DELTA Module 1 - Paper 2 Task 1 - Assessment and Testing

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Practicability

Def (TESTING) the extent to which a test is easy and cheap to construct, administer, score and interpret.

Testing

Def: (TESTING) A form of assessment that can happen at any stage of the teaching/learning process. Ex: diagnostic, placement, progress, achievement, formative, summative,

Computer adaptive testing

Def: (TESTING) Candidates are presented initially with an item of average difficulty: those who respond correctly are presented with a more difficult item, those who respond incorrectly get an easier item. At the end of the test, an estimate of the candidate's ability is produced. Ex: an oral interview where the tester changes questions based on the apparent level of the candidate's answers. Extra info: a potentially more efficient way of collecting information on people's ability.

Diagnostic test

Def: (TESTING) a test designed to evaluate learners' needs and current ability.

Objective testing

Def: (TESTING) a test where no judgement is required on behalf of the assessor when scoring. This element gives these tests more reliability. Ex: a multiple choice quiz with no ambiguity.

Proficiency test

Def: (TESTING) a type of test designed to measure people's ability in a language, regardless of any training they may have had in that language. The content is not based on the content or objectives of a language course, but on a specification of what candidates have to be able to do in the language in order to be considered proficient. Ex: a placement test / a university entrance exam assessing language ability / a language test for a job as a translator.

Placement test

Def: (TESTING) a type of test learners are given at entry level to ascertain their level.

Objective approach

Def: (TESTING) basing the test content directly on the objectives of the course, but not necessarily following the same content as was taught on the course.

Syllabus-content approach

Def: (TESTING) basing the test directly on a detailed course syllabus or on the materials/books that have been used. Ex: using (or slightly adapting) texts and questions from the students' coursebook. Extra info: can be seen as a 'fair' test in that it only contains what students have actually encountered. The problem lies if the course itself is badly designed!

Content validity

Def: (TESTING) how far the content of the test constitutes a representative sample of the language skills, structures, etc. with which it is meant to be concerned. We measure this by comparing the test with the specifications of skills/language it is supposed to cover. Ex: a speaking test which tests all elements of the specification would have high ________ validity.

Concurrent validity

Def: (TESTING) measures how well a new test compares to a well-established test of the same construct (i.e. testing the same thing). Ex: teacher's assessment of the student in class is compared to the test results. OR if the test had to be reduced to 10 mins, a sample of students are given the 'full' 45 min test and their results are compared to their 10 min test to see the extent of co-efficiency.

Progress/formative test

Def: (TESTING) periodic tests given to learners throughout a course to monitor the the learning process. Teachers can use the results to modify teaching and learning activities to further boost student attainment. Ex: a pop quiz.

Integrative test

Def: (TESTING) testing many language elements together to complete a task. These tests tend to be direct and communicative, but some are indirect. They are often more subjective in terms of marking. Ex: a dictation (students have to listen and write)

Discrete-point test

Def: (TESTING) testing one element at a time item-by-item. These tests will almost always be indirect. These tests are easy to administer and mark, and are objective in terms of marking. Ex: a test where students have to re-write past simple sentences into the present perfect.

Criterion-referenced testing

Def: (TESTING) tests and assessments that are designed to measure student performance against a fixed set of predetermined criteria or learning standards. Candidates have to achieve the standard in order to pass. They show the candidate what THEY can DO with the language, as opposed to comparing them to other candidates. Ex: IELTS, TOEFL

Subjective testing

Def: (TESTING) tests where judgement is required by the assessor when scoring. Ex: a writing composition. Extra info: the degree of subjectivity can vary (e.g. marking a composition is more subjective than marking short answers to specific-information reading activities).

Norm-referenced testing

Def: (TESTING) tests where there is no criterion for passing, but where a candidate's results are interpreted in relation to the results of other candidates. Ex: SAT tests, IQ tests.

Indirect testing

Def: (TESTING) tests which attempt to measure the abilities that underlie the skills in which we are interested, without concern for authentic application. Ex: In a writing test, candidates have to identify punctuation errors in sentences (but don't actually write their own sentences).

Achievement/summative test

Def: (TESTING) tests which happen at the end of a course where the focus is on the outcome. They are directly related to language courses, their purpose being to establish how successful individual students, groups of students, or the courses themselves have been in achieving objectives. Ex: final achievement tests at the end of a course of study.

Construct validity

Def: (TESTING) the degree to which a test measures accurately what it is intended to measure. Evidence for this needs to be found through 'content' validity and 'criterion-related' validity. Ex: a course with the objective of teaching reading sub-skills that contained no reading text or tasks would have no ___________ validity.

Criterion-related validity

Def: (TESTING) the degree to which results on the test agree with those provided by some independent and highly dependable assessment of the candidate's ability.

Backwash/washback

Def: (TESTING) the effect of testing on teaching and learning. It can be harmful or beneficial. If the test's content and techniques are at variance to the course objectives, the effect is likely to harmful. Ex: the teacher might try and narrow down the course content to help Ss pass a test, which could create negative ______.

Scorer reliability

Def: (TESTING) the extent to which a candidate would achieve the same score on a test if they performed it exactly the same way the next day. Ex: a multiple choice test with only one correct answer per question would have a high _______ ________ , because of its high objectivity.

Face validity

Def: (TESTING) the extent to which a test looks as if it measures what it is supposed to measure from the perspective of candidates/teachers/institutions etc. Ex: receptionists on an English course who need to learn how to write an email - a test where they have to write an email would LOOK like it tests the ability they are hoping to gain. Extra info: a test without this may not be accepted by students or demotivate them. New testing techniques need to be introduced slowly to students so ______ _________ of the test is not lost.

Validity

Def: (TESTING) the extent to which a test measures accurately what it is intended to measure. Ex: a gap-fill grammar activity is _____ if the intent is to test a learner's ability to match grammatical forms to their contexts, but is not _____ for testing communicative ability.

Reliability

Def: (TESTING) the extent to which a test measures consistently. The degree to which an assessment tool produces stable and consistent results. Ex: if a student gets more or less the same score on the test whether they take it on a Monday or a Tuesday. Extra info: People may be hungry, tired, demotivated etc., and the test needs to make sure it does not increase variation by adding unclear instructions, ambiguous questions or items that result in guessing.

Predictive validity

Def: (TESTING) the extent to which the results of a test can be used to predict future performance. A type of criterion-related validity. Ex: seeing how many students from a placement test 'actually' went into the correct level. OR getting FB from a uni tutor about a candidates performance following their proficiency test.

Direct testing

Def: (TESTING) when a test requires the candidate to perform precisely the skill that we wish to measure. Every effort is made to make the tests as realistic as possible through 'authentic' texts. Ex: having an interview or a presentation as a speaking assessment (real-world speaking task)

Fresh starts

Def: (TESTING) when an item on a test does not rely on knowledge of a previous item. Ex: in the IELTS reading exam there are 3 passages, so there are 3 _____ ______ in the test.


Set pelajaran terkait

Ch. 15 Cardiovascular System Quiz

View Set

NCLEX Review Questions Fall 2021

View Set

ACC 216 Chapter Eight (final exam)

View Set

Bio 112 - 2.7 - Types of RNA and the Fate of the Primary Transcript

View Set

World History Philosophes And Physiocrats

View Set