PROVIDING COMPUTER-BASED TESTING FOR LOW-LEVEL STUDENTS AT A RUSSIAN UNIVERSITY Cover Image

PROVIDING COMPUTER-BASED TESTING FOR LOW-LEVEL STUDENTS AT A RUSSIAN UNIVERSITY
PROVIDING COMPUTER-BASED TESTING FOR LOW-LEVEL STUDENTS AT A RUSSIAN UNIVERSITY

Author(s): Anna Astanina, V. I. Beliaeva, Tatiana Rasskazova
Subject(s): Social Sciences, Education, Higher Education
Published by: Carol I National Defence University Publishing House
Keywords: computer-based testing; test-design; low-level learners; university environment

Summary/Abstract: The method of computer-based testing is not new, being widely introduced both as cost-effective, quick for administering. Universities where English is not the language of tuition also seek ways of efficient assessment with the help of computer-based tests. However, there are some challenges of testing low-level learners, as there are very few valid ready-made tests designed for pre-A1 adult learners due to numerous factors: insufficient number of test versions, limitations of defining appropriate language items for testing, etc. Test-designers face challenges connected with choosing test constructs in general, vocabulary and grammar items in particular, finding appropriate sound tracks. The data described in the article present the results of two different tests: one that is offered by Russian federal test-designing body for university students irrespective of their level and the other is the first test that was designed specifically for pre-A1 adult students. The tests were administered as the achievement end-of-semester tests for first-year students of STEM subjects. The article presents the year 2015-2016 the number of pre-A1 learners entering the university was 20% of all first-year students (N=508). At the end of the first semester the Federal Internet Exam in Professional Education (FEPE) was administered as an independent computer-based test for university students as an achievement test. However, the failure rate among pre-A1 students was 100%. According to the University internal regulations, a student has to get at least 40% of the test tasks correctly to pass. Therefore, the need for context-specific pre-A1 achievement test was obvious. The test was designed by the end of the second term and included tasks on Reading (with use of English tasks integrated), Writing and Listening. The time limit was set at 40 minutes. The test results show that only 1 pre-A1 learner (out of the total number of 59 students who took the test) failed. The article dwells upon issues that test-designers face while meeting the requirements of testing pre-A1 learners, and suggests some practical implications into designing tests for pre-A1 learners.

  • Issue Year: 13/2017
  • Issue No: 01
  • Page Range: 132-139
  • Page Count: 8
  • Language: English