ORBIS SCHOLAE

ORBIS SCHOLAE

We inform authors and readers that, following an agreement with the Karolinum publishing house, from 2024 (Volume 18), the journal Orbis scholae will be published only in electronic form.

Orbis scholae is an academic journal published by Charles University, Prague. It features articles on school education in the wider socio-cultural context. It aims to contribute to our understanding and the development of school education, and to the reflection of teaching practice and educational policy.

The journal is indexed in SCOPUS, CEEOL, DOAJ, EBSCO, and ERIH Plus.

ORBIS SCHOLAE, 1–26

Vztah rychlého odpovídání v testu z anglického jazyka a výsledného skóre u různých skupin žáků

[The Relationship Between Fast Response Time in English Language Test and The Final Score Among Different Groups of Students]

Jiří Štípek, Hana Voňková, Kateřina Králová

DOI: https://doi.org/10.14712/23363177.2025.7
published online: 08. 10. 2025

abstract

This study focuses on the effort of students when answering achievement test questions and its relationship with their final test score. The aim of the study was to determine the extent to which insufficient effort responding in tests occurs among different groups of students based on gender and school type, and how the varying degrees of insufficient effort among these groups may be reflected in their comparison based on the final test score. The research sample consisted of male and female students from the 9th grade of basic school (N = 631) and the corresponding grades of multi-year secondary general schools (N = 713). As a research tool, we employed an online questionnaire application, which included an English placement test (Cambridge EFL Placement Test) with 25 multiple-choice questions. To identify insufficient effort, we employ time characteristics of students’ behavior during testing, recorded by an online testing application developed for this purpose. The results of the study indicate that insufficient effort is a marginal phenomenon among students of multi-year secondary general schools (3.1%), while affecting almost a quarter of basic school students (23.6%). In both school types, boys exhibit insufficient effort approximately twice as much as girls. Analysis of test results suggests that lower scores of boys compared to girls may be largely explained by lower effort of boys when completing the test. Our findings suggest that insufficient effort responding in tests may have a significant impact on conclusions of studies that compare different groups of students based on test results.

keywords: achievement test; students’ effort; response time; test score; rapid guessing behavior

references (44)

1. Anaya, L. M., & Zamarro, G. (2024). The role of student effort on performance in PISA: revisiting the gender gap in achievement. Oxford Economic Papers, 76(2), 533−560. CrossRef

2. Cambridge University Press & Assessment. (2023, 20. dubna). Cambridge English: Test your English − for schools. https://www.cambridgeenglish.org/test-your-english/for-schools/

3. Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66, 4−19. CrossRef

4. DeMars, C. E., Bashkov, B. M., & Socha, A. B. (2013). The role of gender in test-taking motivation under low-stakes conditions. Research & Practice in Assessment, 8, 69−82.

5. Duckworth, A. L., Quinn, P. D., Lynam, D. R., Loeber, R., & Stouthamer-Loeber, M. (2011). Role of test motivation in intelligence testing. Proceedings of the National Academy of Sciences, 108(19), 7716−7720. CrossRef

6. Eklöf, H. (2007). Test-taking motivation and mathematics performance in TIMSS 2003. International Journal of Testing, 7(3), 311−326. CrossRef

7. Eurostat. (2021, 5. prosince). NUTS − Nomenclature of territorial units for statistics. https://ec.europa.eu/eurostat/web/nuts/background.

8. Goldhammer, F., Martens, T., & Lüdtke, O. (2017). Conditioning factors of test-taking engagement in PIAAC: an exploratory IRT modelling approach considering person and item characteristics. Large-scale Assessments in Education, 5(1), Article 18. CrossRef

9. Guo, H., Rios, J. A., Haberman, S., Liu, O. L., Wang, J., & Paek, I. (2016). A new procedure for detection of students' rapid guessing responses using response time. Applied Measurement in Education, 29(3), 173−183. CrossRef

10. Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology, 27(1), 99−114. CrossRef

11. Jacoby, W. G. (2000). Loess: A nonparametric, graphical tool for depicting relationships between variables. Electoral Studies, 19(4), 577−613. CrossRef

12. Kong, X. J., Wise, S. L., & Bhola, D. S. (2007). Setting the response time threshold parameter to differentiate solution behavior from rapid-guessing behavior. Educational and Psychological Measurement, 67(4), 606−619. CrossRef

13. Kornhauser, Z. G. C., Minahan, J., Siedlecki, K. L., & Steedle, J. T. (2014, duben). A strategy for increasing student motivation on low-stakes assessments [Paper presentation]. Annual Meeting of the American Educational Research Association, Philadelphia, USA.

14. Kröhne, U., Deribo, T., & Goldhammer, F. (2020). Rapid guessing rates across administration mode and test setting. Psychological Test and Assessment Modeling, 62(2), 147−177.

15. Lee, Y.-H., & Haberman, S. J. (2015). Investigating test-taking behaviors using timing and process data. International Journal of Testing, 16(3), 240−267. CrossRef

16. Lee, Y.-H., & Jia, Y. (2014). Using response time to investigate students' test-taking behaviors in a NAEP computer-based study. Large-scale Assessments in Education, 2, Article 8. CrossRef

17. Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17, 437−455. CrossRef

18. Penk, C., & Schipolowski, S. (2015). Is it all about value? Bringing back the expectancy component to the assessment of test-taking motivation. Learning and Individual Differences, 42, 27−35. CrossRef

19. Pinheiro, J., Bates, D., DebRoy, S., Sarkar, D., & R Core Team. (2021). nlme: Linear and nonlinear mixed effects models. https://CRAN.R-project.org/package=nlme

20. Pools, E., & Monseur, C. (2021). Student test-taking effort in low-stakes assessments: evidence from the English version of the PISA 2015 science test. Large-scale Assessments in Education, 9(1), Article 10. CrossRef

21. Rios, J. A., Deng, J., & Ihlenfeldt, S. D. (2022). To what degree does rapid guessing distort aggregated test scores? A meta-analytic investigation. Educational Assessment, 27(4), 356−373. CrossRef

22. Rios, J. A., & Guo, H. (2020). Can culture be a salient predictor of test-taking engagement? An analysis of differential noneffortful responding on an international college-level assessment of critical thinking. Applied Measurement in Education, 33(4), 263−279. CrossRef

23. Rios, J. A., Guo, H., Mao, L., & Liu, O. L. (2017). Evaluating the impact of careless responding on aggregated-scores: to filter unmotivated examinees or not?. International Journal of Testing, 17, 74−104. CrossRef

24. Rios, J. A., & Soland, J. (2022). An investigation of item, examinee, and country correlates of rapid guessing in PISA. International Journal of Testing, 22(2), 154−184. CrossRef

25. Sahin, F., & Colvin, K. F. (2020). Enhancing response time thresholds with response behaviors 25 for detecting disengaged examinees. Large-scale Assessments in Education, 8, Article 5. CrossRef

26. Setzer, J. C, Wise, S., van den Heuvel, J., & Ling, G. (2013). An investigation of examinee test-taking effort on a large-scale assessment. Applied Measurement in Education, 26(1), 34−49. CrossRef

27. Schnipke, D. L., & Scrams, D. J. (1997). Modeling item response time with a two-state mixture model: A new method of measuring speededness. Journal of Educational Measurement, 34(3), 213−232. CrossRef

28. Soland, J. (2018). Are achievement gap estimates biased by differential student test effort? Putting an important policy metric to the test. Teachers College Record, 120(12), 1−26. CrossRef

29. Soland, J., Kuhfeld, M., & Rios, J. (2021). Comparing different response time threshold setting methods to detect low effort on a large-scale assessment. Large-scale Assessments in Education, 9(1). CrossRef

30. Soland, J., Wise, S. L., & Gao, L. (2019). Identifying disengaged survey responses: New evidence using response time metadata. Applied Measurement in Education, 32(2), 151−165. CrossRef

31. Welling, J., Gnambs, T., & Carstensen, C. H. (2024). Identifying disengaged responding in multiple-choice items: Extending a latent class item response model with novel process data indicators. Educational and Psychological Measurement, 84(2), 314−339. CrossRef

32. Wigfield, A., & Eccles, J. S. (2000). Expectancy-value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68−81. CrossRef

33. Wise, S. L. (2015). Effort analysis: Individual score validation of achievement test data. Applied Measurement in Education, 28(3), 237−252. CrossRef

34. Wise, S. L. (2017). Rapid‐guessing behavior: Its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36(4), 52−61. CrossRef

35. Wise, S. L. (2019). An information-based approach to identifying rapid-guessing thresholds. Applied Measurement in Education, 32(4), 325−336. CrossRef

36. Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment, 10(1), 1−17. CrossRef

37. Wise, S. L., & DeMars, C. E. (2006). An application of item response time: The effort-moderated IRT model. Journal of Educational Measurement, 43(1), 19−38. CrossRef

38. Wise, S. L., & Gao, L. (2017). A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30(4), 343−354. CrossRef

39. Wise, S. L., Kingsbury, G. G., Thomason, J., & Kong, X. (2004, April 13). An investigation of motivation filtering in a statewide achievement testing program [Paper presentation]. Annual Meeting of the National Council on Measurement in Education, San Diego, Kalifornie, USA.

40. Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163−183. CrossRef

41. Wise, S. L., & Ma, L. (2012, duben). Setting response time thresholds for a CAT item pool: The normative threshold method [Paper presentation]. Annual Meeting of the National Council on Measurement in Education, Vancouver, Kanada.

42. Wise, S. L., Pastor, D. A., & Kong, X. J. (2009). Correlates of rapid-guessing behavior in low-stakes testing: Implications for test development and measurement practice. Applied Measurement in Education, 22(2), 185−205. CrossRef

43. Wolf, L. F., Smith, J. K., & Birnbaum, M. E. (1995). Consequence of performance, test, motivation, and mentally taxing items. Applied Measurement in Education, 8(4), 341−351. CrossRef

44. Zhao, A., Brown, G. T. L., & Meissel, K. (2022). New Zealand students' test-taking motivation: an experimental study examining the effects of stakes. Assessment in Education: Principles, Policy & Practice, 29(4), 397−421. CrossRef

Creative Commons License
Vztah rychlého odpovídání v testu z anglického jazyka a výsledného skóre u různých skupin žáků is licensed under a Creative Commons Attribution 4.0 International License.

157 x 230 mm
periodicity: 3 x per year
print price: 150 czk
ISSN: 1802-4637
E-ISSN: 2336-3177

Download