![](https://editlib-media.s3.amazonaws.com/sources/ELEARN.png)
Applying controlled usability-testing technology to investigate learning behaviours of users interacting with e-learning tutorials
PROCEEDINGS
M.R. (Ruth) de Villiers, University of South Africa, South Africa
E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, in Vancouver, Canada ISBN 978-1-880094-76-1 Publisher: Association for the Advancement of Computing in Education (AACE), San Diego, CA
Abstract
Abstract: The purpose of this paper is to suggest innovative ways of using the facilities in usability-testing laboratories to find out more about the learning processes and behaviours of users interacting with e-learning applications. The applications investigated in this study are tutorials that present cognitive subject matter. Three added-value techniques are described, namely: visualisation of how time is distributed in the learning process; verbalisation by participants, particularly co-participants; and methods of error analysis, drawing a distinction between usability errors and cognitive errors. The proposals are illustrated by data from three studies of interactive e-learning tutorials, studies which demonstrate the techniques and show their value in providing fine-grained details of the learning experiences. A notable finding is that different users learn from the software in very different ways. The research mechanisms are transferable to other domains.
Citation
de Villiers, M.R. (2009). Applying controlled usability-testing technology to investigate learning behaviours of users interacting with e-learning tutorials. In T. Bastiaens, J. Dron & C. Xin (Eds.), Proceedings of E-Learn 2009--World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 2512-2521). Vancouver, Canada: Association for the Advancement of Computing in Education (AACE). Retrieved August 9, 2024 from https://www.learntechlib.org/primary/p/32839/.
© 2009 Association for the Advancement of Computing in Education (AACE)
References
View References & Citations Map- Adebesin, F., de Villiers, M.R. & Ssemugabi, S (2008). Usability testing of e-learning: an approach incorporating co-discovery and think-aloud. Proceedings of SACLA Conference 2009. In press.
- Alessi, S.M. & Trollip, S.R. (2001). Multimedia for Learning: Methods and Developments. 3rd edition, Allyn & Bacon.-2520 DASHDASH
- Armstrong, S.D., Brewer, W.D, & Steinberg, R.K. (2002). Usability testing. In: S.G.Charlton & T.G.O’ Brien (Eds), Handbook of Human Factors Testing and Evaluation. Mahwah: Lawrence Erlbaum Associates.
- Ardito, C., Costabile, M.F., De Marsico, M., Lanzilotti, R., Levialdi, S., Roselli, T. & Rossano, V. (2006). An approach to usability evaluation of e-learning applications. Universal Access http://www.springerlink.com/content/755507r7144m3845/ Barnum, C.M. (2002). Usability Testing and Research. Allyn & Bacon.
- Becker, D. & De Villiers, M.R.. (2008). Iterative design and evaluation of an e-learning tutorial: a research-based approach. The Information society, 4(3):270-283.
- Boren, M.T. & Ramey, J. (2000). Thinking aloud: reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3): 261-278.
- De Villiers, M.R. (2005). E-Learning artefacts: Are they based on learning theory? Alternation 12.1b: 345-371.
- De Villiers, M.R. (2006). Multi-method evaluations: Case studies of an interactive tutorial and practice system. In: Proceedings of
- Dix, A., Finlay, J., Abowd, G.D. & Beale, R. (2004). Human-Computer Interaction. Pearson Education, Ltd, Harlow.
- Dumas, J.S. (2003). User-based evaluations. In: J.A. Jacko & A. Sears (Eds), The Human-Computer Interaction Handbook. Mahwah: Lawrence Erlbaum Associates.
- Dumas, J.S. & Redish, J.C. (1999). A Practical Guide to Usability Testing. Exeter: Intellect.
- Hwang, G-J., Huang, C.K. & Tseng, J.C.R. (2004). A group-decision approach for evaluating educational websites. Computers& Education 42:65-86.
- Jeffries, R., Miller, J.R, Wharton, C. & Uyeda, K.M. (1991). User interface evaluation in the real world: a comparison of four techniques. Proceedings ACM CHI’91 Conference: 119-124. New Orleans, LA, April 1991.
- Masemola, S.S. & De Villiers, M.R. (2006). Towards a framework for usability testing of interactive e-learning applications in cognitive domains. In: J. Bishop& D. Kourie (Eds), Service-Oriented Software and Systems. Proceedings of the 2006
- Nielsen, J. (2000). Why you only need to test with five users. Http://www.useit.com/alertbox/20000319.html.
- Preece, J., Rogers, Y. & Sharp, H. (2007). Interaction Design: Beyond Human-Computer Interaction. 2nd ed. John Wiley& Sons.
- Rubin, J. (1994). Handbook of Usability Testing: How to Plan, Design and Conduct Effective Tests. New York: John Wiley.
- Shneiderman, B. (1998). Designing the User Interface (3rd ed.). Reading, MA: Addison Wesley Longman.
- Squires, D. & Preece, J. (1999). Predicting quality in educational software: evaluating for learning, usability and the synergy between them. Interacting with Computers 11(5): 467 – 483.
- Storey, M.A., Phillips, B., Maczewski, M. & Wang, M. (2002).Evaluating the usability of web-based learning tools. Educational Technology& Society (3)
- Van Greunen, D. & Wesson, J L. (2002). Formal usability testing of interactive educational software: A case study. World Computer Congress (WCC): Stream 9: Usability. Montreal, Canada, August 2002.
- Wilson, C. (1998). Usability techniques: pros and cons of co-participation in usability studies. Http://www.stcsig.org/usability/newsletter/9804-coparticipation.html
These references have been extracted automatically and may have some errors. Signed in users can suggest corrections to these mistakes.
Suggest Corrections to ReferencesSlides
- presentation_3036_26990.pptx (Access with Subscription)