You are here:

Choosing or Designing the Perfect WebQuest for Your Learners Using a Reliable Rubric
ARTICLE

, USF St. Petersburg, United States ; , Georgia Southern University, United States ; , Usak University, United States

CITE Journal Volume 12, Number 2, ISSN 1528-5804 Publisher: Society for Information Technology & Teacher Education, Waynesville, NC USA

Abstract

The researchers in this study undertook development of a webquest evaluation rubric and investigated its reliability. The rubric was created using the strengths of the currently available webquest rubrics with improvements based on the comments provided in the literature and feedback received from educators. After the rubric was created, 23 participants were given a week to evaluate three preselected webquests using the latest version of the rubric. A month later, the evaluators were asked to reevaluate the same webquests. The statistical analyses conducted on this rubric demonstrated high levels of reliability.

Citation

Unal, Z., Bodur, Y. & Unal, A. (2012). Choosing or Designing the Perfect WebQuest for Your Learners Using a Reliable Rubric. Contemporary Issues in Technology and Teacher Education, 12(2), 209-231. Waynesville, NC USA: Society for Information Technology & Teacher Education. Retrieved March 21, 2019 from .

Keywords

View References & Citations Map

References

  1. Abbit, J., & Ophus, J. (2008). What we know about the Impacts of webquests: A review of research. AACE Journal, 16(4), 441-456.
  2. Abu-Elwan, R. (2007). The use of webquest to enhance the mathematical problem-posing skills of pre-service teachers. International Journal for Technology in Mathematics Education, 14(1), 31-39.
  3. Allan, J., & Street, M. (2007). The quest for deeper learning: an investigation into the impact of a knowledge pooling webquest in primary initial teacher training. British Journal of Educational Technology, 38(6),1102-1112.
  4. Arter, J., & McTighe, J. (2001). Scoring rubrics in the classroom. Thousand Oaks, CA: Corwin Press Inc.
  5. Barrett, P. (2001, March). Assessing the reliability of rating data. Retrieved from the author’s personal website: http://www.pbarrett.net/presentations/rater.pdf
  6. Barroso, M., & Clara, C. (2010). A webquest for adult learners: A report on a biology course. In J. Sanchez & K. Zhang (Eds.), Proceedings of the World Conference on ELearning in Corporate, Government, Healthcare, and Higher Education 2010 (pp. 15661569). Chesapeake, VA: Association for the Advancement of Computers in Education.
  7. Bartoshesky, A., & Kortecamp, K. (2003). WebQuest: An instructional tool that engages adult learners, promotes higher level thinking and deepens content knowledge. In C. Crawford et al. (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2003 (pp. 1951-1954). Chesapeake, VA: Association for the Advancement of Computers in Education.
  8. Bellofatto, L., Bohl, N., Casey, M., Krill, M., & Dodge, B. (2001). A rubric for evaluating webquests. Retrieved from the San Diego State University Webquest website: http://webquest.sdsu.edu/webquestrubric.html
  9. Black, P. (1998). Testing: friend or foe? London, England: Falmer Press.
  10. Bresciani, M. J., Zelna, C. L., & Anderson, J. A. (2004). Techniques for assessing student learning and development: A handbook for practitioners. Washington, DC: National Association of Student Personnel Administrators.
  11. Busching, B. (1998). Grading inquiry projects. New directions for teaching and Learning, 74, 89-96.
  12. Colton, D. A., Gao, X., Harris, D. J., Kolen, M. J., Martinovich-Barhite, D., Wang, T., & Welch, C. J. (1997). Reliability issues with performance assessments: A collection of papers (ACT Research Report Series 97-3). Iowa City, IA: American College Testing Program.
  13. Cortina J. M. (1993). What is coefficient alpha? An examination of theory and applications, Journal of Applied Psychology 78, 98-104.
  14. Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Orlando, FL: Harcourt Brace Jovanovich.
  15. Dodge, B. (1995). WebQuests: A technique for Internet based learning. Distance Educator, 2, 10-13.
  16. Dodge, B. (1997). Some thoughts about webquests. Retrieved from http://webquest.sdsu.edu/about_webquests.html
  17. Dodge, B. (2001). FOCUS: Five rules for writing a great webquest. Learning and Leading with Technology, 28(8), 6-9, 58.
  18. Glass, G. V., & Hopkins, K. H. (1996). Statistical methods in education and psychology. Boston, MA: Allyn and Bacon.
  19. Gorrow, T., Bing, J., & Royer, R. (2004, March). Going in circles: The effects of a webquest on the achievementand attitudes of prospective teacher candidates in
  20. Hildebrand, G. (1996). Redefining achievement. In P. Murphy & C. Gipps (Eds.), Equity in the classroom: Towards effective pedagogy for girls and boys (pp. 149-172). London, England: Falmer.
  21. Johnson, R. L., Penny, J., & Gordon, B. (2000). The relation between score resolution methods and interrater reliability: An empirical study of an analytic scoring rubric. Applied Measurement in Education, 13, 121–138.
  22. Laborda, J. G. (2009). Using webquests for oral communication in English as a foreign language for tourism studies. Educational Technology & Society, 12(1),258–270.
  23. Lim, S., & Hernandez, P. (2007). The webquest: An illustration of instructional technology implementation in MFT training. Contemporary Family Therapy, 29, 163175.
  24. MacGregor, S. K., & Lou, Y. (2004/2005). Web-based learning: How task scaffolding and web site design support knowledge acquisition. Journal of Research on Technology in Education, 37(2),161-175.
  25. Maddux, C.D., & Cummings, R. (2007). WebQuests: Are they developmentally appropriate? The Educational Forum, 71(2),117-127.
  26. March, T. (2003). The learning power of webquests. Educational Leadership 61(4)42-47.
  27. March, T. (2004). Criteria for assessing best webquests. Retrieved from the BestWebQuests University website: http://bestwebquests.com/bwq/matrix.asp
  28. Morrison, G. R., & Ross, S. M. (1998). Evaluating technology-based processes and products. New Directions for Teaching and Learning, 74, 69-77.
  29. Moskal, B.M. (2000). Scoring rubrics: What, when, and how? Practical Assessment Research and Evalution, 7(3). Retrieved from http://pareonline.net/getvn.asp?v=7&n=3
  30. Moskal, B., & Leydens, J.A. (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research & Evaluation, 7(10). Retrieved from http://pareonline.net/getvn.asp?v=7&n=10
  31. Perkins, R., & McKnight, M. L. (2005). Teachers’ attitudes toward webquests as a method of teaching. Computers in the Schools, 22(1/2),123-133.
  32. Perlman, C.C. (2003). Performance assessment: Designing appropriate performance
  33. Peterson, C. L., & Koeck, D. C. (2001). When students create their own webquests. Learning and Leading with Technology, 29(1),10–15.
  34. Pohan, C., & Mathison, C. (1998). WebQuests: The potential of Internet based instruction for global education. Social Studies Review, 37(2),91-93.
  35. Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research & Evaluation, 9(4). Retrieved from http://pareonline.net/getvn.asp?v=9&n=4
  36. Tsai, S. (2006, June). Students' perceptions of English learning through EFL webquest. Paper presented at the World Conference on Educational Multimedia, Hypermedia and Telecommunications 2006, Orlando, FL.
  37. Watson, K.L. (1999). WebQuests in the middle school. Meridian, 2(2). Retrieved from http://www.ncsu.edu/meridian/jul99/downloads/webquest.pdf
  38. Wiggins, G. (1998). Educative assessment. San Francisco, CA: Jossey-Bass.

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact info@learntechlib.org.