Graduate Studies Reports Access

Graduate Course Proposal Form Submission Detail - EDF7469
Tracking Number - 1833

Edit function not enabled for this course.


Current Status: Approved, Permanent Archive - 2005-11-10
Campus:
Submission Type:
Course Change Information (for course changes only):
Comments:


Detail Information

  1. Date & Time Submitted: 2005-09-28
  2. Department: Educational Measurement and Research
  3. College: ED
  4. Budget Account Number: 171100000
  5. Contact Person: Robert F. Dedrick
  6. Phone: 45722
  7. Email: dedrick@tempest.coedu.usf.edu
  8. Prefix: EDF
  9. Number: 7469
  10. Full Title: Introduction to Computer-Based Testing
  11. Credit Hours: 3
  12. Section Type: C - Class Lecture (Primarily)
  13. Is the course title variable?: N
  14. Is a permit required for registration?: N
  15. Are the credit hours variable?: N
  16. Is this course repeatable?:
  17. If repeatable, how many times?: 0
  18. Abbreviated Title (30 characters maximum): Computer-Based Testing
  19. Course Online?: -
  20. Percentage Online:
  21. Grading Option: R - Regular
  22. Prerequisites: EDF 6432 Foundations of Educational Measurement or equivalent
  23. Corequisites:
  24. Course Description: This course should serve as an introduction to the field of computer-based testing. The material covered will be applicable to most operational educational, psychological, credentialing and licensure assessments, for research and measurement.

  25. Please briefly explain why it is necessary and/or desirable to add this course: An analysis of the conference proceedings of national professional organizations (e.g., American Educational Research Association, National Council of Measurement in Education) show a substantial number of measurement/testing sessions devoted specifically
  26. What is the need or demand for this course? (Indicate if this course is part of a required sequence in the major.) What other programs would this course service? In addition to students from educational measurement and research, the course may be of interest to doctoral students from other programs in education, as well as doctoral students in psychology, public health, nursing, and business.
  27. Has this course been offered as Selected Topics/Experimental Topics course? If yes, how many times? 4 times
  28. What qualifications for training and/or experience are necessary to teach this course? (List minimum qualifications for the instructor.) Doctoral degree meeting departmental requirements of at least 50% of doctoral coursework in the areas of Measurement, Statistics, and Evaluation. Documented training in computer-based testing.
  29. Objectives: This course is intended to provide a broad coverage of the field of computerized assessment. The goal of the course is to inform students about relevant psychometric and computer issues, and to enable them to apply that learning to practical computerized testing applications.

    The successful completion of the course requirements is expected to result in increased ability to (a) intelligently read and evaluate relevant computer-based testing literature, (b) recognize the strengths and limitations of methods used in practical computer-based testing situations, (c) develop computer-based tests that meet professional measurement standards, (d) design research studies requiring the use of computer-based tests, and (e) communicate with peers and other professionals on computer-based testing topics.

  30. Learning Outcomes: Grades will be based on 3 distinct activities. The first is a set of software evaluations that the student must conduct on different CBT programs. The second is a research proposal in which a CBT is either a major research tool or the focus of the research question. For the third activity, the student must design and develop a CBT, which they will present to the class.

    A description of each of these activities, and the rubrics used for grading them, are provided below.

    Software Evaluations (30%):

    Several computer test programs and demos will be available for students to view and evaluate. Students will also be expected to conduct an online search for information on additional CBT programs. Students will need to conduct a total of 3 software evaluations, describing the computer test software in terms of available measurement features, appropriate use of computer technology, quality of screen design, ease of use, information provided, etc.

    The first evaluation should be 1 page in length; the remaining two should each be 2-3 pages in length. The evaluations should include both descriptive and evaluative components. The first evaluation should be of a computer test program available in the lab. The remaining two evaluations should be of programs of the students choosing, from those in the lab, online selections, and "marketing-demo" versions. (In conjunction with these evaluations, students also need to conduct an online search for CBTs. They will need turn in a printed copy of the home page of a commercial CBT web site, along with a disk containing a demo version of the CBT program.)

    Rubric (for each of the 3 evaluations):

    ___ Required supplementary materials included (0-2)

    ___ Descriptive elements fully addressed (0-5)

    ___ measurement features

    ___ use of technology

    ___ screen design

    ___ ease of use

    ___ help/information provided

    ___ Evaluative elements fully addressed (0-5)

    ___ measurement features

    ___ use of technology

    ___ screen design

    ___ ease of use

    ___ help/information provided

    Research Proposal (35%):

    Students will be expected to develop a research proposal related to computer based testing. The proposal should be approximately 1200-1500 words in length (or, ~3 single-spaced pages, or ~5 double-spaced pages). It should include an introduction, background or literature review, purpose of the current study, methods, and (anticipated) results or conclusions. CBT applications described in the proposals can be for the purpose of conducting research, instrument development, prototype demonstration, or other justified use.

    Students will need to turn in copies of the abstracts of at least 3 relevant articles (used in the literature review), a draft version of the proposal, and a final version of the proposal (in a form suitable for submission). Students will be encouraged (but not required) to submit their proposal for presentation at FERA or other conference.

    Rubric:

    ____ Required supplementary materials included (0-2)

    ____ Rationale for questions included

    ____ Appropriate methods addressed

    ____ Analyses clearly described

    ____ Limitations noted

    ____ Accurate description/discussion of results

    ____ Conclusions consistent with results

    ____ Publishable writing of results section

    Computer Based Test Project/Presentation (35%):

    By the end of the course the students should have designed and developed a computerized exam. The exam should be developed for a particular test purpose and audience, using principles of measurement, CBT development, and screen design. The students may work individually or in small groups (perhaps a group of 2 or 3 students, supplying measurement, computer, and content skills). Several computer test programs will be available in the computer lab, or other programs may be used at the student's discretion. Usability analyses should be conducted on draft versions of the student projects.

    The items will need to be entered in a software program, and a computer test generated. The exam should include introductory and instruction screens, example items, closing screens, and score reporting output. Alternatively, if the student or group has selected a truly ambitious project, an option would be to thoroughly conceptualize the test while entering only some sections into the computer. Some components of the CBT can be illustrated through storyboards instead of generating them within the software program. The final project will be presented in class.

    The supporting materials for this exam should include full test specifications. The purpose of the test should be discussed, perhaps in terms of target audience, construct/domain, and test use (e.g., placement, proficiency, diagnostics). The test delivery model (i.e., CBT) should be specified, perhaps with a rationale for the model chosen. Item and test development can be addressed in terms of item format (i.e., multiple choice), item development and review process, content categories, and test specifications. The most appropriate estimates of reliability and validity for this application can be mentioned. Information about how the test would be administered (i.e., in a classroom setting, on a walk-in basis) could be included. Finally, the method for score reporting, to the examinee or others, could be addressed.

    Rubric:

    ____ Required supplementary materials included (0-2)

    ____ CBT satisfied all requirements (0-15)

    ____ Introductory screens

    ____ Instruction screens

    ____ Example items

    ____ Closing screens

    ____ Score reporting output

    ____ Presentation addressed all points appropriately (0-15)

    ____ Test purpose

    ____ Test delivery model

    ____ Item and test format decisions

    ____ Consideration of reliability and validity

    ____ Test administration plan

    ____ Score reporting plan

  31. Major Topics: Overview of course

    Overview of CBT

    Administration issues

    Site-based security

    Development issues

    Examinee issues

    Software issues

    Screen design

    Usability testing

    Innovative items

    Research proposals & CBT projects

    Test development

    Classical test theory

    Computer Fixed Tests (CFT)

    Automated Test Assembly (ATA)

    IRT

    Computer adaptive tests (CAT)

    Computer classification tests (CCT)

    Testlets

    Comparison of test delivery methods

    Item pool evaluation

    Professional Guidelines for CBTs

  32. Textbooks: Required Text:

    Parshall, C. G., Spray, J. A., Kalohn, J. C., & Davey, T. (2002). Practical considerations in computer-based testing. New York: Springer-Verlag.

    Additional Readings:

    Apple, Inc. (1995). Human Interface Design and the Development Process. In Macintosh Human Interface Guidelines. Reading, Ma: Addison-Wesley.

    Association of Test Publishers (2002). Guidelines for Computer-Based Testing.

    Bennett, R. E. (1999). Using new technology to improve assessment. Educational Measurement: Issues and Practice, 18, 5-12.

    CAT mini-tutorial – http://www.ericae.net/scripts/cat/startcat.

  33. Course Readings, Online Resources, and Other Purchases:
  34. Student Expectations/Requirements and Grading Policy:
  35. Assignments, Exams and Tests:
  36. Attendance Policy:
  37. Policy on Make-up Work:
  38. Program This Course Supports:
  39. Course Concurrence Information:


- if you have questions about any of these fields, please contact chinescobb@grad.usf.edu or joe@grad.usf.edu.