Computer-Based Testing

|
  • Faculty: Armando Fox & Dan Garcia (co-PIs), Michael Ball, Pamela Fox
  • Students: a great many, for a variety of courses!
We’re adapting and adopting UIUC’s PrairieLearn platform for procifiency-based learning with computer-based assessments. Interested in joining the project? Already been invited to join? Email Armando Fox & Dan Garcia with the following info:
  • Name, year, student teaching experience (GSI, tutor, etc)
  • Relevant EECS/CS courses taken
  • Course(s) you’d like to develop content for or do research on (it’s assumed you have the approval of the course’s instructor of record)
  • GitHub user name
This work is supported by seed funding from the Office of the Vice Chancellor for Undergraduate Education, course improvement funds from the College of Engineering, and a major award from the California Education Learning Lab (in collaboration with CSU Long Beach and El Camino College), a program of the California Governor’s Office of Planning and Research.

Just joined the project? Onboarding materials are in this private GitHub repo. Your advisor can get you access.

Project overviews

Dan Garcia on the why of CBT: Proficiency-Based Learning (5 minutes)

Armando Fox on the how of CBT: from Questions to Question Generators

  • Slides for Instructional Resiliency Task Force (presented by Armando Fox)

More Details…

See the main LEARNER Lab website for recent publications and more details on the team and current efforts.

Help Wanted

We’re looking for Masters and ambitious undergraduates to help with various PrairieLearn-related enhancements. These are a mix of research and engineering, but we’d expect any of them to lead to one or more submitted publications, as well as become contributions to an assessment-authoring system used by thousands of students! Prerequisites: Strong software development skills, including solid knowledge of Python or the ability to pick it up fast because you’ve used other non-Python web frameworks or modern scripting languages. Self-starter, works well in a small team or as one of a pair. Ability to do significant pathfinding on your own, by researching what has been done on a particular topic, finding relevant articles online where appropriate (with your advisor’s or a senior student’s guidance), and in general getting from a high-level description of the project to a specific plan to build & execute it. Compensation: Pay or credit may be possible, depending on the specific project.

Courseware for CS169A Software Engineering

Did you do well in, and enjoy, CS169A? Do you love testing, agile workflows, and other areas of focus in CS169, and want to help fellow students understand and learn them better? We’re looking for students who recently took CS169 and feel confident with the material, to help develop innovative new assessments, both formative (labs, homework) and summative (exam questions), that take full advantage of PrairieLearn’s flexibility to create really cool interactive learning materials. Contact Prof. Armando Fox directly to apply.

Courseware developers for high-enrollment CS courses

Have you been a GSI/reader/etc. for a high-enrollment upper or lower division CS course? Are you excited about the possibility of improving the course’s assessments by using PrairieLearn? Ask your course’s instructor for their blessing to develop content for the course as part of our group. We will embed you in a team to get you started, and figure out a way to get you compensated for the work. Contact Prof. Dan Garcia if interested, or the instructor or co-instructor of the CS course(s) you’d be interested in developing courseware for!

Automatic support for “Cheat traps”

Prof. Nick Weaver has developed a mechanism that helps detect certain forms of cheating on remote (takehome) online exams. We believe his techniques can be adapted in such a way that anyone authoring assessments in PrairieLearn can build them in. Is this possible, and if so, do they successfully detect attempts at certain types of cheating?

Automatic support for cheating detection via UI events

Picture this scenario: a student taking a remote exam has another window open on their screen, via which they are illicitly getting the answer to (e.g.) a coding question. They copy-and-paste the answer and submit. But if you could look at the timing of the keystrokes, you’d see that they would have had to type impossibly fast to do that. This and other mechanisms can be signals of less-than-honorable behavior when taking an online exam. This project would investigate and ideally prototype/develop some ways of building these detection mechanisms directly into PrairieLearn. Contact Prof. Nick Weaver if interested.