| QuestUP | clickUP Ultra |
Managing delivery of assessment |
Managing time and availability | - Timer and availability window can be set separately.
- Assessment is closed when the timer runs out.
- Availability window overrules the timer: assessment is closed when the end of the availability window is reached, even if the student starts the assessment late.
- Separate timers are available for sections within an assessment.
- Can change the pin for students while they write to prevent them from accessing the exam after 30 minutes.
- Extra-time students are flagged centrally - lecturer just adds extra time to assessment without having to link individual students.
| - Timer and availability window can be set separately.
- Assessment is closed when the timer runs out.
- Prohibit late submissions - In progress attempts will be submitted automatically at the due date and time which means Timer does not overrule the availability window.
- Students can be given extra time to work after the time limit has expired.
- Cannot set separate timers on sections within an assessment – have to create separate tests?
- Can change the Access Code for students while they write to prevent them from accessing the exam after 30 minutes.
- Extra-time students accommodated in all assessments in a module.
|
Monitoring student activity during assessment | - A flashing icon indicates when a student loses connection.
- Log files for each student indicates separately when a question is accessed and answered.
- Students may continue writing assessments even if Internet connectivity drops.
- Invigilators can communicate using on-screen messages to students during a test.
- Proctorio available.
- Lockdown browser available.
| - Students needs to provide proof of connection failure.
- Log files are available for each student indicating answers saved in tests.
- Students cannot continue writing a clickUP test if Internet connectivity drops.
- Invigilator has to communicate via email /clickUP/alternate medium with students during a test.
- Proctorio available.
- Lockdown browser not available.
|
Managing question banks |
| QuestUP | clickUP Ultra |
Question types | - 20 question types (See comparison below)
- Dichotomous (all or nothing) or polytomous (partial credit) available on most questions. Marks are equally distributed among distractors. Drag and drop questions are only Dichotomous.
- Notepad area that includes a rich text editor for students to work out solutions is available for all question types. Assessor(s) can view the notes of students.
- Existing QuestUP databanks will be migrated to the new version.
- Possible to upload a subset of question types using the Excel format.
| - 10 question types.
- Dichotomous (all or nothing) or polytomous (partial credit) available for most questions.
- Can set a different percentage for each distractor.
- No additional note areas for students to work out solutions.
- Migration between systems is not possible yet.
- Importing of questions in Ultra:
- Can copy tests from Original Course View.
- Can import questions from a File.
- Can Publish Questions from Respondus.
|
|
|
|
|
Organisation | Questions are grouped into Collections. | Questions are grouped into Question Banks. |
Setting an assessment |
| QuestUP | clickUP Ultra |
Question selection | - Can set one assessment in which MCQs and long essay questions are written, controlled by timers.
- Possible to select questions from multiple collections.
| - Set separate tests if MCQs need a specific time slot, and essay-type questions need their own time allocation.
- Possible to select questions from multiple Question Banks and Tests.
|
Randomisation | Possible to randomise distractors in all questions. The following possibilities exist for the display of questions in an assessment: - Display questions in a fixed sequence.
- Randomise all questions selected for an assessment.
- Create sections within an assessment in which only a specific set of questions are delivered in sequence while the rest are randomised.
- Randomising questions based on specific topics and the number of questions or marks within those topics.
- Randomising questions based on a blueprint that uses learning objectives and taxonomies you assign to your questions.
- Set up multiple fixed forms for the same schedule to distribute different tests randomly between students – without having to set up groups for the distribution.
| Possible to randomise distractors in all questions. The following possibilities exist for the display of questions in an assessment:- Display questions in a fixed sequence.
- Randomise all questions selected for an assessment.
- Create Page Breaks within an assessment in which only a specific set of questions are delivered in sequence.
- Randomising questions based on specific topics and the number of questions within those topics (using Pools).
- No randomising questions based on a blueprint.
- Set up multiple tests for the same schedule to distribute different tests randomly between students – need to set up groups and link individual tests to each group for the distribution.
|
Feedback options | The following feedback options are available: - Display no feedback.
- Display student’s score and feedback directly after the assessment and on the student dashboard.
- Automatically show feedback reports immediately after the test and on the student dashboard.
- Show feedback reports upon publication - where you control the moment of publication.
- Set a Candidate Review Session to temporarily show results and feedback – select individuals/complete groups to access the review session. Review session will open and close on the date and time you set for review/perusal.
- Show feedback on questions or even answer alternatives (MC) during the assessment.
| The following feedback options are available:- Display no feedback.
- Display student’s score and automated feedback directly after the assessment.
- Show feedback reports immediately after the test and on the Gradebook.
- Show feedback - where you control the moment of publication.
- Set a perusal session that requires specific selections from Show Test Results and Feedback to Students from Edit the Test Settings from the relevant assessment. It also requires linking individual students to groups if not all students should view their feedback. The Gradebook item needs to be manually opened and closed.
|
Marking |
| QuestUP | clickUP Ultra |
Autoscoring | - Auto Scoring of objective questions.
- Remarking of auto-scored questions is possible.
| - Auto Scoring of objective questions.
- Remarking of auto-scored questions is possible.
|
Marking of essay questions | - Simple or extended marking workflows for essay questions.
- Marking per student or per question.
- Distribution of students/questions to assessors: random, manual, assessor choice.
- On screen marking tools: select, comment, strike through, criteria (If set for question).
- Similarity checker (Original: https://www.ouriginal.com) available in future release.
| Marking of essay questions in a Test can be done as follows: - Gradable Items:
- Filter by Student Status = Submitted.
- Filter by Grading Status = Needs grading.
- Flexible Grading Interface:
- Filter = Grade by student
- Grade by question
- Delegated grading
- No Smart Views but you can achieve similar results using the Gradebook Filters (Gradebook - Grades).
|
Moderation | Automated based on choice of rules: Difference in scoring attempt/ using the 'Proximity to pass-mark'/Pass-mark range/ Sample (%) | Complex with many manual actions required. In most cases lecturer download a selection of marked assessments from the Grade Center. |
Reports |
| QuestUP | clickUP Ultra |
Item analysis | It is possible to view the following per Item: Chance score, Discrimination (Rir/Rit), Corrected difficulty, Difficulty, Number of times answered, Distribution of answers, Status (OK/Warning based on difficulty) | Tests: Question analysis - provides statistics on overall test performance and individual test questions to help you recognize questions that might be poor discriminators of student performance. |
Assessment performance | - Assessment statistics: Version, Chance score, Avg Rir, Avg Rit, Reliability (Cronbach Alpha), Avg Corrected Difficulty, Avg Difficulty.
- Assessment performance: Pass rate, Reliability, Number of times delivered, Number of items.
| Tests: - Summary of statistics for an individual assessment:
- Average Score.
- Total number of questions.
- Number of submitted assessments.
- Average completion time for all submitted attempts.
- Question analysis: Discrimination and Difficulty.
|
Support | QuestUP Help pages | |