What is the response analysis report?
The response analsyis report displays a summary of student responses for a class of students. It's useful for educators to see how students responded to an activity and to analyse trends in the class, like which Items were easy or more difficult for learners, and which responses were most common among the class. This information can be used to make targeted interventions for individuals, and group them with classmates who have similar or complementary learning needs.
The distribution of responses can also be used to identify unusual Items within a test, for example where a question was misleading or confused a large number of students. This can enable content quality improvements, to ensure Items are fair, accurate and targeting the desired knowledge area and skill level.
The report is optimized for Learnosity's single-select multiple choice questions and also supports Learnosity's wide variety of other question types. Note that responses for audio recorder, video recorder and file upload question types currently will not be shown within the report, but all other question types are supported.
You can try a live demo of the report on the Learnosity demos site.
Figure 1: The Learnosity response analysis report.
The report is powered by the Reports API. For technical information on implementation and configuration options, see the reference documentation page.
UI features of the report
The report includes two views: a summarised grid view of student responses and a drill-down detail view to see specific responses to each Item.
Figure 2: Response analysis grid view.
The grid view is displayed by default when the report is loaded.
- The grid shows the list of students included in the report. The columns show how each students' responded to the Items in the learning activity.
- The total column displays the students overall score on the activity. The column header shows the maximum possible score of the activity.
- The column headers identify each Item, and the question type included in that Item. If the Item contains multiple Questions, the number of Questions as "2Qs" for 2 Questions, "3Qs" for 3 Questions, etc. The maximum score of the Item is also displayed as eg. "1pt".
- Each cell represents the individual student's response to that Item.
- Responses that are fully correct (i.e. received the maximum possible score) are highlighted and show the checkmark icon. A score is shown for Items that provide partial credit.
- A dash
-is used for unattempted Items.
-ptis displayed where a manual score has not yet been provided.
- For single-select MCQs, we show the letter corresponding value to the student's selection.
- For multi-select MCQs, multi-part Items and other question types, we display a value like
R2, etc. Students who gave the same response to the Item are allocated the same R number. Students who gave a unique response the Item are displayed with an ellipsis
- Columns can be sorted by clicking on the chevron icon in the column header, which groups correct responses at the top and then alternate responses by most common to least common.
Click on column header of the grid view to see the detail view of that Item.
Figure 3: Response analysis detail view.
Within the detail view:
- The Item content is shown on the right side, along with the response and validation feedback that corresponds to the selected (darkened) student rows.
- Click through the student rows to see the responses given by each student. Where multiple students all gave the same response, their student rows will appear selected as well.
- Browse left and right between Items using the chevron controls in the top right.
- Click the All button to return to the grid view.
Figure 4: Interacting with the detail view.
Which student sessions will be included in the report?
The report accepts a list of session_ids, so you can choose exactly which session is shown for each student. If students have attempted an activity multiple times, you can decide which attempt should be included for each student.
Importantly, all sessions in the report must be based on an identical set of Items and Questions. This ensures comparisons are made between students who saw the same content, and prevents confusion or misinterpretation. If one or more sessions contain different content, an error will indicate which session_ids have variances.
Can I include multiple attempts for a single student?
Yes - in this case, each included attempt will be displayed as a separate row in the report.
What configuration/customization options are available for the report?
The initialization options for the report allow you to:
- Specify your own Item labels for each column header. Use this to show a user-friendly label for each Item, instead of the authored item_reference or item id.
- Specify student names. Learnosity stores anonymized user_ids only, so you can specify the names of students in your preferred format when you display the report.
- Access the raw data and create a fully customized report UI. You can provide a DataListener to the report, disable rendering of the Learnosity UI and implement your own fully custom UI with your desired visualization and analysis features.
What are the limitations of the report?
Note the following limitations:
- All sessions included in the report must contain an identical set of Items/Questions. This ensures comparisons are made between students who saw the same content, and prevents confusion or misinterpretation. If one or more sessions contain different content, an error will indicate which session_ids have variances.
- The report is optimized for single-select multiple choice questions, but can be used to analyse other question types as well. Note that audio, video and file upload questions currently cannot be shown in the detail view - a text warning is displayed in their place indicating they are not currently available.
How do I start using the response analysis report?
You can get started implementing the report right now using the Reports API v2020.1.LTS and above. For technical implementation and configuration details, see the reference documentation page.