What is the response analysis report?
The response analysis report displays a summary of student responses for a class of students. It's useful for educators to see how students responded to an activity and to analyse trends in the class, like which Items were easy or more difficult for learners, and which responses were most common among the class. This information can be used to make targeted interventions for individuals, and group them with classmates who have similar or complementary learning needs.
The distribution of responses can also be used to identify unusual Items within a test, for example where a question was misleading or confused a large number of students. This can enable content quality improvements, to ensure Items are fair, accurate and targeting the desired knowledge area and skill level.
The report is optimized for Learnosity's single-select multiple choice questions and also supports Learnosity's wide variety of other question types. Note that responses for audio recorder, video recorder and file upload question types currently will not be shown within the report, but all other question types are supported.
You can try a live demo of the report on the Learnosity demos site.
Figure 1: The Learnosity response analysis report.
The report is powered by the Reports API. For technical information on implementation and configuration options, see the reference documentation page.
UI features of the report
The report includes two views: a summarised grid view of student responses and a drill-down detail view to see specific responses to each Item.
Grid view
Figure 2: Response analysis grid view.
The grid view is displayed by default when the report is loaded.
- The grid shows the list of students included in the report. The columns show how each student responded to the Items in the learning activity.
- The total column displays the students overall score on the activity. The column header shows the maximum possible score of the activity.
- The column headers identify each Item, and the question type of that Item. If the Item contains multiple Questions, the number of Questions are shown as "2Qs" for 2 Questions, "3Qs" for 3 Questions, etc. The maximum score of the Item is also displayed, eg. "1pt".
- The column headers also show a histogram chart of the number of students that gave each response. The bars corresponding to fully correct responses (i.e. received the maximum possible score) are highlighted. The right most (unfilled) bar counts students that did not attempt the Item. Hover on a histogram chart to see detailed counts and labels for the responses.
- Each cell represents an individual student's response to that Item.
- Responses that are fully correct are highlighted and show the checkmark icon. A score is shown for Items that provide partial credit.
- A dash
-
is used for unattempted Items.-pt
is displayed where a manual score has not yet been provided. - For single-select MCQs, we show the letter corresponding value to the student's selection.
- For multi-select MCQs, multi-part Items and other question types, we display a value like
R1
,R2
, etc. Students who gave the same response to the Item are allocated the same R number. Students who gave a unique response the Item are displayed with an ellipsis...
.
- Columns can be sorted by clicking on the chevron icon in the column header, which groups correct responses at the top and then alternate responses by most common to least common.
Detail view
Click on a column header of the grid view to see the detail view of that Item.
Figure 3: Response analysis detail view.
Within the detail view:
- The Item content is shown on the right side, along with the response and validation feedback that corresponds to the selected (darkened) student rows.
- A histogram chart shows the number of students who selected each response.
- There are two ways to navigate through the student's responses:
- Click one of the responses in the histogram to display that response. All students who gave that response will become highlighted.
- Click a student row on the left to see the response given by that student. Other students who gave the same responses will also become highlighted.
- Browse between Items using the arrow controls in the top right.
- Click the All button to return to the grid view.
Figure 4: Interacting with the detail view.
Which student sessions will be included in the report?
The report accepts a list of session_ids, so you can choose exactly which session is shown for each student. If students have attempted an activity multiple times, you can decide which attempt should be included for each student.
Importantly, all sessions in the report must be based on an identical set of Items and Questions. This ensures comparisons are made between students who saw the same content, and prevents confusion or misinterpretation. If one or more sessions contain different content, an error will indicate which session_ids have variances.
Can I include multiple attempts for a single student?
Yes - in this case, each included attempt will be displayed as a separate row in the report.
What configuration/customization options are available for the report?
The initialization options for the report allow you to:
- Specify your own Item labels for each column header. Use this to show a user-friendly label for each Item, instead of the authored item_reference or item id.
- Specify student names. Learnosity stores anonymized user_ids only, so you can specify the names of students in your preferred format when you display the report.
- Show or hide the histograms charts.
- Access the raw data and create a fully customized report UI. You can provide a DataListener to the report, disable rendering of the Learnosity UI and implement your own fully custom UI with your desired visualization and analysis features.
There is an additional option, configurable by Learnosity Support, described below.
How can I handle items with multiple versions?
All sessions included in the report must contain an identical set of Items/Questions. This ensures comparisons are made between students who saw the same content, and prevents confusion or misinterpretation. You have two options to handle the case where one or more sessions contain different content:
- By default, the response analysis report will not render and an error will indicate which session_ids have variances.
- By reaching out to Learnosity support, you can enable the option where the report will still render, in spite of items with multiple versions. The items with multiple versions will be excluded from the report and be available to display to the user. This enhancement is available from v2024.1.LTS onwards
Figure 5: Handling items with multiple versions
What are the limitations of the report?
Note the following limitations:
- All sessions included in the report must contain an identical set of Items/Questions. If there are items with multiple versions, the report will either not render or render but exclude the multi-versioned items. See How can I handle items with multiple versions? above.
- The report is optimized for single-select multiple choice questions, but can be used to analyse other question types as well. Note that audio, video and file upload questions currently cannot be shown in the detail view - a text warning is displayed in their place indicating they are not currently available.
How do I start using the response analysis report?
You can implement the report right now using Reports API v2020.2.LTS and above.
An earlier version of the report is also available in v2020.1.LTS with reduced features.
For technical implementation and configuration details, see the reference documentation page.