This is a premium feature and may not be included in your product license. For more information, reach out to your Learnosity Customer Success Manager.
This article describes how to use the Feedback Aide AI Scoring engine integration to enable autoscoring for essay with rich text Questions within your existing Learnosity implementation.
With this integration, you can:
- Create Feedback Aide–optimized rubrics directly within Learnosity.
- Auto-score open-text responses based on those rubrics.
- Evaluate learner responses in the context of shared passage content1
- Review and retrieve scores through Learnosity Reports and the Data API.
The integration supports short response2 rubric types:
- General
- Key points
- X from Y
- Categorize
As well as and essay3 rubric types:
- Analytic
- Holistic
Short response rubrics are designed for fact-based, skills-based, or multi-point responses with a recommended maximum size of 250 words.
Analytic and holistic rubrics are ideal for open-ended extended responses, including essays, reports, and reflections. For these rubrics, recommended response size is about 200-5000 words.
Note, there is a hard limit of 100,000 characters. Responses above 100,000 characters will not be scored.
Authoring
Authors control whether and how a Question is scored with Feedback Aide through the Author API.
1. Enable Scoring
When creating a new essay with rich text Item:
- Open the Scoring section.
-
Select the checkbox Score with Feedback Aide.
Note: If the option appears but cannot be selected, contact your system administrator to confirm the feature is fully enabled for your environment.Scoring section in the Author API with Feedback Aide enabled
2. Use Shared Passages for Scoring
Available from v2026.1.LTS, an additional scoring option is available when Feedback Aide is enabled called 'Use shared passages for scoring.'
When this option is selected, all shared passages within the same Item will be referenced during Feedback Aide scoring. Sources incur additional Feedback Aide credits at 1 credit/1000 words of source content.
Enable this option when the learner’s response should be evaluated in the context of shared passage content authored within the same Item.
For example, enable this option if:
- The learner must analyze, reference, or interpret a passage included in the Item.
- The rubric evaluates understanding of specific content contained in a shared passage.
If this option is not selected, Feedback Aide evaluates the learner response independently, without access to shared passage content.
Note: The shared passage content must not exceed 30,000 characters. If this limit is exceeded, the entire passage is rejected and will not be considered during scoring.
For more information, see the Feedback Aide Sources article.
3. Create the Rubric
Once scoring is enabled:
- Click Create rubric to open the Rubric Editor.
- Follow How to Create a Rubric Using Rubric Editor for guidance.
The Maximum points field will be automatically disabled. Its value is derived directly from the rubric, ensuring alignment between AI scoring and assessment scoring.
Note: You must create and save a rubric before saving a Question with Feedback Aide scoring enabled.
4. Manage Scoring Settings
If an administrator or author later disables Feedback Aide, the Question reverts to a manually scored one. However, the previously created rubric remains safely stored and becomes immediately available if Feedback Aide scoring is re-enabled.
Assessment
For learners, the assessment experience remains unchanged. Questions are delivered as usual within the Learnosity assessment player.
Scoring and Grading
When a learner submits an Activity, Learnosity triggers the auto-scoring process for any Items configured with Feedback Aide.
- Expect a slightly longer delay before scores are available due to AI processing time.
- If a scoring error occurs, the Question defaults to unscored, allowing for manual grading as with any other non-auto-scorable Item.
- Once processing is complete, the final score becomes available in Learnosity Reports and through the Data API.
Linking to Feedback Aide API
Auto-scores are stored within Learnosity APIs, but accessing detailed rubric ratings, feedback comments, and the Feedback Aide grader UI requires a simple custom integration step.
To access full feedback and rubric details, your developers must:
- Retrieve the session UUID from the Learnosity Data API.
- Instantiate the Feedback Aide API using this UUID.
- (Optional) Save any updated scores or feedback back to Learnosity through the Data API.
This approach provides both:
- The convenience of viewing scores in Learnosity Reports, and
- The depth of complete AI analysis through the dedicated Feedback Aide API.
Grading: Human-in-the-Loop
To ensure quality assurance and pedagogical integrity, we strongly recommend maintaining a human-in-the-loop review process for AI-generated scores and comments.
- Use the retrieved UUID (see the Linking to Feedback Aide API section).
- Open the session in the Feedback Aide API.
- Review and adjust scores, modify rubric ratings, or refine AI-generated feedback before finalizing results for learners.
- Save the adjusted scores back to Learnosity through the Data API.
1. Note: Use shared passages for scoring option is supported. Version added: v2026.1.LTS
2. Note: Short response rubric types (General, Key points, X from Y, Categorize) are supported. Version added: v2025.3.LTS
3. Note: Analytics and Holistic rubric types are supported. Version added: v2026.1.LTS