Using Feedback Aide within Learnosity APIs

This is a premium feature and may not be included in your product license. For more information, reach out to your Learnosity Customer Success Manager.

This article describes how to integrate the Feedback Aide API with Learnosity essay with rich text Question type to enable auto-scoring for constructed response questions. The integration lets you leverage Learnosity’s assessment ecosystem with the AI Scoring engine from Feedback Aide for seamless scoring within your existing Learnosity implementation.

With this integration, you can:

  • Create Feedback Aide–optimized rubrics directly within Learnosity.
  • Auto-score open-text responses based on those rubrics.
  • Review and retrieve scores through Learnosity Reports and the Data API.

Currently, the integration supports short response rubric types:

  • General
  • Key points
  • X from Y
  • Categorize

These rubrics are designed for fact-based, skills-based, or multi-point responses of up to approximately 250 words. There is a hard limit of 2000 characters on learner responses scored with these rubric types. 

Authoring

Authors control whether and how a Question is scored with Feedback Aide through the Author API.

1. Enable Scoring

When creating a new essay with rich text Item:

  • Open the Scoring section.
  • Select the checkbox Score with Feedback Aide.
    Note: If the option appears but cannot be selected, contact your system administrator to confirm the feature is fully enabled for your environment.

    Scoring section in the Author API with Feedback Aide enabled

2. Create the Rubric

Once scoring is enabled:

The Max score field will be automatically disabled. Its value is derived directly from the rubric, ensuring alignment between AI scoring and assessment scoring.

Note: You must create and save a rubric before saving a Question with Feedback Aide scoring enabled.

3. Manage Scoring Settings

If an administrator or author later disables Feedback Aide, the Question reverts to an unscored state. However, the previously created rubric remains safely stored and becomes immediately available if Feedback Aide scoring is re-enabled.

Assessment

For learners, the assessment experience remains unchanged. Questions are delivered as usual within the Learnosity assessment player.

Scoring and Grading

When a learner submits an Activity, Learnosity triggers the auto-scoring process for any Items configured with Feedback Aide.

  • Expect a slightly longer delay before scores are available due to AI processing time.
  • If a scoring error occurs, the Question defaults to unscored, allowing for manual grading as with any other non-auto-scorable Item.
  • Once processing is complete, the final score becomes available in Learnosity Reports and through the Data API.

Linking to Feedback Aide API

Auto-scores are stored within Learnosity APIs, but accessing detailed rubric ratings, feedback comments, and the Feedback Aide grader UI requires a simple custom integration step.

To access full feedback and rubric details, your developers must:

  1. Retrieve the session UUID from the Learnosity Data API.
  2. Instantiate the Feedback Aide API using this UUID.
  3. (Optional) Save any updated scores or feedback back to Learnosity through the Data API.

This approach provides both:

  • The convenience of viewing scores in Learnosity Reports, and
  • The depth of complete AI analysis through the dedicated Feedback Aide API.

Grading: Human-in-the-Loop

To ensure quality assurance and pedagogical integrity, we strongly recommend maintaining a human-in-the-loop review process for AI-generated scores and comments.

  • Use the retrieved UUID (see the Linking to Feedback Aide API section).
  • Open the session in the Feedback Aide API.
  • Review and adjust scores, modify rubric ratings, or refine AI-generated feedback before finalizing results for learners.
  • Save the adjusted scores back to Learnosity through the Data API.
Was this article helpful?

Did you arrive here by accident? If so, learn more about Learnosity.