This article details the available request methods for accessing the scoring service.
The scoring section of the Data API allows a user to submit full Question JSON (including the validation object) and the user responses (the JSON object returned from the getResponses() call on the Questions API), and have the auto-scored responses returned on a per-user basis. See the Questions API for the Question JSON format, and examples Question Responses.
Currently, the following Question types are supported for auto-scoring:
- Choice Matrix
- Classification
- Cloze Association
- Cloze Dropdown
- Cloze Inline Text
- Cloze Math Formula
- Cloze Text
- Graph Plotting
- Image Association
- Image Cloze Math Formula
- Image Dropdown
- Image Cloze Text
- Match List
- Math Formula V2 (Deprecated)
- Multiple Choice Question
- Number Line Association
- Number Line Plot
- Order List
- Short Text
- Simple Chart
- Simple Shading
- Sort List
- Text Highlight
- Token Highlight
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"response_id": "385",
"options": ["True", "False"],
"stems": ["Washington is the capital of United States", "Sydney is the capital of Australia", "Beijing is the capital of China", "Singapore is in China"],
"stimulus": "Select True or False for the following",
"type": "choicematrix",
"ui_style": {
"type": "table"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [0, 1, 0, 1]
}
}
}
],
"responses" : {
"385": {
"value": [0, 1, 0, 1],
"type": "array"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420608028,
"records": 1
},
"data": [
{
"response_id": "385",
"type": "choicematrix",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"possible_responses": ["2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12", "13", "14", "15", "16", "17", "18", "19", "20", "21", "22", "23", "24", "25", "26", "27", "28", "29"],
"stimulus": "Blah Blah",
"type": "classification",
"ui_style": {
"column_count": 3,
"column_titles": ["Prime", "Odd (Not Prime)", "Even (Not Prime)"],
"row_count": 3,
"row_titles": ["< 10", ">= 10 & < 20", ">= 20 "]
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [
[0, 1, 3, 5],
[7],
[2, 4, 6],
[9, 11, 15, 17],
[13],
[8, 10, 12, 14, 16],
[21, 27],
[19, 23, 25],
[18, 20, 22, 24, 26]
]
}
},
"response_id": "3912"
}
],
"responses": {
"3912": {
"value": [
[0, 1, 3, 5],
[7],
[2, 4, 6],
[9, 11, 15, 17],
[13],
[8, 10, 12, 14, 16],
[21, 27],
[19, 23, 25],
[18, 20, 22, 24, 26]
],
"type": "array"
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420610581,
"records": 1
},
"data": [
{
"response_id": "3912",
"type": "classification",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60010",
"type": "clozeassociation",
"template": "<p>\"It’s all clear,\" he {{response}}. \"Have you the chisel and the bags? Great Scott! Jump, Archie, jump, and I’ll swing for it!\"</p><p>Sherlock {{response}} had sprung out and seized the {{response}} by the collar. The other dived down the hole, and I heard the sound of {{response}} cloth as Jones clutched at his skirts. The light flashed upon the barrel of a revolver, but Holmes’ {{response}} came down on the man’s wrist, and the pistol {{response}} upon the stone floor.</p>",
"possible_responses": [
"whispered", "holmes", "Holmes", "acquaintance", "intruder", "Homes", "burning", "rending", "broken", "revolver", "hunting crop", "clinked", "spilt"
],
"instant_feedback": true,
"feedback_attempts": 2,
"validation": {
"valid_responses": [
[
"whispered"
],
[
"Holmes"
],
[
"intruder"
],
[
"rending"
],
[
"hunting crop"
],
[
"clinked"
]
]
}
}
],
"responses": {
"60010": {
"value": [
null, "whispered", "intruder", null, null, "broken"
],
"type": "array",
"apiVersion": "v2023.1.lts"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420610672,
"records": 1
},
"data": [
{
"response_id": "60010",
"type": "clozeassociation",
"is_valid": true,
"score": 1,
"max_score": 6,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60011",
"type": "clozedropdown",
"description": "The student needs to select the correct response for each blank ",
"template": "<p>\"It’s all clear,\" he {{response}}. \"Have you the chisel and the bags? Great Scott! Jump, Archie, jump, and I’ll swing for it!\"</p><p>Sherlock {{response}} had sprung out and seized the {{response}} by the collar. The other dived down the hole, and I heard the sound of {{response}} cloth as Jones clutched at his skirts. The light flashed upon the barrel of a revolver, but Holmes’ {{response}} came down on the man’s wrist, and the pistol {{response}} upon the stone floor.</p>",
"instant_feedback": true,
"feedback_attempts": 2,
"possible_responses": [
[
"whispered", "sprinted", "joked"
],
[
"Homes", "holmes", "Holmes"
],
[
"acquaintance", "intruder", "shopkeeper"
],
[
"burning", "departing", "rending", "broken"
],
[
"revolver", "hunting crop"
],
[
"rattled", "clinked", "spilt"
]
],
"validation": {
"valid_responses": [
[
"whispered"
],
[
"Holmes"
],
[
"intruder"
],
[
"rending"
],
[
"hunting crop"
],
[
"clinked"
]
]
}
}
],
"responses": {
"60011": {
"value": [
"whispered", "Holmes", "intruder", null, null, null
],
"type": "array",
"apiVersion": "latest"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420610762,
"records": 1
},
"data": [
{
"response_id": "60011",
"type": "clozedropdown",
"is_valid": true,
"score": 3,
"max_score": 6,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"stimulus": "fill in the blanks",
"template": "<p>Sherlock Homes had sprung out and seized the {{response}} by the collar. The other dived down the hole, and I heard the sound of {{response}} cloth as Jones clutched at his skirts. The light flashed upon the barrel of a revolver, but Holmes’ {{response}} came down on the man’s wrist, and the pistol {{response}} upon the stone floor.</p>",
"type": "clozeinlinetext",
"validation": {
"scoring_type": "partialMatch",
"valid_response": {
"value": ["intruder", "rending", "hunting crop", "clinked"],
"score": 1
}
},
"instant_feedback": false,
"response_id": "950"
}
],
"responses" : {
"950": {
"value": ["intruder", "rending", "hunting crop", "clinked"],
"type": "array",
"apiVersion": "v2.51.1"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1421018171,
"records": 1
},
"data": [
{
"response_id": "950",
"type": "clozeinlinetext",
"is_valid": true,
"score": 4,
"max_score": 4,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"type": "clozeformula",
"ui_style": {
"type": "block-on-focus-keyboard"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [
[{
"method": "equivLiteral",
"value": "25+45=70",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}],
[{
"method": "equivLiteral",
"value": "1",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}],
[{
"method": "equivLiteral",
"value": "10",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}],
[{
"method": "equivLiteral",
"value": "7",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}],
[{
"method": "equivLiteral",
"value": "50",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}]
]
}
},
"text_blocks": [],
"response_containers": [{
"template": "25+45={{response}}"
}],
"response_container": {
"template": ""
},
"stimulus": "It takes John 25 minutes to walk to the car park and 45 minutes to drive to work.",
"template": "<p>{{response}} minutes = {{response}} hour and {{response}} minutes</p><p>John needs to get out of the house at {{response}}:{{response}}a.m. in order to get to work at 9:00a.m.</p>",
"response_id": "915"
}
],
"responses" : {
"915": {
"value": ["25+45=70", "1", "10", "7", "50"],
"responses": [
["70"],
[],
[],
[],
[]
],
"type": "array",
"apiVersion": "v2.51.1"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1421018227,
"records": 1
},
"data": [
{
"response_id": "915",
"type": "clozeformula",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60009",
"type": "clozetext",
"template": "<table><thead><tr><th><strong>Multiply</strong></th><th><strong>_ x 1</strong></th><th><strong>_ x 2</strong></th><th><strong>_ x 3</strong></th><th><strong>_ x 4</strong></th><th><strong>_ x 5</strong></th></tr></thead><tbody><tr><td><strong>1 x _</strong></td><td>{{response}}</td><td>2</td><td>3</td><td>4</td><td>5</td></tr><tr><td><strong>2 x _</strong></td><td>2</td><td>{{response}}</td><td>6</td><td>8</td><td>10</td></tr><tr><td><strong>3 x _</strong></td><td>3</td><td>6</td><td>{{response}}</td><td>12</td><td>15</td></tr><tr><td><strong>4 x _</strong></td><td>4</td><td>8</td><td>12</td><td>{{response}}</td><td>20</td></tr><tr><td><strong>5 x _</strong></td><td>5</td><td>10</td><td>15</td><td>20</td><td>{{response}}</td></tr></tbody></table>",
"instant_feedback": true,
"feedback_attempts": 2,
"case_sensitive": false,
"max_length": 2,
"validation": {
"valid_responses": [
[
"1"
],
[
"4"
],
[
"9"
],
[
"16"
],
[
"25"
]
],
"valid_score": 1
}
}
],
"responses": {
"60009": {
"value": [
"1", "4", "9", "16", "25"
],
"type": "array",
"apiVersion": "v2023.1.lts"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611105,
"records": 1
},
"data": [
{
"response_id": "60009",
"type": "clozetext",
"is_valid": true,
"score": 5,
"max_score": 5,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"axis_x": {
"draw_labels": true,
"ticks_distance": 1,
"show_first_arrow": true,
"show_last_arrow": true
},
"axis_y": {
"draw_labels": true,
"ticks_distance": 1,
"show_first_arrow": true,
"show_last_arrow": true
},
"background_image": {
"src": ""
},
"canvas": {
"snap_to": "grid",
"x_max": 5.4,
"x_min": -5.4,
"y_max": 5.4,
"y_min": -5.4
},
"grid": {
"x_distance": 1,
"y_distance": 1
},
"is_math": true,
"stimulus": "<p>Graph the points (-3, 3) and (3, -3).</p>",
"toolbar": {
"tools": ["point", "move"],
"default_tool": "point"
},
"type": "graphplotting",
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [{
"id": "lrn_1",
"type": "point",
"coords": {
"x": -3,
"y": 3
}
}, {
"id": "lrn_2",
"type": "point",
"coords": {
"x": 3,
"y": -3
}
}]
}
},
"response_id": "487"
}
],
"responses" : {
"487": {
"value": {
"actions": ["point(-3,3) << id: 'lrn_1', snaptogrid: true, snapsizex: 1, snapsizey: 1 >>;", "point(3,-3) << id: 'lrn_2', snaptogrid: true, snapsizex: 1, snapsizey: 1 >>;"],
"undo": ["remove(lrn_1);", "remove(lrn_2);"],
"composition": [{
"id": "lrn_1",
"type": "point",
"coords": {
"x": -3,
"y": 3
}
}, {
"id": "lrn_2",
"type": "point",
"coords": {
"x": 3,
"y": -3
}
}]
},
"type": "object",
"apiVersion": "v2.51.0"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611216,
"records": 1
},
"data": [
{
"response_id": "487",
"type": "graphplotting",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60013",
"type": "imageclozeassociation",
"description": "The student needs to choose the correct response for each blank ",
"img_src": "/static/images/clozeireland.jpg",
"response_positions": [
{
"x": "72",
"y": "18.8"
},
{
"x": "19.8",
"y": "49.5"
},
{
"x": "54.5",
"y": "46"
},
{
"x": "83.9",
"y": "47.7"
},
{
"x": "29.8",
"y": "81.4"
}
],
"possible_responses": [
"Athlone", "Belfast", "Dublin", "Cork", "Galway"
],
"instant_feedback": true,
"feedback_attempts": 2,
"validation": {
"valid_responses": [
[
"Belfast"
],
[
"Galway"
],
[
"Athlone"
],
[
"Dublin"
],
[
"Cork"
]
]
}
}
],
"responses": {
"60013": {
"value": [
"Belfast", "Galway", "Athlone", "Dublin", "Cork"
],
"type": "array",
"apiVersion": "latest"
}
},
"enable_logging": false
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1377330382
},
"data": [
{
"attempted": true,
"response_id": "60013",
"score": 5,
"type": "imageclozeassociation"
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"image": {
"alt": "Scalene triangle with sixty-four degree angle",
"src": "//assets.learnosity.com/organisations/1/triangle.png"
},
"is_math": true,
"response_container": {
"pointer": "",
"template": ""
},
"response_containers": [{
"template": "",
"pointer": "left",
"width": "100px",
"height": "20px"
}, {
"template": "",
"pointer": "right",
"width": "100px",
"height": "20px"
}],
"response_positions": [{
"x": 31.25,
"y": 0
}, {
"x": 72.75,
"y": 16.8
}],
"stimulus": "<p>Express the angles a & b in terms of simple arithmetic involving all the angles of the given triangle.</p><p>E.g. \(180-2\left(64\right)\)</p>",
"text_blocks": [],
"type": "imageclozeformula",
"ui_style": {
"type": "block-on-focus-keyboard"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 2,
"value": [
[{
"method": "equivValue",
"value": "64",
"options": {
"decimalPlaces": 10,
"inverseResult": false,
"allowDecimal": true
}
}],
[{
"method": "equivValue",
"value": "52",
"options": {
"decimalPlaces": 10,
"inverseResult": false,
"allowDecimal": true
}
}]
]
}
},
"response_id": "554"
}
],
"responses" : {
"554": {
"value": ["64", "52"],
"responses": [
[],
[]
],
"type": "array",
"apiVersion": "v2.51.1"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1419317415,
"records": 1
},
"data": [
{
"response_id": ,
"type": ,
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60014",
"type": "imageclozedropdown",
"description": "The student needs to select the correct response for each blank ",
"img_src": "/static/images/clozeireland.jpg",
"response_positions": [
{
"x": "71",
"y": "19.5"
},
{
"x": "21.1",
"y": "50.1"
},
{
"x": "54.6",
"y": "46.8"
},
{
"x": "83.9",
"y": "48.5"
},
{
"x": "31.7",
"y": "82"
}
],
"instant_feedback": true,
"feedback_attempts": 2,
"possible_responses": [
[
"Belfast", "Tullamore", "Madrid"
],
[
"London", "Galway", "Cork"
],
[
"Ennis", "Athlone"
],
[
"Arkow", "Dublin", "Sydney"
],
[
"Bantry", "Cork"
]
],
"validation": {
"valid_responses": [
[
"Belfast"
],
[
"Galway"
],
[
"Athlone"
],
[
"Dublin"
],
[
"Cork"
]
]
}
}
],
"responses": {
"60014": {
"value": [
"Belfast", "Galway", "Athlone", "Dublin", "Cork"
],
"type": "array",
"apiVersion": "latest"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611347,
"records": 1
},
"data": [
{
"response_id": "60014",
"type": "imageclozedropdown",
"is_valid": true,
"score": 5,
"max_score": 5,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60012",
"type": "imageclozetext",
"description": "The student needs to fill in the blanks ",
"img_src": "/static/images/clozeireland.jpg",
"response_positions": [
{
"x": "70.5",
"y": "19.8"
},
{
"x": "18",
"y": "50.4"
},
{
"x": "54.5",
"y": "47.1"
},
{
"x": "83.9",
"y": "48.7"
},
{
"x": "46.2",
"y": "82.1"
}
],
"instant_feedback": true,
"feedback_attempts": 2,
"case_sensitive": true,
"max_length": 8,
"validation": {
"valid_responses": [
[
"Belfast"
],
[
"Galway"
],
[
"Athlone"
],
[
"Dublin"
],
[
"Cork"
]
]
}
}
],
"responses": {
"60012": {
"value": [
"belfast", "galway", "cork", "dublin", "athlone"
],
"type": "array",
"apiVersion": "latest"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420607768,
"records": 1
},
"data": [
{
"response_id": "60012",
"type": "imageclozetext",
"is_valid": false,
"score": 0,
"max_score": 5,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"possible_responses": ["Australia", "United States", "England", "Ireland", "France"],
"stimulus": "<p>Match the cities to the parent nation.</p>",
"stimulus_list": ["London", "Dublin", "Paris", "Boston", "Sydney"],
"type": "association",
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": ["England", "Ireland", "France", "United States", "Australia"]
}
},
"response_id": "1328"
}
],
"responses": {
"1328": {
"value": ["England", "Ireland", "France", "United States", "Australia"],
"type": "array",
"updatedFormat": true,
"apiVersion": "v2.51.0"
}
},
"enable_logging": false
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611282,
"records": 1
},
"data": [
{
"response_id": "1328",
"type": "association",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"response_id": "125",
"is_math": true,
"stimulus": "<p>State the mass-energy equivalence</p>",
"template": "E=mc^2",
"text_blocks": [],
"type": "formulaV2",
"ui_style": {
"type": "floating-keyboard"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [{
"method": "equivSymbolic",
"value": "E=mc^2",
"options": {
"ignoreOrder": false,
"inverseResult": false
}
}]
}
}
}
],
"responses" : {
"125": {
"value": "E=mc^2",
"type": "string",
"apiVersion": "v2.51.0",
"responses": []
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611708,
"records": 1
},
"data": [
{
"response_id": "125",
"type": "formulaV2",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60004",
"type": "mcq",
"options": [
{
"label": "Sydney",
"value": "sydney"
},
{
"label": "Perth",
"value": "perth"
},
{
"label": "Brisbane",
"value": "brisbane"
},
{
"label": "Darwin",
"value": "darwin"
}
],
"valid_responses": [
{
"value": "sydney",
"score": "50"
},
{
"value": "brisbane",
"score": "50"
}
],
"multiple_responses": true
}
],
"responses": {
"60004": {
"value": [
"sydney", "brisbane"
],
"type": "array",
"apiVersion": "v2023.2.lts"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611831,
"records": 1
},
"data": [
{
"response_id": "60004",
"type": "mcq",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"is_math": true,
"labels": {
"frequency": 1,
"show_max": true,
"show_min": true
},
"line": {
"max": 10,
"min": 0
},
"points": ["A", "B", "C"],
"snap_to_ticks": true,
"stimulus": "<p>Place the markers with the following configurations:</p><ol><li>A has a value of 1.</li><li>B is 7 more than A.</li><li>C is 2 less than B.</li></ol>",
"ticks": {
"distance": 1,
"show": true
},
"type": "numberline",
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [{
"point": "A",
"position": "1"
}, {
"point": "B",
"position": "8"
}, {
"point": "C",
"position": "6"
}]
}
},
"response_id": "167"
}
],
"responses" : {
"167": {
"value": {
"A": {
"x": 1,
"y": 7
},
"B": {
"x": 8,
"y": 7
},
"C": {
"x": 6,
"y": 7
}
},
"type": "object"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420783344,
"records": 1
},
"data": [
{
"response_id": "167",
"type": "numberline",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"line": {
"max": 10,
"min": -10,
"right_arrow": true,
"left_arrow": true
},
"stacked_elements": 2,
"stimulus": "Insert a segment from 4 to -9",
"ticks": {
"distance": 3,
"minor_ticks": 2,
"show": true
},
"type": "numberlineplot",
"ui_style": {
"width": "370px"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [{
"type": "segment",
"point1": 4,
"point2": "-9"
}]
}
},
"response_id": "380"
}
],
"responses" : {
"380": {
"value": [{
"type": "segment",
"point1": 4,
"point2": -9
}],
"type": "array"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1421013094,
"records": 1
},
"data": [
{
"response_id": "380",
"type": "numberlineplot",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60016",
"ui_style": "button",
"type": "orderlist",
"list": [
"cat",
"horse",
"pig",
"elephant",
"mouse"
],
"description": "Arrange these animals from smallest to biggest:",
"validation": {
"valid_response": [
4, 0, 2, 1, 3
],
"valid_score": "1",
"partial_scoring": "true",
"penalty_score": "0",
"pairwise": "0"
},
"instant_feedback": true
}
],
"responses": {
"60016": {
"value": [4, 0, 2, 1, 3],
"type": "array",
"apiVersion": "latest"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420611882,
"records": 1
},
"data": [
{
"response_id": "60016",
"type": "orderlist",
"is_valid": true,
"score": 5,
"max_score": 5,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60002",
"type": "shorttext",
"description" : "The <strong>student</strong> needs to say what the capital of <u>Ireland</u> is.<br>Valid Responses:<dl><dt>Dublin</dt><dd>Score 1</dd><dt>Dubh Linn</dt><dd>Score 2</dd><dt>Baile Átha Cliath</dt><dd>Score 3</dd><dt>Baile Atha Cliath</dt><dd>Score 4</dd></dl>",
"valid_responses": [
{"value": "Dublin", "score":1},
{"value": "Dubh Linn", "score":2},
{"value": "Baile Átha Cliath", "score":3},
{"value": "Baile Atha Cliath", "score":4}
],
"instant_feedback": true,
"case_sensitive": true,
"character_map": true
}
],
"responses": {
"60002": {
"value": "Baile Atha Cliath",
"characterCount": 17,
"type": "string",
"apiVersion": "latest"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420612116,
"records": 1
},
"data": [
{
"response_id": "60002",
"type": "shorttext",
"is_valid": true,
"score": 4,
"max_score": 4,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"chart_data": {
"data": [{
"x": "Comedy",
"y": 2
}, {
"x": "Action",
"y": 2
}, {
"x": "Romance",
"y": 2
}, {
"x": "Drama",
"y": 2
}, {
"x": "SciFi",
"y": 2
}],
"name": "Favourite movie type"
},
"stimulus": "<p>Plot the following:</p><table><tbody><tr><td>Comedy</td><td> </td><td>4</td></tr><tr><td>Action</td><td> </td><td>5</td></tr><tr><td>Romance</td><td> </td><td>4</td></tr><tr><td>Drama</td><td> </td><td>1</td></tr><tr><td>SciFi</td><td> </td><td>6</td></tr></tbody></table>",
"type": "simplechart",
"ui_style": {
"chart_type": "bar"
},
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [{
"x": "Comedy",
"y": 4
}, {
"x": "Action",
"y": 5
}, {
"x": "Romance",
"y": 4
}, {
"x": "Drama",
"y": 1
}, {
"x": "SciFi",
"y": 6
}]
}
},
"y_axis_label": "Y Axis",
"max_y_value": 10,
"response_id": "720"
}
],
"responses" : {
"720": {
"value": {
"data": [{
"x": "Comedy",
"y": 4,
"interactive": true
}, {
"x": "Action",
"y": 5,
"interactive": true
}, {
"x": "Romance",
"y": 4,
"interactive": true
}, {
"x": "Drama",
"y": 1,
"interactive": true
}, {
"x": "SciFi",
"y": 6,
"interactive": true
}],
"name": "Favourite movie type"
},
"type": "object",
"apiVersion": "v2.51.0"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420612185,
"records": 1
},
"data": [
{
"response_id": "720",
"type": "simplechart",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"type": "simpleshading",
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": {
"method": "byCount",
"value": 3
}
}
},
"canvas": {
"row_count": 3,
"column_count": 3
},
"stimulus": "<p>Use this model to solve the problem.</p><p>Click parts of the model to shade \(\frac{1}{3}\) of the whole model.</p>",
"is_math": true,
"response_id": "6463"
}
],
"responses" : {
"6463": {
"value": [
[1, 3],
[2, 2],
[1, 1]
],
"type": "array",
"apiVersion": "v2.51.0"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420612625,
"records": 1
},
"data": [
{
"response_id": "6463",
"type": "simpleshading",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"response_id": "60022",
"type": "sortlist",
"description": "In this question, the student needs to sort the events, chronologically earliest to latest.",
"list": [
"Russian Revolution", "Discovery of the Americas", "Storming of the Bastille", "Battle of Plataea", "Founding of Rome", "First Crusade"
],
"instant_feedback": true,
"feedback_attempts": 2,
"validation": {
"valid_response": [
4, 3, 5, 1, 2, 0
],
"valid_score": 1,
"partial_scoring": true,
"penalty_score": -1
}
}
],
"responses": {
"60022": {
"value": [
0, null, 3, null, null, null
],
"source": [
null, 1, 2, null, 4, 5
],
"type": "array",
"apiVersion": "v2023.2.lts"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420612678,
"records": 1
},
"data": [
{
"response_id": "60022",
"type": "sortlist",
"is_valid": false,
"score": -6,
"max_score": 6,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions": [
{
"template": "Then he <valid>walked</valid> slowly up the street, and then down again to the corner, still <valid>looking</valid> keenly at the houses. Finally he <valid>returned</valid> to the pawnbroker's, and, having <valid>thumped</valid> vigorously upon the pavement with his stick two or three times, he <valid>went</valid> up to the door and <valid>knocked</valid>. It was instantly <valid>opened</valid> by a bright-looking, clean-shaven young fellow, who <valid>asked</valid> him to <valid>step</valid> in.",
"type": "texthighlight",
"validation": {
"partial_scoring": true,
"valid_score": 1,
"responses": [{
"index": 8,
"value": "walked"
}, {
"index": 78,
"value": "looking"
}, {
"index": 119,
"value": "returned"
}, {
"index": 161,
"value": "thumped"
}, {
"index": 236,
"value": "went"
}, {
"index": 260,
"value": "knocked"
}, {
"index": 286,
"value": "opened"
}, {
"index": 345,
"value": "asked"
}, {
"index": 358,
"value": "step"
}]
},
"stimulus": "Highlight the verbs.",
"response_id": "6483"
}
],
"responses": {
"6483": {
"value": [{
"value": "walked",
"index": 8
}, {
"value": "looking",
"index": 78
}, {
"value": "returned",
"index": 119
}, {
"value": "thumped",
"index": 161
}, {
"value": "went",
"index": 236
}, {
"value": "knocked",
"index": 260
}, {
"value": "opened",
"index": 286
}, {
"value": "asked",
"index": 345
}, {
"value": "step",
"index": 358
}],
"type": "array",
"apiVersion": "v2.51.0"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420613079,
"records": 1
},
"data": [
{
"response_id": "6483",
"type": "texthighlight",
"is_valid": true,
"score": 9,
"max_score": 9,
"attempted": true,
"error": null
}
]
}
Sample Request
{
"security": {...},
"request": {
"questions" : [
{
"stimulus": "<p>Which sentence or sentences imply that the cheetahs run fast?</p>",
"template": "<p><span class="lrn_token">Most cheetahs live in the wilds of Africa</span>. <span class="lrn_token">There are also some in Iran and northwestern Afghanistan</span>. <span class="lrn_token">The cheetah's head is smaller than the leopard's, and its body is longer</span>. <span class="lrn_token">This cat is built for speed</span>. <span class="lrn_token">Its legs are much longer than the leopard', allowing it to run at speeds of up to 70 miles per hour</span>! <span class="lrn_token">This incredible ability helps the cheetahs catch their dinner, which is usually an unfortunate antelope</span>. <span class="lrn_token">A cheetahs spots are simply black spots, not rosettes or circles</span>.</p>",
"tokenization": "custom",
"type": "tokenhighlight",
"validation": {
"scoring_type": "exactMatch",
"valid_response": {
"score": 1,
"value": [3, 4]
}
},
"response_id": "6535"
}
],
"responses" : {
"6535": {
"value": [3, 4],
"type": "array",
"apiVersion": "v2.51.0"
}
}
}
}
Sample Response
{
"meta": {
"status": true,
"timestamp": 1420613341,
"records": 1
},
"data": [
{
"response_id": "6535",
"type": "tokenhighlight",
"is_valid": true,
"score": 1,
"max_score": 1,
"attempted": true,
"error": null
}
]
}