Options
All
  • Public
  • Public/Protected
  • All
Menu

Module ExamGrader

To start, you'll want to define some graders and a mapping from question IDs for the questions you're using to those graders. You might put these in their own files, or in the same file as the question definitions, or somewhere else. Whatever organization you like is fine, as long as you can eventually import the graders you define into your grading script.

The association between a question ID and the grader that handles that question is done with a GraderMap.

src/rubric/tf.ts

import { GraderMap, SimpleMCGrader } from "examma-ray";
export const TF_Graders : GraderMap = {
"sp20_mc_time_complexity_1" : new SimpleMCGrader(0),
"sp20_mc_time_complexity_2" : new SimpleMCGrader(0)
};

src/rubric/s7_3.ts

import { GraderMap, SimpleMCGrader } from "examma-ray";
export const S7_3_Grader = {
"sp20_7_3_assn_op": new StandardSLGrader([
{
title: "Function Header",
description: `Function header has correct name, parameter, and return ...`,
points: 1,
required: [1],
prohibited: [0]
},
{
title: "Self-Assignment Check",
description: "The function should compare the \`this\` pointer, which ...",
points: 0.5,
required: [3],
prohibited: [2]
},
...

and so on...

Then, set up a top-level grading script to create an ExamGrader, register your graders with it, load exams, grade the exams, and write out reports:

src/grade.ts

import { ExamGrader } from "examma-ray";
import { exam } from "./exam-spec"
import { TF_Graders } from "./rubric/tf";
import { S7_3_Grader } from "./rubric/s7_3";

let grader = new ExamGrader(exam, [
TF_Graders,
S7_3_Grader
]);

grader.loadAllSubmissions();
grader.gradeAll();
grader.writeAll();

Note the import of exam in the example above. This comes from your exam specification that you've created in a separate file. TODO link to that documentation.

You might also have some questions (e.g. open-ended code writing) that require people to manually grade. Calling gradeAll() won't fully grade those, but it will trigger the appropriate graders to create grading assignment files. Once those are filled in, just run the grading script again and it will pick up the human-generated results in those files.

Several graders are currently supported:

  • FreebieGrader - Gives points to everyone (or, optionally, to all non-blank submissions)
  • SimpleMCGrader - Grades an MC question with one right answer
  • SummationMCGrader - Grades a multiple-select MC question where each selection is worth positive or negative points
  • FITBRegexGrader - Uses regular expressions to grade each blank in an FITB question. Also comes with an interface for human review of unique answers
  • StandardSLGrader - Grades SL ("select-a-statement") questions based on which lines should/shouldn't be included

The format for the graders looks like JSON, but it's actually typescript code defining an object literal, so autocomplete, etc. should be available in VS Code.

For the FITB Regex grader, you'll need to be familiar with javascript regular expression syntax.

  • Tutorial/Documentation at https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions
  • Interactive tool for testing out regexes, really neat. https://regex101.com/ Make sure to select the "ECMAScript/Javascript" flavor on the left side.
  • Tip: Normally, the regex will match against any substring of what the student entered. If you want it to only match the WHOLE thing, use ^ and $. For example, if you're looking to match any decimal number /[\d\.]+ will match 6.2 and My answer is 6.2, whereas ^[\d\.]+$ will only match 6.2. Essentially ^ means "beginning of string" and $ means "end of string".

For now, refer to examples of existing graders. More thorough documentation coming.

Index

Type aliases

ExamGraderOptions: { frontend_js_path: string; frontend_assets_dir: string; uuid_strategy: UUID_Strategy; uuidv5_namespace?: string }

Type declaration

  • frontend_js_path: string
    deprecated

    This is ignored

  • frontend_assets_dir: string
  • uuid_strategy: UUID_Strategy
  • Optional uuidv5_namespace?: string
ExamGraderSpecification: Partial<ExamGraderOptions>
GraderMap: {}

A mapping of question ID to grader.

Type declaration

GraderSpecificationMap: {}

A mapping of question ID to grader specification.

Type declaration

  • [index: string]: GraderSpecification | undefined
Exception: { explanation: string; adjustedScore?: number; pointAdjustment?: number }

An exception including an adjusted score and an explanation of why the exception was applied.

Type declaration

  • explanation: string
  • Optional adjustedScore?: number
  • Optional pointAdjustment?: number
ExceptionMap: {}

A mapping from (uniqname, question id) to any exceptions applied for that student for that question. The question's display index (e.g. "3.2") may be used in place of the question ID. Only one exception may be specified per student/question pair.

Type declaration

  • [index: string]: {}

Generated using TypeDoc