New lessons, planning decisions, and ways of engaging students involve, in a sense, carrying out experiments. These “experiments” are a positive and integral part of a teacher’s professional development, but what’s missing from these everyday classroom innovations?

To know if a new strategy works, students must be assigned randomly to treatment conditions. Without randomized assignment to treatment conditions, teachers are left making comparisons based on subjective reflection, or based on confounded comparisons to other cohorts. This can make it seem like ineffective strategies worked, or cause effective strategies to be missed. Randomized assignments, however, allow a teacher or researcher to infer whether the strategy caused improvements without bias.
Terracotta makes it easy to achieve the standards of responsible, ethical experimentation.
Students should be assigned to experimental variants of assignments (rather than deprived of no-treatment controls).
Whenever possible, research should favor within-subject crossover designs where all students receive all treatments (eliminating any bias due to condition assignment).
Students should be empowered to provide informed consent.
Student data should be deidentified prior to export and analysis.

Educators, instructional designers, and technologists are expected to show evidence

Terracotta makes it easy to link claims about education interventions back to evidence in two important ways:

Mapping Outcomes to Assignments

Experiments examine how different treatments affect relevant outcomes, such as learning outcomes (like exam scores) and behavioral outcomes (like attendance). Terracotta allows instructors to import, identify from the gradebook, or manually enter outcome data into Terracotta following each treatment event (for within-subject crossover designs) or at the end of an experiment (for between-subject designs).

Exporting De-identified Data from Consenting Participants

At the end of the term, Terracotta allows the export of all study data, with student identifiers replaced with random codes, and with non-consenting participants removed. This export set includes: 1) condition assignments, 2) scores on manipulated learning activities, 3) granular clickstream data for interactions with Terracotta assignments, and 4) outcomes data as entered by the instructor. By joining these data, de-identifying it, and scrubbing non-consenting participants, Terracotta prepares a data export that is shareable with research collaborators (includes no personally identifiable information) and that meets ethical requirements (only includes data from consenting students).

Terracotta eliminates the barriers that keep teachers and researchers from conducting experimental research on instructional practices.

Curious about Terracotta?

We're here to help.