Case studies

Have you ever wondered…?

At Terracotta, we spend a lot of time wondering, and we're guessing you do, too!

Terracotta can help you test how well a new assignment idea, or even a district-wide intervention works; whether an instructional practice will work in multiple contexts; which of two strategies is most helpful to students; and how an assignment affects students' critical thinking skills. This is hardly an exhaustive list, but read on to learn more about what other researchers and teachers have wondered, and have used Terracotta to explore.
Case study

To what extent does this assignment help students?

Let’s say you’re a teacher, and you’re wondering if a retrieval practice assignment will improve students’ performance on the midterm exam. In Terracotta, you can build the experiment and link it to the midterm exam outcome.
Case study

How well does a department- or district-wide intervention work?

District administrators have been talking about recent research on mindset interventions in STEM courses and want to see how it works in your context. Rather than relying on anecdotal evidence, they decide to create the experiment in Terracotta, link it to outcomes, and contribute to the broader conversation.
Case study

What can I learn from an experiment in an authentic classroom setting?

As a researcher, you’re always on the lookout for a tool that can help you access high-quality data, but you’re well aware of the difficulties of conducting responsible experiments in classroom contexts. With Terracotta, you can build the experiment, integrate informed consent into the process, run the experiment, and access your deidentified data when it’s done, with students who opted out automatically excluded from the data export.
Case study

How well does an instructional practice work in different contexts?

This study sought to determine the effect of an instructional practice (in this case, prequestions), when its implementation might vary between classes. To do so, the research team used Terracotta to run a multi-site within-subject randomized controlled experiment, conducted across 26 diverse classes ranging from 6th grade to college seniors, examining the generalizable effect of prequestions on student learning from online media. Terracotta made this otherwise unwieldy study efficient, and expanded the scope of data collection to include raw clickstream interactions with learning materials, item-level assessment responses, and more. Terracotta also standardizes them, deidentifies them, and automatically removes nonconsenting participants, making it possible to post all the raw data publicly, without manipulation, in a common, well-documented format.
Case study

Which strategy works best for students?

Researchers used Terracotta to manipulate online review assignments so that consenting students alternated, on a weekly basis, between taking multiple-choice quizzes (as retrieval practice) and reading answers to these quizzes (restudy). Students' performance on subsequent exams was significantly improved for items that had been in retrieval practice review assignments. All materials, data, and analyses are publicly available at https://osf.io/yrbhe/,and; read the full study here: https://link.springer.com/article/10.3758/s13428-023-02164-8.
Case study

How does a virtual exchange affect students' critical thinking skills?

A business professor at a US university wanted to test whether a virtual exchange experience affected students’ international communication and critical thinking skills. To do so, she and a colleague from a university in Ecuador conducted a common activity with their students over two Zoom sessions. The professor used Terracotta to administer surveys before and after the exchange. She found that her students at the US university did improve their intercultural communication and critical thinking skills through the virtual exchange program.

Terracotta helps you build the experiment and gives you the data you need when it's complete.