campaign
AI writing now includes AI paraphrasing detection!
Learn more
cancel
Blog   ·  

Revolutionizing the Experience of Writing and Student Learning

The Turnitin Team
The Turnitin Team

Subscribe

 

 

 

 

By completing this form, you agree to Turnitin's Privacy Policy. Turnitin uses the information you provide to contact you with relevant information. You may unsubscribe from these communications at any time.

 



Revolutionizing the Experience of Writing and Student Learning

If you ask anyone what's going on with Turnitin, most will know us for our plagiarism detection tools. That's something that we have done for many years now. We have remarkable market share in a very narrowly-focused tool. Previously we have talked about the technology behind automated assessment, automated understanding of text, and now I'll talk about the real world impact that it can have, and how you can actually shape the classroom.

"And when I talk about that, I usually think about it in two terms. One is impact, and the other is adoption."

"I would like to measure impact as: every student learning to communicate well in writing and able to express themselves in text. And that’s something that broadens the access to communication, to technology, and to the various things that society can provide to students by giving them a communication medium that they have not previously had. They simply have been locked out of that. Revision Assistant and the tools that we’re building at Turnitin really changes the narrative around that, around access."

"And then when I talk about adoption, I think about: ‘what is it going to take to have teachers use technology?’ Natural language processing, artificial intelligence, machine learning... most English teachers, if you start talking to them about these facts, won’t get it. They don’t want to have that conversation because it sounds very much like they don’t have a role in their own classroom, that you can provide tools that replace them, that you are trying to really dehumanize the process of learning, and you run into that brick wall very quickly if you start talking about the technology behind what you do. So in my talk today, I’m going to focus a lot on how you can communicate the impact of automated tools to provide feedback on student writing, demonstrate to teachers that it’s something that they want in their classroom and that can help their students, and really get them engaged in the process and encourage them to adopt our tools in their classrooms."

"I have four points to make. The first is that the technology does matter, to a point."

"So, let’s talk about what the technology does. When we talk about automated essay scoring, fundamentally what we’re doing is collecting data, putting it into piles, and trying to understand how they're categorized so that when a computer comes out and tries to do that again, it can do that automatically. In order to do this, the first step is collecting data. For any given writing prompt, hundreds of papers, sorting them into buckets of 1s, 2s, 3s. Once you’ve collected that data, you need to pull out these features, finding the elements of vocabulary, of syntax, some higher-level concepts that you can target specific things that the student did within their text. You don’t know whether these are good or bad things, but they're enough to tell you a description of what’s going on, the different choices that the student made as an author. And with those choices that they’ve made, all of the features that you’ve collected, you can then find patterns."

"You can pull all this information out, build out thousands of dimensions of various features of a text, and then say, all of them summed together tell us something meaningful about which pile this essay should fall into. This is a “1” essay because it looks like all the previous “1” essays, or a “3.” And then when a new text comes in, you can assign that to it by extracting those same features and doing a similarity match, essentially."

"It’s not rocket science. It’s really trying to take a description of a text and match it to what most closely mirrors it, and you can get to an automated scoring engine that does that for text pretty quickly, but that doesn’t get you very far."

"So the next thing that we have to do after that is talk about what they did to get to that score. You need to actually be able to tell them, you know, it’s not that you got this score because you had lots of words; you didn’t write a lot; you didn’t use GRE vocabulary or anything else to get to that point; here are the choices you made, the areas of your text that you should focus on as things that you did particularly well or have a lot of room for improvement. In order to do that, we need to break the essay into chunks, individual sentences or phrases, in some cases paragraphs, and try and say what is it that this text is doing differently from the rest of your essay. Once you do that, you can say for any given chunk, a sentence at a time, here is what the system thought about this sentence. It doesn’t matter about any one particular feature, but there might be 50 or 100 or 150 features that appeared here, and overall, this text, looked like a 2 overall, but there are elements of it that are more 1-like or more 3-like, and you can do that for every individual sentence."

"And then you can start pulling that out and saying well, there’ s a couple that look a lot more like a 1 than they do like a 2 or a 3 compared to the rest of this text. That’s probably an area of weakness for the student, someplace that they should focus their time in. Similarly, if there are subsets of their text, paragraphs or sentences that look much more like a 3, then we can tell them, 'this is something that you did, right, you should be somehow proud of this, or you should focus on it to see what is it that I can do differently in the rest of my text.'"

"Once you have these outliers, then you can start thinking about what you would say about them on a given rubric trait. We want to focus you on something actionable that you can do here in your sentence. This, again, ties back into making the students want to keep writing and getting that immediate feedback."

"Then it gets a little messier because we’ve now generated all of these outliers and have generated comments for lots of them. So you then need to add that element of structure, and then you also need to constrain for tone and for balance, talking about only doing a couple things at a time, not overwhelming them with 20 bits of feedback. Every student has received a paper full of red ink and just thrown it out—not worth their time. And then also balancing for positive and negative tone, saying that we want to give you a couple things you did really well and a couple things that you could work on, and encouraging them to go through that process multiple times, focus on their text in one place, revise in one area, get more feedback, iterate. Think about writing as a process so that, because of our intervention, students now know that their text is a living document."

Motivation to iterate and improve on your writing is something that is really hard... and it's something that our feedback really does have an impact on.

"It’s not something where you start at the top and go to the bottom; instead, it’s something where they are really thinking about, ‘how did I write this previously, what could I do differently, how could I iterate over time to get to something that makes a lot of sense.’ And we think that the biggest thing that we can do as a company with our product is encourage students to think about writing this way, encourage them to edit, to revise to iterate, and to improve on their own, make them confident in their abilities, give them self-efficacy so that they can build up their own writing, and then feel motivated to actually want to do so. Motivation to iterate and improve on your writing is something that is really hard, especially when you're looking at a blank Word document, and it’s something that our feedback really does have an impact on."

"So that leads to my second point, which is: how you define success matters."

"It’s not just the technology. It’s measuring what is it that the student changed about their behavior. The really cool thing about our product is that it works. Isn’t always true in ed tech. So, what does it mean that it works? So I have at least four answers."

"The first of those is reliability. This is what a lot of the literature has focused on for many years: can you reproduce the assessment of humans? It requires relatively sophisticated technology, but the answer is, ‘yes,’ you can get to a point that you reliably assess text, make roughly the same level of errors that you would as a human, and you can do that with relatively small training set sizes."

"So the next is in the product that you’ve built: do the students actually use the software? That’s the next tier of success— engagement. We can see that they're writing many drafts; and we can see that as they're writing and receiving more feedback; their word count improves; they're getting more practice writing, and they keep using it for the majority of the class period and even come back and log in from home. On average, we start at around 160, 170 words in a student’s first draft, and they go up to 350 words by the time they submit, and this is middle school students, grade 6 through 8, and that’s good."

"The next thing that you would want to see is: does the writing get better, are they actually improving the quality of the writing? And we can look at that too. Students using Revision Assistant, from their first draft to the time they submit it, on average, go from a 2/4 to a 3/4 on a given rubric trait. It means that the writing quality at least intrinsically is getting better as they use the product."

With Revision Assistant, 94% of students got feedback at least once and then changed their essay prior to submission... that jump from 29% to 94% is pretty significant.

"But then the thing that I actually care the most about is behavior change. How do students interact with their writing process? The best baseline that we could find suggests that students that don’t have Revision Assistant: about 29% of them revise their essay before they submit it. With Revision Assistant, 94% of students got feedback at least once and then changed their essay prior to submission, so that means a minimum of two drafts, and that jump from 29% to 94% is pretty significant."

"We see that on average in middle schools students write about 11 drafts before submitting. In high schools, grades 9 through 12, they write 8 drafts before submitting. They're thinking about writing differently, and they're 13. That’s something that we’ve been able to change very early in a student’s educational life."

"We’ve seen that Revision Assistant changes student behaviors. What we’ve discovered as we’ve spent all this time building the product is that content matters more than either the technology or the metrics: the actual writing activities that the students are doing, the feedback that you're giving them, all of this."

Students are thinking about writing differently... That's something that we've been able to change very early in a student's educational life.

"So my last point is that audience awareness matters above everything else that I just talked about. When we go into a school and talk about Revision Assistant, what we talk about is the classroom experience, that we’ve spent hours and hours and hours of our time sitting in a classroom, watching students on laptops, on iMacs, using Revision Assistant, bouncing ideas off each other. We’ ve seen where it’s screwed up, where the software wasn’t doing what we wanted, and we changed it because we looked at a 14-year-old struggling to improve their essay because of it."

"We talk about the teachers that we’ve hired to work on this. These are four of the 20 or so people in our office that actually have been teachers for years of their lives, that we’ve pulled out of Penn State, out of a middle school, out of Baltimore city high schools, people that have that classroom experience, years of time in front of a batch of students and now want to work on improving the content and the technology that actually will help out a much larger group of teachers nationwide, and even worldwide."

"Your product, what you’ve built out of it, doesn’t replace the teacher, doesn’t change the conversation in a way that leads students to be totally independent thinkers. It enables teacher-student conversations. It means that when a teacher goes to a student and asks, ‘how are you doing?’ that, instead of staring blankly, the student will say, ‘well, I’m on my third draft and this feedback is coming back. I’m not sure what to do with it. Could you help me out?’ I’ve seen that happen. Fundamentally, that’ s what we want Revision Assistant to do. We want it to change the way teachers interact with their students and create a better writing classroom."

"So with that, the lessons learned: Technology does matter, to a point, and your metrics for defining success matter quite a bit. But the content matters more than any of that, and above all else, if you want teachers to actually use your technology, talk to them like teachers and talk about the classroom that they're in, and that’s where you get the engagement, and that’s where you get the experience to change."

Hear the Q&A and watch the full presentation.

Related content:

Motivating Students to Revise More Leads to More Practice, Greater Success
Seeing the Light! Turnitin Acquires LightSide Labs to Enable Students, Empower Educators and Improve Instruction