Page 22 - Campus Technology, October/November 2018
P. 22

CAMPUS TECHNOLOGY | October/November 2018
assessments. The program flags potential cheating and generates a report for the instructor.
In addition, Turnitin is in use at Miami as well as Deakin. This widely used plagiarism detection tool checks student papers against a multitude of sources (including other papers already submitted) to identify those containing unoriginal work. At Deakin, according to Sutherland-Smith, students upload their papers to the service, which generates a “similarity report.” This report, which instructors may request be handed in with the student assignment, includes a similarity score that summarizes the amount of matching or similar text found in the paper.
But technology isn’t a failsafe, Sutherland-Smith added, especially when a faculty or staff member isn’t fully trained on the use of it. She first learned of Turnitin in 2001, during a university pilot to test the service. “Part of what we found in that pilot trial was that staff would misinterpret the information they were getting. We found that staff would just look at the total percentage and go, ‘All right. It’s a 75 percent text match; [that means the paper is] 75 percent plagiarized.’” In reality, there can be many factors behind that number.
As a guide on the Turnitin website itself laid out, “If the student has used quotes and has referenced correctly, there will be instances where we will find a match.” Or the user may not exclude “small sources” or “quotes and bibliography items.” Sometimes, added Sutherland-Smith, a student will
Technology isn’t a failsafe, especially when a faculty or staff member isn’t fully trained on the use of it.
use content positively identified by the software as a duplicate pulled from somewhere else that might be exactly what the instructor said to use. All those nuances feed into a score that could generate suspicions.
“The tool was never, ever about plagiarism detection,” she asserted. “The tool’s only ever about providing evidence from text matching. From there it has to be a human decision.”
A Better Alternative: Authentic Assessment
Another aspect of the solution, noted both educators, is to come up with assessment designs that are tougher to hire contract cheating companies to produce. One way to do that is to make the assignment “authentic and local,” said Sutherland-Smith. It might require the student to use locally gathered data or information that’s hard for contracting cheating companies to get their hands on in time to complete a given assignment, she offered.
Mays, who teaches Excel courses, has modified his assignments to force the inclusion of something original. He’ll have the students create and customize a chart based on the data in a table and then have them write an essay that analyzes the results. “I have caught several students who have totally copied everything,” he said. “Since I grade everything in one sitting, I can remember when one chart looks similar, which makes life easier on the detection end.” Or he might ask students to create a macro and do a save-as to record the full

   20   21   22   23   24