Accused of Cheating by an Algorithm, and a Professor She Had Never Met

Dr. Orridge did not respond to requests for comment for this article. A spokeswoman from Broward College said she could not discuss the case because of student privacy laws. In an email, she said faculty “exercise their best judgment” about what they see in Honorlock reports. She said a first warning for dishonesty would appear on a student’s record but not have more serious consequences, such as preventing the student from graduating or transferring credits to another institution.

Honorlock hasn’t previously disclosed exactly how its artificial intelligence works, but a company spokeswoman revealed that the company performs face detection using Rekognition, an image analysis tool that Amazon started selling in 2016. The Rekognition software looks for facial landmarks — nose, eyes, eyebrows, mouth — and returns a confidence score that what is onscreen is a face. It can also infer the emotional state, gender and angle of the face.

Honorlock will flag a test taker as suspicious if it detects multiple faces in the room, or if the test taker’s face disappears, which could happen when people cover their face with their hands in frustration, said Brandon Smith, Honorlock’s president and chief operating officer.

Honorlock does sometimes use human employees to monitor test takers; “live proctors” will pop in by chat if there is a high number of flags on an exam to find out what is going on. Recently, these proctors discovered that Rekognition was mistakenly registering faces in photos or posters as additional people in the room.

When something like that happens, Honorlock tells Amazon’s engineers. “They take our real data and use it to improve their A.I.,” Mr. Smith said.

Rekognition was supposed to be a step up from what Honorlock had been using. A previous face detection tool from Google was worse at detecting the faces of people with a range of skin tones, Mr. Smith said.

But Rekognition has also been accused of bias. In a series of studies, Joy Buolamwini, a computer researcher and executive director of the Algorithmic Justice League, found that gender classification software, including Rekognition, worked least well on darker-skinned females.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.