Facial recognition and text analysis are preventing women from meeting attackers.
There was no way for Katia to know the man she’d met online was planning to rape her. Like many women using online dating apps and websites, she was cautious and did everything right.
On the night of their first date, the college graduate told her mother and best friend where she would be. She presumed safety in the public locale of a city bar, and she was right, for a time.
Katia continued to see the man, text him, and speak to him on the phone. Having gained her trust, he was privy to her home address. It wasn’t until after the fourth date that things started to go wrong.
One night the man she’d met online, and had considered up until that point to be a “cool guy,” showed up at her apartment with a bottle of wine laced with Rohypnol and raped her.
Katia was beaten and threatened at knifepoint in a five-hour aggravated rape and assault. Despite surviving the ordeal, her life was torn apart. And sadly, even retrospectively, there had been no way to predict it—until now.
New and innovative reverse image searching
Named after one of the many victims of aggravated rape, KATIA aims to stop women from ever meeting an attacker. Responding to the flaws of traditional background checks, which require both first and last names, the program takes advantage of the assets women often have: pictures and text.
An evidence-based rape screening tool, the program uses two unique programming techniques with “R,” a statistical computer and graphics language, to highlight potential dangers.
Image source: NEC Corporation of America with Creative Commons License