Massive changes coming to Tinder and other dating apps under a world-first plan to protect women from sex predators
- Dating apps are set to bring in reforms designed to increase the safety of users
- Match Group, owners of Tinder, have been working on changes with police
- Under reforms the company could use artificial intelligence to scan users’ chats
Dating apps will use artificial intelligence to ‘red flag’ potentially threatening users to protect women from sexual assault.
American holding company Match Group owns 27 apps including Hinge, Tinder, and Plenty of Fish and has been working with NSW Police on a plan to improve safety.
Users who might pose a danger to others on the platforms will be flagged and any assaults reported via the apps will be immediately forwarded to police.
The owners of dating apps like Tinder are working on a plan to increase the safety for users on their platform (stock image)
Match Group will establish a software portal that will feed any sexual assault reports – along with information linked to the case such as conversation histories – to police.
‘They have been very good at recognising that their brand can be damaged if they don’t support victims if there has been an incident that has stemmed from meeting on one of their apps,’ Detective Superintendent Stacey Maloney told The Daily Telegraph.
The changes address the two of the biggest concerns that have been raised about the safety of dating apps.
Apps such as Tinder currently don’t directly refer reports of assaults or stalking behaviour to police themselves but rather provide information to complainants in the form of generic written responses or call centre ‘case workers’.
Victims might then not continue the matter with police, so the platforms forwarding complaints directly to authorities increases the chances of conviction.
The company could use artificial intelligence to scan the conversations between users (stock image)
The other concern is apps like Tinder with an ‘unmatch’ function that perpetrators could use to block their victims and wipe their digital interactions – along with possible evidence.
Match Group is working with police to potentially use their artificial intelligence systems to scan conversations for ‘red flags’ and then document them.
‘Very coercive and forceful behaviour that we see in law enforcement in offenders… if we can [record] that throughout the course of them being on those apps… in the event something does occur,’ Det Supt Maloney said.
The company is also considering hiring law enforcement liaisons whose sole job would be to streamline communication and information sharing with police.