News

Face-Reading AI Will Tell Police When Suspects Are Hiding Truth

American psychologist Paul Ekman’s research on facial expressions spawned a whole new career of human lie detectors more than four decades ago. Artificial intelligence could soon take their jobs.

While the U.S. has pioneered the use of automated technologies to reveal the hidden emotions and reactions of suspects, the technique is still nascent and a whole flock of entrepreneurial ventures are working to make it more efficient and less prone to false signals.

Facesoft, a U.K. start-up, says it has built a database of 300 million images of faces, some of which have been created by an AI system modeled on the human brain, The Times reported. The system built by the company can identify emotions like anger, fear and surprise based on micro-expressions which are often invisible to the casual observer.


“If someone smiles insincerely, their mouth may smile, but the smile doesn’t reach their eyes — micro-expressions are more subtle than that and quicker,” co-founder and Chief Executive Officer Allan Ponniah, who’s also a plastic and reconstructive surgeon in London, told the newspaper.

Facesoft has approached police in Mumbai about using the system for monitoring crowds to detect the evolving mob dynamics, Ponniah said. It has also touted its product to police forces in the U.K.


WATCH: NYC terror suspect allegedly seen purchasing fireworks fuse days before attack
Michigan man found guilty of killing wife whose body was discovered in fertilizer tank
Special election replacing Marjorie Taylor Greene goes to runoff between Trump-endorsed candidate and Democrat
Trump Says Family Bible from His Mother Has a Powerful Revival Connection
Speaker Johnson touts Trump’s agenda as crucial blueprint ahead of midterms: ‘On the ballot’
Brett Kavanaugh Fires Back as Ketanji Brown Jackson Gets Hostile While Two Share Stage at Event
Trump Appoints Erika Kirk to Board Position Previously Held by Charlie
Leftist Still Doesn’t Learn His Lesson After His Pro-Diversity Speech Is Interrupted by ISIS Terrorist’s Bomb
WATCH: Dem witness accuses Trump of ‘population purge,’ Kennedy fires back: ‘You trigger my gag reflex’
Far-left lawmaker endorses candidate who boasted about voting with Republicans 80% of time
Newsom blames Trump for California’s higher gas prices, despite state policies
California bishop’s alleged secret double life explodes into felony case
FAA Temporarily Grounds All JetBlue Flights
‘A Woke Joke’ – Fans Furious at Hypocrite Bruce Springsteen as Tickets to His ‘No Kings’ Tour Are So Expensive, Only Royalty Can Afford Them
ICE Houston touts over 400 illegal alien child sex offenders arrested during Trump’s first year back in office

The use of AI algorithms among police has stirred controversy recently. A research group whose members include Facebook Inc., Microsoft Corp., Alphabet Inc., Amazon.com Inc. and Apple Inc published a report in April stating that current algorithms aimed at helping police determine who should be granted bail, parole or probation, and which help judges make sentencing decisions, are potentially biased, opaque, and may not even work.

See also  Judge to allow sex offender to question witnesses in Virginia locker room case

The Partnership on AI found that such systems are already in widespread use in the U.S. and were gaining a foothold in other countries too. It said it opposes any use of these systems.

Story cited here.

Share this article:
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter