photo: Facebook

photo: Facebook

Kelowna’s Two Hat continues to lead charge in fight against child sexual abuse images online

The company has released CEASE.ai, an artificial intelligence model to detect new material for law enforcement

Leading AI technology company Two Hat announced Jan. 22 that it has released CEASE.ai, an image recognition technology for social platforms and law enforcement that detects images containing child sexual abuse.

By making the technology available to public and private sectors, Two Hat aims to address the problem not only at the investigative stage but at its core, by preventing images from being posted online in the first place.

“This issue affects everyone, from the child who is a victim, to the law enforcement agents who investigate these horrific cases, to the social media platforms where the images are posted,” said Two Hat CEO and founder Chris Priebe. “With one hat in social networks and the other in law enforcement we are uniquely positioned to solve this problem. With CEASE.ai, we’ve leveraged our relationship with law enforcement to help platforms protect their most vulnerable users.”

Built in collaboration with Canadian law enforcement, and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs, a national research organization with top Canadian universities www.mitacs.ca, CEASE.ai is an artificial intelligence model that uses ensemble technology for groundbreaking precision. With Two Hat’s recent acquisition of image moderation company ImageVision, it has boosted its existing technology to achieve even greater accuracy and efficiency.

RELATED: Kelowna’s Two Hat changing landscape of content moderation

Unlike similar technology that only identifies known images (“hash lists”), CEASE.ai detects new child sexual abuse material (CSAM). Developed for law enforcement, CEASE.ai aims to reduce investigators’ workloads and reduce trauma by prioritizing images that require immediate review, ultimately rescuing innocent victims faster. Now social platforms can use CEASE.ai to detect and remove child abuse images as they are uploaded, preventing them from being shared.

Predators are increasingly using social platforms to solicit and share images. According to a 2018 NetClean report, “Grooming and extortion are now coming from social media apps, unlike a few years ago where most of it occurred by someone that had access to the child.”

RELATED: Kelowna tech company featured in Ruby Roxx documentary

“Removing child abuse material from the internet and protecting kids is a responsibility that we all share, regardless of sector,” said Julie Inman Grant, Australian eSafety Commissioner. “It’s exciting to see innovative technology solutions being deployed in a space where it’s crucial that the good guys stay one step ahead.”

Learn about Two Hat’s efforts to detect CSAM through CEASE.ai on its site. Priebe will host a webinar in February to share his vision of the future of AI, including CEASE.ai and updates to Two Hat’s suite of content moderation solutions.

To report a typo, email:
newstips@kelownacapnews.com
.


@KelownaCapNews
newstips@kelownacapnews.com

Like us on Facebook and follow us on Twitter.