We introduced Auditor after learning from our schools that students were using Gmail to express negative emotions such as bullying and self-harm. We made it free so that no school would have to make compromises when it comes to student safety.
Today, Auditor is thriving – scanning 3.5 million emails per day for 1 million students across the world.
‘Within six months of its launch Auditor has scanned more than 100 million emails and saved so many lives – That’s the power of AI and best part is that it’s all free! This is something that each every school should have as kids lives are at stake.’
‘Securly has been having its filtering product around for quite sometime but our core mission has always been ubiquitous safety for kids. Auditor is one such product that is so core to our mission, we decided to give it for free. Since its launch six months ago it has helped save countless number of lives.’ – Neeraj Thakar, VP India R&D Operations
Auditor also ensures that schools are CIPA (Children’s Internet Protection Act) compliant. CIPA requires schools to maintain “the safety and security of minors when using electronic mail, chat rooms, and other forms of direct electronic communications”. However, traditional web-filters do not address this. Many schools reported using Google’s default compliance options to flag emails that contain a predefined set of keywords.
This method is undependable as:
- It is prone to lots of False Positives (False Alarms) and False Negatives (Missed Alerts).
- Does not scale well in a large District where IT becomes the bottleneck in sorting through these flagged messages.
Instead, Auditor uses an automated sentiment inference approach. For example, consider a post that was flagged by our algorithm:
“slowly im realizing i don’t really have a purpose here say good bye cause Fryday it’s all over <3”
Though the allusion to suicide is clear, it does not contain the keywords usually associated with this kind of behavior. A keyword based approach would not have worked in detecting this.