Request an invite here: www.securly.com/auditor
Auditor by Securly
Today, in honor of Stop Cyberbullying Day, we are announcing the launch of “Auditor by Securly”. This is a free tool that helps schools using Google Email ensure the safety of their students by monitoring these channels for messages that are indicative of bullying or self-harm. At this time the product is in Beta phase and we are working with early adopter districts who are willing to collaborate with us to shape this product over the next couple of months before it is available for general release. For an invite, click on the button above.
In this blog post, we outline the thought process that brought us to this announcement.
In Pursuit of Student Safety
We have always thought of Securly as a company with a “double-bottom line” mission. As with any other company that is trying to build a sustainable business, we need to charge a fee for our services and grow our revenues year over year. However, while achieving this somewhat “practical” goal, we aspire to make a dent in the universe. In our mind, that “dent” has always been (and likely always will be) ubiquitous child safety – both at school and at home. We have introduced a number of innovations to market in pursuit of this aspiration:
- Sentiment-analysis based Bullying and Self-harm Detection: Industry’s first bullying and self-harm detection on social media. We use Machine Learning techniques to infer sentiments that are indicative of bullying or self-harm on Social Networking websites such as Facebook, Twitter, Google+.
- 911 Emergency Response notifications for Guidance Counselors: Industry’s first Delegated Administration that allows district guidance counsellors and school principals to be alerted on these flagged activities allowing the district IT team to focus on infrastructure related operations.
- 911 Emergency Response notifications for Parents: Industry first Parent Portal and e-mail reports. While the concept of a Parent Portal is extremely common among EdTech companies, Securly is the first (and at this time only) web-filter to have introduced this concept to the web-filtering and student-safety space. Parents can get immediate notification on any instances of bullying/self-harm.
Expanding Student Safety to Emails
With Google Mail becoming a tool of choice in thousands of schools across the world, blocking these is no longer an option for most middle and high schools. However, we have learnt through conversations with our customers that these resources have opened up new vectors for students to vent negative emotions such as bullying and self-harm.
Generally speaking, we have found that many schools do not have good solutions in place that address this issue because of the following factors at play:
- While web-filters are required by law, they do not cover these vectors. By its very definition, “web-filtering” does not apply to emails. A lot of schools that we’ve spoken to use Google’s default compliance options to flag emails that contain a predefined set of keywords. This can be prone to lots of False Positives (False Alarms) and False Negatives (Missed Alerts) and does not scale well in a large District where IT becomes the bottleneck in sorting through these flagged messages.
- Old school approaches to monitoring these channels involving human auditors have cost money.
- The CIPA law is vague about the need to cover this vector – “The policy proposed must address.. security and safety of minors using chat rooms, email, instant messaging, or any other types of online communications.” However, the meaning of “safety” is left too vague.
Given the lack of any compliance requirement, and cash-strapped schools already reluctant to spend on paid solutions, we felt compelled to introduce a free tool to address this serious issue. We expect even districts that have never purchased paid solutions to now consider using Auditor as a free tool that could potentially preempt the next bullying or suicide incident.
Automated sentiment inference approach: The key advantage that we will be providing over the state of the art is that while existing tools rely heavily on keyword matching to detect inappropriate behavior (e.g. by looking for words like “suicide” or “ugly”), we will be relying on our tried and tested machine learning techniques to do so. To appreciate the value of this approach, consider the following post that was flagged by our algorithm – “slowly im realizing i don’t really have a purpose here say good bye cause Fryday it’s all over <3” It should be clear to the reader that a keyword based approach would not have worked in detecting this.
911 Emergency Response Notifications to Parents and Guidance Counselors: We will be extending our existing Delegated Administration and Parent Reports functionality from our flagship web-filtering product to Auditor. In the context of Auditor, these services become 911 Emergency Response notifications to guidance counselors and parents respectively. In other words, parents, principals and guidance counselors will receive these alerts whenever our Auditor detects disturbing emails or chats sent or received – or even inside email drafts composed but not sent yet (e.g. a suicide note in progress).
Timeline and Business Model
Securly will be bringing Auditor to market this Fall. We will make it available for free in perpetuity to any school (of any size) that wants to use it. We will be launching with E-mail, Chat and Drafts monitoring.
Like most enterprise “freemium” business models, we are also expecting a fraction of the schools who try this tool out to eventually become paid customers and referrers of our premium web-filtering paid product. This move aligns with the company’s “double bottom-line” goal of helping schools who cannot afford to plug a key hole in student safety while bringing a key business model innovation to the table.
If you would like to use this tool in your District and stay updated of our progress, please do drop a note here and we will keep you posted!
Request an invite here: www.securly.com/auditor