MySafeTXT: New Moderation Tool Aimed at Children’s Websites

Tia Fisher Freelance writer, Freelancer, mostly blogging for eModeration

Posted on November 30th 2012

MySafeTXT: New Moderation Tool Aimed at Children’s Websites

ImageNo-one knows better than a moderator how imaginative and resourceful children are at getting round filters.

That’s to put a nice spin on it of course.  Whilst it is a great demonstration of their ingenuity and imagination and determination, the truth is, the stream of rude words and private contact information in high traffic sites is the moderation millstone. Wading though shedloads of swearwords and phone numbers (and their filter-busting permutations) is a chore which keeps moderators from quickly seeing the cry for help, the bullying situation, the grooming conversation that they need to get to, fast.

Hence eModeration’s delight when our long-term child online safety consultancy partners KidsOKOnline (KOKO) came to us with their new moderation tool MySafeTXT.

MySafeTXT is an online English language monitoring and alert service, a moderation tool which can be plugged in to any existing online community, and is (for example) currently being used by the IGGY community, an online Educational Social Network for gifted 13 to 18 year old students, created by the University of Warwick.

All user-generated textual content and communications between community members is scanned, and any content containing swearwords is automatically rejected (without taking up moderator time). Moderators are then automatically alerted to any suspicious patterns of behaviour which indicate bullying, sexual harassment, grooming or self-harm.

The process is lightning fast and content with no suspect language (usually 90% of all content) is published almost instantly.


How MySafeTXT works

Moderators log in to a simple Content Moderation Dashboard and check any flagged content awaiting review, choosing to publish or remove content directly from the dashboard. They don’t actually enter the community itself, and have no direct contact with its members.


MySafeTXt moderation dashboard

MySafeTXT moderation dashboard


MySafeTXT moderation queue

MySafeTXT moderation queue


MySafeTXT Reputation Check

MySafeTXT Reputation Check


Community owners are in control

The community owners set the community rules and the content is then filtered for rule violations with reference to comprehensive, continuously updated, lists of suspect words and phrases. If the content contains banned words (Black List) it is simply rejected and not published. If the content contains suspect words (Grey List), it is presented to moderators who can review it in context and choose to publish or not.


Why MySafeTXT gets our approval

eModeration put the tool through its paces and advised KOKO on the final stages of the development of the latest update. We’re pretty picky about our tools, and as far as we’re concerned MySafeTXT ticked most of our boxes and had some really welcome features, such as:

  • Out of the box: the tool plugs into existing platforms.
  • The interface is simple to use and intuitive.
  • A separate alert queue for potentially serious content, with email notification.
  • Flagged content is highlighted for speed of location.
  • The ‘reputation check’ screen (see above) gives an at-a-glance summary of the user and their previous user behaviour.
  • Automatic block for content which is clearly against guidelines.  The swearwords – and all their aliases – just don’t make it onto the screen at all.
  • Word filters (lists) are built with kids in mind, with systems to detect and prevent filter avoidance.
  • Filters are intelligent: the system tracks all Moderator actions and collates their responses to notifications/blocks and adjust the lists to remove/reduce false positives and to cover missed suspect terms.
  • Black lists are updated continuously by KOKO, so that when new phrases come into use, or when previously OK words are now being used as insults, the user lists are updated.
  • Custom word lists can be developed for anything project specific, and the community owners control what language is banned, notified or allowed.
  • Guidelines can also be entered in to the tool and will appear above each piece of content for the moderators to quickly refer to.
  • Hot Spot reports in the tool will flag up any conversations between users that are hitting a lot of filters. The idea is this will flag up grooming and bullying.
  • Detailed audit logs of every action that takes place in the tool: great for QA and checking moderator performance.

The focus is on the content with MySafeTXT, and not on actions for the user: at the moment, there are no rules set up for contraventions such as auto-bars or warning emails, but KOKO say they may be incorporating this into the tool in future releases.

MySafeTXT is based on KOKO’s 20+ years experience of running safe online communities for children and adults (for example SuperClubsPLUS).  KOKO also provides a community for the children in the care of Barnardo’s and advises on child safety systems to JKRowling’ Pottermore website. From Carole Hart-Fletcher, Director of KOKO: “After 20 years of protecting young people online, we’ve learned how they try to get around any moderation system – they’ve taught us many tricks and we’ve built them into MySafeTXT.”

MySafeTXT Plugins are currently available for WordPress, Drupal and PHP-based systems. Do get in touch with Carole Hart-Fletcher from KOKO if you’d like more information.


Tia Fisher

Freelance writer, Freelancer, mostly blogging for eModeration

Freelance writer and social media enthusiast, frequently to be found blogging with with the social media management agency eModeration. You can find her on .

See Full Profile >