Our society is digitised and so are our conflicts. Most of our conversation is happening online. Conversation can turn toxic quickly.
There are parts of the user population spreading hate and certain topics whenever they argue. Other people or a specific/selected group easily become their targets.
Content moderation is important, but it is hard work. It has to be quick. It has to be on point. Hate speech never sleeps and therefor content moderation always has to be wide awake.
Implementing consistent moderation decisions is difficult and moderating is psychologically demanding.
We provide automated solutions that are unbiased, topic neutral and socially fair. We are focusing on hard sociolinguistic problems. We help our customers to retain control of their brand. That means shifting the focus back to the positive aspects of community building and fostering meaningful communities.