They need to run peoples text through a sentiment analyzer and categorize players based on toxicity and group tiers of toxicity together. Maybe include tips on how to be a better human.
Ultimately, this is one of those things that needs subjective judgment and community ambassadors to be handled effectively. That requires human labor with high turnover.
I'm sure at some point one of the big players in the especially bad spaces (like MOBAs) will figure out how to do it on the cheap and create a market efficiency. But until then, all the profit chasers are allergic to creating actual jobs to solve the problem.
They need to run peoples text through a sentiment analyzer and categorize players based on toxicity and group tiers of toxicity together. Maybe include tips on how to be a better human.
Ultimately, this is one of those things that needs subjective judgment and community ambassadors to be handled effectively. That requires human labor with high turnover.
I'm sure at some point one of the big players in the especially bad spaces (like MOBAs) will figure out how to do it on the cheap and create a market efficiency. But until then, all the profit chasers are allergic to creating actual jobs to solve the problem.
Should we have a game toxicity review site to warn people of how toxic a community is.