Reddit CEO Steve Huffman wants to bring in a new set of speech restrictions to the platform, but what those rules will ultimately look like is far from concrete.

On Thursday afternoon during an Ask Me Anything session, the incoming Reddit CEO explained his vision for how content should exist on the platform going forward. “We’re considering a set of additional restrictions on what people can say on Reddit — or at least say on our public pages — in the spirit of our mission,” wrote Huffman.

That includes spam, illegal content, anything that incites harm or violence, harassment, bullying, reference to physical abuse of a specific person or people, and sexually suggestive material involving minors:

As I state in my post, the concept of free speech is important to us, but completely unfettered free speech can cause harm to others and additionally silence others, which is what we’ll continue to address.

When asked if he’d be banning any specific subreddits, Huffman replied,

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

We’ll consider banning subreddits that clearly violate the guidelines in my post — the ones that are illegal or cause harm to others.There are many subreddits whose contents I and many others find offensive, but that alone is not justification for banning.

Later he specifically cited /r/rapingwomen and /r/fatpeoplehate as subreddits that had already been banned. He also said /r/coontown would be “reclassified.” The plan for reclassification is something akin to the way that Reddit handles not safe for work content, which flags salacious threads. To help Redditors understand how to determine whether a thread will be reclassified or banned, he added:

It’s ok to say, “I don’t like this group of people.” It’s not ok to say, “I’m going to kill this group of people.”

He also said he would be supplying moderators with the tools to appropriately remove content and listed a few instances that would result in content being taken down.

  • The user deleted their post. If that’s what they want to do, that’s fine, it’s gone, but we should at least say so, so that the mods or admins don’t get accused of censorship.
  • A mod deleted the post because it was off topic. We should say so, and we should probably be able to see what it was somehow so we can better learn the rules.
  • A mod deleted the post because it was spam. No need for anyone to see this at all.
  • A mod deleted a post from a user that constantly trolls and harasses them. This is where I’d really like to invest in tooling, so the mods don’t have to waste time in these one-on-one battles.

But the new restrictions on content and system for reclassifications is still fairly confusing. They don’t have clear boundaries, a problem that has plagued Reddit before and which some say is the main reason it’s having issues with containing violent talk and bullying on the platform.

For instance, while /r/fatpeoplehate has been shut down, /r/watchpeopledie (which, as the name implies, shows people being fatally harmed) is still very active.

Going forward, Reddit will have to be much more specific not only about what gets banned, but about how the company intends to keep the most distressing content from bubbling to the surface.

In the meantime, Huffman said that the policy will remain the same until moderators have the tools to implement change.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More