BigStock Photo.

Valve will no longer remove games from its Steam game marketplace unless they are “illegal, or straight up trolling,” according to a statement from the Bellevue, Wash.-based gaming company posted today.

The announcement comes a week after Valve removed a controversial game that simulated school shootings, following a nationwide outcry to ban the title. Last month it also issued warnings to developers about adult content in games.

In its blog post, Valve executive Erik Johnson writes that “Valve shouldn’t be the ones deciding this.”

“If you’re a player, we shouldn’t be choosing for you what content you can or can’t buy,” it reads. “If you’re a developer, we shouldn’t be choosing what content you’re allowed to create. Those choices should be yours to make. Our role should be to provide systems and tools to support your efforts to make these choices for yourself, and to help you do it in a way that makes you feel comfortable.”

Valve said it will create tools to let users “override our recommendation algorithms and hide games containing the topics you’re not interested in.”

“And it’s not just players that need better tools either – developers who build controversial content shouldn’t have to deal with harassment because their game exists, and we’ll be building tools and options to support them too,” Johnson wrote.

More from the announcement:

“So what does this mean? It means that the Steam Store is going to contain something that you hate, and don’t think should exist. Unless you don’t have any opinions, that’s guaranteed to happen. But you’re also going to see something on the Store that you believe should be there, and some other people will hate it and want it not to exist.”

After removing the school shooter game last week, Valve called the game developer “a troll, with a history of customer abuse, publishing copyrighted material, and user review manipulation.”

Valve’s Steam platform has become one of the biggest outlets for game publishers, with a reported 7,672 games released in 2017 alone on the platform.

Other tech companies have grappled with how to police controversial content on their platforms. YouTube last year implemented new policies to age-restrict inappropriate videos posing as children’s content.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.