When Ravi Yekkanti puts on his headset to go to work, he never knows what the past day will bring him in virtual reality. Who could I meet? Will he be accosted by a child’s voice with a racist remark? Will a cartoon try to grab your genitals?
Yekkanti’s job, as he sees it, is to make sure everyone in the metaverse is safe and having a good time, and he takes pride in it. He is at the forefront of a new field, VR and metaverse content moderation.
Digital safety in the metaverse has had a bit of a rough start, with reports of sexual assault, bullying and child care, an issue that’s only becoming more pressing with Meta’s recent announcement that it’s lowering the minimum age for on its Horizon Worlds platform from 18 to 13.
Because traditional moderation tools, such as AI-enabled filters on certain words, don’t translate well to immersive real-time environments, mods like Yekkanti are the primary way to ensure safety in the digital world. And this work is becoming more important every day. Read the whole story.
—Tate Ryan-Mosley
The flawed logic of rushing extreme climate solutions
Early last year, entrepreneur Luke Iseman says, he launched a pair of weather balloons filled with sulfur dioxide from Mexico’s Baja California peninsula, hoping they would burst miles above Earth.
It was a trivial act in itself, effectively a small DIY act of solar geoengineering, the controversial proposal that the world could counteract climate change by releasing particles that reflect more sunlight into space.