Nextdoor social network adds features to combat racial profiling
Nextdoor social network adds features to combat racial profiling
Beset by accusations that his neighborly social network had become a breeding ground for race-based fear-mongering, the head of Nextdoor rolled out new features Wednesday that he said will combat the problem.
“We know it’s not the last bit of work we have to do, but it’s a pretty significant milestone,” Chief Executive Officer Nirav Tolia told a roomful of reporters at the company’s headquarters on Market Street in San Francisco, hours after the changes were implemented in 110,000 neighborhoods across the country.
Tolia said that Nextdoor is the first social media company to deal with racism head-on by changing aspects of its product, at the risk of alienating some users.
Nextdoor, a get-to-know-your-neighbors network meant to be the kind of place where people can advertise garage sales or find babysitters, came under fire last year after Oakland residents and civic leaders said it was being used to spread bias.
Stunned by the onslaught of warnings about African American men in hoodies or “dark-skinned” people driving by in cars, residents in the city’s Dimond and Glenview neighborhoods formed a social media watchdog group, Neighbors for Racial Justice, which began pressing city officials and Nextdoor to address the problem.
The group found an audience in City Councilwoman Annie Campbell Washington, who met with executives from Nextdoor several times and encouraged them to make reforms.
Nextdoor says it will discourage both types of profiling with new pop-up screens that appear whenever someone posts in its “crime and safety” forums, prompting users to give at least two details besides the race or ethnicity of the person they are describing — such as what type of hair the person has, or what type of shoes he or she is wearing.
The company has also begun “empathy trainings” for Nextdoor “leads” — the volunteers who start individual Nextdoor networks in their neighborhoods and who are empowered to delete abusive posts.
The point of the trainings, which started with three face-to-face sessions in Oakland but will be offered online in other cities, is to help define racial profiling so that moderators know what to look out for, Tolia said.
In test-runs of thousands of posts, Tolia said that while the company saw a 75 percent reduction in posts that it characterized as racially biased, it also saw a 50 percent “abandonment rate” — meaning half of the participants ditched their posts midway through writing them, because they didn’t have enough information to fill all the check boxes in the pop-up windows.
Audrey Esquivel, a resident of Oakland’s Glenview neighborhood and member of Neighbors for Racial Justice, said that she, too, was encouraged by the social network’s reforms, and that she hopes the new features will teach people to be more conscientious, in general, about their own implicit biases.