Apple — and app developers — have a responsibility to shield adult content from kids
At 15, I downloaded Grindr, uploaded a photo of my bare chest, and created a profile with a fake alias. Initially uncomfortable, my curiosity grew as I encountered usernames like '"looking for young" and "discreet hookups only." Seeing LGBTQ+ sexuality displayed so openly was a relief and made me realize, Oh, there are gay people everywhere.
Months of deleting and redownloading Grindr culminated one night after a long day of school, dance and homework. The Grindr notification sounded, and the guy I chatted with the night before was back online, suggesting we meet. Despite my hesitation, his "now or never" pleas led me to crawl out of my bedroom window.
My parents were supportive and open-minded when raising my younger sister and me. They did almost everything right, according to many trusted parenting experts, by granting me freedom and lessening restrictions as I grew into an older teenager. I also know that the reverse of this, commonly known as helicopter parenting, can drive a wedge between kids and their parents, leading to rebellion. Because of these widely accepted truths, I believe the urge to explore sexuality as a teen is natural.
So why do many LGBTQ+ kids seem to find that independent exploration online instead of offline?
Teenagers who identify as heterosexual are not only allowed to explore their sexuality openly and comfortably but are celebrated for doing so (e.g., nearly every adolescent film and television show of the last 75 years). But in 2023, one-fourth of high schoolers in the U.S. identified as LGBTQ+, according to data from the Centers for Disease Control. Of that one-fourth, many are likely in school environments that make them feel unsafe when exploring as openly as their heterosexual counterparts. And they are outnumbered. I believe this to be a large reason why LGBTQ+ teens turn to the online world for exploration and connection — like I was clearly craving when I snuck out of my house — leading to queer teens being 3 times more likely to experience unwanted or unsafe interactions online.
That night ended when I saw the guy's truck turn down my street. I hid behind a bush, deleted the app after his messages and stayed off Grindr until college.
After hearing about experiences like this, most people feel lost and confused about who to blame. Was it Grindr’s fault for allowing a minor to weasel their way onto an adult hookup app? The parents that didn’t know the app was downloaded? What about Apple, which couldn’t prevent the adult-only app from being downloaded by a 15 year old? To answer this, I think about who I would blame if my puppy, Stormy, suddenly got sick from a chemical in her newest chew toy. I would first wonder why the toy didn’t go through thorough safety testing; then I would ask why a store allowed it on the shelves in the first place.
This logic is missing when tech executives continue to shift the blame back and forth between the app itself and the app marketplace. Both the Apple App Store and Grindr were responsible for my safety.
A lot has changed for better and worse about Grindr since my time on the app. Just this year, the company established new safety and privacy features. Grindr also has seen more users speaking openly about a diminished user experience.
Apple just released new child safety features for the Apple App Store. But it is not enough. While not showing and advertising apps rated as inappropriate to kids is an important step, it does not account for developers who mis-rate their apps. Apple claims their app store is a “safe place for kids,” but in a report I worked on with my colleagues at Heat Initiative and ParentsTogether Action, we found 200+ risky or inappropriate apps in the Apple App Store rated as appropriate for kids as young as 4. What is the fix for this?
I still believe that Apple’s App Store will continue to be a risk for youth unless it implements third-party reviews of app ratings. Just like movies, television programs, and video games, these experts would assess the risk to children and give an age rating in the best interest of kids – not the best interest of Apple or the app developer.
Big Tech must be held accountable for building safe products and services. Clearly, app store marketplaces still have serious changes to make, but the tech companies developing these apps must design them with safety as the top priority. This includes thorough ways of verifying age, especially considering many of these apps are accessible without a mobile app. Lives depend on it.
As we continue navigating tech safety, we have to balance new measures with privacy concerns. Adding additional steps like ID verification is an uncertain act for many LGBTQ+ youth in the Trump era. Basically, we have to ensure our data is protected from entities who could hurt us.
So, whether it’s a mother, like Kristin Bride, who lost her son after he was cyberbullied on anonymous apps he got off of the App Store or the queer teens assaulted after sneaking out to meet someone they met on a hookup app, Big Tech’s negligence is contributing to harm, and it must stop.
Lennon Torres is an LGBTQ+ advocate who grew up in the public eye, gaining national recognition as a young dancer on television shows. With a deep passion for storytelling, advocacy, and politics, Lennon now works to center the lived experience of herself and others as she crafts her professional career in online child safety at Heat Initiative, aiming to bridge the gap between online safety and LGBTQ+ representation through intentionally inclusive strategies. Lennon’s LinkedIn: https://www.linkedin.com/in/lennon-torres-325b791b4/