At age six, Sarah Hill was handed her first iPad by her parents, which she used to play games like Angry Birds and Minecraft whenever she was bored. By age 21, the Alabama native had fallen so deep into virtual reality experiences and playing video games that sheâd stopped seeing friends, showering, and brushing her teeth. âIf you compare video game and tech addiction to drugs,â she says, âVR is the meth of drugs.â
At college, she spent so much time holed up in her room compulsively accessing a chatbot site, Character AI, on her phone that she failed classes. âI remember the night I told my parents Iâd lied about everything and I flunked,â she recalls. âMy parents didnât have any words. They were like, âJust go.â I went to my room, but the last thing I saw was my mom resting her elbows on the counter and just crying. That was the worst thing I ever saw.â
Hillâs parents flew with her from Alabama to a town just outside of Seattle and enrolled her at reSTART, one of the nationâs few residential treatment programs for digital overuse that treats tech addiction as a danger on the scale of alcohol or drug addiction. Clients are required to abstain from the internet, smartphones, gaming, and other technologiesâoften for months at a time. On her first day there screen-free, Hill lay down on her bed and cried.
Hill and reSTARTâs other clients are at the center of an intense debate about how harmfully addictive modern tech can be. Once waged mostly in academic white papers and over dinner tables, it has escalated to the courts, thanks to a slew of landmark legal cases against Meta, YouTube, TikTok, and Snap. (The last two reached settlements earlier this year. TikTok declined to comment for this article, and Snap did not respond to requests for comment.) These initial âbellwetherâ cases are being closely watched because their outcomes could provide precedent for the thousands of other lawsuits filed making similar claimsâand even force tech companies to change their products and business models. Some have anticipated a âBig Tobacco momentââa reference to the 1990s lawsuits against tobacco companies that proved they were aware of the addictive nature of nicotine and the health dangers of smoking, and led to massive damages paid.
In the case against Meta and YouTube, a now 20-year-old plaintiff, referred to as KGM, testified in February that the âaddictive designâ of these platforms, including infinite scroll, filters, and autoplay, led her to spend up to 16 hours a day on them, causing depression, anxiety, body dysmorphia, and self-harm. (A jury was deliberating the case as this article went to press.)
The Big Tech companies deny these claims, saying they did their best to protect free expression while keeping users safe. They question the whole concept of âtech addiction,â pointing out that thereâs no scientific evidence that their products were the cause of KGMâs and othersâ issues. The head of Metaâs Instagram, Adam Mosseri, said in court that social media was not âclinically addictive.â And in a written statement, a Meta spokesperson points to other factors in KGMâs life as the cause of her troubles, adding: âThe evidence simply doesnât support reducing a lifetime of hardship to a single factor, and our case will continue to underscore that reality.â
Reached for comment about YouTube, a spokesperson for owner Google, JosĂ© Castaneda, said allegations about the platform were âsimply not true.â âProviding young people with a safer, healthier experience has always been core to our work,â he said. He pointed to the companyâs âservices and policies to provide young people with age-appropriate experiences, and parents with robust controls.â

But concerned parentsâalong with researchers, health organizations, and even some former tech industry leadersâare sounding the alarm, saying that the systems we rely on for modern life are designed in ways that may be fundamentally incompatible with human well-being. They cite a growing body of research in psychology and neuroscience arguing that social media use delivers dopamine jolts similar to those associated with addictive drugs like meth or heroin. And with the rapid acceleration of AI, many are calling for the U.S. government to get serious about regulation and pleading with Big Tech to provide stronger safety features that constrain the algorithms, push notifications, and endless swiping that make it so hard to put your phone down.
âUnfortunately, [tech] is taking mostly young people away from the most important thing in their lives and key to their mental health, and that is relationships with other people,â says New York University professor and podcaster Scott Galloway. For tech companies, he says, itâs all about keeping usersâ attention locked in: âI donât think [Big Tech] set out in their business plans to depress global youth. I think their algorithms discovered that rage, self-esteem, and funny cat videos just keep people online.â
There is, of course, a difference between the kind of low-level âaddictionâ to our phones that most of us jokingly will cop toâchecking email before weâre out of bed, scrolling TikTok in the grocery lineâand the rarer, all-consuming dependency that leads people to places like reSTART or into courtrooms as plaintiffs. At the same time, the line between a bad habit of using tech several hours a day and a behavioral addiction can be blurry, especially for teens and young adults whose social lives, homework, and entertainment all run through the same devices.
âIâm finally putting a foot down and saying, âI want to get out of this endless cycle.â I need to do something to better myself and my life.â
Sarah Hill, reSTART client
And thatâs the whole point, argues Roger McNamee, a former tech investor and author of Zucked: Waking Up to the Facebook Catastrophe. âThese companies are in the business of attention,â he says. âOnce they had attention, they were in the business of controlling the choices available to people in order to influence their behavior in ways that were profitable for the platform. That culture and that business model were guaranteed to produce lots of harm.â
With its constitutional and cultural emphasis on the importance of free speech, the U.S., unlike many other countries, has largely declined to tell tech companies how they should interact with users. That has had dire effects, McNamee says: âWe went from a culture where we used tech as an empowering tool to viewing tech as a tool for controlling people and extracting value. Thatâs the culture of the Valley, and the underlying behaviors that that causes are wrecking our democracy, wrecking public health, and wrecking our economy.â
The reluctance to place limits on how tech products engage users, especially in the age of AI, âshould disturb everybody,â he says.
Some 25 miles northeast of Seattle, past towering Douglas firs, sits the gray-paneled, split-level reSTART clinic. Motivational posters and pillows with phrases like âHealing is not linearâ adorn its common areas. The center can house up to 16 clients, who share rooms and are responsible for household chores. They also are required to participate in 24 to 30 hours of structured group and individual therapy each week. reSTART teaches clients multiple evidence-based coping and recovery strategies, ranging from box breathing to physical grounding exercises. The treatment isnât cheap. As an out-of-network provider, reSTARTâs rate averages about $1,000 per day, though the clinic encourages clients to check with their insurers to see what can be covered. The average length of stay is 12 to 16 weeks, and many continue on via outpatient services for weeks after.
reSTART cofounder Cosette Rae opened the center with therapist Hilarie Cash almost two decades ago. Rae had previously worked as a tech developer and, upon realizing she was overusing technology in unhealthy ways, decided to change careers and pursue social worker training.
She vividly recalls a case in 2009, when she was called to assist a young adult who refused to leave their house or go to school. (Rae uses the pronoun âtheyâ here to protect the individualâs identity.) They were not healthy, and had moved their bedroom mattress into the middle of the living room to play World of Warcraft nonstop. Doctors had diagnosed the person with agoraphobia, but Rae suspected that tech addiction was the real problem. She reached out to Cash for advice, and the two realized there was no place to treat people with these types of issues. They decided to open a center themselves.
Rae remembers being both ârevered and rejectedâ in the early days of the center. Much like today, many didnât think tech addiction was real. But there was no shortage of clients: She has treated around a thousand in the nearly two decades since the center opened, and spoken to many thousands more, she says.
What her clients struggle with is more difficult than breaking free from substance abuse, Rae says, partly because thereâs no getting away from tech; itâs everywhere. âWhen I go out in the community right now, I do not have a lot of friends that are telling me about meth or heroin,â she says. âI donât usually go into the store and see people dealing. I donât go to the restaurant and people are doing a line. But when it comes to technology, itâs everywhere. So youâre constantly being in front of it and having to say no.â
Itâs more akin to an eating disorder, Rae says, where a person still has to eat but has a problematic relationship with food. In this day and age, clients arenât able to drop technology from their lives completely.
Itâs not just teenagers who are struggling. Rae mainly works with young and middle-age adults (reSTART takes clients who are 15 and older), but she has seen clients in their late forties or fifties. The most common addictions Rae sees, besides video games, involve virtual reality, pornography, and more recently, AI chatbots.
One client, a 23-year-old Seattle-area college student who asks to withhold their name and gender, describes their own overuse of video games, YouTube, and communication platform Discord. The student says they wished schools today would teach kids how to use technology mindfully and warn against addictive behaviors: âTechnology is best used when itâs a tool to enhance your life. But what I got trapped in is technology being my life.â
Some scientists, such as Stanford psychiatrist and Dopamine Nation author Anna Lembke, say compulsive tech use taps into the brainâs reward circuitry in strikingly similar ways to substance addiction. When someone scrolls social media or wins a round of a video game, their brain releases dopamine, which trains them to seek that âhitâ again and again. Repeated bursts of stimulation can desensitize the pathways and weaken the prefrontal cortex, which is responsible for planning and self-control, making it harder to resist urges even when the habits are causing problems or affecting school, work, or relationships.
Brain imaging studies of people with internet gaming or social media disorders have found structural and functional changes in these regions that mirror what doctors see in other behavioral addictions such as gambling.
Tech addiction is not listed as a condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM), the guide published by the American Psychiatric Association for diagnosing mental health conditions. However, in its most recent edition, the DSM does list âinternet gaming disorderâ as a condition warranting more clinical study.
âI donât think [big tech] set out to depress global youth. I think their algorithms discovered that rage, self-esteem, and funny cat videos just keep people online.â
Scott Galloway, NYU professor and podcaster
Its absence doesnât faze Rae. âIt took 40 years for gambling [disorder] to get into DSM,â she says. âSo I donât give any credence to the fact that itâs not in there yet.â
The science is far from settled, and some studies suggest that tech doesnât cause usersâ unhappiness. A 2023 University of Oxford study of 2 million people from around the globe found that links between internet adoption and psychological well-being were âsmall and inconsistent.â
And in March, California Institute of Technology researcher Ian Anderson and Wendy Wood, a professor at the University of Southern California, wrote a Washington Post op-ed arguing that calling habitual tech use âaddictionâ was misleading and harmful. In surveys, they found that when people described their Instagram use as an addiction, âThey felt stuck, less confident that they had the ability to change.â Yes, they wrote, companies should âamend their platforms to help users regain control over their habits.â But they concluded, âThe truth is: Heavy use is not necessarily an addiction.â
Nir Eyal, a tech investor and author of Hooked: How to Build Habit-Forming Products, says itâs not the tech thatâs solely to blame for peopleâs addictions. âEvery generation has a moral panic about whatever new technology, but you donât fix things by stopping their use,â he says. âYou fix things by making them better, by making them safer.â
Eyal argues that there is nothing unethical about making a product that some people get addicted to, and asking social media companies to make their products less sticky is not the answer. Why? âBecause any product thatâs good, somebody is going to get addicted to,â he says. âStop making the product interesting? Thatâs dumb. Thatâs why we use the product. Thatâs called âentertaining and engaging.’â
The debate is only likely to grow more urgent given the rapid adoption, and daunting potency, of AI. Rae fears AI could create new ways for people to get hooked on tech, or treat AI as a âsubstitute attachment figureâ for real relationships. âI think everybodyâs been focused on all the talk around the existential threats like, âCan it take our jobs?’â Rae says. âBut what about taking our humanity? Thatâs whatâs happening.â As a practitioner working with tech addicts, she says, âIâm standing here looking down at a tsunami coming to people who have no idea what their kids are going to be facing. How this is going to change them; how itâs going to change their relationships with each other; and how itâs going to change their futures.â
If tech addiction is accepted as real, it raises another thorny and divisive question: What canâand shouldâbe done about it? Some states, including New York and California, have enacted laws requiring warning labels on social media apps that highlight the risks for young people. In September, the New York attorney general proposed a rule requiring social media companies to restrict algorithmically personalized feeds and nighttime notifications for users under 18 unless parental consent is granted. California put legislation into place last year creating safety restrictions on the development of AI.
Federal oversight has been slow or nonexistent, though many legislators have tried. In 2019, Missouri Republican Sen. Josh Hawley introduced a bill that would have banned social media features that exploited human psychology. Hawleyâs Social Media Addiction Reduction Technology (SMART) Act went nowhere, attracting little bipartisan support and never making it out of committee.
In December, Australia became the first country to ban social media for people under 16, and Greece and Britain are considering similar laws.
Social media platforms have themselves put up some guardrails, mainly via opt-in or parental controls. Meta launched Teen Accounts on Instagram and Facebook, with more restrictive features and nighttime nudges to close the app. Snap has expanded in-app warnings, âfriendingâ safeguards, and location-sharing controls. Google and YouTube announced a $20 million initiative to address teen digital well-being. TikTok launched a daily screen-time limit, with users under 18 automatically cut off after an hour. And in February, Meta, TikTok, and Snap agreed to be independently rated on how well they protect teensâ mental health by a group of advocacy organizations.
reSTARTâs Rae doesnât want to get stuck in semantics. Instead of arguing about whether their products are addictive, she says that Big Tech companies should devote some of their profits to resources that can help those âstruggling as a result of loving their product,â she says. Many people canât afford treatment like reSTART, as most health insurers wonât cover problematic tech useâthough sometimes clients can get coverage for associated disorders such as depression or anxiety.
Companies could also consider shutting off access to their technology for certain time frames, Rae suggests. Eyal recommends something similar. In addition to implementing a legal minimum age to use social media, he recommends that tech companies adopt a âuse and abuseâ policy. After a certain number of hours, he says, tech companies should reach out to the user with a message offering resources to prevent or cure addiction.
Sarah Hill recently transitioned out of the center to an apartment owned by reSTART, half an hour away. She still visits the center most days for treatment, but is eyeing a job at a grocery store on her off daysâand even got a cell phone. Itâs a basic âdumbâ Gabb phone, with no apps or games. Even so, Hill recently found herself mindlessly scrolling through new screen backgrounds. âI felt myself losing control again, and it scared me,â she says, tucking the phone underneath her legs on one of reSTARTâs oversize chairs.
But Hill says she does have high hopes about managing her addiction in the future and says her phone usage has improved. âAfter making so many mistakes, Iâm finally putting a foot down and saying, âI want to get out of this endless cycle,’â she says. âI need to do something to better myself and my life.â
Six questions to ask yourself about your tech use
Washingtonâs reSTART Clinic developed these screening questions to help potential clients consider whether their tech use has become problematic. Hereâs an abbreviated version:
- How often do you think about your current, previous, or next online activity?
- Have you become restless, irritable, angry, or anxious when you are unable to engage in online activities?
- Have you tried to reduce participation in online activities but found it too difficult?
- Have you lost interest in non-online activities such as sports, hobbies, or family time?
- Have you deceived a family member, significant other, employer, or therapist regarding the amount of time you spend online?
- Have you jeopardized or lost a significant relationship or an academic or employment opportunity because of your engagement with online activities?
This article appears in the April/May 2026 issue of Fortune with the headline âWhat is tech addiction? It may well be Big Techâs next problem.â
This story was originally featured on Fortune.com