When Kathy Borum hugged 17-year-old Vaughn-Thomas goodnight on a December night three years ago, the Tennessee mom didn’t know she was holding her son for the last time.A hard-working student and talented athlete, Vaughn-Thomas had just come home from the gym before getting ready for bed.But to cope with the stress of school, he did something that has tragically become all too common among minors: He ingested a Xanax laced with fentanyl that he likely acquired via Snapchat.When he didn’t wake up from his alarm the next morning, Kathy discovered that her son had fatally overdosed.For years, my colleagues and I on the Senate Judiciary and Commerce committees have heard similar heartbreaking stories from parents across the country who have lost their children to social media harms.What Kathy told me about her son is the tragic truth for so many of these victims: “One mistake should not have been a death sentence.”Yet these tragedies are happening every single day because Big Tech would rather put profit over children’s safety.For years, platforms from Facebook and Instagram to Snapchat and TikTok have developed addictive algorithms to keep children scrolling as long as possible.As a result, teens are on average spending 8.5 hours a day on their screens — and suffering rising rates of anxiety, depression and suicide.At the same time, the algorithms are exposing children to unthinkable harms, including pro-suicide content, drug dealers and human traffickers.Just in October, reports emerged showing that TikTok developed addictive algorithms even as company executives privately acknowledged the features led to mental health issues, including loss of memory and cognition.At the same time, the platform fed self-harm and eating disorder content to minors while failing to remove posts promoting drug abuse and pedophilia.Although TikTok’s impending US ban may end the Chinese-owned app’s dangers, many other social media platforms are also completely neglecting m...