The US TikTok lawsuit could change the face of social media forever, and it's about time

Social Media
(Image credit: Shutterstock)

Social Media in the US, and specifically the key Section 230 protection afforded to platforms, may have met their Waterloo. For most of the two-plus decades we've been using social media like X (nee Twitter), Facebook, Instagram, TikTok, and others, they've operated under protections designed 25 years ago by US lawmakers primarily to shield pioneering platforms like Compuserve and AOL.

Those protections, which are part of the Communications Decency Act of 1996, said that online computer services couldn't be held liable for content posted on their platforms by third parties. These services were like dumb, vast warehouses with shelves of information placed there by others. A warehouse doesn't create what's inside, it just accepts the content and gives consumers access. 

This was back in the days of AOL, which controlled the pages you saw using keywords, a rough organizing principle for such a vast amount of information. In some ways, early platforms like Prodigy, CompuServe, and AOL were just one pretty interface removed from the Bulletin Board Systems that preceded them.

Modern digital services, mainly social media, have one major difference: they no longer passively wait for you to discover content and make connections on your own. Everything is tailored based on custom algorithms. TikTok's vaunted For Your Page, X's For You page, Threads' For You Feed, Facebook's feed, Instagram's recommendations – all of them are driven by algorithms that learn your habits and then deliver other people's content based on those assumed interests. 

AOL wanted people to sign up and stay on, but it mostly kept its numbers up by managing churn. Almost as many people stopped paying for and using the service as signed up each month. That's why we all got so many disks and CDs in the mail, begging us to join.

Algorithms in control

These days, the platforms are mostly free. Ads and partner deals pay the bills, so it's crucial that eyeballs remain glued to each service. Hence, the algorithms that do the dirty work of keeping us all engaged.

While AOL, CompuServe, and even ISPs could fairly claim that they had no control over the content we saw online, and that the responsibility still fell on the shoulders of the content originators, the algorithms make the picture far murkier for modern social media, and perhaps even search engines like Google.

Section 230 has been under attack for years. I used to believe that it fairly protected all online services. When you look for someone to blame for seeing unwanted violent, hateful, perverse, or even pornographic content in your feed, the ultimate responsibility lies with the creator of that content and not the host.

I don't believe that anymore and, as far as I can tell, it looks like US courts could soon make a precedent on this point in a closely watched case.

Precedent could be set

In 2021, a 10-year-old girl, Nylah Andreson, found a viral meme in her TikTok feed. The video promoted something called "The Blackout Challenge." Social media is full of these viral challenges and the vast majority of them are harmless. 

This one was not. It promoted choking yourself until you black out.

Tragically, Nylah, according to the filing, died while attempting the challenge and her family has been suing TikTok ever since. While the lower courts dismissed the case, a US Court of Appeals ruled that Nylah's family could sue TikTok and specifically pointed to the TikTok algorithm as not being protected by the Federal-level Section 230.

From the ruling:

"TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech."

While no one person at TikTok curates content for anyone's feed, it is fair to call the algorithm the arbiter, and the algorithm is programmed by TikTok, which is owned by the Chinese company ByteDance (the company is currently being told to sell itself to US entities or face a ban in the States).

The Andreson case will continue and if Nylah's family wins its suit against TikTok, it could mean a rapid end of protections for all social media currently using algorithms to shape our feeds. If TikTok loses, the social media companies could be held liable the next time you see hate speech, violent imagery, pornography, or suggestions of dangerous actions. 

In a separate interview, Nylah's family said they wanted these Big Tech firms to be held accountable for the algorithms and to do more to protect their users.

The winds of change

Whatever the final result, any platform that programs an algorithm to analyze your interests, then caters content based on that analysis, has a responsibility to ensure that its algorithm can't deliver dangerous content.

In my own social media use, especially on TikTok, I've marveled at the algorithm's power and flexibility. It will endlessly fill my For Your Page, keeping me hooked for hours at a time. It does allow for personal curation, which mostly happens by searching for things of interest. 

When I stumble on something I like, I pay extra attention to it. I watch it more than once, pause the video, like it, share it, and then watch a few more videos in the same vein. If I do this a few times, I can shape my FYP feed so that I see more videos about people refurbishing old gadgets or making pasta.

However, these feeds have a needy side. They always throw in a "you might also like"  topic that's been popular with others. They're trying to prevent you from losing interest in your feed and the platform.

That's how, I believe, most people end up seeing things like violence and dangerous memes. You need to show the feed how much you dislike that content, then you can weed it out – assuming the algorithm allows it.

TikTok will fight this case, as other social media platforms have, but I think the tide has turned and a loss is possible. If that happens, TikTok, X, Threads, Facebook, Instagram, and other social media platforms may be forced to trash and recast all of their algorithms to ensure they don't repeat the mistakes of the past. Otherwise they could end up buried under costly lawsuits – which they might lose again – until the platforms succumb and disappear forever.

You might also like

Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Mark, the Today Show, Good Morning America, CNBC, CNN, and the BBC. 

Read more
TikTok logo seen on mobile with TikTok CEO Shou Zi Chew sketch displayed on screen. On 23 March 2023 in Brussels, Belgium.
TikTok will have its day in court, but it's time to ask what we'll do without it
Participants hold up signs in support of TikTok at a news conference outside the U.S. Capitol Building on March 12, 2024 in Washington, DC.
US TikTok ban: the clock is ticking for Americans' digital freedoms
The TikTok logo appears on a smartphone screen with the United States flag in the background
Forget the US TikTok ban – what we need is better social media and privacy laws
TikTok ban
TikTok's imminent ban is pushing people to RedNote, another Chinese app, and the irony is just too rich
A phone showing the TikTok logo
TikTok could immediately shut its app to millions in the US in days – here's how to prepare
TikTok ban
TikTok Ban live: President Trump signs exec order giving TikTok a penalty-free 75-day extension
Latest in Social Media
tiktok
How to edit TikTok videos
 Facebook social media app logo on log-in, sign-up registration page
How to delete all your Facebook posts
The logo of the social media app Bluesky is seen on the screen of a mobile phone
Bluesky gets a massive video upgrade to tempt X fans who are frustrated by its cyberattack outages
TikTok
How to download TikTok videos without a watermark
Instagram app logo on iOS
Instagram wants you to do more with DMs than just slide into someone else’s
The logo of the social media app Bluesky is seen on the screen of a mobile phone
What is Bluesky? The new social media network explained
Latest in Opinion
Apple CEO Tim Cook delivers remarks before the start of an Apple event at Apple headquarters on September 09, 2024 in Cupertino, California. Apple held an event to showcase the new iPhone 16, Airpods and Apple Watch models. (Photo by Justin Sullivan/Getty Images)
The big Siri Apple Intelligence delay proves that maybe we really don't know Apple at all
Racks of servers inside a data center.
Modernizing data centers: an efficient path forward
Apple iPhone 16 Pro Max REVIEW
Apple Intelligence is a fever dream that I bet Apple wishes we could all forget about
Asus ROG Ally using Steam
I think Asus could be the perfect partner for an Xbox handheld – but I have questions
Hands typing on a keyboard surrounded by security icons
The psychology of scams: how cybercriminals are exploiting the human brain
A person using a desktop computer.
The role of automation in achieving sustainability goals