The death of the internet: why the future is terrifying, and how we fix it

A grim reaper with a scythe, looking at a smartphone in his hand.
(Image credit: Shutterstock / Studio Romantic)

The internet is in a precarious place. It’s assaulted from all sides - not by technological problems, but by social ones. Misinformation is rife, marketing and advertising covers every facet of the web, and armies of politicized and automated bots roam the wilds of its social media landscapes, all of which are filtered down to you through carefully curated algorithmic posts designed to induce endorphin kicks and keep you on your platform of choice. Right now, everything is changing, and not necessarily for the better.

For many of us, looking back 10 or 20 years, the 'world wide web' looked radically different in that golden age. The social media platforms, the communities, the gaming landscape, the knowledge and accessibility, the shopping - all of it felt different, and it was different. This goes beyond rose-tinted glasses. The companies that joined into the foray were incredible, almost revolutionary. Spotify, Netflix, Amazon, Facebook, Twitter, and Uber: all remarkably impressive, market-upsetting ideas that broke the mold. They drew in masses of customers, users, and consumers with awesome features and affordable pricing.

Yet over time, those same features and costs have gotten predominantly worse for the average Joe, as the companies have scooped out the investment in the middle for the sake of greater margins. This usually occurs once they become publicly-traded entities; driven by shares, investors, and board members clamoring for greater profits rather than the ideals and concepts that founded them.

A digital world in decline

The same sadly goes for the scientific endeavors too. Educational tools and access to information are equally falling apart. So much of the information out there has now been muddied and diluted by TikTok Reels and YouTube Shorts in their thousands, spewing forth all manner of falsehoods from anyone who can pick up a phone and film a 60-second clip. Flat-earthers, fitness and diet influencers, climate-change deniers, moon-landing hoaxers, political “activists” on both sides of the spectrum, so-called journalists pandering to clickbait, you name it. It’s increasingly difficult to identify what’s real and what’s not, what’s true fact and what isn’t. It’s partly why Google changes its search ranking algorithms so often, as it continually tries to promote correct and accurate information over AI-regurgitated content and misinformation.

young woman with her friend tiktoker created her dancing video by smartphone camera together. To share video on social media application

Millions of people filming themselves dancing probably isn't what the founding fathers of the internet had in mind. (Image credit: Nattakorn_Maneerat via Shutterstock)

We’re in a world of demagogues and social media personalities, where your reach and the number of views on your content dictate whether you’re taken seriously or not. Whether your facts and statements are taken as truth. We saw it during COVID, we saw it during the US elections, we saw it with the war in Ukraine, and the recent UK riots. It isn’t slowing down either, and the impact it has is arguably getting worse.

We even have services now that capitalize on that too. Ground news, collating all the media together to give you the full spectrum of political opinion on any one given event, fact-checkers covering masses of social media platforms, and Community Notes pointing out when folks with lots of clout spout utter nonsense. Hell, there are even entire divisions of scientists out there now making a living out of debunking the empirically-incorrect insanity spewed by other social media influencers. It’s absolutely wild.

Algorithmic Echo Chambers

The problem is systemic. It started in social media, with algorithms delivering 'curated' content rather than just showing you a historical timeline of those you follow. Your likes and dislikes, what you spend time watching, reading, listening to, it all became fuel for the fire. Facebook, Instagram, Twitter - all of them feed you content in that manner. If that’s right or left-wing politics, or 9/11 conspiracy theories, or cute black labradors, it didn’t matter: as long as you stay on the platform and consume more ads. In fact, it's become so prevalent that it’s hard to find a feed system on any social media platform today that doesn’t do that.

The problem with this is that it has effectively stifled creative debate. No longer are your opinions challenged or questioned, no longer do you have meaningful conversation and discussion, but instead you're fed more and more of the same content. That in turn reinforces and influences your beliefs as a consequence, as you sit in an echo chamber of like-minded people repeating the same things. It’s not difficult to see how this actively leads to an increase in extremist beliefs and views.

How can your opinion change or evolve if there’s no one there to challenge it? It’s part of the reason why so many in the last few elections across the planet are almost in utter disbelief when their political candidate of choice doesn’t win. Because to them, all they see is a deluge of support online for their chosen party and nothing else.

Hope for the hopeless?

It’s a bloody mess: a relatively free market, held back only by the sparsest amount of regulation. 31 years—that’s how long it’s been since the World Wide Web made its first foray into the public arena. It’s hard to imagine what Sir Tim Berners Lee envisioned; it’d be like this far into the future. I doubt this is what he imagined (although Tim, if you’re reading this and are free for a chat: hit me up, I’m so up for that).

Sir Tim Berners-Lee, credited as the inventor of the World Wide Web.

The man who started it all, Sir Tim Berners-Lee. No, you can't blame him for all the TikTok dances. (Image credit: Paul Clarke)

That said, there’s still hope. The amount of good that’s come out of the WWW since its conception, and even today, is still far greater than the net negatives (no pun intended). Even if in ten years it’s just filled with AI-generated articles and gradually degrading memes while Amazon charges you $90 a month for next-week delivery, as long as people are still using it to actively and openly communicate with one another, it’ll be a net positive.

We don’t hear about the number of scientific breakthroughs that have been accelerated by the internet, the discoveries, the health conditions cured, or the humanitarian aid organized; we don’t hear about any of that because that’s not what makes the news. It’s not interesting. That’s not included in the scientific journals or the papers. We don’t hear about the relationships formed or how integral it is to our modern society’s infrastructure as a whole.

How do you fix it, then? Well, it’s not so simple as slapping a band-aid on something. By its very definition, the World Wide Web is exactly that: global. To get some form of consensus on how to improve the current cesspool that it is requires collective effort. We’ve seen that happen before in the tech industry. There’s a reason JEDEC exists, and standards like USB and DDR are a thing; we need one for the internet, one with teeth on a much larger scale. One with smart minds behind it, looking at the monopolization of segments of the internet and pushing governments to act on it. Suggesting legislation. Looking at patterns and predicting what might occur. One that can react rapidly without necessarily being hindered by bureaucratic nonsense.

USB-C charging

The USB Implementers Forum incorporates many major players in the tech industry, including Apple, Intel, and Microsoft. (Image credit: ShutterStock / kontrymphoto)

Then there’s education, and I’m not talking just about kids and young adults, but for all ages. In a similar manner to how we strive for complete adult literacy, we need to have a big push to make each nation-state computer literate as well, beyond talking about “how to turn on the PC” and “this is the internet," but how to identify fake posts, how to fact-check statements, how to find multiple sources, and the legality behind what you post and how you post online. So much of that is just not available, or not known to the public, of all ages.

Learning new critical skills as a global society is hard. But we did it for the threat of nuclear annihilation in the Cold War; we did it with the introduction of the seat belt in cars; we did it for reading; it needs to be done again, but for the digital age. Is it a challenge? Yes, but this isn’t the first time we’ve faced technological turmoil, nor will it be the last.

You might also like...

Zak Storey
Freelance contributor

Zak is one of TechRadar's multi-faceted freelance tech journalists. He's written for an absolute plethora of tech publications over the years and has worked for Techradar on and off since 2015. Most famously, Zak led Maximum PC as its Editor-in-Chief from 2020 through to the end of 2021, having worked his way up from Staff Writer. Zak currently writes for Maximum PC, TechRadar, PCGamesN, and Trusted Reviews. He also had a stint working as Corsair's Public Relations Specialist in the UK, which has given him a particularly good insight into the inner workings of larger companies in the industry. He left in 2023, coming back to journalism once more. When he's not building PCs, reviewing hardware, or gaming, you can often find Zak working at his local coffee shop as First Barista, or out in the Wye Valley shooting American Flat Bows.