Why we should embrace AI, not fear it

The media has not got a clue about artificial intelligence (AI). Or technology. 'Robots are coming for your job' is a popular cry, but the next day it's fears about AI starting World War III.

Not only do robots and AI have very little to do with each other, but AI is at a very early stage. What's more, it can be split into several separate technologies.

The masses are being misled into fearing automation and a nebulous super-intelligence, but it’s those with a working knowledge of how AI works – and how it can be exploited – that will be best prepared for the future of work.

What is AI?

 What is AI? Watch our explanation, brought to you by Honor 

There is no precise answer to this question, but it's got nothing to do with robot overlords. AI is a field of computer science that examines if we can teach a computer to ‘think’.

AI as a phrase has been around since 1956 when it was coined by American computer scientist John McCarthy, six years after English mathematician Alan Turing had published a paper called 'Computing machinery and intelligence' in 1950.

AI is generally split into various subsets that try to emulate specific things that humans do. Speech recognition mimics hearing, natural language processing mimics writing and speaking, image recognition and face scanning mimic sight, and machine learning mimics thinking.

That’s a lot of different, often unrelated technologies; AI is an umbrella term, and certainly not a general purpose technology.

Advances in processing power and the production of big data are fueling AI. Credit: IBM

Advances in processing power and the production of big data are fueling AI. Credit: IBM

Why is AI so hyped up?

Research into AI is currently riding the wave of increased computing power and big data. Together they make AI both possible and imperative; as a society we now produce way too much data to ever process ourselves or get any insight from. Collected data is growing 40% a year, and it's mostly going to waste.

The existence of all this data also means that AI software has enough information not only to work with, but to learn from. Is this AI’s big moment? Venture capitalists and technology giants such as Amazon, Google, Facebook, Microsoft and Apple think so, and are investing heavily in research.

It’s these companies that have unimaginably huge data sets collected in the last few decades, and a vested interest in automating tasks on that data. Together they’re becoming the arbiters of AI know-how, so it’s AI techniques developed by Google et al. that are being used by scientists to trawl through data to get new insights.

There’s about to be an AI-powered knowledge explosion.

AI will help scientists make incredible breakthroughs. Credit: Nasa

AI will help scientists make incredible breakthroughs. Credit: Nasa

Supervised machine learning

Machine learning is the act of computer scientists training a computer to do something. It's about automating repetitive tasks, essentially training a computer to recognize patterns, and categorize data.

The classic example is image recognition or 'AI vision'; give a computer a large number of images containing labeled objects, and the computer can learn to identify them automatically. The computer creates what AI researchers call a neural network; a virtual brain connection similar to a basic process in the human brain.

However, creating a neural network like this takes a lot of human labor, and also a lot of processing power. Google AI and the University of Texas recently used AI on a labeled data-set of signals from the Kepler space telescope to discover two exoplanets when astronomers had failed to find anything.

It's also being used to identify cracks in reactors, and even help engineers at the UK's Joint European Torus facility capture and deploy nuclear fusion energy.

This is supervised machine learning, and while it's getting better at not forgetting, its usefulness at predicting patterns in data is hamstrung by the data it is fed.

AI is being used to help make nuclear fusion a reality. Credit: JET

AI is being used to help make nuclear fusion a reality. Credit: JET

Unsupervised machine learning

What if a computer system could self-teach, building algorithms guided not by humans, but by data?

Unsupervised machine learning (also called 'true AI' by some) is really what AI researchers want to achieve. It's where you only have unlabeled data, and you ask the computer to learn things without specifically telling it what the right answers are.

For example, Google developed an image recognition neural network and then gave it YouTube for a week to see if it could recognize common objects. It found cats – even though it didn't know what a cat was. For AI, that’s impressive, but it also shows the current limits of what AI is capable of.

However, that same neural network – now called DeepVariant – is now being used to accurately identify mutations in DNA sequences, presented to the computer as images. The AI is essentially spotting the mistakes made by DNA-sequencing machines; it’s gaining insight from data where there would have been none. AI is also being used to spot fake paintings.

This is what AI is being used for; to make computers better at their job.

AI doesn't come close to the abilities of the human brain

AI doesn't come close to the abilities of the human brain

Neural networks

This is just one of many machine learning techniques. Neural networks mimic what happens in the human brain, but don't think for a moment that AI is on the verge of replicating humans. A neural network in AI can handle hundreds, thousands, and sometimes millions of inputs, with data flowing one way.

It's clever stuff, but the human brain has billions of interconnected neurons; we are all several orders of magnitude more complex than AI. So when you hear the phrase 'deep learning', keep it in context; true computer intelligence and artificial general intelligence (AGI) are some way away.

Will AI 'take our jobs'?

There is a lot of fear about AI taking people's jobs. It's made worse by the fact that many economies are experiencing slow growth and job insecurity. AI is about making computers more capable, which will have a significant impact on how society runs. A lot of routine work will be automated, reducing administrative workload.

It means people will be able to concentrate on the higher value work without the soul-destroying report-writing duties. It means scientists will make more discoveries, doctors will have access to more cutting-edge knowledge and save more lives, and police will be able to do more policing.

AI is about boosting productivity, and it may spawn a thousand startups that carve new companies and industries.

Robots like Relay don't even use AI. Credit: Savioke

Robots like Relay don't even use AI. Credit: Savioke

The future for AI

AI is a way for computer scientists to get computers to catch up with the reality of big data, and get them to perform tedious manual tasks that are now way beyond us given the deluge of data we're now surrounded by.

It’s a basket of techniques, not a general purpose technology, and it's not about to automate everything.

Although it will have an effect on many industries, all businesses will need a persuasive business case for AI – most probably to solve a really specific, narrow problem – as well as the services of data scientists that specialize in AI, and a lot of well-ordered data that the AI can learn from.

Will AI change everything? Perhaps, or maybe the hype – and the funding for research – will dry up as researchers hit a wall. After all, AI is already on the verge of becoming a bland marketing term to sell phones. Even if these early days of AI do prove a significant milestone for humanity, it's probably going to be a slow-burner.

However, what we do know for sure is that having an understanding of AI is going to become more important for more professions. For all of us living through the data explosion era, AI is the missing piece of the jigsaw.

TechRadar's AI Week is brought to you in association with Honor.

Jamie Carter

Jamie is a freelance tech, travel and space journalist based in the UK. He’s been writing regularly for Techradar since it was launched in 2008 and also writes regularly for Forbes, The Telegraph, the South China Morning Post, Sky & Telescope and the Sky At Night magazine as well as other Future titles T3, Digital Camera World, All About Space and Space.com. He also edits two of his own websites, TravGear.com and WhenIsTheNextEclipse.com that reflect his obsession with travel gear and solar eclipse travel. He is the author of A Stargazing Program For Beginners (Springer, 2015),

Latest in Tech
A Lego Pikachu tail next to a Pebble OS watch and a screenshot of Assassin's Creed Shadow
ICYMI: the week's 7 biggest tech stories from LG's excellent new OLED TV to our Assassin's Creed Shadow review
A triptych image of the Meridian Ellipse, LG C5 and Xiaomi 15.
5 amazing tech reviews of the week: LG's latest OLED TV is the best you can buy and Xiaomi's seriously powerful new phone
Beats Studio Pro Wireless Noise Cancelling Headphones in Black and Gold on yellow background with big savings text
The best Beats headphones you can buy drop to $169.99 at Best Buy's Tech Fest sale
Ray-Ban smart glasses with the Cpperni logo, an LED array, and a MacBook Air with M4 next to ecah other.
ICYMI: the week's 7 biggest tech stories from Twitter's massive outage to iRobot's impressive new Roombas
A triptych image featuring the Sennheiser HD 505, Apple iPad Air 11-inch (2025), and Apple MacBook Air 15-inch (M4).
5 unmissable tech reviews of the week: why the MacBook Air (M4) should be your next laptop and the best sounding OLED TV ever
Apple iPhone 16e
Which affordable phone wins the mid-range race: the iPhone 16e, Nothing 3a, or Samsung Galaxy A56? Our latest podcast tells all
Latest in News
Google Gemini AI
Gmail is adding a new Gemini AI tool to help smarten up your work emails
Android 16 logo on a phone
Here's how Android 16 will upgrade the screen unlocking process on your Pixel
Visual Intelligence identifying a dog
AirPods with cameras for Visual Intelligence could be one of the best personal safety features Apple has ever planned – here's why
Nvidia AMD
Nvidia rumors suggest it's working on two affordable GPUs to spoil AMD's party
A Minecraft sheep.
Minecraft developer rejects generative AI, 'it's important that it makes us feel happy to create as humans'
IBM office logo
IBM to provide platform for flagship cyber skills programme for girls