An ode to Intel’s 4004 processor: the one that started it all

Intel 4004 50th Anniversary
(Image credit: Intel)

1971 – what a year that was – and I'm afraid I can remember most of it. In November 1971, the Mariner 9 satellite orbited Mars, the album Led Zeppelin IV was released and Oman was granted independence from the United Kingdom. I met Led Zeppelin guitarist Jimmy Page in the 1970s – but that's a different story altogether.

And in 1971 Intel started advertising a chip called the 4004. I don't remember noticing that at the time but the semiconductor company has dominated my working life from 1987, until, er, now.

Intel historian Elizabeth Jones claims a lot for the introduction of the 4004. She says that we would not have smartphones, AI that recognises facial expressions, live mapping systems and the rest. She doesn't say that Intel failed spectacularly in the late 1990s to recognise that smartphones and tablets would come to overturn the hegemony of desktop and even notebook PCs.

Intel 4004

(Image credit: Intel)

Originally, the 4004 was designed to be the engine for a Japanese company to produce a prototype calculator using 12 custom chips, but engineers Ted Hoff, Stan Mazor and Federico Faggin were able to produce four devices – including the 4004 – in November 1971. But it wasn't until five years later that Intel was able to draw on its ideas and really start to “shrink the die”.

Malcolm Penn, CEO of UK semiconductor analyst firm Future Horizons, told TechRadar which problems the 4004 overcame. 

“The symbiotic combination of visionary IC system design and programmability gave birth to the world’s first general purpose microprocessor”, he said.

“An iconic ‘rock star’ design, it stands head and shoulders alongside the ubiquitous 7400, 702/09/41 and 1103 as pioneering revolutionary ICs (integrated circuits).”

He said that by the end of the 1960s, a single MOS (metal oxide semiconductor) IC could contain 100 or more logic gates. That technology, however, was much slower than chips using TTL (transistor-transistor logic). The 4004 had 2,300 transistors. Contrast that with today's CPUs from Intel with around 100 billion transistors for Ponte Vecchio, the company’s most complex system-on-chip, released in August 2021.

“The symbiotic combination of visionary IC system design and programmability gave birth to the world’s first general purpose microprocessor”, said Penn. “Whilst its success in the marketplace was limited, quickly superseded by second and subsequent generation designs, it opened the floodgates for the microprocessor era.”

The story is a little more complicated than Intel suggests.

Busicom Calculator

(Image credit: Intel)

According to Ken Shirriff, a former programmer at Google and a historian of semiconductors, the 4004, created for Japanese calculator firm Busicom, had competitors including Mostek and Texas Instruments, who created calculators on a single chip.  

But the genius of Intel and its engineers was that the 4004 could be used for general purpose computing. It came to develop better manufacturing processes for its range of microprocessors, predicted by Intel co-founder Gorden Moore to proclaim in 1965 that the number of transistors on a semiconductor will double every couple of years, while the price of chips will halve (ed: universally now known as Moore’s Law).

Intel 8080

(Image credit: Intel)

The successor to the 4004 would become the Intel 8080 chip, later leading to the 16-bit 8086, first adopted in the IBM PC. A myriad of other PC companies sprang up in the late 1970s to compete with Big Blue.

Shirriff said in a report in 2016 that he considered that the “honors for creating the first microprocessor depend on how you define the word. Some define a microprocessor as a CPU on a chip. Others say that all that's required is an arithmetic logic unit on a chip. Still others would allow these functions to be packaged in a few chips, which would effectively make up the microprocessor”.

There was no guarantee at the time that Intel and the x86 technology would come to dominate the microprocessor market, said Shirriff.  There's more fascinating information on computer history on his blog. There are plenty of other factors which helped Intel deliver its success in the microprocessor market, including the leadership at the helm.

Andy Grove

(Image credit: Intel)

One important person in its success was the third employee of Intel and its third CEO, the late Andy Grove, a Hungarian refugee. He coined the phrase “Only the paranoid survive”, meaning that a company always had to keep its eyes on its competitors and avoid complacency. He expanded that principle into a book of the same name in 1996.

I met Grove a couple of times – an obviously driven individual, he welcomed plain speaking and the harder the questions, even from journalists, the better. He incorporated these principles into management styles in the corporation, demanding that employees use plain speaking to their bosses. The results – predictably – sometimes caused a lot of friction.

But Grove also realised that it wasn't enough just to have engineering excellence to succeed. Intel has a history of using very sophisticated and slick marketing to push its products and promote itself. Some wit even coined the word “marchitecture” to describe Intel's approach.  It also employs a large team of public relations officers worldwide to inform and to fend off journalists' questions it doesn't necessarily want answered.

Intel Inside

(Image credit: Intel)

The marketing slogan “Intel Inside” and the accompanying refrain must have been heard by hundreds of million people.

Intel recently announced that it has no intentions to switch to a “fabless” model, where it would design chips manufactured by a third party. Instead, it intends to invest many billions in expanding its manufacturing capabilities. It has already started to build two fabs, with plans for a third fab too.  CEO Pat Gelsinger – one of Andy Grove's protéges – wants the company to provide a third of semiconductor manufacturing worldwide.

Gelsinger obviously has faith in the future of x86 technology although Apple has moved away to designing its own microprocessors in the shape of M1 and its successors. Is it likely that this switch could mark the beginning of a change in direction in the microprocessor market?  It's way too early to say the jury's out on that – the case hasn't even got to court. And Microsoft and Intel are joined at the hip in the x86 waltz.

The really big questions are for how much longer Moore's Law will be valid and whether quantum computing technologies will topple current microprocessor designs. If and when that happens it will be the end of the road for x86 technology.

We've also featured the best Intel processors

TOPICS
Mike Magee

Mike Magee has written about tech since the 1980s, edited many titles and has written many many words in his lifetime. His speciality is news writing and he takes a keen interest in semiconductors. He co-founded The Register and founded The INQUIRER.

Read more
A chip wafer manufactured at Intel Foundry
Can 18A save Intel from being devoured by its rivals – and Wall Street?
Intel Core Ultra processor
Intel in 2024: year in review
Project DIGITS - front view
I am thrilled by Nvidia’s cute petaflop mini PC wonder, and it’s time for Jensen’s law: it takes 100 months to get equal AI performance for 1/25th of the cost
A mockup of the Intel Core Ultra 200HX die against a blue swoosh background
Intel announces new Core Ultra 200 series mobile CPUs at CES 2025, targeting enthusiasts and edge users
Microsoft
10 tech anniversaries you shouldn't miss in 2025
Qualcomm Snapdragon X Elite
Qualcomm in 2024: year in review
Latest in Pro
An American flag flying outside the US Capitol building against a blue sky
Sean Plankey selected as CISA director by President Trump
A young man working on laptop in office writing notes
Ending the fix/break cycle of End User Computing support
OpenAI
OpenAI wants to help your business build its next generation of AI agents
Ai tech, businessman show virtual graphic Global Internet connect Chatgpt Chat with AI, Artificial Intelligence.
Nation-state threats are targeting UK AI research
A hand reaching out to touch a futuristic rendering of an AI processor.
Business investors are positive about AI’s impact on the economy
Scam alert
Fake jobs and phone calls: How Americans lost $12.5 bn to fraud in 2024
Latest in Features
Sigma BF silver camera in the hand at The Photography Show, UK
I tried the Sigma BF camera everyone is talking about – it's truly stunning, but has one fundamental flaw
Pia holding a camera and smiling at something off camera in Picture This.
Picture This is Prime Video's #1 movie, but it hasn't captured everyone – here are 3 more rom-coms to watch instead with over 85% on Rotten Tomatoes
The Deepal EO7 from the side, an SUV and pick-up truck combo
I drove an electric SUV that transforms into a pick-up, and it’s as fun as it is functional
Robert Pattinson in a space suit in Mickey 17
3 Bong Joon-ho movies to stream after you've watched Mickey 17, including 2020's Best Picture winner
Willem Dafoe in Mississippi Burning
5 great free movies to stream on Tubi, Pluto TV, Plex and more this week (March 10)
Pictory
What is Pictory: Everything we know about this business-focussed AI video generator