The UNIX OS – 50 years and counting
2019 marks the year the UNIX OS turns 50
It’s 50 years since a team of computer scientists created the first version of an operating system (OS) that would revolutionise modern-day computing. Created as a multi-tasking system for multiple users, the UNIX OS continues to influence the world five decades later.
Today’s operating systems can be grouped into one of two families. The majority of those that don’t fall into Microsoft’s Windows NT-based operating systems can be traced back to the UNIX OS. So, whether you’re running Linux, MacOS, Android, iOS, Chrome OS or PlayStation, the chances are you’re using an OS based on the UNIX approach.
Andrew Josey the VP of standards and certification at The Open Group.
Solid foundations
The UNIX developers took a decision early on to rewrite the OS in the C language, which allows it to run on a wide range of different hardware architectures. It was created at a time when systems were typically tied to the platforms they were written on, through the use of low-level source language.
Its founding principles centered around the core concepts of simplicity and modular software development, to provide clear, concise and extensible code for easy repurposing and maintenance. The approach proved extremely popular and the ‘UNIX Philosophy’ became highly regarded for this modular software design. Its portability also caused it to be quickly and widely adopted across commercial, research and academic organisations.
In force today
The UNIX OS design elements exist in numerous forms today, and its role in the evolution of computing is evident across entire infrastructures. Having enabled technologies such as cloud computing, security, virtualisation and mobility, it’s integral to the foundation of technologies ranging from cloud function as a service to serverless computing.
The Internet was built on the UNIX system in the 1970s, with the first world wide web server running on a UNIX system back in 1989. Sectors including manufacturing, government, healthcare and financial services have adopted it in huge numbers and its impact is still evidenced across numerous Fortune 100 companies today.
Modern-day examples include its use in the Human Genome Project as a platform to decode the human genome and as a render farm of UNIX systems in the first Disney-Pixar full length CGI animated film, Toy Story. Most of today’s ATMs and air traffic control platforms also run on UNIX derived systems, amongst numerous other examples of current implementations.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
A standards journey
The UNIX OS definition sits at the forefront of open standards, but it didn’t start that way. In the initial years after it was created, the code was made available just for the cost of media and shipping. Later, it was licensed by AT&T.
Commercial offerings increased rapidly with greater adoption amongst academia and its further development through AT&T and the University of Berkeley California, among others. Multiple variants started to emerge, with some incompatibilities. And this period, known as the ‘UNIX Wars’, led to a call for standardisation.
During this time the Institute of Electrical and Electronics Engineers published the POSIX® standard – a notable first for the industry, which stimulated federal procurement. Subsequently, the UNIX trademark was transferred from AT&T via Novell to The Open Group to hold in trust, in light of its vendor and technology-neutral stance. This move led to the creation of the Single UNIX Specification, and the UNIX certification programme.
Today, The Open Group collaborates closely with the UNIX community in its efforts to maintain and evolve the standard. This involves allowing the standard documentation to be reused in open source projects, delivering test tools, ensuring that the standard documentation is freely available on the web and managing the UNIX and POSIX certification programmes.
Given its open nature, today’s UNIX platform allows users to focus on innovation rather than competition, while offering a choice for solution integrators over their preferred foundation. It also equips software developers with a degree of portability which allows them to avoid having to consider integration issues and focus on solving business problems instead.
A promising future
Most of the tech community understands that the UNIX approach will continue to be the chosen operating system in scientific and technical computing circles for some time still, but its influence is set to reach far wider.
No other OS has had such a wide ranging impact on modern-day technology and the shape of tomorrow’s computing ecosystem means it’s set to have even greater influence.
As we move into a future of widely connected and diverse computing environments, platforms that stimulate business and market innovation will be in high demand. These environments will need a strong OS foundation to drive business continuity and scalability.
As an enabler of key technologies that also delivers a reduced total cost of ownership, increased IT agility, stability, and interoperability in heterogeneous environments, the UNIX OS is set to meet these requirements perfectly – giving it the edge once more.
Andrew Josey the VP of standards and certification at The Open Group.
Andrew Josey is the VP of standards and certification at The Open Group. He is responsible for the Certification business and the Standards Process within The Open Group, specializing in collaboration and delivery of standards and related intellectual property (publications, test suites and training materials). Andrew is an advocate for open systems, the UNIX system and open source. He has 20 years plus experience of running industry certification programs including: the UNIX system, IEEE POSIX standards, Linux Standard Base, the ArchiMate modeling language and the TOGAF framework.