I was saddened to learn recently of the death of one of the most influential individuals in the history of computing. This person’s elegant sense of design and his brilliant grasp of the potential inherent in computers to transform the world are visible everywhere one finds a computer. The Mac Powerbook on which I write this blog, as well as the server which hosts the UConn Today website, owe a large measure of their power to this individual’s beautiful work. I am speaking, of course, of Dennis Ritchie, one of the creators of the C programming language and the UNIX operating system, whose obituary appeared in the New York Times last week.
An operating system like UNIX is the basic control program for a computer. It mediates the interaction of the keyboard, disk drives, displays, and the computational engine of the computer. A programming language, like C, is a way to express the instructions a programmer wishes the computer to carry out in a relatively readable way. The language’s compiler program then converts the readable code into the machine instructions that the chips in the computer actually obey. What Ritchie, with Kernighan, Thompson and others, accomplished was to design a language, C, that was simple enough to be easily used on many different computer architectures yet powerful enough to make complicated programming feasible. At the same time, they wrote the UNIX operating system – a very complicated program – in C, making it possible to move UNIX to new machines as the underlying technology changed.
Perhaps most importantly, they were open about their methods. As a result, UNIX and C were quickly adopted by computer science researchers and a number of the cutting edge computing companies such as SUN Microsystems. Among the sophisticated companies to adopt UNIX was Steve Jobs’s NeXT. The technology moved from NeXT to Apple when that company was reborn under Jobs’s leadership. Mac OSX, the operating system installed on my Mac Powerbook, is a version of the UNIX operating system developed by Ritchie and his collaborators at Bell Labs back in the early 70’s.
I can’t help but reflect on the difference between Ritchie’s contribution and that of Steve Jobs, since they both died in the last month. As beautiful as Jobs’s work is, it is commercial to the core. The iPhone/iPad is a closed platform, and apps for it come from a central and tightly controlled source. Music bought from iTunes comes with strict controls – have you considered buying a Kindle Fire? How would you play your iTunes music on it? Yet all of the tightly licensed, tightly controlled Apple universe rests on the openly disseminated work of Ritchie and his collaborators. Indeed, the world of Apple.com could not exist without UNIX and the other critical infrastructure of computing and the Internet, whose developers made their work available to everyone.
Ritchie’s work is part of an open development stream that is parallel to but less visible to the public then the commercial world of Apple and Microsoft. That development stream includes Richard Stallman, the author of the famous text-editor Emacs, who has opposed software patents and promoted the free dissemination of computer code; Linus Torvalds, who began the development of the UNIX-based operating system Linux and pioneered the notion of crowd-sourcing the development of free software; and Tim Berners-Lee, who invented the world wide web and released it so that he could better communicate with his fellow physicists.
Among computer scientists and programmers, the names Ritchie, Stallman, Torvalds, and Berners-Lee resonate in a way that Gates and Jobs never will. To fully appreciate Ritchie’s contribution, think for a moment about what has changed in computing since he and his collaborators designed and built C and UNIX around 1970. The Apple II would not be available until 1977; the IBM PC until 1981. The Internet did not begin to take shape until the late 80’s and early 90’s. The original version of C and UNIX was developed on the PDP-7, a $70,000 “mini-computer” that shipped with 4K of main memory (that’s 1 millionth of the memory on my laptop). Yet their design proved capable of scaling up over 40 years of advances in hardware and will likely continue to be used for the foreseeable future. Will we be able to say the same thing about the iPhone?
/* Dennis Ritchie was known by his login name, dmr */
/* Here's a farewell in the language that he co-authored */
#include <stdio.h>
int main(void)
{
printf("Goodbye, dmr\n");
}