MattAndJojang's Blog

God. Life. Spirituality.

Dennis Ritchie: The Shoulders Steve Jobs Stood On

with 4 comments

Dennis Ritchie (standing) and Ken Thompson at a PDP-11 in 1972. (Photo: Courtesy of Bell Labs)

The tributes to Dennis Ritchie won’t match the river of praise that spilled out over the web after the death of Steve Jobs. But they should.

And then some.

“When Steve Jobs died last week, there was a huge outcry, and that was very moving and justified. But Dennis had a bigger effect, and the public doesn’t even know who he is,” says Rob Pike, the programming legend and current Googler who spent 20 years working across the hall from Ritchie at the famed Bell Labs.

On Wednesday evening, with a post to Google+, Pike announced that Ritchie had died at his home in New Jersey over the weekend after a long illness, and though the response from hardcore techies was immense, the collective eulogy from the web at large doesn’t quite do justice to Ritchie’s sweeping influence on the modern world. Dennis Ritchie is the father of the C programming language, and with fellow Bell Labs researcher Ken Thompson, he used C to build UNIX, the operating system that so much of the world is built on — including the Apple empire overseen by Steve Jobs.

“Pretty much everything on the web uses those two things: C and UNIX,” Pike tells Wired. “The browsers are written in C. The UNIX kernel — that pretty much the entire Internet runs on — is written in C. Web servers are written in C, and if they’re not, they’re written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.

“It’s really hard to overstate how much of the modern information economy is built on the work Dennis did.”

Even Windows was once written in C, he adds, and UNIX underpins both Mac OS X, Apple’s desktop operating system, and iOS, which runs the iPhone and the iPad. “Jobs was the king of the visible, and Ritchie is the king of what is largely invisible,” says Martin Rinard, professor of electrical engineering and computer science at MIT and a member of the Computer Science and Artificial Intelligence Laboratory.

“Jobs’ genius is that he builds these products that people really like to use because he has taste and can build things that people really find compelling. Ritchie built things that technologists were able to use to build core infrastructure that people don’t necessarily see much anymore, but they use everyday.”

From B to C

Dennis Ritchie built C because he and Ken Thompson needed a better way to build UNIX. The original UNIX kernel was written in assembly language, but they soon decided they needed a “higher level” language, something that would give them more control over all the data that spanned the OS. Around 1970, they tried building a second version with Fortran, but this didn’t quite cut it, and Ritchie proposed a new language based on a Thompson creation known as B.

Depending on which legend you believe, B was named either for Thompson’s wife Bonnie or BCPL, a language developed at Cambridge in the mid-60s. Whatever the case, B begat C.

B was an interpreted language — meaning it was executed by an intermediate piece of software running atop a CPU — but C was a compiled language. It was translated into machine code, and then directly executed on the CPU. But in those days, C was considered a high-level language. It would give Ritchie and Thompson the flexibility they needed, but at the same time, it would be fast.

That first version of the language wasn’t all that different from C as we know it today — though it was a tad simpler. It offered full data structures and “types” for defining variables, and this is what Richie and Thompson used to build their new UNIX kernel. “They built C to write a program,” says Pike, who would join Bell Labs 10 years later. “And the program they wanted to write was the UNIX kernel.”

Ritchie’s running joke was that C had “the power of assembly language and the convenience of … assembly language.” In other words, he acknowledged that C was a less-than-gorgeous creation that still ran very close to the hardware. Today, it’s considered a low-level language, not high. But Ritchie’s joke didn’t quite do justice to the new language. In offering true data structures, it operated at a level that was just high enough.

“When you’re writing a large program — and that’s what UNIX was — you have to manage the interactions between all sorts of different components: all the users, the file system, the disks, the program execution, and in order to manage that effectively, you need to have a good representation of the information you’re working with. That’s what we call data structures,” Pike says.

“To write a kernel without a data structure and have it be as consist and graceful as UNIX would have been a much, much harder challenge. They needed a way to group all that data together, and they didn’t have that with Fortran.”

At the time, it was an unusual way to write an operating system, and this is what allowed Ritchie and Thompson to eventually imagine porting the OS to other platforms, which they did in the late 70s. “That opened the floodgates for UNIX running everywhere,” Pike says. “It was all made possible by C.”

Apple, Microsoft, and Beyond

At the same time, C forged its own way in the world, moving from Bell Labs to the world’s universities and to Microsoft, the breakout software company of the 1980s. “The development of the C programming language was a huge step forward and was the right middle ground … C struck exactly the right balance, to let you write at a high level and be much more productive, but when you needed to, you could control exactly what happened,” says Bill Dally, chief scientist of NVIDIA and Bell Professor of Engineering at Stanford. “[It] set the tone for the way that programming was done for several decades.”

As Pike points out, the data structures that Richie built into C eventually gave rise to the object-oriented paradigm used by modern languages such as C++ and Java.

The revolution began in 1973, when Ritchie published his research paper on the language, and five years later, he and colleague Brian Kernighan released the definitive C book: The C Programming Language. Kernighan had written the early tutorials for the language, and at some point, he “twisted Dennis’ arm” into writing a book with him.

Pike read the book while still an undergraduate at the University of Toronto, picking it up one afternoon while heading home for a sick day. “That reference manual is a model of clarity and readability compared to latter manuals. It is justifiably a classic,” he says. “I read it while sick in bed, and it made me forget that I was sick.”

Like many university students, Pike had already started using the language. It had spread across college campuses because Bell Labs started giving away the UNIX source code. Among so many other things, the operating system gave rise to the modern open source movement. Pike isn’t overstating it when says the influence of Ritchie’s work can’t be overstated, and though Ritchie received the Turing Award in 1983 and the National Medal of Technology in 1998, he still hasn’t gotten his due.

As Kernighan and Pike describe him, Ritchie was an unusually private person. “I worked across the hall from him for more than 20 years, and yet I feel like a don’t knew him all that well,” Pike says. But this doesn’t quite explain his low profile. Steve Jobs was a private person, but his insistence on privacy only fueled the cult of personality that surrounded him.

Ritchie lived in a very different time and worked in a very different environment than someone like Jobs. It only makes sense that he wouldn’t get his due. But those who matter understand the mark he left. “There’s that line from Newton about standing on the shoulders of giants,” says Kernighan. “We’re all standing on Dennis’ shoulders.”

~Cade Metz, WIRED


Written by MattAndJojang

October 17, 2011 at 10:21 am

4 Responses

Subscribe to comments with RSS.

  1. I knew almost none of this. I’m one of those “end users” who treats a computer like a car. I want to turn the key and have it “go”. Still, the history is fascinating, and I’m beginning to appreciate some of the minds that helped to bring this new world about.

    I must say – the thought of someone reading a book about programming language while sick in bed is something I find amusing, terrifying and astonishing, all at once!


    October 19, 2011 at 9:04 pm

  2. I was kind of restless when I was in college and was jumping from one course to another, until my Dad suggested I take up computer science. He just came from the US and told me computers were the thing of the future ( this was in early 80’s). I followed his suggestion, and I never regretted it. I was hooked (you might say, I found my true calling), and I spent a significant amount of time of my adult life in the computer industry, mostly, developing business software for business corporations.

    My fascination and interest with computers has never waned since then. I’ve seen the birth of the personal computer, and since the 80’s saw how computer technology has grown by leaps and bounds. From a machine that only “nerds” (that’s what my wife calls me :-)) like me could program and use, it’s now evolved into something that most people, from all walks of life can use, in whatever profession they’re engaged in.

    There was a time, that like the guy in the article, I ate, drank, and slept computers, and my idea of a good time, after spending 16 hours in office (this was normal in the computer industry), was lying in bed reading computer manuals and computer programming books! I can understand why most people would find that strange, to say the least. In your words “amusing, terrifying and astonishing all at once!” 🙂

    Anyway, I’d like to think that I’m a more balanced person now. Those days when I was too absorbed with computer technology are over (although from time to time I still keep track of developments in the industry). I’m now pursuing other interests. I just posted this article to honor one of my heroes in the computer industry, Dennis Ritchie, who passed way a few days ago. ( I read his book, “The C Programming Language,” many years ago. The book is probably hidden somewhere in our house.)



    October 20, 2011 at 9:07 am

  3. this is not only end of Dennis Ritchie this is the end of an era…i’m from india and i’m great fan of D.R and his contribution and dedication is non forgatable…without him no one can imagine that a machine will work like that he gave us power of imagination and re-thinking………….thank’s ritchie you will be Missed as always……….

    Rohit Andani

    October 21, 2011 at 11:28 am

  4. I agree, Rohit, 100%! This is indeed the end of an era!

    A private and self-deprecating person, he downplayed his achievements. When asked about C (the programming language he invented), he replied: “C is quirky, flawed, and an enormous success.”

    Nevertheless, no one can deny his achievements and contributions to computer technology. It would be safe to say that without Dennis Ritchie, there would be no Steve Jobs & Bill Gates. In fact, there would be no Microsoft, Apple, or Google!

    His pioneering & groundbreaking work on computer languages and operating systems is the basis of computer software today. Our present-day computer technology stands on the shoulders of Dennis Ritchie.



    October 21, 2011 at 12:05 pm

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: