group with the subject line “What would you like to see most in
Minix?”He was adding features to his clone,and he wanted to take a
poll about where he should add next.
Torvalds already had some good news to report. . “I’ve currently
ported bash(1.08) and GCC(1.40), , and things seem to work. . This
implies that I’ll get something practical within a few months,”he said.
At first glance,he was making astounding progress.He created a
working system with a compiler in less than half a year.But he also had
the advantage of borrowing from the GNU project.Stallman’s GNU
project group had already written a compiler (GCC) and a nice text
user interface (bash).Torvalds just grabbed these because he could.He
was standing on the shoulders of the giants who had come before him.
The core of an OS is often called the “kernel,”which is one of the
strange words floating around the world of computers.When people
are being proper,they note that Linus Torvalds was creating the Linux
lin 1991.Most of the other software,like the desktop,the utilities,
the editors,the web browsers,the games,the compilers,and practically
everything else,was written by other folks.If you measure this in disk
space,more than 95 percent of the code in an average distribution lies
outside the kernel.If you measure it by user interaction,most people
using Linux or BSD don’t even know that there’s a kernel in there.The
buttons they click,the websites they visit,and the printing they do are
all controlled by other programs that do the work.
Of course,measuring the importance of the kernel this way is stupid.
The kernel is sort of the combination of the mail room,boiler room,
kitchen,and laundry room for a computer.It’s responsible for keeping
the data flowing between the hard drives,the memory,the printers,the
video screen,and any other part that happens to be attached to the
In many respects,a well-written kernel is like a fine hotel.The guests
check in,they’re given a room,and then they can order whatever they
need from room service and a smoothly oiled concierge staff.Is this
new job going to take an extra 10 megabytes of disk space? No prob-
lem,sir.Right away,sir.We’ll be right up with it.Ideally,the software
won’t even know that other software is running in a separate room.If
that other program is a loud rock-and-roll MP3 playing tool,the other
FreeForAll/1-138/repro 4/21/00 11:44 AM Page 56
software won’t realize that when it crashes and burns up its own room.
The hotel just cruises right along,taking care of business.
In 1991,Torvalds had a short list of features he wanted to add to the
kernel.The Internet was still a small network linking universities and
some advanced labs,and so networking was a small concern.He was
only aiming at the 386,so he could rely on some of the special features
that weren’t available on other chips.High-end graphics hardware cards
were still pretty expensive,so he concentrated on a text-only interface.
He would later fix all of these problems with the help of the people on
the Linux kernel mailing list,but for now he could avoid them.
Still,hacking the kernel means anticipating what other programmers
might do to ruin things.You don’t know if someone’s going to try to
snag all 128 megabytes of RAM available.You don’t know if someone’s
going to hook up a strange old daisy-wheel printer and try to dump a
PostScript file down its throat.You don’t know if someone’s going to
create an endless loop that’s going to write random numbers all over the
memory.Stupid programmers and dumb users do these things every
day,and you’ve got to be ready for it.The kernel of the OS has to keep
things flowing smoothly between all the different parts of the system.If
one goes bad because of a sloppy bit of code,the kernel needs to cut it
off like a limb that’s getting gangrene.If one job starts heating up,the
kernel needs to try to give it all the resources it can so the user will be
happy.The kernel hacker needs to keep all of these things straight.
Creating an operating system like this is no easy job.Many of the
commercial systems crash frequently for no perceptible reason, , and
most of the public just takes it.
Many people somehow assume that it
must be their fault that the program failed.In reality,it’s probably the
Microsoft now acknowledges the existence of a bug in the tens of millions of copies of
Windows 95 and Windows 98 that will cause your computer to ‘stop responding
(hang)’—you know,what you call crash—after exactly 49 days,17 hours,2 minutes,and
47.296 seconds of continuous operation ....Why 49.7? days? Because computers aren’t
counting the days. . They’re counting the milliseconds. . One counter begins when
Windows starts up;when it gets to 232 milliseconds—which happens to be 49.7 days—
well,that’s the biggest number this counter can handle.And instead of gracefully rolling
over and starting again at zero,it manages to bring the entire operating system to a
halt.”—James Gleick in the N
FreeForAll/1-138/repro 4/21/00 11:44 AM Page 57
kernel’s.Or more precisely,it’s the kernel designer’s fault for not antici-
pating what could go wrong.
By the mid-1970s,companies and computer scientists were already
experimenting with many different ways to create workable operating
systems.While the computers of the day weren’t very powerful by mod-
ern standards,the programmers created operating systems that let tens
if not hundreds of people use a machine simultaneously.The OS would
keep the different tasks straight and make sure that no user could inter-
fere with another.
As people designed more and more operating systems,they quickly
realized that there was one tough question:how big should it be? Some
people argued that the OS should be as big as possible and come com-
plete with all the features that someone might want to use.Others coun-
tered with stripped-down designs that came with a small core of the OS
surrounded by thousands of little programs that did the same thing.
To some extent,the debate is more about semantics than reality.A
user wants the computer to be able to list the different files stored in one
directory.It doesn’t matter if the question is answered by a big operating
system that handles everything or a little operating system that uses a
program to find the answer.The job still needs to be done,and many of
the instructions are the same.It’s just a question of whether the instruc-
tions are labeled the “operating system”or an ancillary program.
But the debate is also one about design.Programmers,teachers,and
the Lego company all love to believe that any problem can be solved by
breaking it down into small parts that can be assembled to create the
whole.Every programmer wants to turn the design of an operating sys-
tem into thousands of little problems that can be solved individually.
This dream usually lasts until someone begins to assemble the parts and
discovers that they don’t work together as perfectly as they should.
When Torvalds started crafting the Linux kernel,he decided he was
going to create a bigger, , more integrated version that he called a
“monolithic kernel.”This was something of a bold move because the
academic community was entranced with what they called “microker-
nels.”The difference is partly semantic and partly real,but it can be
summarized by analogy with businesses.Some companies try to build
large,smoothly integrated operations where one company controls all
FreeForAll/1-138/repro 4/21/00 11:44 AM Page 58
the steps of production.Others try to create smaller operations that
subcontract much of the production work to other companies.One is
big,monolithic,and all-encompassing,while the other is smaller,frag-
mented,and heterogeneous.It’s not uncommon to find two companies
in the same industry taking different approaches and thinking they’re
doing the right thing.
The design of an operating system often boils down to the same
decision.Do we want to build a monolithic core that handles all the
juggling internally,or do we want a smaller,more fragmented model
that should be more flexible as long as the parts interact correctly?
In time,the OS world started referring to this core as the k
the operating system.People who wanted to create big OSs with many
features wrote monolithic kernels. . Their ideological enemies who
wanted to break the OS into hundreds of small programs running on a
small core wrote microkernels.Some of the most extreme folks labeled
their work a nanokernel because they thought it did even less and thus
was even more pure than those bloated microkernels.
The word “kernel”is a bit confusing for most people because they often
use it to mean a fragment of an object or a small fraction.An extreme
argument may have a kernel of truth to it.A disaster movie always gives
the characters and the audience a kernel of hope to which to cling.
Mathematicians use the word a bit differently and emphasize the
word’s ability to let a small part define a larger concept.Technically,a
kernel of a function f is the set of values,x
such that f(x
or whatever the identity element happens to be.The action of the ker-
nel of a function does a good job of defining how the function behaves
with all the other elements.The algebraists study a kernel of a function
because it reveals the overall behavior.
The OS designers use the word in the same way.If they define the
kernel correctly,then the behavior of the rest of the OS will follow.The
small part of the code defines the behavior of the entire computer.If the
kernel does one thing well,the entire computer will do it well.If it does
one thing badly,then everything will suffer.
Many computer users often notice this effect without realizing why it
The kernel of f(x)=x
is (-1,1) and it illustrates how the function has two branches.
FreeForAll/1-138/repro 4/21/00 11:44 AM Page 59
Documents you may be interested
Documents you may be interested