My Background, In General

I've been involved with computers since 1972 when I took my first programming course at Miami University as a requirement in their Engineering Technology program. After taking that first course and the two that followed in the course sequence, I decided to switch majors to the Systems Analysis program. I entered the job market as a business application programmer doing PL/I coding. I eventually transitioned into a little bit of IBM/370 system programming and then into system software programming for a Z-80-based microcomputer system. I finally made the change to UNIX and C where I have been happily working ever since. All of this is outlined in my resume.

On Being an Engineer

According to "Webster ..."

Main Entry: engineer
Function: noun
Etymology: Middle English engineour, from Anglo-French, from enginer to devise, construct, from engin
Date: 14th century
1 : a member of a military group devoted to engineering work
2 obsolete : a crafty schemer : plotter
3 a : a designer or builder of engines b : a person who is trained in or follows as a profession a branch of engineering c : a person who carries through an enterprise by skillful or artful contrivance
4 : a person who runs or supervises an engine or an apparatus

I like definition 3c the best because it describes how I believe I approach my work. Some might call it being a "perfectionist," but I prefer being called an engineer.

In my first two jobs my title was "Programmer." When I moved on to my third job, my title changed to "Software Engineer." I liked the sound of it and thought of myself as someone who "carries through an enterprise" (plans their code out) "by skillful or artful contrivance" as opposed to being a simple coder.

When I started writing, I liked to approach the task in an engineering manner as well. I thought of myself as someone who plans their words, sentences, paragraphs, and chapters out "by skillful or artful contrivance" and thought of myself as a Engineer of the written word as opposed to being a simple writer.

When I started teaching classes, I determined that I wasn't simply teaching. I was transferring my knowledge to the students who were open and receptive to receiving it. So, I started planning my lectures with "a "skillful or artful contrivance" and started calling myself a "Knowledge Transference Engineer."

So, I write software using, write text about, and transfer knowledge on topics related to the UNIX operating system and C. Topics such as:

  • UNIX fundamentals
  • C programming
  • C++ programming
  • UNIX commands/tools
  • vi editor
  • Perl Programming
  • Standard C library
  • Shell programming: sh, ksh, and csh
  • UNIX/C system calls
  • UNIX file systems and processes
  • Motif programming with the X Window System
  • Informix database
  • Hyper Text Markup Language
  • Internet Usage
  • Java programming
In addition to these topics, I also have some basic experience with system administration concepts, and other topics related to the X Window System. All of this work resulted in my having
For those of you who have atended one of the many classes I taught between 1985 and 2002, I have put together a " guestbook " I would like you to "sign." If you are a past student of mine, I would like you to leave me a note with the title of the class, the company and location where the class took place, and the dates the class took place. Also, feel free to leave a short comment if you like. (Note that your note wont appear in the " guestbook " immediately and you have to type the Captcha into the box at the bottom of the entry form.)
two books published through McGraw-Hill: UNIX for Application Developers and Motif Programming in the X Window System Environment. I have two completed but unpublished manuscripts relating to C and the standard C library as well as a couple of other book ideas that have only made it through the outline stage.

One comment that appeared consistently in the student evaluations for the classes I taught was a compliment on my use of diagrams and charts in my explanations of class topics. It seems that I have a knack for explaining things graphically.

Several topics came up so frequently that I created a set of "handouts" or "references" (or "cheat sheets," if you will) that many of my students found generically useful. So much so (as I was recently made aware) that some of them are still in use today -- even after having been copied so many times as to make them almost unreadable.

So, for your further elucidation and edification or even if you're just curious (or if your copy has gotten a little thread-bare), here -- in PDF format -- are a selection of some of my more popular UNIX "cheat-sheets:"

Who Said UNIX was Hard?

You know, a lot of people like to complain about UNIX. It's real popular to complain about how hard it is to use, how cryptic its syntax is, how undecipherable its messages are. But no matter how much you dislike it, no matter how much you complain about it, there is one point of fact that is undeniable: Enough people have found UNIX useful and usable enough to keep it around and in use since 1969! And, without it, we most certainly would not have had the Internet as it exists today. There must be "something" about it that keeps it alive.

Actually UNIX is no harder to learn than any other operating system such as DOS, DEC's VAX VMS, or any other operating system. I believe there are two things that confuse most people. The first is that UNIX is architecturally different from most other systems and, second, from the beginning, there was no design or plan for its growth and evolution.

Architecturally, from the user's perspective, there are no commands for the UNIX operating system. Unless you are booting, debugging, or interacting with the kernel (the actual operating system software) directly, there are no commands that you use to direct the operating system to do something for you. DOS has commands, VMS has commands, almost every other operating system has commands -- UNIX does not.

Instead, UNIX uses a different philosophy. In the place of commands, the UNIX operating system supplies a set of utility programs (or, as they are sometimes called, tools ) that are used to perform some task for you. And, in order for a user to execute any of these utilities, there must be a special utility running called a shell. The shell's responsibility is to read what you type at a keyboard and then direct UNIX as to what you want done. The shell is often referred to as a command interpreter. Many people think that you have to learn and memorize all of the available utilities. However, you really only need to learn a subset of the utilities -- only those utilities you find useful in your work.

Now what is so incredibly interesting, flexible, and useful about this architecture is the way that it allows you to combine the functions of the different utilities. Most modern UNIX systems are delivered with, or have the option to have installed, over 300 of these utility programs or tools. All of these software programs can be run by themselves to perform some data processing task or they can be combined with one another. The two most common ways of combining utilities is through a mechanism called pipes that uses the utilities as filters to process data in stages and batching a sequence of commands together in something called a shell script.

There are literally millions of combinations of tools available to a typical UNIX user. If one utility cannot be found to do the job, if some combination of utilities cannot be constructed with pipes or a shell script, you can write your own utility. That's right -- literally every UNIX command (utility or tool) on the system is either a C program or shell script that someone else wrote. And herein lies the second point of confusion.

The original version of UNIX that was released to the "public" was a fairly small system. There were little more than 100 utility programs and the whole system was described in a relatively small two-volume set of books. Then people started saying things like "Wouldn't it be nice if there were a utility that did ... " and when they didn't find one, they wrote their own. The unfortunate aspect of this is that there was no central "authority" to direct the work. As a result these tool developers started doing things in a manner that they thought was correct and not necessarily the same way that anybody else did something similar in the past.

So, that is why the -d option of the cut command does the same thing that the -t option does on the sort command or that the -F option does on the awk command. And, the -l option is a valid option for the ls command while the -name option is a valid option for the find command. (You will note that the terms utility, tool, and command tend to be used interchangably.) Combine this with the fact that UNIX was developed on an ASR-33 Teletype (if you have ever tried to type on one of these things you will immediately understand) with short, cryptic-looking command names and you will understand what has led some people to refer to UNIX as an expert-friendly system.

Now expert-friendly is not necessarily a bad thing. It simply means that once you have become expert with UNIX -- once you completely understand most of the underlying basic concepts and facilities -- it is a fairly friendly system to use. What you must absolutely understand about UNIX is that it was originally developed by programmers, for programmers, to use for programming. It was never intended to be used by users for day-to-day computing tasks.

There is a fable that made the rounds of UNIX installations in the early days that describes a car that Ken Thompson helped to design. Ken is the originator of the file-system concept that is central to the design of UNIX. The fable goes something like this:

Ken Thompson has an automobile which he helped design. Unlike most automobiles, it has neither speedometer, nor gas gauge, nor any of the numerous idiot lights which plague the modern driver. Rather, if the driver makes any mistake, a giant "?" lights up in the center of the dashboard. "The experienced driver," he says, "will usually know what's wrong."

And so it is with UNIX. The experienced user will know what commands are available; the experienced user will know what options to use; and the experienced user will know how to interpret any warning or error messages. And once you become familiar with the UNIX tools philosophy -- once you grok it in fullness -- the operating system, its tools, and utilities become a wonderfully efficient, flexible, and useful data processing and programming environment.

And, What's Not to Like About C?

It is always interesting to me to note that as much as people love to dislike UNIX, they tend to accept its implementation language much more readily -- The C Programming Language. What is so interesting about this is that C was developed to support UNIX. There were features that they wanted to put into UNIX that weren't supported in the original implementation language called B. So, C was developed with the features necessary to make implementing system software -- specifically UNIX -- much easier.

So I don't feel as strong a need to explain why C is good. Most folks understand that the simplicity, flexibility, power, and elegance available with C make it just about the best programming language ever invented. This is obvious from the number of software vendors -- companies such as MicroSoft, Claris, Symantec, and others -- that regularly use C to implement their software products. And, from attendance in the C classes I teach, it is also obvious that other, non-software companies are using C to implement everything from regular business applications to heavy-duty scientific applications.

I used to be a PL/I fanatic and I can see some of PL/I's influence in C. PL/I used to be a pretty good language for structured-programming but C is the perfect structured-programming language. I have also done a bit of FORTRAN programming and FORTRAN's influence on C is pretty obvious as well. FORTRAN used to be the best scientific language but C has certainly surpassed it. The only drawback I've seen with C is in the implementation of certain types of business programs. It would be nice if C had the ability to perform manipulations with packed decimal-like numbers. But then again, I once designed a set of C functions to do these types of manipulations -- implementation would not have been all that difficult.

All-in-all, C has turned out to be one of the best languages for the implementation of all kinds of software. It is highlevel enough to make it a productive application programming language and it has enough low-level features to make it a useful system software implementation language. And, the ANSI standard has only made it better by ensuring the portability of software systems from one C implementation to another -- with one exception ...

The ANSI folks made a little boo-boo!

The following is my opinion and my opinion only. You may disagree with me -- and that's okay. People have tried to change my mind without success -- and that's okay too. It's my opinion and I think I'm entitled to it. I believe the new style of function definition and the corresponding prototype mechanism destroyed the beautiful orthoganality of C. Now, before you "jump all over me" let me explain ...

First, let me say that I understand the purpose and the intent of the new function mechanism. When I was just starting out in programming I loved those languages that caught the kind of errors this mechanism prevents. However, absolutely nowhere else in the language -- either before or after the ANSI definition -- do you find a list of declarations or definitions separated by commas. Everywhere else they must be separated by semicolons. Add to this the fact that literally everywhere else in the language that you can declare one item of a particular type you can declare multiple items of the same type in a comma-separated list. Since the comma serves a different purpose in the new function mechanism, this cannot be done with the function's parameters. And finally, the parameter names are required in the function definition -- obviously -- but they're optional in the prototype.

It's really just a minor point and I don't let it hang me up when I'm programming. If I'm doing work that requires ANSI compliance I use prototypes and the new style of function definition. If ANSI compliance is not required, I revert back to the older style of function definitions and function return-type declarations. It's not a big deal -- really -- I just think that they "broke" what was once a beautiful language and I do my best to adjust when I have to.

So, What Does Bill Use at Home?

Let me preface this by saying that I have wanted to own my own computer since that first Systems Analysis course at Miami University way back in 1972. Having an interest in electronics, I dreamed of building my own machine. Then there was the used electronic equipment catalog that passed my way with an RCA Spectra (IBM/370 clone) system for sale. Both of these were dreams that I kept alive for quite some while.

Of course, I followed the personal computer "revolution." I received all the catalogs from Altair, IMSAI, Commodore, Sphere, and others. I lusted after a machine from a company called The Digital Group. But it wasn't until, oh, I'd say around 1983 when I visited one of the little computer stores that seemed to be popping up around Cincinnati that I finally made a decision.

As I walked through the store admiring and coveting the systems I saw, I couldn't help but notice one lone little computer sitting back in a corner that no one seemed to be paying attention to. I was drawn to this system because there were pictures on the screen instead of the more typical 24 lines of text. Then there was this mysterious box sitting next to the keyboard that, when moved, caused a little arrow to move a corresponding distance and direction on the screen. And, lo and behold, when I pressed the button on top of the box, things started happening on the screen.

It wasn't too very long before I was pointing and clicking all over the screen of this Apple Lisa computer. The price was still pretty steep but I remember telling myself "this is the future of computing." I just had to have one or something like it. Well, you all know the story -- the Lisa gave way to the Macintosh and it has been, in fact, the system that has led to the current state-of-the-art in computer user-interfaces.

Now remember -- I am somewhat conservative when it comes to technical things. So, I didn't buy the first 128K Macintosh. I did, however, buy a 512K Mac which I later had upgraded to a 512KE. This lasted until I bought a Macintosh SE (with an internal hard disk -- wow!) and eventually led to a Mac IIvx (I know, I know -- no comments from the peanut gallery, okay?). I recently upgraded to a Power Mac -- a G3 "blue and white" tower: 300MHz, 256MB, 6G, etc. I should have waited a little bit longer for the G4, but that is usually what happens to me when I finally buy. Technology never slows down or stops. :-)

So, you ask, "what's a UNIX hack doing using a Mac?" Well, first of all, I stand by my original claim that the Macintosh has driven the current state of the art in computer user-interfaces. Xerox originated the idea with their Star computer (which they didn't market very well) but Apple popularized the graphic user-interface to the point that almost every popular system has one. If it wasn't for Apple, the computer industry wouldn't be where it is today.

And, Macintosh and UNIX are not mutually exclusive. In addition to the Apple native operating system, I used to run a version of UNIX as a "concurrent" operating system on the same machine at the same time. One of the most interesting Macintosh applications of all time is Tenon Intersystem's MachTen. This is a complete implementation of version 4.3 of the Berkeley variant of the UNIX operating system. I haven't upgraded my MachTen for the Power Mac. Instead, I bought a version of Linux available for the Mac. Several years ago -- via a fellow Grand Funk Railroad Fan -- I was able to convert to Apple's version of UNIX they call OS X. I've become more than a little enamored with it, as it is the best implementation of a graphic user interface on UNIX I have ever experienced. My only criticism is that it could have more of a SVR4 flavor (which includes Berkeley) as opposed to standard BSD.

So, I have the best of both worlds on my home system. The user-friendliness of the Macintosh operating system and the expert-friendliness of the UNIX operating system. I'm in computing "heaven!"

Opinions On Object Oriented Obfuscation (OOOOO?)

I have followed the development of "object-oriented" technologies over the years. I believe that these technologies have had a positive impact on software development in particular and computing in general. However, I firmly believe that object technology is not the cure-all for the ills that seem to plague the modern programming industry landscape. There are still many reasons to use and uses for structured programming techniques.

In the past, too many people have run in to "jump on the bandwagon" simply because it's the latest "thing" to hit the programming industry. Too many people accepted the hype of the day that objects would improve programmer productivity and increase the maintainability and reusability of all programming systems. It used to upset me to hear about someone jumping up and saying "we've got to start using C++ and object oriented programming to save the project."

What most people fail to understand is that you don't take a non-object programmer and turn them into an object programmer overnight. Using objects is a completely different -- what's the new word here? -- paradigm, that must be learned completely from scratch. And, you don't just teach a programmer an object-oriented language! They must first learn object-oriented design techniques. Trying to program in an object-oriented language with a non-object-oriented design is like trying to fit the proverbial square peg in the round hole -- it just doesn't fit!

And guess what? Good object oriented design takes a significantly longer period of time. Unless there is a firm commitment, starting from the highest levels of management, down through all levels of the programming staff, to all that object technology entails (including longer learning curves, longer design times, a required object-oriented design methodology, and the resulting longer design-to-delivery cycle time), it should be avoided. Object-anything is not a bandage that you can put on a project to suddenly make it better and fix it; Nor is it something that should be taken lightly for implementing a new project.

What's that you say? You don't have to worry much about object design because you're going to use pre-programmed class (object) libraries? Well, here lies he biggest problem with the current state-of-the-art in object-oriented programming. Object (class) libraries are not general purpose -- they cannot be used interchangeably between programming languages. Java has it's own class libraries that can only be used in Java programs; Classes developed for C++ can only be used in C++ programs; the object-extensions to Perl can only be used in Perl programs; and the Ruby class libraries can ony be used for Ruby programs. The only way to be able to use class libraries between programming languages is to write a set of "wrapper" classes to encapsulate a class' behavior in it's implementation language in the language the object will be instantiated in. What a management challenge!

Finally, I don't think that C++ is the best object implementation language. I have some experience with pseudo-object-oriented programming with X and Motif, and I have recently learned and started teaching C++. Every time I talk about C++ I get the feeling that the language extensions are an unnatural and peculiar growth that has deformed the original beauty of the C programming language. C was never intended to be an object language and if one is going to do object programming they should use a language that was designed for the job. Personally, I used to look forward to learning Smalltalk as an object-oriented programming language.

Java and Ruby to the Rescue?

When Sun first announced the Java programming language, my first response was something like "... Oh, great! Yet another proprietary programming language." However, as I started learning more, I had to step back a bit and thought "... Hey! This isn't too bad."

The first thing I notoced is how much more Java was like C than C++. And, still better, it was more (except for the "main function" (notice I didn't say main "method")), much, much more object-oriented than C++. Considering my "OOOOO" diatribe above, I had to re-think things a bit.

Smalltalk was certainly much more object-oriented than Java. But, Java was a much newer language with more breadth than Smalltalk. And, Java was a much more open language and was still evolving while Smalltalk was static and fixed. And then I noticed one important and big difference: Java has a huge library defined as part of the standard containing thousands of pre-defined, pre-written classes.

As I dug a little deeper, I had to admire Sun's "moxie." Sun decided to only own/control the standard -- the definition of Java -- and let anybody implement the language on any platform they wanted. As long as they stuck to the standard, as long as they didn't deviate from it's specification, anybody could call their product Java. It was sort of an "open-source" type of thing.

Learning still more, I admired still more. Sun's intention was to have a completely scaleable language. That is to say that you could write programs that would help operate the simplest of real-world devices to the most complex systems: from light switches to Air Force fighter jets. And, one step further, no matter which platform you compiled the Java program on, it would run on any computer that had the Java run-time environment implemented and installed on the machine. I was really getting into this "Java-thing" until I realized ...

... this could only be implemented as an Interpreter.

And, interpreter means ... slow! Of course one could (and some have) write a real compiler for Java. But, then you loose the "write once -- run anywhere" feature that made Java so popular in the first place. When you compile code, you can only run it on the machine -- or family of machines -- it was compiled on.

And, then I saw the "light" ...

Sun has the reputation of having some of the fastest computers available (affordable?) to typical consumers and companies. If Java had to be run on fast machines because of the slow interpreted/execution time, then they would most likely have to buy a newer, faster CPU. Since Java was created by Sun most people would probably think that a Java interpreted program would run faster on a Sun computer. So there you have it: built-in marketing at no extra cost.

But, somehow (compared to "others" in the software industry) this point didn't seem so bad. A language that is more object-oriented than C++; a language that is more like C than C++; a language that can be compiled once and run anywhere ... looked pretty cool to me.

... so I "jumped in the pool -- feet first" because the water felt fine!

Now, understand, I am not a Java expert as yet. But, I'm pretty sure that Java has replaced Smalltalk in my desire to have an object-oriented extention to my life in information technologies. I have written some plain commland-line programs. I have written some windowed programs using AWT and Swing. I have used parts of the huge standard class libraries through v1.3. And, I still think it's all "pretty neat!"

If you must do O-O programming; and, you are using C++; take some time and check out Java. I think it will be well worth your while. And if you're comming into O-O from Perl, you should check out Ruby. It's just as "brief" or "simple" as Perl is with a predefined class system.

Bill Steps Out Into a New Frontier

If you're a classic Star Trek fan you've heard the intro: "Space, the final frontier ... " For the longest time I thought UNIX was "the final frontier." But, several years ago, a "new frontier" started appearing on the computing horizon. It started out as a paper presented at a UNIX conference. Then, it expanded into some simple press coverage. More recently, it has jumped into the "mainstream" with its own Internet mailing list, USENET newsgroup, funded educational research, and industry usage.

"What is it?" you ask. "It" is called Plan 9.

No, no, no -- it's not that old B-grade science fiction movie called Plan 9 from Outer Space. It's Plan 9 from Bell Labs -- a new computer operating system. I can hear the complaints coming already -- "Oh, no! Not another version of UNIX." And, you would be right -- it's not another version of UNIX! It has its roots in UNIX and some of the developers that helped develop UNIX have been working on Plan 9, but Plan 9 is something different.

The world is networked. It's a fact -- there are millions of computers sending messages back and forth to one another all over the globe. This networking is not only on the Internet but also in thousands of regional and corporate networks as well. It's not unusual to see these computers work together on a common task in a client-server architecture. It has been found that distributing computing tasks in this way can make for more efficient operation of computing resources.

In order for UNIX and other operating systems to perform distributed, network computing tasks, there are extra "layers" of software that have to be added between the application software and the operating system software. These extra layers add overhead and inefficiencies to the computing system. Plan 9 was designed and implemented, from the ground up, to be a networkbased, distributed computing system.

A typical Plan 9 system would consist of a network of several computer systems. There will almost certainly be a file server for storing all of the files for the system and a CPU server to run compute-intensive jobs. There may also be an authentication server to administer the login process and other computing "servers" as well. User access is through another computer system -- typically smaller, perhaps diskless -- with CPU, monitor, and mouse that Plan 9 documentation refers to as a terminal.

All of these computing resources run a version of the Plan 9 kernel, sending messages and data back and forth to one another, cooperating on a variety of data processing tasks. Small jobs are run on the local "terminal" while CPU-intensive jobs run on the CPU server. Both jobs may need resources such as disk files which are allocated and processed by the disk server. The "everything is a file" paradigm pioneered by UNIX is taken to new, more interesting levels in Plan 9.

In support of the ever-popular "i18n" (internationalization), Plan 9 does not use the ASCII character set. Instead, it uses a superset of ASCII based on ANSI-standard character sets called UTF. UTF characters are 8- or 16-bit characters and are supported by a new C defined-type called Rune. ASCII Runes are still 8-bit quantities. But the non-ASCII characters for other, international languages are 16-bit quantities.

You might think that Plan 9 would be implemented as an object-oriented system, but it's not. Although there is a C++ compiler available for Plan 9, the entire system software was written (from scratch) in a superset of ANSI C. Versions of this new operating system exist for MC68020-based systems, MIPS systems, SPARC systems, 386/486/Pentium-based systems, Next stations, and others. Source code for Plan 9 is available to the general public on CD-ROM for around $350 U.S.

I have bought the CD and am currently in the market for a system to run it on. Ideally, I would like to run it on a Pentium-based notebook computer -- I'm still looking. Be that as it may, I have a very strong feeling that Plan 9 will be a major player in the networked, distributed, computing world of the very near future.

YASS -- Part Deux)

Everybody tries to stand out on the 'Net -- they try to do something that makes them stand out in the crowd. I am no exception. Since I started surfing the 'Net before the Web, and people couldn't see my flamboyant physical signature (which is on my personal page ), I decided to try to come up with a distinctive electronic signature.

Although I really enjoy using my pseudo-random personal signature generator, I thought that using it for business purposes wouldn't be professional enough. In my wanderings through the use of ASCII characters for art, I came to find that some folks had taken the time and trouble to come up with ASCII-based fonts. I found the idea intriguing.

I have included a samples of some "professional" signatures I have used in the past below. If you ever get mail from me or read one of my USENET posts please note that the signature will look like a bunch of garbage unless it is displayed in a mono-spaced font such as Courier. And for you telnet folks -- don't worry -- it is also always less than 80 characters wide.

Some professional signatures of Bill's

Internet Advertising home equity line of credit