Categories
Technology

Today’s Unix: New all over again

Originally published at O’Reilly onlamp.com, recovered from the Wayback Machine.

It used to be that Unix was for the geeks, while the rest of the world used less command-intensive, and usually less powerful, operating systems such as Windows or the Mac OS. Even with the advent of native GUIs such as the X Window system, Unix was not for the uninitiated. If you didn’t understand that pwd returned the name of the working directory, ls listed the contents of a directory, and that you accessed help with man, then you needed to carefully remove your hands from the keyboard and back away slowly, trying not to touch anything as you moved.

Well, the times they are a-changing. Today’s Unix is sexier, friendlier, and is moving in completely different neighborhoods than yesterday’s Unix. Gone is the tough, geeks-only image, wrapped in discussions of kernels and cron jobs, communication interspersed with esoteric terms such as awkseddaemons, and pipes. In its place is a kinder, gentler, more easily accessible Unix, wearing new clothes in soft undertones of unintimidating blue.

Still, the basic core of the old Unix remains, forming a hybrid of its older, powerful functionality that’s been integrated with modern conveniences. In many ways, today’s Unix, with its blend of old and new technologies, open source support, and bright shiny new interfaces, reminds me of that old wedding attire rhyme, “something old, something new, something borrowed, and something blue.”

Something Old

There is a commonality to all popular flavors of Unix — Solaris, Linux, FreeBSD, the new Mac OS X Darwin, and so on. Mac OS X may have a very unique user interface, but if you bring up a terminal (access the Finder menu, select Go, then Applications, then Utilities), you’ll be able to enter Unix commands exactly the same as you would in command line mode in OpenBSD or HP-UX. In addition, the overall boss of the system is still root, though how root is managed can change, dramatically, from Unix box to Unix box, and among individual installations.

Each flavor of Unix is based on the same principles of shell access and kernel system control, and each comes with a minimum functionality that allows you to communicate with the operating system. For instance, you can with assurance go into any Unix box and type vi at the command line, and the vi editor will open. Being a longtime vi fan (successfully resisting any urge to move over to emacs over the years) it’s good to see my old friend regardless of whether I’m accessing vi on my FreeBSD server, a Solaris box at work, the Linux dual boot on my Dell laptop, or within the Terminal window of my Titanium Powerbook.

Additionally, no matter your version of Unix, you’ll interact with the operating system through a shell; multiple users can share system resources because the operating system supports preemptive multitasking (the ability to run tasks seemingly simultaneously without clashes over resources); you can work with files, directories, and resources with common commands such as cdlsmkdirgrepfind, and so on; you’ll have access to a wide variety of open source and freely-available utilities such as the previously mentioned vi; the smarts of the system is the kernel; and root is still the superuser that can take everything down in one command.

However, after having just reassured you that there is little difference between old and new versions of Unix, I’ll now contradict myself and tell you that today’s Unix isn’t exactly the same as the old Unix, as this old dog has learned some new tricks.

Something New

One of the biggest differences I’ve found with the newer forms of Unix, or even more modern versions of old classics, is how much easier it is to do things. For instance, one of the most complicated and nontrivial tasks within the Unix world used to be, at least for me, installing software.

Not that long ago, after downloading and uncompressing the software you wanted, you’d most likely find that the package contained source code, which you’d then have to compile and install. Unfortunately, it seemed as if no two Unix installations were ever alike, so you’d have to work with the source code’s Makefile (basically an instruction file for architecting the build); tweaking it, changing the libraries, the locations, the flags used, and so on, in order to get a successful build and install. The POSIX Unix standards effort helped with some of this, but for the most part, you just had to work the build and installation through, and unless you were really lucky, the process could take a frustrating amount of time.

(Really, it was no wonder Unix people became so clever working with basic Unix functionality — no one wanted to go through the hassles of installing new utilities and tools.)

Today, spending hours and days tweaking Makefiles is virtually a thing of the past — most flavors of Unix now come with tools that not only tweak the Makefile for you, but can download the software and build and install it, all in one command.

If you’re a Linux user, you use RPM to manage your software; if you’re a Debian user, you’ll use dpkg. FreeBSD users utilitize the ports to access and install software. Even the Mac OS X has a version of ports called Fink. In addition, all Unix users can use three new utilities — autoconfautomake, and libtool — to literally analyze your environment and generate a Makefile that’s customized to your machine.

Software installation isn’t the only area of improvement to today’s Unix. Getting a software package, or anything over the Internet, is easier with new tools and utilities, such as GNU’s wget utility, which can download a file through either the HTTP or FTP protocols. Though I’ll always remain true to vi, the more modern vim has lured users away from vi‘s simple (but elegant) functionality, not to mention users of that other editor, emacs. For scripting, you have access to some shiny new scripting/programming languages such as Python and Ruby, in addition to that favorite, Perl. And very few people use the old C shell any more — most use the more modern bash (Bourne Again Shell), or newer variants.

No matter what new toy or tool you use, much of this simplified and friendlier Unix environment is due to the operating system’s implicit partnership with the open source community. In fact, this partnership is the main reason that Unix is such a thriving, vibrant operating system thirty years after its introduction.

Something Borrowed

Though Unix has received support from major corporations such as HP and Sun, it’s still the open source developers who spent time creating utilities and tools, as well as working on more portable versions of Unix that kept what was a cryptic and difficult operating system around long enough for it to mature into the powerful, elegant, and user-friendly OS we have today. Unix owes a huge debt to the open source community for providing applications such as those mentioned in the last section.

Need a database for your FreeBSD box? You can download and use MySql for no charge as long as your use is personal. Interested in serving up Web pages? Download and install the number one Web server, Apache. Once Apache is installed, you can develop with Java using Tomcat, or you can use an embedded scripting approach with PHP — again, downloadable, open source, free software. It’s true that many of these products also work on other operating systems, such as Windows. But the concepts behind each began with Unix.

One can literally fill books just trying to list all of the software that’s freely available, or available for a small price, and most of it is open source and runs on the majority of Unix platforms.

(If you have a spare day or two, you can access Source Forge and browse through all of the open source projects it manages. In addition, access FreeBSD software at the FreeBSD Web site; Mac OS X downloads at www.apple.com/downloads/macosx/; more on Linux at www.linux.org/; and general Unix utilities and tools at the GNU site.

In the last decade, we’ve also seen a blend of traditional open source effort and corporate management, with releases of Linux such as Red Hat or Mandrake, and Apple’s Mac OS X, with its proprietary interface built on an open source Unix known as Darwin. It’s this combination of open source effort and corporate support and stability that’s taking Unix to the desktop and laptop machines of the world; a move that’s taking our old friend out of the basement and giving it new purpose.

Something Blue

One of the first things I tried when I received my new Powerbook was to access the Terminal application and attempt to log in as root using the well-known su -l command. It was then that I discovered that Mac OS X was more than a smart GUI layered on to a Unix kernel — the integration between the two is much more extensive.

If you’ve accessed the Terminal in Mac OS X, then you’re most likely aware that the root user is not enabled by default. To enable root, you must access the Netinfo Manager application (from the Finder menu, access Go, then Applications, then Utilities), clicking on the Security menu and then choosing “Authenticate” to authenticate that you’re the computer’s administrator. Once authenticated, you can then enable or disable the root user.

Having to manually enable root is just one of the many twists to the integration of Unix (Darwin) with Apple’s sophisticated user interface, known as Aqua. The reason for the disabled root is security — if the traditional superuser is disabled, it’s much more difficult to crack into the Mac OS X kernel functionality and wreak havoc on the system. Since most Mac users will never access the command line, or will ever need true root access, a decision was made to disable it by default.

You’ll bump into another twistie when you attempt to compile downloaded software, even downloaded software that makes use of the automated configuration tools. Chances are you’ll run into an error similar to the following:

configure: error: installation or configuration
 problem: C compiler cannot create executables

This less-than-helpful message occurs because you have to specifically install developer tools that come on a separate disk from the Apple Mac OS X installation disk.

In spite of the differences you might encounter working through the Aqua user interface, one common connection between this new environment and more traditional Unix flavors is your ability to use X Window (X11)-based software within the Mac OS X environment, even though the two could be considered competitive GUIs.

I wrote this article using OpenOffice, an open source office application that’s freely available for Linux, Solaris, and Windows. Recently, the OpenOffice organization released a beta version of the application for Mac OS X. Rather than attempt to port OpenOffice directly into Aqua, the organization took an interim step and ported the source to Darwin with a X Window user interface. The next step once the first port is successfully tested will be to port to Apple’s Quartz graphical interface, and finally to Aqua.

To run OpenOffice, I needed to download and install an X Window system, and it just so happens there’s one available — XDarwin. Additionally, there’s a X11 window manager that works with XDarwin — OroborOSX. To install both, it was literally a matter of downloading the packages, uncompressing them with Stuffit, and double-clicking on each installation package — first XDarwin and then OroborOSX. No muss or fuss with paths or parameters or settings of any form (based on default installation). Once the X Window environment was in place, I then used the same procedure with OpenOffice. The total time to download and install all three packages was less than fifteen minutes; a shorter amount of time then it takes to install a certain other office application product. A person could get used to this.

Blue seems to be a lucky color with Unix, because Red Hat’s newest Linux installation, Red Hat Linux 8.0, now comes with a spiffy new GUI the company calls Bluecurve. Having installed Linux many times over the years, I was relieved when the installation program was able to detect and configure all of my peripherals, including my wireless mouse and keyboard. I was also very pleased when I was able to add and configure my wireless network card with just a few clicks of the mouse.

Of course, activating the wireless connection failed at first, and I’m having to do some research to find a solution, and I’ll most likely have to do some tweaking to get it to work. But, hey! It wouldn’t be Unix if all the challenges were removed, now would it?