Linux: A tale of woe

I’ve been using Linux on my desktop at work for years, and during that time I’ve maintained a doggy slow Linux box at home mainly for remote work on the desktop. So when I did my latest hardware upgrade of the home computer, I decided to switch over from Windows 2000 to Fedora Core Linux, thinking it would simplify things and all that. I also wanted to roll a home HDTV recorder/server using MythTV and some of the other open source stuff. So my plan was to build up a nice machine that I could use for software development, web stuff, and TV hooked into my home network and attached through the Internet to the company.

It turns out it wasn’t so easy.

The hardware I selected is all the latest and greatest stuff: ASUS A8N-VM/CSM motherboard, with an on-board nVidia video adapter with DVI out, S/PDIF out, Gig Ethernet, lots of USB, dual channel DDR, AMD’s 64 bit processor, and Serial ATA-II; a 300 GB Maxtor drive with Serial ATA-II, a DVD-RAM burner (that can handle all the other formats as well) . Windows XP installed on my system without incident, connected to Microsoft and downloaded updates. It runs great.

Linux was another story. I started with Fedora Core 4, the latest “stable” version and by most accounts a flawed distribution. Never mind that Fedora ripped out all the video stuff out of fear of the FCC, you can add it in later. The trouble with Fedora Core 4 is that it didn’t know what to do with my hardware. Sure, it was able to run its install program Anaconda from the CDs that I burned. But Anaconda didn’t recognize my disk drive, and actually told me my hardware was defective. Now I can understand that some software may not support some hardware, but the current Linux recovery tools can see SATA-II drives attached to nVidia controllers and format and partition them; just try Gnu parted or QtParted. Apparently you can trick the installer into using current technology with the “device” command, but this isn’t immediately obvious.

So rather than screw around with all that (there are multiple problems with FC4 and nVidia), I decided to jump right into Fedora Core 5, currently in the final testing phase and reasonably stable. It turns out that was probably a good move, as the standard set of CDs for Test 3 (4.92) installed on my system without incident and booted up.

But that’s when the fun starts. It turns out that the Ethernet won’t work if you’ve run Windows on it previously and haven’t done a cold start (disconnecting the power cord from the wall.) And it turns out you don’t get a mouse cursor in X – from your login screen and beyond – and it turns out that your keyboard and network will die within minutes of startup. These issues are all well known, as is the failure of Kudzu (the new hardware scanner) and some other nice things, but the Fedora people are heads-down to release the first “Release Candidate” on Monday.

I already know it won’t run on any system with an ATI or NVidia chipset, however. One of the developers introduced a new bug after testing closed that causes these drivers not to load, and he plans to roll out a fix a few days after the release on Monday. This points to a couple of problems with the Open Source way of doing things:

1. A programmer shouldn’t be allowed to check code into the project that hasn’t been tested. And code that hasn’t been tested shouldn’t be distributed to mirrors all over the world. This is the “adult supervision” problem.

2. Open Source doesn’t interact well with advanced hardware. The vendors are reluctant to share detailed specs with open source developers because such specs are cookbooks to copy shops that want to clone the hardware. Today the competition is between ATI and nVidia, and a few years ago it was between 3Com and Intel. So these guys release enough information for the Open Source people to write drivers that perform OK but not great, and they release their own binary drivers that run fast and don’t disclose the tricks they did in the hardware to boost performance. That’s reasonable.

This is the way it has to be, and it’s perfectly consistent with “free as in speech, not as in beer.” You don’t get free hardware with Linux, you just get free sofware. Sorry.

So all of this tells me that my new computer isn’t going to be running Linux anytime soon, and I’ll probably have to stick with Windows, relegating Linux to the last generation of hardware as before.

Can Linux ever catch up with Windows or is it doomed to be the red-headed stepchild of software engineering? My guess is that there’s a structual flaw in the Open Source model, and I’ve just hit it.

UPDATE: OK, it wasn’t really all that hard. I went back to Fedora Core 4 since Core 5 isn’t there yet. You have to invoke the installer using “linux noprobe” so that you can tell it what driver to use to find the SATA-II drive (sata_nv). After it installs, you then have to edit the GRUB bootloader to add “noapic” to the OS command line, and you can cement that into /boot/grub/grub.conf so you don’t have to do it each time you boot.

Synchronizing the nVidia drivers with the kernel version is a trick, so more on that later.

I said FC5 ain’t there yet, and this is why:

A note for users of ati-fglrx and nvidia-glx: Due to a bug in the FC5 release kernel users of non-GPLed kernel modules will have to wait for an errata kernel; that should happen soon. BTW, the driver packages got renamed; they are now xorg-x11-drv-fglr and xorg-x11-drv-nvidia.

The official Fedora web site is silent on this problem, much to their shame.

Burning Man brings Internet to New Orleans

Maybe those Burning Man people aren’t completely worthless after all:

Now the group, known as Burners Without Borders, is using new Kyocera mobile hot-spot technology to create a wide-area-network in an area with little, if any, Internet access. Their shoestring network, based on $250 routers and $150 wireless cards, could prove to be a model for other volunteer groups in disaster areas.

The credit goes to Kyocera, and it’s a very cool little deal they’ve got.