Silicon Valley still sucks

No recovery in Silicon Valley so far:

Employers in Santa Clara and San Benito counties added 200 jobs to their payrolls in August. But compared to a year ago, Silicon Valley has 2,400 fewer jobs, a decrease of 0.3 percent. Economists say the annual comparison is more important than the monthly one, which is easily swayed by seasonal shifts in employment. And for the past five months, the year-over-year changes has stayed fairly close to zero, wavering on either side of it.

But the stock-scammers are still active:

EBay Inc. said it agreed to acquire Internet-calling start-up Skype Technologies SA for about $2.6 billion in cash and stock, posing a new threat to phone companies and expanding the online-auction company’s revenue sources.

Somebody’s smoking drugs at EBay.

Greatest inventor of the 20th century

Jack Kilby, inventor of the integrated circuit, has passed away. He invented the IC during the summer mass-vacation period at TI just weeks after joining the company and before he’d accrued enough time to take vacation. It happened like this:

The innovation came in August 1958, when Kilby was working alone at a Texas Instruments lab in Dallas. Most of the rest of company was on vacation, but Kilby lacked the seniority to take time off. Instead, he toiled on borrowed equipment and, by September, developed a working prototype.

Robert N. Noyce, co-founder of chip giant Intel Corp., is credited with developing the manufacturing process that made economical the wide-scale production of integrated circuits. Kilby and Noyce bickered for years over the other’s claim to have invented the integrated circuit. Ultimately, the two agreed to share credit. In 1995, Kilby was awarded the Robert N. Noyce Award, the Semiconductor Industry Association’s highest honor. When Kilby won the Nobel Prize, he invited Intel’s other founder, Gordon Moore, to the ceremony as a gesture to the contribution of Noyce, who died in 1990. Nobel Prizes are not awarded posthumously.

Including Noyce in the honor was classic Kilby, said those who knew him.

This Noyce fellow had some good ideas of his own, but Kilby invented the IC first, fair and square. The last project Kilby worked on was the solar cell, and we’re going to see a lot of those as well.

Kilby was a great man, and the world is a much better place for his having been in it.

Dvorak explains it all

Here’s his take on the process by which the Apple OS will go mainstream:

1. Apple releases OS X86 as a proprietary system for its boxes. It’s immediately pirated and goes into the wild.

2. Apple squawks about the piracy to draw attention to it, thus increasing the piracy, creating a virtual or shadow beta test. The complaining is necessary to assure Microsoft that Apple does not intend to compete with Windows. This keeps Microsoft selling MS Office for the Mac.

3. There are driver issues that get resolved by the hobbyists, and OS X86 now remains in shadow beta, being tested in a process that is apparently outside of Apple’s control, but is in fact carefully monitored by the company.

4. Once the system stabilizes in the wild, Apple announces that it cannot do anything about the piracy situation and that it’s apparent that everyone wants this OS rather than Windows. It’s “the will of the public.” Apple then makes the stupendous announcement that it will sell a generic boxed OS, “for the rest of you!” One claim is that it is a solution to spyware.

5. Microsoft freaks out and stops development of Office for the Mac. But in the interim, while not selling OS X86 “for the rest of you,” Apple has been developing a complete Office suite, which it announces at the same time.

6. Spyware and viruses emerge on the Mac.

Sounds about right.

High Definition TV Technologies

High-definition TV is probably confusing to most folks, so I’m going to lay out the basics in the interest of world peace and harmony and explain the technologies currently duking it out for your consumer dollar.

First, lets understand that high-definition TV is digital, but not all digital TV is hi-def. DVDs, for example, are digital, but they don’t qualify as hi-def because there’s no more detail in the DVD picture than in a good standard def, analog TV image. Digital TV programming can take any of several formats, defined by their image geometry and the frequency with which the picture is updated. The high end of the scale of these formats is hi-def, the low end is standard def, and the middle is called Enhanced Definition TV or EDTV. Nobody is currently broadcasting in the best format, an image geometry of 1920 pixels x 1080 lines, progressive scanned at 30 frames/sec. The popular formats are 1280 x 720p and 1920 x 1080 interlaced. “Interlaced” means that the video picture is formed out of pairs of images, one consisting of the odd-numbered lines and the other with the even-numbered one; this is a trick that fools the eye and uses only half as many bits as progressive scan. Digital TV at 480 lines progressive is EDTV, and 480 interlaced is standard def, SDTV, the format used by DVDs.

Size and Shape

Hi-Def TV monitors are generally larger and wider than Old-Timey TV (OTTV). The screen shape has a ratio of 16:9 (width:height) compared to 4:3 for OTTV. This is handy when you’re watching movies, but for normal TV programming it means you’re going to have black bars on the left and right sides of your picture. So if you’re used to watching a 27″ set, you would need to get at least a 34″ widescreen HDTV to see an image of the height you’re used to (17″) when you’re watching shows that aren’t tailored for the wide screen.

The main advantage of HDTV is its ability to fill large screens with crisp images that aren’t grainy or otherwise funky-looking, so if you don’t get at least a somewhat larger screen than the normal OTTV screen you’re kind of missing the point.

Geometry

When you’re looking for an HDTV monitor, bear in mind that very few of them are capable of displaying the largest formats directly, pixel-for-pixel; that is, they’re all capable of receiving 720p and 1080i, but they typically do some image processing to display the images on a screen that has somewhat different geometry. For example, most HDTV plasma panels have a native resolution of 1024 x 768, just like crappy computer monitors. But they have image processing capability that allows them to “scale” 1280 x 720 or 1920 x 1080 images onto their native geometry. Since the image changes 30 or 60 times a second, and there may not be a whole lot of difference between any two adjacent pixels, these panels produce fine images up to a certain size, depending on how demanding you are, and are better looking than regular TV in any event. But you’re still going to be better off with a display whose native geometry is perfectly matched to HDTV formats, or one that has flexible geometry like an old-fashioned picture tube, because you’ll avoid weird image processing defects that plague all but the most expensive of plasma sets. That being said, this WalMart wonder is a nice TV set, and nobody knows TV like WalMart shoppers.

The alternative display technologies are LCD (just like computer displays) and a couple of variations on LCD for projection TV, DLP and LCoS.

LCD

Like plasma, LCD is a direct-view, panel technology that produces screens four or five inches thick that you can hang on a wall like paintings. LCD can be had in HDTV geometries, but some of it uses computer geometries as plasma does, so you should read the fine print. As with all of this stuff, you can pay nearly as much or as little as you want for an LCD HDTV, as these two examples show: BenQ has a 37″ monitor with native resolution of 1920 x 1080 (just what you want) for $2000 at Crutchfield. And Sharp has some smaller 32″ sets for twice as much.

DLP

Digital Light Processing is a nice, fairly inexpensive projection technology that’s used in medium-sized rear-projection TVs (typically from 46″ to 60″). DLPs use a chipset from Texas Instruments with 1280 x 720p, so these sets do have to scale 1080i down, but it’s pretty straightforward exercise as each 4 lines of input produce 3 lines of output. DLP TV have a single gun, and get the three colors that TV pictures are made from by shooting it through a “color wheel” that spins at 10,000 RPM or so. It’s a clunky process, but the images are acceptable. This Toshiba is a good example of a DLP set.

LCoS

Liquid Crystal on Silicon is a brilliant concept that JVC developed for video editing systems and has recently adapted for home entertainment, and it’s my bet as the winning technology in this area as it’s both cheaper and brighter than either LCD or DLP. The trick behind LCoS is that the beam of light that shines through a liquid crystal in LCD or DLP bounces off the LCoS crystal, which gives the colored light more intensity. These sets also use three guns so you don’t have a clunky color wheel, and the geometry is HDTV-oriented and not a carry-over from computers. JVC makes the best LCoS sets, but you can also get them from Philips and others, and the prices are reasonable.

CRT

OK, we’ve covered all the new technologies, but what about good, old-fashioned CRTs? It turns out they have a couple of natural advantages over the fixed-pixel-arrays that we’ve mentioned, flexibility and cost. CRTs form images by shooting an electron beam on a phosphor coating inside the tube, using electromagnets to direct the beam, which sweeps the screen from top left to bottom right 30 times a second, more or less (29.97, actually) . They can adjust resolution by altering the speed that the beam travels and by changing the number of times it turns on and off to form picture elements (pixels). It’s not really as flexible as all this at the high end, where a shadow mask is placed in between the beam source and the phosphor to sharpen the dots, but the general principle still applies. And CRTs are cheap to make because we’ve been making them for so long. The LCD companies are having to build brand-new and very expensive factories to produce the larger panels they need at a low cost, and somebody has to pay for them. Sharp is building their own, LG and Philips are collaborating, Sony and Samsung are collaborating, and the Chinese Army is building one with slave labor.

The down sides of CRT are size – they top out at 34″ – and the weight, about 200 pounds for a 34″. Old projection TVs also used CRT guns, but that’s a downer. Good sources for HDTV CRTs are Toshiba and Sony.

OK, that’s that for displays, there’s a lot to be said about HDTV recorders and programming, but that’s for another post.

Tomorrow’s technology today

Check this about Sharp Labs from EE Times:

Camas, Wash. – Sharp Laboratories of America aims to turn your TV into a Web-surfing, news-gathering, sports-summarizing, on-demand movie viewing, e-mail center. As the beachhead for U.S. imports from Japan’s $20 billion Sharp Corp., Sharp Labs also has designs on your cell phone, video recorder, document-imaging system and more.

“We are charged primarily with researching technologies that Sharp Corp. can develop into products for the U.S. market,” said Sharp Labs’ founder and director, Jon Clemens. “For instance, Sharp had the first camera-enabled mobile phone, and by 2005 we will be producing only LCTVs [liquid-crystal-display televisions], no more CRTs.”

Interesting place, if I say so myself.

California recovering

Despite a housing bubble, UCLA economists expect California to grow in 2005:

Yet the fallout from the bubble in California won’t be devastating, according to the UCLA Anderson Forecast. Indeed, the Golden State’s economy will expand at a faster clip than the nation’s in 2005, thanks in part to a recovering Bay Area, the widely watched forecast says.

All in all, next year is shaping up as “solid but not spectacular” for California, said Christopher Thornberg, a UCLA Anderson Forecast senior economist and author of its state outlook.

Welcome back.

The End of an Era

So it’s official, IBM is getting out of the PC business:

SAN FRANCISCO (CBS.MW) — Lenovo Group Ltd. will buy IBM’s personal computing business in a $1.75 billion deal, creating what the companies said Tuesday night will be the No. 3 PC maker worldwide.

I thought this day would never come. The IBM PC, from its inception in 1981, had the most dramatic effect on the computer industry in general and my career in particular of any technology or event of the last 30 years. Before the PC, I was a system programmer at Texas Instruments developing operating systems and protocols for closed, proprietary systems, systems that were full of fun and complexity with multi-tasking, real-time priorities, virtual memory, and interprocess communications. The PC, with its deficient operating system and marginal hardware, put an end to that sort of system, bringing about a massive shift to bare-metal programming, a retarded CPU architecture, a return to proprietary communication protocols, and assembly language instead of block-structured high-level languages.

As the virus spread, it gradually overcame its origins and evolved into a lower-cost version of the kind of systems I used to work on, only without my having access to the system code so I could simply change it if I didn’t like the way it worked until Linux came along.

But now IBM has decided the whole experiment wasn’t such a hot idea. Presumably, they’re still in the server business as well as services and consulting, so the more things change the more they remain the same. Sorta.

Scientific Method Man

Gordon Rugg has devised a simple method of solving scientific problems currently thought to be intractable, such as Alzheimers. Here’s how it works:

The verifier method boils down to seven steps: 1) amass knowledge of a discipline through interviews and reading; 2) determine whether critical expertise has yet to be applied in the field; 3) look for bias and mistakenly held assumptions in the research; 4) analyze jargon to uncover differing definitions of key terms; 5) check for classic mistakes using human-error tools; 6) follow the errors as they ripple through underlying assumptions; 7) suggest new avenues for research that emerge from steps one through six.

On its face it seems reasonable, so much so that it’s more or less exactly what many of us do already in the search for unique intellectual property.

Disruptive technology

Take a cheap WiFi router and add some mesh networking software, and before you know it he Telcos are obsolete. Read Cringely’s theory about how it will unfold:

A disruptive technology is any new gizmo that puts an end to the good life for technologies that preceded it. Personal computers were disruptive, toppling mainframes from their throne. Yes, mainframe computers are still being sold, but IBM today sells about $4 billion worth of them per year compared to more than three times that amount a decade ago. Take inflation into account, and mainframe sales look even worse. Cellular telephones are a disruptive technology, putting a serious hurt on the 125 year-old hard-wired phone system. For the first time in telephone history, the U.S. is each year using fewer telephone numbers than it did the year before as people scrap their fixed phones for mobile ones and give up their fax lines in favor of Internet file attachments. Ah yes, the Internet is itself a disruptive technology, and where we’ll see the WRT54G and its brethren shortly begin to have startling impact.

RTWT for the theory, which sounds crazy, but who really knows?