The Internet has lately played host to a near-infinite amount of fol-de-rol regarding a rather silly post from Weblogs, Inc. and Mahalo founder Jason Calacanis in which he railed against Apple’s recent paranoia. There’s plenty wrong with the gist of his argument (as Leander points out in this rather nice post), as well as a few things that are right on.
But I’m not here to dwell on that. I just want to make one thing very clear: what makes Apple great is not what it puts into its products. It’s what gets left out. As exciting as visions of flying iPhones with 8 sim slots, a Zip disk slot, and dual head-mounted displays might seem, the original iPhone (and iPod, for that matter) became iconic because of its limitations — not in spite of them. Innovation, contrary to Calacanis, is often more about editing than possibility. Apple, more than most companies, is defined by its unwillingness to do too much. The greatest design impact is in what we can’t see.
Apple has always run counter to this scattershot approach, sometimes to a fault. The original Apple ][ was both a beneficiary and victim of this tendency. On the one hand, Steve Wozniak figured out how to provide color graphics with a single chip, an 80 percent or more reduction over previous designs. On the other hand, Woz and Steve couldn’t really conceive of anyone using their computer for anything except programming and playing games — so the keyboard could only create upper-case letters. The Mac wasn’t directly compatible with the Lisa, which wasn’t compatible with the Apple ///, which wasn’t compatible with any of the Apple II computers (not at all true, actually. See footnote at the end.*), some of which weren’t 100 percent compatible with each other. And, lest we forget, Jobs was so impressed with the original Mac mouse that he left arrow keys off the original Mac keyboard.
Rather than expressing its daring by what interesting possibilities it could add to its products, Apple (at least Steve’s Apple) has always pushed the envelope by testing what can get left out. Done wrong, it’s aggravating as hell. The floppy disk wasn’t dead enough at the introduction of the first iMac, and Apple offered no rewritable removable storage at all. No one had thought up the USB thumb drive yet, CD-RWs didn’t appear in Macs for another three years, and your options were basically the dreadful Zip disk format or the obscure, floppy-combatible and notoriously unreliable SuperDisk. If Mac OS X had gone ahead as originally planned without the Carbon API, developers would have jumped ship even faster than they already did.
But it’s also this insistence on experimenting with what can be tossed away that makes Apple so brilliant and their products so amazing. No one ever talks about it these days, but the iMac was partially revolutionary because it ditched all of the Mac’s legacy interfaces — SCSI, ADB, and the mini DIN-8 serial port — in favor of USB. Anyone miss them these days? The original iPod was just a wheel and music controls without the bells and whistles that still define many of its least successful competitors. The current generation of MacBook Pros ditched removable batteries in favor of batteries that last for seven or eight hours. But more than the feature-level stuff, Apple also has a genius for figuring out when to knock something out of its product line-up. Though painful at the time, it’s quite clear that if Apple had kept the Newton around, we would have neither the iPod or iPhone today. And it’s also clear that the inability of the post-Jobs leadership to kill the Apple ][ sooner than 1993 divided the company’s attention too much between its past and its present, giving Microsoft the opportunity to encroach.
What makes the design of Apple’s products brilliant is what gets left out. Don’t come here looking for over-the-top features. Look here for the essence of an idea rendered passionately.
That’s the case for Apple.
*I just got in this note from a former employee of Apple’s publications department who wrote documentation for both the Apple// and Apple///:
“Generally a nice article, but I have to pick a minor nit.
“…the Apple ///, which wasn’t compatible with any of the
Apple II computers…”
If not just incorrect, it’s misleading, at best.
The Apple/// had a compatibility mode that made it act like
an Apple][+ with 48K of RAM. In fact, most of the software
used on Apple///s was Apple][ software.
This crippling was driven by marketing, who for some unaccountable
reason believed that a $3,000+ machine would kill sales of the much
less expensive Apple][ (and //) family if it wasn’t crippled.
Which crippling actually was actually harder to design than to make
a completely transparent compatibility mode.
Nonsensical, but then we’re talking about marketing.
I guess I should note that I worked in Apple’s publications department
from late 1979 through early 1985, and wrote both Apple/// and Apple//
hardware and software manuals. So I’m probably a bit twitchy about the subject.”