How Apple’s PA Semi Acquisition Fits Into Its Chip History
April 28th, 2008
Daniel Eran Dilger
Apple’s acquisition of PA Semi does not signal an entirely new direction for the company. Throughout its history, Apple has designed sophisticated custom chips for use in its computers, in addition to codeveloping complete microprocessors. According to those in the know, it appears that after acting to jettison its internal custom silicon efforts and delegate much of that work to Intel, Apple experienced some remorse and acquired PA Semi to get right back into the chip design business. Here’s a look at Apple’s history in chips, leading up to how Apple’s acquisition of PA Semi relates to the beleaguered future of Microsoft’s Windows.
A Brief History of PC Chip Fabs in the Early 70s.
Intel and Texas Instruments independently developed the microprocessor almost simultaneously. In 1971 Intel released the 4004, what it calls the first commercial microprocessor, while TI introduced the first single-chip microcomputer the same year. The first obvious consumer applications were in calculators (TI’s specialty) and data terminals (the business Intel pursued).
At that time, the chip design process was integrated as part of chip manufacturing. Chip designers had to be large enough to build their own chip manufacturing plants, which created a significant barrier to entry in the business of custom designed chips. The expense of developing a new chip design and the limited demand for custom silicon in an era before ubiquitous personal computing and consumer electronics also helped to limit the playing field to a few major fabs.
MOS Technologies got started manufacturing TI designed chips for use in calculators, and became a principle chip vendor for Commodore Business Machines. Shortly after the release of Intel’s 8-bit 8080 in 1974, Motorola introduced its own alternative, the 6800. Both chips cost around $150 to $300 each. This price point prevented the widespread adoption of microprocessors, as their functions could be also performed with individual components for less.
Rapid Innovation in Chip Development in the Late 70s.
Key chip developers at Motorola, including Chuck Peddle, wanted to develop a much cheaper version of the 6800 that could find wider adoption. After Motorola expressed no interest in the strategy, Peddle left with Bill Mensch and a half dozen other designer to build their own processor. They needed a chip manufacturer, so they went to work for MOS where they developed the 6501, which was similar to the Motorola 6800 but much cheaper, faster, smaller, and more efficient to manufacture.
Motorola sued MOS, resulting in the release of the similarly cheap, but less infringing 6502. Commodore purchased MOS in 1976 after fearing that TI would destroy its calculator business if it did not “vertically integrate” by buying up the means to develop its own components. Owning its own chip fab enabled Commodore to rapidly develop the custom processors used in its computers much faster and cheaper than competitors like Apple and Atari, who had to develop their custom chip designs using outside partners.
Commodore originally hoped to sell the cheap 6502 microprocessors to manufacturers such as Ford, but the real demand would come from personal computers. Steve Wozniak designed his original Apple I computer using the $25 MOS 6502 processor. Many other home computers of the day also used the same chip family, including 8-bit systems from Acorn, Atari, and Commodore, the Apple II line, and the NES game console.
After Commodore purchased MOS, it signaled the intention to enter the gaming market. Mensch, the 6502’s leading engineer, left to set up the Western Design Center, where he licensed MOS’ 6502 to manufacture the improved 65C02 used by Apple in its later Apple II systems, and the 16-bit 65C816, used by the 1986 Apple IIGS, the Super Nintendo, and various embedded devices from automotive controllers to pacemakers, now numbering above five billion chips sold.
Meanwhile, Federico Faggin, a key developer of Intel’s 8080, left to start Zilog, which introduced the binary compatible, simpler, and cheaper Z80 processor. This allowed Z80 systems to run the same CP/M operating system developed for the Intel 8080 by Gary Kildall of Digital Research. Zilog also widely licensed its processor design to other chip manufactures, creating a software standardization platform behind CP/M that made Z80 coprocessor cards popular even among home computers based on the 6502.
IBM Kills Innovation with the PC.
A wide variety of 6502 and Z80 based systems competed for attention throughout the late 70s as the home computer market exploded. In 1981 however, IBM entered the microcomputer market and forever changed everything.
IBM had been working on an advanced RISC design called the IBM 801, a predecessor to the IBM POWER architecture. However, IBM decided to instead select the relatively old hat Intel 8088 as the basis of its 1981 PC to prevent its new desktop system from eating into its higher end computing products. IBM also chose to license DOS from the then mostly unknown Microsoft; the software was a direct, unauthorized clone of Digital Research CP/M.
This resulted in putting IBM’s tremendous marketing power behind one of Intel’s least exceptional processors and some unremarkable copycat software. IBM quickly lost control over the DOS PC industry it invented, unintentionally ceding control to Microsoft and a variety of PC cloners, including Compaq, a company founded by TI employees.
IBM’s PC rapidly killed CP/M and Z80 computers, which had been the closest thing to a lingua franca for personal computers prior to the DOS PC. It eventually also devastated the market for most other alternative desktop platforms. Acorn, Atari, and Commodore increasingly struggled to compete against the PC throughout the 80s. By 1995, Apple was the only non x86 computer maker with any real business left, and even it was looking less than viable.
Macintosh Reintroduces Innovation.
While IBM’s original PC simply perpetuated more of the same thing, Apple had began work years earlier to dramatically advance the state of the art past the simple 8-bit text based systems of the late 70s, in parallel to its sales of the wildly profitable Apple II. The company invested $60 million into the development of an entirely graphical desktop computing system called the Lisa, along with a consumer version that became the Macintosh.
Xerox’ Palo Alto Research Center invested in Apple to help bring its advanced graphical computing technologies into the mainstream. In order to power such a system, Apple needed a far more advanced processor than the MOS 6502 family used in the Apple II line or the Intel x86 family used by IBM’s PC. Apple chose Motorola’s new 68000, an elegantly designed, forward looking, hybrid 16/32-bit processor.
Motorola’s 680×0 family powered the advanced graphics capabilities of the 1984 Macintosh and was also used by the Atari ST and Commodore’s Amiga, as well as the futuristic hardware developed by NeXT Computer in the late 80s and early 90s and workstations from HP, Sun, and Apollo. Intel had developed its own advanced processors, but was forced by the economies of scale related to the DOS PC clone market to build upon the weak x86 foundation of the processor selected by IBM.
Microsoft and x86 Slaughter Better Technologies.
Microsoft leveraged its position as an Apple partner to appropriate the Mac’s graphical user interface for use on the x86 PC; starting with Windows 3.0 in 1990 and particularly with Windows 95, Microsoft pushed its DOS user base to Windows, further entrenching the marginal x86 processor architecture.
Motorola’s elegant 680×0 in the Mac was pitted against Intel’s clumsy and complicated but widely used x86 processor line in a battle of intense competition throughout the 80s. The size of the DOS PC market made it increasingly difficult for any better processor architectures to compete.
The weak technology but strong marketing behind x86 systems running Microsoft’s software was also increasingly a problem for Sun’s SPARC, SGI’s MIPS, DEC’s Alpha, IBM’s POWER, and HP’s PA-RISC architectures. All of those architectures were superior to Intel’s x86 line, but the market for those alternatives was consistently eroded throughout the 90s by the high volume sales of the x86 PC.
In the mid 90s, Microsoft even made efforts to develop cross platform versions of Windows NT for Alpha, MIPS, PowerPC in addition to x86 PCs, in order to get its software working on more substantial and sophisticated hardware. However, Microsoft was not able to deliver a workable cross platform architecture for NT 4.0; it could not even match the processor portability NeXT had perfected years earlier. Microsoft’s next major version, NT 5 (Windows 2000), withdrew support for any hardware other than x86 PCs.
At the same time, Microsoft also worked with Intel to replace the x86 architecture with a new 64-bit processor architecture called Itanium, using a design that originated at HP. Intel and HP partnered with SGI and Compaq in a spectacular boondoggle effort that directly sacrificed three of the five leading processor architectures to create what would ultimately end up being the fourth place loser behind x86, PowerPC and SPARC. More than anything else, Itanium helped promote the hegemony of the x86 by eliminating superior competition.
Apple’s Invisible Chip Business.
Along with Sun and IBM, Apple remained among the minority of computing companies that didn’t blindly follow the Microsoft/Intel juggernaut of marginal PC technology. As Motorola’s 680×0 began to run out of steam in the late 80s after a decade-long run, Apple decided to jump to a new processor architecture in order to keep pace with the fierce investment Intel could afford to pour into its x86 line.
As a vendor of both hardware and software, Apple was commonly compared to x86 PC clone manufacturers such as Compaq, or against Microsoft as a software platform vendor. However, Apple really went far beyond the PC vendors or Microsoft to develop highly integrated custom technology.
While PC vendors simply slapped commodity parts together and sold them with a license to Microsoft’s copycat DOS or Windows, Apple developed groundbreaking, high performance technology that made its desktop computers closer to the workstations sold by higher end vendors such as Sun and SGI. That required sophisticated custom chips.
To develop these, Apple began working with VLSI Technologies, a chip maker founded by Doug Fairbairn of Xerox PARC. VLSI specialized in ASICs, or application specific integrated circuits. Apple developed a variety of ASICs to reduce part costs and develop components that did not yet exist on the market. As chip development grew increasingly specialized and affordable, it began to make more sense for hardware makers to develop their own custom designs rather than only using mass produced, general purpose components. For example:
- Much of the Apple II had been designed by Wozniak using a brilliant combination of mostly off the shelf parts. In developing the Apple III, Wendell Sander created a custom LSI (large scale integration, referring to 10,000s of transistors) chip that bundled the Apple II’s various disk controller components into a single chip, which he called the “Integrated Woz Machine.” This chip was used in the Macintosh.
- Burrell Smith, working on the prototype Macintosh, developed a design to pack similar components together in what he called the “Integrated Burrell Machine,” as a play on the name of Apple’s top hardware competitor at the time. The design was not used in production.
- In the mid 80s, Apple engineers Dan Hillman and Jay Rickard integrated the entire Apple II logic board into a single VLSI (very large scale integration, referring to 100,000s of transistors) chip called the Mega II, which was used in the Apple IIGS as well as the Apple IIe compatibility card for the Mac LC.
- Throughout the 80s and 90s, Apple designed its own custom audio and graphics chips, SCSI controllers and other components. In the late 80s it even developed FireWire as a high speed, isochronous serial replacement to the existing parallel SCSI bus.
Apple’s highly integrated Mac hardware had little similarity to the basic PC designs that commonly lacked any built in support for audio, networking, SCSI, or even decent graphics throughout the 80s and into the early 90s. However, Microsoft’s increasing market power kept the archaic x86 architecture used by DOS PCs firmly in place.
Apple Assumes RISC with ARM and POWER.
Because Apple was working on projects that few other companies were investing in, including the handheld Newton, it made sense for Apple to investigate the development of its own microprocessors. After meeting with Acorn Computer in the late 80s, Apple, Acorn and VLSI teamed up to develop a joint ARM architecture for low power processors suitable for use in the Newton and Acorn’s desktops.
Inspired by a project at Berkeley to develop a RISC processor, Acorn decided that if students could design a new chip architecture, it could too. Apple got involved after finding that Acorn’s design had most of what it wanted for the Newton’s processor. ARM needed the investment Apple could provide in order to develop additional requirements such as an integrated memory management unit. Along with VLSI as their fab partner, the two companies spun off ARM as an independent joint venture, which later licensed ARM processor designs to other companies.
Berkeley’s RISC processor itself served as the basis for Sun’s SPARC architecture. Stanford University’s parallel MIPS project was also spun out into an independent commercial company later acquired by SGI in 1992, which similar to Commodore before it, wanted to make sure it had continued access to the processors for use in its systems. The success of both projects inspired additional investment in RISC design, including an awakening of IBM’s dormant 801 project (resurrected as POWER), HP’s PA-RISC, and DEC’s Alpha processor.
In 1986, Apple had set out to develop the Aquarius project, its own RISC processor, as a successor to the 680×0 Macintosh architecture out of concern that Motorola wouldn’t be able to ship successive generations of the 680×0 on time. To do so, it hired a team of 50 engineers and purchased a Cray supercomputer. Those efforts didn’t materialize.
After Aquarius, Apple started work with Motorola to use its 88100 RISC processor in new Macs under the name Jaguar. IBM, which had just been jilted by Microsoft in the OS/2 fiasco, approached Apple with plans to collaborate on both a next generation operating system software project called Taligent and the use of IBM’s POWER server processor architecture in Apple’s desktop systems. Because of the work Apple had already invested in Motorola’s 88100, the three companies worked together to develop a hybrid chip called the 601, which used Motorola’s bus interface with IBM’s RISC core, resulting in the first member of the PowerPC family.
Apple’s ARM partnership resulted in the world’s most popular mobile processor family, and its AIM PowerPC partnership produced one of the most popular server and embedded processors. ARM was sheltered from competition with x86 due to the lack of suitability of either x86 or Microsoft’s software in the mobile space, but apart from the Mac, PowerPC was largely forced to find markets outside of the PC desktop it originally aimed at. Efforts to deliver Windows NT, OS/2, NeXTSTEP, and other operating systems to PowerPC were all derailed largely because of the market power behind the x86 PC.
IBM, Apple, RISC, and the Roots of the Power Mac
RISC Architectures – Bill Joy
Origins: Why the iPhone is ARM, and isn’t Symbian
More Custom ASICs.
Apple continued to develop its own custom chips with its VLSI partner for SCSI and serial communication controllers, incorporating support for technologies such as AppleTalk serial port networking and ADB, the Apple Desktop Bus developed in the mid 80s to provide support for daisy chained keyboards, mice, trackballs, tablets, and other input devices a decade prior to the release of Intel’s similar USB.
Apple worked so closely with VLSI that it had its own dedicated division at the chip fab. In a press release from the mid 90s, Umesh Padval, vice president and general manager of VLSI’s Apple Products Division announced, “Integrating customized and standard functions on a single chip results in a number of distinct advantages for the customer. These include enhanced performance combined with decreases in the power, size, weight and price of the end-product. Our proprietary FSB cells have enabled VLSI to address Apple’s silicon needs quickly, thereby contributing to their innovative Power Macintosh 5200/75 LC family.”
As an increasingly beleaguered Apple worked to simplify its hardware operations in the late 90s to stem the flow of blood, it canceled the Newton, shed its ARM holdings, and worked to use more common, industry standard parts whenever possible to save money. It rapidly replaced ADB with USB, dropped its custom monitor adapters for the emerging DVI standard, and began using cheaper ATA drive controllers over custom designed SCSI ones.
Apple continued to recruit chip designers to develop custom silicon however. A 2004 Apple job posting for a Senior VLSI CAD Engineer said, “an ideal candidate will have extensive knowledge of the design flow required to build complex ASIC designs. This candidate is expected to define and implement process improvements for the Apple IC design and verification teams as well as set technical direction for CAD projects and infrastructure.”
Apple Delegates ASIC Development to Intel.
By 2005, the future of PowerPC processors looked increasingly like the future of Motorola’s 680×0 a decade prior. Intel, motivated by tough competition from AMD, had turned the sow’s ear of the x86 architecture into a silk purse with its new Core processor. Apple hoped to be able to delegate nearly all of its custom chip development work to Intel and benefit from the same economies of scale that had enabled it to outsource its specialized graphics processor efforts to companies such as Nvidia and ATI.
John C Randolph writes, “Apple had a very good in-house VLSI design group that used to develop the ASICs for Apple’s PowerPC motherboards. With the Intel switch, Apple handed that responsibility over to Intel, and rather short-sightedly let most of those engineers go.”
“That really bit them on the ass when they were developing the iPhone, because not having their own chip design experts in-house made for very poor communication with Samsung, which is why the H1 processor in the iPhone wasn’t quite what they wanted, although it was exactly what they’d asked for; in other words, mostly Apple’s fault, not Samsung’s.”
The Future of Apple’s Chip Plans.
“Now, the most significant data point to me about that acquisition,” Randolph added, “is that PA Semi was founded by the man who ran the DEC Alpha project. Apple can afford to develop a whole new CPU architecture, and if Steve Jobs decides it’s worth the risk, then the results could be fantastic. Imagine a processor designed specifically to work well with the new generation of compilers from the LLVM project. Not to mention, it would render cloning just about impossible.”
“PA Semi isn’t a small acquisition, however much Apple’s trying to downplay it. The last company they spent hundreds of millions for was NeXT. At $275M, I don’t believe this is just about better parts for the phone. I think Steve’s got something bigger in mind, although we probably won’t see the results for three years or more.”
Apple appears to have no real use for the PWRficient processor line PA Semi has been developing, although the chip already has seen significant interest from a variety of companies, particularly in aerospace and defense. It is therefore somewhat ironic that there are so many chip makers that have processor designs nobody seems to be interested in.
The next segment compares the contenders for the future of microprocessors and points out why the tables are turning for the historical market leverage Microsoft has enjoyed with the x86 architecture in the PC world.
Like reading RoughlyDrafted? Share articles with your friends, link from your blog, and subscribe to my podcast! Submit to Reddit or Slashdot, or consider making a small donation supporting this site. Thanks!