After whipping Intel, Nvidia & AMD in mobile chips, sky’s the limit for Apple Inc’s silicon design team
February 5th, 2015
Daniel Eran Dilger
While rumors have long claimed that Apple has plans to replace Intel’s x86 chips in Macs with its own custom ARM Application Processors, there are a series of more valuable opportunities available to Apple’s silicon design team, each of which has the potential to replicate Apple’s history of beating Intel in mobile chips and in building mobile GPUs without the help of Nvidia or AMD.
Building upon How Intel lost the mobile chip business to Apple, a second segment examining how Apple could muscle into Qualcomm’s Baseband Processor business a third segment How AMD and Nvidia lost the mobile GPU chip business to Apple and a fourth article examining how AMD and Nvidia could face new competitive threats in the desktop GPU market, this article examines other ways Apple’s silicon design team could expand its scope to address new products and markets.
Apple’s appetite for sensors, imaging and other custom silicon
Beyond mobile Baseband Processors and GPUs, Apple also appears interested in acquiring and developing strategically selected sensors, cameras, IO interfaces and other component technology–some of which even many technically savvy consumers have never heard of before.
For example, last fall Apple devoted an unusual amount of attention to an esoteric display component called the timing controller or “TCON” in introducing the 5K iMac.
Dan Riccio, Apple’s Senior VP of Hardware Engineering, stated during the new iMac’s introduction video, “Communicating to all of those pixels requires a lot of brain power. In a display, it’s called the timing controller or TCON. The TCON tells every pixel what to do and when to do it. For this new Retina display a TCON didn’t exist that could do the job. We had to create it. This single incredibly advanced chip is responsible for directing millions of pixels.”
Apple’s website further details, “Because iMac with Retina 5K display has four times as many pixels as the standard 27-inch iMac display, the TCON had to be able to handle more information than ever. But even the most powerful timing controllers available couldn’t manage this number of pixels, so we had to create a new one with four times the bandwidth of the previous-generation 27-inch iMac — up to 40 Gbps. Now a single supercharged chip beautifully orchestrates the symphony of all 14.7 million pixels.”
In this case, Apple’s motivation for creating custom silicon wasn’t to save costs, but rather to innovate with new technology ahead of the rest of the industry, delivering a 5K all-in-one Mac before consumer electronics rivals could even bring a 5K monitor to market. This is not an isolated example.
In this case, Apple’s motivation for creating custom silicon wasn’t to save costs, but rather to innovate with new technology ahead of the rest of the industry
Cameras and Imaging
Apple is now building a large swath of the world’s premium smartphones, requiring that it also develop advanced custom imaging technology to process photos taken by relatively limited mobile camera sensors.
Camera imaging logic now takes up considerable space in Apple’s latest A8. None of the company’s investments in camera-related logic benefit Apple’s competitors. That has given iPhones a strong competitive advantage in their ability to take superior photos compared to the majority of Android phones on sale.
Apple also developed its own multiple-element LED flash, branded True Tone, to more realistically illuminate subjects in limited light photos. And the company has patented a mechanism for supporting external lenses, which could result in a secondary licensing business similar to its Lightning and other MFi programs.
Apple is also expected to introduce multiple camera sensor photography, achieving better photos using composite, parallel captures in a way that builds upon previous efforts to counteract the limitations inherent in mobile camera sensors, including its acquisitions resulting in HDR imaging and Panorama capture.
With custom designed silicon capable of capturing high quality images from tiny camera sensors–and creating Time Lapse and SloMo (below) captures–it’s not a stretch to imagine Apple expanding its reach into the market for GoPro-style sport camcorders or even drone cameras.
Fingerprint and payment technology
Apple’s acquisition of fingerprint sensor maker AuthenTec gave the company exclusive access to advanced Touch ID hardware. And just as with its custom camera processing logic, Apple developed its own Secure Enclave logic in silicon to build a proprietary architecture for securing fingerprint-related data right into the A7 and A8.
If Apple had instead relied upon Samsung to design its Application Processors, every technology advance it paid for would also benefit Samsung’s own smartphone sales. As it is, Samsung is still working to catch up, going on two years after Apple first shipped its full-finger touch sensor for iPhone 5s.
That has also enabled Apple to launch its fingerprint-secured Apple Pay before other platforms even had an installed base of devices equipped with functional sensors capable of hosting a competing payment system.
3D scanning and imaging
Apple is likely to also buy up other key manufacturers of motion and environment sensors that gave it a similar, strategic edge in producing difficult to copy products. The 2013 acquisition of PrimeSense appears to be an example of doing just that; future iOS, Mac and Apple TV products are likely to include new cameras and sensors that support 3D scanning and gesture-based interfaces, in addition to advanced camera imaging features (such as the augmented reality concepts PrimeSense demonstrated years ago, below).
Other vendors have already shipped or experimented with similar features already, but the products Apple ships will be likely be built with custom silicon that isn’t available as an off the shelf component that competitors can easily copy, because Apple owns the core technologies and the proprietary silicon designs that power them.
It’s not hard to imagine Apple developing a TV set top box that combines the features of today’s Apple TV with gesture-based navigation and Siri commands, console class video games and home automation features.
With the ability to design its own advanced silicon CPUs, GPUs and custom logic, Apple could also develop advanced upgrades to its current AirPort basestations, enabling mesh networks that stitch together “Internet of Things” sensors for managing everything from home security to inside climate control and outside landscaping.
Apple is also currently pioneering some of the largest deployments of commercial solar and fuel cells at its growing portfolio of data centers.
If it wanted to enter the markets for solar powered outdoor sensors; indoor mapping; iBeacon retail transmitters; public WiFi, electric vehicles or anything else involving off the grid energy, it certainly has a head start in understanding how to manage power supplies and battery charging.
Apple’s Maiden, N.C. solar farm. | Source: Apple
Apple also has the ability to flex vast economies of scale to rapidly expand do-it-yourself solar installations among consumers, leveraging some of the same advanced proprietary technology it already uses to achieve industry leading battery life in its laptops and mobile devices. With nearly $200 billion in cash, Apple can enter any market it determines worthy of its time.
Open standards, vertical hardware
At the same time, Apple is also pursuing efforts that use open technologies that its competitors can benefit from. When everyone uses the same technology, it can broadly benefit everyone. Apple’s use of Unix and its promotion of WebKit and OpenGL have dramatically helped keep the industry competitive, for example.
Ironically, Google–despite its reputation for being “open”–has worked to subvert all of those standards with its own internally developed alternatives, largely because Google makes its money via advertising and software services, and either doesn’t want to contribute to key standards or doesn’t have the long-term vision to deploy such enabling technologies.
Conversely, in hardware–which drives Apple’s bottom line–Google tries to support everything, from ARM to Intel’s x86 Atom CPUs, virtually any Baseband Processor and a broad array of lower end GPUs, all of which contribute to fragmentation-related support issues.
The end result is that Google’s Android provides a sort of hobbyist “primordial soup” that allows for innovative concepts to attempt to evolve into a viable state. The most successful of these can be domesticated and refined by Apple to create products that are ready for sale, in the way that Apple Pay is now driving upon roads that Google financed years ago when it tried to roll out its own NFC-based Wallet.
One big difference between today’s Apple and its status back in 2007 when it introduced the iPhone is that the company is far richer, ships far larger volumes of products and makes up a much greater percentage of the world’s consumer electronics–Apple particularly dominates in profitability. That has given Apple options it didn’t have when it first launched ARM-based iPhones without Intel.
At this point, Apple can literally do anything it wants to. The real question is: what will Apple want to do next?