3892 Views

What is Semiconductors 3.0?

I am sick and tired to hear the same buzzwords a zillion times every single day. IoT, Cognitive Computing, Cloud Computing, Fog Computing, Big Data, Artificial Intelligence, Smart-whatever (from objects to cities and pretty soon toilet paper, I’m afraid). What really makes me sick is that these buzzwords are used, most of the time, to avoid entering into the “low-level” details and are just thrown at us to justify unclear strategies. Intel wants to build the IoT fabric. What is that exactly? Does that mean they’ll compete with Cisco? For me, it’s just useless garbage language to avoid saying they don’t have a clue where they are going. On the other side of the competitive landscape, SoftBank buys ARM because of IoT. Right, obvious, isn’t it. But what does it mean exactly? Why would a Telco (or a pure financial institution, I’m not sure really) need to own a CPU architecture? And why would ARM dominate IoT like they dominate mobile? What software legacy makes their Instruction Set Architecture desirable in the IoT space? Hey, I am not saying Intel will not be the IoT fabric or ARM won’t be powering the IoT edge devices. I am just saying that throwing buzzwords at us isn’t sufficient to explain and justify strategies. Throwing slogans at people is what advertisers or politicians do. The tech world, i.e. us, should be a little bit more pragmatic and should not accept slogans as absolute truth.

 

 

Man of Science series. Abstract design made of human head, numbers and visual elements on the subject of human mind, modern technology, education and science

 

Alright, now that I have expressed my frustration, let me try to enter in the details and see what IoT might do to the competitive landscape. The title of this post is Semiconductors 3.0, so, you might guess that there was a 1.0 and a 2.0.

 

Semiconductors 1.0: The PC era

If we look at the different phases of semiconductor market evolution, we can clearly see the PC era as THE first major continental drift that totally changed the landscape. Intel became the huge behemoth we know, DRAM became strategically important and put Japan Inc. on the planet, and, probably even more important, software legacy became a major driving force. This was the first mass market “platform”, the Wintel platform. That era was characterised by several 100Mu of “objects” (PCs, workstations) priced at $1000. I am just using round numbers for the sake of simplification. These are the orders of magnitude. A $100B market, that’s enough to change the landscape.

 

Semiconductors 2.0: The Smartphone era

PC stopped growing and started declining. What came next was a battery powered device, the Smartphone. Intel totally blew it. Qualcomm became the huge behemoth we know. Flash memory became strategically important and put Korea Inc. on the planet and the software changed totally. Microsoft also blew it and now we have Apple and Google platforms both running on ARM cores. Because of the Apps store phenomenon, the platforms are only 2 because the App gap is enough to kill even a good platform if you can’t find all of the popular apps on it (Windows Phone, Symbian and a few others died or are dying as a result). This era is characterised by several Billions of units of objects (in this case, phones or tablets) priced at $100. Again, I am using round numbers to keep it simple. So, the $100B market size is actually similar to the PC era.

 

So, what’s next?

Guess what, Mobile isn’t growing much anymore. The only growth is in the low and mid-range, in developing countries, so price pressure is getting worse and worse. This time, Qualcomm is not even alone. Mediatek, Samsung and several Chinese competitors are also capable of making mobile SOCs and modems, running the very same Android. The poor western guys who tried to compete are all dead after loosing tens of billions of $. Broadcom, ST-Ericsson, NXP, Infineon, Icera/Nvidia, Intel, TI, the list is long and sad. One could say that the 2.0 era is also a drift towards the east. Korea and China are much stronger than they were in the 1.0 PC era. Taiwan which was a strong player in the PC days, remained a strong player in the Mobile period, mostly thanks to TSMC and Mediatek.

 

Semiconductors 3.0: The Internet of Everything era

Where is the next growth sector? I guess we all agree it has to do with the emergence of a bunch of new connected devices. But, we should be careful trying to extend the 1.0 and 2.0 rules (I intentionally did not use “paradigm” to avoid one of my least favorite buzzwords) to this new era. Why? Mostly because of fragmentation.
IoT is not one market. It’s a collection of many markets that have different cycles, different technical requirements, different growth patterns. To simplify to the extreme, we can probably say that it’s a market for tens of Billions of $10 units. So, we keep the same constant $100B TAM and everyone is happy… Well, not quite.
The fragmentation brings new rules compared to the PC and Mobile eras. Automotive, Industrial, Networking, Consumer, Medical, Military applications are not “one” IoT market. For example, there is absolutely no reason why one merchant software platform would win in all the sub-segments. Plain Linux is probably fine for almost all apps. Most IoT devices will not need and certainly will not encourage large, open apps stores to be developed which would be a big driving force against fragmentation. Do you really believe BMW will let you download a Lewis Hamilton app to change your autonomous driving style?

 

Uncertainty and fragmentation are not conducive to the way the US corporations think. Let’s face it, even if it’s politically incorrect, most technology companies in the US are hoping to build monopolistic positions. They invest big money when they feel there is a possibility to build a quasi-monopoly. And, they are extremely good at it. I would therefore bet that Semiconductors 3.0 will continue the drift towards the east. Uncertainty and fragmentation aren’t that bad if you move really fast and adapt really fast. We could also see some European companies doing reasonably well, close to their end market. Germany is important for Automotive and Industrial, for example, so NXP and Infineon could continue to do well.
How about processor platforms? I think the game is totally open and I believe that it’s open both at the edge and in the cloud.
At the edge, ARM is definitely under threat because the 32 bit core is a commodity and the software legacy is not so relevant.
In addition, I am a firm believer that some more processing will need to happen at the edge to pre-process the enormous amount of data generated by the hundreds of billions of sensors. This pre-processing is necessary for latency, bandwidth and cost of data transportation reasons. It will be done most probably by embedded neural processing units, let’s call them ENPU. A prototype of this approach is the QuarkSE MCU from Intel. It has a 32 bit X86 Quark core but its real power comes from its embedded neural processing unit which does super fast and super low power sensor data analytics. It would actually be fun to see Intel become the king of the edge, almost by accident, as this is clearly not their strategic focus but rather a hobby.

 

In the Cloud, Intel is under serious threat because the processing required for the big data analytics (buzzwords again, sorry) is different. Deep Learning (more buzzwords) or whatever AI technology will take off, does not care about X86 instruction set. It cares about massive parallelism, removal of the Von Neumann bottleneck and, I strongly believe, it will want in-memory computing. A new kind of Cloud Processing Unit will emerge, let’s call it CNPU for Cloud Neural Processing Unit and the control processor will become a commodity, doing ancillary tasks. GPUs are doing this neural processing today but that’s just a gap filler until something better emerges. No matter what Nvidia says, GPU were designed to do Graphics, not to do Neural Processing. Our brain isn’t full of GPUs, sorry guys. That’s ARM’s opportunity to finally enter the data center in a semi-big way but not in the key strategic socket. Who will take that neural processing socket? New names probably. Google, Alibaba, Baidu, Facebook, Amazon, Apple,… Anyone trying to shoehorn its ISA, whether ARM or X86 into that socket will fail. Are ARM or Intel capable to come up with something radically different from their core technology? I am not betting on that, but I might be wrong.

 

I also believe that the 3.0 era will also see a new memory technology emerge. Artificial Intelligence needs a fast write, fast read, non-volatile memory. Samsung/Grandis, Intel/Micron, WD/Everspin and others are working hard on this. I don’t know who will win but it will be interesting to watch.

 

I am sure Mr Son, who just wrote a $32B check to acquire ARM, wants everything I just wrote to be totally wrong. He’s a much smarter than I am, so he’s probably right :-) Stay tuned as we see this new era develop.

 

PS: This is a personal note. Most of you won’t care so you can stop reading here. As I am writing this post about the next phase of the Semiconductor market, I cannot describe how sad I feel when I see ST, my former employer, going through an endless decline. This fragmented market I describe as the Semiconductors 3.0 era would have been a PERFECT market for ST. ST has sensors, MCUs, analog, mixed signal, power. It has almost all the bits and pieces (OK, they’re a little weak in connectivity and in software but that could be fixed) to be the Intel/Qualcomm of the Semiconductors 3.0 era. It also has 28nm FDSOI which is the perfect process for IoT edge devices. What’s missing is the leadership to unify all the people, all the silos in ST around one clear vision. I wish them good luck anyhow.

 

________________

This is a guest post by Philippe Lambinet, CEO of Cogito Instruments

 

Recent Stories