Monthly Archives: September 2016

value-chain

Fabless Semiconductor Company Value Chain

Michael Porter developed the value chain concept in 1980 in his book “Competitive Advantage”. A value chain is a series of activities within a company in order to deliver a product or a service. The total value delivered by the company is the sum total of the value built up all throughout the company.

 

This description is applicable also to the semiconductor industry. In the semiconductor industry every company has to generate a value. The sum of the entire value chain (from end to end) is the total offering which is delivered to the end customer. For visualisation sake, one can image a chip (IC or ASIC) production all the way from sunny sand dunes, where the sand is being taken to a factory, to wafer processing, ASIC design using EDA tools, packaging, testing and finally delivering the chip (ASIC) to Apple for making a new mobile phone so the customer that can enjoy the new benefits of a new IC.

 

In today’s semiconductor industry the value chain is dynamic and managed via partnership between all the value chain creators. Fabless semiconductor company’s value chain is described in the following block diagram. It assumes that the Fabless semiconductor company has agreements with all the typical suppliers such as: EDA companies, Wafer supplier and OSAT companies (those companies has also their own value chain. For example: packaging companies have material suppliers, machines, design tools and more).

 

Another type of value chain can be created with an ASIC supply chain partner that manages the fabless company operation. There are several companies that are specialized in providing semiconductor supply chain services. You can find them in this link.

 

amkor

Qualcomm Opens Semiconductor Test Manufacturing Site in Shanghai

Qualcomm Incorporated (NASDAQ:QCOM) today announced the opening of Qualcomm Communication Technologies (Shanghai) Co. Ltd., a semiconductor test facility in the Waigaoqiao (WGQ) free-trade zone in Shanghai, and its first foray into providing manufacturing services for semiconductors. By working with Amkor Technology, Inc., one of the world’s leading providers of contract semiconductor assembly and test services, the new company will combine Amkor’s extensive test services experience and state of the art cleanroom facilities with Qualcomm Technologies’ industry leadership in cutting-edge product engineering and development.

 

The new manufacturing facility demonstrates Qualcomm Technologies’ commitment to continue to invest and help develop semiconductor expertise in China, and is indicative of growth in semiconductor market leadership in the country. Through the ownership and operation of a semiconductor test center, Qualcomm Technologies will enhance its focus on customer service, continue to develop its expertise in operational excellence, and increase its business presence in China.

“The test facility is part of our continued mission to streamline supply chain operations and improve operational efficiency,” said Roawen Chen, senior vice president, QCT global operations, Qualcomm Technologies, Inc.

 

“Qualcomm Technologies continually strives to improve our manufacturing footprint in China and the formation of Qualcomm Communication Technologies in Shanghai is another example of this dedication,” said Frank Meng, chairman, Qualcomm China.

“We are excited to work with Qualcomm Technologies in their new test operation in China,” said Steve Kelley, Amkor’s president and chief executive officer. “Amkor offers the most advanced outsourced assembly and test technologies in China, and this expanded relationship is a natural extension of the long history of close collaboration between our two companies.”

The Shanghai-based facility is set to begin operations on October 18, 2016.

 

About Qualcomm Incorporated

 

Qualcomm Incorporated (NASDAQ: QCOM) is a world leader in 3G, 4G and next-generation wireless technologies. Qualcomm Incorporated includes Qualcomm’s licensing business, QTL, and the vast majority of its patent portfolio. Qualcomm Technologies, Inc., a subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of Qualcomm’s engineering, research and development functions, and substantially all of its products and services businesses, including its semiconductor business, QCT. For more than 30 years, Qualcomm ideas and inventions have driven the evolution of digital communications, linking people everywhere more closely to information, entertainment and each other. For more information, visit Qualcomm’s website, OnQ blog, Twitter and Facebook pages.

 

About Amkor Technology, Inc.

 

Amkor Technology, Inc. is one of the world’s largest providers of outsourced semiconductor packaging and test services. Founded in 1968, Amkor pioneered the outsourcing of IC packaging and test, and is now a strategic manufacturing partner for more than 250 of the world’s leading semiconductor companies, foundries and electronics OEMs. Amkor’s operating base includes more than 8 million square feet of floor space, with production facilities, product development centers, and sales and support offices located in key electronics manufacturing regions in Asia, Europe and the U.S. For more information, visitwww.amkor.com.

Contacts

 

Qualcomm Contacts:
Pete Lancia, Corporate Communications
1-858-845-5959
corpcomm@qualcomm.com
John Sinnott, Investor Relations
1-858-658-4813
ir@qualcomm.com
or
Amkor Contact:
Greg Johnson, Vice President, Finance and Investor Relations
480-786-7594
greg.johnson@amkor.com

Negotiation with lawyer who is sitting behind desk and has clasped hands on document.

Verification, Validation, Testing of ASIC/SOC designs – What are the differences?

If you are involved in any ASIC/SOC design life cycle, it is highly likely that you would have heard questions like – Have you verified a feature?  Is all feature testing completed?  How will you validate a new feature?  What design defects were found and how?

 

The terminologies Verification, Validation and Testing are used interchangeably and can be confusing at times- at least for entry level engineers.

 

All of these terms does relate to testing of the chip but refers to the same at different stages in a chip design and manufacturing flow.  Here is what they really mean.

 

 

SoC Verification is a process in which a design is tested (or verified) against a given design specification before tape-out. This happens along with the development of the design and can start from the time the design architecture/micro architecture definition happens.  The main goal of verification is to ensure functional correctness of the design before the tape out.  However with increasing design complexities, the scope of verification is also evolving to include much more than functionality. This includes verification of performance and power targets, security and safety aspects of design and complexities with multiple asynchronous clock domains.  Simulation of the design model (RTL) remains the primary vehicle for verification while a lot of other methodologies like Formal property verification,  Power-aware simulations,  emulation/FPGA prototyping,  static and dynamic checks etc also are used for efficiently verifying all aspects of design before tape out.  The Verification process is considered very critical as part of design life cycle as any serious bugs in design not discovered before tape-out can lead to the need of newer steppings and increasing the overall cost of design process.

 

SoC Validation is a process in which the manufactured design (chip) is tested for all functional correctness in a lab setup.  This is done using the real chip assembled on a test board or a reference board along with all other components part of the system for which the chip was designed for.  The goal is to validate all use cases of the chip that a customer might eventually have in a true deployment and to qualify the design for all these usage models. Validation happens initially for individual features and interfaces of the chip and then can also involve running real software/applications that stress tests all the features of the design.  Validation team usually consists of both hardware and software engineers as the overall process involves validating the chip in a system level environment with real software running on the hardware.

 

There are some companies that use the term Validation in a broader perspective and classifies the activities before and after Silicon/chip availability. Verification hence is also referred to as Pre-Silicon Validation (indicating activities before the silicon chip is available)  and  Validation is also known as Post-Silicon Validation.

 

SoC Testing (Manufacturing/Production test) involves screening manufactured chips for faults or random defects, reliability, functional defects and electrical characterization before volume shipment.

 

The first level of testing happens on a wafer level before individual dies are packaged. This is known as Wafer sort/probe testing that characterizes the various technology and transistor parameters before the die is cut out.  This step helps in identifying faulty dies before packaging.

 

The next level of testing happens on a packaged die to stress for reliability by testing at increased temperatures and identifying chips that can fail easily.   This process is known as burn-in stress testing.

 

The third level of testing is used to identify manufacturing defects or faults.  At a high level this process involves stimulating the input ports with various test patterns (also known as test vectors) and comparing the output responses against expected results.  Tester equipments like ATE (Automatic Test Equipment) are used which can take individual chips and use a test program to do the test pattern stimulus and response checks automatically.

 

The next level of testing is to characterize and screen chips before volume shipments. Characterization involves testing the design with voltage and frequency shmooing to find the ideal operating conditions.  Designs with high speed IOs (like PCIE, Ethernet, DDR etc) also goes through characterization of IO ports by shmooing various electrical parameters to arrive at ideal transmission and error rates.

 

Functional defects are identified in the parts using functional test patterns.  These functional test patterns are identified to exercise the different parts in a chip to achieve satisfactory coverage and run at actual speeds.

 

To summarize here is a quick summary of what each of these steps include:

 

For more related topics and questions on Verification/VLSI, do refer to my Quora profile – https://www.quora.com/profile/Ramdas-Mozhikunnath

Red question mark on the background of one hundred dollar bills

Semiconductor Wafer Mask Costs

Semiconductor lithography and wafer mask set have developed dramatically in recent years. As technology migrated into nanometer geometries mask set price has increased exponentially.

 

The good news is that mask cost is decreasing every year due to maturity in production process and other factors such as market demand, competition landscape etc. However, when a new process node is introduced, the wafer mask set price is sky high — allowing only a few companies to access the new process as “early adaptors”.

 

Selecting the right process node is a critical choice for a fabless company. Designing in the cutting edge will cause an increase in complexity and hence higher cost and risk. But will allow the company to benefit from improved performance, smaller size and low power ASICs.

 

We have collected wafer mask set prices from our network and generated a chart that shows the comparison of maskset price for each node. Remember these prices will change over time and over production volume.

 

 

 

Leard more about maskset here: Understanding Maskset Type – MPW, MLM, MLR and Single-Maskset

Get prices from wafer foundries for mask set and wafers: Get 3 quotes from Semiconductor Foundries

Man of Science series. Abstract design made of human head, numbers and visual elements on the subject of human mind, modern technology, education and science

What is Semiconductors 3.0?

I am sick and tired to hear the same buzzwords a zillion times every single day. IoT, Cognitive Computing, Cloud Computing, Fog Computing, Big Data, Artificial Intelligence, Smart-whatever (from objects to cities and pretty soon toilet paper, I’m afraid). What really makes me sick is that these buzzwords are used, most of the time, to avoid entering into the “low-level” details and are just thrown at us to justify unclear strategies. Intel wants to build the IoT fabric. What is that exactly? Does that mean they’ll compete with Cisco? For me, it’s just useless garbage language to avoid saying they don’t have a clue where they are going. On the other side of the competitive landscape, SoftBank buys ARM because of IoT. Right, obvious, isn’t it. But what does it mean exactly? Why would a Telco (or a pure financial institution, I’m not sure really) need to own a CPU architecture? And why would ARM dominate IoT like they dominate mobile? What software legacy makes their Instruction Set Architecture desirable in the IoT space? Hey, I am not saying Intel will not be the IoT fabric or ARM won’t be powering the IoT edge devices. I am just saying that throwing buzzwords at us isn’t sufficient to explain and justify strategies. Throwing slogans at people is what advertisers or politicians do. The tech world, i.e. us, should be a little bit more pragmatic and should not accept slogans as absolute truth.

 

 

Man of Science series. Abstract design made of human head, numbers and visual elements on the subject of human mind, modern technology, education and science

 

Alright, now that I have expressed my frustration, let me try to enter in the details and see what IoT might do to the competitive landscape. The title of this post is Semiconductors 3.0, so, you might guess that there was a 1.0 and a 2.0.

 

Semiconductors 1.0: The PC era

If we look at the different phases of semiconductor market evolution, we can clearly see the PC era as THE first major continental drift that totally changed the landscape. Intel became the huge behemoth we know, DRAM became strategically important and put Japan Inc. on the planet, and, probably even more important, software legacy became a major driving force. This was the first mass market “platform”, the Wintel platform. That era was characterised by several 100Mu of “objects” (PCs, workstations) priced at $1000. I am just using round numbers for the sake of simplification. These are the orders of magnitude. A $100B market, that’s enough to change the landscape.

 

Semiconductors 2.0: The Smartphone era

PC stopped growing and started declining. What came next was a battery powered device, the Smartphone. Intel totally blew it. Qualcomm became the huge behemoth we know. Flash memory became strategically important and put Korea Inc. on the planet and the software changed totally. Microsoft also blew it and now we have Apple and Google platforms both running on ARM cores. Because of the Apps store phenomenon, the platforms are only 2 because the App gap is enough to kill even a good platform if you can’t find all of the popular apps on it (Windows Phone, Symbian and a few others died or are dying as a result). This era is characterised by several Billions of units of objects (in this case, phones or tablets) priced at $100. Again, I am using round numbers to keep it simple. So, the $100B market size is actually similar to the PC era.

 

So, what’s next?

Guess what, Mobile isn’t growing much anymore. The only growth is in the low and mid-range, in developing countries, so price pressure is getting worse and worse. This time, Qualcomm is not even alone. Mediatek, Samsung and several Chinese competitors are also capable of making mobile SOCs and modems, running the very same Android. The poor western guys who tried to compete are all dead after loosing tens of billions of $. Broadcom, ST-Ericsson, NXP, Infineon, Icera/Nvidia, Intel, TI, the list is long and sad. One could say that the 2.0 era is also a drift towards the east. Korea and China are much stronger than they were in the 1.0 PC era. Taiwan which was a strong player in the PC days, remained a strong player in the Mobile period, mostly thanks to TSMC and Mediatek.

 

Semiconductors 3.0: The Internet of Everything era

Where is the next growth sector? I guess we all agree it has to do with the emergence of a bunch of new connected devices. But, we should be careful trying to extend the 1.0 and 2.0 rules (I intentionally did not use “paradigm” to avoid one of my least favorite buzzwords) to this new era. Why? Mostly because of fragmentation.
IoT is not one market. It’s a collection of many markets that have different cycles, different technical requirements, different growth patterns. To simplify to the extreme, we can probably say that it’s a market for tens of Billions of $10 units. So, we keep the same constant $100B TAM and everyone is happy… Well, not quite.
The fragmentation brings new rules compared to the PC and Mobile eras. Automotive, Industrial, Networking, Consumer, Medical, Military applications are not “one” IoT market. For example, there is absolutely no reason why one merchant software platform would win in all the sub-segments. Plain Linux is probably fine for almost all apps. Most IoT devices will not need and certainly will not encourage large, open apps stores to be developed which would be a big driving force against fragmentation. Do you really believe BMW will let you download a Lewis Hamilton app to change your autonomous driving style?

 

Uncertainty and fragmentation are not conducive to the way the US corporations think. Let’s face it, even if it’s politically incorrect, most technology companies in the US are hoping to build monopolistic positions. They invest big money when they feel there is a possibility to build a quasi-monopoly. And, they are extremely good at it. I would therefore bet that Semiconductors 3.0 will continue the drift towards the east. Uncertainty and fragmentation aren’t that bad if you move really fast and adapt really fast. We could also see some European companies doing reasonably well, close to their end market. Germany is important for Automotive and Industrial, for example, so NXP and Infineon could continue to do well.
How about processor platforms? I think the game is totally open and I believe that it’s open both at the edge and in the cloud.
At the edge, ARM is definitely under threat because the 32 bit core is a commodity and the software legacy is not so relevant.
In addition, I am a firm believer that some more processing will need to happen at the edge to pre-process the enormous amount of data generated by the hundreds of billions of sensors. This pre-processing is necessary for latency, bandwidth and cost of data transportation reasons. It will be done most probably by embedded neural processing units, let’s call them ENPU. A prototype of this approach is the QuarkSE MCU from Intel. It has a 32 bit X86 Quark core but its real power comes from its embedded neural processing unit which does super fast and super low power sensor data analytics. It would actually be fun to see Intel become the king of the edge, almost by accident, as this is clearly not their strategic focus but rather a hobby.

 

In the Cloud, Intel is under serious threat because the processing required for the big data analytics (buzzwords again, sorry) is different. Deep Learning (more buzzwords) or whatever AI technology will take off, does not care about X86 instruction set. It cares about massive parallelism, removal of the Von Neumann bottleneck and, I strongly believe, it will want in-memory computing. A new kind of Cloud Processing Unit will emerge, let’s call it CNPU for Cloud Neural Processing Unit and the control processor will become a commodity, doing ancillary tasks. GPUs are doing this neural processing today but that’s just a gap filler until something better emerges. No matter what Nvidia says, GPU were designed to do Graphics, not to do Neural Processing. Our brain isn’t full of GPUs, sorry guys. That’s ARM’s opportunity to finally enter the data center in a semi-big way but not in the key strategic socket. Who will take that neural processing socket? New names probably. Google, Alibaba, Baidu, Facebook, Amazon, Apple,… Anyone trying to shoehorn its ISA, whether ARM or X86 into that socket will fail. Are ARM or Intel capable to come up with something radically different from their core technology? I am not betting on that, but I might be wrong.

 

I also believe that the 3.0 era will also see a new memory technology emerge. Artificial Intelligence needs a fast write, fast read, non-volatile memory. Samsung/Grandis, Intel/Micron, WD/Everspin and others are working hard on this. I don’t know who will win but it will be interesting to watch.

 

I am sure Mr Son, who just wrote a $32B check to acquire ARM, wants everything I just wrote to be totally wrong. He’s a much smarter than I am, so he’s probably right :-) Stay tuned as we see this new era develop.

 

PS: This is a personal note. Most of you won’t care so you can stop reading here. As I am writing this post about the next phase of the Semiconductor market, I cannot describe how sad I feel when I see ST, my former employer, going through an endless decline. This fragmented market I describe as the Semiconductors 3.0 era would have been a PERFECT market for ST. ST has sensors, MCUs, analog, mixed signal, power. It has almost all the bits and pieces (OK, they’re a little weak in connectivity and in software but that could be fixed) to be the Intel/Qualcomm of the Semiconductors 3.0 era. It also has 28nm FDSOI which is the perfect process for IoT edge devices. What’s missing is the leadership to unify all the people, all the silos in ST around one clear vision. I wish them good luck anyhow.

 

________________

This is a guest post by Philippe Lambinet, CEO of Cogito Instruments