How Could Intel Miss Mobile Processors, Comms Chips and GPUs?

Motorola was a true innovator, but it crashed, and where Intel succeeded. At the time of the rise of the microprocessor, Motorola (with its…

How Could Intel Miss Mobile Processors, Comms Chips and GPUs?

Motorola was a true innovator, but it crashed, and where Intel succeeded. At the time of the rise of the microprocessor, Motorola (with its 6800/68000 series) had a much better processor than Intel (the horribly flawed 8086-series). But, Intel had two jewels in its crown: being selected for the IBM PC, and András István Gróf (Andy Grove).

Andy continually fretted about future markets and threats — and built a company that had a focus that few other companies could ever match. Initially, he took Intel out of the crowded DRAM market and focused the company on microprocessors. At the time, he could see that what engineers needed was development kits, and Intel made sure that these were created to allow system designers to use their processors in the design of new systems. It was a bold move for a company that invented DRAM.

But, is Intel now failing? Well, after Andy left the company, they have generally moved from an engineering focus to a profit-focused approach — and in maximising the returns from their x86 architecture. They have since missed the mobile processor market, missed the communication chip market, and missed the GPU/parallel processing market.

In GPUs, NVIDIA created the CUDA core for free. But which only run on their chips, and thus locked-out Intel from the GPU market. But the great threat to Intel come from companies that do not need to spend funds in making chips: they just design them and others to build them.

QualComm saw the opportunity in communications chips, and which mixed analogue and digital electronics. And ARM focused on licencing its technology for others to create. But, the greatest threat to Intel, comes from Apple, and which now design their own chips, and then hand-over to others to manufacturer. While this creates supply chain risks for Apple, it allows them to focus on continually improving their hardware, and not waiting for Intel to advance their systems. Anyone who uses a Macbook with the ARM-based chip, will know that it generally wipes the floor with the equivalent x86 processor.

What went wrong? Intel focused on general-purpose with the CPU, and which are not well matched to more specialist AI and cryptography methods. NVIDIA saw the AI market coming, and focused on making their chips well matched to machine learning. How did this happen? Well NVIDIA saw that some PhD students were using their GPUs to run AI algorithms .. and they have since matched their chips to these requirements. The company have bet to the future on AI, and to allow others to manufacture their chips.