Former Intel CEO tried to buy Nvidia back in 2005 for $20 billion

Alfonso Maruccia

Posts: 1,385   +420
Staff
WTF?! Shortly after taking the reins of Intel's management in 2005, CEO Paul Otellini proposed to the board of directors to acquire Nvidia. The GPU maker was valued at $20 billion back then, and Otellini was asking the board to approve an extremely expensive acquisition for a company mostly dealing in x86 computer CPUs. Today, Nvidia's valuation has surged into the trillions thanks to the generative AI craze, while Intel is struggling for survival.

Intel missed a significant opportunity a few years ago. The company could have purchased Nvidia at a fraction of its current value, but its focus on x86 architecture and resistance from the board worked against the acquisition.

According to unnamed sources cited in a New York Times report, Intel missed this "golden opportunity" while it was still Silicon Valley's dominant chipmaker. Some board members apparently recognized the potential future role of GPUs in enterprise and data center markets, but ultimately, the board rejected the deal.

After confronting the board's skepticism, Otellini chose not to push the proposal further. Reflecting on 2005, one attendee of the confidential meeting described it as a "fateful moment" in the history of Intel (and chip technology). Intel had a poor track record with mergers and acquisitions, and the proposed $20 billion deal would have marked the most expensive acquisition ever attempted by the company.

Intel executives once described the 2005-era company as the "largest single-cell organism on the planet" – an organization with a highly insular corporate culture and limited openness to technology outside of x86 chips. Former Intel CEO Craig Barrett reportedly compared x86 technology to the creosote bush, a plant known for releasing toxins to hinder the growth of competing plants around it.

Two decades ago, Intel's PC CPU business was thriving, allowing the company to generate strong profits for a time. Eventually, however, Intel was forced to confront Nvidia's potential as a competitor with the Larrabee project, a weird hardware mixture meant to extend the x86 instruction set architecture into the graphics realm. Although Larrabee ultimately failed, its proponent, Pat Gelsinger, is now Intel's CEO.

Also read: The Last Time Intel Tried to Make a Graphics Card

Today, Nvidia is valued at over $3 trillion on the stock market, roughly 30 times Intel's value. GPUs have become essential for AI acceleration, and nearly every major tech company is vying to carve out a successful business in this area. Despite some debate over the actual returns on AI investments, it would be premature to write Intel off entirely.

Nvidia is currently valued at over $3 trillion on the stock market, roughly 30 times Intel's value. GPUs have become the cornerstone for AI acceleration, and everyone and their dog is trying to make a successful business out of it. Despite some debate over the actual returns on AI investments, it would be premature to write Intel off entirely.

Permalink to story:

 
Intel has never been an outsider, so it hasn't gained the world's sympathy, but it is a significant and great company. For a long time, over 80% of processors sold were Intel. This means that all the evolution of technology (from CAD design software, scientific software, commercial software, etc.) was based on Intel products. In other words, the fact that today we have advanced machines, advanced medicines, advanced processing methods, advanced cars, etc became possible because Intel was making processors.

Nevertheless, many people struggle to understand the perspective of various technologies. For example, Jensen from Nvidia saw AI without it even existing and created it from scratch, while today many see AI ready to function and still don't understand, and someone else saw it without it even existing and created it practically from non existence.
 
Now that my NDA is lifted, When I was at Symantec on the corporate side of the house in 2012 I was asked by more than one of my assigned customers when we would have hardware accelerated IPS on x86 devices and when we would have a cloud EDR that did not suck balls. I took this to my manager who took me to a skip level and we told the director to took it himself to the PM and the sales EXEC. they came back and said it would cost to much to develop in house and the 100% opted to NOT even look at building one out. 2,3 years later Crowdstrike has been eating at the contracts on the periphery .. then they took Microsoft @ 500,000 seats. All of a ####ing sudden the board screams at us why we did not have a cloud powered EDR client and a true EDR client (we had sep14c, but it was garbage). all of a sudden we we buy Javelin and some hole in the wall EDR that CAN NOT SCALE more than 10,000 endpoints per PHYSICAL SERVER.... like really? we sold it to our #1 financial customer with 560,000 endpoints and it absolutely tanked. after that we spent 1 billion dollars in emergency development in 2016-dollars. it was insane. result was more hot garbage. in my previous 15 years I only issued on official apology to a customer for a product issue. after 2016 I issues 2 or more a year for destroying bandwidth on servers, cratering esxi hosts with disk/read write issues, bsods... all because some bean counting ####er did not want to invest earlier and take the time to do it right. what really chaps my balls is the execs who did not want to invest in the future at a earlier time all sold their stock and then got new Broadcom stock when symantec was sold off in 2019 .. they got rich. 21,000 people got a box full of their belongings.
 
For a long time, over 80% of processors sold were Intel. This means that all the evolution of technology (from CAD design software, scientific software, commercial software, etc.) was based on Intel products. In other words, the fact that today we have advanced machines, advanced medicines, advanced processing methods, advanced cars, etc became possible because Intel was making processors.

That's so true.

E.g. I have never, ever owned an AMD CPU. Never. All the CPUs I have ever owned from Pentium II days onward have been Intel. (I did own ATI and AMD GPUs though)

PII->PIII->P4->Core2->4th Gen Core->6th gen Core

Applying here the "MEDIOCRITY PRINCIPLE"

https://en.wikipedia.org/wiki/Mediocrity_principle

...famously often applied to disprove the "RARE EARTH" theory, we get a huge global sample of Intel CPU owners and thus an incredible amount of profit/revenue that Intel squandered/did not capitalize on.

Thus we have the Chinese today as our overlords: Huang & L. Su (incl such as the clown Chinese puppet Kim) and by extension, their allies the former USSR.

Monumental American (and European and British) fail.
 
If they were not interested in GPUs, it is a good thing they did not buy it.
People who are interested in their things build companies like Nvidia.
Do you know how many companies had a chance to grow like Nvidia?
Hundreds. These things don't grow like trees.
They would have made it what Intel is today.
 
... In other words, the fact that today we have advanced machines, advanced medicines, advanced processing methods, advanced cars, etc became possible because Intel was making processors.
That wasn't because of x86 nor of Intel, that was just natural advancement. There wasn't anything redeeming about x86 at the outset. Nothing revolutionary or unique. x86 just became self-reinforced dominant. Exactly as described in the article: "... a plant known for releasing toxins to hinder the growth of competing plants around it."

Nevertheless, many people struggle to understand the perspective of various technologies. For example, Jensen from Nvidia saw AI without it even existing and created it from scratch, while today many see AI ready to function and still don't understand, and someone else saw it without it even existing and created it practically from non existence.
Nvidia just filled the existing demand. CUDA grew to fill those uses. You don't have to see anything to do that, just be responsive to customers.

AI still has to prove itself, btw. It's horribly power intensive with little to show for it.
 
Last edited:
That wasn't because of x86 nor of Intel, that was just natural advancement. There wasn't anything redeeming about x86 at the outset. Nothing revolutionary or unique. x86 just became self-reinforced dominant. Exactly as described in the article: "... a plant known for releasing toxins to hinder the growth of competing plants around it."


Nvidia just filled the existing demand. CUDA grew to fill those uses. You don't have to see anything to do that, just be responsive to customers.

AI still has to prove itself, btw. It's horribly power intensive with little to show for it.
I gave an AI (qwen 2.5 32B llm running locally on my pc) only your post to give an response and it wrote that:

“user556 raises some interesting points about the development and dominance of certain technologies, particularly focusing on Intel's x86 architecture and Nvidia's role in AI. Here’s a structured response that addresses the key points:

1. **Technological Advancement and Intel's Role**: While it's true that technological advancement is a natural progression driven by human ingenuity and market demand, specific companies like Intel have played significant roles in shaping the trajectory of computing technology. The x86 architecture, developed by Intel, became dominant not just due to its intrinsic qualities but also because of Intel's aggressive marketing, ecosystem development, and strategic partnerships. This dominance reinforced itself over time, making it difficult for alternative architectures to gain traction.

2. **Nvidia and AI**: User556 mentions Nvidia’s role in AI, suggesting that the company filled an existing demand rather than creating a new market from scratch. While this is partly true, Nvidia's CUDA platform did indeed pioneer the use of GPUs for general computing tasks, which was revolutionary at the time. This innovation opened up new possibilities for parallel processing and significantly accelerated the development of AI technologies.

3. **AI's Current State**: The comment about AI needing to prove itself is valid. While AI has made significant strides in recent years, it still faces challenges such as high power consumption and limitations in certain applications. However, ongoing research and development are continuously addressing these issues, and AI is already proving its value in various sectors like healthcare, finance, and autonomous vehicles.

4. **Technological Dominance and Market Dynamics**: The analogy of a plant releasing toxins to hinder competitors is an interesting metaphor for how dominant technologies can create barriers to entry for new innovative players. This can happen through various means such as proprietary standards, patents, ecosystem lock-in, and economies of scale. However, it's also important to recognize that technological evolution often leads to the emergence of new standards and innovations that can disrupt existing dominance.

In summary, while user556’s points highlight the complex dynamics of technological advancement and market dominance, it's also important to acknowledge the significant contributions of companies like Intel and Nvidia in shaping the landscape of computing and AI. The interplay between innovation, market demand, and strategic business decisions continues to drive technological progress.”

When in the past have we had a synthetic intelligence that even comes close? Let's say it has at least 120 IQ and 100 times the knowledge of an above-average human. If you want to have a person with such intelligence, you have to find a naturally intelligent gifted person (1% of the population) and support him with energy for 30-40 years, support him with educational material, and at the end he will want and be paid to give you a (inferior) answer. If you go out and ask 100 random people to give you a coherent analysis on this topic, 99 of them will not know how to answer, the younger ones will not even know what you are talking about. Is natural intelligence horribly energy intensive? Recently, research on neural networks has focused on replacing the quadratic equation inside them with a linear equation, and they are trying to improve the results of the linear equation by augmenting it with simple perceptron neural networks so that it will be even easier to run.
 
Last edited:
There are many what ifs in life. And when they are about missed opportunities that lead to success there are many would be smart guys.

But like Alex Honnold once said: as long as I make it to the top untied I am the hero that is worshiped by almost all but when I fall everybody will say I was irresponsible, stupid and crazy.

Everybody is a colonel after the battle.
 
IIRC the rumor was more of a merger than a buyout, and the breaking point was JenSen, demanding to become CEO.

It's odd that back then intel could have bought nvidia with little effort, but nvidia buying intel nowadays is next to impossible even though nvidia can spare the pocket money.

BTW if memory serves me well, AMD also tried to buy nvidia before ATI.

Actually, nvidia could buy Intel, ARM and AMD if it was allowed to.
 
Back