Nvidia CEO believes AMD has suffered a ‘great loss’ and Intel is playing GPU catch-up
Chief exec is in a buoyant mood following strong Q3 results
Nvidia just posted some impressive financial results, and the company is certainly in a buoyant mood, with the CEO taking the time to divulge his thoughts on the recently-revealed AMD and Intel partnership in laptop CPUs, as well as the defection of a key executive from the former to the latter.
The Q3 fiscal results were certainly strong, with Nvidia notching up a record revenue of $2.64 billion (around £2 billion, AU$3.45 billion), an increase of a third compared to a year ago. The firm made big gains with data centers, but surprisingly also experienced a big jump in gaming revenue with a 25% increase year-on-year – flying in the face of analysts’ expectations.
After boasting of bulging coffers, chief executive Jensen Huang talked on the subject of Raja Koduri leaving AMD to become Intel’s senior VP of the Core and Visual Computing Group, with a remit to deliver ‘high-end discrete graphics’. Yes, discrete graphics solutions, not integrated (on-processor) affairs.
As Tom’s Hardware reports, Huang commented: “Yeah, there's a lot of news out there... first of all, Raja leaving AMD is a great loss for AMD, and it's a recognition by Intel probably that the GPU is just incredibly important now.
“The modern GPU is not a graphics accelerator, we just left the letter ‘G’ in there, but these processors are domain-specific parallel accelerators, and they are enormously complex, they are the most complex processors built by anybody on the planet today.”
He goes on to point out that this is exactly why “every major server around the world has adopted Nvidia GPUs.”
Graphic detail
So, there are a couple of things here. First of all, and obviously enough, it’s not surprising that the Nvidia CEO wants to paint Koduri’s departure as a bad thing for AMD, and AMD’s graphics cards, in terms of it being a ‘great loss’.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Also, on the Intel side of the equation, Huang focuses on the company’s need to drive forward with graphics processors as a critical one. And this likely reflects the fact that the mentioned discrete GPUs Intel talked about in its press release welcoming Koduri into the fold is more about targeting heavyweight arenas such as AI and machine learning, rather than anything to do with gaming.
In other words, Intel doing discrete graphics is certainly big news that will make big waves, but not in terms of consumer graphics cards.
Note that Intel has tried its hand at discrete graphics cards in the past – or had brief flirtations would perhaps be a better way of putting it – but you get the sense that it’s truly a serious drive this time around.
Furthermore, Huang took time to comment on AMD and Intel teaming up to make laptop processors with integrated AMD graphics, news which broke earlier this week.
His somewhat rambling comment on the matter was: “And lastly, with respect to the chip that they [Intel and AMD] built together, I think it goes without saying, now that the energy efficiency of Pascal GeForce and the Max-Q design technology and all of the software we have created has really set a new design point for the industry, it is now possible to build a state of the art gaming notebook with the most leading edge GeForce processors, and we want to deliver gaming experiences many times that of a console in 4K and have that be in a laptop that is 18mm thin.
“The combination of Pascal and Max-Q has really raised the bar, and that's really the essence of it.”
In short: Nvidia’s rivals need to do something, because the firm’s latest advances with Max-Q are pushing the notebook graphics envelope so much.
Strong words all round then, but given its current form, Nvidia is unlikely to be short of confidence. Particularly when looking to a future in which graphics processors are key to the likes of supercomputers and cutting-edge fields such as AI and machine learning.
- These are the best graphics cards you can buy in 2017
Darren is a freelancer writing news and features for TechRadar (and occasionally T3) across a broad range of computing topics including CPUs, GPUs, various other hardware, VPNs, antivirus and more. He has written about tech for the best part of three decades, and writes books in his spare time (his debut novel - 'I Know What You Did Last Supper' - was published by Hachette UK in 2013).