NVIDIA's Project Denver (ARM-based CPU)

Pressure

Veteran
Interesting but in my opinion they really didn't have any choice. Intel, which decided to close the door on the lucrative chipset market and integrated graphics, and their delayed venture into the GPU market (Larrabee), to AMD who purchased ATI. They were sitting ducks in a market that focuses more and more on tight integration of both CPU, GPU and essential IO.

No details so far but it has a strong "Fusion"-feel to it, although they seem to be aiming at the integrated market for starters; tablets, phones and ultra-portables (even though they say it is meant for desktop).

I suppose we have reached a point where performance is enough for most mundane things.

Press Release:

NVIDIA Announces "Project Denver" to Build Custom CPU Cores Based on ARM Architecture, Targeting Personal Computers to Supercomputers

NVIDIA Licenses ARM Architecture to Build Next-Generation Processors That Add a CPU to the GPU

LAS VEGAS, NV -- (Marketwire) -- 01/05/2011 -- CES 2011 -- NVIDIA announced today that it plans to build high-performance ARM® based CPU cores, designed to support future products ranging from personal computers and servers to workstations and supercomputers.

Known under the internal codename "Project Denver," this initiative features an NVIDIA® CPU running the ARM instruction set, which will be fully integrated on the same chip as the NVIDIA GPU.

This new processor stems from a strategic partnership, also announced today, in which NVIDIA has obtained rights to develop its own high performance CPU cores based on ARM's future processor architecture. In addition, NVIDIA licensed ARM's current Cortex™-A15 processor for its future-generation Tegra® mobile processors.

"ARM is the fastest-growing CPU architecture in history," said Jen-Hsun Huang, president and chief executive officer of NVIDIA. "This marks the beginning of the Internet Everywhere era, where every device provides instant access to the Internet, using advanced CPU cores and rich operating systems.

"ARM's pervasiveness and open business model make it the perfect architecture for this new era. With Project Denver, we are designing a high-performing ARM CPU core in combination with our massively parallel GPU cores to create a new class of processor," he said.

Warren East, ARM chief executive officer said, "NVIDIA is a key partner for ARM and this announcement shows the potential that partnership enables. With this architecture license, NVIDIA will be at the forefront of next generation SoC design, enabling the Internet Everywhere era to become a reality."

About NVIDIA
NVIDIA (NASDAQ: NVDA) awakened the world to the power of computer graphics when it invented the GPU in 1999. Since then, it has consistently set new standards in visual computing with breathtaking, interactive graphics available on devices ranging from tablets and portable media players to notebooks and workstations. NVIDIA's expertise in programmable GPUs has led to breakthroughs in parallel processing which make supercomputing inexpensive and widely accessible. The Company holds more than 1,600 patents worldwide, including ones covering designs and insights that are essential to modern computing. For more information, see www.nvidia.com.
 
We're getting to the point where ever increasing processor performance is of little consequence. Sandy Bridge's main claim to fame is its integration. The improvements in per-core execution speed may be significant but probably won't be the selling feature for the majority of people.

Microsoft's getting pretty smart by trying to make Windows and applications architecture agnostic. This allows them to escape the PC model in the future if they so choose.
 
Performance will matter a great deal once speech recognition and realtime voice translation type apps are built. I just worry that the gap in processing power between what we have now and what you need for the next killer app is a large one :?:
 
Performance will matter a great deal once speech recognition and realtime voice translation type apps are built. I just worry that the gap in processing power between what we have now and what you need for the next killer app is a large one :?:

Off-topic, but have you seen the demos of Google's speech-to-text on Android? Its entirely cloud based, afterall speech and text are both easily transported.
 
Performance will matter a great deal once speech recognition and realtime voice translation type apps are built. I just worry that the gap in processing power between what we have now and what you need for the next killer app is a large one :?:

I'm not sure how computationally intensive those are. Honestly, I think that kind of stuff is mostly IO and memory bound due to the large data structures. A dual-core 1GHz A9-class chip can probably handle such things without a sweat.
 
I'm not sure how computationally intensive those are.

Understanding natural language can't be that easy especially with tons of background noise, but I would love to be wrong. Doing it with the cloud would for sure be much easier, but I get a weird feeling knowing that without a good cellular signal my phone becomes a useless brick. However that's probably something i need to get used to since it seems like the future ;)

One day everything maybe a service. Seems kinda silly to spend $400 on a graphics card that just sits idle 98% of the time. With a cloud setup and taking into account peak usage then I'm sure somewhere close to 50% utilization could be doable.
 
Last edited by a moderator:
Performance will matter a great deal once speech recognition and realtime voice translation type apps are built. I just worry that the gap in processing power between what we have now and what you need for the next killer app is a large one :?:
I agree completely and think the gap exists.
 
Understanding natural language can't be that easy especially with tons of background noise, but I would love to be wrong. Doing it with the cloud would for sure be much easier but I get a weird feeling knowing that without a good cellular signal my phone becomes a useless brick...but that's probably something i need to get used to since it seems like the future ;)

One day everything maybe a service. Seems kinda silly to spend $400 on a graphics card that just sits idle 98% of the time. With a cloud setup and taking into account peak usage then I'm sure somewhere close to 50% utilization could be doable.

Oh it's not easy but I was speaking of computationally intensive as opposed to memory bound comparatively speaking in typical mobile SoC architectures.
 
Reading about Nvidia's ARM plans, and Microsoft's Windows 8 announcement, I'm wondering if we're not standing on the threshold of a gigantic paradigm shift in the realm of personal computing - are we even aware of the immense implications for the future this could have?

There's truly been no previous greater threat to Intel's position of absolute domination than Microsoft going ARM, coupled with the rise of (reasonably) powerful ARM chips.

This could be huge, huge, huge. Apple showed not just once, but thrice, that you CAN in fact switch basic hardware architecture, and do so quite successfully and painlessly! If the suits over in Satan Clara doesn't have the jitters already, they will soon I bet. :p

Personally I'm quite ready and willing to say FU to x86. It's lived long past its usefulness, the basic PC architecture is archaic and full of old crap that's dragging it down. Even things like the little endian binary format of x86, its stack-based FPU and so on just shows what a crazy fucked-up old system it really is. No, a clean re-start would be much preferable, and an end of Intel's domination of the semiconductor industry would be a great boon to us all too I bet.

Intel once tried to kill off x86 with Itanic - this was in retrospect a bad move. However, Intel's reaction, to bet the farm on x86 and put it into everything from supercomputers to PCs and graphics cards, down to portables and cell phones, really isn't much better.
 
Last edited by a moderator:
I'm sure the first targeted use for this is in consoles. A bunch of high powered ARM cpus combined with a huge GPU would be all you need for a fantastic next gen console.

The one thing I don't know about is the effect of having super fast CPU->GPU communication. I read a bit on smallLUXGPU development and it seemed like one of the big bottlenecks was getting data back and forth to the GPU.

However...everything is getting super fragmented right now. Are game developers really going to program for...

-Ps3-Cell processor
-xbox 360-PowerPC
-Nvidia Maxwell/kepler
-Fusion/Sandy Bridge (eg normal x86 + GPU)

Seems like something radical needs to happen in the software space to make this less of a headache.
 
Last edited by a moderator:
Reading about Nvidia's ARM plans, and Microsoft's Windows 8 announcement, I'm wondering if we're not standing on the threshold of a gigantic paradigm shift in the realm of personal computing - are we even aware of the immense implications for the future this could have?

More competition is always good. That's why its exciting to foresee ARM desktops. This and the continuing decrease of ISA importance.

I wouldn't try to read anything else beyond theses points - I guess no immense implications for me :D
 
Satan Clara? :p

Anyway, Apple's example doesn't really compare here. There's nearly 20 years of apps since Windows 95 that can still run on modern OSes, and far more users of said apps than Apple ever had to deal with.

I'm not saying it's impossible. I'm just saying that just because Apple did it doesn't mean that Microsoft can do it.

That said, I wouldn't mind if a real third player entered the PC CPU market... We need more competition. Intel pretty much keeps AMD around just to avoid antitrust lawsuits, and I doubt they'll tolerate AMD for much longer. They haven't been a competitor for four and a half years. I'm hoping Fusion changes that in some way, but I doubt it. Somebody with an ARM license doing some serious damage to Intel would be good.

Myself, I don't give a damn about x86/ARM/etc. I just want CPUs to get better, and as long as AMD continues its slow slide into nothingness, the future of good CPUs is in danger. If someone can step up and take some of the weight off of AMD's shoulders, this would be good.

Edit: This is a reply to Grall.
 
There's truly been no previous greater threat to Intel's position of absolute domination than Microsoft going ARM, coupled with the rise of (reasonably) powerful ARM chips.

Ummm.... I'm guessing here that you don't remember the earty 1990's then?
 
Myself, I don't give a damn about x86/ARM/etc. I just want CPUs to get better, and as long as AMD continues its slow slide into nothingness, the future of good CPUs is in danger. If someone can step up and take some of the weight off of AMD's shoulders, this would be good.

Edit: This is a reply to Grall.

I agree with this 100%. I actually have AMD CPUs in my desktop and HTPC, but laptops are all intel. It is getting difficult to come up with reasons to buy an AMD CPU. The current x6 I have though at least has proven its value in modeling where I can keep the cores busy. But now the new intel chips appear to be about as fast even with 4 cores.

It is funny the underdog is now Nvidia what will people do? Their heads will explode most likely :)
 
The one thing I don't know about is the effect of having super fast CPU->GPU communication. I read a bit on smallLUXGPU development and it seemed like one of the big bottlenecks was getting data back and forth to the GPU.
Yeah, you have a wtfpwn I/O interface in the PS3 (some 30-40ish GB/s or whatever on paper, way faster than actual framebuffer bandwidth anyway lol), almost all wasted because the clunky GPU Sony picked basically can't do GPGPU calculations...
 
Satan Clara? :p
Hah, I don't mean anything by it. It's just something that guy Mike whatsisface who fronted The Inquirer before he got ousted used to call Intel that I remembered as I was typing... :LOL: I have all-Intel chips in my PCs right now, P4, Core2 Quad and i7...

Anyway, Apple's example doesn't really compare here. There's nearly 20 years of apps since Windows 95 that can still run on modern OSes
Yeah, but how many of those do people actually RUN, and how many of the really old ones still in use actually need cutting-edge performance? Most (meaning hugely vast majority) of the gigantic backlog of all x86 software ever made is obsolete ancient shit that's, well, been obsoleted, that nobody cares about anymore. You take any of the tens of thousands of DOS, win3.x and 9x apps in existence, you can't friggin' run 'em nowdays because modern windowses don't support 16-bit mode software. And that's a good thing too.

You can run 'em through dosbox, today, at a fraction of the speed of a modern system, but in most cases that would be quite enough.

I'm not saying it's impossible. I'm just saying that just because Apple did it doesn't mean that Microsoft can do it.
They don't need to get everything working. It's like making omelettes, you gotta break some eggs. And if your stuff got broken, then don't upgrade your hardware so that you can continue running your old voodoo stuff, or else upgrade your application to THIS century, and then it'll work on MS's new ARM-based Windows 8 just peachy... :devilish:

Somebody with an ARM license doing some serious damage to Intel would be good.
Nothing beats good ol' competition to bring good products onto the market. Heck, this theory proves itself over and over in the tech industry, in case anyone still doubts it... Just look at intel P4 -> AMD Opteron -> intel Core, IE6 -> Mozilla -> IE7, or Geforce FX -> ATI 9700 -> Geforce 6 and so on.

Myself, I don't give a damn about x86/ARM/etc.
I prefer when good solutions succeed over bad ones. PCs are full of bad solutions that are merely "good enough" so that they'll get the job done (any modern x86 CPU is a magnificient example of that, by brute-forcing performance using a shitty base architecture.)

A PC "franchise reset" using ARM would be wonderful IMO. Bring in J.J. Abrams, sprinkle lens flares liberally all over it...success! :LOL:

Ummm.... I'm guessing here that you don't remember the earty 1990's then?
Hm, what am I supposed to remember? OS/2? :LOL: Yeah, big threat THAT was... Did it crack 5% market share at any point during its existence? Maybe it did, but still never left any lasting impression. Sad fact is, DOS and win3.x ruled during the early 90s, as magnificiently crap as they both were.
 
Pretty sure the 1990s reference was to AMD owning intel. It was actually from about 95is till 2001ish as I recall. Fastest x86s were OC'd, pencil-modded thunderbirds and Athlons.
 
Pretty sure the 1990s reference was to AMD owning intel. It was actually from about 95is till 2001ish as I recall. Fastest x86s were OC'd, pencil-modded thunderbirds and Athlons.

Probably more a reference to WNT and all the various architectures it ran on back in the day (MIPS, alpha, etc).
 
So I wonder if Nvidia might do real Fusion before even AMD? Wonder whether the GPU core will do CUDA and/or OpenCL and whether CPU and GPU will share address space?
 
Back
Top