[Beyond3D Article] Intel presentation reveals the future of the CPU-GPU war

Personally I think nVidia is being the arrogant one here, saying that Larrabee is "the GPU that CPU engineers would design" and playing like there is nothing to fear from Intel.

Firstly, as far as I know, there is no specific CPU engineer or GPU engineer education at university... It's not like CPU engineers would be too dense to be able to figure out how to create a good GPU, or vice versa (in fact, I'm quite sure that both nVidia and ATi have a lot of engineers that started their careers on CPUs).

Secondly, they are working with some big names from software rendering, like the people from Pixomatic, which includes Michael Abrash, who worked on the legendary software renderers at ID Software.

Thirdly, even though Intel never designed a high-end GPU, their IGPs do contain state-of-the-art hardware... So Intel does have some engineers around that have developed various generations of GPUs before, and know how to build a DX10 GPU, with efficient texture filtering, caching and thread management. They're not starting from scratch. The X3000/X4000 series of IGPs is actually remarkably similar to the G80 architecture, just on a much smaller scale.

Now, both the hardware and the software guys at Intel are experts in their field... If they can just work together, so that the software people can explain the hardware people what kind of instructions they want, and then let the hardware people explain how to take advantage of their design in software, I think this could work out pretty well.

In the meantime nVidia knows that they have to make their GPU design more generic and more programmable at every generation, so they wish they had more CPU engineers, because they will eventually have to design a GPU the way CPU designers would.
 
Last edited by a moderator:
i am not sure what you are asking for as it is subject to interpretation, imo'

it is popular because i started it at ATF a long time ago. I still say that nVidia and intel were on the brink of some real collaboration - way beyond their "necessary" partnership with MBs; one that they must not tamper with.

Something broke down and i would say it is "ego and arrogance" on intel's part. I perceive that intel wants "intel inside" literally *everything* that has a CPU - from alarm clocks and refrigerators to Servers, and PC gaming - nVidia's prime territory

and Jen is a hell of a lot more visionary than Intel's senior board ever was, never mind the "yes men" they have become as they congratulate themselves on Past Successes.

Jen sees his GT200 - coming cheaply for them to produce - in ALL graphics sectors to kill r700 and he will also take them on in a "shortly to be announced" cool MB that is really flexible to compete with AMD's new ones and to be stable OC'er like intels

He also has 'CUDA which will be a barracuda tearing at intel as it speeds up many "things not currently associated with GPUs" by five time"! and this is all well-hidden in plain sight on the internet

Do you even know what the *T* in GT is for - what the naming convention of this new architecture is? it is so obvious .. think 'scientists' unless you already know

And infact, intel is "fixed" like a huge windmill with their "RT posturing" and i think nVidia has a can of gas and some fiery arrows prepared for them. Intel is using their "2nd string PR team" - i recognize them from P4 NetBust Days! .. and i suffered with intel all the way through with my "EE" until my current e4300 which was a "side" project for them.

intel screwed up once before with this same team and my prediction is that nVidia is waiting to black both their eyes. Something AMD was too arrogant to do while they were briefly on top.

Larrabee will not be so impressive, i think
my analysis summary to date

Please let there be some tongue-in-cheek involved with the writing of that post. Pretty please, pandas around the world would be sad if that were not the case:(
 
Personally I think nVidia is being the arrogant one here, saying that Larrabee is "the GPU that CPU engineers would design" and playing like there is nothing to fear from Intel.

Considering the source of that quote, I would be careful assuming that public statements are accurate indications of internal thinking.

Right now Larrabee is *the* big story that's grabbing all the headlines and being hyped from here to the moon. I think this interview with DK and Jen-Hsun's recent diatribe are an attempt to bring a little reality back into the mainstream conversation. Larrabee is being touted as the second coming for the graphics world, but Nvidia is pointing out that Intel is treading on their turf and that they're pretty damn good at this game, thank you very much.

But that doesn't mean Nvidia isn't taking Intel seriously. They didn't get where they are by being that stupid. They're just playing their cards a bit closer to their chest -- a lot like they did with G80. We'll find out how good that hand really is in a year or two.

(There, I think I've mixed enough metaphors for one post.)
 
Please let there be some tongue-in-cheek involved with the writing of that post. Pretty please, pandas around the world would be sad if that were not the case:(

i always loved panda; barbecued with just a touch of ginger and bird-pepper
- btw your old boss Melkie, sends regards and evidently is not so happy with your mishandling of His Ring; he is working on an improved one and will see you shortly, i think.

Yes, it is a 'summary' of what i believe - more-or-less- with no supporting reasoning whatsoever attached. What part of it would you like me to explain?


But that doesn't mean Nvidia isn't taking Intel seriously. They didn't get where they are by being that stupid. They're just playing their cards a bit closer to their chest -- a lot like they did with G80. We'll find out how good that hand really is in a year or two.

This year.. anytime Jen is THIS confident, you just know he IS taking intel seriously and is more than prepared for them, by his confident statements. i believe he has several "weapons" to bring down Goliath's PR; he knows how to expose Intel's [arrogant] weak spot and has a Cuda stone which is a talisman, if what i am reading about it is just half-true

EDIT: i have been watching this for a long time. i am not saying Larrabee won't be good - it will; it is octo-beast and quite powerful. But i perceive that Intel is not so happy with it - it does have to drag x86 overhead around and i think that is a bigger penalty then Intel is letting on,

So, when that happens - as with P4; the "distraction dance" of their tired old PR team is brought out and we hear things like "netBust will get 10Ghz easily" - OR - "Larrabee and RT will revolutionize gaming" - Which i realize they are also not quite saying, but he implication is certainly there; i see Intel's smoke and mirrors to make a so-so launch seem more exciting that it really is.

That is all, it is intel's modus operandi and Jen is a little bit smarter than i am
[my tongue is stuck *firmly*in my cheek, now; you may get to know me eventually, if you are patient (and i will tell you now, it is not worth it)]
 
Last edited by a moderator:
That's an incredibly popular point of view. Why do you think that?

Jawed

Cause ATI and Nvidia use a lot of obfuscation to make everything seem complex? In general its been he who throws the most resources at the problem wins, slightly influenced by he who has the best drivers, slightly influenced by he who pays the developers more.

Historically its been an embarrassingly parallel market with razor thin margin. As the competition has died out, the margins have improved for both players, which gained the interest of both AMD and Intel because the margin started to be in the attractive realm. That coupled with the move from GPUs being embarrassingly parallel to actually being computationally driven with much more flexibility made it make sense for them to get involved with the market.

GPUs are moving to the point of "put lots of ALUs with limited control on chip" and "design control and memory architectures to extract maximal performance on set of given workloads and future workloads" and "write drivers that make it work". these are all tasks that Intel/AMD have been doing forever.

Aaron Spink
speaking for myself inc.
 
Last edited by a moderator:
apoppin, you stick out like sore thumb on this forum. I've been a lurker of this forum for quite some time, and I just had to post after chuckling at your posts.

Do you even know what CUDA is?

If you praise CUDA to be the be all-end all of Intel, then you should at least have a sound understanding of what it is.
 
apoppin, you stick out like sore thumb on this forum. I've been a lurker of this forum for quite some time, and I just had to post after chuckling at your posts.

Do you even know what CUDA is?

If you praise CUDA to be the be all-end all of Intel, then you should at least have a sound understanding of what it is.

I stand out everywhere I am quite used to it by now. As long as I am not either impolite nor insane, I intend to challenge your own beliefs a bit. I have a bit of experience at “that other noisy forum” and I bring perhaps another PoV; you can feel free to expand my own views for they are flexible and my analysis is adaptive

I see I have brought you out! Welcome to B3D – if I am allowed even to stay. A sore thumb needs attention – not amputation.

i do understand CUDA .. i have been privileged with a vision that says CUDA will be used to do things "not normally associated with GPU"

all i can say is "wait and See" . . . i can link you to what i can show you
 
I stand out everywhere I am quite used to it by now. As long as I am not either impolite nor insane, I intend to challenge your own beliefs a bit. I have a bit of experience at “that other noisy forum” and I bring perhaps another PoV; you can feel free to expand my own views for they are flexible and my analysis is adaptive

I see I have brought you out! Welcome to B3D – if I am allowed even to stay. A sore thumb needs attention – not amputation.

i do understand CUDA .. i have been privileged with a vision that says CUDA will be used to do things "not normally associated with GPU"

all i can say is "wait and See" . . . i can link you to what i can show you

Thanks for the welcome.

I just can't help shake the feeling that although you loudly put down Intel's PR efforts, you're too diligent to follow NVIDIA's PR team.

CUDA will not finish Intel, at all. It may hurt CPU manafacturers (not just Intel) in the supercomputing space, but it won't cause much of an effect in the desktop or corporate server sector.
 
CUDA will not finish Intel, at all. It may hurt CPU manafacturers (not just Intel) in the supercomputing space, but it won't cause much of an effect in the desktop or corporate server sector.
Actually that's not completely true. It's pretty damn clear from Jen-Hsun's comments at Analyst Day that he wants CUDA to be a major factor in the consumer space - and my info indicates that they have already got design wins thanks to that focus.

The point remains, though, that I am massivey unconvinced that they know how to win this batte. They've started focusing on this just now, presumably in great part because Jen-Hsun went 'omfg' at the H.264 Encoding presentation he received from those guys. They could have started 18 months ago - but they didn't get the importance of that back then, and focus exclusively on HPC. That lost them valuable time.

So what's the right strategy? In-house R&D. And just as opening up CUDA (making parts open-source etc.) is the right strategy for HPC, it's NOT the right strategy for the consumer space, yet they'll likely do the former in a way that'll also result in the latter. So yeah - I really don't think they have a good grasp of that space right now, which means that until I see any evidence that they do or that I get to actually shout at them in person about how they don't have a clue, I won't be very confident in their prospects on that front. The key to understand is that it's *incredibly* time sensitive, and if not done fast enough it risks being rather worthless.

And hello Wesker! :) Sorry for having to contradict you in your first post, but if that makes you feel better apoppin is indeed a bit crazy here (no offense intended!) ;) I mean, 'privileged with a vision'? errr... and metaphoric speech like that just doesn't work over multi-paragraph texts, especially not on the internet.

As for what apps are possible via CUDA - I think if you look at any CPU review testing multi-core applications, you'll find plenty of ideas. No, not every single one of those could be done on the GPU - but many, many could be, or at least be substantially accelerated instead of completely offloaded (which also adds a lot of value). If you want me to be a bit more precise here, just ask.
 
Ii do understand CUDA .. i have been privileged with a vision that says CUDA will be used to do things "not normally associated with GPU"

So you understand that in this case it's actually nVidia treading on Intel's turf, as aaronspink already pointed out.

Intel is just aiming to meet nVidia halfway. I don't see why they can't.
 
So you understand that in this case it's actually nVidia treading on Intel's turf, as aaronspink already pointed out.

Intel is just aiming to meet nVidia halfway. I don't see why they can't.
You betcha!

But my opinion is that intel can't meet nVidia even halfway! .. so i say "wait and see"
- i am not just impressed what they did with CUDA this year, i am frankly blown away by it
- we will agree to please disagree until it becomes clear to all of us

apoppin is indeed a bit crazy here (no offense intended!) I mean, 'privileged with a vision'? errr... and metaphoric speech like that just doesn't work over multi-paragraph texts, especially not on the internet.

Dr Tim Leary was also crazy everywhere ... but then i'd say "visionary" like a fox; i need time to make a transition to B3D .. i am not so sure it will be granted to me here
-I will also work on improving my multi-paragraph text metaphors so as to add a bit more rationality to my insani .. err, vision.


But yes, nVidia IS treading on intels territory - imo, deliberately - when Jen realized as i do, that intel is all smoke and mirrors for this PR campaign.
And i believe they intend to focus all their weaponry on the DeathStar'; and AMD will be their unwitting ally as the Rebels take on Intel's CPU sector. This is a multi prong campaign and we see but part of it.
Jen, a deep one, he is! i see the Force is strong with that one.
[sorry =P]
 
What exactly did Cuda do to blow you away?
It should be no surprise that a GPU is faster at some things than a CPU... After all, that's why we use a GPU for graphics and not the CPU, is it not?
It should also be no surprise that workloads that are similar to vertex/pixelprocessing, such as massively parallel linear algebra things are also well-suited to a GPU.

However, when I look at the MarchingCubes example that's included with the Cuda SDK, and compare it to my own multithreaded CPU-solution, apparently Cuda is not so hot with all tasks. Even on my dualcore processor, the CPU-routine is still lots faster than the Cuda-routine. So nVidia still has its work cut out for more general purpose tasks.

I get the impression that you just fell for nVidia's Cuda smoke-and-mirrors instead.
 
They've started focusing on this just now, presumably in great part because Jen-Hsun went 'omfg' at the H.264 Encoding presentation he received from those guys.
Where can we see this in action, that's what I want to know. NVidia could make a lot of friends getting h.264 encoding out there unlike the useless crap that ATI tried a few years back.

Jawed
 
What exactly did Cuda do to blow you away?
It should be no surprise that a GPU is faster at some things than a CPU... After all, that's why we use a GPU for graphics and not the CPU, is it not?
It should also be no surprise that workloads that are similar to vertex/pixelprocessing, such as massively parallel linear algebra things are also well-suited to a GPU.

However, when I look at the MarchingCubes example that's included with the Cuda SDK, and compare it to my own multithreaded CPU-solution, apparently Cuda is not so hot with all tasks. Even on my dualcore processor, the CPU-routine is still lots faster than the Cuda-routine. So nVidia still has its work cut out for more general purpose tasks.

I get the impression that you just fell for nVidia's Cuda smoke-and-mirrors instead.
possibly

i will fail in my analysis if i have insufficient data. Here is what we know:

Whenever - in the past - nVidia's CEO fires a shot across his Opponent's bow, he gives a bit of strategy in his public address to his board and stockholders.

This time he is SO confident, i have reason to believe he is jumping up and down for Joy [inside] at what his engineers have brought him. Normally, i'd think i'd be a little depressed that a 11 billion dollar company was bearing down on me. Not Jen - he is Overconfident

And i'll try and see what i can find for you on CUDA

CUDA is not nVidia only weapon in this PR war with intel; i believe nVidia has a new secret weapon - but that is insane speculation from me; a prophecy, if you will =P
 
I recall a certain other company that was overly confident that their next CPU would demolish Intel's offerings, after all, they weren't even native quadcores!
That same company is now struggling for survival.
 
I recall a certain other company that was overly confident that their next CPU would demolish Intel's offerings, after all, they weren't even native quadcores!
That same company is now struggling for survival.

Yes, they copied Intel and arrogantly assumed intel wouln't recover quickly. AMD lost a good deal of their enthusiast market as they tried to milk every nickel and dime from us as they speed-bumped every 50mhz. That was shameful imo and the birds have come home to roost over that one. I think they will be OK now, that Phenom 9850 BE looks awesome for everything except highest end gaming
 
I recall a certain other company that was overly confident that their next CPU would demolish Intel's offerings, after all, they weren't even native quadcores!
That same company is now struggling for survival.

Being confident in statements is hardly indicative of much, everybody does it. Or have we all forgotten the "must be smoking something hallucinogenic" gem that Jen himself put out prior to the FX debacle.

It's beyond silly to discard Intel from the get-go just because guys like Kirk or whomever come out and say that CPU engineers can only design CPUs and that the art of building a GPU is some arcane, impervious secret only available to initiates, simply because even if that were true(it's not), Intel has enough resources to get enough magical GPU engineers and give them enough to work with in order to make themselves competitive...if they're actually interested in doing that.

Does that mean that Intel will steam-roll over nV and friends?Hardly, but odds are that they can be very competitive with nV on their own turf, if they put enough money into it. The key here is knowing just how interested Intel is in this GPU business, and that's a major unknown at this point.
 
Anyway, I think apoppin needs to realise that he's now among people who actually develop with Cuda, some even for a living.
 
Nvidia is beating their chest pretty hard. Ever since the first word on Larrabee, they have been getting louder and louder. The David Kirk interview screams of chest beating.
 
It's beyond silly to discard Intel from the get-go just because guys like Kirk or whomever come out and say that CPU engineers can only design CPUs and that the art of building a GPU is some arcane, impervious secret only available to initiates, simply because even if that were true(it's not), Intel has enough resources to get enough magical GPU engineers and give them enough to work with in order to make themselves competitive...if they're actually interested in doing that.

Does that mean that Intel will steam-roll over nV and friends?Hardly, but odds are that they can be very competitive with nV on their own turf, if they put enough money into it. The key here is knowing just how interested Intel is in this GPU business, and that's a major unknown at this point.

Money isn't everything though. Corporate ethos is fairly important too. All the smart GPU guys in the world can't make a product a success if the rest of the organisation won't or can't let them.
 
Back
Top