[Beyond3D Article] Intel presentation reveals the future of the CPU-GPU war

So from my POV, the main risk is hardware engineers overestimating what software engineers can do and software engineers not taking certain hardware efficiency metrics into proper consideration until it's too late. Whether that will actually happen is anyone's guess, however.

Don't you think that there's maybe a worrying parallel here Itanium, in the sense that it was a hardware design which relied heavily on the compiler writers in order to deliver on its promised performance?
 
Regarding density and performance - I am seriously expecting the 4Q09/1H10 Larrabee to be on 32nm. I said it and I'll say it again: if it's 45nm, Intel doesn't need to bother even releasing the thing, they'll just look dumb because they'll be at a noticeable process *disadvantage* given what I've seen of TSMC's 40nm process so far (in terms of (perf/mm²)/$).
Intel is targetting Larrabee for about 150GB/s bandwidth. Bearing in mind that a 2008 GPU such as GT200 has ~150GB/s (we expect), Larrabee will be an entire performance-generation behind, if not more. Also, 32nm at Intel will prolly be fully occupied pumping out x86 CPUs. Why would Intel divert its most advanced process away from x86?

I'm expecting Larrabee to compete no higher than the performance category, i.e. analogous to G94 and RV670 at today's prices. I don't see any reason why it won't be very successful if it hits that target.

Jawed
 
I honestly don't see Intel attempting to compete in the enthusiast sector, other than the odd nod to the enthusiasts every now and then, Intel has never really targetted anything at them. Well, other than trying to take some of their money with Extreme Edition processors. :p

So having only 150 GB/s wouldn't necessarily hurt them in that regards.

However, if they are also targetting HPC with it, that might hurt them in comparison to ATI/Nvidia's offerings in those area's.

Or could there be some super efficient bandwidth saving bit of tech they are working on?

Regards,
SB
 
Last edited by a moderator:
Do not compare Intel with AMD, please.
Intel is a FAR bigger company, which has developed FAR more technology and has been successful in a lot of areas.
Will all the resources that Intel has, they can surely make this work. Getting back to your Netburst... no, it was not a very efficient design... but because Intel had a huge advantage over AMD in terms of manufacturing technology, they still made the design work, simply by brute force.

Larrabee could be a similar scenario... It might not be as efficient as nVidia's GPUs, but if they run at much higher clockspeeds, with more cores, cache and all that, then it might just work.
Raytracing is ofcourse nonsense at this point, but that's another story.
Intel is huge but [imo] they have lost their vision. They need something "exciting" because Larrabee is another P4 NetBust

Intel may be Huge but they have no Vision anymore. They have CPU engineers who have ZERO clue about the parallelism of the GPU nor ANY clue how to do "fusion"

That is WHY they went to NVIDIA. However, Jen is smart and seen NVIDIA holds all the Cards
--CUDA will blow away Intel - it will speed up many functions not traditionally done by a GPU - up to FIVE FASTER than than now!!!

Intel is smoke and mirrors .. imo, Larrabee is a bad joke that will take 10 years to be worth anything for Graphics

NVIDIA will eat them alive in the PR war as Intel is just "buying" researchers and devs for their Ray Tracing NONSENSE. All the intelligent "free" devs say RT MAY be incorporated into traditional engines - RT will never replace a traditional game engine.
 
Last edited by a moderator:
Back to the topic - Clearly Intel is clueless about Graphics and they hold back the industry. They can't even make a decent IG solution. AMD and nVidia blow them away

Intel is CPU only and their "vision" is to have "intel inside" EVERYTHING that uses a CPU - they are going after all the markets, imo
[yes it IS my opinion and i my analysis is even respected in some quarters; i am only "junior" here]
 
Last edited by a moderator:
Oh I think Larrabee (and Nehalem) shows that Intel do have a clear vision: there is a future in massively parallel computing, and graphics is the stepping stone that gets us there.

Also I wouldn't say that Intel is clueless about graphics. Their IGPs aren't that bad, if placed in the right context: their price, power consumption and transistor budget.
Intel just designed them to be cheap, not to be fast. I think they actually did a pretty good job on the hardware. It's mostly the drivers that they messed up royally. But Intel has hired many new graphics experts for the Larrabee project, so there's more manpower and expertise available now. I think they have a fair chance of succeeding really.
 
Wow seriously, tell us how you really feel ;)

--CUDA will blow away Intel - it will speed up many functions not traditionally done by a GPU - up to FIVE FASTER than than now!!!
Ok... don't buy into the CUDA hype especially if you're not buying into the Larrabee hype (personally, I'm not one for hype anyways). CUDA is certainly a useful interface for targeting current NVIDIA hardware, but its utility beyond that remains to be seen. It's already clear that its no holy grail though, although I'm certainly a fan of people trying to figure out better ways to program some of these new processors.

Intel is just "buying" researchers and devs for their Ray Tracing NONSENSE
Okay the raytracing stuff was just Intel marketing going wild. Being able to do some efficient raytracing *is* cool, but as TomF pointed out, Larrabee's main focus is rasterization and always has been.

Don't get me wrong, I actually agree with a lot of what David Kirk is saying personally, but it may be a bit naive to think that Intel can't pull something off in the GPU space given enough time and determination. In any case, why can't people just wait and see and talk more about it when Intel has something to show?

On a separate note, there seems to be a lot of similar discussions going on in different threads now and it's kind of hard to keep track of... do we really want to continue discussing stuff in this ancient thread?
 
Back to the topic - Clearly Intel is clueless about Graphics and they hold back the industry. They can't even make a decent IG solution. AMD and nVidia blow them away

Intel is CPU only and their "vision" is to have "intel inside" EVERYTHING that uses a CPU - they are going after all the markets, imo
[yes it IS my opinion and i my analysis is even respected in some quarters; i am only "junior" here]

Please try to dial back the anti-Intel fire-breathing by a notch or two, 'kay? It is entirely possible to communicate the same concepts as given above with about 10% of the emotional loading of terms.
 
Please try to dial back the anti-Intel fire-breathing by a notch or two, 'kay? It is entirely possible to communicate the same concepts as given above with about 10% of the emotional loading of terms.
You are right, thank-you! I forgot where i am.

OK, i am new here and i will do my very best - i promise - to fit in

To support my present conclusions - which are indeed flexible and adaptive - is Intel's own history. And i am also a historian of sorts and it is my opinion they are doing the same thing as they as they did with P-4. It is what i have called a "distraction dance" and it began in my opinion with their first forays with formerly unknown research into Ray Tracing.

OK, we are agreed that RT is premature - even to be 'kind' to the most fervent believer in intels' vision. If you can accept that, then you might also consider my premise that Larrabee's capabilites are being exaggerated perhaps?

i do believe there is some "x86 overhead" that OctoBeast will have to "drag around" [if you can handle my crude pet-name for it]

i am just not "polished" like you guys. I will admit to "crude" but i never mean to be rude, except to what i perceive as "PR". If you will note, my rig is powered by NVIDIA/Intel - next week it will be powered by CrossFire .. again

my agenda is "truth" and my ego is unimportant
 
OK, we are agreed that RT is premature - even to be 'kind' to the most fervent believer in intels' vision.
I really don't think that replacing rasterization is part of Intel's vision at all. Conversely, doing a bit of raytracing is part of everyone's vision down the road, including NVIDIA by their own admission.

Thus if you're basing your analysis on the raytracing marketing out of Intel, I think you're a little bit misguided. I would recommend considering the raytracing stuff solely in the context of "a cool thing we can do since we have a more flexible graphics pipeline".
 
my agenda is "truth" and my ego is unimportant

No offense, but there is no 'truth' so far, since Larrabee is not even ready yet. We might see the first public demonstrations by the end of the year, which may give us a first clue about how well it will perform, and all that... but right now, nobody really knows what to expect (we're just trying to combine the raw specs from Intel with our own experiences and insights to try and piece together what Intel is actually doing).
I think you're being out of line here with your criticism on Intel, based purely on their past.
What is it that they say? Past results are no guarantee for the future? That goes both ways.
Heck, you seem bent on this Pentium 4-thing... might I remind you that Intel followed that up with the Core2 architecture.

You might also want to ignore the raytracing-thing for now... Apparently it started to lead a life of its own in the media, but Intel has already rectified that by a statement of one of their engineers. They do want raytracing in the future... then again everyone does, but for now the focus is on rasterizing.
 
You are right, thank-you! I forgot where i am.

Hang on a minute! Are you AnandTech apoppin? :LOL:

If so, yes, for the love of god don't bring that AnandTech video forum style of fire-breathing over here. Your account won't last very long if you do. :smile:

I understand why it's popular/required over there from my infrequent visits. But treat it like Vegas, and leave it there.
 
Hang on a minute! Are you AnandTech apoppin? :LOL:

If so, yes, for the love of god don't bring that AnandTech video forum style of fire-breathing over here. Your account won't last very long if you do. :smile:

I understand why it's popular/required over there from my infrequent visits. But treat it like Vegas, and leave it there.

The same SoB
{sweet ole boy - NOW!!}

Look , i PROMISED to tame it down; i will!

"apoppin" is my username; my name is Mark and i am actually a former mod for ATF Video for 6 months. Keys and i calmed it down from the insane zoo it was previously after we got rid of some of the worst viral guys. One does what they have to and i posted aggressively as a DEFENSE for over 8 years

Please allow me time to make a transition .. i think i can really contribute to your forum as well as me learning what you do best - the polite "technical white paper" type discussions

MY specialty was this "polite discussion" BEFORE ATF Video .. one just develops a natural "armor" or you just leave there in disgust.

And you Geo, are also "famous" and i think you forgot that i promised you that i would post here after you actually extended me an invitation!!!

So i am accepting your invitation if it is still open [and i behave]

apoppin/mark

and your guys are right there is no "official" TRUTH about Larabee yet
- i just "know" from my unofficial research

PS: i don't think i ever was "anandtech's apoppin"
.. and certainly no more, it appears. I need a "technical fix"; i am tired about arguing which company is "better"
 
Last edited by a moderator:
Welcome to the board. But please stop ITALICIZING in caps in order to try to get your point across by putting inordinate emphasis in (what seems like) every other line. :)

I believe most of us are adults here with a fair bit of reading skill. :) Just post normally and I'm sure most of the people here would have a better and easier time digesting what you have to say.

And the Intel IGP solutions are actually complete successes for the market segment they were primarily designed for. Business machines and large OEM contracts which again are targetted mostly at large scale business purchases. The fact that it can also be used for the OEM consumer segment is a bonus. However, for that segment OEMs are generally better served going with an ATI or Nvidia solution if the price is right and the boards are stable.

I'm not sure about most people. But back on XP while the occasional Nvidia or ATI driver could cause a hard lock and in extreme cases a BSOD, I don't recall ever seeing that with Intel IGPs on official drivers. Granted they can't seem to get their drivers very game worthy, but hey that wasn't an issue for their target market.

Likewise, I wouldn't put it past Intel to come up with a compelling mainstream perhaps even performance segment graphics solution. Lest we forget they did make a short foray into the PC discrete graphics market with the i740 and while it certainly was great in comparison to the some of the competition from Nvidia, 3dfx, Matrox, and ATI. It was pretty decent for when it came out.

But in the end they bowed out due to the fierce competition in the market at the time. 3dlabs, PowerVR, and a few other companies were also still competing.

Now, there's only really Nvidia and ATI. With S3 occasionally making an attempt at the mainstream. In many ways a much easier market for Intel to compete in even if the cost of entry is higher. But then cost isn't something Intel is concerned with.

Regards,
SB
 
Welcome to the board. But please stop ITALICIZING in caps in order to try to get your point across by putting inordinate emphasis in (what seems like) every other line. :)

I believe most of us are adults here with a fair bit of reading skill. :) Just post normally and I'm sure most of the people here would have a better and easier time digesting what you have to say.

And the Intel IGP solutions are actually complete successes for the market segment they were primarily designed for. Business machines and large OEM contracts which again are targetted mostly at large scale business purchases. The fact that it can also be used for the OEM consumer segment is a bonus. However, for that segment OEMs are generally better served going with an ATI or Nvidia solution if the price is right and the boards are stable.

I'm not sure about most people. But back on XP while the occasional Nvidia or ATI driver could cause a hard lock and in extreme cases a BSOD, I don't recall ever seeing that with Intel IGPs on official drivers. Granted they can't seem to get their drivers very game worthy, but hey that wasn't an issue for their target market.

Likewise, I wouldn't put it past Intel to come up with a compelling mainstream perhaps even performance segment graphics solution. Lest we forget they did make a short foray into the PC discrete graphics market with the i740 and while it certainly was great in comparison to the some of the competition from Nvidia, 3dfx, Matrox, and ATI. It was pretty decent for when it came out.

But in the end they bowed out due to the fierce competition in the market at the time. 3dlabs, PowerVR, and a few other companies were also still competing.

Now, there's only really Nvidia and ATI. With S3 occasionally making an attempt at the mainstream. In many ways a much easier market for Intel to compete in even if the cost of entry is higher. But then cost isn't something Intel is concerned with.

Regards,
SB

Are you serious? Are you going to do this to me also .. i will just leave you to your polite discussion then. You are doing the SAME thing the guys at ATF Video did .. attacking the manner the message is presented instead of looking at my message.

It is up to you .. i am out of here otherwise and i wish you well. i intend to challenge dogma - and i will also do it POLITELY .. do not be impolite yourself for this is "my style"

i am actually a professional and published writer - since 1971 when i was Chess Editor of many publications in Dublin, Ireland and i was the first expert on "gang behavior" in Europe and i have a cover Story to my credit in 1973 in Profile magazine "inside Skinhead gangs" - before i was 20 years old

this is my way of being INFORMAL - i hate structure - except when i am getting paid for it. and i ASSURE you i am NOT getting paid to post here. My analysis is my own. You judge it on what i write - not "how" i write it

EDIT: here is a little directness. I love intel CPU .. i have had "intel inside" since 286 .. no AMD - ever; however i think their IG is shameful ...
contrasted with guys that really UNDERSTAND graphics - nVidia and now AMD because of partnering with ATi ... it is my opinion that Intel will never "get it" as they are CPU engineers and nVidia will not help them
My analysis and it is adaptive and subject to change as you guys input what you think and i analyze the new data i get.

I would love to stay here and get a fresh view from you guys - your insights are exceptional and your benchmarking among the very best anywhere! i am into "IQ" analysis myself and may have found a new means to save 90% of the traditional time needed to do frame-by-frame comparisons. And perhaps i will challenge you politely to get a new view for us all. But i will not fight city hall here as i did at ATF; i ran into Anand himself in P&N and he did not care so much for what i said and i will not overstay my welcome. You are a really cool bunch of guys!

-Mark/apoppin
 
Last edited by a moderator:
That's an incredibly popular point of view. Why do you think that?

Jawed
i am not sure what you are asking for as it is subject to interpretation, imo'

it is popular because i started it at ATF a long time ago. I still say that nVidia and intel were on the brink of some real collaboration - way beyond their "necessary" partnership with MBs; one that they must not tamper with.

Something broke down and i would say it is "ego and arrogance" on intel's part. I perceive that intel wants "intel inside" literally *everything* that has a CPU - from alarm clocks and refrigerators to Servers, and PC gaming - nVidia's prime territory

and Jen is a hell of a lot more visionary than Intel's senior board ever was, never mind the "yes men" they have become as they congratulate themselves on Past Successes.

Jen sees his GT200 - coming cheaply for them to produce - in ALL graphics sectors to kill r700 and he will also take them on in a "shortly to be announced" cool MB that is really flexible to compete with AMD's new ones and to be stable OC'er like intels

He also has 'CUDA which will be a barracuda tearing at intel as it speeds up many "things not currently associated with GPUs" by five time"! and this is all well-hidden in plain sight on the internet

Do you even know what the *T* in GT is for - what the naming convention of this new architecture is? it is so obvious .. think 'scientists' unless you already know

And infact, intel is "fixed" like a huge windmill with their "RT posturing" and i think nVidia has a can of gas and some fiery arrows prepared for them. Intel is using their "2nd string PR team" - i recognize them from P4 NetBust Days! .. and i suffered with intel all the way through with my "EE" until my current e4300 which was a "side" project for them.

intel screwed up once before with this same team and my prediction is that nVidia is waiting to black both their eyes. Something AMD was too arrogant to do while they were briefly on top.

Larrabee will not be so impressive, i think
my analysis summary to date
 
So, Larrabee's going to be crap because it has a second-rate PR team?

Well, I agree, there's no PR like NVidia's.

Jawed
 
Back
Top