Beyond3D's 2007 'Bricks & Bouquets'

Wow... Clearly Intel never left the "GPU market." But they did leave the GPU Market. You would have thought that idea was clear... Guess not...
 
Wow... Clearly Intel never left the "GPU market." But they did leave the GPU Market. You would have thought that idea was clear... Guess not...

Just replace "GPU market" with graphics market; it's more than obvious that I didn't go down into meaningless semantics and separate integrated from discrete graphics sollutions.

You're most welcome to compare an Intel and according AMD/NVIDIA IGPs available today; then you can see how much Intel really "cares" so far about graphics in terms of performance and driver support overall. Then you could contribute a tad more to this conversation than meaningless sarcasm.
 
http://www.beyond3d.com/content/articles/31/1

The third slide there is the best evidence I've seen that Intel will be "in it to win it", rather than doing their usual dilletente act. And somewhere near the top at that (at least such is their intention). Unless anyone thinks it is entry level IGPs they are seeing as driving up their margins.
 
http://www.beyond3d.com/content/articles/31/1

The third slide there is the best evidence I've seen that Intel will be "in it to win it", rather than doing their usual dilletente act. And somewhere near the top at that (at least such is their intention). Unless anyone thinks it is entry level IGPs they are seeing as driving up their margins.

Difference being that if one reads through all the slides it sounds more like a battle that will take place primarily on the software/applications level.

Was it Damien Triolet that once said that Intel can create outstanding math units? He also added that great math units don't necessarily make a great GPU. IMHLO it's a tough gamble to rely on major shifts when it comes to applications; if that's what ISVs/developers truly want in the longrun then chances are high for a success. If not, they've got another Pentium4 in their hands.
 
Difference being that if one reads through all the slides it sounds more like a battle that will take place primarily on the software/applications level.

Yeah. He started giving that presentation before he became (at least publicly) VCG chief architect. So there is probably a mix of things there that morphed over time.

There has always been the question in my mind where Intel's priorities are in this area:

1). Flick away the GPGPU threat
2). Tie graphics in the bottom 1/2 of the market even closer to platformisation than is currently the case.
3). Really compete in the top 1/2 of the graphics performance and features market.

I'd guess that really is the priority order right there.

And, yes, that's my semi-snarky comment in the presentation analysis suggesting that if Intel gets into a "devrel war" with the gpu IHVs that the historical evidence suggests the gpu guys are more comfortable and have been more effective in that battleground.
 
Whereby I'd say that there should be a reasonable timeframe between (1), (2) and (3). In order to move to the next level though, presupposition is that you succeed at step (1) & (2).


And, yes, that's my semi-snarky comment in the presentation analysis suggesting that if Intel gets into a "devrel war" with the gpu IHVs that the historical evidence suggests the gpu guys are more comfortable and have been more effective in that battleground.

Frankly I'd think that if there's something interesting/substantial in the whole story, that developers are such enquiring minds that they would define their decisions ignoring completely what each side would have theoretically to say.

It might be a stupid and somewhat irrelevant example, but why was NV1 actually a major flop for NVIDIA?
 
Being the first gfx acclerator it suffered the problems that a lot of early stuff does, people were not ready to buy the card untill there were a lot of games made for it and devs wernt prepared to make games until it had a large install base

the fact that nv1 used quadratic texture mapping rather than triangles made it absolutely horrible to program for and doing stuff like collision detection was a nightmare adding to devs reluctance to support it

when the other cards arrived they all used triangles instead and the all the proprietry api's like glide and redline and the ihv independant ones like directx + opengl were designed for triangles, devs were used to working with them
also nvidia couldnt make a driver that worked with opengl and as you know back in those days if a card couldnt do a direct3d feature it was emulated in software and nvidias driver basically said "i can do nothing at all " so all d3d functions were done in software
so the only api it supported was its own whatever that was
which was the final nail in the coffin as the only proprietry api devs were prepared to program for was glide
 
Last edited by a moderator:
I think ISVs tend to be conservative, and deeply interested in risk/reward. The more niche-y a market is (some of the gpgpu areas), the more likely either side is liable to be able to convince an ISV "on the merits". The broader based (and, definitionally, lower priced per unit) the market is (like, for instance, gaming), the harder the ISVs will be to get to move outside their comfort zones and the "fat part of the market", because the less control there will be over creating a compatible client base.

I think history shows that the gpu guys are much more used to those scenarios than the cpu guys. Doesn't mean the cpu guys can't learn tho, and clearly the evidence in the 2nd half of 2007 is that Intel seems to be moving to shore up the software end. So it's a little better than it looked in the early part of the year. Tho I still think that's a big mountain they've only begun to climb. . .but at least they have some experienced climbers on the team now.

Look at Hyperthreading, and how often dual-core optimizations when they finally started happening noted that Hyperthreading got serious benefits too. . . . years later. Or to counter your example, why did NV3 begin NV's climb to success?
 
Just replace "GPU market" with graphics market; it's more than obvious that I didn't go down into meaningless semantics and separate integrated from discrete graphics sollutions.

You're most welcome to compare an Intel and according AMD/NVIDIA IGPs available today; then you can see how much Intel really "cares" so far about graphics in terms of performance and driver support overall. Then you could contribute a tad more to this conversation than meaningless sarcasm.

This is amazing. The idea is that Intel would go into the high end or even performance sectors. In doing so they would have to be competitive and couldn't rely on selling to the people who don't care as long as it works with iTunes and YouTube. It's not even possible to hold a conversation with someone who didn't grasp this from the start. I blame myself though, I guess this idea wasn't as clear as I thought.
 
Found this on wiki dont know how true it is but 1000 points in 3dmark 06 is around 10% of a 8800gtx

"The X4500 GPU is scheduled to appear in the forthcoming G45 ("Eaglelake-G") chipset and will be manufactured with 65nm technology. Like the X3500, X4500 will support DirectX 10 and Shader Model 4.0 features. Intel is aiming this solution to be 3x faster than the GMA 3100 (G33) in 3DMark06 performance, resulting in a score of around 1000 points
The X4500 is scheduled to be release in Q2 2008"
 
This is amazing. The idea is that Intel would go into the high end or even performance sectors.

What I'm saying for several posts now is that the intention alone isn't good enough.

In doing so they would have to be competitive and couldn't rely on selling to the people who don't care as long as it works with iTunes and YouTube.

That's the most pityful excuse ever for selling a supposed "graphics unit"; while it's true that noone would really expect any outstanding performance on an IGP you either sell such a thing as a pure 2D pipeline or you support as you should. And no it wouldn't be a wasted investment nowadays. If those G3x IGPs would be worth anything Intel could have scaled them down and and use them in markets they are using at the moment 3rd party IP, which come with their own drivers last time I checked.

It's not even possible to hold a conversation with someone who didn't grasp this from the start. I blame myself though, I guess this idea wasn't as clear as I thought.

Blame yourself as much as you want; it'll take a wee bit more for myself from Intel that I'm convinced that they finally start caring about graphics, in extension developers and finally gamers. The very first step is to recognize that it's not all about just making money, but that even the crowd that just cares about iTunes would play a couple of older games with reduced settings on an IGP if it would be a wee bit more capable and they wouldn't face a ton of glitches.

Within the whole marketing ramble, we did hear that IGP will become more capable in the future; well I'll start sitting up when that happens. In the meantime excuse me if I dump any idiotic marketing bullshit in the same trashcan as XGi becoming "market leader" within 5 years in the past.
 
I think ISVs tend to be conservative, and deeply interested in risk/reward. The more niche-y a market is (some of the gpgpu areas), the more likely either side is liable to be able to convince an ISV "on the merits". The broader based (and, definitionally, lower priced per unit) the market is (like, for instance, gaming), the harder the ISVs will be to get to move outside their comfort zones and the "fat part of the market", because the less control there will be over creating a compatible client base.

That's true; but aren't they as conservative as you say because some if not all of them had to learn a few things the hard way?

My point is that yes both sides will have a segment of the overall graphics market, where CPUs and GPUs can compete; yet it'll take a hell of a lot more before the one would replace the other for the entire market.

And pardon me, if a GPU or CPU has X GPGPU abilities might interest me as a gamer from a technological perspective, but it won't in the least define my future bying decisions either.

I think history shows that the gpu guys are much more used to those scenarios than the cpu guys. Doesn't mean the cpu guys can't learn tho, and clearly the evidence in the 2nd half of 2007 is that Intel seems to be moving to shore up the software end. So it's a little better than it looked in the early part of the year. Tho I still think that's a big mountain they've only begun to climb. . .but at least they have some experienced climbers on the team now.
Depends where the experience of those "climbers" come from; I'm only aware about some of the former 3DLabs folks. Great professional accelerators, but that was about it.

Personally I've been so to speak fooled too several times in the past with high hopes for X or Y supposed new contender in the graphics market, until it always turned into a soap bubble in the end. I raise an eyebrow for the past years every time something like that comes up, since the amount of resources X large company has isn't the defining factor in the end, but rather how much resources that company really intends to devote in the end and if they're really wanting to tolerate red numbers from such a startup project for a couple of years.

Look at Hyperthreading, and how often dual-core optimizations when they finally started happening noted that Hyperthreading got serious benefits too. . . . years later. Or to counter your example, why did NV3 begin NV's climb to success?
Multi-threading has been on GPUs since their birth; I guess it was inevitable for CPUs to start working in that direction in order to increase efficiency. The META core is one example that surfaced long before than hyperthreading did in the mainstream CPU market. In other words it was already proven technology in other areas and it was only a matter of time until it got incorporated in mainstream CPUs too. Ironically the META was actually developed to avoid mutliple-cores on one die.

With NV3x now you're making it a wee bit more complicated; for D3D9.0 several IHVs proposed their sollutions before the API was defined and conincidentially ATI provided the better and more efficient sollution (R3x0). If NVIDIA learned a lesson from NV3x it's in my book two folded: one side is to try to provide the best sollution possible and the other side was to minimize risks and maximize margins. In which case after the NV3x did NV release any high end GPU on the latest and smallest manufacturing process? Of course do I understand what you're trying to get at, but I doubt that NV didn't make any serious changes in their roadmap when they started to feel that NV30 is going to flop. It took them several years to recover from that one, until they reached the peak of their success with G80.

***edit: ...and G80 was/is as successful because (like the R300 in the past) it combines the best possible balance between features/performance and IQ for it's timeframe. We unfortunately don't get such gifts as gamers very often now do we? ;)
 
I was already looking forward to that one, a great read, as expected.
I just have one minor nitpick: I'd suggest a "no voting for articles etc. of B3D or affiliates allowed" rule, those choices always have a kind of incestuous feel to me. You know, like if the Eurovision Song Contest participants would be allowed to vote for themselves. :smile:
And as a long time reader - yes, I'm one of those who even knows there's a frontpage! :p - this would also increase my chance to stumble upon a must-read I might have otherwise missed. (As I already have read the B3D articles, of course.)
 
I'd certainly be happy to hear about others ideas for "Interview of the Year" and "Article of the Year". I really did/do think Tim and Rys' interviews were the best on offer this year, but I'd be happy to hear about others. . . certainly part of the point of B&B is to get others offering their own ideas on the catagories offered.
 
Great read. I am surprised that only Mark voted for Mass Effect as the best console game of the year. Mass Effect was one of the best games that came out last year, period. It got me to buy an Xbox 360 - something I would not have foreseen myself doing at the beginning of last year.
 
It also prompted me to buy Mass Effect, although I've yet to put it in the damn console. One day!

As for the links to friends of B3D....well, they're friends of B3D for a reason first of all :p , but I get the point, next year will be a little different in that respect.

The "Rys, you're so wrong about Crysis" comments made me pick the game up again and give it another go (although I skipped the first chapter and a half for brevity's sake). I'm still not convinced. For the record, the game doesn't stink, but neither does it move the game on in the way I was expecting, and the little niggles just put me right off at times. It still smells, just a little sweeter than when I wrote my B&B contribution, under duress from G's big whip.

Walt, yes, 3D hardware analysis kind of sucks at the product level, web-wide. May 2008 be the year I change your mind, sorry that it's long overdue. I still think I (and friends who ably assist in the process) do a pretty good job on the arch side (when I actually do the arch side!) though, and that approach won't change much this year.

And 8800 GTX deserved it, you naysaying barstewards :cool:
 
Back
Top