Beyond3D's 2007 'Bricks & Bouquets'

B3D News

Beyond3D News
Regular
Equal parts Year in Review, psychiatric therapy for the deeply disturbed on the cheap, and love notes to the class cuties...yes, it's once again time for Beyond3D's "Bricks & Bouquets", 2007 edition! Read the full news item
 
Good read, although the reminder of Intel's raytracing nonsense this year got the blood boiling again! That said I have a lot more faith now that Matt Pharr and the crew from Neoptica are there to fix things up. Hell if Matt Pharr tells me that pure raytracing is the future I might actually consider it ;) For now, I'm sticking with the consensus reached at the end of that epic thread.
 
The choice of the 8800GTX by Geo and Rys strikes me as odd. Thought the award title was "Video Card of the Year", not "Only High End Card of the Year."

Also Rys with Crysis makes me wonder what game he was fooled into thinking it was Crysis. Derivative physics and graphics? No game out there comes close in either area. Physics in Crysis are out right amazing and the graphics on medium match every other game, let alone the high settings. Sure the last 1/3 of the game was bad, real bad. But the first 2/3 were amazing in every regard, especially the free form gameplay which only two other FPS games match (Far Cry and STALKER).
 
Part of the joy of "Bricks & Bouquets" is the different perspectives and priorities the participants bring to the table. Between us we tend to cover the waterfront. The result usually is that there's a little something that nearly everyone will find enjoyable and a little something that nearly everyone will find to feel a little scandalized about.

Personally, I haven't bought a midrange video card in roughly forever, and consider the big dog GPU to be a very important driver of agendas and results, both technology wise and business wise.

I respect that others can look at that catagory and be all about price/performance & value, but that ain't me. At least not with video cards. Now with CPUs I'd probably answer along those lines. . .

Edit: Oh, btw, and even just on it's own, given its place in history on the cusp of the transition to Vista and DX10 --an inflection point that the entire industry had been looking forward to for several years-- for a GPU a year later to still be flicking performance crown contenders away like Gulliver with the Lilliputians is down-right remarkable.
 
That said I have a lot more faith now that Matt Pharr and the crew from Neoptica are there to fix things up. Hell if Matt Pharr tells me that pure raytracing is the future I might actually consider it ;) For now, I'm sticking with the consensus reached at the end of that epic thread.

While I usually try to hide it beneath a facade of cynicism, the reality is by nature I'm about two thirds "Pollyanna meets 'if you clap loudly we can save Tinkerbell' ".

So, I've been working on the theory of late that Matt and his colleagues will assimilate Intel VCG rather than the other way around.

We'll see. Matt, phone home, all is forgiven. Or something. :D
 
I agree with sentiments directly or indirectly expressed by several of the contributors (Farid in particular). Why is the 8800 GTX still dominant? The pace of progress has slowed to an unacceptable degree.

I'm starting to think that Intel's foray into high performance graphics is doomed to succeed.

If there isn't much room/incentive for the traditional heavyweights to keep offering dramatic performance improvements, how can they compete against a company that will offer unbeatable process and power reduction technology? Intel doesn't need to offer a better pipeline implementation - they just need have something that performs not-too-much-worse per cycle.

AMD has been shooting itself in the foot (and vital organs) for some time now. But what happened to nVIDIA? The 8800GT is a great *product*, but doesn't demonstrate progress.

Hopefully the Q1 releases by all and sundry will make me feel foolish.
 
How does a budget part not demonstrate progress? Are we seriously of that mind "if its not faster its not better"? That's crazy, there is much to be said about significantly lowering the price of entry for high levels of performance.

Also, why would Intel coming into high end graphics be something bad? What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.
 
How does a budget part not demonstrate progress? Are we seriously of that mind "if its not faster its not better"? That's crazy, there is much to be said about significantly lowering the price of entry for high levels of performance.
The 8800GT is an excellent *product*. But offering essentially the same SKU at a lower price point thanks to manufacturing and marketing prowess isn't the type of technological progress I'm looking for.

Also, why would Intel coming into high end graphics be something bad?
It isn't bad at all! But... since Intel has been historically weak in this area, the most likely explanation for this outcome would be screwups by the current leaders. I don't care who wins, but screwups are thoroughly undesirable.

What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.
My point was that the progress in the graphics field (and its derivative children) has, historically, made Moore's Law type projections laughable. If the creative talents who established that trend can't carry the torch, Intel will take over by virtue of brute force if nothing else.
 
Also, why would Intel coming into high end graphics be something bad? What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.

Doesn't it depend under what presuppositions a new player (so to speak) enters the high end GPU market, when it comes to defining whether such a move is positive or negative?

Intel has a track record so far whenever they dealt with anything considering graphics that leaves a bitter taste in departments like driver development or at times even image quality. There were times in the past where they had just as crappy texture filtering as other companies that have been bought out a long time ago (whereby the latter received a lot of flack, while Intel just got away with it just because not many really care about IQ on IGPs). Let alone that Intel for years now tries one way or the other to escape the implementation of critical features; Jebus it took them eons to remember that graphics units do actually also need a geometry unit. Which of their DX9.0 IGPs did even had a shred of a T&L let alone a sign of a VS unit? Yes their latest D3D10 IGP inevitably sports geometry processing (finally), yet it took ages to get according drivers out the door and I sure hope there's a D3D10 driver available for those nowadays.

Take PowerVR as an example instead; they might have not achieved anything in the GPU development for the past years but if you check their track record from the ancient Dreamcast, their few budget GPUs ever released in the past, Arcade Hw and the likes as the PDA/mobile orientated MBX today there aren't any serious complaints considering IQ; rather the contrary. IMG's major problem here is rather that they're an IP selling company, which is a milestone for GPU graphics as things stand if they don't have a committed partner behind them to invest in their IP.

And that example above is merely to show that the amount of resources available are not really an excuse for cutting corners anywhere. Not in the very least for a company such as Intel with their ungodly amounts of resources available. And no the excuse that they "just" developed IGPs so far isn't an excuse either.

To sum it up: if Intel makes a radical change in their design & support philosophy for anything considering graphics (which is necessary in order to compete in the GPU market anyway), then I don't think many if anyone would have second thoughts. Until then having second thoughts is IMHLO perfectly justified.

The only other reason why I dragged IMG into this post, are the multiple licensing deals Intel has signed with them. So far the only thing we know about are Intel's coming UMPCs where IMG's SGX will be integrated. If Intel will use that IP for any other markets is still in the stars. At least PowerVR however small they are orientate their designs according to what ISVs and gamers want more or less (just like the big players in that market) and don't just care how to make the cheapest product possible within each timeframe.
 
I think Rys cannot play Crysis hence his thoughts on it, he probably keeps getting lost in the jungle :) hee hee.. they'll find him in another 40 years on one of th islands with a big beard and still fighting the N Koreans who will by that time be producing all our computers for us in real life ...

He is right about monitors though, a good comfy chair and a great monitor should always be the starting points to using a computer, then cpu and graphics etc secondary.
 
Also Rys with Crysis makes me wonder what game he was fooled into thinking it was Crysis. Derivative physics and graphics? No game out there comes close in either area. Physics in Crysis are out right amazing and the graphics on medium match every other game, let alone the high settings.

Yep, I loved getting stuck on a rock in the opening swim to the beach. Great physics! It has some cool stuff... but is also blatantly rough and buggy beyond my imagination. Watching other user videos of the physics, and especially AI, indicates this sort of issue isn't something just me and Scotty have hit on. Games have bugs and limitations, but as great as some parts in Crysis-Physics are (destructible foliage and buildings, component damage on vehicles, cinematic physics, etc) I don't think I see enough tempering from the fans about the blatant show stoppers. Having to restart the game 2 minutes into it due to a physics bug is pretty substantial in my book and worth noting. But then again maybe others don't have any of these problems?
 
how can they compete against a company that will offer unbeatable process and power reduction technology?
Are you SURE we are talking of the same company? I think people need to stop taking Intel's marketing statements so literally... (for example: in the Larrabee timeframe, Intel will be at a process density disadvantage)
 
My big concern about Intel is that outside their core cpu area they've historically had the stick-to-itiveness of a hummingbird. If Larrabee isn't seen as "all that and a bag of chips" right out of the block, will they fold up their tents and slink away 18 months later?
 
Yep, I loved getting stuck on a rock in the opening swim to the beach. Great physics! It has some cool stuff... but is also blatantly rough and buggy beyond my imagination. Watching other user videos of the physics, and especially AI, indicates this sort of issue isn't something just me and Scotty have hit on. Games have bugs and limitations, but as great as some parts in Crysis-Physics are (destructible foliage and buildings, component damage on vehicles, cinematic physics, etc) I don't think I see enough tempering from the fans about the blatant show stoppers. Having to restart the game 2 minutes into it due to a physics bug is pretty substantial in my book and worth noting. But then again maybe others don't have any of these problems?

None, I've clear from start to alien ship 4 times now, not a single issue you mentioned. First time on second hardest difficultly, now the last three on the hardest. I never had this weird physics issues people talk about and my friends have not either. It's only something I've heard from by a hand full of people online.
 
How does a budget part not demonstrate progress? Are we seriously of that mind "if its not faster its not better"? That's crazy, there is much to be said about significantly lowering the price of entry for high levels of performance.

The era of the beast GPU sucking down megawatts of power and producing the heat of a nuclear furnace topped by shoe box-sized heat sinks, while delivering all of that "goodness" for a mere $700 is behind us already, although there are some folks who still don't understand what's happening right before their very eyes...;)

There's SLI, there's CrossFire, and now there's triple SLI and up-to-quad Crossfire X, and next year there will be something else on the order of multiple gpu production on single PCBs to look forward to. The problems of OS and API support need to be ironed out, of course, but it can and will be done because these companies are working to move multiple-gpu adoption into the mainstream commodity arena.

You know, my opinion is that in the old days when the difference between a 3d game running at 10 fps and one running at 30 fps meant the difference between a game you couldn't really play and a game you could, that FPS benchmarks made sense and were valuable indicators of relative worth. Today, when generally we talk about the difference between 100 fps and and 130 fps as if that difference also denotes what's playable and what's not, it's clear we've completely lost our way.

The fact is that unless you are a benchmark junkie who never plays a game without slowing it down by running FRAPS in the background, those "major" fps benchmark "differences" we see so often in reviews don't mean a single, solitary thing in terms of whether product A will provide as much or more enjoyment playing a given game than product B. I think Image Quality is still king, but of course I am, and possibly you are, too, in the minority at the present time as to that particular issue. Bar-chart frame-rate benchmark conditioning dies hard...;)

I've long been suggesting and pleading that benchmark-junkie reviewers add one more bar chart to their plethoras when they do their reviews--and that's a "bang-for-the-buck" chart which correlates fps benchmark numbers to dollars spent. Come to think of it, even an "excess fps" bar chart (excess in terms of the fps delivered over the fps actually needed to play a given game as intended by its developers) might be nice to see as well. But this kind of analysis would require a paradigm shift in the way most reviewers approach 3d cards in general. As such, human nature being what it is, it's surely desirable but probably not very likely that 3d-card reviews are likely to achieve this depth of analysis anytime soon.


Also, why would Intel coming into high end graphics be something bad? What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.

The problem with Intel's approach to 3d gpus has always been fairly evident ever since the company began and ended its initial foray into the discrete 3d-chip markets with the i7xx series of 3d gpus several years ago. Intel's design and idea for "3d" was merely how to leverage it so that it could sell entire systems based around it. As such, Intel's initial and only gpus relied on on AGP texturing (a concept pioneered at Intel) from system ram as opposed to the onboard texturing gpu model that was so much better it is with us until this day.

In those days, both 3dfx and nVidia products literally ran rings around Intel's discrete gpu product line because of that--in many cases where games were quite playable on other gpus they simply were not on Intel's line of discrete gpus. Basically, when Intel discovered that it couldn't use discrete 3d technology to move lots of its other chips at the same time it packed up its 3d marbles and went home and that was the end of that. We still see this philosophy today in the fact that Intel would rather sell cpus sans embedded memory controllers, which they'd rather sell separately, although rumor has it that Intel might one day embed its memory controllers inside its cpus like AMD has been doing for years. As long as Intel can continue to do so I think it would prefer to sell more than one chip whenever it can sell at least two, instead, and in the process earn more money.

Yes, certainly that's also true when it comes to SLI and Crossfire, no doubt about about it. But the way that ATi and nVidia are doing it is far more specialized than the way Intel approaches such matters. We'll see pretty soon just what Intel's latest notions as to how to use 3d to sell more Intel chips other than its own 3d chips pans out, I'd imagine.
 
I really don't see how anyone can actually consider Intel's past in the GPU market. It can't be the same or they'll simply flop. So that's useless...
 
Great stuff. This is turning out to be a great tradition. Keep it up! :D
 
I really don't see how anyone can actually consider Intel's past in the GPU market. It can't be the same or they'll simply flop. So that's useless...

Granted, past performance is no indicator of future profit (standard SEC disclaimer...;)) but the fact is that nVidia and ATi, (even 3dfx) have past gpu production histories which allow us to extrapolate what they are likely to do in the future. The one time in the past that Intel showed any interest at all in the discrete 3d gpu market was during the period where the company's only interest was in selling non-competitive (with 3dfx and nVidia) discrete 3d gpus--the i7xx series. Ever since its withdrawal from that market Intel's only interest has been in selling integrated 2d & later 3d gpus as a component of its larger system offerings. The integrated 3d graphics Intel has sold ever since are still non-competitive with the discrete 3d products that both ATi and nVidia are selling and have been selling.

What I'm trying to understand is what's leading you to believe that Intel has any intention of going head-to-head with either nVidia or ATi in the discrete 3d graphics markets. Intel's always been in the integrated gpu markets, however.
 
I really don't see how anyone can actually consider Intel's past in the GPU market. It can't be the same or they'll simply flop. So that's useless...

Intel never left the GPU market; if anything it's about their presence in the market today.

What I'm trying to understand is what's leading you to believe that Intel has any intention of going head-to-head with either nVidia or ATi in the discrete 3d graphics markets. Intel's always been in the integrated gpu markets, however.

I wouldn't be in the least surprised if Intel tries to compete with AMD/NVIDIA in the future in professional markets. In such a case it would truly mean that Intel will compete against the other two, but it doesn't necessarily mean that the result could yield any significant breakthrough in the mainstream GPU market (if it's ever released there in the end).

For 2008/9 Intel will release as it seems UMPCs (and possibly larger form factor devices) with IMG graphics IP integrated into them. The 3d market overall hasn't only somewhat "expanded" to the extreme high end as you noted, yet it also has and is expanding to the very lowest end. We already have fast moving 3d in PDA/mobile phones and UMPCs are next in line, which are somewhere in between PDAs and usual laptops.

Personally for me there are way too many unknown factors at the moment; mobile and small factor devices in general are constantly gaining in ground and in the longrun it might bring some interesting shifts in the market overall. There's also the projected migration of GPUs into CPUs in the foreseeable future from AMD (dubbed "Fusion"), which we still don't know what Intel inteds to do for that one; it could eventually mean that IGPs become redundant in the future and pure SoCs and "SoC-alike" devices gain a bigger momentum. And no that doesn't mean that the PC or the high end GPU market will vanish or any other funky theory that someone could come up. It could merely mean that some balances overall in the market might change over time.

AMD and NVIDIA are working as we speak on OGL_ES2.0 compliant chips (for AMD often referred to as "mini-Xenos") and for the markets those could be sold for as chips they will be competing with companies like Intel, Texas Instruments and the likes. If and wherever they sell those devices as IP they'll of course compete with IMG directly.

Long story short: I can see Intel competing in terms of graphics one way or the other "more" with AMD/NVIDIA than up to today; what I have a hard time seeing for the time being is Intel becoming overnight a high end GPU competitor in the mainstream GPU market.
 
Back
Top