Larrabee delayed to 2011 ?

but it was short-lived... at *most* a year until the 8800's came out.
Wow, you have high standards. You want a $400 console to outperform and/or be as advanced as a $500 video card that comes out one year later?
Again, pretty off-topic though I suppose :)
True, but I think we all enjoy your posts. Besides, we can't blame ourselves for going off topic when the subject is virtually shelved...
 
Wow, you have high standards. You want a $400 console to outperform and/or be as advanced as a $500 video card that comes out one year later?
No, obviously not! (But yes I do have high standards - aren't I allowed to? :)) Like I said, I'm not making an economic argument. I'm merely addressing the claims that consoles are better than PCs for some span of time after they come out... this is normally simply untrue and even if there are arguable cases (maybe the 360, but was it really overall better than high-end PCs when it came out? Hard to say...), it's a pretty short time span!

Really what they do with the consoles at that price point is quite impressive, no doubt. Just compared to a PC running the same games at a higher resolutions, the consoles can't keep up. Not a big issue since the PC obviously costs a lot more, but relevant to someone trying to get the "best experience" and willing to pay more for it.

And then there's the selfish "graphics guy" angle, where I just can't get excited for old/midrange tech :)
 
Last edited by a moderator:
You have to factor in the fact that the programming model on a console isn't the same, and you don't have the share the hardware with increasingly poorly programmed resident userspace programs you have no control over. Never mind the OS getting in your way (although that's getting worse across the board as these things try to do more each generation).

Maybe I've not been taking an interest long enough, but all I've seen on the consoles is the games getting better over time and making great use of the hardware as the programmers get to grips with it and enjoy the lower level access.

You simply don't have that on the PC, and the vast majority of PC games (even non-ports) are very wasteful of the hardware in the main. Look at the increase in power between Xenos and Cypress. Now tell me you're getting much better looking pixels for it. You drive higher resolutions now, but.....well that's about it for me in the main.

The late-life console games tend to do amazing things with the hardware, and I can't say the same for the vast majority of PC games. They just ride Moore's.
 
Maybe I've not been taking an interest long enough, but all I've seen on the consoles is the games getting better over time and making great use of the hardware as the programmers get to grips with it and enjoy the lower level access.
Right but even with those advantages (and targeting one architecture is a *huge* advantage) they still don't produce noticeably superior graphics to PC games. They're just so much slower that even cleverness can't close the gap.

You simply don't have that on the PC, and the vast majority of PC games (even non-ports) are very wasteful of the hardware in the main.
Sure, but that's the cost of portability, and increasingly just the cost of fewer resources for the dedicated PC SKU.

Look at the increase in power between Xenos and Cypress. Now tell me you're getting much better looking pixels for it. You drive higher resolutions now, but.....well that's about it for me in the main.
Well TBH I really am getting a *whole* lot more pixels. The fact that they aren't always "better" is just because the developers find it easier to jack up the resolution vs. developing specific ways to make "prettier" pixels using that power. And if Cypress/DX11 were the real baseline for a game, the console simply couldn't compete. We're barely getting out of DX9-level engines on the PC here... I can *easily* produce shadows for instance that the consoles can't even dream of on Cypress and still have performance to spare.

The late-life console games tend to do amazing things with the hardware, and I can't say the same for the vast majority of PC games.
Sure but that has *nothing* to do with the hardware... that's just "constraints breed innovation". It doesn't change the fact that Cypress has vastly more power and flexibility than Xenos even given the disadvantages of the portable PC APIs. It's merely the economics that make it infeasible to develop a Cypress-focused engine for instance, but even those are changing as DX11 becomes a reasonable baseline target.

Really though, I haven't seen anything on the current consoles that even a midrange PC can't duplicate at higher resolutions with more MSAA, but feel free to point out an example :) I'd say the most impressive rendering stuff in consoles is the tiled deferred rendering using the SPUs in Uncharted, Blur, etc. (enabled primarily by the fast CPU-GPU bus which PCs don't have), and now a single DX11 GPU can do that much faster and more flexibly without using the CPU at all.

All obviously my opinion and not from the point of view of buying one hardware platform or another. Still, if you have both platforms, game SKUs on PCs (even ports) are general superior in the graphics department even if no specific work is put into differentiating the PC SKU other than higher resolutions/MSAA/texture filtering (which is the norm). And I do maintain that if I didn't care at all about the economics and just wanted to develop the best graphics tech on any current platform, a high end PC is the most capable.
 
Last edited by a moderator:
I'm coming at it from the point of view that better looking pixels is almost always preferable to more of them. Consoles are pretty much fixed resolution devices, and it shows over time after the learning curve has been conquered. Maybe PC developers should ignore the big moving resolution target and spend more time on better looking pixels at a 720p resolution (and thus performance) target too.

It's a crying shame that I can have a Cypress in my PC, knowing that resolution scaling and turning on MSAA and surface filtering is mostly trivial in code, and not have the game tuned for great looking pixels at the same resolution targets as a modern console in this generation. I'd love to use a Cypress-based PC to drive epic looking games at 720p on my TV, and let a scaler take that higher if needed like the console gives me (since I own a 1080p TV).

Maybe I don't play enough PC games, but I'd like to see one that looks honestly better in motion than God of War 3, say. Even the PC games that actually sell of the back of their reputation for pushing the hardware don't look great (Crysis is an aliased mess, so's Metro, so's Stalker). There's barely a shred of aliasing of any kind in GoW3 and the game looks great in general. A single Cypress SIMD is more capable than the GPU in a PS3, yet on a Cypress (even at 720p so I can turn on a good level of MSAA) the mentioned PC games look pretty bad in comparison IMHO.

The theoretical advantages because of Moore's on the PC just aren't exploited the way I think we'd both like :smile:
 
I'm coming at it from the point of view that better looking pixels is almost always preferable to more of them.
I agree in general, but 720p isn't enough at least for the display devices and viewing distances I'm used to :) 1080p probably is.

I'd love to use a Cypress-based PC to drive epic looking games at 720p on my TV, and let a scaler take that higher if needed like the console gives me (since I own a 1080p TV).
Well for the most part you can do that... I don't think there's any games that look significantly worse on PC than consoles, and there's not that many exclusives with graphics that are head-and-shoulders above the multi-SKU games IMHO. Uncharted 2 is probably the one exception, but those levels of graphics are definitely achievable on a modern PC.

Maybe I don't play enough PC games, but I'd like to see one that looks honestly better in motion than God of War 3, say.
I guess we're into mostly qualitative stuff here, but I'd say Batman AA is a decent example of a similar game that looks great in motion and running it on even a midrange PC @ 1080p w/ AA shows a noticeable bump vs. the console versions. God of War 3 looks pretty good for sure, but it's as much art as tech really... if you watch there's pretty heavy use of LOD (in the titan sections) and restricted camera paths to achieve said visuals. It's more that they simply elect to avoid effects that alias badly - which is a great choice obviously - than doing anything really special technically.

Even the PC games that actually sell of the back of their reputation for pushing the hardware don't look great (Crysis is an aliased mess, so's Metro, so's Stalker).
True to some extent, but again... at 720p Cypress can basically do 4x SSAA on these games nowadays!!

There's barely a shred of aliasing of any kind in GoW3 and the game looks great in general.
I was just playing it the other day and that's a bit of an over-statement (this image for instance displays the usual specular and geometric aliasing, blocky polygons and low-resolution, poorly-filtered textures, poor/missing shadows). Still, they avoid high specular exponents, excessive normal maps and apply a pretty aggressive depth of field post-process (in addition to an edge-aware blur I think) which avoids/hides most of the bad aliasing problems of modern games.

A single Cypress SIMD is more capable than the GPU in a PS3, yet on a Cypress (even at 720p so I can turn on a good level of MSAA) the mentioned PC games look pretty bad in comparison IMHO.
You're at best making comments on particular engines/games though... do you know of even a single multi-platform SKU that looks significantly better on the consoles/worse on the PC?

The theoretical advantages because of Moore's on the PC just aren't exploited the way I think we'd both like :smile:
Yeah we can both definitely agree with that, but that's just economics. There's tons of power available there but not really the market to make good use of it. And as Marco notes, the tides are turning for the better somewhat in the aliasing discussion, etc.
 
Last edited by a moderator:
I agree in general, but 720p isn't enough at least for the display devices and viewing distances I'm used to :) 1080p probably is.
Agreed, and 2Mpixels means a lot of oomph/pixel on Cypress too :smile:

Well for the most part you can do that... I don't think there's any games that look significantly worse on PC than consoles, and there's not that many exclusives with graphics that are head-and-shoulders above the multi-SKU games IMHO. Uncharted 2 is probably the one exception, but those levels of graphics are definitely achievable on a modern PC.
Nothing that looks significantly worse on the PC, sure (and I think that covers the same point you make later), but nothing that looks significantly better either. I come back to Xenos vs Cypress and not having that bigger oomph/pixel exploited on the PC. I agree to some extent that it's just an economics thing, but there have been large sums spent on PC exclusives in the last few years that didn't make the same tech and art choices as Uncharted or GoW3 and blow people away visually for the power available (and that includes making smart choices about what's visible on screen and being smart in the asset pipe before the engine even gets a hold of it).

True to some extent, but again... at 720p Cypress can basically do 4x SSAA on these games nowadays!!
Great! I'd love to see someone like Carsten or Damien prove it (and make the argument that you should use the GPU to render the game in that fashion on that hardware too, versus less IQ at higher resolution).

*cut to save text* Still, they avoid high specular exponents, excessive normal maps and apply a pretty aggressive depth of field post-process which avoids most of the bad aliasing problems of modern games.
The image (that I cut for brevity in my reply) is telling as a still, but it's hard to notice the shadow aliasing and texture resolution in motion. All developers could learn something from that (and do per-situation tuning of assets and performance to get high aggregate IQ at high framerate). I can somewhat agree with you that it's not *that* impressive technically, and more of the magic in motion is from the art and production. Be interesting to see what Christer thinks of that actually :smile:

Yeah we can both definitely agree with that, but that's just economics. There's tons of power available there but not really the market to make good use of it. And as Marco notes, the tides are turning for the better somewhat in the aliasing discussion, etc.
Good, I'd love a swing back in the direction of better per-pixel IQ on the PC, and better use of the $400 GPU I paid good, hard-earned money for. Hypothetically, if there's ever a spare $1M to spend on a game, for whatever reason, I'd rather that go into IQ R&D at this point in the respective generations on PC and console than almost anything else.
 
I agree to some extent that it's just an economics thing, but there have been large sums spent on PC exclusives in the last few years that didn't make the same tech and art choices as Uncharted or GoW3 and blow people away visually for the power available (and that includes making smart choices about what's visible on screen and being smart in the asset pipe before the engine even gets a hold of it).
Agreed, although there have been good games. Battlefield BC2, Batman AA, Far Cry 2 and a few others look pretty great on PC IMHO. They obviously look good on consoles too (and that's a credit to the development teams) but they do look better at the higher PC resolutions with more textures and filtering.

Great! I'd love to see someone like Carsten or Damien prove it (and make the argument that you should use the GPU to render the game in that fashion on that hardware too, versus less IQ at higher resolution).
Sure. I'm not sure just raw super-sampling is the best use of the FLOPS in the long run for better overall IQ, but it's a simple one in the short term.

Be interesting to see what Christer thinks of that actually :smile:
Agreed, I'd be especially interested to hear if they realistically think there's anything being done that couldn't be accomplished on a high-end PC with a DirectX 11 video card (and what those things are!).

Hypothetically, if there's ever a spare $1M to spend on a game, for whatever reason, I'd rather that go into IQ R&D at this point in the respective generations on PC and console than almost anything else.
You're gonna piss off the "gameplay > graphics" crowd who seems to think that quality for these subsystems is directly determined by money ;) Last thing I want is to feed that silliness more!

So I guess we're in agreement overall... I just don't think the current reality has much to do with the hardware involved.
 
The image (that I cut for brevity in my reply) is telling as a still, but it's hard to notice the shadow aliasing and texture resolution in motion. All developers could learn something from that (and do per-situation tuning of assets and performance to get high aggregate IQ at high framerate). I can somewhat agree with you that it's not *that* impressive technically, and more of the magic in motion is from the art and production. Be interesting to see what Christer thinks of that actually :smile:

I'd argue it looks no better and a lot worse actually than many of the indoor scenes in HL2!

Good, I'd love a swing back in the direction of better per-pixel IQ on the PC, and better use of the $400 GPU I paid good, hard-earned money for. Hypothetically, if there's ever a spare $1M to spend on a game, for whatever reason, I'd rather that go into IQ R&D at this point in the respective generations on PC and console than almost anything else.

Art assets are generally more important to visual quality than engine capability. I thought everyone knew that by now?
 
You're gonna piss off the "gameplay > graphics" crowd who seems to think that quality for these subsystems is directly determined by money ;) Last thing I want is to feed that silliness more!

But gameplay is largely independent of money. gameplay is much more about ideas and the cohesiveness of said ideas where as visual quality is mostly about money and time.
 
But gameplay is largely independent of money. gameplay is much more about ideas and the cohesiveness of said ideas where as visual quality is mostly about money and time.
Yeah exactly. I am not one of said people, but there are a lot of them and they are easily summoned :) These are the people who think that every bit of effort put into graphics is a bit that makes the gameplay worse.
 
That image linked earlier, the first I've seen from that game, has the same specular for every surface it seems. i.e. it's Doom 3 all over again. That's pretty tragic.
 
Yet in motion it's a completely different story.
I will agree with this. It's definitely nicer in motion and I chose that picture just to highlight that the tech isn't actually significantly better than PC games, etc. They use it well though and it does look great in motion.

I did some digging and it appears they're using some variant of "morphological anti-aliasing" which looks like it's based on image-space shape/curve reconstruction. Neat! I'll have to look into that and maybe write up a shader to see how well it works in motion. Definitely seems to work well for GoW3!
 
So Larrabee is not going to happen in the next 3 years.

Do anyone still think many small CPU cores + texture units still a good idea for real time graphics in the next 3 years ?

First Sony with Cell, that project failed when they had to introduce RSX, now Intel failed to bring out competitive product despite the investment. Then you have NV with Fermi that concetrate on general purpose performance but with very large die and not enough performance. And yet looking at the latest CG in movie, real time rendering seems to have fallen even further back.

Shoudn't high end real time rendering platform look into specialised chips with fast interconnect between them like say LM - Model 3 in its time ? SLI and X-fire don't scale too well, how should real time graphics move forward ?
 
Only very few companies are pushing the boundaries of real time graphics on PC. Companies like Crytek et al are really pushing the power available in the latest and greatest processors and graphics cards. It would be unfair to say that real time has stagnated because GPU power is lacking IMHO.

With the advent of consoles becoming more PC-like and more profitable for games developers it is a no-brainer for developers to target console performance first and then port to PC (if they bother to port to PC).

What is needed for better real time graphics, sadly, is the release of the next-gen consoles for the bar to be raised in 3D graphics once again. Unfortunately that means that all that will happen is 3D graphics will reach the level of detail possible on PC's at time of launch and will not progress much after the lifetime of the consoles.

Sorry for being so pessimistic. It's the consoles.... I tell ya.
 
With the advent of consoles becoming more PC-like and more profitable for games developers it is a no-brainer for developers to target console performance first and then port to PC (if they bother to port to PC).
OT
Actually what bothering for PC gamers is that the 360 is 5 years with no successor in sight. The hardware is out of date even against low/mid PC GPU now. Things will get even worse next year when AMD and Intel will push out their APU/fusion processors.
 
Last edited by a moderator:
Sorry for being so pessimistic. It's the consoles.... I tell ya.
I know it's fashionable for PC fans to piss and moan about what "low tech" consoles have done to PC gaming, but in reality it seems that PC games just don't sell that well anymore, certainly not anywhere near the numbers that console games sell. It's probably hard for PC exclusive devs to recover the cost to develop a game that "can only be done on PCs" because not many would buy it. So you end up with ~$1500 worth of uber CPU/GPU/RAM that just sits there underutilized.
 
Only very few companies are pushing the boundaries of real time graphics on PC. Companies like Crytek et al are really pushing the power available in the latest and greatest processors and graphics cards. It would be unfair to say that real time has stagnated because GPU power is lacking IMHO.

With the advent of consoles becoming more PC-like and more profitable for games developers it is a no-brainer for developers to target console performance first and then port to PC (if they bother to port to PC).

What is needed for better real time graphics, sadly, is the release of the next-gen consoles for the bar to be raised in 3D graphics once again. Unfortunately that means that all that will happen is 3D graphics will reach the level of detail possible on PC's at time of launch and will not progress much after the lifetime of the consoles.

Sorry for being so pessimistic. It's the consoles.... I tell ya.

Well I am not really talking about consoles, I am asking if general purpose is holding back real time graphics somewhat ? Forget consoles, look at Fermi, it is way faster at general purpose stuff but it seems to do so at the expense of graphics.

So can GPU that is even more general purpose than Fermi work better or will it just hold real time graphics back ?

For the future we talk about using GPU for everything and left CPU to become the manager again. Is that a good thing for real time graphics though ? Wouldn't it still better to spend every one of those six billions transistors at doing graphics instead of getting side track into general purpose realm ?

I mean we have CPU like Cell, that's very good at heavy lifting of non graphical stuff and is scalable, and soon Intel will have something similar too, why make GPUs do what something that Cell already able to do well? Shouldn't the GPU do just graphics because we are very far from being 'there yet' to get sidetrack.
 
Back
Top