Graphical effects rarely seen this gen that you expect/hope become standard next-gen

It's all a fake, an effect that looks like SSS, but it isn't.
Unless its static and precalculated, which you can do for ice as an example.
 
Yeah that was annoying. Supposedly they've fixed it for Mass Effect 2.

As for shadows in next gen games, I hope they go the route of "contact hardened shadows".

Old way, the shadow is uniformly blurred:
http://www.firingsquad.com/media/gallery_image.asp/2039/5

New way, shadow gets blurrier with distance from occluder:
http://www.firingsquad.com/media/gallery_image.asp/2039/6

Oh man that's awesome. Can't wait until I get a DX11 card...

EDIT: Doesn't Far Cry 2 on consoles also use SSS?

I'm wondering if they use their indirect lighting as well? Also, another console game that used SSS is Brothers in Arms Hell's Highway.
 
Uncharted 2 uses some very convincing SSS shaders on ice and vegetation, the Ice cave level almost looked like some scenes from "Planet Earth Documentary".
 
semitope;1354701[B said:
]I don't quite think there is evidence in their past work to show they could do i[/B]t but I am pretty anxious to see what they pull off. Hoping this is the 360 exclusive to wow us.

Anyone got a vid of the Halo 3 effects mentioned?

My hopes for next gen lie in lots and lots of physics. Not necessary level and object destruction but in fluids and particles. Of course good image quality is welcome and 720p with 8xaa 16xaf as standard would be great. They don't need to aim for 1080p at all (just put in a hardware scaler and call it a day)

There is evidence that they could pull it off like Halo Combat Evolved as that game looked graphically stunning for its time.Halo 2 also looked extremely good as well.On the PC, Marathon looked amazing for its time too.I believe Halo 3 is a good looking game, despite the engine having its roots from the original Xbox hardware.Halo 3's lighting is arguably the best I have seen in a console game thus far and it sports some pretty good looking textures as well.

Im eager to see what Bungie can do, with an engine developed from the ground up to specifically take advantage of the 360's hardware.Hopefully, we get to see more of Reach at next years GDC...
 
Yeah, the water in R2 is way ahead of most and needs to be standard.

They put the same water effect in the new ratchet and clank and it looks pretty good. Just spent a good deal of time running through a nice length of water with ratchets rocket boots. its a really cool effect.

[Yt]9Lcn10InLBs[/Yt]
 
Last edited by a moderator:
I remember FEAR having muzzle flash shadows. This isn't necessarily an effect, but I would love to see more physics enabled debris flying around in my FPS/TPS games. Gears 2 and Red Faction: Guerilla did a pretty good job with this.

Also, I loved throwing grenades at the occasional pieces of destructible cover in Uncharted 1.
 
I remember FEAR having muzzle flash shadows. This isn't necessarily an effect, but I would love to see more physics enabled debris flying around in my FPS/TPS games. Gears 2 and Red Faction: Guerilla did a pretty good job with this.

Also, I loved throwing grenades at the occasional pieces of destructible cover in Uncharted 1.
I recommend you to try Killzone 2. If you like debris, destruction and chaos :cool:
 
There was a similar thread a while back (2007). This is what I wrote back then:

Shadow Aliasing. As great as a game like Mass Effect is, the "shadow crawl" across the faces is horribly annoying. MotorStorm is another where the shadows is just horrible.

Lack of Self Shadows. Self Shadows give a lot of depth to characters and look fabulous. Comparing a game like VT3 to Madden, and the self shadows just pop out in VT3. I love them and cringe when I don't see them. Don't ask me to choose between no self shadows and self shadows with aliasing... grrr!

Dynamic Shadows. See a trend? Having dynamic objects both cast and receive shadows to/from other dynamic objects isn't too much to ask... is it?

Bloom. Some bloom is good, but the constant glow some games have on every lit edge is very, very annoying. I love HDR effects but bloom is one that must be handled with care.

Flat Textured Grass, Sprite bushes. Games that feature a lot of grass really need to put some effort into it. Trackmania and BF:V had decent grass a while ago and a number of launch titles also had solid grass (like Kameo). Even if you cannot go all out on 3D grass, some sprite grass is better than horribly flat pixilated... textures. But for bushes... no. You better make those 3D. If I see another bush that looks the same from every angle and has no lighting or shadowing... grrr! MotorStorm, looking at you!

Non-interactive Worlds. Boom! That was a huge bomb! Oh, and the half decayed brick wall... isn't scratch. Yawn. If I hit something in the world, I want to see some input. Part of immersion is showing the gamer they DO make an impact in the world.

Animation. Great visuals with poor animation make average, at best, graphics. Graphics is equal parts visuals and animation, and without the later the title suffers. I am tired of LOOOONG unbreakable animations, especially ones that where you run into an object (wall, another player, NPC, etc) and it just continues as if they are not there. Classic examples abound in Madden. Jump into a tackle animation... and you do a super jump suspended in the same spot in the air. Too Human also has some of those looong unbreakable animations where you do a special move and some enemy may be walking into you (moon walk of course because you cannot be moved!) but you keep going as if no one is touching you.

Texture Flicker and Pop-in. Nothing worse than seeing a texture flicker (texture/black/white/repeat) because it is on a seam or somesuch. Likewise pop-in. You have 512MB of memory so use it wisely. Yes, DOF can help but it is still noticable.

Last Gen Particle Effects. When your smoke or fire looks like a pixelated and static mess then you need to redo it. Lost Planet really showed us what we should expect from game particles, RFoM has some great snow effects and Kameo a lot of nice particles all over the place. They go a long way to give the world life. Lifeless last gen particles make a title looked dated. No real grey area here.

Body / Facial Animation. HL2 was 2004. We are now in 2007. When only a handful of games can claim parity that is not a good sign. Valve really nailed the "life" expressed in the characters not only through excellent facial animation (and acting) and lip sync, but also body gestures. Few things bore me more than really, really bad character animation.

Unbreakable Lights. If I can shoot them I want them to go out. Totally destroys immersion for me--and getting to break them is waaay up on the cool factor. This alone could add a TON of replay to games, especially MP. Take any FPS... you wander into a room and are being chased... hide in a corner, shoot out the lights. When the enemy comes through the lit door (and the HDR iris effect hits him) you have him nailed! Or better yet, he turns on his flash light. Bang!

Dishonorable mentions: Repetitive Textures, Edge Aliasing, Texture Aliasing, Low Poly, although I can excuse some of these depending on the circumstance.

Ambient Occlusion and Gl hacks are high up on my list. How much AA is debatable based on resolution, but AF better be a basic feature next gen--I am sick of poor filtering and swimming textures. Yuck. Grass and dynamic particle systems are way up on my list too--they games that use them look substantially better than those that don't.

Oddly the 'haves' this gen (like Far Cry 2) hit many of these "wants" while the 'have nots' fall very, very short. In many ways true "next gen" titles on the 360/PS3 can get most of these right. Legacy code/design appear to be as much a bottleneck as the hardware.
 
Have not numerous developers stated they will not look to implement AA or 60fps in console games because it is not worth the cost? They would rather spend it on other items on the list is what I believe they stated.

Some of this kind of thing should be taken into account in the "next next-gen hardware" thread. Instead of just looking at die sizes, power consumption, etc. At least one should have an idea of what it will take to implement these "wants." Best example I can think of is the RAM in the 360 doubling thanks to developer begging.
 
eDRAM will be a dead end if developers demand FP16 @ 1080p @ 4xMSAA unless a smarter implimentation (caching [streaming out?] of buffers?) can be found. And it seems developers also want full read/write of the buffers for more advanced post processing.

I would bet post processing will play a major factor in future engines and developers will want a lot of control over this in hardware with minimal hoop jumping.
 
eDRAM will be a dead end if developers demand FP16 @ 1080p @ 4xMSAA

Why would developers "demand" something like that? Developers demand flexible AND easy to use hardware, with the freedom to put its strengths into whatever aspects of visual quality they see fit.

I would personally value much more the ability to texture from EDRAM than any particular EDRAM size or target resolution.
 
Why would developers "demand" something like that? Developers demand flexible AND easy to use hardware, with the freedom to put its strengths into whatever aspects of visual quality they see fit.

I would personally value much more the ability to texture from EDRAM than any particular EDRAM size or target resolution.

To be fair, you did not quote me in full context. The whole sentance that summed up my statement much more clearly was:

eDRAM will be a dead end if developers demand FP16 @ 1080p @ 4xMSAA unless a smarter implimentation (caching [streaming out?] of buffers?) can be found.

You are abstracting the "end point" from the "technicals" which is fine. But when you say, "Developers demand flexible AND easy to use hardware, with the freedom to put its strengths into whatever aspects of visual quality they see fit" my arguement stands by the fact, regardless of the design implimentation on the hardware, developers have (a) voiced a desire for anti-aliasing and (b) complained about how Xenos goes about doing this. They are wanting the flexibility AND ease of use--exactly what you said.

And what I said, if you properly quoted me, is that eDRAM is a dead end if developers demand things like MSAA and higher resolutions if the current implimentation is proposed. Why? Because, to quote you, "Developers demand flexible AND easy to use hardware".

eDRAM, as is, isn't as flexible as a standard memory pool AND it isn't easy to use when attempting to get basic features developers want (e.g. MSAA at HD resolutions) if the eDRAM is too small.

I think your ease of jumping on my posts has resulting you in you leaping before looking because there is no fundamental axe to grind if you actually read what I said in context ;)

Maybe your concerns are different than other developers but the handfuls I talk to (by no means huge) the #1 gripe I hear about eDRAM from actual developers is kevetching that they do want MSAA at HD resolutions but 10MB isn't sufficient to do a 720p target resolution with MSAA without additional work and workload considerations. When asked about future consoles they have told me if eDRAM is used again they don't want to fiddle with tiling unless it has significant changes (i.e. demand more memory/better implimentation to avoid these issues). But that is just a small survey of the couple handfuls I have talked to. You probably talk to more developers, but I haven't met many who don't think anti-aliasing is a bad thing (unless they are not able to do it performantly and fall back to the 'consumers don't notice' excuse/position).

#2 gripe with eDRAM is the lack of ability to do more robust operations to utilize the benefits of the bandwidth.

Which leads me to your specific desire--surprising to be quite honest.

I am surprised to see you argue for textures--are you referring more to MRT or straight texturing? Not sure how valuable eDRAM would be due to size limitations. At a high level it appears one of the "strengths" of GPUs is their high tolerance for latency--I could be mistaken, but this seems to fit perfectly with texturing.

Considering how tolerant GPUs are for latency and the size of textures in memory (100s of MBs) how is texturing from eDRAM (10s of MBs) a major benefit worth the silicon investment? Putting aside legacy design issues (difficulty) and cross platform development (more difficulties+lower exploitation) I am not sure this is a "win" from a design perspective or benefit onscreen.

Maybe you can elaborate how using eDRAM for texturing fits your criteria for flexibility and ease of use? How is demanding texturing different than "why would developers demand" an eDRAM pool that supports MSAA without current tiling issues? :???:

[If you mean an eDRAM pool that is a flexible scratchpad I am all for that if it can be designed within a reasonable budget, as I mentioned in the next-gen prediction thread just last week.]
 
I want texturing from EDRAM, in the sense of reading from EDRAM in the shader - this currently isn't possible on the 360. This would allow for very rich, multipass postprocessing effects without leaving the confines of the EDRAM, at huge bandwidth - which will allow the hypothetical GPU to flex its shader muscle.

My point about flexibility vs. resolution was that if you give developers a flexible GPU with enough EDRAM to do fp16 1080p 4xAA in launch titles, three years down the road we'll be back again to quarter-resolution particle buffers and no AA (or smart "edge" AA), but with much more sophisticated effects. And I don't think anyone (except for the B3D regulars ;-) ) would applaud a GPU that forces you to use its power exactly for fp16/1080p/4xAA, and all this massive bandwidth can't be redirected to other uses.
 
All that to say:

(me) eDRAM as implimented is a dead end if (a) developers demand ease of use for a variety of designs goals and (b) to utilize higher resolutions, MSAA, high percision color formats, etc.

Or

(you) eDRAM as implimented is a dead end if (a) "Developers demand flexible AND easy to use hardware" and (b) "freedom to put its strengths into whatever aspects of visual quality they see fit".

There is nothing fundamentally different from what we said, unless you think Xenos is both flexible, easy to use, and allows a lot of freedom to put its strengths into "whatever aspects of visual quality they see fit" -- which a call to texture from eDRAM points the opposit direction.
 
EDRAM in its Xenos form forces compromises, but I think the inability to texture (read) from it is much worse than the limited size. We spend half of our postprocessing time resolving (transferring from EDRAM to main RAM so we can read the results of the previous pass in a subsequent postprocessing), even though we collapsed some passes to avoid resolves as much as possible. If we had Xenos with 16 MB EDRAM, we'd simply go to 720p 2xAA (instead of the current 0xAA); if we could texture from the current 10 MB without intermediate resolves, we'd do more interesting things to soften shadows, remove shadowmap artefacts, and have much better postprocessing with the same frame budget. I think the latter would improve the image quality of our game more.
 
I want texturing from EDRAM, in the sense of reading from EDRAM in the shader - this currently isn't possible on the 360. This would allow for very rich, multipass postprocessing effects without leaving the confines of the EDRAM, at huge bandwidth - which will allow the hypothetical GPU to flex its shader muscle.

My point about flexibility vs. resolution was that if you give developers a flexible GPU with enough EDRAM to do fp16 1080p 4xAA in launch titles, three years down the road we'll be back again to quarter-resolution particle buffers and no AA (or smart "edge" AA), but with much more sophisticated effects. And I don't think anyone (except for the B3D regulars ;-) ) would applaud a GPU that forces you to use its power exactly for fp16/1080p/4xAA, and all this massive bandwidth can't be redirected to other uses.

I agree. I had posted about this very issue last week :D It seems a waste to use ~25% of your GPU silicon budget to eDRAM and for it to be limited to a small fraction of utilization. I think we agree that based on what developers are saying, and issues they have with eDRAM, and the direction they want to go in terms of visuals, that as I noted in the other thread, "a more robust eDRAM implimentation" is needed to deal with these issues.

Does anyone want Xenos style eDRAM, especially if it is big enough to support 1080p with 4xMSAA, but is limited to the same functionality as this gen? Does anyone even think that is a good use of silicon area?

So I think we agree, fundamentally.

EDIT: And we do agree. eDRAM is a dead end as-is (Xenos style) if developers want a more robust resource and want to up visual quality in terms of techniques that can use eDRAM as well as current issues (resolution, MSAA).
 
Isn't that argument a bit obvious though? I mean, the analogue is that unified shaders are a dead-end if implimented as in Xenos, despite US being the future! :p Technology moves on, and rather than reuse the hardware designs of yesteryear, if new designs offer a better soution, go with them. So eDRAM remains an option, only as a more versatile local scratch-pad RAM pool, and unified shaders are an option (a given!), only in a more versatile form than Xenos.

Although I do wonder, why was Xenos' eDRAM implimented as it was? What's the overhead in adding texture-reads from it, that this functionality wasn't included?
 
Isn't that argument a bit obvious though? I mean, the analogue is that unified shaders are a dead-end if implimented as in Xenos, despite US being the future! :p Technology moves on, and rather than reuse the hardware designs of yesteryear, if new designs offer a better soution, go with them. So eDRAM remains an option, only as a more versatile local scratch-pad RAM pool, and unified shaders are an option (a given!), only in a more versatile form than Xenos.

Although I do wonder, why was Xenos' eDRAM implimented as it was? What's the overhead in adding texture-reads from it, that this functionality wasn't included?
A more versatile local scratch pad memory would have to be on the chip (in the main chip, no daughter die), no?
 
Back
Top