This sort of thing is what I expect next-gen...

Vysez said:
Sonic said:
Far Cry on a Radeon 9800 Pro doesn't matter right now since the game isn't optimized for a fixed target where the developer knows exactly what to expect and what he/she will get form the hardware. Far Cry is optimized in a different way in that it is supposed to perform well on a range of graphics card with varying levels of graphics. With that said I think the game will be trounced next gen in the graphics department simply because there will be much more available processing power. Remember, the farther we go on the better hardware makers (NV, ATI) will be able to make shaders that much more better. Only now have we seen such a focus on shaders but by the time R500 rolls around it may end up having shader performance tripled or quadrupled what we have today. Also take RAM limitations into account and how much Windows eats up and all other resources. In a console all of that is minimized and you can focus on exactly what you want to with said memory.

I second that, and when you know that FarCry uses only 3x less shaders than Half Life² (which runs @60fps @1024 on a "simple" R360) for example, you easily could imagine that a console (with its "locked specs") with more computational power than today's high ends GCs could top any recent games , especially at 640x480.

I also agree that a good implementation of LOD techniques could do miracles when it comes to realtime rendering .
The next-gen could/will come to the point where there'll be a lot of polys<pixels on screen (@640x480, of course), if you add to that a nice HOS, i believe that polygons won't be the primary problem next-gen.
Lightings techniques will need most of the work, if developers are looking for CGI look in realtime, and when you see some of the "recent" (everything is "relative") breakthroughs on HDR, SHL, Stencils Shadows etc, and if developers couple thoses with some shaders (soft shadowing techniques,...), CGI look won't be far off. IMHO.

I'm sure most systems will support 1080i and/or 720p next gen.
Considering there are already XBOX games supporting this I expect most XBOX2 games to at the very least.
 
Fafalada said:
I imagine by the time we can afford those storage requirements the computing power will be up there too. Did you look at the space used for that table scene? :oops:

Definately looks great though, would make one heck of a techdemo for PS3 :D

Whats a gig or two for 3 chairs and a table :)
 
Crazyace said:
Although all of the Precomputed radiance stuff is really nice for those camera
flybys of still screens It all starts to fall apart a bit if large amounts of geometry are animating.. as the inherent self occlusion and higher order reflection/shadowing changes. Its good for technical demos, and another tool to use in games, but not a complete panacea.. roll on proper real time radiosity instead...

Radiosity IS a form of PRT, radiosity is pre-calculating the diffuse energy per-vertex. Maybe you can optimise the pre-calculation to the point that you can do it real-time but its still a form of PRT.

You can factor radiosity solvers as low-frequency fixed camera, fixed light diffuse only PRT. What makes advanced PRT 'better' is that the lighting enviroment can be change WITHOUT recalculating the precomputation and that specular (and other view depedent changes) can be captured.

In radiosity you store a colour per vertex, that represents the lighting (diffuse only) at that moment, with (for example this latest paper) your storing a cube-map per vertex with a compressed version of the 6D lighting response (its not general to all 6D, several are reduced).

In both systems any geometry changes will cause the lighting data to become invalid.
 
MonkeyLicker said:
I'm sure most systems will support 1080i and/or 720p next gen.
Considering there are already XBOX games supporting this I expect most XBOX2 games to at the very least.

The 1080p support in the hardware is almost a given on the next-generation consoles, i'm sure of that.
What's not sure is the number of games that going to trade effects/polygons for resolution.
And when you know that in the 2nd biggest market for videogames, Europe, HDTV is classed in the same rank as santa clauss and the Loch Ness monster, we heard about them but we never saw them. :D
There's not a lot of reasons for developers/publishers tol bother with HDTV (except 480p), IMHO, of course.

Today, indeed Xbox offers support for 1080i, only a few cross-platform games (such as Enter the Matrix) use it, for the obvious reason that the Xbox could run thoses cross-platforms games at a higher resolution than PS2 and GC.
A lot of "Xbox only" games (Halo 1&2, PGR2, Top Spin, Apex, etc) trades 60fps for 30fps to have more raw power for the rendering of complex shaders effects.

And when you see that XboxNext could be the first on shelves and therfore could be the less powerfull of the 3 machines, i think that Xbox "AAA" Titles won't be high rez compatible, at least once the PS3 and N5 are released. They could makes them HDTV compliant but they would have to trade something for it (fps, shaders, polygons, textures... If they do trade something it will be fps because you can't see it on screenshots and videos).

I'll be satisfied if all games rocks at 60fps next gen, since HDTV is not the main priority 'round here, 30fps games should really disappear with this next-gen (especially on racer games). :D
 
Vysez said:
The 1080p support in the hardware is almost a given on the next-generation consoles, i'm sure of that.

I'm not, I won't be surprised if 1080i is the maximum supported.

There is no reason for 1080p from a commerical point of view, it would just cost MS, Sony, Nintendo supporting a video encoder that can handle that level of bandwidth (from 4 Meg per frame for 1080i to 8 Meg per frame for 1080p) for those few people who have TV's that support it.

Video encoder chips aren't free, even a 50p increase in cost would probably stop it shipping given the market penetration.

Its also requires a minimum backbuffer size of 16Mb (32bit colour and depth), thats probably more than next-gen console will be able use comfortable (all are likely to use eDRAM for backbuffers and that makes it a very precious resource).

The XBOX was punching above its weight with regard HDTV support, its PC heritage made hires easy for ports (as you mentioned it was mainly PS2 ports with spare fillrate and RAM that used very high resolutions). All NG consoles are 'proper' consoles like and supporting very high resolution won't come for free.
 
DeanoC said:
Radiosity IS a form of PRT, radiosity is pre-calculating the diffuse energy per-vertex. Maybe you can optimise the pre-calculation to the point that you can do it real-time but its still a form of PRT.

I treat radiosity as a lighting solution based on secondary ( and n-degree ) illumination as well as the primary light sourcing - Precoumputed radiance ( like baked lighting ) is what you are talking about.
( It will take a lot of passes to implement real 'realtime' radiance - Maybe more than other techniques such as photonmapping.. but you want to have something to do with all of that cpu power :) )
 
honestly, 720p and 1080i will be more than fine as far as resolution in the upcoming consoles. as long as we get high amounts of FSAA compared to this generation (often zero) games will look very sweet.


Oh and nextgen games will make Far Cry look very very dated since, as mentioned above, Far Cry isn't targeted at a fixed platform, unlike console games.
 
PRT can be quite nice for places where you might otherwise just use static lightmaps.

Anywho, some real time photon mapping would be great... well, any technique that could get us real time GI. Of course.
 
PRT needs better press :)

Unbounded (no resolution restriction) PRT is a complete solution to Kajiya rendering equation, as such it is a global as any GI can be.

Bad choice of name I guess its no more pre-computed than any other GI (they all assume that the lighting data is valid only as long as the geomtry doesn't change). Though its not really a form of GI as its doesn't care how you acquired your lighting data, its just a way of storing it and replaying it back under more general conditions then other GI.

Its not just a replacement for static lightmaps in general (though the current real-time tech can only do fairly low frequency work), in its high-frequency varients its a replacement for any GI technique where you want to change the lighting without changing the geometry.
 
Well, looking at what a beefed up GF3 and PS2 can do today with just a bit of optimization, I think my belief that next gen will excede by a fair margin whatever farcry is doing is more than justified. Afterall, from PSOne to PS2 there was what? A ten fold increase in polygon pushing power, plus a myriad of other effects, so why will next gen be any different?

You forgot the Gekko & Flipper within the GC. The sheer amount of differing detail & simultaneous number of characters in RE4 surpasses anything I've yet seen on a console. (10-20 attacking at once)
 
RE4 does it all. That big Orgre creature looked amazing and there was motion blur and particle effects going at the same time too. That swamp water looked amazing too.
 
Back
Top