Tim Sweeny interview over @ BeyondUnreal

RingWraith said:
Isn't he going a little overboard with the 1 GB video card???

We aren't even breaking the 128 MB limit yet. Anything over 512 MB, even 2 years from now will be overkill.

If you want to be playing todays games in 2 years time you are correct. Otherwise, you are more than likely making an assumption that is incorrect.

As humans our eyes can only see so much detail, especially in fast moving sequences. There will be a point where texture detail above a certain point will be unnecessary and unnoticed.

The Human eye can only detect close to TRUE 32-Bit color, anything higher wouldn't matter.

Well, I can certainly perceive the difference between in-game graphics and the real world, and if I was asked to put a number on it I'd say the difference between the game and reality was more than a factor 4 in complexity (more like 10^4 I'd say). So 1GB is no-where near enough memory on a graphics card :)
 
nutball said:
RingWraith said:
The Human eye can only detect close to TRUE 32-Bit color, anything higher wouldn't matter.

Well, I can certainly perceive the difference between in-game graphics and the real world, and if I was asked to put a number on it I'd say the difference between the game and reality was more than a factor 4 in complexity (more like 10^4 I'd say). So 1GB is no-where near enough memory on a graphics card :)
That doesn't really have anything to do with colour depth. That computer graphics don't often look very realistic is more due to the quality of art and lighting techniques than anything else.
 
Oy hasn't this been discussed ad nauseum? The whole 32bit color argument is moot in computer graphics. It's needed to ensure that in multi-pass situations, errors don't creep up because of the lack of color precision.
 
Reviving an old thread :p

All that old discussion about 32-bit being enough looks rather silly now that everyone knows the importance of HDR :)
 
Wunderchu said:


I expected this, I we want to break the bloom- and ugly lowres. texturesbarriere which almost every new game is afected, it's gonna cost memory (and bucks). Hail to Epic.
 
Last edited by a moderator:
Well, a 2048x2048 Texture in DXT1 format only takes up 2 MiB, so someone might think 256 of those textures are enough for one level/map. But that's not taking into account what a high-res antialiased framebuffer plus several render targets takes away, and you need much more than a single diffuse color map to simulate realistic surfaces. Even with compression, I think with all those layers you can get beyond 32 bits per texel on average. And for realistic environments you do want to reach at least 1 texel per cm², which is still low sometimes. With these numbers, 512 MiB of textures give you about 13500 m² of uniquely textured surfaces.
 
Just to throw an example into the mix...

I've been writing up a HDRI demo in D3D9 for the last week or so, it's far from highly optimized for space (such things aren't important in this instance) but a 1024x768 display of a spinning cube can chew up close to 30mb of VRAM just for the intermediary stages of the pipeline. That's not even beginning to count any actual art assets or geometric storage :D

From my point of view, I can easily see ways of chewing up any/every bit of VRAM I'm allowed ;)

Jack
 
Sweeney said the same stuff about Unreal Tournament 2003. that 128mb cards could not display the full graphics and 256mb cards would be needed. At that time though his comments were very Nvidia slanted. Later this proved to be a completely misleading statement. That they "reexplained" a few times if i remember right.

Well see what a 256mb card can do. perhaps he knows that Nvidia is releasing a 1G card or something? :rolleyes:
 
Back
Top