Future Hardware

"64bit color (Billions of colors)"

Is there any point when most people now are using 24bit lcd's (some are 18bit)
 
"64bit color (Billions of colors)"

Is there any point when most people now are using 24bit lcd's (some are 18bit)

Their was 4bit, 8bit, 16bit, 24bit, 32bit.
I'm not sure where 18bit come from?

Yes, their is a point going to 64bit..... (Maybe at professional level first)
 
I think we need to be able to display it before actually moving to a higher precision target. 10bpc is a fairly significant increase that is already there but I'm not sure how 16bpc would really help, besides sucking up bandwidth over the interface cables.

I'd say the programmable interpolators and a move towards rendering with NURBs would be a nice step. That should take polygon size down below that of a pixel. Move away from triangles towards 'point primitives'. That IMHO would be the biggest IQ gain to be had at this point in time.

That and to nicely raytrace for lighting and shadows some form of surface quicksort would be quite handy. Hardware accelerating some basic raytracing techniques would be extremely useful.
 
64bit color (Billions of colors) vs. we have 32bit (16 millions), I know Linux has in their software 64bit, but I like to see in hardware.

Something like this "resolution" 2560x1600x64bit

there are FP16 framebuffers and they're used today in games. (but not really related to the "number of colours" shown on the screen).
 
1024-bit external memory bus for 100s of GB/sec

plus ultra high-bandwidth EDRAM in the TB/sec range.

for lack of a better term "better true 3D motion blur"

cost-free 16x AA (ala Xenos' 4x AA) - reasonable performance hit for 32x AA

better lighting

more geometry

etc.

:)
 
better lighting

more geometry
"Future" hardware won't give you either/both. Someone needs to write a better software algorithm for the first while the second simply is too time-costly for software artists (i.e. current HW triangle rates is safely ahead of software demands).

As for my wish, I'd vote for really good geometry shaders efficiently performing HOS tesselation.
 
64bit color (Billions of colors) vs. we have 32bit (16 millions), I know Linux has in their software 64bit, but I like to see in hardware.

Something like this "resolution" 2560x1600x64bit

How about this: high precision internal framebuffer - > fast, error dispersion dithering -> 24 bpp on screen? :)

IMHO output colour depth is least of problems when going photorealism (for now)

My pick: better (global) illumination algorithms with good looking shadows on top :)
 
How about this: high precision internal framebuffer - > fast, error dispersion dithering -> 24 bpp on screen? :)

why not what we can have right now? FP16 framebuffer -> tone mapping algorithm -> 30bpp on screen (a CRT on VGA :p. will the PC get the equivalent of the "HDMI 1.3 deep colour" buzz, on the digital side?)
 
why not what we can have right now? FP16 framebuffer -> tone mapping algorithm -> 30bpp on screen

Heh agreed! What i wanted to stress is i'm yet to see annoying quantization artifacts on 24 bpp. When there is banding it usually is problem with input data/lack of interpolation somewhere/poor algorithm/anything but frame buffer depth.
Sure, it would be nice if IHVs forced 48bpp and then say: "ok guys this is dead end now we are working on the rest and if anyone dares to ask for more we tell them to go to hell" :)

My wet dream hardware: PetaHz serial processor with quadruple precision fpu and 0 memory latency (cough...yes, i used to read fantasy books)
 
When I look at the new Crysis vid's my answer would be ... Nothing.

Maybe a little more geometry shading power so building's can burst into even more debris particles when you drive through them with a tank..


But maybe back at the days of Unreal I also thought that one couldn't posiibly desire for any more visual beauty..
 
Back
Top