Do colors make a big difference in performance?

I noticed that the more technically impressive a game is considered, the smaller the color palette. Exactly how does coloration effect performance? I remember how the creators of Mass Effect 2 said they had larger color palette than in the first game like this was a colossal achievement.

What exactly is the impact of colors on GPU performance and the graphics overall?
 
Last edited by a moderator:
Couldn't the use of less colors mean the textures are smaller in size? Doesn't a texture get larger with the more colors that are used.

I could see this if the game was in black and white (right?) but just being muted I don't know if it would have much of an impact on performance.
 
Couldn't the use of less colors mean the textures are smaller in size? Doesn't a texture get larger with the more colors that are used.

I could see this if the game was in black and white (right?) but just being muted I don't know if it would have much of an impact on performance.
Think of it like this. If you take a digital photo (think of the digital photo as a texture) of a room that is complete white and a room that is colorful will one photo be less MBs than the other? :p
 
I think it would be possible to gain performance from games being close to monochromatic - e.g. use 1- and 2-channel textures instead of full 3- and 4-channels, thus reducing texture bandwidth. I'm not sure anyone bothers.

In the PS2 age, when they had extensive support for paletted textures, they most certainly did use it.
 
Everything would be a lot faster at 16 bit, too, I guess, but no-one in their right mind would want to go back from 32 bit...
 
The orignal post was asking if aesthetics use less resources, if the choice of a imited pallete by the artists frees up resources elsewhere. The answer is a categorical 'no'. There can be talk of hardware designs using resources for colours, like having to bit different render modes, but these are total displayed colours, not choice of colours. Likewise paletised tetures have a limited number of colours, but that doesn't stop you using lots of different palettes for lots of different colours on screen. Then there's a case that an engine could work in 256 bits only, which would use less resources.

But for years, every GPU has supported 32 bit colour hardware, and switching to lower colour resolutions (still not afffecting choice of colours by artists!) would just see this hardware sitting idle and gain no extra performance to use elsewhere.

Hence when all said and done, hardware resources is not a reason for a developer to pick a particular low-colour aesthetic. KZ2 wasn't grey-brown because that freed memory and processing power for the deferred rendering, but because that was the choice by the artists to make it look gritty and miserable to match the world being invaded.
 
Switching to 16-bit framebuffers would still cut memory and bandwidth requirements in half at many places throughout the pipeline. Then again 16-bit imagery would be hideously ugly with lots of dithering and such (remember, Voodoo1-2 image quality)
 
The orignal post was asking if aesthetics use less resources, if the choice of a imited pallete by the artists frees up resources elsewhere. The answer is a categorical 'no'. There can be talk of hardware designs using resources for colours, like having to bit different render modes, but these are total displayed colours, not choice of colours. Likewise paletised tetures have a limited number of colours, but that doesn't stop you using lots of different palettes for lots of different colours on screen. Then there's a case that an engine could work in 256 bits only, which would use less resources.

But for years, every GPU has supported 32 bit colour hardware, and switching to lower colour resolutions (still not afffecting choice of colours by artists!) would just see this hardware sitting idle and gain no extra performance to use elsewhere.

Hence when all said and done, hardware resources is not a reason for a developer to pick a particular low-colour aesthetic. KZ2 wasn't grey-brown because that freed memory and processing power for the deferred rendering, but because that was the choice by the artists to make it look gritty and miserable to match the world being invaded.

Well, technically a lower color palette could reduce the memory footprint of textures as the likelyhood of compressable sections of the image increases thus leading to greater compression ratios. Would it be significant or affect developement goals? Hard to say.

Regards,
SB
 
Think of it like this. If you take a digital photo (think of the digital photo as a texture) of a room that is complete white and a room that is colorful will one photo be less MBs than the other? :p
Actually, if the photo is compressed (e.g. JPG), the one with less contrast and colors will be less MBs ;)

Personally, I'm not a fan of vivid colors in games. They make things look cartoony (think Halo, World of Warcraft etc.).
 
I noticed that the more technically impressive a game is considered, the smaller the color palette. Exactly how does coloration effect performance? I remember how the creators of Mass Effect 2 said they had larger color palette than in the first game like this was a colossal achievement.

What exactly is the impact of colors on GPU performance and the graphics overall?
None in practice (except for "hyperintelligent shades of the colour blue" :) )
Couldn't the use of less colors mean the textures are smaller in size? Doesn't a texture get larger with the more colors that are used.
Not really. The texture compression schemes used in current gen systems are fixed rate (e.g typically 2, 4 or 8bpp) so it's the range of colours and how rapidly they change that may affect the visual quality of the compressed result.
I think it would be possible to gain performance from games being close to monochromatic - e.g. use 1- and 2-channel textures instead of full 3- and 4-channels, thus reducing texture bandwidth. I'm not sure anyone bothers.
Actually, there were flight simulators from (IIRC) the early 1980s that used a monochrome texture map which was then used to vary the intensity of a colour. It probably worked quite ok for things like grass or concrete textures.
Everything would be a lot faster at 16 bit, too, I guess, but no-one in their right mind would want to go back from 32 bit...
Switching to 16-bit framebuffers would still cut memory and bandwidth requirements in half at many places throughout the pipeline. Then again 16-bit imagery would be hideously ugly with lots of dithering and such (remember, Voodoo1-2 image quality)
Well, not all systems were badly affected by using a 16bpp framebuffer, provided the shading was done (in an on-chip tile) at 32bpp.
 
Actually, if the photo is compressed (e.g. JPG), the one with less contrast and colors will be less MBs ;)

Personally, I'm not a fan of vivid colors in games. They make things look cartoony (think Halo, World of Warcraft etc.).

Ok didnt consider using compression :p
 
Ok didnt consider using compression :p
JPG would only be useful for application storage or transmission - it is of dubious use* for 3D rendering as it doesn't allow random access.


*Yes I know there were hardware decompressors but those had to fully decode the texture or use something like Microsoft's TREC format.
 
Switching to 16-bit framebuffers would still cut memory and bandwidth requirements in half at many places throughout the pipeline. Then again 16-bit imagery would be hideously ugly with lots of dithering and such (remember, Voodoo1-2 image quality)
16 bit framebuffers aren't limiting the artists to low-key artwork. It's not like swtiching to 16 bit requires a palette of only 65536 colours, but you have the full range of colours and shades, only dithered. Picking only greys and browns, perhaps covering 65536 colours, isn't going to free up resources in modern games that are using 32 bit (minimum) data formats. Even less so when the pipelines are HDR. Greyscale is still greyscale, and though in theory you could go luminance only and ditch a load of data bits, no game is doing that, certainly none that the OP is thinking of.

At the end of the day, if you are writing a game for PS360 and/or PC, you aren't ever going to pick a colour scheme based on performance issues. Picking between a look like Halo2 or KZ2 isn't going to the difference of 5% less processing and BW for KZ2 because it's low key, all greys and browns. Pipelines are 32 bits RGBA at least, in hardware so at no extra cost to use than not, and choosing just a subset of available hues and intensities isn't going to free anything to use for other parts of your game. If RAM is an issue, textures can be shrunk. If BW is an issues, resolutions can be shrunk. Models can be simplified. Limiting the range of hues doesn't help anything.
 
Shifty, a 16 bit frame buffer is exactly one half of a 32 bit frame buffer, and it takes about half the bandwidth to read and write into it. Even if the GPU is already equipped to work at high color precision internally, the VRAM and its bus would still be less of a limit if you'd cut the traffic nearly in half.

But of course it's an absurd idea to go for an R5G6B5 format buffer, as it would offer nearly unacceptable image quality loss, banding, dithering and such - I've only used it as an example to illustrate that every single game already pays a significant performance penalty, and some are willing to pay even more (like Frostbite 2 with it's 5 RGBA buffers for deferred rendering).
 
16 bit framebuffers aren't limiting the artists to low-key artwork. It's not like swtiching to 16 bit requires a palette of only 65536 colours, but you have the full range of colours and shades, only dithered. Picking only greys and browns, perhaps covering 65536 colours, isn't going to free up resources in modern games that are using 32 bit (minimum) data formats. Even less so when the pipelines are HDR. Greyscale is still greyscale, and though in theory you could go luminance only and ditch a load of data bits, no game is doing that, certainly none that the OP is thinking of.

At the end of the day, if you are writing a game for PS360 and/or PC, you aren't ever going to pick a colour scheme based on performance issues. Picking between a look like Halo2 or KZ2 isn't going to the difference of 5% less processing and BW for KZ2 because it's low key, all greys and browns. Pipelines are 32 bits RGBA at least, in hardware so at no extra cost to use than not, and choosing just a subset of available hues and intensities isn't going to free anything to use for other parts of your game. If RAM is an issue, textures can be shrunk. If BW is an issues, resolutions can be shrunk. Models can be simplified. Limiting the range of hues doesn't help anything.
I can't say how this works nowadays, but in antediluvian times when GPU dinosaurs ruled the earth, 16 bit and 32 bit colour palettes made a huge difference in performance.

For instance, I had a Voodoo 3 (PCI-2 port) and the Matrox G400 MAX AGP.

I greatly preferred the Matrox G400 but switching both graphics cards revealed a very jerky and slow framerate, which slightly ruined the better image quality of the G400. Despite the AGP being the superior bus, despite featuring support for 32 bit, despite being designed for Direct3D in mind, etc, the 16 bits video output of the Voodoo 3 was orders of magnitude faster (Glide API performance was unbeatable, but Direct3D was also very fast).

There were comparisons showing Quake 3 sky and lots of colour banding on the 16 bits Voodoo 3 compared to the 32 bit G400 MAX, so the G400 was clearly better, but framerate on the Matrox was a pain in the ass.
 
Today's GPUs have internal ALUs that can deal with better precision without performance hits, so a lot of the differences no longer apply. There's also framebuffer compression so reading and writing into a 32 bit frame buffer is not exactly twice as demanding as it would be with a 16 bit buffer; and of course the X360's EDRAM is equipped to deal with this on top of a lot of other things. Back with the Voodoo3 it was possible to measure frame rates nearly 2x as high for 16bit, today it would not be the case. But there still would be some difference.
 
I can't believe I'm having this conversation. Laa-Yosh, when a developer picks a 16 bit FB, does that lock them out of the majority of colours available? Does an artist end up only having a limited range of colours to pick from, only low-key browns and greys? No. They lose colour fidelity, but not colour range. So choosing 16 bit doesn't affect the choice of colours and vice versa, so wanting to save resources by going 16 bit FBs isn't going to require a limited colour palette.

Yes, someone could choose a 16 bit FB format, but they can go primary school colours with that or psychedelic or true-to-life or next-gen brown. The colour aesthetic isn't compromised at all in any way whatsoever - it'll just be dithered or posterized. As per the OP and the question being asked. Not "does lower colour fidelity have performance savings?" but "does a limited palette?"
 
Back
Top