What is stopping Next-Gen developers from using High-Res textures?

I'd just like to support what cloudscapes is saying. I just tried the new version of Bart PE and after booting "windows" is using 17mb of commit charge (that's RAM + page file) and that's including the 8mb for Bart PE's menu system (which wouldn't be needed in a true Gaming OS).

A usual XP system is using around 150-200mb (more if you run continuous anti-virus proggies etc.) at boot. This kind of memory usage is not scalable thus it's overhead. No matter if you're running a AMD FX-60 or a Pentium III 500 you're going to be using that much memory. So while a 64mb xbox is rougly equivalent to a 256mb PC you can't apply the same ratio when you have 512 or 1gb of ram on the PC. If anything the xbox 360 (for instance) is going to be using a little more memory for its "gaming kernel" than the original xbox did.

My own installation of Windows XP is tweaked to the hilt and after boot it's using less than 100mb of commit charge (around 70mb of RAM).
 
scooby_dooby said:
Windows resources will also rise, no? Windows Vista is around the corner, so PC's again are going to need even more ram in the future, the cycle continues...

They will indeed. As new and more complex versions of Windows come out, the need for more RAM will be clear. I could even use the example to explain the initial poster's inquiry. If Windows XP was a 360 game with lower-res textures, and Windows Vista were the PC port with the higher-res textures, then the same need is present.

Maybe I'm going off on a tangent here though. Things are definatelly not as simple as black or white, or double the memory for double the textures. There are so many factors, but I'm saying (mostly to !eVo!-X Ant UK and some others) that the main reason the texture resolution is lesser is because of RAM limitations, mostly.

I have worked (and am working) on a few games that have been ported both to PC and console, and this is what I see. I have personally worked with resource managers that split source textures into the full-res PC version, and the lower-res console version. It was the way it had to be done, or else we crashed (out-of-memory'ed) the console.
 
Last edited by a moderator:
cloudscapes said:
There are so many factors, but I'm saying (mostly to !eVo!-X Ant UK and some others) that the main reason the texture resolution is lesser is because of RAM limitations, mostly.

I'll get with that, but then if a console and a PC had the SAME amount of RAM the console would put out better results.
 
cloudscapes said:
They will indeed. As new and more complex versions of Windows come out, the need for more RAM will be clear. I could even use the example to explain the initial poster's inquiry. If Windows XP was a 360 game with lower-res textures, and Windows Vista were the PC port with the higher-res textures, then the same need is present.

Maybe I'm going off on a tangent here though. Things are definatelly not as simple as black or white, or double the memory for double the textures. There are so many factors, but I'm saying (mostly to !eVo!-X Ant UK and some others) that the main reason the texture resolution is lesser is because of RAM limitations, mostly.

I have worked (and am working) on a few games that have been ported both to PC and console, and this is what I see. I have personally worked with resource managers that split source textures into the full-res PC version, and the lower-res console version. It was the way it had to be done, or else we crashed (out-of-memory'ed) the console.

But with that said, what's your opinion of the textures quality we're seeing so far on alot of 360 games? I mean, as far as I'm concerned alot of it is horrible compared to the potential these machines have.

For example, COD2 has worse textures than Splinter Cell 3, but in reality 360 should be able to pull off effects far far superior to anything ever done on xbox, so I really don't think there's any excuses, not at this point in time.

I think it's just an issue of rushed games, or games developed for PC or last gen systems that are simply not even coming close to exploiting the system.
 
!eVo!-X Ant UK said:
I'll get with that, but then if a console and a PC had the SAME amount of RAM the console would put out better results.

Yes. Because Windows would be chugging the RAM down so hard that the game probably wouldn't even get to the menu screen.

However, if you have a PC with exactly the amount of RAM Windows requires, plus exactly 64MB of RAM/VRAM shared memory, then in all likelyhood, it will run.
 
cloudscapes said:
Yes. Because Windows would be chugging the RAM down so hard that the game probably wouldn't even get to the menu screen.

However, if you have a PC with exactly the amount of RAM Windows requires, plus exactly 64MB of RAM/VRAM shared memory, then in all likelyhood, it will run.

Agreed
 
cloudscapes said:
Yes. Because Windows would be chugging the RAM down so hard that the game probably wouldn't even get to the menu screen.

However, if you have a PC with exactly the amount of RAM Windows requires, plus exactly 64MB of RAM/VRAM shared memory, then in all likelyhood, it will run.


But on a PC widnows is always chugging ram, people have antivirus's and firewalls running, they are often logged into messaging systems, and have other tasks open. In addition, they have all kinds of setting such as page files, virtual memory and other things, all these add up to a very unpredictable value for "exactly the amount of RAM windows takes"

so while consoles can utilize every single last MB on the system, pc games are always working off a best guess sort of set-up.

For example, if you game has minimum specs of 1GB of ram,how much do you as developer actually assume you have available? 600mb? 500mb?

So not only does windows always chug ram, but PC games can't accurately predict how much windows will be using, AND that amount also continually increases with new version of windows. So it's a very grey area, and definately can't be directly compared.
 
scooby_dooby said:
But with that said, what's your opinion of the textures quality we're seeing so far on alot of 360 games? I mean, as far as I'm concerned alot of it is horrible compared to the potential these machines have.

For example, COD2 has worse textures than Splinter Cell 3, but in reality 360 should be able to pull off effects far far superior to anything ever done on xbox, so I really don't think there's any excuses, not at this point in time.

I think it's just an issue of rushed games, or games developed for PC or last gen systems that are simply not even coming close to exploiting the system.

I haven't really played 360 games, just looked over shoulders of some who were occasionally, so I might not be a good judge here. What I did see (especially in hi-def) was that the texture resolution was lower than what they could of been on PC, like CoD2 and King Kong (by the way, I'm not putting these games down at all). I see and acknowledge the lower-resolution art, but I don't really question it, it's normal behavior from a developper's point of view, if you consider the amount of memory. Anyways, I know I'm repeating myself.

Rushed deliveries do happen (more often than what's publically said even). Risks are also taken less frequently on console ports, mostly because crashes apear much more badly on a console than on a PC (where it happens quite frequently). Because of that, developpers have the pressure of making the game for the console as risk-free as possible, code-wise and art asset-wise. Rushed deliveries are often the reason that the engine may not have been fully optimized to it's full potential, etc, so this can be a factor in less texture detail. That said, it's far from the only one.

Example: Detail may be lowered so that an XBOX game only uses up only 58-60MB of available ram, to allow that extra little buffer in those what-if situations that are sometimes impossible to test. I can't give an example game, but I've heard of that being done. Crashes reflect much more poorly towards the developper/publisher on a console than it does on PC, so stuff is reduced to create that extra buffer.
 
cloudscapes said:
I haven't really played 360 games, just looked over shoulders of some who were occasionally, so I might not be a good judge here. What I did see (especially in hi-def) was that the texture resolution was lower than what they could of been on PC, like CoD2 and King Kong (by the way, I'm not putting these games down at all). I see and acknowledge the lower-resolution art, but I don't really question it, it's normal behavior from a developper's point of view, if you consider the amount of memory. Anyways, I know I'm repeating myself.

The reason I question it is I expect that in 1 or 2 years these consoles will be putting out textures that put current PC's completely to shame.

Maybe it's just a porting problem? like when porting from PC to console you have to downgrade your textures because they way the game is built, whereas when dev's make a game just for the console they are able to pull off visual that put the original PC games to shame. Does that make any sense?
 
scooby_dooby said:
But on a PC widnows is always chugging ram, people have antivirus's and firewalls running, they are often logged into messaging systems, and have other tasks open. In addition, they have all kinds of setting such as page files, virtual memory and other things, all these add up to a very unpredictable value for "exactly the amount of RAM windows takes"

so while consoles can utilize every single last MB on the system, pc games are always working off a best guess sort of set-up.

For example, if you game has minimum specs of 1GB of ram,how much do you as developer actually assume you have available? 600mb? 500mb?

So not only does windows always chug ram, but PC games can't accurately predict how much windows will be using, AND that amount also continually increases with new version of windows. So it's a very grey area, and definately can't be directly compared.

In my earlier example, I was just supposing that the computer with the exact-windows-mem+64 was a PC that was free of al lthat excess crap that the XBOX doesn't have. In a real-world-scenario, it probably wouldn't work very well, because real-world people have firewalls and antivirus'es and all that. If the PC Windows were tweaked (not stripped) in such a way so that it's behavior reflected the W2K kernel inside the XBOX (while still using the 150-200MB that regular windows uses), then I think it would be fine.

It's not something that can be measured properly, but once Windows does have a lot of RAM at its disposal, like a GB, then we can easily run a game on the 1GB PC that will be nicer-looking than the 512MB console port. Windows won't need all of that 512MB difference. A couple hundred megs definatelly, but there will be 800MB + VRAM left for the game. More than enough for a considerable boost in visuals.
 
scooby_dooby said:
The reason I question it is I expect that in 1 or 2 years these consoles will be putting out textures that put current PC's completely to shame.

Maybe it's just a porting problem? like when porting from PC to console you have to downgrade your textures because they way the game is built, whereas when dev's make a game just for the console they are able to pull off visual that put the original PC games to shame. Does that make any sense?

I don't know. Possibly, but I can't see it. Unless some kind of new compression is developped that doesn't require one to update their 360's drivers (if such a thing can be done) then I don't see it. There will be a little improvement yes, because devs will get a feel of the hardware and a clearer picture of their limits over time, but not by huge amounts. Another thing devs will need to achieve what you,re saying is time. Sometiems there just isn't any.

I'm HOPING to see titles which will be really optimized and use textures and art efficiently, I'm hoping I will! And maybe some of the much later games for this new gen will have textures and art in general which isn't too perceptivly different from a PC port designed to run on today's machines, but I'll still know (or assume) that they wouldn't be using the full potential of the PC (like Halo PC) or something.
 
Well you do make alot of sense, and I'm sure you're right.

One more question though: Why do you think the nicest textures we have ever seen (PGR3, Kameo for example) have come on the 360 and not the PC in 2004 or 2005? People have had 1GB or 2GB+ of RAM for years now, but then a 500mb console comes along and blows the doors off any PC games out there...(i've never seen a PC game that compares to kameo or PGR)
 
Last edited by a moderator:
scooby_dooby said:
Well you do make alot of sense, and I'm sure you're right.

One more question though: Why do you think the nicest textures we have ever seen (PGR3, Kameo for example) have come on the 360 and not the PC in 2004 or 2005? People have had 1GB or 2GB+ of RAM for years now, but then a 500mb console comes along and blows the doors off any PC games out there...(i've never seen a PC game that compares to kameo or PGR)

I admit I haven't quite figured that one out yet.

One reason may be that devs were(are) timing their next gen titles to be finished by the time the 360 had come out. As a way to sell the console, Microsoft (though the same is applicable to Sony and Nintendo as well) let the devs know more-or-less when the console's coming out, so that the devs can develop a release (or slightly after release) game for it. If they finish a 360 game even months before the console release, they sometimes (not always) have to hold onto to the PC port so that they can all be released at the same time. That may be part of it.

Another can be that when PC games are developped, they're done for a veriety of hardware. They need to be compatible with the maximum number of GPUs and systems, so often they don't implement not-widly-enablable(?) features HDR or heavy use of paralax mapping, for example. It's not a memory issue, just a feature and compatibility issue. However, this technology is enabled by default for every 360. Either someone can play with HDR/etc or they don't have a 360 at all! Devs can immidiatelly start implementing these features for 360 titles without worryign about backwards compatibility or hardware feature-set.

There are maybe other reasons, but that's my take on it, anyways.
 
Last edited by a moderator:
cloudscapes said:
Just guessing though. I seem to remember that they enhanced the PC version, but I don't know by how much, nor do I have a link to back it up. :p

In terms of enhancements, all they did was rewrite some shaders to take advantage of SM2.0. No texture upgrades or anything.
 
Alstrong said:
In terms of enhancements, all they did was rewrite some shaders to take advantage of SM2.0. No texture upgrades or anything.

Makes sense. I don't remember being particuarily impressed with the PC port, visually.
 
Most PC games are restricted to work with a maximum of 256 MB of VRAM that has to store both the framebuffer and backbuffer, and texture and vertex data. This can limit the amount of textures used in a scene to 100-150MB.
Then, without streaming, the game will also have to keep all the textures of a single level in the main RAM of the PC all the time, along with all the other static and dynamic data )game world, sounds, music etc). This will limit the maximum amount of textures in a level, too. And finally, the lack of a fixed hardware will also limit the developers' options in how they handle data, compress textures and so on.
A console on the other hand can keep all the textures for a single level in VRAM all the time, and it can even stream in new data on the fly. With the fixed hardware, developers can squeeze every bit of performance and memory space out of the system. This IMHO makes the X360 and PS3 able to stand up to a high-end PC with 256 MB VRAM and 1GB of main RAM. More time and money spent on a console game will also result in better art, both in terms of looks and optimization.
 
Laa-Yosh said:
Most PC games are restricted to work with a maximum of 256 MB of VRAM that has to store both the framebuffer and backbuffer, and texture and vertex data. This can limit the amount of textures used in a scene to 100-150MB.
Then, without streaming, the game will also have to keep all the textures of a single level in the main RAM of the PC all the time, along with all the other static and dynamic data )game world, sounds, music etc). This will limit the maximum amount of textures in a level, too. And finally, the lack of a fixed hardware will also limit the developers' options in how they handle data, compress textures and so on.
A console on the other hand can keep all the textures for a single level in VRAM all the time, and it can even stream in new data on the fly. With the fixed hardware, developers can squeeze every bit of performance and memory space out of the system. This IMHO makes the X360 and PS3 able to stand up to a high-end PC with 256 MB VRAM and 1GB of main RAM. More time and money spent on a console game will also result in better art, both in terms of looks and optimization.

IMHO that is half-true. Developpers can squeeze every drop of performance out of a console in theory, but they don't always do, at least not to the point where it can stand up the the PC port, and CoD2, Q4 and some others I've seen support that I think. Sometimes there just isn't enough time for the dev to make two ports of the same game each one fully optimized to either platform. In many cases, they'll just opt for "still pretty kick ass" rather than "fully optimized and unbelievably pretty" ports by reducing details and saving their time.

I don't agree with the "without streaming..." statement. PC games can and *have* used background streaming. And while there are some differences between the way data is managed between PC and console ports, usually since they are the same engine, they are very similar (save for the obvious hardware differences). Compression is also basically the same, at least for the XBOX/360/PC department. Almost all DXT. GC and PS2 used indexed (palettes) textures in my experience.

I'm not saying we won't be seeing PS3/360 games stand up to their PC ports, I'm sayign we're unlikely to see PS3/360 games stand up to their PC ports that fully utilize the 1GB and 256MB VRAM unless either the PC port recieves a boost in texture resolution to take advantage of the extra memory.
 
Laa-Yosh said:
Most PC games are restricted to work with a maximum of 256 MB of VRAM that has to store both the framebuffer and backbuffer, and texture and vertex data. This can limit the amount of textures used in a scene to 100-150MB.
Then, without streaming, the game will also have to keep all the textures of a single level in the main RAM of the PC all the time, along with all the other static and dynamic data )game world, sounds, music etc). This will limit the maximum amount of textures in a level, too. And finally, the lack of a fixed hardware will also limit the developers' options in how they handle data, compress textures and so on.
A console on the other hand can keep all the textures for a single level in VRAM all the time, and it can even stream in new data on the fly. With the fixed hardware, developers can squeeze every bit of performance and memory space out of the system. This IMHO makes the X360 and PS3 able to stand up to a high-end PC with 256 MB VRAM and 1GB of main RAM. More time and money spent on a console game will also result in better art, both in terms of looks and optimization.

The FSB is higher also on PS3/X360 esp Cell has 25GB+ for itself, i dont know how much this plays a part also.
Bw how big is the kernel running on 360?
 
it's important to note that most (if not all, nowadays) PC games store textures in system memory that are also being used in graphics memory. by this, i mean, if you are 100MB of textures in video ram you have those 100MB of textures duplicated in system ram, in addition to any textures you have cached for later use. so a console with a mostly unified memory system wouldn't have that problem.
 
Back
Top