Will next gen consoles finally put an end to ugly low res textures?

still it had an un heard of 1gb ram
Pretty sure I had 2GB in my box by late-2005 and 1GB was considered standard for a new box.
The 360 was many times more powerfull than that..for £300 quid..bargain!
"many times"? Well, maybe if you define "1.X" as many, sure :)
Also, what was the actual BoM for 360?

As for the anandtech quote, they didn't take SLI systems into account. Yea, they cost a fortune but were significantly faster as well, especially if you wanted to play with AA at higher resolutions.
 
Well, if you remember back to mid 2005 when this thing was launched, it was much more powerfull than any pc could handle, 512mb gddr3 was silly for the time and was extremely expensive.

The xbox 360 was very advanced from many standpoints, especially considering a price point of £300.
It brought a graphics oriented, in order 64bit 3 core cpu, @3.2ghz.(when pcs were only moving to duel core 2.0ghz OoO)

It brought the highest amount of next gen gddr3 graphics memory seen, and with an innovative (but restrictive) 10mb edram, and superior unified memory arch.
It brought a gpu that was at least 2 times as powerfull as a 6800ultra when it as anounced (the most powerfull pc gpu)
As well as starting the dx 10 feature set, which included unified shaders, and even a simple testerlator and mexeport to main memory..

1, The 7800GTX was launched in July 205 and packed over the twice the power of a 6800Ultra in many situations and could be considered equal to 360's GPU.

2. The 512mb GDDR3 version of the card launched in the November of the same year which was the same month 360 was released in most parts of the world. So even at launched there PC's around that were faster and packed more GDDR3 then what 360 had.

3. PC could run SLI so at the time PC's could have up to twice the power as 360, But at a premium. But 13 months after 360 released Nvidia released one the most historic GPU's ever, The 8800GTX that packed over twice as much power as 360 in a single package.

4. 8 months after 360 launched Intel released the Core 2 Duo range upon the world which greatly out performed there current flagship dual core's, The Pentium Dual cores. Capcom in an article once said that 360's Xenon CPU is about as fast as a 3.2Ghz Pentium Dual core, Making 360's CPU greatly outdated with the release of the Core 2 Duo's.

360 was a bargain from a price/performance stand point to but most of your claims a little in-accurate.
 
:mad: No most of my claims are not inaccurate at all, when it was announced in MAY 2005 and which the ANANDTECH article a linked with my comments as supported evidence, the most powerfull gpu out there was a 6800 ultra..the ps3 RSX was claimed to be twice the power of that, and in fact was very similar to the 7800 that you mentioned....the xenos was more powerfull than the RSX, so you work it out.:rolleyes:

So my claims about MAY 2005, the 360 was the most powerfull gaming hardware seen, which Anand attested to in that article and the quote that i provided...including an SLI setup..which would have cost a small fortune compared to a small £300 console in anycase.

Its no use pointing to such things like core 2 duo's 8 months later and 8800's a year later as that is not the time period i was talking about, and a 8800gtx whilst twice the power a year later, was about the cost of a 360 itself, so they are not comparable.

Again in MAY 2005 when announced, can you provide me with a link of a graphics card that had more than 512mb GDDR3??

My point is not saying that the 360 pawns pc's, no it dosn't by for time that thing was announced, it offered unseen performance for comparativly peanuts.

HOHO; I said i bought my pc in 2004. and we are talking about faster higher bandwidth graphics memory.

My point was regarding graphics use for games, thats what we are talking about aren't we?? of course im not suggesting an exbox could out perform a high end pc of the time on every aspect of computer tasks...sheesh.

And the BOM has got nothing to do with me or you has it!? thats microsoft that picks up the bill.

Yes of course if you went to the end of 2005 you could spend £1000+ pounds and get a superior gaming rig..but that just proves my point about cost..

For MAY 2005 when it was announced it was more powerfull than even a SLI set up for GAMING, which is pretty incredible for £300 dont you think??
 
Last edited by a moderator:
At release the XBox360 was as good as a very high-end PC, and didn't cost nearly as much...
(I think I had 2GB of RAM in those days, now I have 8, so I expect next gen to have 2GB.)

Now please try to stay on topic, which is about finding which technology could improve texture quality in games. (cost effective solution if possible)
 
At release the XBox360 was as good as a very high-end PC, and didn't cost nearly as much...
(I think I had 2GB of RAM in those days, now I have 8, so I expect next gen to have 2GB.)

Now please try to stay on topic, which is about finding which technology could improve texture quality in games. (cost effective solution if possible)

Personally I think they'll have minimum of 4Gb, I have 16Gb and it cost me £65

You can get 8Gb 1600Mhz DDR3 kits from around the £30 mark so 4Gb would be very cheap, Heck throw in a bulk discount and 8Gb could happen.
 
1, The 7800GTX was launched in July 205 and packed over the twice the power of a 6800Ultra in many situations and could be considered equal to 360's GPU.

2. The 512mb GDDR3 version of the card launched in the November of the same year which was the same month 360 was released in most parts of the world. So even at launched there PC's around that were faster and packed more GDDR3 then what 360 had.

3. PC could run SLI so at the time PC's could have up to twice the power as 360, But at a premium. But 13 months after 360 released Nvidia released one the most historic GPU's ever, The 8800GTX that packed over twice as much power as 360 in a single package.

4. 8 months after 360 launched Intel released the Core 2 Duo range upon the world which greatly out performed there current flagship dual core's, The Pentium Dual cores. Capcom in an article once said that 360's Xenon CPU is about as fast as a 3.2Ghz Pentium Dual core, Making 360's CPU greatly outdated with the release of the Core 2 Duo's.

360 was a bargain from a price/performance stand point to but most of your claims a little in-accurate.


Yeah ok. How much would all that hardware cost you back in 2005? Way more than a 360. The argument of PC's being more powerful at the time is a moot one, as computer continually get faster and do not go through generations in the same manner as consoles. With that being said. Yes the 360 was very powerful for its time especially at its price point. RAM and speed issues aside, the fact that it is a closed environment means that devs are better able to take advantage of all the machine has to offer. They fit a shit ton into a $400 system and that's no small feat. And Capcom said that in an article years ago, I wonder what developers would say about the 360 CPU now.
 
Balanced systems aren't just about the most powerful single components or biggest numbers.

Good point.

However, if PS4/Xb720 follows this example (SSD/weak GPU), and the other doesn't (invest SSD budget into GPU), one of them will have developers taking full advantage of the hardware and the other will not.

Just as the components in a console aren't in a vacuum, neither are the consoles themselves in a vacuum. Multiplats will code to the baseline of the popular console(s), and port from there.

Also as I said, don't SSD's (especially cheap ones) have issues with degrading performance over time? The heavier the use, the shorter that time is.

Yes?

I'm not sure that is a ideal alternative for achieving increased texture detail.

But If we are entertaining the concept of increased fixed costs, how about a 384bit Ram interface to 3GB GDDR5?
 
Last edited by a moderator:
Personally I think they'll have minimum of 4Gb, I have 16Gb and it cost me £65

You can get 8Gb 1600Mhz DDR3 kits from around the £30 mark so 4Gb would be very cheap, Heck throw in a bulk discount and 8Gb could happen.

I agree we need 4bb minimum for things like complex texures, higher res, and extra goodies turned on etc..

But you have to look at the bigger picture, they are not going to use ddr3 because its slow and provides poor bandwidth.
GDDR5 is expensive for quantities approaching 4gb due to the small desity of them, and to get the required bandwidth you are going to have to pair it up with something like a 256-386bit memory controller..which again adds to the cost and hampers board shrinkage in future as the motherboard is wider.

Of corse there are other paths to go down, like another edram implemetation, but from what i can gather edram is also very expensive, doesn't scale down well for future cost reductions and would require 100's mb's of it too make it worthwhile.

Another option is xdr2, provides very high bandwidth, good power consumption, but no body makes it/has used it so very very expensive in amounts approaching 4gb.

There seems no cheap answer to resolve to ram issue on future consoles, and we may end up getting some kind of compression technology or something that hinders the quality of the textures..
 
Another option is xdr2, provides very high bandwidth, good power consumption, but no body makes it/has used it so very very expensive in amounts approaching 4gb...

I'm not sure this is enough of a reason to ignore the advantages of XDR.

AFAIR, n64 was the only mass product using Rambus at the time. PS2 also used RDRAM...and PS3 carried the tradition. The only XDR implementation I know of is PS3.

So I'm not sure why XDR2 would be any more of a risk/higher relative cost than in the past.
 
Yeah ok. How much would all that hardware cost you back in 2005? Way more than a 360. The argument of PC's being more powerful at the time is a moot one, as computer continually get faster and do not go through generations in the same manner as consoles. With that being said. Yes the 360 was very powerful for its time especially at its price point. RAM and speed issues aside, the fact that it is a closed environment means that devs are better able to take advantage of all the machine has to offer. They fit a shit ton into a $400 system and that's no small feat. And Capcom said that in an article years ago, I wonder what developers would say about the 360 CPU now.

This was carried on by PM, I would think you would do the same. And for the bolded part? Pretty much the same thing as they said back then.
 
I agree we need 4bb minimum for things like complex texures, higher res, and extra goodies turned on etc..

But you have to look at the bigger picture, they are not going to use ddr3 because its slow and provides poor bandwidth.
GDDR5 is expensive for quantities approaching 4gb due to the small desity of them, and to get the required bandwidth you are going to have to pair it up with something like a 256-386bit memory controller..which again adds to the cost and hampers board shrinkage in future as the motherboard is wider.

Of corse there are other paths to go down, like another edram implemetation, but from what i can gather edram is also very expensive, doesn't scale down well for future cost reductions and would require 100's mb's of it too make it worthwhile.

Another option is xdr2, provides very high bandwidth, good power consumption, but no body makes it/has used it so very very expensive in amounts approaching 4gb.

There seems no cheap answer to resolve to ram issue on future consoles, and we may end up getting some kind of compression technology or something that hinders the quality of the textures..

That's why I think that both Microsoft and Sony will use dedicated System and VRAM memory pools.

They can have slower main system RAM and use the fast expensive stuff for VRAM.

1375Mhz GDDR5 ( 5500Mhz effective ) on a 192bit bus could be used for 1.5Gb of VRAM and 132Gb/s of bandwidth

That coupled with 2-4Gb of System RAM would be 3.5-5.5Gb total System memory, Which in my eyes is not bad as you can always use system RAM for caching.

I think using higher clocked RAM on the GPU instead of a wider bus would be a better option.
 
Do you guys think they're gonna low-ball the RAM again or did they learn a lesson this gen?

Personally. I think 2gb will be a hamstring if they are aiming for 1080p @ 60fps with high resolution art. (actual high resolution art, not what the current gen of consoles think is "high res" but then when you walk up to the wall in MW3 its still a blurry mess... that's not a high res texture im sorry)

I think 4gb should be the minimum for 1080p @ 60fps with high res art.
 
Do you guys think they're gonna low-ball the RAM again or did they learn a lesson this gen?

Personally. I think 2gb will be a hamstring if they are aiming for 1080p @ 60fps with high resolution art. (actual high resolution art, not what the current gen of consoles think is "high res" but then when you walk up to the wall in MW3 its still a blurry mess... that's not a high res texture im sorry)

I think 4gb should be the minimum for 1080p @ 60fps with high res art.

They didn't low ball last time, 512mb was a decent amount when they were designed.

Playstation 3 had a 16x jump in RAM quantity over Playstation 2

Xbox 360 had 8x the jump compared to the first Xbox.

If you translate that now Playstation 4 would have 8Gb of total system RAM and Xbox 720 would have 4Gb.

8Gb would be a lot, perhaps even a little too much, 4Gb in my eyes is not enough but if you look at my thoughts up above 1.5Gb of VRAM coupled with 4Gb of system RAM would be 5.5Gb system total which would be a nice amount, Not too much and not too small.

We also don't know how complex and how much memory there OS's will consume!
 
Last edited by a moderator:
I'm sorry but I disagree about "512 being a decent amount." They pushed the HD issue they should have had the foresight to know what kind of specs they would need. As a result we have AAA titles like Call of Duty being rendered at sub-HD resolution 600p. Even though they were trying to say games were gonna be mandatory 720p.

It sounds to me like they didn't know what they hell they were doing, they just wanted people to buy HDTV's.
 
I'm sorry but I disagree about "512 being a decent amount." They pushed the HD issue they should have had the foresight to know what kind of specs they would need. As a result we have AAA titles like Call of Duty being rendered at sub-HD resolution 600p. Even though they were trying to say games were gonna be mandatory 720p.

It sounds to me like they didn't know what they hell they were doing, they just wanted people to buy HDTV's.

Blip that's you complaining about this in 2012. In 2005 512MB was a good amount of RAM and putting more would have been very complicated. The jump in memory from previous gen was also a big one. Your entire angle is flawed.
 
I'm sorry but I disagree about "512 being a decent amount." They pushed the HD issue they should have had the foresight to know what kind of specs they would need. As a result we have AAA titles like Call of Duty being rendered at sub-HD resolution 600p. Even though they were trying to say games were gonna be mandatory 720p.

It sounds to me like they didn't know what they hell they were doing, they just wanted people to buy HDTV's.

You'll find games run at Sub-HD due to lack of shader/pixel power and hardly because of lack of memory.
 
I'm sorry but I disagree about "512 being a decent amount." They pushed the HD issue they should have had the foresight to know what kind of specs they would need. As a result we have AAA titles like Call of Duty being rendered at sub-HD resolution 600p. Even though they were trying to say games were gonna be mandatory 720p.

It sounds to me like they didn't know what they hell they were doing, they just wanted people to buy HDTV's.
600p has nothing to do with memory. CoD runs at 60 fps. Thus they have only 16.6ms to generate a frame (while 30 fps games have twice of that). 1024x600 has 33% less pixels than 1280x720. The lower resolution was selected purely to make the game reach 60 fps (it saves 33% of pixel shader cycles).

Nobody critisized the 512 MB of memory 7 years ago when the console was launched (it was huge compared to all previous consoles). Memory was expensive and we were lucky to get 512 MB (256 MB was a realistic option).

Both current consoles could easily reach 60 fps and 720p in all games. However players demand more and more every year and the hardware is still the same. To improve their gameplay, physics and graphics quality further every year developers had to compromize (as hardware doesn't get any more powerful). Dropping frame rate to 30 fps basically doubles your CPU & GPU time you can use for each frame. That's what most developers have chosen. 30 fps however degrades gameplay (more input lag etc) so some developers have chosen differently. 1024x600 drops pixel shader load by 33% and thus allows those cycles to be used in a way that benefits the game more. It's all about compromizing.
 
4. 8 months after 360 launched Intel released the Core 2 Duo range upon the world which greatly out performed there current flagship dual core's, The Pentium Dual cores. Capcom in an article once said that 360's Xenon CPU is about as fast as a 3.2Ghz Pentium Dual core, Making 360's CPU greatly outdated with the release of the Core 2 Duo's.

The 360's CPU "beats" the early Core 2 duos in lots of games but there's always an outcry from certain sectors about "crappy ports". Lost Planet? Crappy port. Crysis 2? Crappy port. Crysis 1? Crappy port. (sorry, joke)

Now with Battlefield 3 most early Core 2 Duos can't even meet the minimum running requirement - the 2.4 gHz squeezes in as the very bottom most rung on the ladder - while the 360 goes well beyond it and offers really solid performance that's way above anything that could be described as a minimum playable standard. No doubt optimisation (and overheads) favour the 360, but given the developer and the title's history hopefully we can now avoid the "crappy port" thing.

Here's a really interesting post from sebbi a couple of months back:

Athlon 64 X2 (2GHz) was the most powerful PC CPU at the time Xbox 360 was released. It was natural that some developers building the first generation Xbox games compared them (and found that Athlon X2 performed better in some of their existing code). But you have to understand, that the Xbox 360 hardware was brand new, and the developers didn't have much experience on it. Multithreaded game programming was just taking its first baby steps, and suddenly they had to program for a 6 thread (SMT) in-order CPU (with powerful VMX128 vector units). Most games were single threaded back then (all previous consoles were single threaded, and PC got first dual core CPUs on 2005). XBox 360 on the other hand required developers to fully split their code to six threads if you wanted to get anywhere close to full performance out if it. It was a big change.

It you compare the Xbox 360 launch titles to the games we have now (Battlefield 3, Crysis 2, Rage) the difference is huge. A 2GHz dual core Athlon would have likely resulted in slightly better launch titles... But for running the recent fully optimized multithreaded games, it wouldn't have any chance in competing against the six threaded XCPU. At the time XCPU also had very forward looking vector instruction set. VMX128 includes dot product (SSE4.1, 2008), FMA (Bulldozer / Sandy Bridge, 2011) and float16/32 conversion (Bulldozer / Ivy Bridge, 2011/2012) instructions.

It would be really hard to compare XCPU directly to PC CPUs. In-order vs out-of-order is the first difficulty, then there's SMT, RISC instruction set and different vector instructions. In-order execution hurts less, if you optimize for it, SMT however helps less if the code doesn't have any stalls, vector instructions do help a lot, but only if the particular code can be vectorized. There's isn't any PC in-order CPU (except for ATOM but it's not in any way comparable), there are some 3 core (AMD) CPUs, but those do not have SMT (6 threads), none of the older SSE versions exactly match VMX128 capabilities. AVX does, but it has twice as wide vectors (much higher throughput).
 
I'm sorry but I disagree about "512 being a decent amount." They pushed the HD issue they should have had the foresight to know what kind of specs they would need.
It's not about foresight, but budget. The consoles already cost a lot with 512 MBs, had a good amount of RAM for the time, at a good bus speed, and a good increase on previous generations (over 12x increase for PS3). Any more would have been ludicrously expensive and unrealistic.

It sounds to me like they didn't know what they hell they were doing, they just wanted people to buy HDTV's.
You've clearly got a lot of emotional involvement in this given your angry tone towards sub-HD games. Sub-res games are common. They were common last gen too, but we didn't have websites counting pixels to tell us that. As Sebbbi says above, it's all about compromises and making what sells. The lack of resolution isn't due to specs, but due to devs running code trying to get the buyers' dollars. If Joe Gamer really hated sub-HD games that much they wouldn't be buying COD, and developers wouldn't be making sub-HD for fear of scaring away the gamers. That you feel cheated is a fault of your expectations and understanding of finite hardware. It was never possible in 2005/6 to build an affordable console that could run today's games at 1080p as long as developers keep pushing more complex pixels. That's why hardware starts to look dated and we look forwards to the next round of consoles!
 
The 360's CPU "beats" the early Core 2 duos in lots of games but there's always an outcry from certain sectors about "crappy ports". Lost Planet? Crappy port. Crysis 2? Crappy port. Crysis 1? Crappy port. (sorry, joke)

Now with Battlefield 3 most early Core 2 Duos can't even meet the minimum running requirement - the 2.4 gHz squeezes in as the very bottom most rung on the ladder - while the 360 goes well beyond it and offers really solid performance that's way above anything that could be described as a minimum playable standard. No doubt optimisation (and overheads) favour the 360, but given the developer and the title's history hopefully we can now avoid the "crappy port" thing.

Here's a really interesting post from sebbi a couple of months back:

You're comparing apples to oranges.
If a 360 CPU were running the same OS and drivers as that core 2, it would have worse problems running those games.

The primary issue on PC CPU's is no exclusive access to the GPU, some less than ideal driver architecture that made draw prim calls ludicrously expensive and drivers that are trying to fix broken code on games written over the last 10 years.

We actually ran various benchmarks on the XCPU and at the time High end PC processors before it was released and for none vectorized code, like Zip type compression/decompression, it wasn't even close, the PC processors were MUCH faster. I don't remember the exact numbers so I won't quote them. But even the PPC chips in the "alpha kits" were considerably faster than the shipped CPU's.
 
Back
Top