Codename Vejle: Xbox 360 S combined CPU/GPU

Why is it not allowed to have a faster connection? As long as game validation/testing takes place on older 360's, I don't see a problem. So what if some games tore less/dropped fewer frames on the slim...it'd be a nice incentive to upgrade if you think about it.


My guess is that they are trying to avoid fragmentation of the platform. If you allow performance difference between SKU's it is likely that some developers would choose to make use of the extra performance which would make other SKU's look bad.
 
My guess is that they are trying to avoid fragmentation of the platform. If you allow performance difference between SKU's it is likely that some developers would choose to make use of the extra performance which would make other SKU's look bad.
Did you miss the part where I said "as long as validation/testing is on older 360s"? If the game won't run well on the older 360's, MS won't allow it, plain and simple.
 
It's not just fragmentation of the platform folks or making the older 360 "look bad", some games can depend on hardware timing tricks to eek out the performance.

You don't want to changing latencies on a fixed hardware platform, doing so could induce the possibility of compatibility issues. The last thing you want is games breaking on the new system for pete's sake.
 
It's not just fragmentation of the platform folks or making the older 360 "look bad", some games can depend on hardware timing tricks to eek out the performance.
Is that still true? Back in the days of simpler hardware, simpler software, it was, but developing to that low a level now would be very costly and time consuming and only possible with a sophisticated in-house engine I'd have thought. There's enough optimsiation to be done with job balancing and managing workloads and datastructures and whatnot, that I wouldn't think poking around with clock timings would even be considered!
 
where you been, you CAN install 99% of games directly to the hdd which completely shuts off the dvd drive and the new consoles have 250GB of storage which is enough for 30+ installed games
I haven't had a 360 for a couple of years now. I had borrowed a friend's back in 2007 (he went to China for a year) and the drive made me crazy. It's cool to hear that you can run games off the HDD finally. Maybe one day I'll grab a 360 again (probably when they've been forgotten lol).

I'm currently busy with my "new" Dreamcast and 3DO. :D
 
Double/triple the L2 cache amount, throw on a few more PowerPC cores and bump up the eDRAM so that its big enough to accommodate a full 1080p or 720w/2xmssaa framebuffer (~14MB, right?) and feed it with 1GB of high endGDDR5 instead of 512MB of low end GDDR3 and you've got yourself a very tasty little next generation console. It should decimate the 360's performance.

Doesn't sound very tasty to me, more like a nightmare Gamecube to Wii type of a situation. You are basically descriping X360 X2... with what 7-8 years between the consoles, that's pitiful. If it launches in 2012-13 it better have 3-4GB of ram and 8-10x the transistor budget of X360.
 
Is that still true? Back in the days of simpler hardware, simpler software, it was, but developing to that low a level now would be very costly and time consuming and only possible with a sophisticated in-house engine I'd have thought. There's enough optimsiation to be done with job balancing and managing workloads and datastructures and whatnot, that I wouldn't think poking around with clock timings would even be considered!
Exactly, especially with Microsoft and their insistence on DX9 instead of a lower level. I don't think even with PS3 they don't go any lower than assembly level.
 
Doesn't sound very tasty to me, more like a nightmare Gamecube to Wii type of a situation. You are basically descriping X360 X2... with what 7-8 years between the consoles, that's pitiful. If it launches in 2012-13 it better have 3-4GB of ram and 8-10x the transistor budget of X360.

I'm talking about a 2011 launch for the successor to the Wii, not the next 360 in 2013. Expecting 4GB of RAM and 10x the 360's transistor count is a complete pipedream. Delivering ~3x performance of the current generation consoles 2 years before they release a successor, which are also bound to be scaled back compared to this generation, die sizes will go down next generation, not up. The amount of cash Sony has bled and the success of the Wii has guaranteed that.
 
I'm not certain they will go down. Gamers still expect a certain level of jump with the next coming consoles or they wouldn't be complaining about the X360/PS3 tech lasting so long. However we can be fairly certain they won't go up if only because process shrinks are slowing down and there is not that much left in the way of cost cutting by that way left in future generations.
 
Exactly, especially with Microsoft and their insistence on DX9 instead of a lower level. I don't think even with PS3 they don't go any lower than assembly level.
The 360 permits much lower-level development than "DX9" as we know it on the PC.
 
I'm talking about a 2011 launch for the successor to the Wii, not the next 360 in 2013. Expecting 4GB of RAM and 10x the 360's transistor count is a complete pipedream. Delivering ~3x performance of the current generation consoles 2 years before they release a successor, which are also bound to be scaled back compared to this generation, die sizes will go down next generation, not up. The amount of cash Sony has bled and the success of the Wii has guaranteed that.

Yeah I thought you were talking about the next Xbox. For the next Wii, that doesn't sound bad :)
 
corduroygt said:
Exactly, especially with Microsoft and their insistence on DX9 instead of a lower level.
There was never any such insistence from Microsoft. Even on XBox1, direct hw access was not limited, only GPU documentation was (because of NVidia, not MS).

rpg.314 said:
I am doubtful if it is possible to go any lower level than assembly
There's machine code - which usually boils down to writting DMA chains and direct register access code (both of which is at least somewhat accessible on PS3).
Of course PS3 is kind of pussified by the fact you can actually debug all this stuff.
 
In an age of deferred renderers, you're going to need a whole lot more for all the MRTs if you want to "solve" the tiling issue.
I was thinking a simple priority split for individual buffers could provide some benefit even when oversubscribing eDRAM. Say you have enough eDRAM to keep a forward renderer with 2xAA happy. Adding MRTs you keep your MSAA Z-buffer and one RGBA in eDRAM but spill the rest to UMA memory. At 4xAA all but the Z-buffer goes out to memory. Something like that.

Needs significant hardware support though. Duplicating a lot of ROP functionality.
 
Is this a new thing, because EG reports from 2009 (that kagemaru linked) say that MS won't certify a game that goes outside the DX apis?

Well that's what always confused me, I've read about developers "coding to the metal" on 360 games but then we see reports like the one I linked above.
 
Back
Top