choices, choices...

lentoPastel said:
-embrace blue ray or hit the market sooner?

Definetly hit the market sooner. BR is such a small factor in this compared to the brand name Sony Playstation 3. However if I wanted to make a great gaming machine where money wasnt a concern, I would have waited for newer and better tech.

Also, BR is Sony's thing. MS doesn't need to promote it. Sony does. They didn't really have an option.


lentoPastel said:
-go RSX (more power) or go with 10MB's FrameBuffer (where do you prefer those transistors?)

RSX built more like Xenos.

lentoPastel said:
-in-order or out-off-order CPU (a little old topic)

I would have gone with an AMD CPU like FX-57 or FX-60 depending on when I am supposed to release the thing or how much the extra cache would help games. Though I think it would've been a mistake seeing how nice Cell is, if not for it's power, for it's marketability.

lentoPastel said:
-512MB or more? in this case, if you think 768 is better for PS3, where would you add those extra 256MB? for RSX or for cell?

512 and made a HDD standard. Or had 256mb of cheaper ram with a lower bandwidth just to act as a large (faster than HDD) cache, yet slower than regular RAM to keep it's price low. If the first option isn't an option I would've divided those 256mb equally in a perfect world - but since it isn't and I would've placed those 256mb where it is cheapest (for CELL?).

lentoPastel said:
- Free internet gameplay or invest in servers and charge for them?

Free Internet play, where people can set up their own servers (and play on them at the same time) and have them listed on the main servers. But I would charge for most other services. Selling complementary stuff for games (avatars, levels, music, wallpapers, ring tones, t-shirts, coffe mugs, whatever...). Hosting free tournaments for the players with the highest ranking and let them win stuff as promo.

Also I would put in advertising in online-games while they load. This session is sponsored by...

Were I MS I would have bought the name Baldur's Gate and made a lot of games based on that name. I would also stop "converting" games between Xbox and PC.

Were I Sony I would've started the development of the OS and OS-software a lot earlier. I dont want to patch my system like it is a freaking PSP (read PC). Also I like to deliver what I promise.
 
EndR said:
For me..the 360 is excellent in many ways.. but some things would have helped it out better in the long run..

More cache - easier for devs, more possibilities
More Mhz on GPU - ATI.. what happened with FAST 14. 600Mhz GPU would have been a nice little boost
HDD included on all units - helping devs out and giving more possibilites..

I think the eDRAM was holding back the clockspeed.
 
Changes to the graphics chipsets:

Increase the level of multisampling beyond 4x while keeping the performance penalty low.

Make it both possible and practical for multisampling on high precision floating-point (FP16+) render targets for HDR + AA.

Keep all color blending at 128-bit accuracy.

Make the 3D depth always be near 32-bit floating point accuracy.

Free up the die space being used by the eDRAM.

Double the effectiveness of the shaders.

Employ tile based deferred shading.
 
Oh and if I were Nintendo I would have gone with an Intel chip instead of IBM based PPC after seeing the benchmarks of the new Mac Mini's comparing the old G4 to the new Core solo's.
 
Last edited by a moderator:
ninzel said:
Oh and if I were Nintendo I would have gone with an Intel chip instead of IBM based PPC after seeing the benchmarks of the new Mac Mini's comparing the old G4 to the new Core solo's.
That would somehow complicate backwardscompatibility.
 
ninzel said:
Oh and if I were Nintendo I would have gone with an Intel chip instead of IBM based PPC after seeing the benchmarks of the new Mac Mini's comparing the old G4 to the new Core solo's.
A comparison between a seven year old CPU fabbed at 130nm and a fairly modern 65nm processor is hardly fair; it's basically equivalent to running benchmarks between an original Radeon and a GeForce 7900 GTX.

Secondly, the wholesale prices on the Core Solo will likely be higher than whatever the retail price of the Revolution will be on the launch date.
 
akira888 said:
The price of six USB ports, a CF I/II slot, et alia are quite likely an order of magnitude lower than that of 256 MiB of 700 Mhz GDDR3 RAM. I would not be suprised if external memory was the single most expensive component in the PS3 by far.
A 256MB Chrome S27 (from S3) is only $115 and has 700MHz GDDR3 memory. That's for an entire graphics board. The memory can't possibly cost more than $50 or so, and it's probably less.
 
lentoPastel said:
maybe this is a good time to ponder... now that the next gen battle is undecided, what things, were you a top MS,Sony or Nintendo exec, based in what we know will have you made differently?

I can think on several items:

-embrace blue ray or hit the market sooner?
-go RSX (more power) or go with 10MB's FrameBuffer (where do you prefer those transistors?)
-in-order or out-off-order CPU (a little old topic)
-512MB or more? in this case, if you think 768 is better for PS3, where would you add those extra 256MB? for RSX or for cell?
- Free internet gameplay or invest in servers and charge for them?
For Microsoft, they had to lauch before PS3 or they'd be dead. I don't think HD-DVD or Blue-Ray were options.

For the GPU, we don't know that the raw shader power is weaker in Xenos. You can talk about the dual issue MADD/DP3 all you want, but the same architecture on the PC made a very small increase in performance over the single MADD Geforce6 series in pixel shader tests. Clock for clock and pipe for pipe, the single MADD R520 is not much behind G70. A R520 shader unit isn't necessarily equal to a Xenos shader unit in capability, but I'm just saying we don't know. Texturing ability is the only real power advantage I see in RSX.

Now, as to whether the eDRAM is worth it, remember that it saves tons of transistors for Z-compression, colour compression, read and write buffers for pixels, reduces memory controller complexity, and who knows what else. A good chunk of the cost is offset right there. More importantly, the bandwidth savings are just too big to ignore. If you don't have data, your extra power often just sits idle.
 
dukmahsik said:
edram was in the ps2 and GC and looked what both consoles are able to do near the end of its life, esp ps2.

I'm not sure if there's a logical implication there. Of course, eDram was a big feature of PS2's rendering system, a strength indeed, but if it wasn't a strength, something else would have been. Late-cycle results are really a function of developer talent and the results of previous iterations of software on the system, all contributing toward very mature software on a platform. You see the software really mould around the hardware at that stage, whatever its strengths and weaknesses might be.
 
dukmahsik said:
edram was in the ps2 and GC and looked what both consoles are able to do near the end of its life, esp ps2.


What?

I was most impressed with what Xbox was doing at the end...as far as growth in graphics...
 
superguy said:
What?

I was most impressed with what Xbox was doing at the end...as far as growth in graphics...

If you look at Jak1-Jak3 and Eternal Darkness-RE4 , vs. Halo1-Halo2 I thought GC and PS2 showed such a huge improvement.
 
superguy said:
I was actually thinking of Halo 2 as a poor example.

Look at games like Doom 3 or Half Life 2 on Xbox.

I think the Xbox just started out so strong that the improvements seemed incremental at best at least to my eyes.
 
Mintmaster said:
A 256MB Chrome S27 (from S3) is only $115 and has 700MHz GDDR3 memory. That's for an entire graphics board. The memory can't possibly cost more than $50 or so, and it's probably less.

And that's compared to the $3-4 those ports probably cost.
 
Back
Top