DuckThor Evil
Legend
Going from 3MB to 6MB is not massive.
sorry my own math was off, luckily I didn't post numbers based on it:smile:
That said I'll just stop guessing what's in there and wait for the truth to come out silently...
Going from 3MB to 6MB is not massive.
Ok. Excuse me but what the fuck! A couple hundred pages about this stuff have already gone by!This is what we should know about hollywood based on all the researched gathered.
-It is not an overclocked flipper ( it's too big)
-It is custom built for the Wii (says ATI)
-It is in between the xbox and 360's GPU in performance. Probably right in the middle.
- The 24MB of the GC's main memory is now located on the GPU's die.
-There is edram on Hollywood but no one is sure how much.( could be more than the 3MB's on flipper)
-There are 2 or more dies on hollywood (napa and vegas)
-The GC's "Aram" seems to have been replaced by 24MB of 1t-sram.
-Hollywood was made on the 90nm process
-Most likely clocked at 243mhz
Sonic and the Secret Rings show a significant improvent over the GC's visuals. Try playing it.
-same 6 bits per component RGBA framebuffer (you can tell from banding, like 16bit on PC. is it so?)
There is no way that Hollywood's fillrate, vertex, and shader performance are at the midpoint between Xbox and X360. Even a doubled-up Flipper at the known clockspeed would only be equivalent to the XGPU in fillrate and still have weaker shaders.
Ok. Excuse me but what the fuck! A couple hundred pages about this stuff have already gone by!
*boil*
Correct from your list:
-It is not an overclocked flipper ( it's too big)
-It is custom built for the Wii (says ATI)
-Hollywood was made on the 90nm process
-Most likely clocked at 243mhz
-There is edram on Hollywood but no one is sure how much.( could be more than the 3MB's on flipper)
For the remaining stuff:
Vs 360's graphics chip, make it a quarter the raw performance, to fill one third of the pixels per frame. Write that on a piece of paper and flush it down the toilet because even though it's the closest you'll ever get, it still doesn't tell you anything about real-world game graphics. Done.
-There are 2 or more dies on hollywood (napa and vegas)
There are exactly 2, as is patently obvious from the images with the heatspreader removed.
-The GC's "Aram" seems to have been replaced by 24MB of 1t-sram.
False. It has been replaced with a single 64MB GDDR3 chip, on a 32 bit wide bus.
- The 24MB of the GC's main memory is now located on the GPU's die.
No. It is one of the two chips/dice on the multi-chip module known as Hollywood.
I.e. one of the two chips is 24MB of 1T-SRAM, the other contains all the logic plus an as of yet unknown (smaller) amount of embedded memory for the framebuffer.
http://www.beyond3d.com/forum/showpost.php?p=928306&postcount=4 (from the first page of right here!)
32-bit (8:8:8:8), 24-bit (6:6:6:6), and 16-bit (4:4:4:4) banding all look different. And yes, they all have banding. The more bits you have, the less banding you have, but 8 bits per component does not magically grant you the ability to render infinitely many colors, and you will still get visible color banding in certain situations.
Dude, don't curse at me. BTW how are you not banned for that? Anyway... I said the aram has been replaced because the 1t-sram is where the aram use to be in the gamecube. The 64MB's of GDDR3 did not replace the Arams purpose. Aram was never the GC's main memory, the 24 MB's of 1t-sram was.
While I'm not going to argue that it would be half way between XGPU and Xenon I think even a doubled up Flipper at 243Mhz would be significantly more powerful then XGPU. It would have twice the pixel fillrate compared to XGPU
32-bit (8:8:8:8), 24-bit (6:6:6:6), and 16-bit (4:4:4:4) banding all look different. And yes, they all have banding. The more bits you have, the less banding you have, but 8 bits per component does not magically grant you the ability to render infinitely many colors, and you will still get visible color banding in certain situations.
Do the math: 650 MP/sec / 162 MHz * 243 MHz = 975 MP/sec, only half of XGPU's fillrate. And that's without the hardware MSAA that XGPU has (XGPU does 3.7 GSamples/sec). You'd need to double the pipelines to achieve XGPU's fillrate.
We will see more real-world geometry, primarily because the T&L unit won't be used to run special vertex shader effects. We just...won't have those effects.
Dude, I find it pretty offensive that you find it offensive! Seriously, 200 pages of all of us going in circles and now the information is on the first page of this very aptly named thread. I'm not so sure what the bigger insult is, asking for yet another repeat of the cycle of madness, maybe because it's more convenient than locating the summary, on page one of the thread with "summary" in the title, or spelling out wtf. I do have an opinion though.Dude, don't curse at me. BTW how are you not banned for that? Anyway... I said the aram has been replaced because the 1t-sram is where the aram use to be in the gamecube. The 64MB's of GDDR3 did not replace the Arams purpose. Aram was never the GC's main memory, the 24 MB's of 1t-sram was.
You're overestimating XGPU's fillrate by a factor of two. It's two billion texels per second, theoretically, not pixels.Do the math: 650 MP/sec / 162 MHz * 243 MHz = 975 MP/sec, only half of XGPU's fillrate. And that's without the hardware MSAA that XGPU has (XGPU does 3.7 GSamples/sec). You'd need to double the pipelines to achieve XGPU's fillrate.
Replacing the slow A-RAM wasn't a problem. Replacing the 1T-SRAM main memory though was a problem, because there are access patterns where 1T-SRAM has such low latencies that current mainstream memory can't match it.
Explains why certain GameCube games emulated run slightly faster?
Do the math: 650 MP/sec / 162 MHz * 243 MHz = 975 MP/sec, only half of XGPU's fillrate. And that's without the hardware MSAA that XGPU has (XGPU does 3.7 GSamples/sec). You'd need to double the pipelines to achieve XGPU's fillrate.
We will see more real-world geometry, primarily because the T&L unit won't be used to run special vertex shader effects. We just...won't have those effects.
My mistake, I thought XGPU had 2 GTexels and 2 Gpixels.
Blazcowicz, I am completely confused. You were complaining about the 6:6:6:6 banding on Wii, but you brought up a Gamecube game. It will be interesting to see if Two Thrones on Wii has the same banding issues that the Cube version did.