Radeon 8500 "R200" compared to Wii's Hollywood

swaaye

Entirely Suboptimal
Legend
Supporter
This thought process randomly popped into my brain earlier today. Is the old Radeon 8500 more than a match for Wii?

  • pixel shader 1.4, the useful phase instruction among other things
  • vertex shader 1.1
  • effectively two geometry processors because it also has a DX7 TCL unit
  • truform (arguably worthless perhaps)
  • 4x2 design at 250-275MHz leads to somewhat more pixel fillrate but much more texture fillrate
  • memory access is a wash, with 64-128MB dedicated memory and more bandwidth to this large bank than Wii has to its 64MB, but no unique ultra-fast RAM bank
  • No color depth limitations (i.e. full 8-8-8-8 color)
Of course R200 wasn't without its quirks.

  • No multisample AA and supersample is horrible for performance (similar to Wii)
  • Anisotropic filtering can not be combined with trilinear filtering (bilinear only). I'm not sure I've ever seen AF from Wii/Cube however, and sometimes they even appear to skip mip mapping.
Overall however I think the results from R200 would be prettier and faster. Thoughts? ;)
 
Last edited by a moderator:
Considering when it launched you should really be looking at the radeon r 300 generation or even r4x0 gen. The wii launched in 2005. The r8500 was a 2001 part I believe.

Hollywood would be very underpowered even looking at low end cost reduced parts of the same era.

The 8500 was a really good card btw. The geforce ti series were faster but the 8500 was really ati's first card able to cmopete.
 
Considering when it launched you should really be looking at the radeon r 300 generation or even r4x0 gen. The wii launched in 2005. The r8500 was a 2001 part I believe.
Well I think it's rather apparent that Wii isn't much different than Xbox or Cube when it comes to effects or performance. They aren't really competitive with R300 or R4x0. I suppose that answers my original question however.

The 8500 was a really good card btw. The geforce ti series were faster but the 8500 was really ati's first card able to cmopete.
8500 has more interesting shader hardware than GF4Ti but the GF4Ti is faster almost without exception.
 
PS1.4 and VS1.1 are already going to buy you things that you're not going to get with Hollywood's fixed-function T&L (which IIRC, ERP said is actually weaker than a GF2's) or TEV. I think the fillrates are comparable, and of course there are some of those unique indirect texturing tricks you can do with the TEV, but other than that...well, you can run Doom 3 on an 8500.
 
The GC was a 2001 console, eastmen.

Yea but the OP says

Is the old Radeon 8500 more than a match for Wii?

The wii was a 2006 console. You could realitsicly put in a x1900 or x 1800 series based gpu in the wii. But even a 9700 or x800 would be small on 90nm . I don't think the wii as a whole would be a match for any of these gpus. I would think even a 9700 would out class it in every way and considering a 9700 pro is built on 150nm with 107m tranistors. The r200 is 60m tranistors on the same process. The r480 is 160m tranistors on 130nm . These were all built at least 2 years prior to the wii and are all small chips made even smaller on 90nm. I don't really know why nintendo choose to go with hollywood again. But it doesn't seem to matter. Though if they do try and compete next gen on graphics their first party teams may be at a disadvantage.
 
Last edited by a moderator:
Forgive me if I'm wring, but I seem to remember seeing 8500s in GC SDKs back when Luigi's Mansion was being tossed around.

Edit - Memory is coming back... I think it was an advertisement in regards to FireGL8500s, linking them with eeeearly Cube development (Luigi's Mansion), as if the two were on some level on similar ground.

Edit2 - Thinking that may have been a FireGL7500 advert... hmmm, but whatever...
 
Last edited by a moderator:
eastmen, you misunderstood my meaning with "more than a match for". I can see why that might happen. What I meant was "is the 8500 superior to Hollywood".

In retrospect, after considering my first hand experience with Xbox and Wii, I do think the answer to the above is clear.
 
If the Wii had more memory maybe we could've had BF2 on the Wii? :p

The Radeon 8500 is the minimal card for that game :D
 
Yup Radeon 8500 can run some later stuff fairly well. I've run Far Cry and Doom3 with it and if you run at "SD" 640x480 it definitely is quite adequate.
 
Radeon 8500 was no doubt more advanced in a lot of ways and would very likely have produced better looking games. But it was also a very different design, for instance it would have rendered Wii unable to play GC games. Not really sure what the point of the thread is to be honest. We all know that there were plenty of GPU's available for Wii that were a lot more modern then Flipper/Hollywood, so did Nintendo.
 
But it was also a very different design that would have rendered Wii unable to play GC games.
True.

Not really sure what the point of the thread is to be honest. We all know that their were plenty of GPU's available for Wii that were a lot more modern then Flipper/Hollywood, as did Nintendo.
Sure. As you said it comes down to their desire for 1) backwards compatibility like MS & Sony 2) cheap.
 
Yeah, they had two options for backwards compatability, a system many times more powerful then GC or a system with an architecture very close to GC. Considering the risk they were taking with the new control system they weren't prepared to go for the first option. Pitty because while graphics aren't what makes games great I'd still have liked a much more powerful system.
 
I would compare Hollywood to an R580 since it can do HDR+AA in realtime graphics in games like Monster Hunter 3, Silent Hill, Resident Evil Darkside Chronicles, etc; let´s not forget that HDR+AA was not possible until ATI x1000(R520 and R580) series came.
 
I would compare Hollywood to an R580 since it can do HDR+AA in realtime graphics in games like Monster Hunter 3, Silent Hill, Resident Evil Darkside Chronicles, etc; let´s not forget that HDR+AA was not possible until ATI x1000(R520 and R580) series came.

Lol, you're mistaken my friend.
 
The 8500 was fast - especially when overclocked with a good aftermarket cooler - but the filtering was awful.
 
eastmen, you misunderstood my meaning with "more than a match for". I can see why that might happen. What I meant was "is the 8500 superior to Hollywood".

In retrospect, after considering my first hand experience with Xbox and Wii, I do think the answer to the above is clear.

i'm just pointing out the other choices they had within ati. I doubt ati evne showed them the 8500 which at that point was very old. Like I said even the 9700pro would have been tiny on 90nm , 65nm , 45nm and would have destroyed both the 8500 but also hollywood.

considering it was released in 2006 as I said earlier they could have gotten away with a gpu all the way up to the radeon 1900. I doubt nintendo would even have considered it.

Of course a r300 or r400 wouldn't have matched the xenos and the r400 could have actualy been more powerfull in the long run than what went into the ps3. It was nintendos bad that they didn't take advantage of the years of tech progress since the gamecube was made.
 
Of course a r300 or r400 wouldn't have matched the xenos and the r400 could have actualy been more powerfull in the long run than what went into the ps3. It was nintendos bad that they didn't take advantage of the years of tech progress since the gamecube was made.

I don't think r400 would have been a match for rsx at all, it was in nv40 performance territory and didn't even support SM3. R520 would probably have been more than a match for rsx though and certainly highly competitive with xenos. R580 would have put wii ahead of both.
 
Back
Top