WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
I hope developers are able to push the system more when the 2nd generation of games is out. I think a lot of developers may be spending more time and effort working on the new controller for their first round of games. Maybe that's why Nintendo hasn't worried getting finalized specs in, because they knew the developers would be working hard on new control ideas. I mean, I don't know exactly, but the dev kits haven't been out too long, have they?

Anyway, the games aren't suddenly going to start looking like Xbox360, but I think the games will start to look better. I think Mario Galaxy looks great, and better than what the GC could handle.
 
swaaye said:
Considering it's still in development, and stuff like Radeon 9700 is like 4 yrs old, uhhh lol.... I'm pretty sure it'll be neato.


I cannot believe that at this late point in the cycle, that Hollywood is not completed and taped out. it has to be. it's late May and Wii launches in November.

the Xbox1 NV2A GPU was finished by Feb 2001, and had some teething problems going into production, but development was finalized in early 2001. only clockspeed adustments were made (from 250 Mhz to 233 Mhz).


it is the overall development kits for Wii that are being finalized for June release.

Hollywood has to have been completed, probably months ago.


so far, there is no indication that Hollywood is even nearly as powerful as the R300 - Radeon 9700 Pro from 2002. but I hope I'm wrong.
 
Megadrive1988 said:
it is the overall development kits for Wii that are being finalized for June release.

Hollywood has to have been completed, probably months ago.
That's what you'd have thought, but it doesn't seem that way. 1) No Wii hardware showing at E3. 2) Slide says 'GPU being developed by ATi' or somesuch.
 
IIRC none had change to be even near from Xenus till june last year, althought they had it since 11/2004, meybe it is the same happening here.

Even if it isnt as "powerfull" as (whatever) fixed function HW, like the rumured physics one, can add a lot of performance with low numbers (eg like PPU-Cell) and , unlike raw power, there is no good way to emulate or tune it down for E3. I even wonder if they would want really good looking games at E3 (given the games they had and the games they showed).

I agree that signs by now look bad but I will wait till see final games in final HW.
 
Shifty Geezer said:
That's what you'd have thought, but it doesn't seem that way. 1) No Wii hardware showing at E3. 2) Slide says 'GPU being developed by ATi' or somesuch.



forgot about that already. we'll see. hopefully we'll all be pleasantly surprised once we see Wii games using what Hollywood is actually capable of.
 
I just recently noticed that a GF6800XT board with a 350MHz GPU and 256MB of GDDR3 only cost around $100 retail.:oops:

I also noticed that a RadeonX1600 Pro with 256MB of RAM is around $100 retail also.:oops:

This would mean the same board made using 90nm would run much cooler and would be cheaper too. That is less than half the price of a Wii console including 256MB of RAM.:???:

If Hollywood ends up being less powerful than a GF6800 or X1600, I will be quite disappointed.:rolleyes:
 
Last edited by a moderator:
That's the wierdness. What we've seen so far is so well below what we'd consider plausible price wise, it defies belief. But then so does the idea that the final specs will be 5x (or whatever) what devs are currently targetting. :???:
 
NANOTEC said:
I just recently noticed that a GF6800XT board with a 350MHz GPU and 256MB of GDDR3 only cost around $100 retail.:oops:

I also noticed that a RadeonX1600 Pro with 256MB of RAM is around $100 retail also.:oops:

This would mean the same board made using 90nm would run much cooler and would be cheaper too. That is less than half the price of a Wii console including 256MB of RAM.:???:

If Hollywood ends up being less powerful than a GF6800 or X1600, I will be quite disappointed.:rolleyes:

Within the same architeture they could have up to 204M transsistores in the GPU and 84M in the CPU with 96Mg of 1T-Sram and 64Mg of A-Ram and if everything equal it would still 99$. With others architeture it could have 43mm^ for the CPU and 120mm^ for the GPU (eg ~ 7600) and keep it at the same price of 99$. (with moderate speeds)

So yes it is easly over priced.

Shifty Geezer said:
That's the wierdness. What we've seen so far is so well below what we'd consider plausible price wise, it defies belief. But then so does the idea that the final specs will be 5x (or whatever) what devs are currently targetting.

Personally I think that would be few the things that they can do and suprisse me at this point.
 
Shifty Geezer said:
That's the wierdness. What we've seen so far is so well below what we'd consider plausible price wise, it defies belief. But then so does the idea that the final specs will be 5x (or whatever) what devs are currently targetting. :???:

As I've said it before, if your making a game to show in playable form then all you can target is the hardware your working on. There is still plenty of time for visual upgrades to the games we saw once the devs spend more time with final hardware. Though 5 times as powerful is a bit over the top I think, the dev kits used early on (likley used for the majority of these games) was already at least twice as powerful as GC. I don't think anyone's expecting over 10 times as power as GC for the final system.
 
Shifty Geezer said:
That's the wierdness. What we've seen so far is so well below what we'd consider plausible price wise, it defies belief.:

Thats the beauty of it though. As is the mantra for the gaming industry "its all about the games". Sure its a rip off in comparison to what has been presented as "next gen" so far. But the price is so low that people that want to play the next mario, and the zelda after next and the next metroid etc will buy the system regardless because the price isn't too prohibitive.

It's friggin genius on their part. Am I dissapointed as a gamer? F&%^ yeah. Does that make a difference in the grand scheme of things? No. Because by XMAS time next year I expect the price to be around $150 with a handful of "value" launch titles available on the shelf. This xmas 360 should be able to keep my full attention.

We'll see how it works out.
 
Last edited by a moderator:
It would be very interesting to know what 5x or 10x could do as we saw in current ATI cards that 4x the power (eg X1600- X1900) only bring them 3x higher rez at the same game, so if it is only 1/4 of the power (from 10-20x of 360-PS3) it should at least be able to play the same games.

I personally think this is a lot like PDZ on last E3 and while I dont expect it to reach 360 level I think it will be enought for multiplatform games.
 
for $250, the Wii seems WAY overpriced for its CPU, what we know of the GPU, and the amount of RAM its coming with. even for $199 it seems overpriced.

but maybe the Wiimote / free-hand-controler, nunchuck (forgive the spelling) attachment and sensor-bar are somewhat costly items. then again, maybe not.

Nintendo always makes a profit on its hardware anyway.


but we won't really know until we can at least see what Hollywood, Wii's GPU, can do, even if we never get the complete Wii GPU specs from Nintendo.
 
Last edited by a moderator:
Megadrive1988 said:
for $250, the Wii seems WAY overpriced for its CPU, what we know of the GPU, and the amount of RAM its coming with. even for $199 it seems overpriced.

Besides Ram (and we dont even know if there is A-Ram) we now as much from CPU as we know from the GPU, ie, they are extentions from GC ones at 50% faster, as long as we know the CPU can be a tri core Gekko with 2 VMX per core giving it ~22Gflops (more than GC:LOL: ) and within Moores law.

but maybe the Wiimote / free-hand-controler, nunchuck (forgive the spelling) attachment and sensor-bar are somewhat costly items. then again, maybe not.

Nintendo always makes a profit on its hardware anyway.

Meybe the sensor bar, but they considered put 2 controlers per Wii so they arent expensive.


[
b]but we won't really know until we can at least see what Hollywood, Wii's GPU, can do, even if we never get the complete Wii GPU specs from Nintendo.[/b]

I think that even flipper architeture can be extended to a great use, I imagin that 3 upgraded TEV units could do a lot of fxs to a game, so hoolywood could be able to do some very nice things if they want it to do, plus it can have no gfx related HW (like the rumured physics).

BTW I am almost sure that we (at B3D) will get its specs.
 
There's a new podcast on IGN where Matt mentions rumors from devs saying Hollywood features no pixel shader capabilities. Could this be a case of developers only associating pixel shader with the form that MS has created with DirectX?

Could it have the power of a X1600 but be designed around the architecture of Flippers TEV?

Like someone mentioned, multiple TEVS, with an increase in pixel pipelines.
 
Ooh-videogames said:
There's a new podcast on IGN where Matt mentions rumors from devs saying Hollywood features no pixel shader capabilities. Could this be a case of developers only associating pixel shader with the form that MS has created with DirectX?

Could it have the power of a X1600 but be designed around the architecture of Flippers TEV?

Like someone mentioned, multiple TEVS, with an increase in pixel pipelines.

Matt is an idiot.

Obviously that the Flipper Architecture doesn´t have Pixel Shader since is the comnercial name for the Fragment Processors in DirectX. But the TEV is a Pixel Shader.

But Matt has lied a lot of times, I believe that he has the clock speeds and the amount of memory info, but he doesn´t know any more info.

Thanks to Matt we know clockspeeds and memory, but he isn´t a developer.
 
Last edited by a moderator:
Ooh-videogames said:
There's a new podcast on IGN where Matt mentions rumors from devs saying Hollywood features no pixel shader capabilities. Could this be a case of developers only associating pixel shader with the form that MS has created with DirectX?


Like Urian said GC does have pixel shaders, just does not call it pixel shaders, so probably it is Matt that dont know what is talking about (a recurrent thing).

BTW does he say anything more about Wii HW?

Could it have the power of a X1600 but be designed around the architecture of Flippers TEV?

If all you need is better ALUs probably yes, but IIRC in the gameasutra article about shading on the GC they talk about some limitations in the data it can read/write(?). Yet remember that flipper have no vertex shading HW, so anything like that would be brand new.

Like someone mentioned, multiple TEVS, with an increase in pixel pipelines.

At least more fxs could be done and probably (given the new clock speeds) a few more complex ones.
 
Last edited by a moderator:
Like Urian said GC does have pixel shaders, just does not call it pixel shaders, so probably it is Matt that dont know what is talking about (a recurrent thing).

BTW does he say anything more about Wii HW?

The problem with Matt is that with the GPU he only makes speculation since he only has part of the information.

And the truth is that the TEV are different from the Pixel Shaders but they are similar.

If all you need is better ALUs probably yes, but IIRC in the gameasutra article about shading on the GC they talk about some limitations in the data it can read/write(?). Yet remember that flipper have no vertex shading HW, so anything like that would be brand new.

The TEV can read the texture or apply an FX on it, it works different from the traditional Pixel Shader and more TEV could be a lost of time since you cannot make an operation in the texture when at the same time you are reading it. A Fatty and more complex TEV is a better idea.

And about the number of TMU, I believe that 4 TMU are enough for 852x480p graphics.


At least more fxs could be done and probably (given the new clock speeds) a few more complex ones.

No, not true since the TEV runs different than the traditional Pixel Shader.

A more complex TEV, combined with better optimization in the application of Effects, new Graphical FX, FSAA in one pass instead of 2, 32 bits color support instead of 24 bits...

We can get better visual quality and more speed with all this.
 
Urian said:
The problem with Matt is that with the GPU he only makes speculation since he only has part of the information.

And the truth is that the TEV are different from the Pixel Shaders but they are similar.

Yes, but in the end they can do (almost?) the same.

The TEV can read the texture or apply an FX on it, it works different from the traditional Pixel Shader and more TEV could be a lost of time since you cannot make an operation in the texture when at the same time you are reading it. A Fatty and more complex TEV is a better idea.

And about the number of TMU, I believe that 4 TMU are enough for 852x480p graphics.

I actually as thinking about about more TEVs each one working on diferent pixels reading the texture independently and make ops under it ,ie, each TEV completely unrealated to the other, it would need some reworking in the way it did but I think it should work.

By more complex I mean just being able to do more complex ops.

No, not true since the TEV runs different than the traditional Pixel Shader.

A more complex TEV, combined with better optimization in the application of Effects, new Graphical FX, FSAA in one pass instead of 2, 32 bits color support instead of 24 bits...

We can get better visual quality and more speed with all this.

With 2 TEV (likeI said) the same fx would take 1/2 time, so they may not being able to do more complex but at least more fxs for the same frame they should be. With higher clock speeds they may get more complex ones.

Even more things could be done like new hard wired fxs on the top of EMBM, personally it seems that a extention of flipper should be able to do a nice job.
 
Status
Not open for further replies.
Back
Top