WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
pc999 said:
They could do those with a X1400 level card and a low end 970FX (with very low prices for HW and dev cost (no HD, no to many polys, just "basic shading"....) too) so I doubt.

No, a GPU in the league of the specs you're describing would at 640x480 be capable of all kinds of fancy shader stuff. I think you're underestimating how powerful even the last "generation" (i.e. X800 etc) of video cards was. Definitely not low-spec enough to make a statement. It's almost like they want to be significantly lower-spec.
 
hupfinsgack said:
I didn't think that they were prerendered. But it seems that everyone at UbiSoft (Red Steel team and Rayman team) were overly optimistic when estimating the hardware and now they've to cut back.
No, no, no. What happened was Nintendo gave out bogus specs to their devs to produce better looking trailers than the console can actualy produce. That's what happens. Just remember every time there's a trailer at a show that looks better than the game ends up being, it's always the console company's fault for lying to us. Or is it just Sony who's evil that way...:p

To be honest, I can't say I see the terrible downgrade. The new shot looks darker and grittier, and thus there's less to see. Up the gamma in a paint package and the quality of what is visible seems comparable to me in lighting, model and texture resolutions, but there's a lot less going on in the new pic to add pzazz. I'd like to see more new vs. old pics to see if there really is a downgrade.
 
fearsomepirate said:
2. It's a political statement. Nintendo wants to prove you don't need lots of horsepower to be successful in the home console market. Likely.

I believe this is part of it. I think Nintendo is trying to say something about the over reliance on technology and how it can shift focus way from the fundementals of creating great entertainment.I'll repost something I said somewhere else.

"In terms of technology,it only takes a pencil and a peice of paper to write a great story and a creative mind to tell that story well. People make great games, not CPU's and VPU's so if a game fails to entertain it will not be due to a lack of power, but a lack of talent."
 
fearsomepirate said:
No, a GPU in the league of the specs you're describing would at 640x480 be capable of all kinds of fancy shader stuff. I think you're underestimating how powerful even the last "generation" (i.e. X800 etc) of video cards was. Definitely not low-spec enough to make a statement. It's almost like they want to be significantly lower-spec.


Then just a really good update/overclock to GC HW (about 10x), I would not bring any new shading or tech but still higher quality gfx that aren't photo realistic and bellow 360 level, and still cheap to dev yet giving many more possibilitys to games (Rebirth demo is a good example, part real time part render target, time as been the reason ,so they, to not be all real time).

Like I said I have my doubts this is HW fault, many last gen games looked better (even in GC), and they wouldn't make a brand new GPU just for this too.

BTW I fund the rebirth videos, that is what I would like to Wii looks like.

http://www.youtube.com/watch?v=MyQ9PyFTWcE
http://www.youtube.com/watch?v=mMKvMpegSXo
 
Last edited by a moderator:
Rebirth was a mix of prerendered animatinon and real-time CG. Anyhow, considering everything in those videos matched or exceeded the best example of gamecube graphics years later (including a forest looking on par with the prerendered backgrounds in Resident Evil, which was nicknamed Rebirth btw), I'd say it's not real time on gamecube hardware, especially considering gamecube hardware wasn't finished at the time the video was made. Like the Zelda-Ganondorf fight, these graphics would require a system a generation more powerful than gamecube.
 
I guess what they mean is that if there is enought performance from CPU/BW/Memory... the GPU would be able to do it. I wonder how much more power would be needed to with the features set of GC to do with.

Anyway I gave it as a example that t is possible not going it high end tech give nice and strong visuals even stay a lot below 360 (many of the good looking tech from UE3 (thtat only scratch the surface of 360) wouldnt be possible).

That said I would love if Wii gfx at least resemble this and the others GC demos.
 
Last edited by a moderator:
Fox5 said:
Like the Zelda-Ganondorf fight, these graphics would require a system a generation more powerful than gamecube.

Except that games like Soul Calibur III and Dead or Alive: Ultimate already have graphics as good as the Spacworld Demo. There's nothing special about the Zelda-Ganondorf fight except for people's emotional reactions to it.
 
Fox5 said:
Personally, I'd like nextgen (or is it this gen now) to look like this...
http://www.youtube.com/watch?v=_73CVfAvZ0A&search=zelda oracle commercial

Oh, and I'd like to see the WiiGPU turn out to have at least a 2gigapixel fillrate, and a quad core cpu to make up for the lack of vertex shaders.


I would've liked this new current generation to look like that also... there are certain aspects of that CG commercial that even Xbox360 and PS3 cannot reproduce.
the image / rendering fidelity.

At one point, long before it became known that Revolution / Wii would only be a modest improvement over Gamecube, I had hoped Gamecube2 / Gamecube Next would be able to crank out visuals as good as if not better than that commercial.


this is an age-old discussion and an obvious point but, there is still a large gulf between the most advanced realtime graphics and even simplistic low-end prerendered CG. The overall look is completely different. even though there are some aspects where realtime graphics have surpassed very old (and lowend) prerendered CG (textures, shaders).

I would like to see emphsis shift from shaders, to better image fidelity and consistant framerate in the NEXT generation.
 
The Wii Hardware

- Nintendo Wii’s ‘Broadway’ CPU operates at 729MHZ with a maximum bandwith of 1.9gbyte/sec.
- Nintendo Wii’s ‘HollyWood’ GPU is clocked at 243MHZ, the internal memory of it includes 3mb of embedded graphics memory and 24megabytes of high speed main memory.
- 64megabytes of GDDR3 (MEM2) as the external main memory. Just like the internal memory, it can be accessed from the CPU and GPU with a maximum bandwidth of 4gbytes/sec and can also store programs in the MEM2.
- The GPU of the Wii is identical to the GC’s but it is on average 1.5X faster.

Wii’s Optical Disc Drive

- Opitcal Disc Drive (ODD) supports single and dual layer Wii disks, discs eject with software or button and the maximum read speed is the equivalent of DVDx6.
- Two main disc types supported the single sided 12cm single sided 4.7gb and the double sided 8.51 GB. Nintendo GC discs also supported. Some of the capacity of the discs are used by the system and games can not use full disc space.
- Inserting a disc will start the Wii console, even if it was already in an off state. Pressing the eject button will change the console to an on state to take out the disc also.

General Overview

- An optional wired LAN adapter that connects to a USB port is in the pipeline for users who do not possess a wireless LAN set-up currently.
- Internal non-removable 512MB flash memory used to storage game save data and downloadable content thus eliminating the Need for a memory card.
- Both Wii discs and Gamecube discs can be played via an intelligent mode swap. When running in GC mode, the Wii’s CPU and GPU will lower to the respective speeds of the GC and some of the MEM2 functions as ARAM.
- Software development environment is an upgrade to the ‘Dolphin SDK’ used with the GC; the same libraries are used so developers can get up to scratch easily as well as the possibility of ports being easier.
- The following interfaces are included with the Wii; SD card slot, Wireless controller, two USB 2.0 ports, wireless LAN, 4x GC controller ports, 2x GC memory card slots and an AV multi output jack (only an analog jack).
- Supports Wii disks (one sided 12cm) and GC discs (one sided 8cm) and console auto switches depends on what disk is inserted
- More than just the Nunchaku is planned as an extension. GC peripherals such as DK bongos can be used in both Wii and GC modes.
- Three power status, on, off and unplugged. To prevent mistaken turn offs, the power button must be held for about a second.

The Wii Control System

- The Wii controller features; Direct Pointing Device, Three axis accelerometer, Wii power button (remotely turn console on/off), buttons, wireless connectivity, indicator LED’s, rumble, battery powered (two AA alkaline batteries) and ability to connect extension unit.
- The Wii controller supports three types of operations; by itself, with a nunchuk extension or with a classic controller. Classic controllers will ship to developers during August 2006.
- The SYNCHRO button on the Wii controller exchanges wireless ID numbers when pressed at the same time as SYNCRHO on the Wii console. Wireless communications are only possible with consoles which have been authenticated.
- The rumble motor can be turned on and off and the intensity can be changed.
- The Wii remote has a pointer for fine movements as well as a motion sensor +/- 3.4G suitable for larger body movements, the nunchuk attachment has a sensor of +/- 2G
- The sensor bar must be placed above or below a TV set, the pointer measures coordinates between the ends of the bar which are about 20cm apart.
- The Wii remote has four status, disconnected, communicating, establishing connection and pairing wait status.
- The pointer can measure co-ordinates within bounds of rectangle centered upon the sensor bar, thus it can also measure points beyond the screen. It also responds to strong light sources, windows, fluorescent lamps, fireplaces, mirrors etc.
- Due to players hands shaking while holding the controller, a ring buffer allows a precise direction to be created to hold and average accelerator samples.
 
Someone has taken GC specs and increased all of them by 50% based on the rumoured CPU/GPU clock speeds being 50% higher. Which just doesn't work given the other info they're claiming. For instance the bandwidth of the supposed 64MB GDDR3 main memory is 4GB/s, 50% higher then the 24MB of 1T-Sram in GC. The problem there is GC's 1T-Sram was single data rate, GDDR3 is double data rate. So even if Nintendo used GDDR3 clocked 50% higher then GC's main memory and on the same 64bit bus it would still be 8GB/s, not 4GB/s :D Unless Nintendo are going to use 162Mhz 64bit GDDR3 memory, on a 0.66 multiplier :LOL:

Not to mention that GDDR3 is graphics memory, so it would make little sense to use it as main memory and equally little sense to have almost 4 times more memory in Wii then GC yet only 50% more bandwidth. But probably the most insane claim there is that ATI would spend years designing a GPU identical to one it designed 6 years earlier.. just clocked 50% faster.
 
Last edited by a moderator:
Probably fake but those specs dont look too far off either.
Looking at the games Wii can't do nice shaders, self shadowing, paralax mapping, normal mapping or even AA. Its probably a DX7 level GPU.
 
Nightz said:
Probably fake but those specs dont look too far off either.
Looking at the games Wii can't do nice shaders, self shadowing, paralax mapping, normal mapping or even AA. Its probably a DX7 level GPU.


GC can do shading, some pages ago you even have a link to gamasutra article.

About those "specs" it seems that some one did a bad "copy-past" from IGN.

The amusing thing is they even lowered more the specs, now who gives 1,4xGC:LOL: ?
 
fearsomepirate said:
Except that games like Soul Calibur III and Dead or Alive: Ultimate already have graphics as good as the Spacworld Demo. There's nothing special about the Zelda-Ganondorf fight except for people's emotional reactions to it.

not to mention that several of the RE4 cutscenes surpass that famous sword fight.

people often forget that aside from hw specs, dev's "specs" too imporve with time. new, ingenious approximation algorithms to classic CG problems are being developed every day, so something that was possible before only at off-line speeds may becomde feasible on 'modest' hw years later just becase of agorithmic breakthroughs.
 
those new specs are generic sounding. even if half true. the 64 MB of GDDR3 memory is nonsense. It's already been stated that the 64 MB is MoSys 1T-SRAM, in addition to the original 24 MB of 1T-SRAM that was in Gamecube as main memory.


darkblu said:
not to mention that several of the RE4 cutscenes surpass that famous sword fight.

but those RE4 cutscenes were running at about half the framerate of the Link vs Gandorf sword fight.
 
Megadrive1988 said:
those new specs are generic sounding. even if half true. the 64 MB of GDDR3 memory is nonsense. It's already been stated that the 64 MB is MoSys 1T-SRAM, in addition to the original 24 MB of 1T-SRAM that was in Gamecube as main memory.




but those RE4 cutscenes were running at about half the framerate of the Link vs Gandorf sword fight.

And much lower resolution and IQ. RE4 was even below the standard IQ for a top notch GameCube title.
 
Status
Not open for further replies.
Back
Top