WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
not necesserily. depends entirely on the algoritms and approaches employed by the devs and artists. if we assume there's a set of 'sweet spot' visualisation techniques that yield good visuals at max rate, then i would venture to guess that the work-load under such scenarios would be mainly on the GPU.

Thanks I thought the work on the CPU would need to scale equally too, are those algoritms and approaches more at the level of shading and such (and less vertex creation)?

Althought I would prefer strong CPU and weaK GPU as most of next gen feature that I like seems to come from the CPU (dinamic AI, dinamic animation, physics...).

of course if you write a sw raytracer for a wii title things may be a bit tough : )

I would be happy with fast raycasting for AI:LOL: .
 
Teasy said:
You can't really compared EE's GFLOP rating to Gekko or Broadway. Considering The Emotion Engine always had to be used heavily for graphics such as T&L.

yes I am aware of that. i was doing a CPU to CPU comparison, not the way that PS2 as a whole works compared to how Gamecube and Wii work as a whole. Yes of course, the PS2's EE is part of the graphics rendering pipeline, it is the front-end, the VUs are flexible programmable units that take the place of geometry engines / T&L units / Vertex Shaders.



I should've said that Gekko, and even more so, Broadway, their flops performances are more more easily usable than that of the EE.
 
Its not like the IQ was bad in every area, the lava room with stone dragons, had me stuck in awe. Even the section, where Salazaar takes Ashley. So since there's a RE title in development for Wii, and Capcom has a engine that can be enhanced, tweaked, and optimize.

I can't wait to see pics and video.
 
pc999 said:
Thanks I thought the work on the CPU would need to scale equally too, are those algoritms and approaches more at the level of shading and such (and less vertex creation)?

yes. and in case you mean in-time vertex creation, it is a rather exotic approach. still to these days. we'll see what will happen by the end of this generation, though, as some of the heavier contenders have bragging rights in this respect..

Althought I would prefer strong CPU and weaK GPU as most of next gen feature that I like seems to come from the CPU (dinamic AI, dinamic animation, physics...).

yes, a strong CPU can go well with a weak GPU if the latter is just a strong rasterizer. if the GPU has its own vertex resource to rely upon, though, the strength of the CPU may not need be focused that much on flops anymore. not tha gekko is a measly flops performer, either, just in comparison to the other two's peak numbers gekko's are nothing to brag about.

I would be happy with fast raycasting for AI :LOL:

then you'll be surprised how much casting/marching algorithms are susceptible to memory access limitations ; )
 
Last edited by a moderator:
darkblu said:
yes, a strong CPU can go well with a week GPU if the GPU is just a strong rasterizer. if the GPU has its own vertex resource to rely upon, though, the strength of the CPU may not need be focused that much on flops anymore. not tha gekko is a measly flops performer, either, just in comparison to the other two's peak numbers gekko's are nothing to brag about.


I thought that those things are flops heavy (not the only request but one of them) things like the ray casting / path finding, the physics too (IIRC Fafalada said that EE would be better suited for physics, the xenon VMXs also are improved for that (and animation) and the best HW for physics do well on the flops side like Cell and the PPU (althought being the PPU the best for it and having 1/4 of the flops tells that flops arent the only request). I thought that a flop improvement guess that good balance is the best.

Anyway that is good news. Thank you very much for the info.

PS: Damm Sony and MS markting


then you'll be surprised how much casting/marching algorithms are susceptible to memory access limitations ; )

I hope that, at least, third partys shows us new and good AI on the wii.
 
denis_carlin said:
I thought that the PC version was cancelled..

So far, only delayed.

remember HOTD2 on the dreamcast?

point being, bilinear with mipmapping is not the end-all-be-all of looks artists may want to achieve when striving to achieve a certain atmosphere.

Or the development of HOTD2 started on Model 2 hardware (or something similar) and was ported to Naomi in the end. I don't believe HOTD3 retains the same look.

Anyhow, considering RE4's "graphic style" caused the graphics to break down into an unrecognizable mess at times (ugh, don't play the game on an LCD screen, even with progressive scan, at least in my experience, looks much better on a CRT though), incredibly blurry, jaggy, and poor color accuracy, I'd say a more traditional PC style filtering would have been preferred.

Its not like the IQ was bad in every area, the lava room with stone dragons, had me stuck in awe. Even the section, where Salazaar takes Ashley. So since there's a RE title in development for Wii, and Capcom has a engine that can be enhanced, tweaked, and optimize.

Is there really? Not Resident Evil 5 though, I'd assume.
Resident Evil 4 seems to be another one of those games (like Rebel Strike or Halo 2) that sacrifices PQ for the sake of stuffing more effects onto the screen. On a crappy display, you don't really notice the lower quality filtering/insufficient resolution/etc, but on a higher quality display it really does look assy compared to a simpler, cleaner design.
 
Fox5 said:
Anyhow, considering RE4's "graphic style" caused the graphics to break down into an unrecognizable mess at times (ugh, don't play the game on an LCD screen, even with progressive scan, at least in my experience, looks much better on a CRT though)

Then the fault is with your display, not with the game. For one thing, being letterboxed, your LCD screen probably doesn't natively display a 480x270 image or a multiple, so the scaling will automatically make it look crappy.

In my opinion, if you have the same signal going to two displays, and one makes it look fine while the other display makes it look like crap, it's a real stretch to call the display that produces the worst image "higher quality." Just because it's more expensive and uses modern technology doesn't mean it's actually better.
 
Fox5 said:
Or the development of HOTD2 started on Model 2 hardware (or something similar) and was ported to Naomi in the end. I don't believe HOTD3 retains the same look.

i've never played HOTD3 but IIRC it originally started up as cell-shaded, and then ended up like this - now, dunno about you, but judging by those shots i'd say art style definitely took precedence over IQ. as about HOTD2, it had its fair share of bilinear use, so although the look of some surfaces could have come as some sort of legacy, the choice of grainier sampling and assets over "cleaner" such is pretty obvious at places.

Anyhow, considering RE4's "graphic style" caused the graphics to break down into an unrecognizable mess at times (ugh, don't play the game on an LCD screen, even with progressive scan, at least in my experience, looks much better on a CRT though), incredibly blurry, jaggy, and poor color accuracy, I'd say a more traditional PC style filtering would have been preferred.

allow me to give you an advice then - don't use poorly/inappropriately-calibrated, subpar-scaling LCDs for consoles. i have a fairly high-end LCD at my desktop here and none of my consoles is hooked up to it (partially due to scaling issues, but much more due to the totally inappropriate gamma). even my good old dreamcast(s) have their own CRT VGA calibrated to TV gamma (as close as possible).
 
Acert93 said:
MS had launch devs running PC graphics cards and dual G5 processors up to about ~3-4 months before games had to go gold.
Exactly,because the 360 VPU was not just an overlocked version of the Nvidia chip from the first Xbox. It is much more advanced thus the later ready date.This is the point of this discusion. If the Hollywood were just an overclocked Flipper it would have been ready and available a loooooong time ago.
 
fearsomepirate said:
Then the fault is with your display, not with the game. For one thing, being letterboxed, your LCD screen probably doesn't natively display a 480x270 image or a multiple, so the scaling will automatically make it look crappy.

In my opinion, if you have the same signal going to two displays, and one makes it look fine while the other display makes it look like crap, it's a real stretch to call the display that produces the worst image "higher quality." Just because it's more expensive and uses modern technology doesn't mean it's actually better.

Ok, well most of the time an SD signal will look better on an SDTV than an HDTV, does that now mean the SDTV is higher quality than the HDTV? For that matter, RE4 still has IQ problems are any display, but interlaced displays seem to help hide the visual problems.
Also, RE4 outputs its image as a standard 640x480 image. The LCD screen scaled it to 1280x960, with the borders still there. There shouldn't be any significant artifacts from exact ratio scaling, though the natively low resolution of RE4 probably does mean it's going to suffer a bit when blown up. As an unstretched image (on any display) it looks much better, but still has IQ issues.

i've never played HOTD3 but IIRC it originally started up as cell-shaded, and then ended up like this - now, dunno about you, but judging by those shots i'd say art style definitely took precedence over IQ.

HOTD3 may not be a technical triumph, but it still retains the crisp PC-like image quality. (so did hotd2 for that matter, minus some point sampled textures, RE4 and several other top games this generation do not)
 
Fox5 said:
Ok, well most of the time an SD signal will look better on an SDTV than an HDTV, does that now mean the SDTV is higher quality than the HDTV? For that matter, RE4 still has IQ problems are any display, but interlaced displays seem to help hide the visual problems.
Also, RE4 outputs its image as a standard 640x480 image. The LCD screen scaled it to 1280x960, with the borders still there. There shouldn't be any significant artifacts from exact ratio scaling, though the natively low resolution of RE4 probably does mean it's going to suffer a bit when blown up. As an unstretched image (on any display) it looks much better, but still has IQ issues.



HOTD3 may not be a technical triumph, but it still retains the crisp PC-like image quality. (so did hotd2 for that matter, minus some point sampled textures, RE4 and several other top games this generation do not)


Well I played RE4 on a PC monitor, and it looked great at 480p. You just can't run RE4 in widescreen format. You would have to switch your set(HDTV) to 4:3, then maybe use zoom function assuming your TV has this feature. I admit that the IQ could be a eye-soar in some areas, but there were areas that stood out above others(eye-candy).

Remember this is what RE4 originally started out as

re4_e3trailer_051303_gcn.jpg


eclipsebio11.jpg


doorbio10.jpg


What we have in our GC, shows that the change in environments, caused a hit in overall texture quality. Which is the move to open environments, the Wii should be able to produce IQ better than the above images, with open environments.
 
So if Nin never releases the specs themselves, how will we the community ever find out? Will we have to wait for some dev with development system to leak them or will journalists be able to analyze the system themsleves?
And even more important,if we never get official confirmation will we even believe it?
 
ninzel said:
So if Nin never releases the specs themselves, how will we the community ever find out? Will we have to wait for some dev with development system to leak them or will journalists be able to analyze the system themsleves?

don't worry, we may neve know every last detail of the system but we'll know sufficiently either through devs or through the hacking/modding/homebrew community.
 
Ooh-videogames:
I've run RE4 in several formats. Running it widescreen squishes everything vertically, but does make the image "sharper", though I would never play like that. Running in 4:3 mode keeps it decently sharp, but the image is still really small and bordered, and letterbox zoom makes the visual errors stand out more, though on a CRT it still looks better than on an LCD to me at least.
And running RE4 through an external scaler really messed up the image, though that's not too big of a surprise, my external scaler sucks.

RE4 looked good overall and had some really fantastic moments to it, but so did Zelda 64. That doesn't mean I'd have accepted a Zelda level of PQ for this generation of games, and I expect all next gen games to have better PQ than RE4. The move from RE4's halfish 480p mode, to a full 720p alone should accomplish that. From my experience with PC games, most of the visual errors I see in RE4 come from too low of a resolution and improper filtering on PC games. It could also be due to a poor RAMDAC, but poor RAMDACs usually affect all games universally yet most gamecube games do not have any visual problems.
 
Fox5 said:
Ooh-videogames:
I've run RE4 in several formats. Running it widescreen squishes everything vertically, but does make the image "sharper", though I would never play like that. Running in 4:3 mode keeps it decently sharp, but the image is still really small and bordered, and letterbox zoom makes the visual errors stand out more, though on a CRT it still looks better than on an LCD to me at least.
And running RE4 through an external scaler really messed up the image, though that's not too big of a surprise, my external scaler sucks.

RE4 looked good overall and had some really fantastic moments to it, but so did Zelda 64. That doesn't mean I'd have accepted a Zelda level of PQ for this generation of games, and I expect all next gen games to have better PQ than RE4. The move from RE4's halfish 480p mode, to a full 720p alone should accomplish that. From my experience with PC games, most of the visual errors I see in RE4 come from too low of a resolution and improper filtering on PC games. It could also be due to a poor RAMDAC, but poor RAMDACs usually affect all games universally yet most gamecube games do not have any visual problems.


It would be nice, if all GC games running on Wii, could run in 720p resolution. The hardware compatibility and larger memory pool should make it easy.
 
Based on how little nintendo improved the graphics of Ocarina of Time when emulated on the Cube, I doubt they'll do any enhancements. Forcing a higher resolution or AA would have been nice though.
 
Ocarina of Time runs in 640x480 on the Gamecube. It's 320x240 on the N64. I can't tell if there's FSAA or not, though. It also seems to run at a better framerate. And the game was emulated, not ported. Further, it was just a nice pack-in to move Gamecubes, not a major game release.

RE4 would benefit hugely from FSAA, mip-mapping with AF, and running in 850x480. Mikami said he letterboxed it because he didn't want the game to be in 4:3. He never said it was for performance reasons. The Cube's pulled of plenty fancy graphics at 640x480.
 
fearsomepirate said:
Ocarina of Time runs in 640x480 on the Gamecube. It's 320x240 on the N64. I can't tell if there's FSAA or not, though. It also seems to run at a better framerate. And the game was emulated, not ported. Further, it was just a nice pack-in to move Gamecubes, not a major game release.

RE4 would benefit hugely from FSAA, mip-mapping with AF, and running in 850x480. Mikami said he letterboxed it because he didn't want the game to be in 4:3. He never said it was for performance reasons. The Cube's pulled of plenty fancy graphics at 640x480.

The 640x480 doesn't make a large difference though since they seem to be using the bare minimum quality filtering. A PC, even at 640x480, can make Ocarina look so much better.
The framerate is better than the n64 version (though not without faults), and it is pretty much a perfect emulation, something rarely seen on a PC. Not a major game release, but it doesn't seem to take much to add graphical effects to an emulated 3d game.
Also, you were suggesting that all backwards compatible games run at a higher res, considering that Ocarina was a special one time effort, I'd imagine it'd get more work put into it than the case by case basis to ensure each gamecube game could run flawlessly at a higher res. The only time Nintendo has ever enhanced a ported, emulated, or backwards compatible game that wasn't a remake was with gameboy games getting 4 color mode on gameboy color.

And RE4 could have had an option for true widescreen, if it could handle it. I have doubtst that it could have without framerate dips (considering the game already had some), though likely nothing that would make it unplayable. Heck, the majority of the game may have played at the same speed, but I could see a problem with some of the more intense scenes. The PS2 version did support true widescreen, but it also had so many graphical features cut out that it's not comparable.
 
Fox5 said:
The 640x480 doesn't make a large difference
You're joking, right? It's 4 times the pixels. It makes a huge difference in the sharpness and clarity of the picture. Just put in your N64 and your Cube and switch back and forth between the two pictures, and you'll see what I mean. Of course, it doesn't do anything for the texture resolution. And I really don't know why this matters--it was a freebie pack-in bonus so you could play an N64 classic on your Gamecube. They weren't selling it as a $49.99 stand-alone game. Obviously, if they wanted to port it and enhance the graphics, they could have, just like they did with Mario 64 on the DS.
Also, you were suggesting that all backwards compatible games run at a higher res
Huh? No I wasn't. Someone earlier said that they didn't even up the resolution on OoT on the Gamecube, and I pointed out that he was wrong. Seriously, who are you talking to? You shouldn't just make up random things and attribute them to people.
 
Ocarina of Time on N64 was *well* under 30fps on Nintendo 64. I'd say 20fps give or take a few frames/s.

on Gamecube, it is more or less the same framerate, well under 30fps. I didn't notice a higher framerate on Gamecube than Nintendo64. maybe there are fewer dips in framerate on Gamecube, I don't know. the last time I played the original N64 cartridge was 1998-1999. but I'm pretty certain the Gamecube port/emulation did not improve the framerate, and if it did, it was by a VERY insignificant amount.


I played the N64 OoT emulated on PC--there was an option to boost the framerate, which really worked -- it seems to be running at, a good 30fps. i was impressed. I forgot the name of that N64 emulator though (dang!).... years later I used the Project64 emulator, the version I have runs the game at basicly the original ~20fps, and I don't see any options to boost the framerate.


the only thing that the Gamecube port really improved was the resolution.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top