First pics of FarCry for Wii?

Well, remember that the Wii only has a 1 MB texture buffer

a mere 1MB texture cache?! versus how much in nv2a?

and if I recall correctly from the original Gamecube, some kind of virtual texturing.

yep. some kind of measly virtual texturing. versus a framebuffer sitting conveniently right in the middle of xbox' main mem.

This virtual texturing probably only reads from the 24 MB main memory pool

quite likely. from that dreaded low-latency edram pool. and while doing so managing to get much less into the way of the cpu than the miraculous, over-hyped and under-delivering yet freaking expensive cross-bar uma controller of the nv2a.

leaving developers to manually shuffle textures around between the 64 MB pool and the 24 MB pool

eek! that sounds nasty! gee, you have to really think in advance!

as well as whatever work may need to be done with the texture cache itself.

yes. it's one vicious, malicious cache. you really have to watch out with it.

In any event, I'd hardly describe it as more efficient, though it may yield higher performance.

wait.. so it was not more efficient, yet it somehow yielded higher performance?

hmm. so how is it not more efficient - does it use more transistors for control functions? is it more expensive to manufacture? does it not allow the respective memory clients (i.e. GPU, CPU and DMA controllers) to perfom to their potential under a reasonable memory access load?
damn, it was going so nicely, why did you have to bring performance into this?

but i'm pretty sure you can make eveything right again by shedding light to one simple fact. the $1M dollar question:

* do you know what was the average access latency to xbox's main memory under normal load from the POV of the different clients? how about cube's?
 
This looks a lot like the Xbox version to me. So I guess it's right on the mark then.
 
One thing I don notice is that they've added some light blooming just like Red Steel. The enemies also seem to have a little bit more detailed textures (look at the arms and the veins).

Compare it to the Xbox version...

far-cry-instincts-20050920061603211.jpg

far-cry-instincts-20050901015426120.jpg

far-cry-instincts-20050901015424979.jpg

far-cry-instincts-20050901015427823.jpg
 
I prefer the look of the Xbox version based on those screens, the textures don't turn to crap 5 feet away from the camera.
 
I prefer the look of the Xbox version based on those screens, the textures don't turn to crap 5 feet away from the camera.

Thats why I think these are of a old build. Comparing these next to Xbox shots, doesn't really say much about the hardware. Reports put this game at a Nov release, development probably began shortly after E3.
 
One thing I don notice is that they've added some light blooming just like Red Steel. The enemies also seem to have a little bit more detailed textures (look at the arms and the veins).

Compare it to the Xbox version...

far-cry-instincts-20050920061603211.jpg

far-cry-instincts-20050901015426120.jpg

far-cry-instincts-20050901015424979.jpg

far-cry-instincts-20050901015427823.jpg

That looks better than the Wii version. I think they are underestimating Wii's graphical capabilities expecting that the game will sell anyways thanks to the original gameplay :???:
 
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB). That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii. Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.

 
That looks better than the Wii version. I think they are underestimating Wii's graphical capabilities expecting that the game will sell anyways thanks to the original gameplay :???:

Actually, I seem to remember that Nintendo was much later with getting devkits out than, say, Sony. What they did instead was release GameCubes with Wii-mote support, and the games were initially just developed for the GameCube. This would allow the developers to concentrate on the important part, which is making good use of the new controller.

No doubt with the promise that any graphics upgrades that the Wii would receive would be easily tapped into, I think that's what developers aimed for initially.

We also shouldn't rule out the possibility that catering for this new kind of gameplay requires a bit of extra calculations going on at the cost of graphics. You also need to be more responsive, so framerates are going to be more important than graphical detail.

And finally of course, apart from supporting 480p, compared to the more expensive other next-gen machines, the Wii will simply always look inferior. And we know from experience with previous generation shifts that old-gen graphics start to look really old really soon once we got the new gen.
 
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB). That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii. Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.


Wasn't Gamecube 24MB and the Wii upgraded to something like 96MB?
 
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB). That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii. Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.

You put to much faith in those leaked specs, it didn't even include Hollywood.
 
My main point here is that the Wii GPU is limited to directly accessing a smaller amount of total memory than the original Xbox (24 MB vs 64 MB).

at least it seems so. and i got your main point from the get go, i believe. maybe i was being overly sarcastic in delivering my response.

That's bound to introduce some porting issues into the equation when moving from the Xbox to the Wii.

it takes texture management. something people have been doing since the dawn of 3d hw. is it easier not to do it? - of course. is it though justified not to do it? i'm still waiting for the day when all textures eveybody will ever need will go nicely over the system's main bus and that will not constitute a severe hit to the rest of the system. last-minute porting frenzies aside, i'm fine with texture management, and nothing makes an archtiecture that requres texture management inefficient in my eyes if once in place said management makes my life easier. versus devs developing mem-access paranoia by the end of the project.

Now, maybe that smaller amount of memory has lower average latency, but GPUs are hardly sensitive to this kind of thing and bandwidth plays a much larger role.

as clever latency-hiding beasts GPUs are, they're not invincible to latencies. nothing is and nothing will ever be. re their big bandwidth apetites - the GC met that demand hardly any worse than the xbox (actually i'd say way better: > 10GB/sec from texcache, and > 7GB/sec to the fb), and the wii will be only better in this regard. last but not least, we're discussing a GPU + CPU setup here - and latencies for the latter are far from insignificant.

fact of life is the xbox macro-architectural decisions ended up so bad that even overly-expensive componets could not mend that. how much effor was wasted with the development of nv2a's 'omnipotent' cross-bar mem controller only to find out at the end of the day it was not that magic bullet*? how much did the high-specced DDR delivering the 'massive' bw cost, only to be crippled by insane latencies (from the POV of the CPU)?

* ok, it ended up being a fine local vidmem controler, so nv actually gained from it. at somebody else's expense, though.
 
Last edited by a moderator:
Wasn't Gamecube 24MB and the Wii upgraded to something like 96MB?

It maintains the original 24 MB 1T-SRAM pool with the 16 MB "A-RAM" from the GameCube being upgraded to 64 MB, along with a faster interface. As Ooh-videogames has pointed out, this is based on the leaked specs, which judging by everything that's been seen and reported, look pretty accurate. I would expect the Wii GPU to be clocked higher than Flipper, but I don't think we'll see any changes to the on-die memory architecture. I suspect the biggest reason why the Wii doesn't support HD resolutions is because of the 2 MB framebuffer.
 
fact of life is the xbox architectureal decisions ended up so bad that even overly-expensive componets could not mend that. how much effor was wasted with the development of nv2a's 'omnipotent' cross-bar mem controller only to find out at the end of the day it was not that magic bullet*? how much did the high-specced DDR delivering the 'massive' bw cost only to be crippled by insane latencies?

* ok, it ended up being a fine local vidmem controler, so nv actually gained from it. at somebody else's expense, though.

I don't know, I think you're being a bit too hard on the Xbox here. It is, after all, the console with the best graphics of the past generation. It's also the only last-generation console with games running at HD resolutions (GT4 1080i notwithstanding). A big factor in it's ability to do that was in fact its UMA design.
 
I don't know, I think you're being a bit too hard on the Xbox here. It is, after all, the console with the best graphics of the past generation. It's also the only last-generation console with games running at HD resolutions (GT4 1080i notwithstanding). A big factor in it's ability to do that was in fact its UMA design.

and a cray spiced up with a bunch of onyx stations would put an xbox to shame wrt HD rendition. how does that make such a config efficient though? remember, we're talking efficiency here.

and since not every company can allow wasting billions on hw, neither there's such a thing as infinite resources, efficeincy does matter. but maybe it matters more in my eyes than it does in yours *shrug*
 
Last edited by a moderator:
It maintains the original 24 MB 1T-SRAM pool with the 16 MB "A-RAM" from the GameCube being upgraded to 64 MB, along with a faster interface.

But as long as the "A-Ram" does have high BW and (at least in this case) low lantency what is the problem, wouldnt even mean more BW than a single pool (like it happens in PS3)?

Correct me but wouldnt this make wii way better than XB in terms of memory/BW/latency and general efficiency?


In regard to FC images it seems a bit raw in some parts but it also seems to be problems based on the memory, so I expect it to be much better in the final game, althought I agree that XB version looks better.


On a more interesting note, I wonder if (assuming it is really from wii) they are using FC to test or use alternative controls as, unlike RS or MP3, in all the shoots the aim is on the center of the screen, maybe we will see a mouse like controls, I hope so.
 
Back
Top