First pics of FarCry for Wii?

But as long as the "A-Ram" does have high BW and (at least in this case) low lantency what is the problem, wouldnt even mean more BW than a single pool (like it happens in PS3)?

Correct me but wouldnt this make wii way better than XB in terms of memory/BW/latency and general efficiency?

The 64 MB memory pool is supposedly some form of DDR SDRAM, so the bandwidth will probably not be overly high, nor will the latency be exceptionally low. Again, I would point out that the GPU probably only has access to the smaller 24 MB pool, which means developers will have to use bandwidth to shuffle textures back and forth if they decide to use the 64 MB pool for additional storage.
 
Not given those specs (probably the ones that I guess you are using).



Anyway the rumors of the specs, specially in regard to memory, are really bad, some say it is GDDR3 others say it is 1T-Sram even the varios reports of IGN in the subject are contradictory (IIRC one of them say a total of 104 Mgs, while other say also 88Mgs).
 
Last edited by a moderator:
The 64 MB memory pool is supposedly some form of DDR SDRAM, so the bandwidth will probably not be overly high, nor will the latency be exceptionally low.

I thought it was 1T-SRAM, with no latency issues? One of the reasons how GC has been able to go head to head with XBox in previous generation.
 
Actually, I seem to remember that Nintendo was much later with getting devkits out than, say, Sony. What they did instead was release GameCubes with Wii-mote support, and the games were initially just developed for the GameCube. This would allow the developers to concentrate on the important part, which is making good use of the new controller.

No doubt with the promise that any graphics upgrades that the Wii would receive would be easily tapped into, I think that's what developers aimed for initially.

We also shouldn't rule out the possibility that catering for this new kind of gameplay requires a bit of extra calculations going on at the cost of graphics. You also need to be more responsive, so framerates are going to be more important than graphical detail.

And finally of course, apart from supporting 480p, compared to the more expensive other next-gen machines, the Wii will simply always look inferior. And we know from experience with previous generation shifts that old-gen graphics start to look really old really soon once we got the new gen.

Yep that makes sense :)
 
Yep, I think it's "public secret" that they sent off OC'd GC Devkits with Wiimotes instead of Wii Devkits.

Also, going by several statements by developers etc, I'm fairly certain that those "leaked specs" aren't the truth, the Hollywood isn't "Flipper 2" etc etc, but a whole lot more, and different.

We always see how 2nd gen games look a whole lot better than 1st gen launch games, everyone knows that - but I'm fairly certain that Wii games will get huge jump in the gfx department, due 2nd gen titles being developed from the ground up on Wii devkits. Same should go for XB360 in some terms too, since after all, they DID develop those launch titles quite long with Dual G5's + X800's. On PS3, I'm not expecting as big jump, though, as the hardware they're using, they've essentially had all along, G7x's have been around for quite a time and aren't that big jump over the first NV4x's.
 
The console specs that your using are off and fake. Broadway I know has higher cache specs and the DDR memory has yet to be annoucned by anyone and goes against the mosys annoucnements.

Here is how IGN got there numbers on ram

88MB is from the 24MB ram on the gc combined with the 64MB ram that is now added in.

107 or 104 is from the 24MBMain, Addiotnal 64MB, the 3MB framebuffer, and the 16MB of embedded ram on the gpu which they don't really know what's it there for.

IGN has pretty much always had the same specs labeled.
 
The console specs that your using are off and fake. Broadway I know has higher cache specs and the DDR memory has yet to be annoucned by anyone and goes against the mosys annoucnements.

Here is how IGN got there numbers on ram

88MB is from the 24MB ram on the gc combined with the 64MB ram that is now added in.

107 or 104 is from the 24MBMain, Addiotnal 64MB, the 3MB framebuffer, and the 16MB of embedded ram on the gpu which they don't really know what's it there for.

IGN has pretty much always had the same specs labeled.

I consider their info more reliable, then some anonymous dev. Like you said, all of Nintendo technical partners are known. Neither mention the use of GDDR3 memory, which would go against Nintendo efficient design.

Hollywood is such a big mystery, devs(EA,Ubi) didn't get kits with Hollywood until shortly before E3 2006. I think Ubi may be the only thirdparty devs that decided to not use a old GC engine or PS2 port to start development. I think the target renders are more than possible.
 
Well 1T-SRAM is a marketing term for a DRAM cell with an SRAM interface, so saying it has no "latency issues" is not quite sensible, though it certainly does have lower access latency than standard DRAM.

As to the other questions, I am assuming the 64 MB pool is some form of more traditional DDR SDRAM like memory that replaces the 16 MB SDRAM (that used an 8-bit interface). So the total system memory would be 88 MB, not including the 3 MB on-chip graphics memory.

I thought it was 1T-SRAM, with no latency issues? One of the reasons how GC has been able to go head to head with XBox in previous generation.
 
Well, remember that the Wii only has a 1 MB texture buffer
"Only". How much do you think any other GPU has, 60MB? ;) FYI, it's considerably less than 1MB. Besides, the buffer stores compressed pixels, so that's 4-6MBs worth of texture data in reality. Add to that, the virtual texturing ability that only fetches texels actually visible (unless the programmer manually specifies otherwise and locks a texture permanently), and you'll see that that 1MB becomes quite roomy. This is NOT a point where Flipper should be criticized. You don't criticize a horse stable for not being big enough to hold an elephant when you can still fit a friggin horse in it...

This virtual texturing probably only reads from the 24 MB main memory pool
That's baseless speculation. Why do you even bother?

In any event, I'd hardly describe it as more efficient, though it may yield higher performance.
You're not exactly looking at things with the right perspective here, so maybe chill with the nonsensical statements, hm?
 
You still have to be able to store those textures somewhere, and whether that's in 24 MB or 88 MB, it's not much. Basically, the proof is in the pudding here, the screenshots show very low texture detail, and I'm just extrapolating from publicly released information (speculative, but pretty accurate I'd say) as to why this is the case.

"Only". How much do you think any other GPU has, 60MB? ;) FYI, it's considerably less than 1MB. Besides, the buffer stores compressed pixels, so that's 4-6MBs worth of texture data in reality. Add to that, the virtual texturing ability that only fetches texels actually visible (unless the programmer manually specifies otherwise and locks a texture permanently), and you'll see that that 1MB becomes quite roomy. This is NOT a point where Flipper should be criticized. You don't criticize a horse stable for not being big enough to hold an elephant when you can still fit a friggin horse in it...
 
You still have to be able to store those textures somewhere, and whether that's in 24 MB or 88 MB, it's not much. Basically, the proof is in the pudding here, the screenshots show very low texture detail, and I'm just extrapolating from publicly released information (speculative, but pretty accurate I'd say) as to why this is the case.

So what your saying, you have watched some Wii gameplay vids and viewed some images?

Therefore your opinion, on the accuracy of those leaked specs( have yet to be confirmed) are represented in those videos and images?

How can you judge images and the console capabilities, when you have no idea at what stage of development these images were taken?
 
So what your saying, you have watched some Wii gameplay vids and viewed some images?

Therefore your opinion, on the accuracy of those leaked specs( have yet to be confirmed) are represented in those videos and images?

How can you judge images and the console capabilities, when you have no idea at what stage of development these images were taken?

I'm not expecting radical improvements in asset quality or resolution. Based on the leaked specs, we are looking at a system in the same ballpark as the Xbox 1. The Wii game footage shown thus far seems to bear that out.
 
I'm not expecting radical improvements in asset quality or resolution. Based on the leaked specs, we are looking at a system in the same ballpark as the Xbox 1. The Wii game footage shown thus far seems to bear that out.

ban25, you don't determine the 'ballpack' of a system judging by the worst or mediocre-level work on it. if that was the case xbox360 itself would be in the same ballpack as xbox1. and even at this early stage, there's been wii footage that suggests higher-than-xbox1 capabilities. have you seen all of mp3 footage?

and for one last time - there's no technical reasons that would cause a wii title to use less textures than a xbox title, with or without GPU's direct access to the extended memory pool. there might be other factors, though, like a rush-job porting, or general time restraints that could preclude the devs from using the system properly. so if you really want to make a dent, i suggest you look elsewhere - texturing is not wii's weak spot.
 
ban25, you don't determine the 'ballpack' of a system judging by the worst or mediocre-level work on it. if that was the case xbox360 itself would be in the same ballpack as xbox1. and even at this early stage, there's been wii footage that suggests higher-than-xbox1 capabilities. have you seen all of mp3 footage?

and for one last time - there's no technical reasons that would cause a wii title to use less textures than a xbox title, with or without GPU's direct access to the extended memory pool. there might be other factors, though, like a rush-job porting, or general time restraints that could preclude the devs from using the system properly. so if you really want to make a dent, i suggest you look elsewhere - texturing is not wii's weak spot.

I've seen most of the publicly released footage for the Wii, including all the first-party stuff like MP3, SMG, etc. I stand by my statement that on every level, from raw technical specifications to first and third party games, it is essentially in the same category as the Xbox. Now, you may very well argue details, like how it has 30% more memory or some such, translating into increased texture variety or what have you, but when I look at all the games -- there's nothing I would say cannot be achieved on the Xbox with the exception of Wii's unique method of control.
 
I'm not expecting radical improvements in asset quality or resolution. Based on the leaked specs, we are looking at a system in the same ballpark as the Xbox 1. The Wii game footage shown thus far seems to bear that out.
]

I would agree with you, if game development for Wii titles had not began its life on GC devkits. Also the amount of ram available compared to Xbox, IMO is not the main benefit to giving Wii the edge. I don't believe the Wii use GDDR3 memory at all, maybe its included in old dev kits, but not the final kits devs have now. The texture cache bandwidth is probably double that of GC putting it at 20GB(peak) or more.

We don't know enough about the specs to say Wii is in the same ballpark as Xbox.
 
I've seen most of the publicly released footage for the Wii, including all the first-party stuff like MP3, SMG, etc.

While I agree that most of the footage seen till now show games more or less in the XB level, you must remember that:

1- the companys at E3 had only wii SDKs less than 1 month before (some just 2 weeks), others had latter.

2- Almost none of the companys that showed games is well know to push HW (or even go beyond PS2 HW), althought we have seen some things like the render target for RS, Disaster or Pokemon and if the games look like that then they will without doubt surpasse XB. In fact they showed some things that would look bad on the N64 (eg necro-nesia) yet this does not put wii on the level of N64. Plus many of the ones how try arent using engines that are particulary good for GC like HW (FC is a example).

3- Even XB360 games looked like XB games and didnt had to care with a brand new controler, probably most effort is being spent on it.

Anyway like I said, the only oficial thing we know is that it is better than XB so son or latter we should see that.
 
Last edited by a moderator:
You still have to be able to store those textures somewhere, and whether that's in 24 MB or 88 MB, it's not much. Basically, the proof is in the pudding here, the screenshots show very low texture detail, and I'm just extrapolating from publicly released information (speculative, but pretty accurate I'd say) as to why this is the case.
There are GCN games with way better texture resolution. So that's really a non issue. Just because a developer can't make proper use of the resources, doesn't mean the system is at fault.
 
Back
Top