WiiGeePeeYou (Hollywood) what IS it ?

Status
Not open for further replies.
You need to wait years for the potential of the system to show up.

Not that I think any of these screenshots illustrate the best of what Wii has to offer but if Wii is truly architectually similiar to GC, and as a developer you have experience with the latter, then it probably won't take much time too much time for you to discover the potential of Wii.
 
No that's really not the case. There is zero proof.A dev if they wanted to could make a game look like an N64 game on the 360. What we are seeing are preproduction screenshots of games on the internet, that likely started development on GC. Of course they will look like GC games, yet that say's nothing about the potential of the system.
Perhaps you don't understand what colour dithering shows? Sure, you could make your XB360 game dithered, but it's not something anyone would choose to do. Dithering is the product of having less than 8 bits per channel for your colour. In this case, with the patent talking about framebuffers being low-colour, this screenshot offers evidence to support that.

You can get performance/resource savings using low-colour so it makes sense in some cases - PSP has quite a lot of dithering that I've seen. The question is, when low-colour was used in a 5 year old piece of hardware at the same time everything else was 24 bit, why have Nintendo not progressed their hardware to work in full 24 bit when alpha blending (if that's what is the case here)? Supporting these modes for BC makes sense of course, but why is a new Wii-specific game using low-colour modes? No other modern GPU is running like this AFAIK, and I can't imagine ATi creating a brand-new GPU who's alpha support is limited to 5/6 bit colour when using blending. I mean, we're looking at 16+ bit per channel HDR FP colour in the main, and Wii hasn't even got proper 8 bit per channel with alpha?! :???:

Guden said:
Nobody CARES about dithering or "low-color" (whatever that might be), or anything about what's under the hood. The DS showed that, in case anyone really needed a reminder.
I dunno. People might not care, and I doubt it'll affect their purchasing decision, but sometimes it's pretty obvious and irks me. In a few PSP and GC titles I've seen I can't help but notice it in the particle effects. For me, I'm suprised that Nintendo are releasing a new piece of hardware, that they've had 5 years to work on, and had ATi working on a GPU for a while, and it's selling for $250, and yet the graphics quality is of such a low relative quality...I can't imagine anyone choosing to go with that! Why not go with 8 bit colour+8 bit alpha + 24/32 bit Z let every other solution?

Well, at this point out comes the calculator...

Let's take NTSC SDTV resolution as 720x480. That's ~350,000 pixels. 96 bit per pixel gives 12 bytes per pixel, or 4.2 MBs. Given 4 MBs eDRAM if you don't want tiling, you'd have to go with low-colour modes. That would explain the reasoning for including this option. It gives such a GPU 3x SSAA, but with colour dithering.

If you wanted the same SS in higher quality, 32 bit with alpha+16 bit Z, you'd need ~6 MB eDRAM. Alternatively you do the rendering in tiles. Would it be that much more expensive for Wii to have 6 MB eDRAM instead of 4? And why supersampling? Why not go with MSAA?

Still, I have to admire Nintendo's convictions that there's no point investing in tech! They really do seem to be taking the 'give them a new experience and save on everything else' approach. Why put in 6 MB eDRAM and give better quality if you can get away with less eDRAM and lower quality? I guess on a lot of SD screens the dithering would be pretty smudged so many won't notice!

overall, i think we're a bit hasty jumping to pixel-level conclusions just from looking at tiny clips of jpeg that can be used as jpeg compression artifacts primers : )
A JPEG isn't going to be adding that sort of dithering if it wasn't there to begin with. It's safe to say that dithering is part of the rendering. That of course doesn't tell us everything about a system, but it gives us clues. Add to that this patent, likely specs, etc. it adds up to a reasonable case that Wii has 4 MB eDRAM and a buffer with supports 3x SSAA with 5.6.5 bit colour when using alpha blending, which will mean any game rendering in that mode will exhibit dithering. In a thread asking what Hollywood is, that seems to give us some good material to consider! I do accept though that it doesn't show us what other features the GPU has and how it may or may not differ from GC. Could have all the features of a DX 10 part, but just use low-colour supersampled framebuffers, for all we know.
 
Not that I think any of these screenshots illustrate the best of what Wii has to offer.
To make sure this is very clear, no-one's looking at this screeny and saying 'it's dithered, so Wii can only do dithered graphics.' The idea that Wii needs to use dithering comes from the patent, and this screenshot is validating that theory. It doesn't prove it, but it does show there's reason to think that 96 bit super sampled 5.6.5+alpha framebuffers are a technical design limit of Wii and it shares GC's 24 bit per pixel architecture rather than upping it to 32 bit per pixel.
 
The most likely scenario is that this game was built and completed on GC dev kit's with the Wii controller and once they got final kit's they didn't bother to do any upgrades. Why would they for this type of game aimed at this type of audience.
 
Perhaps you don't understand what colour dithering shows? Sure, you could make your XB360 game dithered, but it's not something anyone would choose to do. Dithering is the product of having less than 8 bits per channel for your colour. In this case, with the patent talking about framebuffers being low-colour, this screenshot offers evidence to support that.

actually no, a fully-dithered framebuffer looks differently. the only thing that is apparently dithered in that shot is the fire-trail locale.

A JPEG isn't going to be adding that sort of dithering if it wasn't there to begin with.

and you're ready to personally guarantee that for all jpeg compressors out there? anyhow, that was not my point - it was that we try to judge too much at a pixel level with too little sound evidences.

It's safe to say that dithering is part of the rendering. That of course doesn't tell us everything about a system, but it gives us clues.

yes. that the fire-trail texure is dithered, most likely because it is of high-color. now, why it is high color and not true color is a totally different topic - my guess is somebody could have used that extra space saving. something no plaform is immune from - if i wanted to cram, say, 2 min of a video texure i guarantee you i would not have done that with true color on any of the present platforms.

Add to that this patent, likely specs, etc. it adds up to a reasonable case that Wii has 4 MB eDRAM and a buffer with supports 3x SSAA with 5.6.5 bit colour when using alpha blending, which will mean any game rendering in that mode will exhibit dithering.

again, a fully-dithered frame buffer does not look like that. and the alpha blending you're talking about here is destination alpha - normally alpha blending is done with the source alpha.

shifty, you're walking in a territory where you're trying to explain certain developers decisions based soleley on supposed hw limitations. with all due respect, i think you're stepping ahead of yourself here.
 
To make sure this is very clear, no-one's looking at this screeny and saying 'it's dithered, so Wii can only do dithered graphics.' The idea that Wii needs to use dithering comes from the patent, and this screenshot is validating that theory. It doesn't prove it, but it does show there's reason to think that 96 bit super sampled 5.6.5+alpha framebuffers are a technical design limit of Wii and it shares GC's 24 bit per pixel architecture rather than upping it to 32 bit per pixel.

I know a lot of people are fighting this conclusions, but between Matt's insider info and how all the games currently look it appears to be the most obvious conclusion. That said there may be special sauce in the Hollywood chip which can be exposed later as the API developers and devs get final hardware and can begin digging--but since Nintendo is so closed off on this that is a guess/conjecture/stretch that currently has no tangible evidence.

Shifty you mentioned being dissappointed. Count me in. I own a GCN, my only last gen console. While non-GCN owners may be tickled pink, my reaction is "Why not just give me the darned Wii-mote?"

Obviously that wasn't gonna happen for practical marketing purposes, but at $250 that is an absolutely fair question. At that price, $50 cheaper than the 360 Core at $299, I expect MUCH more. Just from the games shown I get the dejavu feeling--I got these games on my GCN in 480p via a component cable :???:

Technically I look at a solid DX9 class GPU like the Radeon 9700Pro/9800Pro and shake my head. The R3xx series was developed by the ATI West Coast (ArtX) team. The 9700 was manufactured on the 150nm process and had 8 SM2.0 Pixel Shaders and 8 TMUs and was released in Q3 2002.

Hollywood is on the 90nm process, and presumably on the fast track for 65nm in 2007. On the market we already have GPUs that passively cooled in a PC with much higher performance (with SM3.0/32bit percision pixel shader support) on the 110nm process and retail currently in the $100 range (i.e. PCB, GPU, Memory, Cooling Solution; e.g. the 6600GT).

I see no reason, at $250, that Nintendo could not have added the controller (even at $50) and could have taken 90nm Gekko/Broadway chips (overclocked) and added in a nice, modern GPU that performed EXCEEDINGLY well at 480p widescreen AND STILL MADE MONEY AT $250. The GCN has been selling at sub-$100 for years.

Now we can all hope this is the case, but is there absolutely ANY evidence of such?

Nope, not that I have seen. Although a DX9 class GPU would have been exceedingly affordable, the fact we have not seen any DX9 level software (e.g. HL2, Doom 3... 2004 games, which mind you newer DX9 level software on a 6600GT/9800Pro class GPU looks even better, and we are talking about 1280x1024 and 1024x768 resolutions, not 480p!!) is a pretty clear indication of what is in there IMO. I think we would see a lot of excited developers porting their DX9 PC engines over if Wii supported them.

Anyhow, the $250 price tag is a complete turn off. The Virtual Console may be a plus for some... but I already own a GCN, N64, SNES, GB, and NES. Why would I re-buy games I already own, especially on a budget console? So it all comes down to games: Do I want GCN+ level games (level = AI, Graphics, Sound, etc) and get a unique free hand controller, or do I spent $50 more and get a platform with 10x+ performance, significantly more game support, and a proven online network?

Graphics are not the end all be all for me, but 480p DX7 class graphics have grown old and stale. If I felt I was getting a deal on Wii at $150 (a steal, and $199 with a packin is justifyable imo, especially for non-GCN owners) I would be excited, but I feel like I am getting a GCN+ and a Wand. The Wand is great, but so are other features the PS3 and 360 offer.

Now if I was getting a 6600/7600 class GPU in there for $250 I would feel pretty good about the purchase. Not as nice as RSX or Xenos, but competitent, especially at 480p! "Good enough" graphics, some porting/asset downsizing, and my little Wii would hold its own plus offer an experience no one else had. But now I am pretty much being sold on: Same exact experience you already own, sans a new controller.

Anyhow, based on what is known, at $250 the Wii should have a MUCH better GPU. My guess: By Fll 2007 Wii is $149. $249 is a 2006 thing due to 1) demand and 2) limited supply (6M by the end of March). Might as well make money while you can!
 
shifty, you're walking in a territory where you're trying to explain certain developers decisions based soleley on supposed hw limitations. with all due respect, i think you're stepping ahead of yourself here.

The 3 previous times I asked in this thread if we had any solid info outside of Matt and the games shown there has been nothing.

So I ask again: Is there absolutely ANY reason to believe that Wii has a GPU and CPU that is more than an overclocked/modified Flipper and Gekko?

If not, all your soleley and supposedly comments can be flipped right back around as far as I have seen/red, there is no indication that the HW limitations Shifty are the ONLY concrete tidbits we have outside of fan desire/hope, "There is more in there".

Not saying there isn't, but is there absolutely ANY credible information saying we should expect SIGNIFICANTLY more?

Or is it just fans grasping at straws?
 
it was that we try to judge too much at a pixel level with too little sound evidences.
If we stick to 'sound' evidences we won't have much discussion! All we have to fathom the question 'what is Hollywood?' are screenshots, patents and 'leaks'.

shifty, you're walking in a territory where you're trying to explain certain developers decisions based soleley on supposed hw limitations. with all due respect, i think you're stepping ahead of yourself here.
It's only ideas, and I'm happy for people to reason to the contrary :D

As for certain developers, we can cast our net wider to get more screenshots from different games. I've just buzzed through IGN and not found much. Here's a Metroid Prime that similarly shows what looks like a direct capture and a player image, as though from the same source as that Mario Strikers image...

gc-2006-metroid-prime-3-image-20060823084519787.jpg


Here's Elebits

elebits-20060907095823847.jpg


So this dithering isn't isolated to a single game of GC conversions, but it's also not apparent on every Wii screen, or even a majority. At least, not at IGN. Annoyingly though, a lot of these are oversized or with super AA, so they look more like promo shots and aren't likely to show if the real game is using dithering or not.

Anyhow, that's all the info we've got to hand. What are other people's theories? ;)
 
I think most first party titles titles are using 24bit color and 24 depth (Z)bits, what Nintendo has titled 48bit color. Red Steel may be using it as well.

Why does all the gametrailer videos look better than other sites?
 

Please tell me that this is also a PS2 version of CoD3. I mean what did the Wii version brings of new, the controls dont seems to get anything new or better and the rest seems just a expansion pack of some PS2 CoD game without online, at least in the others we get new things and online (althought his is not N fault).

Don't expect it ooh-videogames, I luckily found a source I could trust on broadway and the two times that the info they have has been made public they have to retract simply because info was too obvious. Whatever is in hollywood nintendo really doesn't want out as I've never heard basic numbers from people who could easily spill the beans.

(hypoteticaly speaking) Would that be really that hard to make a "safe" leak, ie, eg a dev that gives a leak to a friend that gives to a friend ... that uses a public PC and put it on Wikipedia or a dev, it only need something that help to get some credebility on the forums.

Nobody CARES about dithering or "low-color" (whatever that might be), or anything about what's under the hood. The DS showed that, in case anyone really needed a reminder.

Still everybody cares if a game look good or not and there is some things that just look bad (eg pop up that appears in BTWii) and those kind of things shouldnt be happening at this time in a 250$ console in games that look like those.

DS best seling games are those how are N branded (ie would sell anyway) or that really uses its features (barelly any game use Wii ones on the same extent) and that is fine for some games, but not for all. Also the kinds of games that on want to play on a handeld are diferent etc... I dont think that compare directely any handheld to console is a good idea.


You need to wait years for the potential of the system to show up.

Factor 5 had show more that everyone else on a GC/Wii (or almost even in a XB) in a launch game produced in 9 months.

Perhaps you don't understand what colour dithering shows? Sure, you could make your XB360 game dithered, but it's not something anyone would choose to do. Dithering is the product of having less than 8 bits per channel for your colour. In this case, with the patent talking about framebuffers being low-colour, this screenshot offers evidence to support that.

You can get performance/resource savings using low-colour so it makes sense in some cases - PSP has quite a lot of dithering that I've seen. The question is, when low-colour was used in a 5 year old piece of hardware at the same time everything else was 24 bit, why have Nintendo not progressed their hardware to work in full 24 bit when alpha blending (if that's what is the case here)? Supporting these modes for BC makes sense of course, but why is a new Wii-specific game using low-colour modes? No other modern GPU is running like this AFAIK, and I can't imagine ATi creating a brand-new GPU who's alpha support is limited to 5/6 bit colour when using blending. I mean, we're looking at 16+ bit per channel HDR FP colour in the main, and Wii hasn't even got proper 8 bit per channel with alpha?! :???:

But if they reuse the same engine (or engines made on the GC/or even art) of the previus games in the new HW this (or any other problem) would still hapening anyway? If so then there is the possibility that later games will not have this.

For me, I'm suprised that Nintendo are releasing a new piece of hardware, that they've had 5 years to work on, and had ATi working on a GPU for a while, and it's selling for $250, and yet the graphics quality is of such a low relative quality...I can't imagine anyone choosing to go with that! Why not go with 8 bit colour+8 bit alpha + 24/32 bit Z let every other solution?

Me too, specially because it would be so cheap to put better HW (even more in a 250$ console/50$ games) and please almost everyone.

Still, I have to admire Nintendo's convictions that there's no point investing in tech! They really do seem to be taking the 'give them a new experience and save on everything else' approach.

While I will not comment on this, sometimes the diference between courage and stupidity is really small (well if they really paid and spend so much time with ATI and IBM for a GC1,2 then I would take the second).

Personally I dont even ask for a better feature set (unless it is cheaper) ST:RS or rebirth looks really good (specially due to the self-shadowing) but more raw power is needed for more possibilitys for gaming.
 
Acert93, here's what we "know" so far re the GPU:

243MHz for Hollywood vs. Flipper's 162MHz. a supposed ramp up of 50%. please, pay real good attention to this last figure.

now from here on starts the 'straw grasping', but please bear with me.

remeber flipper does the TnL on the cube? a really potent TnL by everybody's testimonies, but when it comes to potency i'd rather leave the devs work do the talk. so, let's try to create a devs showcase here - let's take a well respected cube developer that also has a wii project in the making. i think retro suits that well.

so what did retro achieve on the cube? mp1 & mp2. i suppose you're well familiar with those two games, but if not you can take my word for them as i am well familiar.

now, what little of mp3 i've see until now demonstrates an upgrade in scene geometry complexity way above the dry 50% we could expect.

so unless retro got a whole lot better at designing rendition pipelines lately, what i see and what little the leaks say do not lead me to the conclusion that Hollywood is merely a 1.5x speed-factored Flipper. yes, i'm totally positive it's Flipper-based, otherwise the emulation part would have suffered, but performance has ramped up non-proportionally to those percents suggested by the clocks.
 
Last edited by a moderator:
I think because the GC and Wii are in close we will see alot of GC developed stuff ported to Wii in the beginning.
Other devs who may have been privy to the actual specs may have gotten a head start developing to the theoretical specs(Retro makes sense). Some devs probably had limited info so played it safe and targetted GC specs.
That's my thoery.It's just seems like such a waste to just speed bump it. ATI has said it's a new architecture lest we forget.They have also said E3 was the tip of the iceberg. ATI has no reason to lie. They have enough credibility in this busienss that if the graphics indeed do not improve on Wii,noone is going to doubt ATI's ability to make great VPU's. Nintendo doesn't seem to be flaunting the graphics so I don't see them pouring the pressure on to ATI to make it look better than is true.
 
the fact is, there were quite a few GameCube games that had lower-end graphics than Dreamcast games. Gamecube was 3 to 4 times more powerful than Dreamcast, on average. so by looking at certain "poor" graphical aspects of Wii games in no way gives us an idea of what Hollywood is capable of.



I think because the GC and Wii are in close we will see alot of GC developed stuff ported to Wii in the beginning.
Other devs who may have been privy to the actual specs may have gotten a head start developing to the theoretical specs(Retro makes sense). Some devs probably had limited info so played it safe and targetted GC specs.
That's my thoery.It's just seems like such a waste to just speed bump it. ATI has said it's a new architecture lest we forget.They have also said E3 was the tip of the iceberg. ATI has no reason to lie. They have enough credibility in this busienss that if the graphics indeed do not improve on Wii,noone is going to doubt ATI's ability to make great VPU's. Nintendo doesn't seem to be flaunting the graphics so I don't see them pouring the pressure on to ATI to make it look better than is true.


good arguement ninzel. much agreed.



I think because the GC and Wii are in close we will see alot of GC developed stuff ported to Wii in the beginning.
Other devs who may have been privy to the actual specs may have gotten a head start developing to the theoretical specs(Retro makes sense). Some devs probably had limited info so played it safe and targetted GC specs.
That's my thoery.It's just seems like such a waste to just speed bump it. ATI has said it's a new architecture lest we forget.They have also said E3 was the tip of the iceberg. ATI has no reason to lie. They have enough credibility in this busienss that if the graphics indeed do not improve on Wii,noone is going to doubt ATI's ability to make great VPU's. Nintendo doesn't seem to be flaunting the graphics so I don't see them pouring the pressure on to ATI to make it look better than is true.

yet another well thought-out post. it gives me hope. With that said, I do agree with Acert93 that there is no evidence that Hollywood is on par with the 4 year old Radeon 9700 Pro (R300).

the R300 is probably between 4 to 8 times more powerful than Flipper. whereas Hollywood is probably around 2x Flipper in raw power (50% extra clock combined with greater efficiency, my speculation) plus Hollywood must have a few new features here and there, but almost certainly will not rival R300 in performance or features.


there's no way Wii would be able to reproduce the realtime Animusic pipedreams demo that R300 did

animusic_demo.jpg
 
Last edited by a moderator:
When you read the Iwata interviews of Nintendo hardware and software development engineers, they make mention of the focus to acheive high performance and low power consumption. I would assume they acheived their goal, to Nintendo this aspect is worth keeping close to their chest.

Hollywood and it being shrouded in secrecy, I think is more of marketing tool. It lends room for the element of surprise. Imagine the reactions of many gamers, particularly those of us who populate these message boards, if 2nd generation titles look considerably more advanced then what has been previously released. Right now, it would seem Nintendo decision to use over-clocked GCs to start Wii development, had its setbacks. Most titles upon our first look during E3 06, were visually comparable to GC. 16bit( low resolutions) textures, flat lighting, low polygon counts, and the lack of any visible shader tech.

At this stage, we don't know anything. The console is two months away from release. There's something that should be said about the fact that Hollywoood was the last main component to be complete. Is the amount of main memory still 88MBs? What number of pixel pipelines are there? Is there more to the patents about "Shade Tree" recirculating shader? Did the clockspeeds get a boost? What is memory bandwidth?

All these things, we don't know. So we wait in anticipation of new leaked specs. Matt our main source of Nintendo info, had specs IGN had to remove.
 
PC999 safe leaking I would assume would be very hard to get away. I was privy to one leak at matt ign did recently that got pulled in minutes. Someone could easily put up the info, but it seems unlikely given the reaction to the fake specs by nintendo in some cases that some legit and big would be allowed to stay.

Acert broadway isn't 750CXe, its 750CL (GX Line) so wouldn't the theory of it being an overclock go out the window? I can't testify for Hollywood, but knowing what I know about broadway nintendo wasn't content in any way to simply keep the specs status quo and go with a speed bump.

As for evidence of Hollywood's ability I think a company like ATI stating we've only the seen the tip of the iceberg is a lot (at least 8times bigger underneath).
 
the fact is, there were quite a few GameCube games that had lower-end graphics than Dreamcast games. Gamecube was 3 to 4 times more powerful than Dreamcast, on average. so by looking at certain "poor" graphical aspects of Wii games in no way gives us an idea of what Hollywood is capable of.






good arguement ninzel. much agreed.





yet another well thought-out post. it gives me hope. With that said, I do agree with Acert93 that there is no evidence that Hollywood is on par with the 4 year old Radeon 9700 Pro (R300).

the R300 is probably between 4 to 8 times more powerful than Flipper. whereas Hollywood is probably around 2x Flipper in raw power (50% extra clock combined with greater efficiency, my speculation) plus Hollywood must have a few new features here and there, but almost certainly will not rival R300 in performance or features.


there's no way Wii would be able to reproduce the realtime Animusic pipedreams demo that R300 did

animusic_demo.jpg


I think you'll be proved wrong. Low Power and high performance.
 
I'll leave you guys a bigger tidbit as to what nintendo is up to, though it's nothing you couldn't piece together yourselves

Think of the psp and the speed it runs at vs what it could
Think of the gecko's speed
Think of how power effecient broadway is

Things should line up into place, but the final clue is that nintendo isn't killing one bird with one stone it's killing two.
 
Well all I know is that Cube's 24 bpp issue was really, really noticeable in RE4. Nothing like dark contrasting stuff to show things like banding. I would hope they've progressed past that by now.

Hell I've been running 32bpp on PC since I got a Radeon LE DDR in 2000! Wait, nm that, I was running 32bpp on my G400 MAX in '99, and even occasionally on my G200 in 1998! :)

Like Acert93 says, it's odd that N didn't go with even an available low-cost PC part. I mean, RV410 was cheap a year ago and for the most part wipes the floor with a R300. We have budget DX9 parts at Newegg with more power than Wii apparently, going for <$50. It's strange and really cheap of N to do. But, if you look back on N's hardware other than Cube, you see they are typically very cheap on technology. SNES and N64 had so many curtailed areas. N64 especially IMO.
 
Last edited by a moderator:
Hell I've been running 32bpp on PC since I got a Radeon LE DDR in 2000! Wait, nm that, I was running 32bpp on my G400 MAX in '99, and even occasionally on my G200 in 1998! :)

I was gonna say...G200 at 32bpp (Matrox Millenium) in 1998. Good times.
 
Like Acert93 says, it's odd that N didn't go with even an available low-cost PC part.
It's not odd if the prime concern was hardware BC with GC, although that decision might be considered odd if it sets Wii a few years behind the curve!

ooh-videogames said:
Hollywood and it being shrouded in secrecy, I think is more of marketing tool. It lends room for the element of surprise. Imagine the reactions of many gamers, particularly those of us who populate these message boards, if 2nd generation titles look considerably more advanced then what has been previously released.
That would be...peculiar marketting. 'Let's create hardware that can do this, but won't show it for the first year. Let them see worse games to begin with and yet not tell them that Wii can do better. That's bound to get us sales!' I don't see the sense in that. Tell people what your system can do, and give them every reason to think well of it. Do everything you can to break down reservations for buying it. If someone wants good graphics, and your system can do good graphics, shout about it. Don't hide that for a year to surprise them! What you're suggesting would be the same as releasing screenshots of a game without any AA or texture filtering, and then when the game's released, add AA and filtering. It'd be a nice surprise for those who bought the game, but by then I expect most people will have stopped paying attention to the jaggie mess you were showing earlier.

To me, the reason for secrecy is because you know comparatively those that talk about such things will make a lot of noise about your less impressive technical details. If everyone else is using FP16 HDR, and you can't use HDR, that'd be reason to keep quiet and leave people guessing. Whereas if you have FP16 HDR but don't say so when present titles make it look like your system doesn't support it, keeping that info secret only hurts your image (to that 0.1% who care about such things ;))

Incidentally, that patent talked of YUV colour modes. I'd be very interested if that was proper NAO32 like colour-space supported in hardware. That'd be a real boon for the system IMO, and a first for the industry?
 
Status
Not open for further replies.
Back
Top