nVidia adopting 3Dc?

digitalwanderer

wandering
Legend
I saw this story posted by Hanners on EB's frontpage this morning and hadn't seen it here yet:

Hanners at Elite Bastards said:
3Dc was always an open source technology, so seeing NVIDIA jump on it at the first available oppportunity is no great surprise, but good news for us consumers.

3Dc is a new compression marchitecture meant to increase the level of details in games. All ATI cards from X800 generation upwards have this capability and it turns out that new Nvidia cards are going to use it as well.

We believe the upcoming G70 has this support already, as it's taped out. It is the first compression technique optimised to use normal maps, allowing per pixel control over light reflection from a textured surface. In other words, it can look very good.

The Inquirer has the full story.
 
It is the first compression technique optimised to use normal maps, allowing per pixel control over light reflection from a textured surface. In other words, it can look very good.


Doesn't that translate to: It's a normal map, but compressed.

They make it sound like they've never seen anything like it before.
 
Someplace somewhere in the last week or so I read that NV told somebody that NV40 already supports 3dc. . .but that they can't call it that, because ATI owns the moniker.

Yeah, I know that a few more details would be nice, but that's all I remember. :oops:
 
I don't believe ATi owns the term "3Dc", I believe it's open source.

They might not have wanted to use the term though... ;)
 
NV4x can use the 3dc format but it takes twice the memory footprint compared to x800 which kinda destroys the purpose of 3dc
 
digitalwanderer said:
The hardware could support it, but nVidia might not have exposed the capability in their drivers yet.

Ah, that old dodge. I hate that straddle. Leads to lots o' frustrated forum posts. Better to keep your mouth shut until you support it rather than fart around with that.
 
geo said:
digitalwanderer said:
The hardware could support it, but nVidia might not have exposed the capability in their drivers yet.

Ah, that old dodge. I hate that straddle. Leads to lots o' frustrated forum posts. Better to keep your mouth shut until you support it rather than fart around with that.


Yeah well in the end they'd just be shifting the tomato-targets from their hardware staff to their drivers staff.
From "we have crap hardware" to "we have crap drivers", in the end NVIDIA still plays the part of the idiot.
 
london-boy said:
geo said:
digitalwanderer said:
The hardware could support it, but nVidia might not have exposed the capability in their drivers yet.

Ah, that old dodge. I hate that straddle. Leads to lots o' frustrated forum posts. Better to keep your mouth shut until you support it rather than fart around with that.


Yeah well in the end they'd just be shifting the tomato-targets from their hardware staff to their drivers staff.
From "we have crap hardware" to "we have crap drivers", in the end NVIDIA still plays the part of the idiot.

Yeah, but in my experience observing the scene, nothing drives normally sane (let's not even get into the effect on the marginally sane and the outright flaky!) folks batty over time like being told some feature they'd like to have is supported by the hardware but they just haven't turned it on in the drivers yet. Seen it many, many times.
 
NV40 doesn't support 3Dc in hardware yet, the driver converts it to an uncompressed two-channel format. That means a game can use it, but it would be better if the game used the two-channel format in the first place to avoid compression artifacts.

It's quite possible that upcoming chips from NVidia have hardware support, though.

The first format optimized for normal maps was CxV8U8.
 
Back
Top