The Official G84/G86 Rumours & Speculation Thread

Doesn't have to be a problem. It all depends on pricing and if AMD can get RV670 out fast enough which is now scheduled for Q3.

Personally I don't think AMD intends for RV630 to go up against G84, seeing as we've been hearing about some nameshuffling (eg RV630 -> X2400 instead of X2600).

Its a huge problem, we are still in Q1, mid Q2 NV performance cards coming out, DX10 games comings too before Q3.
Sorry but i not belive anymore what they say, AMD need to proof something, a good series of cards whats released in time and have good price/performance, than i trust they again.
After the biggest hype ever i not expect this.
 
Better that they position and price it according to its performance than what happened with X1600.

Tho if they still can't compete head to head at $199-$249 when they've got a 65nm vs 80nm (it appears) advantage (and process advantage should be *more* important in middle and low parts than high-end parts, actually --because price point flexiblity and margins are lower in those segments) that would most certainly not be a good thing.
 
Better that they position and price it according to its performance than what happened with X1600.

Tho if they still can't compete head to head at $199-$249 when they've got a 80nm vs 65nm (it appears) advantage (and process advantage should be *more* important in middle and low parts than high-end parts, actually --because pricing flexiblity and margins is lower in those segments) that would most certainly not be a good thing.

I understand this, but why they make the same mistakes again?, they always (last 2 time) calculate wrong and underestimate (this rename story (if ture) from x2600->x2400 show that i'm right) NV mainstream-performance cards.

After the biggest hype ever, i don't know what can i say more when this things coming true. :cry:
 
Well, you'd expect NVIDIA to have a new product to compete in the upper-midrange by then too, being positioned above G84 since the gap between G80 and G84 is pretty damn huge. The large amount of redundancy in the G80 GTS is helping close it a bit, but it cannot do miracles either.

Still, assuming RV610 costs much less than G86, NVIDIA has nothing to compete in that market segment except G72 and any potential 80nm or 65nm variants of that, which isn't exactly feature-competitive with UVD+DX10. We'll see. I'm really under the impression that NVIDIA doesn't care a lot about the super-low-end discrete DX10 market, for the very simple reason that management cares a lot about margins.

This might seem ridiculous, but from my point of view, NVIDIA seems to make certain decisions very much based on how the financial community would perceive them, rather than how much gross profit it would deliver. And considering analysts tend to focus a lot on margins nowadays as an indicator of how healthy a business is, it only makes sense for them to focus on these segments if they can keep respectable magins. In the end, this could help AMD to make some very nice profit there, though.


Just curious, but how are you connecting the dots that NVIDIA is making decision to please the financial community. I don;t think there is any logic at all there.
 
Just curious, but how are you connecting the dots that NVIDIA is making decision to please the financial community. I don;t think there is any logic at all there.
There might be a variety of factors involved in any such decision. However, I would tend to believe the financial community is one of those factors. How do you believe the financial community would react to margins dropping back down to 40% or below?

NVIDIA has already mentioned in conference calls that they were not interested in capturing the lowest parts of the market by sheer pricing pressure. I'm not inventing that - but I am speculating on the reason behind that decision. Without some extra insider perspective, it is indeed very difficult for me to be confident in that speculation though, obviously...
 
companies have finite resources. every project has a cost. as well there are opportunity costs. i dont think there are nvidia engineers sitting around idle. the question for nvidia is which projects will deliver the highest return on investment. obviously there are long-term strategic factors to consider as well. margins are one of two fundamental components of profitability. obviously the other is revenue. nvidia's financial success has allowed it to significantly boost headcount and hence, investment rates. more and more projects.
 
Who cares about HDCP/HDMI ?
The average user cares about speed IMHO.
Besides, even without "native", in-chip support, what can stop them from offering external solution?
 
Because these are the parts which are more likely to end up in an HTPC than their watt-guzzling larger brothers.
Yea, tell me that HTPC is the most selling PC in the world...
How much more expensive an external solution will be? 5, 10$ ?

Anyone who sees OEM machines with S3 chips? They are low power, compatible with all kinds of mumbo-jumbo support, and?
If a product is named "bad, slow, late to market", it always will be less atractive.
 
Anyone who sees OEM machines with S3 chips? They are low power, compatible with all kinds of mumbo-jumbo support, and?
If a product is named "bad, slow, late to market", it always will be less atractive.

That's not the point. You are trying to minimize the importance of HDMI/HDCP to the HTPC crowd when in fact it is as important as or even more so than gaming performance. You're also assuming that R6xx based parts will be too "slow" for gaming in common HDTV resolutions which is highly doubtful.
 
That's not the point. You are trying to minimize the importance of HDMI/HDCP to the HTPC crowd when in fact it is as important as or even more so than gaming performance. You're also assuming that R6xx based parts will be too "slow" for gaming in common HDTV resolutions which is highly doubtful.

Maybe not R6xx, but the G84/G86 direct competitors are the RV6xx products and i doubt any RV610 (perhaps even RV630) will be used for "serious" gaming at 1920x1080 (progressive scan, of course)...
So i can definitely understand why they only pushed for a mandatory HDCP key at the top-end model (8600 GTS).


Let's face it, most HDTV's have at least a component or D-Sub VGA connector (where i live even most low-end, el-cheapo/no name brand, 26 and 32 inch LCD HDTV's have at least HDMI, VGA and component), and most sub-129 dollar graphics cards do have either one or all of them (my old NV43-based 6200 PCIe has both and DVI).
"Low End" is about having the lowest possible upfront cost, and including a HDCP key is not a cost effective measure, especially so when the ICT flag isn't something we'll see enforced any time soon, unless the Sony's or Toshiba's of the world are suicidal and don't want their respective HD optical disc formats to be popular enough to replace DVD's in consumers minds.
And the HDCP key is mostly a fixed licensing fee per board, something that doesn't "go away" with process shrinks.

Getting back to my PCIe NV43 6200 (something of a rarity, with a full 128bit bus, purchased brand new for 39 dollars just two years ago...), it has Purevideo, component adapter, supports HDTV up to 720p/1080i (on component, because it can get to 1920x1200 via D-Sub) and is Vista Premium ready.
I can watch Blu-ray movies with it (via VGA), and i've also used it with a borrowed X360 HD-DVD drive, working flawlessly.
For the price at the time, it was a real bargain.


My point is, HDCP is not a required "feature" just yet, and by the time it's actually needed, we'll be able to buy an HDCP-enabled low end card with much more 3D and video decoding/encoding performance than today.
Of course, i have this "buy it when i need it" attitude, but i think it also applies to the high-end (i really wouldn't care if a 8800GTS had DX10 or not compared to the X1950XTX, because both are about equal in performance with DX9 games, and the real DX10 software is still months away -not to mention the unknowns about how fully matured DX10 games will be running on first generation hardware, or will they need to fall back to DX9 due to speed issues-)
 
Last edited by a moderator:
Maybe not R6xx, but the G84/G86 direct competitors are the RV6xx products and i doubt any RV610 (perhaps even RV630) will be used for "serious" gaming at 1920x1080 (progressive scan, of course)...
So i can definitely understand why they only pushed for a mandatory HDCP key at the top-end model (8600 GTS).

Yeah, meant RV6xx. And by common HDTV res I was thinking 720p which should be cake for that class of hardware.

I just don't think the HTPC crowd is too concerned with gaming performance. I mean, if you make all the sacrifices required to build/buy an HTPC in the first place then gaming performance can't be that high on your list of priorities. IMO HTPC's are moving ever closer toward being STB's and in that space features are paramount.

In terms of "buy it when I need it", sure you can upgrade your HTPC card in the future but I just think that crowd goes for the package and doesn't have the same piecemeal short-term mentality that most of us adopt in the discrete GPU market. And what about OEM's - don't you think they want to be able to advertise HDCP/HDMI capability? I know that was a big deal for me when I bought my receiver recently.
 
Yeah, meant RV6xx. And by common HDTV res I was thinking 720p which should be cake for that class of hardware.

I just don't think the HTPC crowd is too concerned with gaming performance. I mean, if you make all the sacrifices required to build/buy an HTPC in the first place then gaming performance can't be that high on your list of priorities. IMO HTPC's are moving ever closer toward being STB's and in that space features are paramount.

In terms of "buy it when I need it", sure you can upgrade your HTPC card in the future but I just think that crowd goes for the package and doesn't have the same piecemeal short-term mentality that most of us adopt in the discrete GPU market. And what about OEM's - don't you think they want to be able to advertise HDCP/HDMI capability? I know that was a big deal for me when I bought my receiver recently.

But OEM's have the ability to include HDCP in their bulk GPU orders.
Just look at all those early Sony Vaio laptops with a HDCP-enabled Geforce 6600 and a blu-ray drive, or Toshiba's with the same GPU decoding from a HD-DVD drive.
In that case, at least the option to include the key is there, should the OEM's decide they need it.

In a typical mid-sized business PC buying spree would you rather have to pay for the key, or have the option to purchase, say, 1000 units without it ?
If the card without it is even 50 cents cheaper, that's 500 bucks saved right there, simply by not including a feature you'll never use in MS Office or Vista Business, for instance.
 
on a HTPC you can use integrated video. that's more low cost and low power.

To my knowledge, there are currently no IGP's with hardware-assisted video decoding capabilities above 720p/1080i.
Not to mention it would steal main memory capacity and bandwidth.
At least with a discrete card, the 3D performance is enough to play a few not-so-old games at reasonable settings.
 
To my knowledge, there are currently no IGP's with hardware-assisted video decoding capabilities above 720p/1080i.
Not to mention it would steal main memory capacity and bandwidth.

But on the other hand, pretty much any low-end CPU these days can support h.264 at 720p without any real problems. I suppose if you're using your HTPC as a PVR you will probably need to use a dual-core processor at present, however.

The sooner ATI's UVD (or an NVidia equivalent) is integrated into low-end GPUs (and IGPs), the better as far as I'm concerned.
 
But on the other hand, pretty much any low-end CPU these days can support h.264 at 720p without any real problems. I suppose if you're using your HTPC as a PVR you will probably need to use a dual-core processor at present, however.

The sooner ATI's UVD (or an NVidia equivalent) is integrated into low-end GPUs (and IGPs), the better as far as I'm concerned.

I'll give you my example.
My media center is home-made.
I knew a dual-core would be a key component, not because it would be faster, but it would be more versatile.
Therefore, i got a great deal on a used Pentium D 805 and combined it with a Pinnacle Dual Hybrid TV card (PCI Express x1).
This card has some very interesting features.
It supports PiP, dual analog and/or DVB-T TV, analog and/or DAB radio, hardware MPEG 2 and DivX capture, time-shifting, is Vista Premium ready, etc.

The mere fact it has two cores means i can encode a stream in MPEG 2/DivX while watching/time-shifting another without hiccups.
Actually (with the included Pinnacle software), i only need to have a single coaxial cable connected -it has two plugs available- to get PiP working.
This is not something i would recommend to anyone using a single core system, much less an integrated GPU.

I then purchased a X10-compatible RF remote control from Toshiba, a WiFi/BT card and used an old Audigy 2 ZS for sound (this is a temporary situation).
In the future, i plan to replace the PD 805 with an existing E6600, double the RAM to 2GB and have two or 3 additional hard drives.

Again, this is my personal experience, but the bottom-line is that hardware component choice needs to be balanced for this particular kind of system.
An Intel G35, Nvidia 7050 or AMD 690 would definitively not be my first choice, even with the slightly lower power requirement in mind.
 
Back
Top