Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

From what I heard it has no impact on clock speed potential.

I recently bought a RTX 2080 FE, and I've been following all the reviews as I was curious about this. I had a MSI RTX 2080 Gaming Trio on preorder at Amazon, it was even on sale for the same price as a FE, but at the last minute I went with the Founders Edition ( I grabbed the last one in stock at my local Best Buy) when I saw how things were breaking regarding the upper limits of the gpu. I was psyched for the Trio, though I was put back a little by its weight. It even comes with a support brace, though supposedly the card is so stiff it doesn't sag, even without the brace being used. But my not having one of those new-fangled steel reinforced PCI-E slots was on my mind.

Anyway, as much as I lusted for the Trio, with its two eight pin power connector, and extra phases, vs. the FE's configuration, I saw the writing on the wall with the early reviews. And sticking with the FE meant I could confidently continue to pass along my cards to a good friend, who also lacks steel reinforced PCI-E slots, after I upgrade.

Here's the guru3d review that demonstrates what I mean, it's for the 2080 Ti Gaming Trio, but it's analogous to the 2080 Gaming Trio. Same beefed up design, same increase in power consumption, same it not making a difference.

Power consumption: https://www.guru3d.com/articles_pages/msi_geforce_rtx_2080_ti_gaming_x_trio_review,7.html

Overclocking: https://www.guru3d.com/articles_pages/msi_geforce_rtx_2080_ti_gaming_x_trio_review,28.html

I'm too lazy to link to other reviews but what I saw indicated the same thing. Guru3d has already done several, and they're not out of line with the others I've read elsewhere. Somehow the FE cards boost/overclock as well as the best AIB cards. A little noisier when matching those cards excellent cooling, but not bad at all with the noise decibels, and the cooling is good enough to sustain overclocks that match nVidia's partners cards.

It's like there's several hard limits in the gpu's bios (or maybe the chip design itself, or both), and nVidia's board circuitry was designed to fully tap that out. I use a UPS, and it has monitoring software that I was using on a second monitor while benchmarking. Fwiw, it appears that even when I maxed out all the sliders, my 2080 FE isn't using all the power available to it. It takes a lot of gaming to make it droop below 2040 MHz, to around 2025 IIRC, and that drop is temporary. I run the chip at close to its limit, but not all the time as I use v-sync and that gives it some breathing room. When the card is hot and I run multiple benchmarks (v-sync off), it will droop down to 2025 MHz more often. But I don't run the fans maxed out, though I do use a custom fan profile that's more aggressive than the default one. Fan speeds will hit 84% after an hour of gaming, but temps are kept below 75C.

I got Mafia III with all its DLC for a bargain price, and I've been pounding away at that using a custom resolution of 3200 x 1800. I've got AA either on low, or off if the nVidia Control Panel is doing what I set it to. I enabled nVidia's FXAA as a substitute. Everything else is maxed out, including 16 tap aniso in the control panel. Runs at 60 fps 99% of the time though the card is at close to its limit most of the time. I'm coming from a 1070 FE, so this is a very nice bump in performance, even though my 1070 was a surprisingly decent overclocker. Lol, it's now sitting in my friend's PC, and she's loving that she could bump Monster Hunter: World up to its maximum setting at 1080p.

I posted the following edit below:

Edit: With my air conditioning off, and after gaming for a while, the gpu clock does dip down to 2025 MHz. Temp on the gpu hits the mid 70s, but I don't run the fans all out, they stay around 84%, or lower if the card isn't straining to hit 60 fps, or is often unable to maintain that.

And another thing, it appears Rise of the Tomb Raider hits my system hard when it comes to power use, and arguably does use every drop of power nVidia will let it have. I'm using an EVGA Supernova P2 650 watt platinum PSU nVidia spooked me into upgrading to, lol, and the software of my UPS reports it sometimes is drawing over 400 watts from the wall (though usually 380 watts or less). I have an i7 67oo (non-k), and my system at this moment is only using 55 watts while I'm typing this. This PSU should have close to 92% efficiency when I'm gaming, so that 400 watts at the wall translates to 370 actual watts going from the PSU.

Afterburner software however reports I'm usually a bit below the power limit of 124%. So yeah, take a grain of salt with my impressions, and bear in mind this software I'm measuring with is maybe just providing rough estimates of what's going on.

This card is a delight though, after adjusting my case fans and my cards fan curve I can run games all out, mute the volume, and the volume from my case is just a mellow background noise. The PSU fan is set to hybrid mode, and is usually/always off.

Anyway, so yeah, maybe the AIB partner cards, with a bios like the one EVGA released that breaks nVidia's power limit, can outperform the FE cards. Maybe. There's one other factor, and that's the voltage limit, and going by what that nVidia head engineer said, I'm guessing we may not see vendors getting approval to push those limits. Using the Afterburner software we see it's the power limit that is getting constantly hit. But even with the voltage limit set to use its maximum allowable adjustment, it too still occasionally gets hit. The question I see arising now is, will increasing the power limit mean that it's the voltage limit that starts to get hit, and thus be a new limiting factor?

I guess we'll get an indication of what's what once cards with EVGA's new bios (which should work with other cards) starts getting put to use. In the EVGA forum it was reported, IIRC, that EVGA will switch to shipping cards with this new one.

I'm guessing nVidia was getting complaints, and relented on the power limit.
 
Last edited:
https://techreport.com/blog/34136/g...-begin-playing-nice-with-tr-freesync-monitors

Freesync apparently works natively with the latest drivers (411.70) on both Turing and Pascal.


Update:

Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I believe the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding some form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for any misleading conclusions we've presented in our original article. The original piece continues below for posterity.
 

Yeah this doesn't work:

no_freesyncseijv.jpg
 
Edit to my post above:
With my air conditioning off, and after gaming for a while, the gpu clock does dip down to 2025 MHz. Temp on the gpu hits the mid 70s, but I don't run the fans all out, they stay around 84%, or lower if the card isn't straining to hit 60 fps, or is often unable to maintain that.

And another thing, it appears Rise of the Tomb Raider hits my system hard when it comes to power use, and arguably does use every drop of power nVidia will let it have. I'm using an EVGA Supernova P2 650 watt platinum PSU nVidia spooked me into upgrading to, lol, and the software of my UPS reports it sometimes is drawing over 400 watts from the wall (though usually 380 watts or less). I have an i7 67oo (non-k), and my system at this moment is only using 55 watts while I'm typing this. This PSU should have close to 92% efficiency when I'm gaming, so that 400 watts at the wall translates to 370 actual watts going from the PSU.

Afterburner software however reports I'm usually a bit below the power limit of 124%. So yeah, take a grain of salt with my impressions, and bear in mind this software I'm measuring with is maybe just providing rough estimates of what's going on.

This card is a delight though, after adjusting my case fans and my cards fan curve I can run games all out, mute the volume, and the volume from my case is just a mellow background noise. The PSU fan is set to hybrid mode, and is usually/always off.

Anyway, so yeah, maybe the AIB partner cards, with a bios like the one EVGA released that breaks nVidia's power limit, can outperform the FE cards. Maybe. There's one other factor, and that's the voltage limit, and going by what that nVidia head engineer said, I'm guessing we may not see vendors getting approval to push those limits. Using the Afterburner software we see it's the power limit that is getting constantly hit. But even with the voltage limit set to use its maximum allowable adjustment, it too still occasionally gets hit. The question I see arising now is, will increasing the power limit mean that it's the voltage limit that starts to get hit, and thus be a new limiting factor?

I guess we'll get an indication of what's what once cards with EVGA's new bios (which should work with other cards) starts getting put to use. In the EVGA forum it was reported, IIRC, that EVGA will switch to shipping cards with this new one.

I'm guessing nVidia was getting complaints, and relented on the power limit.
 
Last edited:
https://techreport.com/blog/34136/g...-begin-playing-nice-with-tr-freesync-monitors

Freesync apparently works natively with the latest drivers (411.70) on both Turing and Pascal.

Although it may be more Windows than NV behind this.

Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I'm confident the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we've presented in our original article. The original piece continues below for posterity.

Whoops I should have read further in the thread. :p

That said, would be fantastic if NV started to officially support VESA Adaptive Sync (no need to use AMD's branding for it, FreeSync).

If they did, that might stop me from getting an AMD card for my next card. Because I can assure you that if LG follows Samsung and starts releasing Adaptive Sync TVs, especially the OLED ones, I'm never buying a video card that doesn't support Adaptive Sync ever again.

Regards,
SB
 
I still have a silly hope that hdmi 2.1 will force nvidia's hand in the matter. I just can't justify it otherwise myself. I would rather get a new bike frame.
 
But if they support Freesync, they're killing gsync as we know it. But they can "kill" rtg by taking back gamer only staying with AMD because of Freesync (hello there)...
 
But if they support Freesync, they're killing gsync as we know it. But they can "kill" rtg by taking back gamer only staying with AMD because of Freesync (hello there)...

If Gsync was as good as they claim it is then they wouldn't have to kill it now would they?

Gsync is like their own special form of HDDVD and they won't let go. People keep buying so they don't really have any reason to do it anyway so yeah :p
 
So Kyle and Brent got their hands on a RTX 2070 and they just put up a review.

https://www.hardocp.com/article/2018/10/14/msi_geforce_rtx_2070_gaming_z_performance_review/2

The guy they kicked out for calling out GPP is the first one with a full blown review on his site, almost 48 hours before everyone else.
Oh the irony... Salt must be raining at nvidia headquarters.



Regardless, if the RTX 2070 is coming at its MSRP, then this is the Turing GPU worth getting. Head and shoulders above the GTX 1080, especially in compute-intensive games.
 
Regardless, if the RTX 2070 is coming at its MSRP, then this is the Turing GPU worth getting. Head and shoulders above the GTX 1080, especially in compute-intensive games.
Discussion on value based on MSRP is irrelevant, and I really hope you don't actually expect that price. The 2070 falls in line where expected, just above 1080 but it will be selling for $600-700. There's really no difference between this and the 2080 in terms of whether it's got any value or not since there's nothing to quantify the additional resources spent on RTX yet.
 
What did I miss? Aren't the Turing cards being sold at MSRP, at the moment?
MSRPs:

2070 $499
2080 $699
2080ti $999

Now you tell me whether they're being sold at MSRP :)

With Pascal pricing, Nvidia wanting to still sell Pascals, lack of competition at that market segment and Nvidia being who they are, it's going to be a long time until Turing is sold at those PR bulletpoint prices.
 
Back
Top