Swap PS5 for a RTX 4070?

Question : the Zotac Super Trinity OC 4070ti Super comes in 2 editions the difference is ones white £20 more expensive and clocked 15mhz higher, the black one comes with a 2x8pin to 12VHPWR adapter but the white comes with a 3x8pin to 12VHPWR adapter
which is making me think is the black one border line on power delivery because the same card clocked 15mhz higher needs 3x8pin ?

Another thing thats confusing me, looked at the specs for the normal 4070
1705083715148.png

Specs for the 4070ti super
1705083811824.png
 
Last edited:
Both of those cards list TDP as 285W. A 2x8pin adapter should be good for 300W with a comfortable safety margin above that. A good PSU with thick cables could probably do it with a single 8pin connector but I'm not suggesting anyone try that.
 
Am I right in thinking the card also draws power from the pci-e slot ?
The slot could deliver up to 75W. I've read that NVIDIA cards don't pull much power from the slot anymore. That said I just tested and under furmark my 4070 draws ~60W from the slot. This is because my 4070 has only a single 8pin connector and a 200W TDP. Some 4070 cards have 12+4 pin connectors, and all the NVIDIA cards 4070Ti and above do as well. The Internet says those cards don't draw much from the slot. If anyone has a 4070Ti or above it'd be cool if you could test this. Everything you need is in furmark (GPU stress test and GPU-Z). You can find PCIe Slot Power under Sensors in GPUZ.
 
Yea I still expect the 4070 Super to be closer to the 4070Ti than to the 4070. But it will be interesting to see how much it loses from having 25% of the cache removed. With wafer prices so high it's an odd decision to put so much cache that isn't necessary and now won't be included in any products (4070Ti is EOL).
NVIDIA has updated the spec sheet for the 4070 Super. It has the full 48MB L2 (y)
 
Anyone have any info on the supers not being vr ready (whatever that means)
It's just a mistake. From NVIDIA website.

GeForce RTX 4070 Family​

GeForce RTX 4070 Ti SUPERGeForce RTX 4070 TiGeForce RTX 4070 SUPERGeForce RTX 4070
GPU Engine Specs:
NVIDIA CUDA® Cores8448768071685888
Boost Clock (GHz)2.612.612.482.48
Base Clock (GHz)2.342.311.981.92
Memory Specs:
Standard Memory Config16 GB GDDR6X12 GB GDDR6X12 GB GDDR6X12 GB GDDR6X
Memory Interface Width256-bit192-bit192-bit192-bit
Technology Support:
Ray Tracing Cores3rd Generation3rd Generation3rd Generation3rd Generation
Tensor Cores4th Generation4th Generation4th Generation4th Generation
NVIDIA ArchitectureAda LovelaceAda LovelaceAda LovelaceAda Lovelace
NVIDIA DLSSDLSS 3.5
Super Resolution
DLAA
Ray Reconstruction
Frame Generation
DLSS 3.5
Super Resolution
DLAA
Ray Reconstruction
Frame Generation
DLSS 3.5
Super Resolution
DLAA
Ray Reconstruction
Frame Generation
DLSS 3.5
Super Resolution
DLAA
Ray Reconstruction
Frame Generation
NVIDIA ReflexYesYesYesYes
NVIDIA BroadcastYesYesYesYes
PCI Express Gen 4YesYesYesYes
Resizable BARYesYesYesYes
NVIDIA® GeForce Experience™YesYesYesYes
NVIDIA AnselYesYesYesYes
NVIDIA FreeStyleYesYesYesYes
NVIDIA ShadowPlayYesYesYesYes
NVIDIA HighlightsYesYesYesYes
NVIDIA G-SYNC®YesYesYesYes
Game Ready DriversYesYesYesYes
NVIDIA Studio DriversYesYesYesYes
NVIDIA OmniverseYesYesYesYes
Microsoft DirectX® 12 UltimateYesYesYesYes
NVIDIA GPU Boost™YesYesYesYes
NVIDIA NVLink™ (SLI-Ready)NoNoNoNo
Vulkan RT API, OpenGL 4.6YesYesYesYes
NVIDIA Encoder (NVENC)2x 8th Generation2x 8th Generation1x 8th Generation1x 8th Generation
NVIDIA Decoder (NVDEC)5th Generation5th Generation5th Generation5th Generation
AV1 EncodeYesYesYesYes
AV1 DecodeYesYesYesYes
CUDA Capability8.98.98.98.9
VR ReadyYesYesYesYes
Display Support:
Maximum Resolution & Refresh Rate (1)4K at 240Hz or 8K at 60Hz with DSC, HDR4K at 240Hz or 8K at 60Hz with DSC, HDR4K at 240Hz or 8K at 60Hz with DSC, HDR4K at 240Hz or 8K at 60Hz with DSC, HDR
Standard Display ConnectorsHDMI(2), 3x DisplayPort(3)HDMI(2), 3x DisplayPort(3)HDMI(2), 3x DisplayPort(3)HDMI(2), 3x DisplayPort(3)
Multi Monitorup to 4(4)up to 4(4)up to 4(4)up to 4(4)
HDCP2.32.32.32.3
Card Dimensions:
LengthVaries by manufacturerVaries by manufacturer244 mm244 mm
WidthVaries by manufacturerVaries by manufacturer112 mm112 mm
SlotsVaries by manufacturerVaries by manufacturer2-Slot2-Slot
Thermal and Power Specs:
Maximum GPU Temperature (in C)90909090
Idle Power (W) (5)12121110
Video Playback Power (W) (6)17201616
Average Gaming Power (W) (7)226226200186
Total Graphics Power (W)285285220200
Required System Power (W) (8)700700650650
Supplementary Power Connectors2x PCIe 8-pin cables (adapter in box) OR
300 W or greater PCIe Gen 5 cable
2x PCIe 8-pin cables (adapter in box) OR
300 W or greater PCIe Gen 5 cable
2x PCIe 8-pin cables (adapter in box) OR 300 W or greater PCIe Gen 5 cable.

Certain manufacturer models may use 1x PCIe 8-pin cable.
2x PCIe 8-pin cables (adapter in box) OR 300 W or greater PCIe Gen 5 cable.

Certain manufacturer models may use 1x PCIe 8-pin cable.
1 - Up to 4k 12-bit HDR at 240Hz with DP 1.4a + DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1 + DSC. With dual DP 1.4a + DSC, up to 8K HDR at 120Hz
2 - As specified in HDMI 2.1a: up to 4K 240Hz or 8K 60Hz with DSC, Gaming VRR, HDR
3 - DisplayPort 1.4a
4 - Multi Monitor:
  • 4 independent displays at 4K 120Hz using DP or HDMI
  • 2 independent displays at 4K 240Hz or 8K 60Hz with DSC using DP or HDMI
  • Other display configurations may be possible based on available bandwidth
5 - Idle power measured with GPU running at idle at the Windows desktop for 10 minutes
6 - Video playback power measured using AV1 codec
7 - Average gaming power is measured across 22 games at 4K, 1440p, and 1080p
8 - Minimum is based on a PC configured with a Ryzen 9 5900X processor. Power requirements can be different depending on system configuration.
Note: The above specifications represent this GPU as incorporated into NVIDIA's Founders Edition or reference graphics card design. Clock specifications apply while gaming with medium to full GPU utilization. Graphics card specifications may vary by add-in-card manufacturer. Please refer to the add-in-card manufacturers' website for actual shipping specifications.
It also lists the 4070Ti Super as optionally having 2x8pin power connectors. I find that much more appealing than a 12+4pin connector.
 
Thanks
It also lists the 4070Ti Super as optionally having 2x8pin power connectors. I find that much more appealing than a 12+4pin connector.
I don't think it does, it seems to say you need a 12+4 pin or 2x8pin using the supplied adaptor
Technically it could be done but ive not seen any model that uses 2x8pin (it's early days)
 
Last edited:
Thanks

I don't think it does, it seems to say you need a 12+4 pin or 2x8pin using the supplied adaptor
Technically it could be done but ive not seen any model that uses 2x8pin (it's early days)
Yea you're right, I misread that.
 
@Reynaldo what did you decide to do now the 4070 super is out and the price is reduced on the 4070 ?
The 4070 super reviews look great, but that 12GB vram worries me a bit. I will wait a few months to try to save for a 4070ti super, but most likely I will get a non-ti.
 
The 4070 super reviews look great, but that 12GB vram worries me a bit. I will wait a few months to try to save for a 4070ti super, but most likely I will get a non-ti.
12GB has proven to be a bit of a limitation for my 4070. Hogwarts Legacy needs more even for 1080p. Granted I modded DLSS to be native res (effectively DLAA) so I could also run framegen at native res. But I was surprised when performance tanked after playing for a while. Turning textures to high fixes it. But it's a shame because the card is good enough other than the lack of memory.

Just saying this is likely to be more of a problem going forward. 4070Ti Super is the card I would've gotten back in March if it had been available. At this point I find it difficult to recommend a >$500 card with 12GB.
 
12GB has proven to be a bit of a limitation for my 4070. Hogwarts Legacy needs more even for 1080p. Granted I modded DLSS to be native res (effectively DLAA) so I could also run framegen at native res. But I was surprised when performance tanked after playing for a while. Turning textures to high fixes it. But it's a shame because the card is good enough other than the lack of memory.

Just saying this is likely to be more of a problem going forward. 4070Ti Super is the card I would've gotten back in March if it had been available. At this point I find it difficult to recommend a >$500 card with 12GB.
I am aiming for the Ti Super mostly because of its larger VRAM capacity, but 200$ is a lot of money where I'm from.
 
I am aiming for the Ti Super mostly because of its larger VRAM capacity, but 200$ is a lot of money where I'm from.
For sure. I just have a bad feeling about 12GB going forward. My opinion on this recently changed. I thought since there were popular $600-$800 12GB GPUs on the market devs would avoid going over that for a while. Doesn't seem to be the case. You can turn down textures but that feels bad for $600.
 
I bought a new monitor for my wife, a 34" ultrawide 1440 165hz and was going to get her a 4070S, thinking it should be enough for her needs. She doesn't really care about maxing out graphics, she has no idea what all the settings do and I configure every game for her. So I'm thinking between DLSS and reasonable settings depending on the game, I should be able to get her a decent framerate compared to her current 3070.
 
Hogwarts Legacy needs more even for 1080p.
I played it at 3840x1200 high settings no upscaling, no ray tracing and was getting over 60fps on a 2070
but I will probably get a 4070ti super because I want a card that will last as long as possible ,although the price is absolutely scandalous. At least Dick Turpin had the decency to wear a mask.

ps: one game my 2070 does struggle with is Immortals of Aveum
 
Last edited:
I played it at 3840x1200 high settings no upscaling, no ray tracing and was getting over 60fps on a 2070
but I will probably get a 4070ti super because I want a card that will last as long as possible ,although the price is absolutely scandalous. At least Dick Turpin had the decency to wear a mask.

ps: one game my 2070 does struggle with is Immortals of Aveum
The Ultra textures are a killer for me. High looks basically the same except sometimes certain textures don't load. I didn't notice much of this but when I did I was sad.

Also I was using maxed out raytracing and framegen. Probably would be fine on VRAM without that. Point is the 4070 is plenty powerful for this game maxed out at 1080p except for VRAM. With framegen (incredible feature) it's usually >100fps for the first few minutes. Can't maintain that for an actual play session without turning textures to High and seeing the occasional N64 texture.

When I see a game tested for VRAM use it's using like 11GB I assume 12GB will not be enough for actually playing the game. But it is enough for short benchmarks.
 
Last edited:
I am aiming for the Ti Super mostly because of its larger VRAM capacity, but 200$ is a lot of money where I'm from.

A concern is the price range might be more than $200 in practice as the 4070ti S doesn't have an FE model to anchor MSRP.

For sure. I just have a bad feeling about 12GB going forward. My opinion on this recently changed. I thought since there were popular $600-$800 12GB GPUs on the market devs would avoid going over that for a while. Doesn't seem to be the case. You can turn down textures but that feels bad for $600.

But isn't that the case for the most part? There's always going to be the exceptions. For instance with last gen things like the Farcry 5 texture pack would also overwhelm 6GB VRAM (same relative ratio) which was the VRAM for the most popular card by far (the 1060 6GB).
 
Back
Top