The Witcher 3: Wild Hunt revealed

I dunno how much it matters in reality, but your system may not support PCIe 3.0 (it was confusing on X79 IIRC, and for shame there was no X89). Maybe dual Titan Xes benefit from PCIe 3.0? Generally this has not been an issue but as GPUs get stronger it will start to matter more (more data travelling over the PCIe bus).

Otherwise that CPU should be more than capable of maxing out modern games (and will be for some time, especially with D3D12 coming up). 6 SNB cores at 3.2-3.8GHz (stock) is just beastly.

BTW I never went anywhere! Although you and Malo may have to have a cagematch to determine who is my true B3D BFF. Al may join but be careful he will come at you from above when you least expect it and steal your booty.
 
Has anyone been able to prove that PCIe 1.1 x16 is a major bottleneck? Let alone 2.0....

4.0 is supposed to be finalized in 2016.

I have a hard time getting excited about faster PCIe when in general it is a major problem for PC game performance. GPU to CPU transfers can cripple performance. The consoles don't really have this issue. It causes weird quirks like making ancient console emulation difficult.
 
Last edited:
I think TW3 engine is very well threaded as developers themselves said that in one of the old interviews, so the only worry with your CPU would be DX11 serial pipe slowing down whole engine. But on the other hand this is a bigger worry if you're on AMD cards than nVidia ones, especially when running W8.1.

I'm on i7 4770k and R9 290X so I have bigger chance of performance issues, but I'm still hopeful for 60FPS in Ultra with QHD res. If not, then I'm upgrading my card to R9 390X :p
 
Has anyone been able to prove that PCIe 1.1 x16 is a major bottleneck? Let alone 2.0....

4.0 is supposed to be finalized in 2016.

I have a hard time getting excited about faster PCIe when in general it is a major problem for PC game performance. GPU to CPU transfers can cripple performance. The consoles don't really have this issue. It causes weird quirks like making ancient console emulation difficult.
It's not an issue in single card setups for sure, and generally multi-card setups are fine with dual 3.0x8 slots. However I haven't seen any testing of this on the new supercards like the Titan X. I imagine it could be more of an issue when pushing super high resolutions and/or framerates that those cards can deliver.
 
However I haven't seen any testing of this on the new supercards like the Titan X. I imagine it could be more of an issue when pushing super high resolutions and/or framerates that those cards can deliver.
I guess that depends on whether a beefier video card means additional PCIe transfers. I'm not sure that it does. You don't really want much going across the bus if you want minimum stuttering. You want as much as possible done locally on the GPU. Same story since the PCI Voodoo1.
 
Last edited:
I guess that depends on whether a beefier video card means additional PCIe transfers. I'm not sure that it does. You don't really want much going across the bus if you want minimum stuttering. You want as much as possible done locally on the GPU. Same story since the PCI Voodoo1.
Higher resolutions and framerates definitely means more data across the PCIe bus. That's what I'm getting at.

Perhaps increased scene complexity as well? There is a lot of stuff going on in GTA5 at any given moment.
 
Higher resolutions and framerates definitely means more data across the PCIe bus. That's what I'm getting at.

Perhaps increased scene complexity as well? There is a lot of stuff going on in GTA5 at any given moment.
Games tend to load up the video card with as much data as possible ASAP so PCIe transfers are minimized most of the time.

Sending anything from the GPU to the CPU (and back) tends to be bad. See GPGPU physics limitations. And we of course know how well PCIe texturing works. Nobody wants 2GB cards anymore. ;)
 
Games tend to load up the video card with as much data as possible ASAP so PCIe transfers are minimized.
Absolutely true, but when you're running higher resolutions there is more framebuffer information that has to be sent between the cards. When the framerate increases there are more frames to be sent between the cards per second, hence more bandwidth used. I'm not really debating a point, just stating the obvious. :)

FWIW I don't really think Dresden is limited by PCIe bandwidth. Likely there is something else entirely causing the slowdowns. Single thread CPU performance is a likely culprit.
 
Absolutely true, but when you're running higher resolutions there is more framebuffer information that has to be sent between the cards. When the framerate increases there are more frames to be sent between the cards per second, hence more bandwidth used. I'm not really debating a point, just stating the obvious. :)
Oh you're talking multi GPU. That didn't click for some reason. ;)
 
I'm currently using two Titan X's, and I went out and bought GTA 5 for the sake of blasting it to see what the "next gen" had to offer. At completely maxed out settings, including the shadow sliders, there were noticeable pitfalls in performance, and I can't tell if it's the games optimization or my CPU starting to show its age. That's when I started wondering if maybe my processor was the problem.

I'm pretty sure it's max settings being extremely CPU heavy, though I can see SLI having higher CPU sensitivity as well (drivers).
 
It's not an issue in single card setups for sure, and generally multi-card setups are fine with dual 3.0x8 slots. However I haven't seen any testing of this on the new supercards like the Titan X. I imagine it could be more of an issue when pushing super high resolutions and/or framerates that those cards can deliver.

I haven't kept up with SLI, but does it still require a physical bridge connector? If it does then it shouldn't need PCIE bandwidth nearly as much as Crossfire without a physical bridge.

Regards,
SB
 
It does require a bridge I think, but I don't have a clue as to how much bandwidth it provides and what they send over it vs the PCIe bus. Not even sure if that's public knowledge.

Has the bridge ever changed? If not, I would imagine the bandwidth it provides is somewhat pathetic compared with modern PCIe. SLI has been around since what, the 6000 series?

Or maybe they've been silently upgrading it whilst maintaining backwards compatibility? Or maybe each series of cards has its own bridge?! I know nothing!!
 
HQ cinematic trailer rocks! I've watched it 3 times on YT and only after watching HQ version I've noticed arrow tip piercing her body and that there are two tear drops and not one!

Amazing attention to detail in that trailer! Great job Laa-Yosh and your colleagues!

PS. For those of us who are a bit sad CDPR didn't have resources to push improved, deserving, monstrous PC version of TW3 - http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why
 
Last edited:
Bleh, so confirmation that consoles are why the graphics were dumbed down reduced.

Even sadder to think that if they had stayed PC only, we'd have gotten those other, better, graphics. But the resource drain of developing on multiple platforms meant we got the console gimped version.

Regards,
SB
 
HQ cinematic trailer rocks! I've watched it 3 times on YT and only after watching HQ version I've noticed arrow tip piercing her body and that there are two tear drops and not one!

Amazing attention to detail in that trailer! Great job Laa-Yosh and your colleagues!

PS. For those of us who are a bit sad CDPR didn't have resources to push improved, deserving, monstrous PC version of TW3 - http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why

Yeah, Gamersyde is my new favorite website. Their videos are the best and cater to fidelity fiends like me.
 
Unbeleivable. Kudos on that guy for telling it how it is though.

But why the hell has the PC version got reduced hairworks? That makes no sense, it's a frickin Nvidia technology!

Anyway, I'm afraid this is a definite miss for me now (not that I was hugely interested in the first place so probably not a lost sale).
 
Here we have a contradictory (and more verifiable) sources stating that the versions will not be identical and that the PC version will have improvements over the console versions, including, but not limited to, Hairworks:

http://www.dsogaming.com/news/cd-pr...fferences-between-these-platforms/#more-77181

It's possible the previous source is also correct about things being cut from the PC version for the sake of the consoles but that source would seem to be wrong about the equivalent draw distance and superior hair works on the consoles.
 
I did quick visit to GAME today and it looks like PC version will come on 2xDVD instead of just 1 for consoles.

jK9MZgzs.jpg

http://i.imgur.com/jK9MZgz.jpg - full size


dcFcVj7s.jpg

http://i.imgur.com/dcFcVj7.jpg - full size
 
Back
Top