The Witcher 3: Wild Hunt revealed

Discussion in 'PC Gaming' started by Dresden, Feb 5, 2013.

  1. homerdog

    homerdog donator of the year Legend Subscriber

    I dunno how much it matters in reality, but your system may not support PCIe 3.0 (it was confusing on X79 IIRC, and for shame there was no X89). Maybe dual Titan Xes benefit from PCIe 3.0? Generally this has not been an issue but as GPUs get stronger it will start to matter more (more data travelling over the PCIe bus).

    Otherwise that CPU should be more than capable of maxing out modern games (and will be for some time, especially with D3D12 coming up). 6 SNB cores at 3.2-3.8GHz (stock) is just beastly.

    BTW I never went anywhere! Although you and Malo may have to have a cagematch to determine who is my true B3D BFF. Al may join but be careful he will come at you from above when you least expect it and steal your booty.
     
    Dresden likes this.
  2. swaaye

    swaaye Entirely Suboptimal Legend

    Has anyone been able to prove that PCIe 1.1 x16 is a major bottleneck? Let alone 2.0....

    4.0 is supposed to be finalized in 2016.

    I have a hard time getting excited about faster PCIe when in general it is a major problem for PC game performance. GPU to CPU transfers can cripple performance. The consoles don't really have this issue. It causes weird quirks like making ancient console emulation difficult.
     
    Last edited: May 15, 2015
  3. Lightman

    Lightman Veteran Subscriber

    I think TW3 engine is very well threaded as developers themselves said that in one of the old interviews, so the only worry with your CPU would be DX11 serial pipe slowing down whole engine. But on the other hand this is a bigger worry if you're on AMD cards than nVidia ones, especially when running W8.1.

    I'm on i7 4770k and R9 290X so I have bigger chance of performance issues, but I'm still hopeful for 60FPS in Ultra with QHD res. If not, then I'm upgrading my card to R9 390X :p
     
  4. homerdog

    homerdog donator of the year Legend Subscriber

    It's not an issue in single card setups for sure, and generally multi-card setups are fine with dual 3.0x8 slots. However I haven't seen any testing of this on the new supercards like the Titan X. I imagine it could be more of an issue when pushing super high resolutions and/or framerates that those cards can deliver.
     
  5. swaaye

    swaaye Entirely Suboptimal Legend

    I guess that depends on whether a beefier video card means additional PCIe transfers. I'm not sure that it does. You don't really want much going across the bus if you want minimum stuttering. You want as much as possible done locally on the GPU. Same story since the PCI Voodoo1.
     
    Last edited: May 15, 2015
  6. homerdog

    homerdog donator of the year Legend Subscriber

    Higher resolutions and framerates definitely means more data across the PCIe bus. That's what I'm getting at.

    Perhaps increased scene complexity as well? There is a lot of stuff going on in GTA5 at any given moment.
     
  7. swaaye

    swaaye Entirely Suboptimal Legend

    Games tend to load up the video card with as much data as possible ASAP so PCIe transfers are minimized most of the time.

    Sending anything from the GPU to the CPU (and back) tends to be bad. See GPGPU physics limitations. And we of course know how well PCIe texturing works. Nobody wants 2GB cards anymore. ;)
     
  8. homerdog

    homerdog donator of the year Legend Subscriber

    Absolutely true, but when you're running higher resolutions there is more framebuffer information that has to be sent between the cards. When the framerate increases there are more frames to be sent between the cards per second, hence more bandwidth used. I'm not really debating a point, just stating the obvious. :)

    FWIW I don't really think Dresden is limited by PCIe bandwidth. Likely there is something else entirely causing the slowdowns. Single thread CPU performance is a likely culprit.
     
  9. swaaye

    swaaye Entirely Suboptimal Legend

    Oh you're talking multi GPU. That didn't click for some reason. ;)
     
  10. TheAlSpark

    TheAlSpark Moderator Moderator Legend

    I'm pretty sure it's max settings being extremely CPU heavy, though I can see SLI having higher CPU sensitivity as well (drivers).
     
  11. Silent_Buddha

    Silent_Buddha Legend

    I haven't kept up with SLI, but does it still require a physical bridge connector? If it does then it shouldn't need PCIE bandwidth nearly as much as Crossfire without a physical bridge.

    Regards,
    SB
     
  12. homerdog

    homerdog donator of the year Legend Subscriber

    It does require a bridge I think, but I don't have a clue as to how much bandwidth it provides and what they send over it vs the PCIe bus. Not even sure if that's public knowledge.

    Has the bridge ever changed? If not, I would imagine the bandwidth it provides is somewhat pathetic compared with modern PCIe. SLI has been around since what, the 6000 series?

    Or maybe they've been silently upgrading it whilst maintaining backwards compatibility? Or maybe each series of cards has its own bridge?! I know nothing!!
     
  13. Dresden

    Dresden Celebrating Mediocrity Veteran

    I folded and watched the launch video, even though I swore I wouldn't. If you haven't seen it, well, you should. It's the best launch video I've ever seen:

    http://www.gamersyde.com/news_the_witcher_3_launch_cinematic-16539_en.html

    That's that website I mentioned a while ago, they're the absolute best for watching videos. Plus, their screenshots are enormous, if you need a nice background.
     
    Lightman likes this.
  14. Lightman

    Lightman Veteran Subscriber

    HQ cinematic trailer rocks! I've watched it 3 times on YT and only after watching HQ version I've noticed arrow tip piercing her body and that there are two tear drops and not one!

    Amazing attention to detail in that trailer! Great job Laa-Yosh and your colleagues!

    PS. For those of us who are a bit sad CDPR didn't have resources to push improved, deserving, monstrous PC version of TW3 - http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why
     
    Last edited: May 16, 2015
    Malo and Dresden like this.
  15. Silent_Buddha

    Silent_Buddha Legend

    Bleh, so confirmation that consoles are why the graphics were dumbed down reduced.

    Even sadder to think that if they had stayed PC only, we'd have gotten those other, better, graphics. But the resource drain of developing on multiple platforms meant we got the console gimped version.

    Regards,
    SB
     
  16. Dresden

    Dresden Celebrating Mediocrity Veteran

    Yeah, Gamersyde is my new favorite website. Their videos are the best and cater to fidelity fiends like me.
     
    Lightman likes this.
  17. Malo

    Malo Yak Mechanicum Legend Subscriber

  18. pjbliverpool

    pjbliverpool B3D Scallywag Legend

    Unbeleivable. Kudos on that guy for telling it how it is though.

    But why the hell has the PC version got reduced hairworks? That makes no sense, it's a frickin Nvidia technology!

    Anyway, I'm afraid this is a definite miss for me now (not that I was hugely interested in the first place so probably not a lost sale).
     
  19. pjbliverpool

    pjbliverpool B3D Scallywag Legend

    Here we have a contradictory (and more verifiable) sources stating that the versions will not be identical and that the PC version will have improvements over the console versions, including, but not limited to, Hairworks:

    http://www.dsogaming.com/news/cd-pr...fferences-between-these-platforms/#more-77181

    It's possible the previous source is also correct about things being cut from the PC version for the sake of the consoles but that source would seem to be wrong about the equivalent draw distance and superior hair works on the consoles.
     
    Lightman likes this.
  20. Lightman

    Lightman Veteran Subscriber

Loading...

Share This Page

Loading...