And no processor to accurately model soundA USB DAC to eliminate interference from GPUs -> a stereo receiver -> good stereo monitors and a subwoofer.
A USB DAC to eliminate interference from GPUs -> a stereo receiver -> good stereo monitors and a subwoofer.
And no processor to accurately model sound
Except it doesnt! I'm not saying it cant what I am saying is once dsp's disappeared properly modelled audio disapearedand it all runs in software.
I didn't hear shit, the bullet didn't go anywhere and I'm spinning away from it like a moron. (The downside of too accurate of a physics model...)Now go into outter space and fire the gun...
The physics are perfect, but it's another one of those games with stupid invisible boundary walls the gun just happened to be pointed right against unknown to myself before firing.The bullet should travel so the physics model isnt that accurate
Absolutely. I just happen to have good quality passive monitors that I have no intention of replacing. They'll propably last a few decades.Well, you can even cut a bit: USB DAC ( i have an AKAI EIE Pro http://www.akaipro.com/product/eiepro ) ... > Active Yamaha monitors.. (XLR or RCA input )
But i have an old X-FI Titanium HD PCIexpress ( including stereo analog in/out ) that i use more for gaming anyway.... Old, but honesltly, i have never got any problem with it. ( just for ASIO, i use different drivers )
Except it doesnt! I'm not saying it cant what I am saying is once dsp's disappeared properly modelled audio disapeared
The reason you you longer need a dsp is because games no longer correctly model audio.
can you name a few so i can investigateSome do, most don't.
I don't play nearly as many games as I used to, but eg. WWise is used by lots of studios, is multi platform and has a convolution reverb plugin (among other things).can you name a few so i can investigate
There's the silly issue of having to replace a HDMI 1.1 receiver with a HDMI 1.4 receiver and then a HDMI 2.0 receiver
You actually didn't use a great analogy because display-wise the difference between HDMI 1.1 and 1.4 is really small (there's deep color support but AFAIK content with 10bit color didn't exist until HDR came along), but I get your point.How is this any different from "having" to replace an HDMI 1.1 display with an HDMI 1.4 display, etc. ?
AFAIK TrueAudio isn't supported by all AMD cards. And such DSPs also have to be present on Nvidia cards in order to be relevant.
My impression is that AMD came with TrueAudio mainly because they desperately needed something new to show off since they were losing to Intel on the CPU front and to Nvidia on the GPU front.
"Wwise Convolution lets you create convincing reverberation based on samples of real acoustic spaces. Accurately reproduce any space from the smallest room to the largest cathedral."
You actually didn't use a great analogy because display-wise the difference between HDMI 1.1 and 1.4 is really small (there's deep color support but AFAIK content with 10bit color didn't exist until HDR came along), but I get your point.
As I explained before, an AV receiver's price is a lot more dependent on the quality of its analog circuitry and auto-calibration systems than the supported standards and formats. When you pay $600-3000 for a receiver, very little of that is going for getting the latest decryption chip or sound decoder. There are $230 receivers that support full bandwidth HDMI 2.0a and HDCP2.2, but their analog output just isn't good enough for decent floor-standing or booksat'shelf speakers.
So if you bought a $1000 AV receiver in 2015 that performs nicely with a $1500 set of speakers, right now if you bought a 4K HDR TV you'd have to buy another expensive receiver just to get something that has nothing to do with audio.
Again, I wouldn't mind this a lot if someone out there made a cheap device that could extract the audio from a HDMI 2.0 source and put it into a HDMI 1.4 cable for audio + HDMI 2.0 for video (though I could run into sync problems), but no one is doing it at the moment.