Who Killed PC Audio

A USB DAC to eliminate interference from GPUs -> a stereo receiver -> good stereo monitors and a subwoofer.
 
A USB DAC to eliminate interference from GPUs -> a stereo receiver -> good stereo monitors and a subwoofer.

Well, you can even cut a bit: USB DAC ( i have an AKAI EIE Pro http://www.akaipro.com/product/eiepro ) ... > Active Yamaha monitors.. (XLR or RCA input )

But i have an old X-FI Titanium HD PCIexpress ( including stereo analog in/out ) that i use more for gaming anyway.... Old, but honesltly, i have never got any problem with it. ( just for ASIO, i use different drivers )
 
Last edited:
and it all runs in software.
Except it doesnt! I'm not saying it cant what I am saying is once dsp's disappeared properly modelled audio disapeared
The reason you you longer need a dsp is because games no longer correctly model audio.
Case in point, in an open environment fire a gun listen to the sound of the gunshot
now go into a small room preferably a bathroom with tiled walls and fire the same gun, has to sound of the gunshot now changed due to the size of the room and the materials the room is constructed from ?
Now go into a different but similarly sized room that is not covered in ceramic tiles is the sound of the gunshot different again ?
 
The bullet should travel so the physics model isnt that accurate ;)
The physics are perfect, but it's another one of those games with stupid invisible boundary walls the gun just happened to be pointed right against unknown to myself before firing.
 
Well, you can even cut a bit: USB DAC ( i have an AKAI EIE Pro http://www.akaipro.com/product/eiepro ) ... > Active Yamaha monitors.. (XLR or RCA input )

But i have an old X-FI Titanium HD PCIexpress ( including stereo analog in/out ) that i use more for gaming anyway.... Old, but honesltly, i have never got any problem with it. ( just for ASIO, i use different drivers )
Absolutely. I just happen to have good quality passive monitors that I have no intention of replacing. They'll propably last a few decades.
 
Except it doesnt! I'm not saying it cant what I am saying is once dsp's disappeared properly modelled audio disapeared
The reason you you longer need a dsp is because games no longer correctly model audio.

Some do, most don't. It's ten year now since Crackdown launched with ray traced convolution reverb. I know, it's a console title, but head and shoulders above anything else at the time.

Sound quality is rarely discussed in reviews or on forums. Consequently it's not a priority, - a pity. However, that wouldn't change if we still had dedicated sound hardware.

Cheers
 
Last edited:
There's the silly issue of having to replace a HDMI 1.1 receiver with a HDMI 1.4 receiver and then a HDMI 2.0 receiver

How is this any different from "having" to replace an HDMI 1.1 display with an HDMI 1.4 display, etc. ? You either need/want the added functionality enough to upgrade, or you don't. For my part, I finally felt the need to upgrade from my 1.2 receiver to a 2.0 one after 9 years to enable 4k video switching and Dolby Atmos/DTS-X. I consider that a pretty good run.
 
Also, there were dedicated audio DSPs on AMD cards for a couple of generations that developers and gamers seemed to have been largely indifferent to.
 
AFAIK TrueAudio isn't supported by all AMD cards. And such DSPs also have to be present on Nvidia cards in order to be relevant.
My impression is that AMD came with TrueAudio mainly because they desperately needed something new to show off since they were losing to Intel on the CPU front and to Nvidia on the GPU front.
 
How is this any different from "having" to replace an HDMI 1.1 display with an HDMI 1.4 display, etc. ?
You actually didn't use a great analogy because display-wise the difference between HDMI 1.1 and 1.4 is really small (there's deep color support but AFAIK content with 10bit color didn't exist until HDR came along), but I get your point.

As I explained before, an AV receiver's price is a lot more dependent on the quality of its analog circuitry and auto-calibration systems than the supported standards and formats. When you pay $600-3000 for a receiver, very little of that is going for getting the latest decryption chip or sound decoder. There are $230 receivers that support full bandwidth HDMI 2.0a and HDCP2.2, but their analog output just isn't good enough for decent floor-standing or bookshelf speakers.

So if you bought a $1000 AV receiver in 2015 that performs nicely with a $1500 set of speakers, right now if you bought a 4K HDR TV you'd have to buy another expensive receiver just to get something that has nothing to do with audio.

Again, I wouldn't mind this a lot if someone out there made a cheap device that could extract the audio from a HDMI 2.0 source and put it into a HDMI 1.4 cable for audio + HDMI 2.0 for video (though I could run into sync problems), but no one is doing it at the moment.
 
Last edited by a moderator:
AFAIK TrueAudio isn't supported by all AMD cards. And such DSPs also have to be present on Nvidia cards in order to be relevant.
My impression is that AMD came with TrueAudio mainly because they desperately needed something new to show off since they were losing to Intel on the CPU front and to Nvidia on the GPU front.

If consumers cared it would have been a unique selling point for both the hardware and the games that supported it. They didn't, though.
 
@Gubbi I have lots of games that use wwise but I dont remember proper ausio modelling

from the wwise website
"Wwise Convolution lets you create convincing reverberation based on samples of real acoustic spaces. Accurately reproduce any space from the smallest room to the largest cathedral."

So its pre done reverbs that you can alter certain parameters of

Edit: just as a very quick and dirty test I searched for a list of games using wwise and picked one at random (bioshock infinite) and did the gun test and it failed.
 
Last edited:
You actually didn't use a great analogy because display-wise the difference between HDMI 1.1 and 1.4 is really small (there's deep color support but AFAIK content with 10bit color didn't exist until HDR came along), but I get your point.

As I explained before, an AV receiver's price is a lot more dependent on the quality of its analog circuitry and auto-calibration systems than the supported standards and formats. When you pay $600-3000 for a receiver, very little of that is going for getting the latest decryption chip or sound decoder. There are $230 receivers that support full bandwidth HDMI 2.0a and HDCP2.2, but their analog output just isn't good enough for decent floor-standing or booksat'shelf speakers.

So if you bought a $1000 AV receiver in 2015 that performs nicely with a $1500 set of speakers, right now if you bought a 4K HDR TV you'd have to buy another expensive receiver just to get something that has nothing to do with audio.

Again, I wouldn't mind this a lot if someone out there made a cheap device that could extract the audio from a HDMI 2.0 source and put it into a HDMI 1.4 cable for audio + HDMI 2.0 for video (though I could run into sync problems), but no one is doing it at the moment.

2.0 brings object-based audio along too, though, so you do get additional audio-related functionality. And if someone spent a $1,000 on an A/V receiver in 2015 that didn't support 2.0, well that was just a bad decision. The 2.0 -> 2.1 upgrade is probably going to force a quicker upgrade than I would prefer, but that's also going to require a new TV, a new video card, new cables even. So, the new receiver will just get budgeted in with the rest. Then the "old" stuff gets re-purposed/sold/handed-down to family or friends.

It's not ideal, I agree, but for me having an easy, high quality experience makes up for it.
 
Back
Top