I have the great Gigaworks S750 system. Back when I was living in my parents' house, I had the speakers very decently set in the ceiling and bookshelves. My room was my man cave.
Now I have to split the office with the wife so I can only set it as a pseudo 5.1 in the desk
Not a huge problem because the living room has a very decent AV Receiver with a Jamo 5.1 setup. But my HTPC isn't as up-to-date as my desktop in the office. The GTX 660 Ti definitely wasn't a great purchase at the time.
There's the silly issue of having to replace a HDMI 1.1 receiver with a HDMI 1.4 receiver and then a HDMI 2.0 receiver
And now
HDMI 2.1. Plus, no one out there seems able/willing to make a device that extracts the audio stream from HDMI other than the 25 year-old SPDIF standards. I now own a 4K HDR TV that renders both PCM and all the high-resolution formats completely useless if I want to watch 4K content. All I can use is ARC which outputs vanilla Dolby Digital, DTS or stereo PCM.
The stupidest part here is the receivers' price is heavily influenced by the quality of its analog components, making the purchase of a good receiver the most frustrating thing ever, whenever a new HDMI standard comes up.
I guess the only solution left for me is to get a HDMI 2.0 graphics card that sends the audio through one HDMI port while sending the video through another port (if such a thing is even possible).
is there a receiver that supports Displayport by the way? Receiver with Displayport 1.3 and Freesync?
Just bringing a mere technicality there.
FreeSync will most probably be formally supported through HDMI 2.1's Game Mode Variable Refresh Rate.
Meaning any receiver maker that wants to support HDMI 2.1 for that 8K bulletpoint then it needs to support FreeSync.
As for DisplayPort, I don't really see the benefit of doing so. Are you going to hook up a high-end computer monitor to a home theater setup?
Of course that's the story that Creative will tell. They naturally wouldn't highlight their own incompetence.
You honestly believe that back in 2004/2005 "most of the OS crashes" with sound blaster users came from the audio driver?
This was the time of Geforce FX and early days of Geforce 6. I had a 9700 Pro, then a pair of 6800GT SLI and then a X1900 XT.
All throughout this time I had a SB Live!, then an Audigy 2 and then a X-Fi XtremeGamer. I did see the drivers getting bloated and slow due to too much featurecreep just like he mentions, but I recall very well where most of my crashes came from, and they didn't come from my soundcards. My ratio of graphics-related CTDs to sound-related CTDs were probably close to 10:1.
Kind reminder that this was also the age of Netburst Intel CPUs and everyone plus their dogs were overclocking, even the ones with Athlon 64 who didn't really need to do it.