Review: ASRock X670E Steel Legend motherboard

Published by

Click here to post a comment for Review: ASRock X670E Steel Legend motherboard on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

If i decide to save some money byr "cornering" myself into another dead end platform, then i`ll buy a 13600KF and a decent Z690 board...
I think I'd do the same in your shoes. Though, I personally might opt for the 13500 since I don't care about the extra few Hz and care more about efficiency.
As for the audio part, i had a dedicated sound card on my Q9550 build, but then i bought a Z270 board for my 7600K rig and found out that the onboard was very decent, as good or even better than my old sound card, something that was really surprising to me.
I think many would be surprised how good modern onboard audio is these days. We've come a long way. Wherever it isn't enough, you can get a decent external DAC for cheap.
So for my future build, i`m looking for a board that has better onboard audio solution as my current build. Or at least as good.
Why not use HDMI for audio and spend a little extra on a good receiver? That way, whatever you do in the future, you'll have something reliably good.
The problem is that there`s very little info about onboard audio and their quality, so it`s not easy to know what to look for...
If you can identify the audio chipset and see the capacitors used on the motherboard, you can get a rough idea of how good it'll be.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

I think many would be surprised how good modern onboard audio is these days. We've come a long way. Wherever it isn't enough, you can get a decent external DAC for cheap. Why not use HDMI for audio and spend a little extra on a good receiver? That way, whatever you do in the future, you'll have something reliably good. If you can identify the audio chipset and see the capacitors used on the motherboard, you can get a rough idea of how good it'll be.
Do you care to explain the bolded part? Because i know very little about audio hardware.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

Do you care to explain the bolded part? Because i know very little about audio hardware.
Let's start with the chipset: If you look at the motherboard specs, it'll tell you which audio chipset is used (and if they don't then you should avoid that product). Using your example of the Strix B650-A, you can identify the chipset if you go to the Audio section of the product page: https://rog.asus.com/motherboards/rog-strix/rog-strix-b650-a-gaming-wifi-model/ In this case it is the Realtek ALC4080. You can then look up that chip for reviews, issues, or various implementations of it. Not every manufacturer gives equal treatment of the audio chip, and this kinda matters because they're in an electrically noisy environment. So, just because the chipset yields poor performance on one motherboard, doesn't mean it will on all of them. That's where looking at how the board is configured matters. In the case of that Asus board, it probably would work pretty well because it has that shield to protect it from EMI. At least, it's protecting the chip itself - EMI can still affect the unshielded traces on the motherboard leading to the audio jacks too. That's where the capacitors come in. One of the most common applications of capacitors is to help smooth out the signal (AKA, noise filtering). If you see a board with a just a couple cheap electrolytic capacitors, you're probably going to get a lot of hissing and buzzing. In the case of that Asus board, they used quite a few decent ones so it should sound pretty clean. As far as I understand, there is such thing as too many capacitors but I'm no expert on this stuff so I'm not about to make claims I can't back up. Anyway, the amplifier chip matters a lot too but again, motherboards are noisy environments so really all that matters is whether the amp isn't known to be garbage. If you're an audiophile and want to wear headphones, I don't think integrated audio is ever the way to go, simply because the amp can never be kept "clean" enough. The higher you push the volume, the more prominent the EMI noise will be. That's why I think integrated audio is perfectly fine for speaker users, since you can reduce a lot of the EMI noise by keeping the PC volume down and then use the speaker driver's volume control when you want things to be quieter or louder. Bear in mind, some of what I said could be a little outdated. I'm no audiophile and I don't really care that much about maximizing sound quality because for my surround system, 33% volume gets to be obnoxiously loud so even if my integrated audio is a hissy mess (which frankly, it probably is) I'm not noticing it lol. EDIT: Oh yeah, and I'm about to get a lot of flak for this, but don't buy into all the sample rate and bit rate BS. In case you're not aware, sample rate is basically the resolution of the X axis of an audio track, bit rate is the resolution of the Y axis. When recording audio, you want these as high as you can possibly get. Whenever you hear a recording where there is clipping/distortion from something being too loud, that's sometimes because they recorded in 16-bit rather than 24-bit. Whenever you listen to audio where drums sound muffled or almost like a hiss, that's because the sample rate was set too low. When it comes to playback, you really don't need much at all except in some very niche situations (particularly if you're a highly trained audio producer). Any audio track that is well-suited for all audiences only needs 16-bit depth. Even in the very rare cases where you can perceive the difference between 16 and 24, it honestly doesn't matter. In a worst-case scenario, it'd be like listening to an orchestra where maybe the vibrato of a violin is ever-so-slightly less prominent, and only if you really focus on it back-to-back between 16 and 24. Don't get me started on 32... For sample rate, it's quite simple: most humans can't hear much beyond 20KHz, and of those who can, it's not exactly desirable. According to the Nyquist Theorum, in order to ensure reliability in your data, you want to double the frequency. So - that's why sample rates are commonly in the 40-50KHz range. You can go ahead and increase it but you won't hear a difference. Anyone who says otherwise is just experiencing placebo. As with bit depth, it's highly recommended to record content in much higher sample rates. So for consuming content, just focus on a good amplifier. The crappiest of audio chipsets from 10 years ago can handle all the quality you need for consuming content, but none of that matters if it can't deliver the sound cleanly.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

Let's start with the chipset: If you look at the motherboard specs, it'll tell you which audio chipset is used (and if they don't then you should avoid that product). Using your example of the Strix B650-A, you can identify the chipset if you go to the Audio section of the product page: https://rog.asus.com/motherboards/rog-strix/rog-strix-b650-a-gaming-wifi-model/ In this case it is the Realtek ALC4080. You can then look up that chip for reviews, issues, or various implementations of it. Not every manufacturer gives equal treatment of the audio chip, and this kinda matters because they're in an electrically noisy environment. So, just because the chipset yields poor performance on one motherboard, doesn't mean it will on all of them. That's where looking at how the board is configured matters. In the case of that Asus board, it probably would work pretty well because it has that shield to protect it from EMI. At least, it's protecting the chip itself - EMI can still affect the unshielded traces on the motherboard leading to the audio jacks too. That's where the capacitors come in. One of the most common applications of capacitors is to help smooth out the signal (AKA, noise filtering). If you see a board with a just a couple cheap electrolytic capacitors, you're probably going to get a lot of hissing and buzzing. In the case of that Asus board, they used quite a few decent ones so it should sound pretty clean. As far as I understand, there is such thing as too many capacitors but I'm no expert on this stuff so I'm not about to make claims I can't back up. Anyway, the amplifier chip matters a lot too but again, motherboards are noisy environments so really all that matters is whether the amp isn't known to be garbage. If you're an audiophile and want to wear headphones, I don't think integrated audio is ever the way to go, simply because the amp can never be kept "clean" enough. The higher you push the volume, the more prominent the EMI noise will be. That's why I think integrated audio is perfectly fine for speaker users, since you can reduce a lot of the EMI noise by keeping the PC volume down and then use the speaker driver's volume control when you want things to be quieter or louder. Bear in mind, some of what I said could be a little outdated. I'm no audiophile and I don't really care that much about maximizing sound quality because for my surround system, 33% volume gets to be obnoxiously loud so even if my integrated audio is a hissy mess (which frankly, it probably is) I'm not noticing it lol. EDIT: Oh yeah, and I'm about to get a lot of flak for this, but don't buy into all the sample rate and bit rate BS. In case you're not aware, sample rate is basically the resolution of the X axis of an audio track, bit rate is the resolution of the Y axis. When recording audio, you want these as high as you can possibly get. Whenever you hear a recording where there is clipping/distortion from something being too loud, that's sometimes because they recorded in 16-bit rather than 24-bit. Whenever you listen to audio where drums sound muffled or almost like a hiss, that's because the sample rate was set too low. When it comes to playback, you really don't need much at all except in some very niche situations (particularly if you're a highly trained audio producer). Any audio track that is well-suited for all audiences only needs 16-bit depth. Even in the very rare cases where you can perceive the difference between 16 and 24, it honestly doesn't matter. In a worst-case scenario, it'd be like listening to an orchestra where maybe the vibrato of a violin is ever-so-slightly less prominent, and only if you really focus on it back-to-back between 16 and 24. Don't get me started on 32... For sample rate, it's quite simple: most humans can't hear much beyond 20KHz, and of those who can, it's not exactly desirable. According to the Nyquist Theorum, in order to ensure reliability in your data, you want to double the frequency. So - that's why sample rates are commonly in the 40-50KHz range. You can go ahead and increase it but you won't hear a difference. Anyone who says otherwise is just experiencing placebo. As with bit depth, it's highly recommended to record content in much higher sample rates. So for consuming content, just focus on a good amplifier. The crappiest of audio chipsets from 10 years ago can handle all the quality you need for consuming content, but none of that matters if it can't deliver the sound cleanly.
First of all, many thanks for the lengthy explanation, this is one of the reasons why i love Guru3d and his members. So, i`m still trying to digest everything you wrote but it seems that the Asus boards are good enough regarding the audio department. As for headphones, i rarely use them, so no problem there. And like i wrote before, my Audioengine speakers are very nice and deliver great sound, so no problem there too. About the AMP part, that`s already integrated on the MB, right? Or is something we need to buy to deliver better sound? Anyway, thanks again.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

So, i`m still trying to digest everything you wrote but it seems that the Asus boards are good enough regarding the audio department.
Yes, there's a very good chance they are. At least the higher-end boards. I stopped trusting Asus for lower tier products years ago (just in general, not specific to audio). Nowadays, I would argue it's pretty hard to find truly awful integrated audio for most people. I'd recommend an external DAC, SPDIF, or HDMI audio for anyone who has some crazy over-the-top home theater system but if you set your volume to normal levels that won't irritate neighbors, just about any integrated audio will do.
About the AMP part, that`s already integrated on the MB, right? Or is something we need to buy to deliver better sound?
If you've got an analog audio jack, there is an [op-]amp somewhere behind it. In the vast majority of cases, you can't swap it out, but there are some high-end sound cards that allow you to do so. These chips are important because simply put: they're what makes the audio something you can actually listen to. More often than not, they're what determines the quality of your sound, assuming the actual audio chipset isn't lying about its capabilities. If your integrated audio has noise issues, there isn't really much you can do about it. Just lower the volume and hope that your external amplifier (and if your speakers have a volume knob, they have one) doesn't pick up on the noise. That's really where integrated audio becomes a bit of a gamble, but typically where the motherboard manufacturer tells you what the amplifier is, that's pretty good news since it means they're probably not using something that's the cheapest unit available.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
H83:

I'm looking at AM5 boards because I'm thinking about an AMD build but the MBs are expensive and I don't know what to choose. This one seems nice but I need a good audio solution.
Look in to a USB DAC - they can be had in really high qualty. Also get kinda expensive...
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Great review as always Boss. Price it's not bad compared to similar products. Steel Legend has always have a great price to performance ratio in my opinion.
https://forums.guru3d.com/data/avatars/m/275/275175.jpg
tunejunky:

not true every single next gen board for either AMD or Intel has been significantly beefed up. a lot of which is unnecessary imho. greed would be lower quality or marking up w/o changes but if you aren't going to OC (which the AIBs seem to assume most will) you can look carefully for a board with an eight phase vrm plus, shortly (after xmas most likely) the cheapest chipset will come out for both (what used to be a B series but now is less)
VERY TRUE Beefed up with plastic parts and some paint to have dragon, military or some sort of gamerlook, these new motherboards haven't had that big improvements at all over older generations, and certainly not the 400$ extra pricetag for the main models. Maybey you are trying to convince yourself that they are worth their price, or that "you" can afford it, but you are just playing into their scheme...
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
chainy:

VERY TRUE Beefed up with plastic parts and some paint to have dragon, military or some sort of gamerlook, these new motherboards haven't had that big improvements at all over older generations, and certainly not the 400$ extra pricetag for the main models. Maybey you are trying to convince yourself that they are worth their price, or that "you" can afford it, but you are just playing into their scheme...
look dude i'm a supplier to AMD,Nvidia, and Intel (plus more you never heard of) my company charges more for its parts as do others. but my point is exactly like i said and no bold caps will change facts. with only one or two exceptions ALL B650 boards are built like X570 boards (excepting interfaces) NOT like B550 boards. ALL X670 Boards (not counting interfaces) are built like premium boards and yes, X670E is super-premium. throw darts all you want but first gain some knowledge
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
schmidtbag:

Let's start with the chipset: If you look at the motherboard specs, it'll tell you which audio chipset is used (and if they don't then you should avoid that product). Using your example of the Strix B650-A, you can identify the chipset if you go to the Audio section of the product page: https://rog.asus.com/motherboards/rog-strix/rog-strix-b650-a-gaming-wifi-model/ In this case it is the Realtek ALC4080. You can then look up that chip for reviews, issues, or various implementations of it. Not every manufacturer gives equal treatment of the audio chip, and this kinda matters because they're in an electrically noisy environment. So, just because the chipset yields poor performance on one motherboard, doesn't mean it will on all of them. That's where looking at how the board is configured matters. In the case of that Asus board, it probably would work pretty well because it has that shield to protect it from EMI. At least, it's protecting the chip itself - EMI can still affect the unshielded traces on the motherboard leading to the audio jacks too. That's where the capacitors come in. One of the most common applications of capacitors is to help smooth out the signal (AKA, noise filtering). If you see a board with a just a couple cheap electrolytic capacitors, you're probably going to get a lot of hissing and buzzing. In the case of that Asus board, they used quite a few decent ones so it should sound pretty clean. As far as I understand, there is such thing as too many capacitors but I'm no expert on this stuff so I'm not about to make claims I can't back up. Anyway, the amplifier chip matters a lot too but again, motherboards are noisy environments so really all that matters is whether the amp isn't known to be garbage. If you're an audiophile and want to wear headphones, I don't think integrated audio is ever the way to go, simply because the amp can never be kept "clean" enough. The higher you push the volume, the more prominent the EMI noise will be. That's why I think integrated audio is perfectly fine for speaker users, since you can reduce a lot of the EMI noise by keeping the PC volume down and then use the speaker driver's volume control when you want things to be quieter or louder. Bear in mind, some of what I said could be a little outdated. I'm no audiophile and I don't really care that much about maximizing sound quality because for my surround system, 33% volume gets to be obnoxiously loud so even if my integrated audio is a hissy mess (which frankly, it probably is) I'm not noticing it lol. EDIT: Oh yeah, and I'm about to get a lot of flak for this, but don't buy into all the sample rate and bit rate BS. In case you're not aware, sample rate is basically the resolution of the X axis of an audio track, bit rate is the resolution of the Y axis. When recording audio, you want these as high as you can possibly get. Whenever you hear a recording where there is clipping/distortion from something being too loud, that's sometimes because they recorded in 16-bit rather than 24-bit. Whenever you listen to audio where drums sound muffled or almost like a hiss, that's because the sample rate was set too low. When it comes to playback, you really don't need much at all except in some very niche situations (particularly if you're a highly trained audio producer). Any audio track that is well-suited for all audiences only needs 16-bit depth. Even in the very rare cases where you can perceive the difference between 16 and 24, it honestly doesn't matter. In a worst-case scenario, it'd be like listening to an orchestra where maybe the vibrato of a violin is ever-so-slightly less prominent, and only if you really focus on it back-to-back between 16 and 24. Don't get me started on 32... For sample rate, it's quite simple: most humans can't hear much beyond 20KHz, and of those who can, it's not exactly desirable. According to the Nyquist Theorum, in order to ensure reliability in your data, you want to double the frequency. So - that's why sample rates are commonly in the 40-50KHz range. You can go ahead and increase it but you won't hear a difference. Anyone who says otherwise is just experiencing placebo. As with bit depth, it's highly recommended to record content in much higher sample rates. So for consuming content, just focus on a good amplifier. The crappiest of audio chipsets from 10 years ago can handle all the quality you need for consuming content, but none of that matters if it can't deliver the sound cleanly.
Oh Man! where to start when you are so wrong... 1) you understand little if any music theory at all. if you did you would understand Harmonics. you are virtually never talking about single note from single instrument reproduction. and even then you still have harmonic structure. this is science not opinion 2) i personally ran double-blind testing over the course of five years (avg sampling 300 ppl) re: digital vs analog audio, digital audio with differing codecs (pre Red Book) and multichannel digital audio with differing rates. at that time i worked for Pioneer electronics and i know for a fact Sony/Phillips/et al were doing the same. not only did 100% of the respondents hear the difference between analog & 16/44, 75% heard the differences between 16/44 and 24/48 (DAT). furthermore a deeper dive showed that the digital filters affected the sound more prominently (both good & bad) and 100% heard the differences in digital filter settings (which is playback) 3) the DAC (& the digital filters) makes a huge difference in listenability as well as sound quality but that doesn't equate to spending more money (by itself) as the cost of high-end DAC chips is the lowest in history.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
@tunejunky I was wondering when you were going to come tell me I was wrong. 1. How does harmonics have anything to do with what I said? At a data level, none of that matters because all the chip cares about is the amplitude of the peak/valley at the specified time. So long as the chip doesn't lie about its specs, it will reproduce the recorded signal verbatim. The chip doesn't know which instruments are playing simultaneously, it's just playing back whatever data has been recorded to the best of its ability. But where we can both agree is it's what comes after the digital chip (in this case, the DAC) that determines whether we're hearing what we're supposed to, which is why I advise against integrated audio for headphone users. But here's the thing: the specs of the chip itself won't determine that. So, whether the chip supports a max of 16-bit/44KHz or 32-bit/384KHz, you won't hear a difference if the post-chip circuit is crap, and, if the audio was recorded with the lowest common denominator. In other words, the science of the matter is: digital chips don't care about harmonics. Data is data. No matter how much soul you put into an audio track, no matter how much you increase the quality, it is nothing more than 1s and 0s to the machine. The machine just needs to be able to replicate them. 2. That also doesn't really refute anything I said. Of course there would be noticeable differences between analog and digital, especially if your testing was done decades ago back when even the best of digital still had room for improvement. As for going from 16 to 24, in a controlled environment with the industry's best hardware and tracks picked for their dynamic range, I honestly would have thought more than 75% would hear a difference. But here's the thing: a. Most recorded audio people consume is tuned for 16-bit, because they're trying to appeal to the masses. b. Of those who have 24-bit compatible hardware and a system capable of playing back 24-bit audio, not everyone lives in an environment where they can actually appreciate it, and not everyone wants to wear headphones. You kinda need to crank the volume pretty high to make the most of a 24-bit recording, and that's a bit of a privilege for most people. I'm fortunate enough to not have close-by neighbors to annoy, but I know I'm lucky in that regard. c. Of those who can have the means and the privilege of hearing 24-bit in all its glory, according to your study, only 75% can hear the difference. That's already starting to question whether it really matters. Which goes to my next point: d. Of those who can hear a difference, how many of them have a reason to care? Just because you can perceive the difference, doesn't make it desirable. Sure, maybe if I'm trying to zone out in some atmospheric music, 24-bit would be pretty cool since I'm actively focusing on the subtle differences at that point. But most of the time, I just want to hear what I'm supposed to hear. I've watched movies with 24-bit audio at home and honestly... I'd rather it not be. The quiet stuff was too quiet and the loud stuff was annoyingly loud. And yeah, 16 bit doesn't necessarily fix that, but audio optimized for 16-bit seems to take it a little easier on the dynamic range. It's the same sort of problem with photography - who cares about the authenticity of the dynamic range when most people don't have a display to see the difference? Of those who do have a proper display, not all of them can differentiate what's in the darkest/lightest spots. Of those who can differentiate, how does that improve the image in any way? Whether you're talking recorded audio or recorded images, you will never get a fully accurate re-creation, so why focus so much on minute details that otherwise contribute nothing toward the experience? Anyway, I have no doubt whatsoever that digital filters would impact result. That's a given. But like I said, that doesn't really have much to do with what the digital chip is capable of, hence getting external filters and amps. 3. I agree and again, that doesn't say anything about how I'm wrong. I'm not fluent in terminology but when I was speaking of amplifiers, I was in turn speaking of DACs as well as other components. Really, the point of my post was to say you can have the cheapest Realtek audio chip today and its specs are more than good enough, but what will really impact the sound quality is everything between the chip and your ears. It doesn't matter what the chip is capable of if the DAC or amplifier is crap. It doesn't matter how good those are if they're in an electrically noisy environment. It doesn't matter how good any of the system is if your drivers are made of paper, fridge magnets, and rusty iron wires. Audio quality is determined by the worst component. Realistically, the digital part will never be the source of poor quality, so as long as you have a chip capable of the limits of human hearing (24-bit/44KHz), all that matters is everything else that comes after the chip.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
schmidtbag:

@tunejunky I was wondering when you were going to come tell me I was wrong. 1. How does harmonics have anything to do with what I said? At a data level, none of that matters because all the chip cares about is the amplitude of the peak/valley at the specified time. So long as the chip doesn't lie about its specs, it will reproduce the recorded signal verbatim. The chip doesn't know which instruments are playing simultaneously, it's just playing back whatever data has been recorded to the best of its ability. But where we can both agree is it's what comes after the digital chip (in this case, the DAC) that determines whether we're hearing what we're supposed to, which is why I advise against integrated audio for headphone users. But here's the thing: the specs of the chip itself won't determine that. So, whether the chip supports a max of 16-bit/44KHz or 32-bit/384KHz, you won't hear a difference if the post-chip circuit is crap, and, if the audio was recorded with the lowest common denominator. In other words, the science of the matter is: digital chips don't care about harmonics. Data is data. No matter how much soul you put into an audio track, no matter how much you increase the quality, it is nothing more than 1s and 0s to the machine. The machine just needs to be able to replicate them. 2. That also doesn't really refute anything I said. Of course there would be noticeable differences between analog and digital, especially if your testing was done decades ago back when even the best of digital still had room for improvement. As for going from 16 to 24, in a controlled environment with the industry's best hardware and tracks picked for their dynamic range, I honestly would have thought more than 75% would hear a difference. But here's the thing: a. Most recorded audio people consume is tuned for 16-bit, because they're trying to appeal to the masses. b. Of those who have 24-bit compatible hardware and a system capable of playing back 24-bit audio, not everyone lives in an environment where they can actually appreciate it, and not everyone wants to wear headphones. You kinda need to crank the volume pretty high to make the most of a 24-bit recording, and that's a bit of a privilege for most people. I'm fortunate enough to not have close-by neighbors to annoy, but I know I'm lucky in that regard. c. Of those who can have the means and the privilege of hearing 24-bit in all its glory, according to your study, only 75% can hear the difference. That's already starting to question whether it really matters. Which goes to my next point: d. Of those who can hear a difference, how many of them have a reason to care? Just because you can perceive the difference, doesn't make it desirable. Sure, maybe if I'm trying to zone out in some atmospheric music, 24-bit would be pretty cool since I'm actively focusing on the subtle differences at that point. But most of the time, I just want to hear what I'm supposed to hear. I've watched movies with 24-bit audio at home and honestly... I'd rather it not be. The quiet stuff was too quiet and the loud stuff was annoyingly loud. And yeah, 16 bit doesn't necessarily fix that, but audio optimized for 16-bit seems to take it a little easier on the dynamic range. It's the same sort of problem with photography - who cares about the authenticity of the dynamic range when most people don't have a display to see the difference? Of those who do have a proper display, not all of them can differentiate what's in the darkest/lightest spots. Of those who can differentiate, how does that improve the image in any way? Whether you're talking recorded audio or recorded images, you will never get a fully accurate re-creation, so why focus so much on minute details that otherwise contribute nothing toward the experience? Anyway, I have no doubt whatsoever that digital filters would impact result. That's a given. But like I said, that doesn't really have much to do with what the digital chip is capable of, hence getting external filters and amps. 3. I agree and again, that doesn't say anything about how I'm wrong. I'm not fluent in terminology but when I was speaking of amplifiers, I was in turn speaking of DACs as well as other components. Really, the point of my post was to say you can have the cheapest Realtek audio chip today and its specs are more than good enough, but what will really impact the sound quality is everything between the chip and your ears. It doesn't matter what the chip is capable of if the DAC or amplifier is crap. It doesn't matter how good those are if they're in an electrically noisy environment. It doesn't matter how good any of the system is if your drivers are made of paper, fridge magnets, and rusty iron wires. Audio quality is determined by the worst component. Realistically, the digital part will never be the source of poor quality, so as long as you have a chip capable of the limits of human hearing (24-bit/44KHz), all that matters is everything else that comes after the chip.[/QUO 24/48 isn't related to the limits of human hearing, it's related to the bit depth and recording frequency. you can easily have 192/32 (etc...) and that also is unrelated to the limits of human hearing. harmonics are vitally important as an indicator of realism and they are the most affected by digital sampling. remember sounds are a collection of sinusoidal wave patterns - not stairsteps. the point of increasing the recording frequency (44Hz +) is to more closely emulate the waveform. in playback that frequency is oversampling to create a truer average. i'm sure you understand bit depth so i'll leave that alone. and excuse me, but a well executed digital recording IS a "fully accurate recreation" and is totally indistinguishable (by human hearing - which is NOT as sharp as vision) from the origin. i've been in many recording studios for film and music (and my main rig is a DAW) and as i've said i've run dozens of double blind tests for hundreds of people in each sample. everything i say is backed by science and the recording industries. digital filters are a key component of every DAC and are the single biggest difference in the perception of sound quality between DACs (even though many can be set aftermarket to similar if not the same settings). nowadays the quality of DACs (incl. board based) is the highest it's ever been and the cost is the lowest. the fact of the matter is a person can make a reference quality audio system for 10% of the cost of a similar system from 10 years ago. or in other words the DAC i have on my phone (LG V60) is equal to a $1200 separate component (still on the market). so "the chip" itself has those self same digital filters and any person willing to take the time can and will hear the differences. whether or not it is "worth it" is a totally different and highly personal question. p.s. you can hear the difference between 16 and 24 bit recordings at any volume. hearing deteriorates with age and that gets back to the "worth it" question.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
tunejunky:

24/48 isn't related to the limits of human hearing, it's related to the bit depth and recording frequency. you can easily have 192/32 (etc...) and that also is unrelated to the limits of human hearing.
I'm well aware, but there is an upper limit of what humans can do. Most people can't hear past 22KHz, and most people find frequencies above 5000Hz to be annoying. The sample rate is doubled for the sake of ensuring no Hz is left behind. So, when you consider that the a CD-quality sample rate is at least 8x faster than anyone cares to listen to, I find it impossibly difficult to believe you that anyone can tell the difference of a sample rate higher than that, even in the most pristine of conditions. We're talking about the physical limits of the human body. Remember - the cilia within our ears is submerged in a liquid. We are limited to what that can do, even if a microphone can pick up more detail. As for bit depth, most people can hear the difference between 16 and 24, but like I said: who honestly cares. The difference is subtle and the difference doesn't really do anything improve the audio experience in the vast majority of cases. In the few cases where it does make a difference, the vast majority of people don't have a system equipped to appreciate it. As for going to 32 bit, that really is just placebo. Your own study showed there were enough people not convinced by 16 to 24, but from 24 to 32 is reaching the limits of what the human brain is capable of differentiating. Remember: the point of the discussion is whether you can get a good experience with integrated audio when using speakers. We're not talking the best experience, headphone users, or what works best in niche situations, we're just talking about the average gamer. It's also worth pointing out that for games in particular, they are mixing audio in realtime. Everything can be recorded in 16-bit because every voice and sound effect is recorded to be rather normalized; there isn't hardly any dynamic range in game recordings. However, having the game mix the audio in 24-bit would actually make sense, because you could have very quiet sounds in the distance while an explosion blasts in your face, where 16-bit could potentially make it less immersive. As for game music, that's also fine in 16-bit because the music is typically meant for the ambiance/background, where high dynamic range would go unnoticed.
harmonics are vitally important as an indicator of realism and they are the most affected by digital sampling. remember sounds are a collection of sinusoidal wave patterns - not stairsteps. the point of increasing the recording frequency (44Hz +) is to more closely emulate the waveform. in playback that frequency is oversampling to create a truer average.
Go ahead and zoom into your favorite audio track as closely as your editing software will let you. You will see stairsteps. That's what happens when you go from analog to digital. Perhaps those steps can be treated as vectors, but the fact of the matter is: it's nothing more than just finely-placed data points. If you go to 32-bit/384KHz, that would definitely smooth it out a bit, but at that point we're going so far beyond what human hearing can pick up on that it's moot. Here's a challenge for you: Take your favorite 32-bit/384KHz track, take a fully zoomed-in secreenshot of the waveform of the track at a place with complex harmonics. Down-sample the track to 24-bit/192KHz and take a screenshot in the exact same spot and zoom level. Do it again at 24-bit/44KHz (we can both acknowledge a difference can be heard at 16-bit, the point I'm trying to make here is about sample rate). Compare these 3 screenshots side by side. You will absolutely see some bigger steps, but what you're really looking for is whether the width of the steps. If they are wide enough that several entire spikes have been completely eroded, and, if that erosion has a noticeable impact, I will eat my words.
and excuse me, but a well executed digital recording IS a "fully accurate recreation" and is totally indistinguishable (by human hearing - which is NOT as sharp as vision) from the origin.
The recording may be an accurate recreation, but playing it back will never sound the same. This is of course assuming that the audience isn't listening through speakers in the original creation. Obviously if you're listening to a live performance in your car and you re-listen to a recording of that performance, of course it will sound exactly the same. But I'm talking about sitting there in front of the performance, without the use of mics or speakers, and then listen to a recording of that exact same performance. It will not sound the same. The mic could be tuned to perfectly capture what you would hear, but the manner in which a microphone captures audio is totally different from how your ears do. So, it's not perceiving the sound the same way you do. Meanwhile, the speakers may be able to re-create the waveform, but being completely physically different than what produced the sound, it can never 100% accurately replicate the sound the same way.
i've been in many recording studios for film and music (and my main rig is a DAW) and as i've said i've run dozens of double blind tests for hundreds of people in each sample. everything i say is backed by science and the recording industries.
And in your previous description, those differences would either be obvious or didn't refute anything I said. Do your test from 44KHz to 192KHz (keep the bit rate the same, as well as the track and the hardware) with hundreds of blind tests and come back to me with the results.
digital filters are a key component of every DAC and are the single biggest difference in the perception of sound quality between DACs (even though many can be set aftermarket to similar if not the same settings).
I agree. I didn't say otherwise.
nowadays the quality of DACs (incl. board based) is the highest it's ever been and the cost is the lowest. the fact of the matter is a person can make a reference quality audio system for 10% of the cost of a similar system from 10 years ago. or in other words the DAC i have on my phone (LG V60) is equal to a $1200 separate component (still on the market). so "the chip" itself has those self same digital filters and any person willing to take the time can and will hear the differences. whether or not it is "worth it" is a totally different and highly personal question.
I also agree with this, which begs the question why you're arguing with me. I'm saying that a motherboard with audio that had some thought/effort/cost put into the audio will produce a very good (again, not great or best) experience with those using speakers. We both can agree that an external DAC is better if you care to increase quality, and that it wouldn't cost much. As far as I understand, there are multiple ICs involved. I wouldn't be surprised if some overly cheap or poorly designed digital filters are going to have noticeable results, regardless of what the specs of the sound card are. But that's where it gets to a point where you'd have to be willing to spend hours of research or hundreds (perhaps thousands) of dollars on sound equipment to really care. Since we're talking about integrated audio, this is not even worth taking a second glance at.
p.s. you can hear the difference between 16 and 24 bit recordings at any volume. hearing deteriorates with age and that gets back to the "worth it" question.
If you've got a track with high dynamic range and you're young, sure, you don't need anything all that loud to hear the difference. We're talking some pretty niche situations though. Most of the time, a properly produced track isn't going to have such drastic changes, hence my photography example. And yes, "worth it" is highly personal, but when you're watching a sitcom and you're checking to see if it's 24-bit, is that really a matter of preference, or is it just principle at that point? If H83 shared your level of passion about audio quality settings, he wouldn't have asked the question he did. As a media enthusiast, you might wonder how people could possibly tolerate the compression quality that most streaming services provide, but it's a strong sign of how little the vast majority of people care. You and I can see a difference; most don't. Of those who do see a difference, they still fully enjoy the content, warts and all. As tech enthusiasts, we might wonder why anyone could possibly justify buying an Alienware, but the fact of the matter is, the brand is successful enough to have survived for almost 20 years, because most people just don't care - all they want is a "good enough" experience. A car enthusiast may question why anyone would buy a Toyota, as they're some of the most mundane an un-inspiring cars you can get, but sometimes people just want something cheap and reliable that gets the from point A to point B in comfort. A master chef may question why people could possibly eat at McDonald's, but ask the average fat American why he/she eats there and they'll tell you it's delicious - after all, it ain't healthy and it's not the cheapest way to fill your stomach. To each their own, but at the end of the day, enthusiasts (of any subject) tend to over-emphasize things that nobody else does, regardless of whether they notice. For what it's worth, I'm calling myself out on this too; there are things I care about that I know hardly anybody else does.