Review: ASRock X670E Steel Legend motherboard
Click here to post a comment for Review: ASRock X670E Steel Legend motherboard on our message forum
schmidtbag
H83
schmidtbag
https://rog.asus.com/motherboards/rog-strix/rog-strix-b650-a-gaming-wifi-model/
In this case it is the Realtek ALC4080. You can then look up that chip for reviews, issues, or various implementations of it. Not every manufacturer gives equal treatment of the audio chip, and this kinda matters because they're in an electrically noisy environment. So, just because the chipset yields poor performance on one motherboard, doesn't mean it will on all of them. That's where looking at how the board is configured matters. In the case of that Asus board, it probably would work pretty well because it has that shield to protect it from EMI. At least, it's protecting the chip itself - EMI can still affect the unshielded traces on the motherboard leading to the audio jacks too. That's where the capacitors come in. One of the most common applications of capacitors is to help smooth out the signal (AKA, noise filtering). If you see a board with a just a couple cheap electrolytic capacitors, you're probably going to get a lot of hissing and buzzing. In the case of that Asus board, they used quite a few decent ones so it should sound pretty clean. As far as I understand, there is such thing as too many capacitors but I'm no expert on this stuff so I'm not about to make claims I can't back up.
Anyway, the amplifier chip matters a lot too but again, motherboards are noisy environments so really all that matters is whether the amp isn't known to be garbage. If you're an audiophile and want to wear headphones, I don't think integrated audio is ever the way to go, simply because the amp can never be kept "clean" enough. The higher you push the volume, the more prominent the EMI noise will be. That's why I think integrated audio is perfectly fine for speaker users, since you can reduce a lot of the EMI noise by keeping the PC volume down and then use the speaker driver's volume control when you want things to be quieter or louder.
Bear in mind, some of what I said could be a little outdated. I'm no audiophile and I don't really care that much about maximizing sound quality because for my surround system, 33% volume gets to be obnoxiously loud so even if my integrated audio is a hissy mess (which frankly, it probably is) I'm not noticing it lol.
EDIT:
Oh yeah, and I'm about to get a lot of flak for this, but don't buy into all the sample rate and bit rate BS. In case you're not aware, sample rate is basically the resolution of the X axis of an audio track, bit rate is the resolution of the Y axis. When recording audio, you want these as high as you can possibly get. Whenever you hear a recording where there is clipping/distortion from something being too loud, that's sometimes because they recorded in 16-bit rather than 24-bit. Whenever you listen to audio where drums sound muffled or almost like a hiss, that's because the sample rate was set too low.
When it comes to playback, you really don't need much at all except in some very niche situations (particularly if you're a highly trained audio producer). Any audio track that is well-suited for all audiences only needs 16-bit depth. Even in the very rare cases where you can perceive the difference between 16 and 24, it honestly doesn't matter. In a worst-case scenario, it'd be like listening to an orchestra where maybe the vibrato of a violin is ever-so-slightly less prominent, and only if you really focus on it back-to-back between 16 and 24. Don't get me started on 32...
For sample rate, it's quite simple: most humans can't hear much beyond 20KHz, and of those who can, it's not exactly desirable. According to the Nyquist Theorum, in order to ensure reliability in your data, you want to double the frequency. So - that's why sample rates are commonly in the 40-50KHz range. You can go ahead and increase it but you won't hear a difference. Anyone who says otherwise is just experiencing placebo.
As with bit depth, it's highly recommended to record content in much higher sample rates.
So for consuming content, just focus on a good amplifier. The crappiest of audio chipsets from 10 years ago can handle all the quality you need for consuming content, but none of that matters if it can't deliver the sound cleanly.
Let's start with the chipset:
If you look at the motherboard specs, it'll tell you which audio chipset is used (and if they don't then you should avoid that product). Using your example of the Strix B650-A, you can identify the chipset if you go to the Audio section of the product page:
H83
schmidtbag
AlmondMan
chispy
Great review as always Boss. Price it's not bad compared to similar products. Steel Legend has always have a great price to performance ratio in my opinion.
chainy
tunejunky
tunejunky
schmidtbag
@tunejunky
I was wondering when you were going to come tell me I was wrong.
1. How does harmonics have anything to do with what I said? At a data level, none of that matters because all the chip cares about is the amplitude of the peak/valley at the specified time. So long as the chip doesn't lie about its specs, it will reproduce the recorded signal verbatim. The chip doesn't know which instruments are playing simultaneously, it's just playing back whatever data has been recorded to the best of its ability. But where we can both agree is it's what comes after the digital chip (in this case, the DAC) that determines whether we're hearing what we're supposed to, which is why I advise against integrated audio for headphone users. But here's the thing: the specs of the chip itself won't determine that. So, whether the chip supports a max of 16-bit/44KHz or 32-bit/384KHz, you won't hear a difference if the post-chip circuit is crap, and, if the audio was recorded with the lowest common denominator.
In other words, the science of the matter is: digital chips don't care about harmonics. Data is data. No matter how much soul you put into an audio track, no matter how much you increase the quality, it is nothing more than 1s and 0s to the machine. The machine just needs to be able to replicate them.
2. That also doesn't really refute anything I said. Of course there would be noticeable differences between analog and digital, especially if your testing was done decades ago back when even the best of digital still had room for improvement. As for going from 16 to 24, in a controlled environment with the industry's best hardware and tracks picked for their dynamic range, I honestly would have thought more than 75% would hear a difference. But here's the thing:
a. Most recorded audio people consume is tuned for 16-bit, because they're trying to appeal to the masses.
b. Of those who have 24-bit compatible hardware and a system capable of playing back 24-bit audio, not everyone lives in an environment where they can actually appreciate it, and not everyone wants to wear headphones. You kinda need to crank the volume pretty high to make the most of a 24-bit recording, and that's a bit of a privilege for most people. I'm fortunate enough to not have close-by neighbors to annoy, but I know I'm lucky in that regard.
c. Of those who can have the means and the privilege of hearing 24-bit in all its glory, according to your study, only 75% can hear the difference. That's already starting to question whether it really matters. Which goes to my next point:
d. Of those who can hear a difference, how many of them have a reason to care? Just because you can perceive the difference, doesn't make it desirable. Sure, maybe if I'm trying to zone out in some atmospheric music, 24-bit would be pretty cool since I'm actively focusing on the subtle differences at that point. But most of the time, I just want to hear what I'm supposed to hear. I've watched movies with 24-bit audio at home and honestly... I'd rather it not be. The quiet stuff was too quiet and the loud stuff was annoyingly loud. And yeah, 16 bit doesn't necessarily fix that, but audio optimized for 16-bit seems to take it a little easier on the dynamic range.
It's the same sort of problem with photography - who cares about the authenticity of the dynamic range when most people don't have a display to see the difference? Of those who do have a proper display, not all of them can differentiate what's in the darkest/lightest spots. Of those who can differentiate, how does that improve the image in any way? Whether you're talking recorded audio or recorded images, you will never get a fully accurate re-creation, so why focus so much on minute details that otherwise contribute nothing toward the experience?
Anyway, I have no doubt whatsoever that digital filters would impact result. That's a given. But like I said, that doesn't really have much to do with what the digital chip is capable of, hence getting external filters and amps.
3. I agree and again, that doesn't say anything about how I'm wrong. I'm not fluent in terminology but when I was speaking of amplifiers, I was in turn speaking of DACs as well as other components. Really, the point of my post was to say you can have the cheapest Realtek audio chip today and its specs are more than good enough, but what will really impact the sound quality is everything between the chip and your ears. It doesn't matter what the chip is capable of if the DAC or amplifier is crap. It doesn't matter how good those are if they're in an electrically noisy environment. It doesn't matter how good any of the system is if your drivers are made of paper, fridge magnets, and rusty iron wires.
Audio quality is determined by the worst component. Realistically, the digital part will never be the source of poor quality, so as long as you have a chip capable of the limits of human hearing (24-bit/44KHz), all that matters is everything else that comes after the chip.
tunejunky
schmidtbag