It’s two dimensions. You need both good bit depth (kbps) and sample rate (Hz) for quality. But yeah 96 kHz is more than double 44 so of course it’s significantly better.
There is however a point of diminishing returns and I’d certainly say that’s in play beyond 320 kbps (or beyond 96 kHz for that matter).
Bit depth is not the same as bitrate, there is no difference in the signals that can be reproduced within the range of human hearing between a sample rate of 44kHz and 96kHz
Any audiophile would argue with you that the extra Hz help with harmonics that do influence the timbre and subtle qualities that are within hearing range. (/s, since someone needs it)
I personally don’t care, I’m happy with 44 kHz for nearly everything.
I am an audiophile, not an idiot. They don’t. The slim possibility of reproducing signals past 20kHz causing audible changes to the signal within audible range may technically exist, but you will never ever demonstrate the ability to detect a difference in a double blind test.
The only reason to use a higher sample rate than 44.1kHz is to avoid resampling audio which is already in a different sample rate, e.g. CDs which are usually 48kHz or potentially “hi-fi” sources that may be 96kHz or higher. Resampling can theoretically introduce audible artifacts although a modern CPU using a modern resampling algorithm can very easily perform transparent resampling in real-time.
Ok, fine, whatever, I don’t really care. I almost never have a reason to resample anything nor the equipment to tell the differences in any of this. You keep having fun and correcting people if that’s what gets you off.
Get off your damn horse man, this thread is for poking fun at twats who have to correct everybody. If you’re triggered by it, well good I guess. Maybe you’ll learn, or go someplace where technical accuracy matters. Because here it doesn’t.
It’s two dimensions. You need both good bit depth (kbps) and sample rate (Hz) for quality. But yeah 96 kHz is more than double 44 so of course it’s significantly better.
There is however a point of diminishing returns and I’d certainly say that’s in play beyond 320 kbps (or beyond 96 kHz for that matter).
Bit depth is not the same as bitrate, there is no difference in the signals that can be reproduced within the range of human hearing between a sample rate of 44kHz and 96kHz
Any audiophile would argue with you that the extra Hz help with harmonics that do influence the timbre and subtle qualities that are within hearing range. (/s, since someone needs it)
I personally don’t care, I’m happy with 44 kHz for nearly everything.
I am an audiophile, not an idiot. They don’t. The slim possibility of reproducing signals past 20kHz causing audible changes to the signal within audible range may technically exist, but you will never ever demonstrate the ability to detect a difference in a double blind test.
The only reason to use a higher sample rate than 44.1kHz is to avoid resampling audio which is already in a different sample rate, e.g. CDs which are usually 48kHz or potentially “hi-fi” sources that may be 96kHz or higher. Resampling can theoretically introduce audible artifacts although a modern CPU using a modern resampling algorithm can very easily perform transparent resampling in real-time.
Ok, fine, whatever, I don’t really care. I almost never have a reason to resample anything nor the equipment to tell the differences in any of this. You keep having fun and correcting people if that’s what gets you off.
Ok, go ahead and continue posting misinformation and getting mad about being corrected instead of just learning
You’re aware of who this meme was mocking in the first place, right?
Being proud of ignorance is a really cool trait
Get off your damn horse man, this thread is for poking fun at twats who have to correct everybody. If you’re triggered by it, well good I guess. Maybe you’ll learn, or go someplace where technical accuracy matters. Because here it doesn’t.