Post by RockHard on Jul 24, 2016 3:49:31 GMT -5
A lot of worthless hype to me
I've listened to a few 192's & 96's & converted them to 44.1's for side by side comparison on my editor & in my opinion the 44.1's sounded fuller & smoother. To me the hi-rezers had a thinner body of fullness to them. I wondered if my hearing just didn't appreciate the extra clarity that the hi-rez rips supposedly imparted but then I found an article with an opinion against them that made sense @ xiph.org/~xiphmont/demo/neil-young.html
...Basically it states the obvious reality no one can discern an audio signal beyond 20k in the 1st place, no argument there & second any harmonics within the 20k to 192k range can create intermodulation distortion havoc on the audible spectrum. Of course there was supporting evidence that I'm not going to elaborate on but basically it confirmed the possibility why I may have found the hi-rez less pleasurable may be from artifacts in the above hearing range damaging what I was hearing & unless you can drive the audio into a spectrum analyzer to check for artifacts, there's no way to determine if any could be there degrading your audio so why take a chance in the 1st place when 44.1 k is already twice the hearing range?
Then, is using data space so huge you couldn't fit an album on a single cd worth the minute theoretical improvement? I don't know anyone more rabid than me about audio quality & that just doesn't compute for me so my conscience is pretty clear that I'm not depriving my ears of anything I should be concerned over but......
I do have an issue I can't resolve in my mind from an experiment converting a 44.1k down to 22k. If my understanding is what I assume that 44.1k is capable of encoding signals double the audio range then at 22k, well past our hearing range, there should have been no loss of fidelity but I heard substantial degradation. In the article I referenced up above, I had the impression that shouldn't have been possible so I'm left wondering if there's something I'm missing.
I have noticed 48k's do seem to have more clarity & dynamics especially on mp3 but beyond that I can't tell any difference.
Your thoughts anyone?
...Basically it states the obvious reality no one can discern an audio signal beyond 20k in the 1st place, no argument there & second any harmonics within the 20k to 192k range can create intermodulation distortion havoc on the audible spectrum. Of course there was supporting evidence that I'm not going to elaborate on but basically it confirmed the possibility why I may have found the hi-rez less pleasurable may be from artifacts in the above hearing range damaging what I was hearing & unless you can drive the audio into a spectrum analyzer to check for artifacts, there's no way to determine if any could be there degrading your audio so why take a chance in the 1st place when 44.1 k is already twice the hearing range?
Then, is using data space so huge you couldn't fit an album on a single cd worth the minute theoretical improvement? I don't know anyone more rabid than me about audio quality & that just doesn't compute for me so my conscience is pretty clear that I'm not depriving my ears of anything I should be concerned over but......
I do have an issue I can't resolve in my mind from an experiment converting a 44.1k down to 22k. If my understanding is what I assume that 44.1k is capable of encoding signals double the audio range then at 22k, well past our hearing range, there should have been no loss of fidelity but I heard substantial degradation. In the article I referenced up above, I had the impression that shouldn't have been possible so I'm left wondering if there's something I'm missing.
I have noticed 48k's do seem to have more clarity & dynamics especially on mp3 but beyond that I can't tell any difference.
Your thoughts anyone?