if you have a musical piece with the lowest volume being 1 db above total, dead silence (which is really not possible except out in space)
Oooo I think you might be on to something...we must start recording in space!!!!
In all seriousness though, the sampling rate conundrum is the most interesting I think. For two reasons:
1. There seem to be differing opinions about whether the higher sampling rates are destructive or not. One camp says they are destructive for the reasons mentioned in the article, another says that, in fact, sampling at the higher rates will push aliasing, artifacts, etc outside of the audible range thus improving audio quality. Which is basically the totally opposite of the other camp which basically says it creates aliasing and artifacts inside the audible range.
2. When a trumpet or cymbal is heard live, the frequencies we cannot hear are in fact still present and hitting our bodies. Ergo, it may be more 'natural' sounding if these inaudible frequencies are included even if we cannot hear them. Plus it could preserve high order harmonics that would otherwise be absent and although we cannot hear them, they could affect the waveform and provide a slightly different sound (perhaps in a similar way that the exact same note played through a tube amp and a solid state amp will have a different color to the sound)
I didnt bother clicking the article link but I am assuming it is the same one I read a month or so ago...and there was at some point a comparison to a videophile who wants great Microwave, IR, UV, X-ray and gamma performance from his TV or something. Even says to use an IR remote in pitch black to see if you can see it (you cant...obviously..)
This is a broken argument. Why? Have you ever hung out around infrared light? That you cant see? Yeah, that light is friggin hot, and you feel its presence although you cannot see it. So yeah, if during a desert scene in a movie your TV started putting off IR light to simulate the same heat you would feel in the desert; would this not be even more immersive and 'accurate'? I vote yes. Similarly, we cannot see UV light, but we can see its effects when it hits that UV artwork. If, during a black light scene in a movie, that UV light was to emit from the TV and hit your UV poster would this also not feel more accurate and realistic? Again I vote yes. This whole 'we cant see it so it doesnt matter' argument is a little lost on me because we use IR and UV all the time during our lives and NOTICE quite obviously that they are present. So if we have the ability to preserve these effects then why not? (disclaimer: yeah you could get cancer or something, meh, thats not the point now is it!)
Does this correlate to audio? Probably. They are both electromagnetic waves after all. Although, in theory, it would also likely be much less obvious than with the light.
About bit depth:
Recording in space aside.... (come see me in 15-30 years, hopefully Ill have a studio up there by then lol)
If mastering in 24 bit and distributing in 16 bit helps lower the noise created during mixing and editing then mastering in 32 bit and distributing in 24 bit would only seem to increase this effect and hopefully provide an even cleaner sounding track. This I dont think is really a debate. I think that would be a fact.
Now the question is, would that same 32 bit master, sound the same in 16 bit release as it did in 24? Im not sure, but I would think (especially with using clever techniques that may or may not exist yet) that the extra bit depth could be used specifically for the eliminate noise. Something like having a large dynamic gap between the music and noise. For example, use 18 bits for the music and 4 for noise and 2 for the gap: loudest noise level at 00000F and have the quietest music be at 00003F and obviously the highest music level at FFFFFF. I have no idea if thats even possible or if it would have any affect but hey its at least some food for thought.
Basically what I am saying is that it could have some positive effects that we only just now able getting to a point in hardware to be able to realize and to simply brush it off as being at inaudible frequencies or saying that a recording studio cant even get quiet enough to make use of the bit depth is a bit naive and kind of like saying "color tv? why would anyone in the world want that?" or "HDTV? why would someone want that? you are just going to see a lot more pores and wrinkles on that older actress you thought was smoking hot.."
I am not saying that Hi-Res is the real deal either, Im just playing devils advocate so we can have a proper debate discussion rather than all jumping on our fanboy bandwagons and saying junk like "oh yeah i totally hear a massive difference between 96 and 192" or "no way man it sounds the same to me", that just isnt productive and gets nobody anywhere.
Notto mention most of our hearing is quite damaged in one way or another for one reason or another...
Those darn kids have those text message tones that their teachers and parents cant hear...so would a 2 year old have a better chance at spotting differences? Hell yeah one would! But, they are two, not exactly the most articulate or experienced bunch are they? So it would be pretty hard to test that too.
Basically the only real way to know is to record in space with best possible equipment, master in 64/768, publish in 32/384, let a 3-5 year old who has been raised with the sole goal of not damaging his or her ears listen to it through the worlds greatest source/dac/amp and phones also in space.
And who knows? With all the genetic data we are generating and with the advances in bioengineering, maybe before too long we can splice some dog or bat into our DNA and actually be able to hear these frequencies. So perhaps we will all be able to hear past 20 kHz in the future. Maybe then it will be the obvious choice?
Ill shut up now