I clipped the quote because it was getting terribly long......
2)
I can't speak for other people's speakers or their capabilities.
I'm also not claiming that I choose my speakers based on their response above 20 kHz.
I'm simply disproving the claim that "speakers don't have useful response above 20 kHz".
(The claim is wrong because, clearly, at least
SOME speakers do.)
3)
Offhand I have no idea who is filtering what where... nor what the response limitations are on the specific microphones they may have used (or their synthesizer's frequency response if it's electronic).
I simply prefer for my equipment to be able to reproduce the entire musical spectrum if it's there in the recording.
(I consider that to be the definition of "accurate reproduction".)
4a) - 5a)
I agree that most of us don't actually need to accurately reproduce CD-4 content....
Likewise, I agree that most of probably will never use the erase carrier tone leakage (and, yes, special tape heads and electronics were used)....
My point, which I stick by, is that we don't always know our future needs in advance....
(And, if someone says "if you turn the volume up at 14:27 you can hear the musician cough" or "a bat got into the studio and you can see his squeak on a spectrum display at 34:10" I would prefer to be able to try it.)
And, yes, I would prefer a full-spectrum copy of the tape rather than one that has been limited to "what I need" - based on someone else's ideas of what that comprises.
Again, I simply see the goal as an absolutely perfect and complete rendition of the original.
And I see
ANY AND ALL LIMITATIONS as a compromise.
And I prefer to avoid compromises unless absolutely necessary (or unless I agree to them).
Someone asked, jokingly, whether videophiles would prefer it if their TV could reproduce gamma ray frequencies so accurately they could be burned by the video of a nuclear explosion.
While the example is absurd, I would say that the answer is technically yes.
We would be better off if our video display could reproduce frequencies from DC to gamma rays - and then we or the producer of the video could
DECIDE which ones to limit or omit.
(I'm sure the government would cheerfully add some sort of safety standard to cover that aspect of things.)
However, in that case, both the safety and technical limitations
JUSTIFY setting a standard that falls short of that lofty goal.
As a counter-example I would offer the color purple.
Most NTSC-standard TV sets don't reproduce a deep yet bright saturated purple color very well at all.
Therefore, most SD videos rarely show anything in bright purple - like a shiny amethyst necklace.
We have a situation where "the equipment can't play the signal" and "there's no point in including the signal because nobody's TV will be able to play it".
It has reached a point where set designers avoid using certain colors because they know those colors will be poorly reproduced.
(And, when you play those few videos that ignore the limitation on a full-spectrum monitor, you see an immediate difference. Now, interestingly, delivering the full color gamut - or more of it - is a major selling point of 4k HDR.)
6)
I do have an easy question for you......
If "quality assurance is admirable" (your words), then why are you arguing against it?
7a)
I'm not concerned with fakes... or of how to distinguish them from legitimate copies.
In either case, a properly provenanced legitimate original that is undamaged will virtually always be worth more than one that is damaged.
Likewise, even an undamaged copy will be worth more than a damaged copy.
Notice that you said "VISIBLY" repaired..... while I didn't include it......
While visible repairs are surely worse than invisible ones.... damage is still damage.
If someone buys your expensive vase, and finds a repaired crack when they x-ray it, they probably
WILL sue you if you claimed it hadn't been repaired.
"Visible" and "nonexistent" are not the same thing at all...... (otherwise a perfect forgery of a Rembrandt would really be just as good as the original).
Yes, if you can find a lossy compression method whose output will be
INDISTINGUISHABLE from the original, using our ears, or any test we can device, then it will be perfect.
(Except, of course, by definition, it
WON'T be lossy at that point.)
The fact that some people find it easier to identify imperfect originals than perfect copies is irrelevant....
I am
WELL aware of how lossy compression works.......
And it
ALL amounts to "discarding content that someone else has decided I won't notice is missing".
(And I have very little faith in the choices made by other people - especially when those choices are often made based on "what 95% of people won't notice" rather than specifically on "what ***I*** won't notice".)
8)
You're entirely incorrect in one regard......
While the compression level used in JPG is variable -
THERE IS NO SETTING IN THE JPG STANDARD THAT EXACTLY REPRODUCES THE ORIGINAL.
Even if you set a compression level that results in a file that is larger than the RAW file it is still lossy (you cannot retrieve the original pixels exactly).
A RAW file contains all of the information that was available from the camera; which is why it gives you the most flexibility and retains the most information.
An "uncompressed TIFF file" contains most but not all of the information; but is a much more standard format - which justifies the slight loss.
A JPG file discards a significant amount of information.... because a lot of information is approximated or discarded outright.
I also suspect you haven't edited many image files.... especially JPGs.
The method of compression used by JPG is applied to square zones of the image (so each square of a certain size is processed separately).
The compression is applied with certain constraints that ensure that the seams between adjacent squares won't be visible.
However, those constraints are based on several assumptions, including the conditions under which the image will be displayed, and the characteristics of the monitor or printer that will be used.
As a result, even though a given JPG may look "visually perfect" under very certain conditions, their failings tend to become unpleasantly obvious when you change the conditions.
Specifically, when you adjust the brightness, contrast, or color saturation, the sharp discontinuities at block boundaries become visible, and you get that "JPG blockiness artifact" that so many people find annoying.
(I'm ignoring the fact that virtually every pixel has been changed from its original color even though the net overall difference may be "imperceptable" to most people.)
Of course, there is an "original loss" because, while many cameras may exceed the performance of the human eye in specific ways, none so far exceeds the human eye in
ALL regards simultaneously.
Take a picture of this posting, on a really sharp screen, and save it as RAW and as JPG.
Blow it up so you can see the individual letters.
I would be very surprised if you fail to see odd ghosts and echoes around the edges of the letters.
(Because JPG was optimized to compress "pictures with continuous tones", which works well for photographs, but does relatively poor job on sharp edges and narrow lines.)
And, yes, there is a parallel in audio.....
If the audio recording was originally recorded in MP3, then, yes, the most accurate rendition of the original would be a copy of that MP3.
And, if it was recorded at 24/96k, then that file would be the most accurate version.
(Of, assuming it was mixed, that would be the final output file that was sent to the production house.)
BUT, if you convert that MP3 file to 24/96k, you will be able to get back a very close approximation of the original MP3 file.
HOWEVER, if you convert that 24/96k file to an MP3 file, you will NOT be able to get back a close approximation of the original 24/96k file.
At all comes back to your original assertion that "none of the stuff we're throwing away really matters"...
I will agree that, for most people, on the equipment they'll be using, they probably won't notice the difference.....
But, if I was hoping to use the signature of that background noise to determine the brand of recorder the original was recorded on, then you have ruined my project if you discard it.
YOU have decided that the background noise is
NOT "a meaningful part of the recording" - but perhaps we don't all agree.
And, yes, if you're the recording engineer, then, by definition, any such decision you've made is "correct".
9)
We seem to be in perfect agreement on the final conclusion.
Personally, I want the unlimited version, and am willing to pay a little extra for it.
However, I do absolutely agree that many people may find no benefit to high-resolution files, and shouldn't buy them.
Incidentally,
as for your other comment...... ALL audio circuitry has bandwidth
limitations.....
They may be inherent in the circuit design or specifically imposed using an external filter - but they are necessary.
Essentially
ALL circuitry produces noise... and, in general, that noise has an infinite bandwidth.
So, even though your microphone may not pick up 30 kHz, the active devices in your microphone preamp are putting out
SOME LEVEL of 30 kHz noise (and some level of 10 mHz noise).
If you're planning to digitize that output, you must filter out any significant noise above the Nyquist frequency to prevent audible aliasing and other errors.
You also
MIGHT have issues, as some have suggested, with intermodulation distortion caused by high-frequency noise (or content).
And you really wouldn't want to die because your amplifier accurately amplifies the leaked noise it picks up from your microwave oven or cell phone.
There's also another technical issue that
MOST (probably all) amplification circuits exhibit phase shift that increases as the frequency increases.
As a result, at some very high frequency, the phase shift is such that the circuit's negative feedback becomes positive feedback, and the circuit oscillates.
(And virtually all modern audio circuitry employs negative feedback.)
Because of this, audio circuits are designed so that their gain falls at high frequencies..... which is another way of describing "a bandwidth limiting filter".
With older equipment, this limitation was often more or less a random result of the overall design.... or simply a matter of luck.
However, with modern design, it is assumed that each piece of equipment has been designed to "protect itself from anything it would have a problem with".
Basically, with analog equipment, the circuit is designed so, as you look at higher and higher frequencies, the gain reaches unity or lower before the phase shift reaches a dangerous point.
(Most modern amplifiers have a frequency response that starts rolling off significantly around 80 kHz or so - as that is considered to be "well outside the audible frequency range".)
9. I have no issue with capturing the original in as complete a representation as possible, and distributing that version. I have big problems with the general assumption that high sampling rates and bit depths result in categorically better sound, and question strongly the value of ultrasonic content over 20kHz and the ability of the average or even high-end user to get that energy to his eardrums (much less to actually hear it). I do think there are cases where ultrasonic energy, real or distortion products, can cause problems in devices not capable of handling them without distortion. I do believe there are cases where the process of band-limiting audio by the use of certain devices results in intermodulation distortion that can be folded down into the audible region making wideband audio sound better for a reason other than the ultrasonic content. I also believe that those cases are fewer today than say 25 or 30 years ago, and the real solution is to test for high frequency intermodulation and deal with the cause rather than to band-aid a solution either by passing a wider bandwidth or limiting ultrasonics by filtering. Only a few decades back there were audio products that included ultrasonic filters to prevent those signals from wreaking havoc in other devices. Hopefully today those devices are few, but I doubt they're completely gone.