you see, portable and streaming digital format killed it all. Because of the lack of standards and quality with standards.
The loudness war was already in full swing before portable DMPs, mp3 and streaming were more than a dream. So they can’t be the cause.
We have several people to blame for this, including computer manufacturers (cheap laptop and computer speakers, the mp3 format itself (limiting bandwidth and resolution) and adopting low resolution as a consumer standard (instead of 96Khz) Semi-pro audio market forcing a norm of low res multi track audio (44.1-48Khz), the RIAA/ebu/iso for not establishing a standard reference level for recording and reproduction, Converter manufacturers with sub par signal to noise ratios at established capturing levels (-16dbfs through -10dbfs that if you are lucky, 85db s/n ).
RBCD was standardized in lag 1981 and released in 1982. The original PCM adapter made for recording digital audio on a video deck, the Sony PCM 1600, released in 1978, and its successors, the PCM1610 and PCM 1630 all had clear reference markings on special true peak meters. Analog tape references were only standardized if you consider 3 separate reference fluxivities “standardized”, but in reality recordings were made with averages and peaks well outside of those levels all the time. Except, the resulting S/N ratios were 0-20dB worse than using any of your cited pcm reference levels. In fact, PCM levels were initially more critical because of what happens when you exceed 0dBFS. Setting reference levels with PCM was therefore much different from how levels were set on analog machines, mostly out of concern for headroom. The original CD mastering standards allowed for absolutely no peaks above 0dBFS.
Problem is that recording to the proper headroom the signal's original harmonic response is attenuated by the virtual ground (Vref ) established in the ADC. There are very few converters that are engineered to avoid high impedance virtual grounds (RCF, BURL are the two I know of).
Huh? What “virtual ground”? Any professional ADC of quality runs on a bipolar supply, no virtual ground needed. Regardless, harmonic structure is always preserved or the system couldn’t have flat response. Even worse, if you mess with harmonic structure you mess with the very character of the sound. So that's not possible.
Also 20Hz to 20Khz is 80% of the range that was used to be high fidelity (5hz-50Khz) and professional analog mixers and analog processors circuits normally have typically a design bandwidth of 10hz-100Khz.
Whoa a minute there. Who told you that? High Fidelity was never 5Hz-50kHz, and professional analog mixers rarely could pass 100kHz flat. An analog mixer is a very complex device with the signal passing through many stages of amplification, attenuation, filtering, mixing, etc. Each of those stages has a practical maximum 3dB down point, and when you stage them all, it gets worse. Then toss in a few audio transformers, which they did have, and typically you're making it flat to 25-30kHz. Very few analog "processors" (not sure what you mean, so generically) operated to 100kHz either, in fact, I know of only one. No idea where you heard that, but as someone in that industry since the early 1970s, I can tell you first had that its wrong. And, BTW, an analog recorder, even the best, optimized for the specific pancake of tape on it at the time, running at 15ips cannot record 5-100kHz, or 5-50kHz. It's more like 30-20kHz, slightly higher if you don't mind an increase in distortion.
The reduction of audio's dynamic range so those cheap reproduction systems can reproduce the recorded audio loudly caused the quality to go down way below radio standards (lo-fi).
Hold on again...that's not right at all. That implies a general "dumbing down" of all material to fit the lowest quality reproducing system. That's just not done! That's not the cause of the loudness war at all. And what's this about radio standards being lo-fi? Flat from 20 to 15kHz, separation in the 40dB range...what the heck is lo-fi about that?
In essence a cassette tape has better fidelity than digitally recorded material (which by my teacher/mentor, is the worst format that came out in analog).
Oh brother. So, no hands-on experience with either then? Cassettes are fatally hobbled in noise, distortion and most obviously, frequency response by their very nature. They are never mistaken for a live source. PCM 16/44 has been mistaken for live thousands of times.
The media itself was never the issue with red book audio cds: its source and how that source was transferred to the digital domain is the issue. Most PCM ADC IC manufacturers don't remind the designers that the impedance of Vref should be low as possible.
Vref low impedance? Why? It's just a reference, it's not sinking current. Unless you're confusing Vref (necessary for an ADC) with a virtual ground. They are not the same at all, and there are many ways to create a virtual ground. As I said above, though, pro ADCs don't need a virtual ground because they have a real one, and that's the case in many semi-pro ADCs as well. Where on earth do you dig up this mythology?
So all those pcm ADC converters that use a simple cheap smt capacitor to ground have the poorest capturing capabilities (s/n and bandwidth is effected).
Now you're equating "cheap SMT" with a poorly performing capacitor....dead wrong. They can be some of the best. Then you claim that cap limits bandwidth and increases noise? Have you never tested, or seen tests of even the cheapest ADCs? There's no noise or bandwidth limit caused by capacitors of any kind.
then we used to have this thing called pre-emphasis that attenuates bandwidth above 15K quite regularly, and of course modern comb filtering methods have been implemented
OK, I'm getting you're talking out of your hat now. Pre-emphasis in PCM did NOT attenuate bandwidth...at all. It emphasized high frequencies by boosting them so they would modulate the ADC harder, then they were attenuated by an exactly complimentary amount when reproduced. The result is flat! And comb filtering has nothing whatever to do with any of this.
, but still 44.1-48kz should have never been implemented for audio recording because the recording equipment is typically limited to 25hz-16K in bandwidth because of the filtering needed to avoid the sampling rate clock's harmonics bleeding back into the audio.
Well now, let me introduce you to the real world. One of the most rudimentary PCM devices, the Sony PCM1600 released in 1978 had flat response to 20kHz at 0dBFS. The first "consumer" PCM converter for video recorders, the Sony PCM-F1 also had flat response from below 20Hz to above 20kHz, at any level up to and including 0dBFS, with or without emphasis on. And that's the really early stuff with analog filters. Everything since was better. There was never a PCM system released commercially with a bandwidth limit fo 25Hz-16kHz. You need to study the facts, man.
Just because the "format" is mid-fi doesn't account for the capturing. But I will let you guys know the norm for professional recording in major studios are 96Khz, it used to be 88.2Khz when the cd format was the norm for audio delivery. Of course, when someone records at that rate all those filters are not enabled, and the only weak link is the ADC's Vref then.
Oh come on! You clearly have no idea what pros do. We record at various sampling frequencies depending on the project and it's release format. Sure, we now record higher than 44 or 48, but it's primarily to please the client, not for technical reasons. In fact, it causes a problem because the release format with the largest number of copies/streams/downloads is still 16/44! Even though no CDs are involved! If it's a video project, then 48kHz is the release standard, and there's no point in working higher. There's no weak link in an ADC's Vref. If you think so, supply proof.
I think the 44.1 recording is mainstream because of the array of cheap computer setups can record it.
Not even slightly close to the reality. 44.1...now doesn't that impress you as an odd frequency? It's not a nice round number at all, it's a strange choice. Strange, when you consider that in the mid 1970s Dr. Tom Stockham was recording digitally at 16/50 using custom ADCs and data recorders. So why 44.1?
Well, it pre-dates computers of any kind, so that can't be the reason. Sony, when co-developing the CD, envisioned at complete end-to-end recording, post-production, mastering, and release system. The hard part was recording economically. Data recorders were bloody expensive and temperamental, and the tape was expensive too. But Sony was already making these video recorders, the U-Matic series, that had enough bandwidth to handle the bit-stream coming out of an ADC, but the data needed to be formatted into a standard video frame which was composed of lines and fields at a particular group of frequencies. Using 16 bit quantization, and stereo, with a significant block of data on each line reserved for error correction/concealment, the sampling frequency that harmonized with the existing video system was 44.1kHz, which works with 30fps video running 60 fields per second. Now, those are monochrome numbers, and NTSC color video is different, 29.97fps and 59.94 fields/sec, so consumer units like the PCM-F1 ran a tad slower, 44.056kHz, to match up with NTSC Color consumer video recorders. A small but interesting point.
Now if you try a typical song (24 tracks) using consumer computers, it may or may not keep up because of the storage technology implemented. We used to in the studio run ultra scisi raids, then we switched to sas raids to handle the throughput and upgrade out of scsi architecture.
Yeah, I remember SCSI, it went all the way to UltraSCSI II Fast/Wide, then died. Haven't seen a SCSI HDD in a computer or recoding system in 10 years or more.
Back to DACs:
Btw the Achilles heal of all dacs its ability to reproduce the original signal's slew rate. This is commonly caused by the miller capacitance of the fallowing stage.
Well, I gotta say, you're consistent. No, wrong again. First, if the ADC or DAC can pass a sine wave at .5/Nyquist, at 0dBFS, there's no slew limit at all. Second, the buffer amp following a DAC is typically low gain, and the Miller effect equation Cm = C(1+Av) were C is the feedback capacitance, and Av is the amp gain, that means the entire effect depends on C, the feedback capacitance, which is also very low. Again, you're barking up the wrong tree here. No slew limit to begin with.