1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

Creating examples of "Loudness Wars" effect

Discussion in 'Sound Science' started by Elgrindio, Jul 11, 2017.
12 13 14 15 16 17 18 19 20 21
23 24
  1. Zapp_Fan
    In what aspects other than absolute theoretical bandwidth is a RBCD file lower-fidelity than a cassette tape? And, a quick reminder, PCM formats are not band-limited on the low end, the real frequency response of a 16/44 file is 0 - 22khz, the response of real DACs is a little narrower of course.
  2. drtechno
    The media itself was never the issue with red book audio cds: its source and how that source was transferred to the digital domain is the issue. Most PCM ADC IC manufacturers don't remind the designers that the impedance of Vref should be low as possible. So all those pcm ADC converters that use a simple cheap smt capacitor to ground have the poorest capturing capabilities (s/n and bandwidth is effected). then we used to have this thing called pre-emphasis that attenuates bandwidth above 15K quite regularly, and of course modern comb filtering methods have been implemented , but still 44.1-48kz should have never been implemented for audio recording because the recording equipment is typically limited to 25hz-16K in bandwidth because of the filtering needed to avoid the sampling rate clock's harmonics bleeding back into the audio.

    Just because the "format" is mid-fi doesn't account for the capturing. But I will let you guys know the norm for professional recording in major studios are 96Khz, it used to be 88.2Khz when the cd format was the norm for audio delivery. Of course, when someone records at that rate all those filters are not enabled, and the only weak link is the ADC's Vref then.
    Last edited: Jun 25, 2018
  3. Zapp_Fan
    Oh, I understand. Yeah, I would agree that recording at 44.1 (assuming you have anything approaching a decent mic) is iffy depending on the instrument / part you're recording. And recording at 96 (or even 192 or 384, screw it) is cheap enough in terms of modern CPU / RAM / Disk that one might as well do it, on the off chance that it improves something and assuming it doesn't cause other gear to misbehave.

    Now, that doesn't necessarily mean your mic or pre has significant ultrasonic performance anyway, but that is a separate issue.
  4. TheSonicTruth
    Just remember: mp3 and other lossy codecs do not affect the dynamic range of something encoded to them.
  5. drtechno
    I think the 44.1 recording is mainstream because of the array of cheap computer setups can record it. Now if you try a typical song (24 tracks) using consumer computers, it may or may not keep up because of the storage technology implemented. We used to in the studio run ultra scisi raids, then we switched to sas raids to handle the throughput and upgrade out of scsi architecture.

    Back to DACs:
    Btw the Achilles heal of all dacs its ability to reproduce the original signal's slew rate. This is commonly caused by the miller capacitance of the fallowing stage.
  6. drtechno
    I agree, it never did effect the dynamic range, its just that producers and mastering engineers had to reduce the dynamic range so that their corporate OEMs that made cheap portable devices would sound good. You have to know that there is "ties" through multi-corporate conglomerations in the entertainment industry and the labels bend to their equipment manufacturer's whims. This is why portable formats killed the music industry.
    TheSonicTruth likes this.
  7. castleofargh Contributor
    does any of this relate to the so called loudness war?
  8. drtechno
    yes, because that is what caused it to go wrong from the beginning: having to reduce dynamic range to overcome the inefficient speakers and other cheap consumer products in the main stream so that the recording had a good sound with this cheap equipment. This is what is taught in recording schools btw... The capturing equipment have obstacles of their own, so thats why I shared that.

    I want to point out that amplitude does not always fallow loudness, because loudness is the distance between loud and soft.
    TheSonicTruth likes this.
  9. RRod
    Even if it doesn't, I'd give a guinea to be provided a concrete example where some of these issues caused an issue in a recording.
  10. TheSonicTruth
    No: Dynamic Range is the distance between loud and soft.
  11. TheSonicTruth
    All the consumer complaints about Metallica's 'Death Magnetic' as released originally. Google it yourself.
  12. RRod
    The issue was inabilities of ADCs/DACs operating at 44.1. It's not always about you.

    I am well acquainted with Death Magnetic. Seems like the band really liked the sound, so maybe take it up with James and Lars eh?
  13. Glmoneydawg
    Unfortunately "audiophile" has become a dirty word around here....but that is what we are after"audiophile" recordings.Recordings that take full advantage of our systems resolution,dynamics,soundstaging and imaging abilities.Congratulations you are an audiophile.Consumer recordings have never been made for us....they are made for the lowest common denominator....just the the economic truth.... sigh.
    Last edited: Jun 25, 2018
  14. pinnahertz
    The loudness war was already in full swing before portable DMPs, mp3 and streaming were more than a dream. So they can’t be the cause.
    RBCD was standardized in lag 1981 and released in 1982. The original PCM adapter made for recording digital audio on a video deck, the Sony PCM 1600, released in 1978, and its successors, the PCM1610 and PCM 1630 all had clear reference markings on special true peak meters. Analog tape references were only standardized if you consider 3 separate reference fluxivities “standardized”, but in reality recordings were made with averages and peaks well outside of those levels all the time. Except, the resulting S/N ratios were 0-20dB worse than using any of your cited pcm reference levels. In fact, PCM levels were initially more critical because of what happens when you exceed 0dBFS. Setting reference levels with PCM was therefore much different from how levels were set on analog machines, mostly out of concern for headroom. The original CD mastering standards allowed for absolutely no peaks above 0dBFS.
    Huh? What “virtual ground”? Any professional ADC of quality runs on a bipolar supply, no virtual ground needed. Regardless, harmonic structure is always preserved or the system couldn’t have flat response. Even worse, if you mess with harmonic structure you mess with the very character of the sound. So that's not possible.
    Whoa a minute there. Who told you that? High Fidelity was never 5Hz-50kHz, and professional analog mixers rarely could pass 100kHz flat. An analog mixer is a very complex device with the signal passing through many stages of amplification, attenuation, filtering, mixing, etc. Each of those stages has a practical maximum 3dB down point, and when you stage them all, it gets worse. Then toss in a few audio transformers, which they did have, and typically you're making it flat to 25-30kHz. Very few analog "processors" (not sure what you mean, so generically) operated to 100kHz either, in fact, I know of only one. No idea where you heard that, but as someone in that industry since the early 1970s, I can tell you first had that its wrong. And, BTW, an analog recorder, even the best, optimized for the specific pancake of tape on it at the time, running at 15ips cannot record 5-100kHz, or 5-50kHz. It's more like 30-20kHz, slightly higher if you don't mind an increase in distortion.
    Hold on again...that's not right at all. That implies a general "dumbing down" of all material to fit the lowest quality reproducing system. That's just not done! That's not the cause of the loudness war at all. And what's this about radio standards being lo-fi? Flat from 20 to 15kHz, separation in the 40dB range...what the heck is lo-fi about that?
    Oh brother. So, no hands-on experience with either then? Cassettes are fatally hobbled in noise, distortion and most obviously, frequency response by their very nature. They are never mistaken for a live source. PCM 16/44 has been mistaken for live thousands of times.

    Vref low impedance? Why? It's just a reference, it's not sinking current. Unless you're confusing Vref (necessary for an ADC) with a virtual ground. They are not the same at all, and there are many ways to create a virtual ground. As I said above, though, pro ADCs don't need a virtual ground because they have a real one, and that's the case in many semi-pro ADCs as well. Where on earth do you dig up this mythology?
    Now you're equating "cheap SMT" with a poorly performing capacitor....dead wrong. They can be some of the best. Then you claim that cap limits bandwidth and increases noise? Have you never tested, or seen tests of even the cheapest ADCs? There's no noise or bandwidth limit caused by capacitors of any kind.
    OK, I'm getting you're talking out of your hat now. Pre-emphasis in PCM did NOT attenuate bandwidth...at all. It emphasized high frequencies by boosting them so they would modulate the ADC harder, then they were attenuated by an exactly complimentary amount when reproduced. The result is flat! And comb filtering has nothing whatever to do with any of this.
    Well now, let me introduce you to the real world. One of the most rudimentary PCM devices, the Sony PCM1600 released in 1978 had flat response to 20kHz at 0dBFS. The first "consumer" PCM converter for video recorders, the Sony PCM-F1 also had flat response from below 20Hz to above 20kHz, at any level up to and including 0dBFS, with or without emphasis on. And that's the really early stuff with analog filters. Everything since was better. There was never a PCM system released commercially with a bandwidth limit fo 25Hz-16kHz. You need to study the facts, man.
    Oh come on! You clearly have no idea what pros do. We record at various sampling frequencies depending on the project and it's release format. Sure, we now record higher than 44 or 48, but it's primarily to please the client, not for technical reasons. In fact, it causes a problem because the release format with the largest number of copies/streams/downloads is still 16/44! Even though no CDs are involved! If it's a video project, then 48kHz is the release standard, and there's no point in working higher. There's no weak link in an ADC's Vref. If you think so, supply proof.

    Not even slightly close to the reality. 44.1...now doesn't that impress you as an odd frequency? It's not a nice round number at all, it's a strange choice. Strange, when you consider that in the mid 1970s Dr. Tom Stockham was recording digitally at 16/50 using custom ADCs and data recorders. So why 44.1?

    Well, it pre-dates computers of any kind, so that can't be the reason. Sony, when co-developing the CD, envisioned at complete end-to-end recording, post-production, mastering, and release system. The hard part was recording economically. Data recorders were bloody expensive and temperamental, and the tape was expensive too. But Sony was already making these video recorders, the U-Matic series, that had enough bandwidth to handle the bit-stream coming out of an ADC, but the data needed to be formatted into a standard video frame which was composed of lines and fields at a particular group of frequencies. Using 16 bit quantization, and stereo, with a significant block of data on each line reserved for error correction/concealment, the sampling frequency that harmonized with the existing video system was 44.1kHz, which works with 30fps video running 60 fields per second. Now, those are monochrome numbers, and NTSC color video is different, 29.97fps and 59.94 fields/sec, so consumer units like the PCM-F1 ran a tad slower, 44.056kHz, to match up with NTSC Color consumer video recorders. A small but interesting point.
    Yeah, I remember SCSI, it went all the way to UltraSCSI II Fast/Wide, then died. Haven't seen a SCSI HDD in a computer or recoding system in 10 years or more.
    Well, I gotta say, you're consistent. No, wrong again. First, if the ADC or DAC can pass a sine wave at .5/Nyquist, at 0dBFS, there's no slew limit at all. Second, the buffer amp following a DAC is typically low gain, and the Miller effect equation Cm = C(1+Av) were C is the feedback capacitance, and Av is the amp gain, that means the entire effect depends on C, the feedback capacitance, which is also very low. Again, you're barking up the wrong tree here. No slew limit to begin with.
    gregorio, danadam, bfreedma and 2 others like this.
  15. old tech
    Far out drtechno - I haven't read so much nonsense since some of those analogsurvivor posts.

    One positive though, I am begining to understand digiphobes a bit better. It is all this nonsense they believe about digital and the extraodinary claims about analog media they push, even to the extent of claiming cassettes are superior to digital.

    What I have yet to understand though which is cause and which is effect when developing a digiphobia.

    One thing I am absolutely certain though is that my kids (well they're not kids anymore) listening to their MP3s on Ipods with earphones are still enjoying higher quality sound than the cassette walkman on earphones in my day.
12 13 14 15 16 17 18 19 20 21
23 24

Share This Page