Feb 3, 2019 at 6:08 PM Post #12,466 of 19,084
The signal to noise and distortion in the center groove of an LP is more than an order of magnitude greater than at the outer groove. It's clearly audible on many records. Whenever you see specs quoted, it's always the outer groove they're talking about- a best case scenario that progressively degrades little by little with each rotation of the record. If they cited the specs for the inner groove instead of the outer groove, a cassette tape would have better specs than an LP.

It doesn't matter if a cartridge can reproduce super audible frequencies because those frequencies DON'T EXIST on LP records. If they did, records would degrade into a mush of distortion in just a few plays. It is standard practice in LP mastering to roll off the frequencies starting at about 16kHz to prevent premature record wear. The only super audible frequencies present on LP records are nothing more than noise and distortion.

Even if there were inaudible frequencies in the grooves of LP records, they would be as useless as teats on a bull hog. You can't hear them. They add nothing to the perception of sound quality in music. The performance of LPs in the audible range where it counts is so far below that of CDs, there is no contest when it comes to sound quality. Why would anyone trade compromised sound fidelity in the range they CAN hear, for theoretical sound they know they CAN'T? That is just plain dumb.

LPs can sound good. Open reel tape can sound great. CDs can sound perfect. Perfect is all you need. Anything beyond that is just a waste of time, money and effort.
 
Last edited:
Feb 3, 2019 at 9:32 PM Post #12,467 of 19,084
My acoustics teacher in the university (this was back in mid 90's) said during a lecture about audio formats that the idea of vinyl records is horrible: You scratch a plastic plate with a small rock and try to get good sound out of that! :scream:
 
Feb 3, 2019 at 11:41 PM Post #12,468 of 19,084
Since you ask.....
And, I'm sorry, but "the need" to transfer content using NTSC tape was a matter of economics and convenience.
In every other discussion we've had you seem to have insisted that studios are generally willing to spend a lot of money to achieve the best possible sound quality.
Are you suggesting that, in this case, they would have been UNWILLING to abandon the outdated U-matic format and replace it with one that would have worked better?
I'm being a bit facetious, but in defense of my point, which is that, when the CD format was developed, "low cost" and "convenience of implementation" were prioritized far above "best performance".
(They did NOT choose 44.1k because it "worked perfectly"; they chose it because it was easy to put on the tape equipment thwey already had.)

In the early 80's what other options did they have? U-Matic was a robust broadcast standard at the time. An U-matic tape machine was not exactly cheap, about the price of a luxury car at the time. and the tape A 1 gig hard drive was 550 pounds and cost around $90,000. Chances are if you shipped it you would lose the data. My first 1 Gig drive cost $1100 in the mid 90's. Within a few years you could get ten time storage at one third the price With it could edit 1 album I would transfer it in real time off of DAT edit it and transfer it back.
 
Feb 3, 2019 at 11:59 PM Post #12,469 of 19,084
There is a fly in your ointment.. - the usual mistake.


S/N of analog record is 60 dB ( a couple dB up or down ) - referenced to 0 dB, which is 3,54cm/sec recording velocity.

Unlike digital, analog can - and DOES - go over nominal 0 dB level. Analog record does go at least to + 18dB - which is , usually, the limit for tracking ability testing, usually at 300 Hz and in lateral direction ( mono ) , which in turn corresponds to between 89 and 90 micrometer amplitude at this frequency.

There are test records -and cartridges - that can extend this to 110, sometimes even 120 micrometers - without mistracking. That is about +20dB compared to 60 dB S/N ref 0dB - or dynamic range between 78-80 dB. Prior to any noise reduction system.

Add to that 20 or so dB due to noise reduction system - and you DO arrive at 100 or so dB.

Truth to be told, up to +18dB level most commercially available recordings NEVER arrive - because only a handful of perfectly aligned cartridges woul be able to play such levels back without gross mistracking/severe distortion that can not be expressed in %.

Deduct 10 dB ( both for more noisy vinyl and reduced cutting leve combined ) - there is still 88-90 dB of dynamic range available.

In other words : enough.
isn't a signal going above the original limit just evidence of non linear behavior? to make things worst, this will manifest differently at different frequencies(for purely mechanical reasons). to me your argument is like saying that we can get 130dB of dynamic on a CD if we just include an expander DSP as standard feature. it's not false, but the concept of fidelity goes out the window.
 
Feb 4, 2019 at 12:27 AM Post #12,470 of 19,084
You're quite right.... using a 1 gB hard drive to ship content for a CD would have been impractical back in those days.

The practical solution would have been to use the same workflow as they used for vinyl...
Ship the master tape to the location of the mastering lathe (or CD mastering equipment) and convert it there.
In fact, you can pretty well substitute "CD mastering device" for "record mastering lathe" in the description, and you're done.

There's something else that really needs to be mentioned.
It is true that the U-Matic recorders used by studios in the 1980's were designed to work at certain sample rates.
That is because they were designed to store a specific amount of digital audio data per-line and per-frame as digital audio.
HOWEVER, tape drives were available at that time that were designed to record and store computer data.
Those drives were still expensive - but less so than hard drives.
(It would still have made the most sense to send the analog master tape to the CD mastering facility for conversion.)

Performing the conversion at the studio - and sending a U-Matic tape to the CD mastering plant - was surely the most convenient way...
But sending an analog master tape instead would not have been prohibitively complex or expensive.

In the early 80's what other options did they have? U-Matic was a robust broadcast standard at the time. An U-matic tape machine was not exactly cheap, about the price of a luxury car at the time. and the tape A 1 gig hard drive was 550 pounds and cost around $90,000. Chances are if you shipped it you would lose the data. My first 1 Gig drive cost $1100 in the mid 90's. Within a few years you could get ten time storage at one third the price With it could edit 1 album I would transfer it in real time off of DAT edit it and transfer it back.
 
Feb 4, 2019 at 2:05 AM Post #12,471 of 19,084
isn't a signal going above the original limit just evidence of non linear behavior? to make things worst, this will manifest differently at different frequencies(for purely mechanical reasons). to me your argument is like saying that we can get 130dB of dynamic on a CD if we just include an expander DSP as standard feature. it's not false, but the concept of fidelity goes out the window.

Oh dear.... 0 dB in case of analog record is REFERENCE - most definitely NOT LIMIT.
There is no cartridge that can track the greatest amplitude any cutter system in use today can put on master - that is even above +18 and +20 dB ref 0 dB - or 3.54 cm/sec @ 1kHz .
And all of that is LINEAR - cartridge does not compress the signal, it can only mistrack at levels above its tracking capability.

I actually REALLY suggest you read the basics of record mastering/cutting. World did not start with zeroes and ones.
 
Last edited:
Feb 4, 2019 at 2:42 AM Post #12,472 of 19,084
You're quite right.... using a 1 gB hard drive to ship content for a CD would have been impractical back in those days.

The practical solution would have been to use the same workflow as they used for vinyl...
Ship the master tape to the location of the mastering lathe (or CD mastering equipment) and convert it there.
In fact, you can pretty well substitute "CD mastering device" for "record mastering lathe" in the description, and you're done.

There's something else that really needs to be mentioned.
It is true that the U-Matic recorders used by studios in the 1980's were designed to work at certain sample rates.
That is because they were designed to store a specific amount of digital audio data per-line and per-frame as digital audio.
HOWEVER, tape drives were available at that time that were designed to record and store computer data.
Those drives were still expensive - but less so than hard drives.
(It would still have made the most sense to send the analog master tape to the CD mastering facility for conversion.)

Performing the conversion at the studio - and sending a U-Matic tape to the CD mastering plant - was surely the most convenient way...
But sending an analog master tape instead would not have been prohibitively complex or expensive.

We did use the same workflow. You brought the stereo masters to the mastering studio. The mastering engineer would give it the final polish to maximize fidelity for each of the release formats. After the CD introduction you would do the 1630 master first as it was the highest fidelity. The 1630 encoded on the U-matic tape. I cannot find any data tape format in the early 80's that could store this amount of data. The IBM 3480 only stored 200MB, and didn't come out to1985... 5 years too late. DEC was even smaller. The DLT could store it in 1989 however it has hardware data compression.

The cassette master be made by rolling off the low frequencies and compressing the dynamic range it would 1/4" open reel for some reason I think it might have been recorded at lower speeds like 7-1/2 IPS.

Then you would cut the lacquers 7", LP and EP. The 12" EP having the highest fidelity since they are 45 RPM and wider grooves. The helium cooled Neumann cutting heads had a response to 16kHz and later model to 20 kHz. Ortophon had one that could go to 25kHz however it was not as robust as the Neumanns, It is likely there is no working Ortophons left. The cut the lacquers the master LF rolled off and summed to mono below 100 Hz. You can cut up to 50 kHz running at half speed however the trade off is even worse low frequency response. You have to compress the dynamic range and you have hard limiters to keep from cutting to wide or deep and don't forget it is eq'ed for RIAA pre-emphasis curve or IEC Curve which are not the same. Recording for LP release or CD release changes your production style you can't expect a LP to perform like a CD. For example on records you duck the bass level off the kick drum beats, you can't have that much LF and keep it in the groove.

You have a heated stylus cutting the lacquer driven by helium cooled drive coils that has as much 500 watts each going into them. The lacquer that chips off as it is cut is extremely flammable, if the chips get back on the lacquer it is ruined. If you want to hear what is might sound like you have to cut an acetate which can only be played a few times, and you hope the lacquer sounds the same. The lacquer is sent to the pressing plant where it is plated to make the first plate. This process destroys the lacquer. Pretty much the Rube Goldberg of audio. All that work and it doesn't sound anything like the 1/2" analog stereo master you started with. The 1/2" master doesn't sound anything like the mix from the console off the 2" 24 track, which in turn barely resembles what came out of the microphones.
 
Feb 4, 2019 at 2:46 AM Post #12,473 of 19,084
[1] There is a fly in your ointment.. - the usual mistake.
[2] Unlike digital, analog can - and DOES - go over nominal 0 dB level. Analog record does go at least to + 18dB ...
[3] Truth to be told, [3b] up to +18dB level most commercially available recordings NEVER arrive - because only a handful of perfectly aligned cartridges woul be able to play such levels back without gross mistracking/severe distortion that can not be expressed in %.

1. No, the fly is in YOUR ointment ... and it's a very big fly! However, it's not "the usual mistake" because there is NO professional recording engineer who would EVER make that mistake, which is why I'm calling BS on your claim of being a recording engineer!!!!

2. And here we have it, the mistake that not even a rookie student should make: You are comparing two DIFFERENT and UNRELATED dB scales: the dBFS scale of digital audio with the dBVU scale of analogue. There is/was no standard in music recording for aligning these two scales but in film and TV there is: -20dBFS is calibrated to 0dBVU. Therefore, your +18dB(VU) would be roughly equivalent to -2dB(FS) and consequently, the rest of your post is utter nonsense! The SNR of vinyl is NOT 60dB +18dB, there is no +18dB! Not to mention that peak or quasi-peak level has NOTHING to do with signal to noise ratio anyway.

3. What truth are you telling, the truth based on you not understanding the fundamental basics of the scales used for measuring signal amplitude in analogue and digital recordings?
3a. So you're saying that only a handful of cartridges would be able to play back a level of -2dB(FS) without severe distortion. As digital can go up to 0dBFS (2dB higher than vinyl) with no audible distortion at all, how is vinyl better than digital? Are you saying that severe distortion is "better" than no audible distortion?

Unfortunately, even in this sub-forum, there is no rule against posting utter ignorant nonsense and claiming that it's the "truth". So we have no grounds to call for you to be banned, the best we can do is demonstrate/explain that it is utter ignorant nonsense!

G
 
Feb 4, 2019 at 2:58 AM Post #12,475 of 19,084
Oh dear.... 0 dB in case of analog record is REFERENCE - most definitely NOT LIMIT.
There is no cartridge that can track the greatest amplitude any cutter system in use today can put on master - that is even above +18 and +20 dB ref 0 dB - or 3.54 cm/sec @ 1kHz .
And all of that is LINEAR - cartridge does not compress the signal, it can only mistrack at levels above its tracking capability.

I actually REALLY suggest you read the basics of record mastering/cutting. World did not start with zeroes and ones.
all right my bad, I misunderstood your post. I thought you were considering playback amplitudes compared to the max amplitude from the source that would be our reference. I imagined that was where you saw your extra dynamic. now I'm wondering what you meant? is it about being able to record with a VU meter above 0? if so that's utterly inconsequential as you can just change your reference, the total actual dynamic remains the same. or am I again misunderstanding what you're talking about?
 
Feb 4, 2019 at 3:18 AM Post #12,476 of 19,084
all right my bad, I misunderstood your post. I thought you were considering playback amplitudes compared to the max amplitude from the source that would be our reference. I imagined that was where you saw your extra dynamic. now I'm wondering what you meant? is it about being able to record with a VU meter above 0? if so that's utterly inconsequential as you can just change your reference, the total actual dynamic remains the same. or am I again misunderstanding what you're talking about?

I just saw @dprimary actually has experience with analog record cutting.

Just above he said ( quote ) :

0dBu is about -18 dB down from 0dBFS depending on your calibration

That covers the recording levels for normal cutting. Normal record maximum S/N is therefore 60 dB ref 0dBu - overall, with the maximum level of +18dB, that S/N amounts to 78 dB.

Absolute levels can remain (in theory) the same for noise reduction encoded records. Add about 20 dB to that 78 already present - voila, 98 dB S/N .

For the CX, CBS circuit from the patent hard clips at just above +15dBu - meaning that CX encoded records cut optimally are unfortunately distorted/compressed in the last 3 dB of dynamic range. The ICs readily available in USA at the time did not allow for grater voltage amplitude required - and they - obviously - did not either know of or did not want to use japanese ICs that , even back then, have been specifically developed for phono preamps, running at more than +-15 VDC rails.
 
Feb 4, 2019 at 3:28 AM Post #12,477 of 19,084
We did use the same workflow. You brought the stereo masters to the mastering studio. The mastering engineer would give it the final polish to maximize fidelity for each of the release formats. After the CD introduction you would do the 1630 master first as it was the highest fidelity. The 1630 encoded on the U-matic tape. I cannot find any data tape format in the early 80's that could store this amount of data. The IBM 3480 only stored 200MB, and didn't come out to1985... 5 years too late. DEC was even smaller. The DLT could store it in 1989 however it has hardware data compression.

The cassette master be made by rolling off the low frequencies and compressing the dynamic range it would 1/4" open reel for some reason I think it might have been recorded at lower speeds like 7-1/2 IPS.

Then you would cut the lacquers 7", LP and EP. The 12" EP having the highest fidelity since they are 45 RPM and wider grooves. The helium cooled Neumann cutting heads had a response to 16kHz and later model to 20 kHz. Ortophon had one that could go to 25kHz however it was not as robust as the Neumanns, It is likely there is no working Ortophons left. The cut the lacquers the master LF rolled off and summed to mono below 100 Hz. You can cut up to 50 kHz running at half speed however the trade off is even worse low frequency response. You have to compress the dynamic range and you have hard limiters to keep from cutting to wide or deep and don't forget it is eq'ed for RIAA pre-emphasis curve or IEC Curve which are not the same. Recording for LP release or CD release changes your production style you can't expect a LP to perform like a CD. For example on records you duck the bass level off the kick drum beats, you can't have that much LF and keep it in the groove.

You have a heated stylus cutting the lacquer driven by helium cooled drive coils that has as much 500 watts each going into them. The lacquer that chips off as it is cut is extremely flammable, if the chips get back on the lacquer it is ruined. If you want to hear what is might sound like you have to cut an acetate which can only be played a few times, and you hope the lacquer sounds the same. The lacquer is sent to the pressing plant where it is plated to make the first plate. This process destroys the lacquer. Pretty much the Rube Goldberg of audio. All that work and it doesn't sound anything like the 1/2" analog stereo master you started with. The 1/2" master doesn't sound anything like the mix from the console off the 2" 24 track, which in turn barely resembles what came out of the microphones.

Sounds about correct.

I am happy to record 2 channel direct to DSD128 - next to none difference, still want to close the gap between the mike feed and recording ...
 
Feb 4, 2019 at 3:45 AM Post #12,478 of 19,084
1. No, the fly is in YOUR ointment ... and it's a very big fly! However, it's not "the usual mistake" because there is NO professional recording engineer who would EVER make that mistake, which is why I'm calling BS on your claim of being a recording engineer!!!!

2. And here we have it, the mistake that not even a rookie student should make: You are comparing two DIFFERENT and UNRELATED dB scales: the dBFS scale of digital audio with the dBVU scale of analogue. There is/was no standard in music recording for aligning these two scales but in film and TV there is: -20dBFS is calibrated to 0dBVU. Therefore, your +18dB(VU) would be roughly equivalent to -2dB(FS) and consequently, the rest of your post is utter nonsense! The SNR of vinyl is NOT 60dB +18dB, there is no +18dB! Not to mention that peak or quasi-peak level has NOTHING to do with signal to noise ratio anyway.

3. What truth are you telling, the truth based on you not understanding the fundamental basics of the scales used for measuring signal amplitude in analogue and digital recordings?
3a. So you're saying that only a handful of cartridges would be able to play back a level of -2dB(FS) without severe distortion. As digital can go up to 0dBFS (2dB higher than vinyl) with no audible distortion at all, how is vinyl better than digital? Are you saying that severe distortion is "better" than no audible distortion?

Unfortunately, even in this sub-forum, there is no rule against posting utter ignorant nonsense and claiming that it's the "truth". So we have no grounds to call for you to be banned, the best we can do is demonstrate/explain that it is utter ignorant nonsense!

G

Unfortunately manufacturers still get this wrong as well. I expect analog line level inputs to handle at least +24dBu. From what I have measured years ago 0 dBFS is over 15 volts peak to peak. They look up the output of a consumer CD player which specifies some nonsense like out 2V nominal ( which nobody knows what that is suppose to mean) Hit that input with a professional CD player or digital console and you have a distortion wonderland. At least consoles you can reduce the master fader, with a CD player if you don't have 20dB pad your stuck.
 
Feb 4, 2019 at 4:55 AM Post #12,480 of 19,084
As an aside, do you know why the amplifiers had to be capable of so much output?

To be able to cut the high frequency information correctly - uncompressed, not bandwidth limited. In fact, the calculated requirement to cut at 100 cm/sec ( above "theorethical", below actual recorded velocities from real world top quality records ) goes up to just below 1 kW - per channel - meaning 2 kW peak power is required to drive the cutting stylus.

https://pubs.shure.com/view/guide/V15-Type-4/en-US.pdf

It takes great skill - and cojones - to work with the cutting head at the very upper limit - exceed that, the coils burn out... Back in the day, it meant a big check to Neumann or Ortofon - for the replacement head ( if you wanted it ASAP ) or not that much lower check if you wanted to wait to have the damaged head repaired..

Today, neither the Neumann or Ortofon are no longer supporting cutting heads - or record mastering equipment in general. there are none NOS cutting heads anymore - anywhere in the world. Some of the former employees not old too much to still be able to support the market have taken over this task. And they are overbooked like crazy....

No wonder today record mastering engineers have to be more conservative in their approach to record mastering - burning the cutter head can mean 6 months (!) or more waiting time, before the repaired head comes back. Therefore they use compression, peak limiteing, frequency response limiting, - any other sound quality degrading unmentionables of the trade - that will, first and foremost, make sure the cutting head is NEVER jeopardized.

No one can afford half a year doing - nothing ...

Half speed mastering has the benefit of requiring only 1/4 of the power for mastering in real time. That means 250 W/ch amps will suffice - and with any disc cutting amp, that means loafing..Also, no danger to the cutting head - nor requirement to use helium for cooling.
Drawback is problematic low end - requiring the entire chain to work flawlessly down to at least 10 Hz, which proved to be too tough nut to crack. Even the pioneer of half speed mastering, sadly late Stan Ricker resorted to 2/3rd speed mastering in the end - the best compromise under the given circumstances.

More about mastering is nowhere better explained than here ( there are 3 parts ) : http://www.enjoythemusic.com/magazine/rickerinterview/ricker1.htm
 
Last edited:

Users who are viewing this thread

Back
Top