We seem to have a sort of disconnect here......
I don't disagree at all about all the ridiculous complications involved in mastering to vinyl. (And I would be very interested to see, for example, THD measurements of the entire process.... measured from the output of the microphone preamp in the studio to the output of the listener's phono preamp.... with a clean sine wave and no processing.... compared to those for RBCD. I'm quite sure there are good reasons you never see THD and IMD specs for cutting lathes, phono cartridges, or the mechanical accuracy of the shape of the wiggles in the vinyl.)
I was specifically addressing the claim that "the 44.1k sample rate was chosen for CDs because the audio masters HAD to be able to be recorded on U-Matic tape systems - which only supported 44.1k". I was simply pointing out that there was no specific need to ever store the data on U-Matic tape.
Excluding all the details about the equipment involved. Before digital media came along you took the ANALOG MASTER TAPE to the place where they used the cutting lathe to produce the lacquer master. After the switch to CDs, you cold have just as easily taken the same ANALOG MASTER TAPE to the facility where the CD masters were produced. There was no specific necessity to convert from analog master to digital master, on U-Matic tape,
AT THE STUDIO before transporting it.... and I'm not aware of any reason why it would have had to be transferred to U-Matic tape after the conversion either (I'm guessing that the CD mastering studio could have used one of those 550 pound hard disc drives if they wanted to. Also, unlike with vinyl production, there was no tweaking for the engineer to do at the point of conversion. Other than, possibly, making minor adjustments to the ADC, there is no reason for the mastering engineer to be involved in the physical process of converting from analog to digital.
Incidentally, unlike analog data, digital data isn't so fussy about being stored in one continuous segment on a single piece of media. Once the digital audio data for the CD was created, and assembled on the "CD cutting equipment itself", you could have easily stored one CD on four of those 200 mB digital data tapes. There's no issue whatsoever with allowing a digital data backup set extend across multiple tapes. It is routinely done and there is no loss of quality or reliability. (And since, with digital, there is no generational loss of quality, once the CD data stream has been created, there is no issue whatsoever with making multiple master copies, and multiple backup copies, and changing between formats when convenient.)
When the CD standard was first written - there were no CD disc recorders, and no CD mastering facilities.... so, at that point, they could just as easily have been built to a standard of using a 48k sample rate on 5" discs for 45 minutes.... or 96k on a 12" disc for an hour. I was simply pointing out that there were no significant technological reasons that made other choices impractical. The reasons were more at the level of: "Most studios already have U-Matic machines, and know how to use them, so they'll be happier if they don't have to purchase and learn new hardware, or have to ship their precious master tapes to a CD mastering facility after they leave the vinyl mastering facility". The reality is that "audibly perfect audio quality with a reasonable safety margin" was not the
primary consideration when the format was chosen... it was just one of many factors considered.
They did NOT conduct extensive listening tests, in a huge and impartial scientific study, at a wide variety of different sample rates and bit depths, to determine which one sounded better... or if higher sample rates were audibly better than 44.1k. And they most certainly did not test whether recording it onto a CD would audibly degrade content from a high-quality master, recorded and converted on the best equipment available in 2018, and played back on the best quality playback equipment available in 2018 - for obvious reasons. What they did was to develop a standard that met or exceeded all of what they considered to be the practical requirements, then conducted some listening tests to confirm that it was adequate for the market, and produced no obvious audible problems when tested on the equipment available at the time. (Please note that there is nothing terrible, or even unusual, about doing this. You are NOT driving the safest car, or the fastest car, or the most efficient car, that could be built using current technology either. However, we all know that there will be "better" new models next year. However, when it comes to audio, there seem to be a lot of people determined to believe that 'the game is over, we now have the best possible, and there is simply no point in looking for, or hoping for, anything better". I am quite convinced that RBCD was audibly transparent, when tested with the available audio content, available audio equipment, and listening acumen available at the time.... but the time was the 1970's.)
We did use the same workflow. You brought the stereo masters to the mastering studio. The mastering engineer would give it the final polish to maximize fidelity for each of the release formats. After the CD introduction you would do the 1630 master first as it was the highest fidelity. The 1630 encoded on the U-matic tape. I cannot find any data tape format in the early 80's that could store this amount of data. The IBM 3480 only stored 200MB, and didn't come out to1985... 5 years too late. DEC was even smaller. The DLT could store it in 1989 however it has hardware data compression.
The cassette master be made by rolling off the low frequencies and compressing the dynamic range it would 1/4" open reel for some reason I think it might have been recorded at lower speeds like 7-1/2 IPS.
Then you would cut the lacquers 7", LP and EP. The 12" EP having the highest fidelity since they are 45 RPM and wider grooves. The helium cooled Neumann cutting heads had a response to 16kHz and later model to 20 kHz. Ortophon had one that could go to 25kHz however it was not as robust as the Neumanns, It is likely there is no working Ortophons left. The cut the lacquers the master LF rolled off and summed to mono below 100 Hz. You can cut up to 50 kHz running at half speed however the trade off is even worse low frequency response. You have to compress the dynamic range and you have hard limiters to keep from cutting to wide or deep and don't forget it is eq'ed for RIAA pre-emphasis curve or IEC Curve which are not the same. Recording for LP release or CD release changes your production style you can't expect a LP to perform like a CD. For example on records you duck the bass level off the kick drum beats, you can't have that much LF and keep it in the groove.
You have a heated stylus cutting the lacquer driven by helium cooled drive coils that has as much 500 watts each going into them. The lacquer that chips off as it is cut is extremely flammable, if the chips get back on the lacquer it is ruined. If you want to hear what is might sound like you have to cut an acetate which can only be played a few times, and you hope the lacquer sounds the same. The lacquer is sent to the pressing plant where it is plated to make the first plate. This process destroys the lacquer. Pretty much the Rube Goldberg of audio. All that work and it doesn't sound anything like the 1/2" analog stereo master you started with. The 1/2" master doesn't sound anything like the mix from the console off the 2" 24 track, which in turn barely resembles what came out of the microphones.