Testing audiophile claims and myths
Jul 31, 2019 at 9:26 PM Post #13,381 of 17,336
1. Actually, no it wasn't! The first recognised theatrical movie mixed for and released in discrete 7.1 was Last Action Hero in 1993!
2. And, far less recently (than 2010), many theatres did upgrade to 7.1. There were over 6,750 by 1999!
3. No, Hellboy (2004) was originally mixed in 7.1 (and 5.1) but was reversioned for bluray as most/all major films are, as I've already stated.
4. DD and DD EX are both 5.1 audio channels. DD+ is not Dolby Digital it is "Dolby Digital Plus" and does not employ the Dolby Digital AC-3 codec.

You do seem to like arguing about film sound but it's bizarre you choose to do so with someone who's actually worked in film sound for over 20 years!!

I seem to argue with you, as you claim to be an expert in everything sound. Even resorting to refusing to acknowledge terminology Dolby uses! An expert that thought only UHD discs support TrueHD? For example, you're now obfuscating 7.1 SDDS (that uses 5 center channels and 2 surround channels) with Dolby surround 7.1 (now granted, the distinction is channel layout: not who was the first with discrete channels). Last Action Hero was not the first movie mixed for and released in Dolby 7.1: it was SDDS. AS ALL SOURCES SAY, TOY STORY 3 WAS THE FIRST MOVIE MIXED FOR DOLBY 7.1. So baring that in mind....how many theaters in your statistic had >2 discrete surround channels in 1999?? NONE If you're going to claim the SDDS mix in Hellboy(2004) was the same 7.1 mix used in blu-ray, then you're clearly wrong. The surround formats listed for Hellboy are DD, DTS, and D-Cinema 5.1: they had to upmix for the 7.1 surround on BD (which doesn't utilize multiple center channels, but has multiple surround channels).

Another example: Look up Dolby Digital and it lists all "versions". Dolby Digital+ uses E-AC-3, a codec based on previous AC-3. Previously you've said I don't know what a track is because for some reason it would be different in an audio program vs video production program. Now this analogy is like refusing to acknowledge h.263 and h.264 are not seperate specs still within MPEG-4
 
Last edited:
Aug 1, 2019 at 1:38 AM Post #13,382 of 17,336
I seem to argue with you, as you claim to be an expert in everything sound. Even resorting to refusing to acknowledge terminology Dolby uses! An expert that thought only UHD discs support TrueHD? For example, you're now obfuscating 7.1 SDDS (that uses 5 center channels and 2 surround channels) with Dolby surround 7.1 (now granted, the distinction is channel layout: not who was the first with discrete channels). Last Action Hero was not the first movie mixed for and released in Dolby 7.1: it was SDDS. AS ALL SOURCES SAY, TOY STORY 3 WAS THE FIRST MOVIE MIXED FOR DOLBY 7.1. So baring that in mind....how many theaters in your statistic had >2 discrete surround channels in 1999?? NONE If you're going to claim the SDDS mix in Hellboy(2004) was the same 7.1 mix used in blu-ray, then you're clearly wrong. The surround formats listed for Hellboy are DD, DTS, and D-Cinema 5.1: they had to upmix for the 7.1 surround on BD (which doesn't utilize multiple center channels, but has multiple surround channels).

Another example: Look up Dolby Digital and it lists all "versions". Dolby Digital+ uses E-AC-3, a codec based on previous AC-3. Previously you've said I don't know what a track is because for some reason it would be different in an audio program vs video production program. Now this analogy is like refusing to acknowledge h.263 and h.264 are not seperate specs still within MPEG-4

You didn't specifically call out Dolby Surround 7.1, when I read it I had the same reaction as Gregorio that 7.1 has been around for decades.
From a mixing perspective for film, I consider all speakers but except the center and LFE to be part of the surround. In non atmos surround formats the channel are all discrete it does really matter as long as they can downmix to the release encodings. Atmos could in theory output any format even SDDS. Atmos systems with screens wider than 12 meters (40 feet) use 5 front speakers as SDDS did. Five front speakers make don't make any sense on smaller screens.
 
Aug 1, 2019 at 7:06 AM Post #13,383 of 17,336
You didn't specifically call out Dolby Surround 7.1, when I read it I had the same reaction as Gregorio that 7.1 has been around for decades.
From a mixing perspective for film, I consider all speakers but except the center and LFE to be part of the surround. In non atmos surround formats the channel are all discrete it does really matter as long as they can downmix to the release encodings. Atmos could in theory output any format even SDDS. Atmos systems with screens wider than 12 meters (40 feet) use 5 front speakers as SDDS did. Five front speakers make don't make any sense on smaller screens.

I was assuming we were talking about 7.1 surround (as that defined by Dolby) based on the context of posts: no where were we talking about SDDS. Instead it was evolution of 5.1 to Atmos, and Gregorio claimed 7.1 (in the context of surround) had been around for years. Even going so far to claim a 7.1 cinema mix could be the basis of the Hellboy BD 7.1 track (were whether SDDS or other formats, the theatrical release had 2 channel surround). Sorry for the confusion of discrete vs surround config: I myself wasn’t thinking of SDDS as the topic was 7.1 surround on blu-ray (and I realized Gregorio was referencing SDDS with his claim of thatrical 7.1 in the 90s).
 
Last edited:
Aug 1, 2019 at 10:22 AM Post #13,384 of 17,336
The problem is that, when you start talking about surround sound, that term becomes even more vague.
(Even past the fact that what's "audibly transparent" to one person may not be to another... and that what's "audibly transparent" on certain content may not be on other content.)

Here's an easy example.....

Some of the early compression CODECS saved a lot of space by making the assumption that we aren't especially perceptive about the details of high frequency content.
It was particularly assumed that, while the amount of high frequency spectral content may be quite audible, we aren't very sensitive to the details.
As a result, at least one or two of the early surround CODECS simply didn't bother to store the information in the surround channels in the upper frequency bands.
During the encoding process, when they were analyzing what was present in each frequency band, if they detected what seemed to be "decorrelated noise" in the surround channels, they would simply discard it.
All they stored was a very generalized piece of data about "how much sound was present in that band".
Then, when decoding that content, they would simply "fill in" the upper bands in the surround channels with "about the right amount of decorrelated noise".
(I seem to recall that one particular CODEC would simply save that band from one channel - then duplicate it to all of the channels when decoding the content.)

This is NOT unprecedented in VOICE reproduction.
Many advanced telephone CODECs don't actually store the voice at all.
They break it down into small bits of sound - which can then be stored as "coefficients", and then "rebuilt" later from that information (in general terms this is called "tokenizing").
So, for example, they might find that a certain spoken sound in my voice is equivalent to "a 50 msec burst of noise in band 2 at level 5 mixed with a 50 msec burst in band 5 at level 7 followed by a 100 msec burst in band 3 at level 2".
They would then store this information - and identify it (for example as "sound #24").
Then, at the receiving end, when told to do so, the decoder would "play a copy of sound #24".
(The process is somewhat similar to MiDi.)

(Imagine a video transport system where a person at one end watches the action - then describes it over a phone to a remarkably fast artist at the other end.
The artist at the other end then DRAWS what the sender sees based on their description. You would end up with the equivalent of "a cartoon that looks very much like the original".)

You may have experienced this if you've ever had a cell phone conversation where the voice was quite intelligible but any background noise came through as odd electronic chirping noises....
Some of these CODECs, especially the early ones, actually handled voice quite well, but were confused by unusual sounds they were unable to "understand" and "deconstruct" - like background noise.
This was quite noticeable on many early "Internet phone systems".

Similar "decisions" are made all the time when applying compression to VIDEO content... with similar questions about whether they are "visible" or not.

I'm going to regale you with an example I saw on a DVD, which demonstrates the question very well.
In a certain very old disaster movie about a tornado.... one scene takes place in front of a background of very dark rapidly swirling clouds.
In the original VHS tape versions of this movie the clouds could very clearly be seen to swirl throughout the entire scene... along with a significant amount of tape background noise.
However, in the DVD version of the same movie, in that same scene, the clouds DO NOT SWIRL (they change once or twice but essentially remained stationary).
This example is striking because, if you'd never seen the movie before, you would have said that "the DVD looked quite good"... and never missed the movements in the clouds.
However, if you were familiar with the tape version, or the original movie, it was obvious that the DVD version did not reproduce it accurately at all.
(And, apparently, even though there is a lot of random tape noise, we humans can easily discern the difference between swirling clouds and tape noise.)

The reason this happened is obvious (if you're familiar with video encoding for DVDs).
Because noise, like tape noise, is in fact random, it doesn't compress efficiently, so accurately recording tape noise requires a lot of bandwidth.
In the CODEC used for DVDs, bandwidth is allocated intelligently.
And, in general, noise is something that most people prefer not to see, so you normally want to remove it anyway.
So, as part of the process, noise is filtered out before compression is applied, so as to preserve more bandwidth for useful information by avoiding "wasting bandwidth on noise".
(The choice of what to filter out can be controlled manually - but can also be done automatically in many encoders.)
In this particular scene, because the swirling clouds are very dark, and contain little information, the algorithms have "decided" that the swirling is "noise" and filtered it out.

Another way of looking at it would be to say that the encoder has substituted static clouds for the original "unimportant" swirling clouds in order to save space for more "important" information.
(It is performing "priority based perceptual encoding".)

It is in fact possible that, in this case,rather than the encoder, a human operator CHOSE to set the filtering at a level that would wipe out the swirling in the clouds.
However, the result is the same.....

Even though many viewers may PREFER the smoother filtered version....
We cannot reasonably claim that "the encoding is 'visibly transparent' to the original"....
(And it's quite obvious that "the original artistic intent" called for "ominously swirling dark clouds".)
So,if you were an aficionado of bad old disaster movies, would you prefer to see the encoded version or an ACCURATE reproduction of the original.
(Unfortunately, in this case, unless you were to acquire a theatrical master copy, you would be forced to choose between the tape noise from the VHS version, and the "smoothing errors" on the otherwise excellent DVD transfer.)

TO BRING THE CONTEXT BACK TO THIS DISCUSSION....

Unless you have the lossless copy of a file, encode it yourself, and compare the two, can you TRUST the encoding process to never make similar "editorial decisions"?
(And, even if you confirm that ten files you encode and carefully compare are "audibly transparent", are you willing to believe and trust that EACH AND EVERY FILE encoded by someone else will be audibly transparent?)

Personally, not being a major aficionado of old movies, I'm willing to concede that "most DVDs look as good or better than the VHS version", and that's plenty good for me... so I'd rather have the DVD.
However, I'm simply not willing to make a similar concession for music.

I've got no real use for "beyond audible transparency".

What I meant about data rate was overall data rate. Usually, the data rate is sufficient to achieve transparency. But transparency at 2 channel is a different data rate than transparency at 7.1. More pieces cut in the pie mean a bigger pie is needed.
 
Aug 1, 2019 at 12:13 PM Post #13,385 of 17,336
I only know what sort of mixes are on DVD and blu-ray. Since I got my home theater, I don't go to movies any more. The experience is better at home. Better picture, better sound, cheaper, more convenient.

Sometimes I feel like Gulliver watching the Kings argue about which end of the egg to break.
 
Last edited:
Aug 1, 2019 at 12:50 PM Post #13,386 of 17,336
I absolutely agree - about theater movies.

We have a local "big iMax theater".

The picture is really impressive, and large, and they do 3D, and the audio is really powerful, and plays really loud....
But it doesn't sound as good as what I have in my living room.

I only know what sort of mixes are on DVD and blu-ray. Since I got my home theater, I don't go to movies any more. The experience is better at home. Better picture, better sound, cheaper, more convenient.

Sometimes I feel like Gulliver watching the Kings argue about which end of the egg to break.
 
Aug 1, 2019 at 3:23 PM Post #13,387 of 17,336
Similar "decisions" are made all the time when applying compression to VIDEO content... with similar questions about whether they are "visible" or not.

I'm going to regale you with an example I saw on a DVD, which demonstrates the question very well.
In a certain very old disaster movie about a tornado.... one scene takes place in front of a background of very dark rapidly swirling clouds.
In the original VHS tape versions of this movie the clouds could very clearly be seen to swirl throughout the entire scene... along with a significant amount of tape background noise.
However, in the DVD version of the same movie, in that same scene, the clouds DO NOT SWIRL (they change once or twice but essentially remained stationary).
This example is striking because, if you'd never seen the movie before, you would have said that "the DVD looked quite good"... and never missed the movements in the clouds.
However, if you were familiar with the tape version, or the original movie, it was obvious that the DVD version did not reproduce it accurately at all.
(And, apparently, even though there is a lot of random tape noise, we humans can easily discern the difference between swirling clouds and tape noise.)

The reason this happened is obvious (if you're familiar with video encoding for DVDs).
Because noise, like tape noise, is in fact random, it doesn't compress efficiently, so accurately recording tape noise requires a lot of bandwidth.
In the CODEC used for DVDs, bandwidth is allocated intelligently.
And, in general, noise is something that most people prefer not to see, so you normally want to remove it anyway.
So, as part of the process, noise is filtered out before compression is applied, so as to preserve more bandwidth for useful information by avoiding "wasting bandwidth on noise".
(The choice of what to filter out can be controlled manually - but can also be done automatically in many encoders.)
In this particular scene, because the swirling clouds are very dark, and contain little information, the algorithms have "decided" that the swirling is "noise" and filtered it out.

Another way of looking at it would be to say that the encoder has substituted static clouds for the original "unimportant" swirling clouds in order to save space for more "important" information.
(It is performing "priority based perceptual encoding".)

It is in fact possible that, in this case,rather than the encoder, a human operator CHOSE to set the filtering at a level that would wipe out the swirling in the clouds.
However, the result is the same.....

Even though many viewers may PREFER the smoother filtered version....
We cannot reasonably claim that "the encoding is 'visibly transparent' to the original"....
(And it's quite obvious that "the original artistic intent" called for "ominously swirling dark clouds".)
So,if you were an aficionado of bad old disaster movies, would you prefer to see the encoded version or an ACCURATE reproduction of the original.
(Unfortunately, in this case, unless you were to acquire a theatrical master copy, you would be forced to choose between the tape noise from the VHS version, and the "smoothing errors" on the otherwise excellent DVD transfer.)

TO BRING THE CONTEXT BACK TO THIS DISCUSSION....

Unless you have the lossless copy of a file, encode it yourself, and compare the two, can you TRUST the encoding process to never make similar "editorial decisions"?
(And, even if you confirm that ten files you encode and carefully compare are "audibly transparent", are you willing to believe and trust that EACH AND EVERY FILE encoded by someone else will be audibly transparent?)

Personally, not being a major aficionado of old movies, I'm willing to concede that "most DVDs look as good or better than the VHS version", and that's plenty good for me... so I'd rather have the DVD.
However, I'm simply not willing to make a similar concession for music.

Home video has vastly improved since whatever reference you're comparing with DVD vs videotape. DVD matured over the years with improved encoding techniques and film scanning (telecine). There also isn't really much inherent "noise" with digital authoring, as the only ADC is the scanner: there is grain from the source film and you can have artifacting from what encode you're using. Early DVDs could have suffered more from compression artifacts or Digital Noise Reduction (which with telecine, the algorithms try to reduce grain, which can also result in less detail, present in the film and adds contrast around the edges). For a time, movies have been scanned at 4K resolution for 35mm and 8k for 70mm during film restorations: maximizing resolutions that also means better grading (and since videophiles complain to studios, studios have also been less heavy handed with DNR: even re-issuing Blu-ray titles). The main advantage with 4K for home applications isn't so much resolution but greater dynamic range: adding more colors and tonality in scenes that have higher peak brightness and retaining shadow detail. Blu-ray and 4K UHD discs are also possible with more efficient video codecs than DVD: DVD was MPEG-2, BD is MPEG-4 (which allows greater compression with minimal artifacts). As a physical medium BD can have multiple layers, but it seems most home movie releases are BD-50 for 1080P movies and BD-66 for UHD discs. UHD 4K is able to be compressed further as it utilizes a new codec known as h.265 (MPEG-H).

I get better picture quality from my calibrated OLED TV at home then my local theaters....so for me, I like both my picture and sound at home vs the cinema (and it's great that older movies and some TV shows look and sound better than they originally did).
 
Last edited:
Aug 1, 2019 at 4:32 PM Post #13,388 of 17,336
I have DVDs that look almost as good as blu-rays. When DVDs went anamorphic, it was a huge improvement. I buy as many DVDs as I do blu-rays nowadays.
 
Aug 1, 2019 at 4:50 PM Post #13,389 of 17,336
I have DVDs that look almost as good as blu-rays. When DVDs went anamorphic, it was a huge improvement. I buy as many DVDs as I do blu-rays nowadays.

That means you can buck the trend of having to replace your DVD or BD titles for the latest and greatest :blush:. I’m a shutterbug and videophile, so I like collecting 4K now. Some movies they’ve remastered also get a new encode from that source to BD (possibly making a better picture than a previous release). It might be debated about merits of high resolution/DR audio, but video can still mature (as displays get bigger and the photography process can more easily take entire DR of scene). Also, I’ve noticed prices on all disc formats seem to be dropping (probably because of demand with streaming).
 
Last edited:
Aug 1, 2019 at 5:28 PM Post #13,390 of 17,336
I have a projection system with a ten foot screen. Using the THX standards, if I sat close enough to the screen to see the difference between 1080 and 4K, the edges of the screen would be in my peripheral vision. To me, the main advantage of blu-ray and 4K is color accuracy, but again, with a projection system, I'm not likely to be able to see the finer points of that. However seeing films projected in the dark on a big screen is a huge benefit. I don't watch TV any more. I just screen movies... even if they are TV episodes.
 
Last edited:
Aug 1, 2019 at 5:41 PM Post #13,391 of 17,336
I have a projection system with a ten foot screen. Using the THX standards, if I sat close enough to the screen to see the difference between 1080 and 4K, the edges of the screen would be in my peripheral vision. To me, the main advantage of blu-ray and 4K is color accuracy, but again, with a projection system, I'm not likely to be able to see the finer points of that. However seeing films projected in the dark on a big screen is a huge benefit. I don't watch TV any more. I just screen movies... even if they are TV episodes.

The HD remastering for Star Trek Next Gen was pretty impressive. The original VFX were filmed in passes on Vistavision film, and then scanned for analog SD editing. With the digital HD remaster, they scanned the original film and digitally composited passes (making quality as good as a movie). I find broadcast cable to be pretty bad by today’s standards. No wonder people watch most TV shows streaming. Some services like Netflix even has original programming in Atmos and Dolby Vision. The only main disadvantage with streaming a lot of 4K content is running into your internet provider’s data caps.
 
Last edited:
Aug 1, 2019 at 5:51 PM Post #13,392 of 17,336
I'm lucky. Here in Los Angeles, Netflix and Criterion Channel streaming are rock solid and zippy. (Those are the streaming services I use.)I've never liked cable. I haven't had it since the early 90s. I just took the hundred dollars a month cable cost and bought discs with it. I buy a ton of DVDs of TV shows from British TV at Amazon UK. There are fantastic crime shows there and the quality of the DVD authoring is really good.
 
Aug 1, 2019 at 6:06 PM Post #13,393 of 17,336
I'm lucky. Here in Los Angeles, Netflix and Criterion Channel streaming are rock solid and zippy. (Those are the streaming services I use.)I've never liked cable. I haven't had it since the early 90s. I just took the hundred dollars a month cable cost and bought discs with it. I buy a ton of DVDs of TV shows from British TV at Amazon UK. There are fantastic crime shows there and the quality of the DVD authoring is really good.

Oh, I have no issues with Internet speeds in Atlanta: it’s just that if you’re watching a lot of 4K you’re using more data and likely to hit your data allocation for the month (and then I’d be charged an additional $10 For every 50GB). If you like European shows, there’s also Acorn and MHZ
 
Aug 1, 2019 at 7:03 PM Post #13,394 of 17,336
I have a projection system with a ten foot screen. Using the THX standards, if I sat close enough to the screen to see the difference between 1080 and 4K, the edges of the screen would be in my peripheral vision. To me, the main advantage of blu-ray and 4K is color accuracy, but again, with a projection system, I'm not likely to be able to see the finer points of that. However seeing films projected in the dark on a big screen is a huge benefit. I don't watch TV any more. I just screen movies... even if they are TV episodes.
I have a 110" screen and the Epson 5050UB. I'm now going to (attempt to) sell my LG OLED65C7, and the LG HU80KA. I have the screen mounted on an aluminum frame, and I just push it aside when I want to watch TV with my 65". These last few weeks, though, I've decided to just stick with the projector.
It's not as good as the OLED, but it's gorgeous enough to where I love the size of the screen more than the better spectral highlights that I get on the OLED. The Epson's black levels are divine (I have a high contrast grey screen), and it resolves grain like a champ. I've gone full projector now!
 
Aug 1, 2019 at 7:39 PM Post #13,395 of 17,336
The other thing I've noticed with higher resolution displays is that it's easier to see what's in focus (one big contribution for perceived detail): with my 4K computer monitor, I see a shallower depth of field with my photos. With 4K movies, I can see if shots are slightly out of focus (or more regularly, I've seen wide shots filmed with anamorphic lenses that show soft lens distortion on top and bottom of frame). I was surprised to see a few scenes of Dark Knight that were slightly out of focus.
 

Users who are viewing this thread

Back
Top