Is Your S/PDIF Bit Perfect? Here's a Test!

Jan 1, 2010 at 7:46 PM Post #16 of 33
Quote:

Originally Posted by grokit /img/forum/go_quote.gif
Not to hijack or be disrespectful, but what's the deal with bit perfect? Does upsampling increase latency that much? If you are just listening and not trying to capture live sound, does it really make a difference?

I'm asking because I cannot find the term "bit perfect" in FallenAngel's "Before asking about digital audio: READ this Digital Audio Primer"



In some cases it can seriously degrade the sound.

If the upsampling algorithm does a bad job it will reduce sound quality.

On my original Audigy card years ago, the difference was huge. The day I started using the upsampling in Foobar vs letting the card do it was a very good day. I'd spent a lot of time trying to figure out why my PC sounded so much worse than my Toshiba DVD player for audio. Digital is digital right.

If the upsampling is done well then the effect may be imperceptible. Personally it's something I don't want to have to worry about.
 
Jan 1, 2010 at 11:52 PM Post #17 of 33
Quote:

Originally Posted by grokit /img/forum/go_quote.gif
Not to hijack or be disrespectful, but what's the deal with bit perfect? Does upsampling increase latency that much? If you are just listening and not trying to capture live sound, does it really make a difference?

I'm asking because I cannot find the term "bit perfect" in FallenAngel's "Before asking about digital audio: READ this Digital Audio Primer"



It's not so much that upsampling is always bad.

Back in the bad old days, microsoft decided that PC audio devices should do 48khz, and if the codec supported no other clocks then all other clocks will be resampled to 48khz. I've heard that the reasoning behind this was because the old dolby digital 5.1 compressed streams use the same clock as 48khz. Supporting only the one clock for the dac also made the hardware cheaper and the driver programming easier.

That was the AC97 spec. A lot of hardware vendors read it as not just an excuse but a reason to fix their audio clock for 48khz.

Creative Labs was among them. Even their high end cards had this 'feature'. The big difference between the SB Live and the Audigy was that the upsampler in the audigy was slightly better.

In the ensuing 13 years, most hardware vendors have realized that strict adherence to AC97 is a dumb idea.

These days even most cheap parts at least support multiple clocks for SPDIF output, and many support multiple clocks in the DAC as well.

The "test" in the linked article, from a QA perspective (and I'm a QA guy), is a bad test. What they basically tell you to do is push a 1.5mbit DTS stream over SPDIF and see if it works.

What that actually proves, near as i can tell, is whether your audio device can do the right clock for 96khz spdif.

There are some parts that support necessary spdif clocks for surround sound output but don't support 44.1khz. Not many, but they're out there. People can also get confused and push a dolby 5.1 stream that uses the 48khz clock, which would give a false positive as well.

Aside from outputting to a device that will tell you what it's receiving, there isn't a quick and dirty test.

If you know what codec part your computer has, you can look up the datasheet, and see if it supports the 44.1khz spdif clock that almost all of your audio needs.

As for why upsampling is bad, it's more that short jumps are bad. Most hardware upsampling is done quick & dirty & cheap and introduces aliasing to the waveform. In order to avoid aliasling in a jump between 44.1khz and 48khz, you'd have to resample up to 192khz or so and then back down.

Plenty of good DAC parts use upsampling in hardware, by running the dac at some crazy high rate and resampling all audio to that rate. Theoretically speaking the aliasing is unavoidable, but the high sample rate means that the aliasing is very slight, and the high sample rate of the dac reduces the noise inherent in the dac itself, so the end result is lower distortion.
 
Jan 2, 2010 at 1:06 AM Post #18 of 33
Quote:

Originally Posted by ericj /img/forum/go_quote.gif
It's not so much that upsampling is always bad.

Back in the bad old days, microsoft decided that PC audio devices should do 48khz, and if the codec supported no other clocks then all other clocks will be resampled to 48khz. I've heard that the reasoning behind this was because the old dolby digital 5.1 compressed streams use the same clock as 48khz. Supporting only the one clock for the dac also made the hardware cheaper and the driver programming easier.

That was the AC97 spec. A lot of hardware vendors read it as not just an excuse but a reason to fix their audio clock for 48khz.

Creative Labs was among them. Even their high end cards had this 'feature'. The big difference between the SB Live and the Audigy was that the upsampler in the audigy was slightly better.

In the ensuing 13 years, most hardware vendors have realized that strict adherence to AC97 is a dumb idea.

These days even most cheap parts at least support multiple clocks for SPDIF output, and many support multiple clocks in the DAC as well.

The "test" in the linked article, from a QA perspective (and I'm a QA guy), is a bad test. What they basically tell you to do is push a 1.5mbit DTS stream over SPDIF and see if it works.

What that actually proves, near as i can tell, is whether your audio device can do the right clock for 96khz spdif.

There are some parts that support necessary spdif clocks for surround sound output but don't support 44.1khz. Not many, but they're out there. People can also get confused and push a dolby 5.1 stream that uses the 48khz clock, which would give a false positive as well.

Aside from outputting to a device that will tell you what it's receiving, there isn't a quick and dirty test.

If you know what codec part your computer has, you can look up the datasheet, and see if it supports the 44.1khz spdif clock that almost all of your audio needs.

As for why upsampling is bad, it's more that short jumps are bad. Most hardware upsampling is done quick & dirty & cheap and introduces aliasing to the waveform. In order to avoid aliasling in a jump between 44.1khz and 48khz, you'd have to resample up to 192khz or so and then back down.

Plenty of good DAC parts use upsampling in hardware, by running the dac at some crazy high rate and resampling all audio to that rate. Theoretically speaking the aliasing is unavoidable, but the high sample rate means that the aliasing is very slight, and the high sample rate of the dac reduces the noise inherent in the dac itself, so the end result is lower distortion.



What a pleasure to read a post that has been well written and so easy to understand.
 
Jan 2, 2010 at 10:28 AM Post #19 of 33
Quote:

Originally Posted by ford2 /img/forum/go_quote.gif
What a pleasure to read a post that has been well written and so easy to understand.


Yes, 2nd that, and thanks to everyone else who contributed as well
biggrin.gif


Coming from someone who has evolved using Apple machines since before iPod/iTunes, or even the Mac for that matter, I am blown away what Windows users have to bypass in the audio stream to avoid latency and distortion and so forth. I knew about the 44.1-CD vs. 48kHz-DVD issue, but the explanation the aliasing issue was very helpful; is "dirty and cheap" aliasing and it's resultant "short jumps" the cause of "jitters"?

Also the other more anecdotal posts have me "grokking" so far is that if I have lossless files and a quality external DAC, up-conversion is not as much of an issue; if my analog conversion is an internal sound card or suspect in any way, then bit-perfect is more desirable, and even more so again if my music files are lossy.

I have even read people arguing that a good up-sampling 24/192 DAC can improve on a 16/44.1 audio file, but there are usually those who beg to differ, citing "snake oil"
wink.gif


I have definitely learned about using external DACs and am actively comparing different flavors, but I skipped the whole sound card thing (for the most part) as Apple's iPod/iTunes is what led me back into active listening, and ironically back to my turntable, before I came back to the Mac for music again
popcorn.gif


I have used many Windows machines of various flavors and have owned a few over the years, but always worked with a Mac as well and stuck with it for personal stuff and entertainment. I don't mind using windows, now that it's GUI, but I can't stand driver issues
angry_face.gif


I saw a flowchart-type of graphic recently illustrating plainly what is avoided by downloading external ASI/O drivers and exactly what they bypass in various Windows OS's, but I can't seem to find it again.

Sorry for all the emoticons, they're fun.
 
Jan 5, 2010 at 2:23 AM Post #20 of 33
I am totally not sure about the explanation the OP provided to know if one output is bit perfect.

What your get at the out of Foobar is a 5 channel PCM signal, I don't see why the Creative would not be able to re-sample that.

On the Creative website, the specs of the Audigy 2 ZS say:
"Playback: 24-bit Digital-to-Analog conversion of digital sources at 96 kHz to analog 7.1 speaker output, 192kHz for Stereo DVD-A"

It's able to handle a 7.1 AC-3/DTS compressed signal, which means that after decompression, the soundcoard works with a 7-channel PCM signal. It may re sample the 7 channels separately.

So logically, unless there's a bit of information I missed, this little test proves absolutely nothing.
 
Jan 5, 2010 at 3:02 AM Post #21 of 33
Quote:

Originally Posted by khaos974 /img/forum/go_quote.gif
I am totally not sure about the explanation the OP provided to know if one output is bit perfect.

What your get at the out of Foobar is a 5 channel PCM signal, I don't see why the Creative would not be able to re-sample that.

On the Creative website, the specs of the Audigy 2 ZS say:
"Playback: 24-bit Digital-to-Analog conversion of digital sources at 96 kHz to analog 7.1 speaker output, 192kHz for Stereo DVD-A"

It's able to handle a 7.1 AC-3/DTS compressed signal, which means that after decompression, the soundcoard works with a 7-channel PCM signal. It may re sample the 7 channels separately.

So logically, unless there's a bit of information I missed, this little test proves absolutely nothing.



No, you are playing a 2-channel PCM signal. You can't send more than 2 channels of PCM audio over S/PDIF anyway.

To Foobar, it's playing a stereo PCM signal just like from a CD, but it is encoded as DTS, so the receiver recognizes it and decodes it as such. This "trick" is how DTS put their codec on standard audio CDs.

The bit about the sound card is decoding DTS to several analog channels. But, the test involves S/PDIF, not analog, and there is nothing in Foobar that would instruct the sound card to decode the DTS to stereo beforehand. I thought it went without saying that you need to make sure there isn't a DSP decoding DTS in foobar. There should be no DSPs used at all. Besides, we're checking to see that the DTS light comes on to see that it's really DTS that's being sent.
 
Jan 5, 2010 at 6:04 PM Post #22 of 33
Quote:

Originally Posted by grokit /img/forum/go_quote.gif
I knew about the 44.1-CD vs. 48kHz-DVD issue, but the explanation the aliasing issue was very helpful; is "dirty and cheap" aliasing and it's resultant "short jumps" the cause of "jitters"?


Well, it's not really jitter. Jitter is when a bit within a word or a word within a stream is slightly delayed.

Judder is when it's so bad that you can hear gaps in the sound, and this is really my consternation with windows audio - i haven't heard any of my linux machines kick the audio in the nuts just to hit the page file or update the graphics in a window in about 10 years, but i can't get my current windows machines to completely stop doing it.

The resampling problem is both that short jumps in sample rates are hard to do well, and when you resample cheaply you're not resampling as well as you could.

There are a lot of ways to resample digitial audio, and a low end sound chip will use the method that uses the fewest transistors.

I tried to get away with bad motherboard spdif for a linux htpc system years ago, and i found that the hardware-resampled spdif had a harsh, almost screechy sound to it. mplayer's simple '-srate' argument for fast resampling sounded somewhat better, and if i got fancy with the '-af lavcresample' arguments it could sound acceptable, but couldn't bring myself to put up with the hack and got a better motherboard.

Quote:

Originally Posted by grokit /img/forum/go_quote.gif
I have even read people arguing that a good up-sampling 24/192 DAC can improve on a 16/44.1 audio file, but there are usually those who beg to differ, citing "snake oil"
wink.gif



Ehh, you can't put back anything that was already taken away. It's a bit like arguing that you can interpolate a hamburger to be more like a steak.

It's not the stream that is improved; it's the noise floor of the DAC.
 
Jan 6, 2010 at 7:49 AM Post #23 of 33
Quote:

Originally Posted by ericj /img/forum/go_quote.gif
It's a bit like arguing that you can interpolate a hamburger to be more like a steak.


More like a Salisbury steak, perhaps!
 
Jan 6, 2010 at 5:28 PM Post #25 of 33
Quote:

Originally Posted by grokit /img/forum/go_quote.gif
More like a Salisbury steak, perhaps!


More like cubesteak.

Nothing wrong with cubesteak, as long as you batter it and fry it and serve it with sausage gravy.
 
Jan 6, 2010 at 8:48 PM Post #26 of 33
Quote:

Originally Posted by SirDrexl /img/forum/go_quote.gif
No, you are playing a 2-channel PCM signal. You can't send more than 2 channels of PCM audio over S/PDIF anyway.


You can send more then 2 channels of audio over S/Pdif if you use the Dolby Digital Live or DTS-C encoder.
 
Jan 7, 2010 at 5:09 AM Post #27 of 33
Quote:

Originally Posted by ROBSCIX /img/forum/go_quote.gif
You can send more then 2 channels of audio over S/Pdif if you use the Dolby Digital Live or DTS-C encoder.


Yes, those would not be PCM. You can't send more than 2 channels of PCM under IEC 60958 specs. IEC 61937 describes how you can send compressed audio over IEC 60958 (aka SPDIF or TOSLINK) connections.

(iec 60958 superseded iec 958)

hdmi lets you send 8 channels of PCM, but it has far higher bandwidth.

fwiw don't believe the nonsense about glass or quartz spdif links. Plastic fiber is good for 5mhz of bandwidth over 1 kilometer. It can handle your 1.5 megabit, 1-meter streams just fine.
 
Jan 7, 2010 at 6:05 AM Post #28 of 33
i tested bit perfect output and lossless compression at the same time by encoding a DTS encoded wav file into a ALAC file. reciever still picked up the dts stream straight out of the optical on my macbook pro
smily_headphones1.gif
 
Jan 7, 2010 at 7:47 PM Post #30 of 33
Quote:

Originally Posted by ericj /img/forum/go_quote.gif
Yes, those would not be PCM. You can't send more than 2 channels of PCM under IEC 60958 specs. IEC 61937 describes how you can send compressed audio over IEC 60958 (aka SPDIF or TOSLINK) connections.

(iec 60958 superseded iec 958)

hdmi lets you send 8 channels of PCM, but it has far higher bandwidth.

fwiw don't believe the nonsense about glass or quartz spdif links. Plastic fiber is good for 5mhz of bandwidth over 1 kilometer. It can handle your 1.5 megabit, 1-meter streams just fine.



Yes, I realize that as they would be encoded. Just some don't know about the digital encoders availalble. Although they are a lossy compression they still allow multiple channels over S/Pdif.
Actually, I think HDMI allows more then 8 channels as there is a very wide bandwidth there. I don't think I mentioned anything about glass or quartz cables but good to know!
Do you have any opinion on transformer coupled S/PDIF?
 

Users who are viewing this thread

Back
Top