Benchmark DAC1 now available with USB
Apr 7, 2008 at 8:55 PM Post #1,471 of 3,058
While I have seen just about every aspect of the DAC1 discussed here. I haven't seen any discussion of the USB error rate. The DAC1 and most other USB DACs are not performing any error correction (please correct me if I'm wrong Elias) so USB errors can be detrimental to your DACs performance.

I've found claims that the USB error rate is very low - meaning you can expect to see one error over several days of continuous use.

I am wondering if Elias and Wavelength can share with us what error rates they have been able to measure with their devices in the lab and what factors can effect the error rate (cable quality, cable length, overloaded CPU, USB drivers, etc). I made note that Wavelength has measured significant errors when using some USB extension cables.

Thanks,
Ross

Quote:

Originally Posted by Wavelength /img/forum/go_quote.gif
I did have a USB Analyzer there and we did verify no USB errors occured. While this is true at the line level it is possible at the controller (TAS1020) that an error occured.


 
Apr 8, 2008 at 12:29 AM Post #1,472 of 3,058
"kmixer truth" if one notice that it is using floating-point operation might assume that kmixer can never be bit trasparent, Incorrect kmixer is bit-trasparent when used properly Elias Gwinn tests are correct, setting Volume control on kmixer Fully open will mathematically not change the sample values you just need to disable Microsoft software synth not to affect the results.

I Know, i know but why direct kernel streaming sounds better, Playing files on WMP or I Tunes or whatever software that is using Direct Sound Interface will sound bad.
Direct X is designed for use in low latency situation like video playback, video games and similar needs it can skip, alter predict samples and that has the similar coloration signature of a jitter induced distorsion even it is not showed on a test geat like it is still there.
Cure ? use Multimedia Extensions Interface "MME" on your player it is absolute bit-transparent on the majority of the softwares it skips internal volume control and ties the control to kmixer volume control.

You cannot evaluate USB DAC type of gear performance on Windows Vista system, Vista is unable to disable internal resampling so you get 2 antialias filters in a row, one from OS one from DAC off course best setting in Vista is maximum sample rate that dac can support (for the non computer geaks the setup dialog is in control panel -sounds)

About the duel between Computer and Disk Transport sorry to disrupt your myth most of the cd trasports even the cheap ones makes no mistakes and you can be sure that computers dont make mistakes (1+1 will always be 2 never 3) , everybody screams now but different cables sound different, different transports sound different even different computers will sound different that is a FACT.
This has been said a million times im gonna say it again it is all about trasmission timing problem and high frequency noise shifting the clock frequency, take a relative cheap cd tansport and expensive one (it has to be native cd transport not a dvd/sacd modern device) record spdif in to a pc with high end professional grade sound card and record the same track from the same disk you will get 2 bit to bit correct tracks why there is a difference offcourse timing errors now connect the high end studio pc to the dac it is gonna sound even worse and the high end sound card has a verry stable clock and low jitter spdif output why is it sound worse ?
Many of the bad sound quality of a computer is tied with the high frequency noise that millions of switches kickin in and out at a few Gigahertz creating noise that outputs at every cable connected to the computer disturbing the dac clock even if it is completly isolated it still receive the signal remember this is in Ghz range noise.
This comes to the question why is lossless sounding bad when beign played at realtime and sound the same when decoded first then played as wav file compared to the original wav rip previosly mention by J. Gordon Rankin from
Wavelength Audio, offcourse J. Gordon is not imagining things deference is audible it has also been said that 2Ghz system playing lossless sound worst than playing the decoded wav/aiff file from the lossless file that is playing at realtime and as the processing power is climbing the sound gets closer to the original file, this phenomenon is caused by the cpu power supply (not the main power suppy that you connect to the AC) that lies on the motherboard that stabiles and delevers cpu current, this is very efficient switchin power supply that is build for economy not high end so when cpu puts pressure on the power supply it spits out VHF noise excessively and the weaker ones even emit high pitch audible whistle that can be heard in a quiet room, when you use a system with higher frequency cpu the overhead on the cpu is less and the need for less power drain is releaving the strain on the cpu power supply plus usualy the faster cpu's are paired with better motherboards and the radiation effect is less apparent.

Recomendation to J. Gordon Rankin is to test a machine with uncompressed file wile cpu is loaded with mathematical operation to see if there is an effect and to try to test a diffieren types of motherboards (2 phase 3 phase 8 phase types) and try to shield the dac form this emitions i think that the problem comes from the the cpu power supply not from the cpu itself.

Greetings from Europe cheers.



Note: Again, the cpu power supply is the one that lies on the motherboard not the main power supply in the metal case.
 
Apr 8, 2008 at 1:08 AM Post #1,473 of 3,058
Quote:

Recomendation to J. Gordon Rankin is to test a machine with uncompressed file wile cpu is loaded with mathematical operation to see if there is an effect and to try to test a diffieren types of motherboards (2 phase 3 phase 8 phase types) and try to shield the dac form this emitions i think that the problem comes from the the cpu power supply not from the cpu itself.



Do you think your theory is correct when using a RJ45 connection-ethernet cable-RJ45-DAC(SB3/Transporter)?
Maybe DAC1 should have an RJ45 connection?
 
Apr 8, 2008 at 1:51 AM Post #1,474 of 3,058
Squeezebox is not a DAC its a whole system, think of it like

Transporter to be iPod on steroid's.

Receiver/Duet to be low power Notebook with a nice soundcard.


The problem that J. Gordon was talking about when playing lossless files, on Slim Devices gear using internal dac will be constant and it is gonna be dismissed as existing because you can't hear the effect that it has on the sound, you just exept it as a part of it.
Using SB digital output to dac supposed to be theoretically better than pc to dac considering the fact that ARM processors draw less power and create less noise in the system then Intel cpu's never really try it my self, people that i have spoke that try it using Benchmark Media USB dac have not notice any difference i'll borow one and try it myself. I think its not logical to power up a PC and a SB to listen to music when i can just power up a PC ?

Cheers .
 
Apr 8, 2008 at 11:53 PM Post #1,475 of 3,058
If this is really the case, then galvanic isolation together with good shielding and proper filtering at tracks crossing the shield should take care of it.
 
Apr 9, 2008 at 3:30 AM Post #1,476 of 3,058
opinions please,

if i go envy24hts(toslink)>dac1>amp will that sound markedly better than envy24hts(line)>amp? all considered? the wolfson is a good dac and i don't really see the dac1 making much use of the noisy computer.

music_man
 
Apr 9, 2008 at 1:49 PM Post #1,477 of 3,058
Quote:

Originally Posted by Crowbar /img/forum/go_quote.gif
I'm just trying to understand what can cause a difference. An asynch interface is no different conceptually than say transferring the audio file over the Internet. Yet I'm sure no one will claim if it goes over servers ABC it will sound different than if it goes over servers XYZ. If it's not the bits, then what? Perhaps somehow the input timing jitter is cross-coupled in a non-obvious way to other circuits. Or just RF crap coming in over the USB cable is not perfectly filtered.

What's the point of asynch USB to S/PDIF? The whole benefit of USB is that it avoids S/PDIF's problem. This only makes sense if there's a clock line from the DAC to the converter driving its timing.



Crowbar,

Really at this point I am not sure where the problem lies. The good news is hard drives are cheap so save or convert your lossless to WAV/AIFF and avoid the problem.

But remember with USB there is no jitter because there is no clock. Well at least there is no jitter in the interface itself like there is with SPDIF. Jitter in SPDIF happens because the data is wrapped inside the clock. In USB and especially ASYNC mode the jitter becomes only that which is "intrinsic" to the controller, clock and dac.

Actually running a clock from the dac to the transport does not solve the jitter problems. The effect of slew rate and cable distance of clocks exceeding 10MHz will not resolve this. Anyways the SPDIF still wraps the data around a clock. Guido Tent of TentLabs has a noval approach to the problem. Since sending the clock is a bad idea he puts a VCXO (variable clock) inside the transport and puts the PLL in the DAC and only sends a voltage.

But really if you want to fix SPDIF you have to get rid of the idea all together and ATAPI the data off the disk like a computer does then work out something like USB, Ethernet or Firewire to the dac which is basically back to the computer in the first place.

Thanks
Gordon
 
Apr 9, 2008 at 2:07 PM Post #1,478 of 3,058
Quote:

Originally Posted by Ross MacGregor /img/forum/go_quote.gif
While I have seen just about every aspect of the DAC1 discussed here. I haven't seen any discussion of the USB error rate. The DAC1 and most other USB DACs are not performing any error correction (please correct me if I'm wrong Elias) so USB errors can be detrimental to your DACs performance.

I've found claims that the USB error rate is very low - meaning you can expect to see one error over several days of continuous use.

I am wondering if Elias and Wavelength can share with us what error rates they have been able to measure with their devices in the lab and what factors can effect the error rate (cable quality, cable length, overloaded CPU, USB drivers, etc). I made note that Wavelength has measured significant errors when using some USB extension cables.

Thanks,
Ross



Ross,

Here are some of the things that I would consider variables for errors on the USB link:

1) Cable length, best to use a good USB 2.0 cable at less than 3m. The Belkin gold 2M is what I ship with most of my dacs. It is very good cable for the money.

2) USB port: On PC's run USBView and see where the port is connected. On the MAC run the System Profiler. Ports connected directly to the USB Host controller are your best bets. NEVER use a USB port on the front of your computer. They are wired and have the worst results. Most host controllers have 2, 4 or 7 ports. Remember your keyboard, mouse and blue tooth are running off the USB host controller. We found on new iMac that one of the ports is shared by all internal stuff and does not sound as good as the other three.

3) Computer Accounting: I have designed 7 motherboards for PC's over the years. Believe me... there is some Accountant next to each engineer who designed and laid out (pcb layout) a motherboard. The use of cheap parts and poor execution can have a drastic effect on how things work.

Take for instance I have a HUSH Multimedia PC that I use for testing and some development. Understand this was developed for audio video use. There are no fans which makes it quiet. BUTTTTTTT it uses a single 64V switching power supply. This comes into a board that has eight DCDC power convertors for the +/-12 drives, +5v drives, +5v logic, +3.3v, +1.8v core and +/-12v for PCI. All this makes it not the greatest unit for audio video use. Only the lack of fans and it's cool shell make it worthy of that.

4) MBPS: The higher you run the interface in Bytes or Bits per second the higher the rate is. I disgree with the idea of Upsampling on the Computer side to the highest rate. It causes more errors than just leaving it red book.

~~~~~~

Anyways... I ran some tests using error flagging counter and my USB analyzer and found that typically running red book & 2m cable on a direct port into the HOST usb controller that I received no errors in a 24 hour period of sending a 1KHZ sine wave into my TAS1020 emulated system. I used this as the basis because I could also count internal packet errors that the analyzer did not see.

I have run 24/96 on the same interface and setup and have not seen errors either.

But I have on occasion plugged my dac into the front of my G5 quad and seen errors right off the bat at 24/96.

The good thing is this... you follow these rules and really you may never have a problem. I did have a customer who actually had a bad motherboard and would never have even known it. The dac sounded bad he went to the AppleStore they looked at it, plugged in an iPhone and it complained about port speeds.

Later,
Gordon
 
Apr 9, 2008 at 2:15 PM Post #1,479 of 3,058
Quote:

Originally Posted by Wavelength /img/forum/go_quote.gif
Actually running a clock from the dac to the transport does not solve the jitter problems. The effect of slew rate and cable distance of clocks exceeding 10MHz will not resolve this.


It does if you synchronously reclock the signal at the DAC--which would be done as a matter of fact in any system that sends the clock to the transport.
 
Apr 9, 2008 at 2:40 PM Post #1,480 of 3,058
Quote:

Originally Posted by KarateKid /img/forum/go_quote.gif
... you did not recommend using balanced headphones straight out of the DAC1, is it better to use a balanced headphone amp, even if it's still not ideal as you've explained?


A properly designed balanced headphone amp will drive balanced headphones better then directly driving them from the XLR outputs of the DAC1. This is because the output impedance for a headphone amp should be as close to 0 (zero) ohms as possible. The output impedance of XLR outs on the DAC1 is at least 60 ohms. The output impedance of the headphone output of the HPA2 headphone amps, which are built into each DAC1, is less then 0.1 ohms. Since balanced headphones have two outputs per channel (+ and -), the output impedance will NECESSARILY be twice (2x) as much as it would be if it were unbalanced. This is one of several problems with balanced headphone topologies.

As far as eliminating cross-talk, most high-quality headphones will not have common return conductors and, consequently, will not have problems with crosstalk. The HD650's, for example, have separate return conductors for each driver. Balanced headphone lines will not provide any additional separation.

Thanks,
Elias
 
Apr 9, 2008 at 3:08 PM Post #1,481 of 3,058
Quote:

Originally Posted by joijwall /img/forum/go_quote.gif
3) If everything is indeed bit-perfect from the source all the way to DAC1, the DAC1 analog output quality is what could effect the sound. What I understand, Benchmark suggests a removal of the headphone-mute-function for best quality?


First of all, we do not suggest removing the headphone-mute-function. Where did you hear that?

The quality of the sound will be a result of many things:
- The information encoded within the data
- The quality of the time information accompanying the data
- The competency of the DAC in handling any timing errors
- The quality of the circuit board layout in minimizing EMI interference that my interfere with the converter clock (i.e., optimized shielding, traces, and filtering)
- The ability of the power supply to filter noise and regulate a consistent power delivery
- The quality of the D/A chip, and the topology which it is implemented within the DAC
- The quality of the analog circuit design, especially by employing a thorough understanding the strengths and weaknesses of various components and designing sympathetically according to these strengths and weaknesses.
- Spending time and resources on performance rather then aesthetics and marketing

Quote:

Originally Posted by joijwall /img/forum/go_quote.gif
4) I don't know much about ground loops, but I could try optical or run on batteries and listen. I've also moved my music files to a separate HDD not long ago, maybe that has some influence also.


I don't recommend running audio equipment on batteries. Don't worry about ground loops unless you hear a 'hum' or 'buzz'.

The external HDD should not be a problem, unless it is connected via USB simultaneously with a USB audio device. This scenario may lead to drop-outs, etc.

Quote:

Originally Posted by joijwall /img/forum/go_quote.gif
5) Regarding CD-players, are there other important things involved than reading from the disc, D/A-converting and the electronic components (including analog outputs)? I thought perfecting these three gave better sound (and higher price). How does DAC1 perform in these areas? I listened to Linn Klimax DS ($30000 HDD/Streamer/DAC-combo) and it sounded fantastic. Which led me in on the DAC1 path by the way.


All of the things mentioned above will influence a transports analog outputs as much as a DAC.

Thanks,
Elias
 
Apr 9, 2008 at 3:15 PM Post #1,482 of 3,058
I'd like to know what happens when a USB device, such as a DAC1 USB, encounters an error from the USB. Is there any error correction in the data stream that allows the device to recover the word? Does the device interpolate a new word?
 
Apr 9, 2008 at 3:16 PM Post #1,483 of 3,058
Quote:

Originally Posted by joijwall /img/forum/go_quote.gif
By the way, how is it possible to test the bit-transparancy at all on the Mac? When directing the output to DAC1 in Audio/Midi setup, I can choose sample rate, but I cannot set wordlength, it is always 2ch-24bit.
Doesn't this mean bits sent to USB is changed? I mean, the original CD is 44,1/16, but the USB output is 44,1/24, thus we cannot compare the bits unless we convert it back after the USB?
If I understand Benchmarks wiki correctly, the 16->24 conversion is done by iTunes. Will a 16->24 conversion change the "music" information in any way, resulting in DAC1 presenting a different analog output than if the original 44,1/16 was feeding it?
In my case, I guess DAC1 gets a 44,1/24 from my MacBook, but a 44,1/16 from the optical output on my Linn Majik CD.
In short: Since DAC1 wants the 24 bits, and setup forces iTunes to convert 16->24, don't we need to check for bit-transparancy somewhere inside DAC1?
/joijwall



Establishing a 24-bit connection will not affect 16-bit audio. Think of it like lanes on a highway. If we want the ability for 24 cars to travel side-by-side, we need 24 lanes. 16 cars will travel down that 24-lane highway just as they would if it were 16-lanes. However, a 16-lane will not allow 24 cars to travel side-by-side. This is why a 24-bit connection is always encouraged, even in 16-bit applications.

This is different from over-sampling the audio. Over-sampling, or any re-sampling for that matter, involves major DSP (Digital Signal Processing). This DSP will change the audio information. Sometimes it will result in major sonic degradation, sometimes it will be unnoticeable.

However, a 24-bit path will never change 16-bit signal.

Thanks,
Elias
 
Apr 9, 2008 at 3:19 PM Post #1,484 of 3,058
Quote:

Originally Posted by Lord Chaos /img/forum/go_quote.gif
I'd like to know what happens when a USB device, such as a DAC1 USB, encounters an error from the USB. Is there any error correction in the data stream that allows the device to recover the word? Does the device interpolate a new word?


No, there is no error-correction in the DAC1 USB. The DAC1 USB will convert all audio data as it sees it. However, we have seen very few errors (usually none) in our testing.

Thanks,
Elias
 
Apr 9, 2008 at 4:11 PM Post #1,485 of 3,058
Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
A properly designed balanced headphone amp will drive balanced headphones better then directly driving them from the XLR outputs of the DAC1. This is because the output impedance for a headphone amp should be as close to 0 (zero) ohms as possible. The output impedance of XLR outs on the DAC1 is at least 60 ohms. The output impedance of the headphone output of the HPA2 headphone amps, which are built into each DAC1, is less then 0.1 ohms. Since balanced headphones have two outputs per channel (+ and -), the output impedance will NECESSARILY be twice (2x) as much as it would be if it were unbalanced. This is one of several problems with balanced headphone topologies.


To elaborate (and not to be argumentative)... Some people were talking about driving balanced phones from the DAC1 in this thread. I explained why the frequency response would be boosted 3db boost in the bass and 1db in the treble. This post explains why, and this this post gives some numbers.

In my opinion, the frequency distortion from the 60 ohm output impedance will swamp any extra noise or distortion or other problems due to two opamps per channel. The bass boost at around 80Hz will make the phones sound more rhythmical while the treble boost will make them sound more detailed.

- Eric
 

Users who are viewing this thread

Back
Top