CHORD ELECTRONICS DAVE
Jan 23, 2017 at 7:01 AM Post #7,187 of 25,868
from what I can hear the solution to the digital problem is what Rob talked about, his WTA filters and pulse array DAC - 16 bit is a good place to stop.


I think you could be right, for those of us that didn't jump to hi definition recordings but it won't stop me checking out further work done by Rob just in case. I guess I am just a hifi junkie. :stuck_out_tongue:
 
Jan 23, 2017 at 7:33 AM Post #7,188 of 25,868
I think we are in danger of misunderstanding bit depth here. When I talk about 16 bits on the coefficients on the ideal sinc function interpolation filter, this has no bearing on the actual bit depth from the recording.
 
Let me explain. A 16 bit 44.1 kHz recording has no timing errors innately (if it's been properly decimated which none do but that's another story). Likewise, a 24 bit 44.1 kHz has no timing errors innately. It's only when you convert from a sampled waveform (the original data) to a continuous signal via the interpolator (the filter or in my case the WTA filter) when you get the timing errors - and if you use ideal sinc function filter with 16 bit values (this is only with > 1M taps) then you know for sure all errors against ideal is less than 16 bits. It makes no difference whether the original data is 16 bit or 24 bit - you will get the same timing errors because that is down to the coefficient accuracy against ideal not the bit depth to start with.
 
In practice the classical recordings that give the most improvements to me seem to be 16 bit 44.1 from the 1960's - and I am sure this is down to the recording techniques and optimised equipment that Decca and Mercury used at the time, rather than the file format. Modern recordings don't have the timbre variations or the ability to portray the speed and power and sheer impact of the best of the 1960's recordings.
 
Rob
 
Jan 23, 2017 at 7:34 AM Post #7,189 of 25,868
And don't forget the trade offs, more taps mean a longer delay, more processing is also more power.
 
The ~4 second delay with the max buffer setting on the DAC64 and QBD was just about bearable.
 
Maybe Rob's later iterations will have a user selectable tap length...
 
Jan 23, 2017 at 12:51 PM Post #7,191 of 25,868
  Looks like no M scaler with usb connection for a couple of years, that should help people decide, I'm still on the fence. I want to be able to stream with the Blu MK II which would add a lot to this purchase.

M-Scaler with USB input = DAVINA ( And about the price, Chord Coral  / Reference series will not be for everyone)
You do not need to use the ADC function just =)
 
Jan 23, 2017 at 1:08 PM Post #7,193 of 25,868
I must admit if I wasn't so keen to get my hands on the Blu2 M Scaler I would be quite happy with Davina and would enjoy recording music again.
 
Jan 23, 2017 at 1:57 PM Post #7,194 of 25,868
256 million taps would take me years to code! And I am sure it won't be necessary, but I have been wrong before about expectations.

Now with all this fuss about M scaler and (at the moment) limited to BNC and CD on the Blu 2 we need to take a step backwards in time. When half a M taps became possible in spring of 2016, my expectation was that it would NOT make a great deal of difference - and this was based on previous experience. When I went from 26,000 to 164,000 in Dave the improvement was there but I was disappointed; it had taken me 9 months to design, with a lot of problems on the way, and it just sounded better. So in my mind I had expectations that more taps would give better sound, but it would be a small but still worthwhile improvement. Now this was before I re-worked the WTA algorithm, and moved from 8FS to 16FS - both of which was much more important in SQ terms than the tap length. So in my mind I had expectations that going to more taps (0.5M initially) would give a worthwhile improvement for sure - but only worthwhile.

Now when I first heard the 0.5M taps I was questioning my own hearing as it was transformational in SQ - certainly not a mere improvement. It was because it was so good I pushed the boat out and squeezed 1M taps out of a reluctant FPGA. Had I known in advance that 1M taps was hugely important, then there is no way we would have launched M scaler technology as a CD player. But what people fail to realise is that developing a product for manufacture is not a simple process, and that things take considerable time and a lot of boring background work goes on, together with detailed planning and scheduling. Parts need to be ordered, and sometimes delivery schedules can be a year in advance - Mojo production was planned a whole year in advance, and that was just from obtaining the parts POV.

Getting back to expectations - the way I work is to identify an error, then work on reducing that error until you can no longer hear a change. Imagine a biscuit (cookies) barrel, and you just keep picking out biscuits, getting better and better SQ, until no more biscuits come out. The problem with audio is you can't see the barrel, because it is buried in the earth, and you can't see inside the barrel. The only way of finding out how deep the barrel is is to keep taking biscuits out. But that's cool if the biscuits are a foot deep. But how do you pull out biscuits that are a mile deep? It gets harder to do it, and there may only be biscuits just 13 inches deep or it may be over a mile deep - you don't know for sure. So you often don't know how small the error needs to be before it becomes inaudible - and experience has taught me not to make assumptions as to whether something is audible or not. You can only find out after doing rigorous and careful listening tests - and it may have taken many months to design the new module.

Now we know that 1M taps is much better than 0.5M taps - so how far away can we go, how deep is the barrel? What excites me about Davina is I shall know for sure how many biscuits are left, as I will be able to listen to 768k, then hear what the decimation does, then hear how good the 1M tap interpolation actually is. So I will be able to measure how deep the barrel actually is; hopefully the 1M taps is very close to the original, as I really really do not want to spend years coding for ultra long tap lengths.

Rob
Crumbs! He's not crackers honestly :wink:
 
Jan 23, 2017 at 3:18 PM Post #7,196 of 25,868
Crumbs! He's not crackers honestly :wink:

Far from it. Rob follows the scientific method, by proposing hypotheses (1 million taps will be enough (or not enough), 16 bits is sufficient (or not sufficient)), and then developing the equipment to test those hypotheses.
 
I presume part of his mind is already thinking of the next incremental steps, 2 million taps/17 bits etc, just in order to explore if he has reached the 'good enough' stage.
 
Somehow I think that Rob will never reach the ultimate 'good enough' - maybe 'good enough' for a commercial product, but creative minds will always identify interesting avenues to explore, in the search for scientifically 'good enough and understood'.
regular_smile .gif
 
 
Jan 23, 2017 at 3:28 PM Post #7,197 of 25,868
So, after Blu 2 has been released, what's next in the pipeline? The Davina or the digital amp? (and does it have a name?)
smily_headphones1.gif

The Davina receives the most mentions, so I presume that it is the next in the pipeline.
 
The digital amp receives occasional mentions, but maybe that is because Rob has less involvement in the development (I think that @Mojo ideas was the key actor in previous amps, so maybe he has the greater input into the digital amp - sorry if my memory has been fallible on this topic). 
 
Jan 23, 2017 at 4:01 PM Post #7,198 of 25,868
  M-Scaler with USB input = DAVINA ( And about the price, Chord Coral  / Reference series will not be for everyone)
You do not need to use the ADC function just =)


Sure "You do not need to use the ADC function" of the proposed Davina. But you do need to pay for it. :wink:
 
Unless most of your customers are professional users, nearly every Davina you might sell will have a completely wasted ADC section. Do you think that is sensible? Elegant? Respectful of resources?  
 
Jan 23, 2017 at 4:19 PM Post #7,199 of 25,868
  Regarding the premature proclamation of DAVE's obsolescence, there is one thing I am hoping will happen with DAVE in the not so distant future that will further enhance its value and appeal for those those of us who own a DAVE but also for those who are looking for a DAC and are wondering if Hugo2 is good enough or whether they should spend more for DAVE.  Rob has stated in the past that DAVE is capable of being upgraded via a code update but that he would never do it unless it resulted in a significant improvement to DAVE.  Perhaps, that time is now for the following reasons:
 
1)  When paired with M-scaler, many of DAVE's DSP cores will now sit idle and I have wondered if these cores can be re-purposed for greater things?  Certainly, it would be ideal for DAVE users who don't plan to upgrade to M-scaler to have Hugo2's improved filters.  Some (Beolab, BMichels, and Jelt2359) have proposed filter options to tailor DAVE"s sound signature (warmer, cooler, neutral, etc) to help balance the tonal deficiencies in one's system.  dCS has such options.  
 
2)  Improving the SPDIF input.  As Rob has stated, SPDIF is not synchronous to DAVE's clock (only USB is) and must go through DPLL first.  Since M-scaler must use the SPDIF inputs, it would probably make some difference in SQ to make the SPDIF inputs as good as USB.
 
3)  Code for better remote functionality.  Rob stated that he ran out of time to properly code for DAVE's remote and so only a few of the remote's buttons actually function.  It would be great if DAVE had a fully functioning remote.
 
I'm sure there would be some nominal cost for such an upgrade which most of us will probably be happy to pay for.


These are good points. I also notice that Rob has said
 
"One of the curious things was switching on the HF filter with Dave - with 44.1 it should not sound better - and this immediately told me that I needed to improve the WTA filter stop-band performance, and this was done by increasing the bit depth on the quantised coefficients. This worked; now M scaler sounds better with the HF filter off (exactly as it should do)."
 
​which makes me wonder whether this improvement in the WTA stop-band performance could also be incorporated in the DAVE code.
 
(Not sure that I want different filters though. There should only be one setting. Right.)
 
Jan 23, 2017 at 6:25 PM Post #7,200 of 25,868
Now this was before I re-worked the WTA algorithm, and moved from 8FS to 16FS - both of which was much more important in SQ terms than the tap length.

This seems to be a strong indication that the first stage WTA filter would benefit from being at a higher rate than 16FS. e.g. 64FS.

Which then leads into a question of whether the compute capability of the M scaler FPGA would be better used at, say 64FS? Or even 256FS? Where's the knee of this curve?

At some point WTA and sinc() should produce the same coefficients within the constraints of your processing precision, shouldn't they?

When talking about the accuracy of coefficients, there's a severe problem with the count of coefficients involved, since addition is not generally commutative in computing without coding a huge number of sacrificial bits. An irony here is that the higher the upsampling factor used, the more accurately is each sample computed from the same count of coefficients - since each sample is the result of less multiply-adds, the loss of accuracy due to the lack of commutativity is lower :D

Which makes me wonder whether this accuracy improvement is a factor in the better sound of 16FS versus 8FS that you discovered in DAVE development.

Performing the multiply-adds in ascending order of coefficient magnitude should improve accuracy. Or, this is a slower algorithm with built-in correction tracking at each addition:

http://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html#1262

Now playing: Leslie Winer - 5
 

Users who are viewing this thread

Back
Top