Dec 22, 2021 at 6:38 AM Post #5,896 of 7,175
Have you tested/measured HQPlayer's filters and concluded that they are all based on wrong algorithms?
I think he probably has a lot better things to do with his time......like having a massive input to quality Hifi products :relaxed::relaxed:
 
Dec 22, 2021 at 8:46 AM Post #5,898 of 7,175
Fair point, but, in that case, he should not talk about things he hasn't tested, let alone call them "wrong".
Read the many posts that he has made explaining elements of the mathematics of the algorithms. Also the explanations about the disadvantages of using PCs for upscaling, when you can be at the mercy of operating system, plus the CPU settings, as to whether all the streams of digital data are kept in sync, or some are buffered for a while causing the data to be out of sync. Once that happens, then you cannot accurately reconstruct the start of music transients. He has had 30 years experience of studying/analysing the various types of algorithms that DAC designers use, so when other developers of DAC software describe their work, he has a good idea about how close they are getting to the Shannon mathematical theory of DACs. If the maths used is wrong, then the audio output will have errors - you don't need to physically test the DAC to know/prove that.

https://www.head-fi.org/threads/chord-electronics-qutest-dac-official-thread.869417/post-16696978
 
Last edited:
Dec 22, 2021 at 9:09 AM Post #5,899 of 7,175
Read the many posts that he has made explaining elements of the mathematics of the algorithms. Also the explanations about the disadvantages of using PCs for upscaling, when you can be at the mercy of operating system, plus the CPU settings, as to whether all the streams of digital data are kept in sync, or some are buffered for a while causing the data to be out of sync. Once that happens, then you cannot accurately reconstruct the start of music transients. He has had 30 years experience of studying/analysing the various types of algorithms that DAC designers use, so when other developers of DAC software describe their work, he has a good idea about how close they are getting to the Shannon mathematical theory of DACs. If the maths used is wrong, then the audio output will have errors - you don't need to physically test the DAC to know/prove that.
I'm not questioning Mr. Watt's competence—that's not the topic here. I'm only asking if he's certain that all of the HQPlayer's filters / noise shapers are "wrong". That would mean there's proof to that, right? Otherwise, it wouldn't be fair to be so blunt regarding a competitor's solution. Your other arguments are irrelevant. I'm not discussing which solution is better. I'm only intreged about him calling these alternatives "wrong".
 
Dec 22, 2021 at 9:24 AM Post #5,900 of 7,175
I'm not questioning Mr. Watt's competence—that's not the topic here. I'm only asking if he's certain that all of the HQPlayer's filters / noise shapers are "wrong". That would mean there's proof to that, right? Otherwise, it wouldn't be fair to be so blunt regarding a competitor's solution. Your other arguments are irrelevant. I'm not discussing which solution is better. I'm only intreged about him calling these alternatives "wrong".
So you regard how the algorithms are implemented in the flow of data inside a PC as irrelevant.
You clearly don't understand that there are two key contributors involved:
  • Using the correct algorithms
  • How those algorithms are implemented in the PC - a poor implementation inevitably leads to mathematical errors.
You can't ignore either of them.
 
Dec 22, 2021 at 9:25 AM Post #5,901 of 7,175
Have you tested/measured HQPlayer's filters and concluded that they are all based on wrong algorithms?
I concur with this sentiment. I’m no math major but if the ultimate goal of any these upsampling approaches (hardware or software-based) is to reconstitute transient information, HQPlayer seems to be doing fairly well using similar (but not exactly the same) algorithmic implementations (e.g. million tap sinc functions). Don’t get me wrong - if Chord gets access to newer silicon and ups the taps on a Hugo TT2-sized Mscaler implementation AND shrinks the existing 1 million tap implementation into a Qutest-sized (and priced) chassis, I’m all in for the latter. Until then, the value to cost ratio of a $5K hardware implementation versus a similar but admittedly not exactly the same $300 software implementation is not persuasive.
 
Dec 22, 2021 at 10:06 AM Post #5,902 of 7,175
So you regard how the algorithms are implemented in the flow of data inside a PC as irrelevant.
No, you misunderstood me. I didn't say that the implementation is irrelevant. My only concern is Mr. Watt's comment and that comment regarded only the algorithm in itself and not the implementation. That's why I said that you're taking the discussion to a place that is beyond the topic. Mr. Watt said that HQPlayer's algorithm is wrong. I want to know how did he come to this conclusion. That's all. I'm not here to discuss which approach is better in terms of implementation.
 
Last edited:
Dec 22, 2021 at 10:59 AM Post #5,903 of 7,175
No, you misunderstood me. I didn't say that the implementation is irrelevant. My only concern is Mr. Watt's comment and that comment regarded only the algorithm in itself and not the implementation. That's why I said that you're taking the discussion to a place that is beyond the topic. Mr. Watt said that HQPlayer's algorithm is wrong. I want to know how did he come to this conclusion. That's all. I'm not here to discuss which approach is better in terms of implementation.
Rob has posted many times explaining the pros/cons of various algorithms and implementations. Search for his posts, and they will help you understand better where he is coming from, when he makes his comments.
 
Dec 22, 2021 at 11:24 AM Post #5,904 of 7,175
Rob has posted many times explaining the pros/cons of various algorithms and implementations. Search for his posts, and they will help you understand better where he is coming from, when he makes his comments.
The issue is not the algorithms, but the platform on which they are implemented. You cannot seriously defend the FPGA platform that Rob and Chord use when it needs £1500 of cables to ameliorate its deficiencies.
 
Dec 23, 2021 at 1:50 AM Post #5,905 of 7,175
Rob has posted many times explaining the pros/cons of various algorithms and implementations. Search for his posts, and they will help you understand better where he is coming from, when he makes his comments.
Possibly my best post is this one.

As to whether my WTA is better than other algorithms I leave to others to decide, and of course there is no accounting for taste. But I will say that in my opinion the only subjectively important function of an interpolation filter is to reconstruct the timing of transients accurately; this is absolutely vital and accounts for all of the sound quality and musicality of a properly designed interpolation filter. No other designer recognises the importance of transient reconstruction at all in that nobody talks about it - I have been a sole voice on this issue for the past 23 years.

Transient timing reconstruction is a huge (and in reality extremely complex) problem with interpolation filters - and if all other designers do not even accept that this issue is a problem, what are the chances of an optimum solution? Secondly, thousands of listening tests went into fine tuning the WTA algorithm - I am not aware of any other design process for algorithms being so intense.
 
Dec 23, 2021 at 7:17 AM Post #5,906 of 7,175
Possibly my best post is this one.

As to whether my WTA is better than other algorithms I leave to others to decide, and of course there is no accounting for taste. But I will say that in my opinion the only subjectively important function of an interpolation filter is to reconstruct the timing of transients accurately; this is absolutely vital and accounts for all of the sound quality and musicality of a properly designed interpolation filter. No other designer recognises the importance of transient reconstruction at all in that nobody talks about it - I have been a sole voice on this issue for the past 23 years.

Transient timing reconstruction is a huge (and in reality extremely complex) problem with interpolation filters - and if all other designers do not even accept that this issue is a problem, what are the chances of an optimum solution? Secondly, thousands of listening tests went into fine tuning the WTA algorithm - I am not aware of any other design process for algorithms being so intense.
Thanks Rob.
 
Dec 23, 2021 at 11:36 AM Post #5,907 of 7,175
Just starting to lean into new Qutest...

1. Even with my least resolving headphones (Modhouse Argons) details are impressive...clear/clean.

2. Great fun/synergy pairing with both WooAudio (WA6 not in photo) and Burson 3XP.

3. Output voltage remains at default (2V). Will probably test 3V at some point to see what impact/benefits.

4. While all-around Qutest performance is awesome...drums...drums...drums.... love the way hi-hats sound/feel.

5. Running stock power to get a sense for baseline -- will likely upgrade at some point in 22'.

Look forward to tinkering with the filters in the weeks ahead. :sunglasses:


IMG_1741.jpeg
 
Dec 28, 2021 at 12:49 AM Post #5,908 of 7,175
I plan to continue to use my USB chain of MP-U1 battery, intona and 15 ferrites with Qutest. I don't think you can ever have enough protection from RFI/EMI.

What is an intona and MP-u1?
 
Jan 4, 2022 at 1:50 PM Post #5,909 of 7,175
Possibly my best post is this one.

As to whether my WTA is better than other algorithms I leave to others to decide, and of course there is no accounting for taste. But I will say that in my opinion the only subjectively important function of an interpolation filter is to reconstruct the timing of transients accurately; this is absolutely vital and accounts for all of the sound quality and musicality of a properly designed interpolation filter. No other designer recognises the importance of transient reconstruction at all in that nobody talks about it - I have been a sole voice on this issue for the past 23 years.

Transient timing reconstruction is a huge (and in reality extremely complex) problem with interpolation filters - and if all other designers do not even accept that this issue is a problem, what are the chances of an optimum solution? Secondly, thousands of listening tests went into fine tuning the WTA algorithm - I am not aware of any other design process for algorithms being so intense.

Although i (and most others) can't fully understand the math and formula that creates the WTA, i do understand that its purpose is making a higher sampled digital of a real analogue captured audio signal out of a less sampled version of the same analog. With going from 44.1 redbook showing the biggest step up and therefore most advantage.

The MS took it as far as equal to 768k which, to test the WTA's accuracy, you want to compare with digital recorded by Davina doing 768k ADC of a real captured analog.

With the knowledge u gathered sofar.. what will we look at in the future in terms of sampling rate needed until the benefits of higher sampling diminish. Say reaching the human ear's timing resolution limit?

Sadly our current Chord DAC's won't be able of decoding more than 768k once the next generation xMS comes out..
 
Last edited:
Jan 5, 2022 at 5:32 AM Post #5,910 of 7,175
An awful lot has been learnt over the past couple of years - with some extraordinary findings, of which I will be able to talk about (hopefully) later in the year. But one thing I have definitely confirmed is that as you go higher in sample rate the benefits from increased tap length diminish markedly; 768 kHz does seem to be the sweet spot for recordings, so future more advanced and costly M scalers will still be at 705/768.
 

Users who are viewing this thread

  • Back
    Top