Dali's Soft Magnetic Composite Driver
Apr 2, 2024 at 10:27 PM Post #46 of 231
@MiffyRabbit
Have you tried the tests on klippel.de and audiocheck.net? Trying to test your listening skills on music right away is like running before learning to walk IMO. I'm doing critical listening training ATM and it's not easy at all, breaking it down into components like THD and frequency discrimination in test tones is helpful.
 
Apr 3, 2024 at 12:36 AM Post #47 of 231
@MiffyRabbit
Have you tried the tests on klippel.de and audiocheck.net? Trying to test your listening skills on music right away is like running before learning to walk IMO. I'm doing critical listening training ATM and it's not easy at all, breaking it down into components like THD and frequency discrimination in test tones is helpful.
Thank you Kin! no I have not! that must be why I was performing poorly haha. Thank you, I will save those two sites.
 
Apr 3, 2024 at 3:35 AM Post #48 of 231
I think it's better to apply critical listening to music rather than test tones. Learn how music works. You'll get a lot more out of it.
 
Apr 3, 2024 at 3:57 AM Post #49 of 231
I think it's better to apply critical listening to music rather than test tones. Learn how music works. You'll get a lot more out of it.
The dialectic method I'm advocating here is to isolate and learn to identify specific artifacts in isolation, then learn how these artifacts sound when combined together. Music complicates this initial phase because music has varying rates of intentional distortion and harmonics combined with artifacts that shouldn't be there.

It's not necessary to do it this way of course, I just think it's a systematic way to build the skill set in a controlled manner so the student learns not just the how of that skill, but the why as well.
 
Apr 3, 2024 at 4:57 AM Post #50 of 231
Your hearing is probably perfectly fine. In real blind tests many people -- that saying they hearing all this (big) differences between different codecs, bitrates, lossy vs lossless, CD vs hi-res, wired vs wireless, or between several DAPs, amps, DACs, cables, etc -- they will failing again and again this tests.

The 2 very more important things, that are absolutely real and don't requiring golden ears, are: a) differences in tonal balance/frequency response, and b) differences in recording/mastering quality. A 128kbps well-mastered track/file will sounding very better with a good bluetooth headphone than a poor master in hi-res with a summit-fi wired headphone.

Unfortunately, there is tremendous amount of exaggeration and BS that people posting in audio forums. People with little or no experience in audio often will believing a lot of this BS and often will spending a lot of money that isn't necessary in different aspects of audio. But even in this cases, many people will wanting believing that they're hearing (big) differences between A, B and C, when in reality they don't, because they already spending a lot of money. The name for this is 'confirmation bias'. With a little training you can learning hearing some real very small (but often insignificant) differences, but not in the big way that many, many people often describing/exaggerating in this forums.
I Agree with every word of this.
 
Apr 3, 2024 at 5:12 AM Post #51 of 231
Do you mean that SQ LDAC = SBC ?

Bluetooth codecs
SBC
SBC codec logo
16 bit16 kHz, 32 kHz, 44.1 kHz, 48 kHz48.0 kHz320 kbps2003~ 200-300 ms
LDAC
LDAC Codec logo
24 bit44.1 kHz, 48 kHz, 96 kHz96.0 kHz990 kbps2015~ 200-400 ms
Do the math, it’s very simple: 24 x 96,000, then multiply that by 2 because it’s stereo and then divide by 1,024 to get the kbps, the result is 4,500kbps but with LDAC you’ve got a maximum of 990kbps, so obviously it cannot support 24/96 audio. LDAC cannot even support 16/44 because 16 x 44,100 x 2 / 1024 = 1,378kbps. So both SBC and LDAC have to apply a lossy codec and as audible transparency occurs with lossy codecs at rates lower than 320kbps, then audibly LDAC = SBC. This obviously assumes the bitrate of SBC is not falling too far below 320kbps.
I know I am not an audio engineer lol. But damn I was shocked, my hearing isn't as great as others on this site
No, you’re hearing probably isn’t significantly worse than others on this site and audio engineers generally don’t have particularly good hearing. Your listening skills are probably significantly worse than a music/sound engineer but even having that level of listening skills still won’t help, codecs at higher bit rates are audibly indistinguishable by engineers as well!

Many audiophiles claim they can hear the difference between lossless and MP3s at 320kbps but as with so many other things (cables, etc.), give them a controlled listening test and they can’t, just like everyone else! Obviously, the codecs have improved over the last 30 years but for more than a decade, audible transparency occurs at about 170kbps and even 128kbps is audibly transparent with quite a high proportion of recordings.
The dialectic method I'm advocating here is to isolate and learn to identify specific artifacts in isolation, then learn how these artifacts sound when combined together. Music complicates this initial phase because music has varying rates of intentional distortion amd harmonics combined with artifacts that shouldn't be there.
Generally that is a good approach and pretty much every hearing threshold I can think of, off the top of my head, is more sensitive to test signals than music, because obviously we can design the test signal to occupy the most sensitive hearing range, while also isolating and maximising the specific artefact/threshold being investigated but in most cases, even sometimes with formal scientific studies, it is wise to also test with music recordings, as we can choose recordings which exhibit that artefact to a greater or lesser extent and provide more data.

However, this isn’t really the case with lossy codecs because what we’re actually testing is the efficacy of a complex set of algorithms applying “perceptual models”. Eg. Splitting up the signal into a number of bands, analysing the content in each band and reducing the number of bits each band requires by eliminating freqs we wouldn’t be capable of hearing due to “auditory masking” and other hearing limitations. To test this process therefore requires complex signals, covering the entire freq spectrum of human hearing and providing a diverse range of scenarios. So your choice in this instance is either to spend many years designing a set of hundreds/thousands of diverse complex test signals or simply choose parts of the millions of commercial audio recordings available. Of course, you should try out some simple test tones/signals to satisfy any personal doubts.

G
 
Apr 3, 2024 at 5:37 AM Post #53 of 231
I'm just curious if the deficiencies in SBC I heard were related to defaulting to low bitrates and resampling vs aptX HD, I'll see if I can find my DSR9BT to check it out again.
Resampling isn’t much of an issue as far as the evidence I’m aware of is concerned or at least, not as much of an issue as is commonly assumed. Low bitrates has been an issue though, some devices had a fixed bitpool below the recommended and I heard of devices which stepped down the bitrate (due to say a weak signal or interference) but didn’t step back up again when the signal improved, without rebooting the Bluetooth connection. I don’t know if there are any current devices which demonstrate such behaviour, if so, I would consider them defective. TBH though, it’s becoming less of an issue as the more recent Bluetooth specs effectively deprecates SBC in favour of LC3.

G
 
Apr 3, 2024 at 5:57 AM Post #54 of 231
Do the math, it’s very simple: 24 x 96,000, then multiply that by 2 because it’s stereo and then divide by 1,024 to get the kbps, the result is 4,500kbps but with LDAC you’ve got a maximum of 990kbps, so obviously it cannot support 24/96 audio. LDAC cannot even support 16/44 because 16 x 44,100 x 2 / 1024 = 1,378kbps.
Does this mean that this is simply a commercial manipulation of numbers on the part of bluetooth codec developers (consumer deception)?
Do the math, it’s very simple: 24 x 96,000, then multiply that by 2 because it’s stereo and then divide by 1,024 to get the kbps, the result is 4,500kbps.
If I understand correctly, to get 4,500kbps in DALI IO-12 you only need to use a USB connection. This will be the best sound quality (SQ) from DALI IO-12.
 
Apr 3, 2024 at 6:41 AM Post #55 of 231
Does this mean that this is simply a commercial manipulation of numbers on the part of bluetooth codec developers (consumer deception)?
Effectively yes. Although as with many “consumer deceptions” there is some element of truth to it. AFAIK, you cannot feed a 24/96 signal to the SBC codec, unless the data/signal is reduced to 16/48 or 16/44 it will simply not function. You can feed a 24/96 signal to LDAC though, however it will be lossy compressed (as will 16/44), you won’t get the 24/96 (or 16/44) out that you put in.
If I understand correctly, to get 4,500kbps in DALI IO-12 you only need to use a USB connection.
The connection either has to provide sufficient data bandwidth or reduce the amount of data it’s transmitting (with lossy compression for example). USB 2 provides a data bandwidth up to 480Mbps, over a hundred times more than two channels of 24/96 requires (4,500kbps / 1024 = 4.4Mbps).
This will be the best sound quality (SQ) from DALI IO-12.
That very much depends on exactly what you mean by SQ. If you’re talking about audible SQ, then the best SQ is already reached using standard 16/44 through SBC. If you're talking about absolute sound quality, what can be measured but not heard (for example music content/frequencies above about 20kHz), then it *might* theoretically be the best SQ under certain circumstances, although you won’t be able hear it of course.

G
 
Apr 3, 2024 at 7:05 AM Post #56 of 231
great sound quality on the Dali has nothing or very little to do with inaudible frequencies, codecs and numbers and everything to do with its driver capability (within the spectrum of what humans can hear of course) and their ability to pick up detail and provide a spacious sound. The hardware basically.
 
Apr 3, 2024 at 8:47 AM Post #57 of 231
great sound quality on the Dali has nothing or very little to do with inaudible frequencies, codecs and numbers and everything to do with its driver capability (within the spectrum of what humans can hear of course) and their ability to pick up detail and provide a spacious sound. The hardware basically.
Receipts? I can't find anything on these outside of fluff pieces and a THD measurement that places it as average in fidelity.
 
Apr 3, 2024 at 9:21 AM Post #58 of 231
Receipts? I can't find anything on these outside of fluff pieces and a THD measurement that places it as average in fidelity.
What I’m trying to say is I agree with you. They sound great, in my opinion, not because of the codecs used or anything that falls outside of the realm of human hearing (to your points above). The receipts? I had the IO 12. But again to your point, I just returned it because, to me, it wasn’t $1000 better than my sennheiser momentum 4. Not even close actually. It was even a downgrade in battery which is important to me. There should be no compromises at that price point imo. Either way, what I’m trying to say is that, in general, it’s the hardware mostly that makes something sound good, not codecs or snake oil.
 
Apr 3, 2024 at 9:25 AM Post #59 of 231
Effectively yes. Although as with many “consumer deceptions” there is some element of truth to it. AFAIK, you cannot feed a 24/96 signal to the SBC codec, unless the data/signal is reduced to 16/48 or 16/44 it will simply not function. You can feed a 24/96 signal to LDAC though, however it will be lossy compressed (as will 16/44), you won’t get the 24/96 (or 16/44) out that you put in.
From what I can read, it depends on how recent the codec protocols there are. Neither of these bluetooth standards are typically lossless (as with the nature of wireless communication, it's more efficient to broadcast as lossy). SBC being the "fallback" standard audio protocol for audio with bluetooth (standard protocols going from 16-48kHz). Looks like LDAC is still becoming a standard in the Android community, and Apple using ISO's backed AAC (AAC also capable of lossy 96kHz). ACC would be a more popular lossy audio codec, as it's also used as an open source audio container for movies ("audio transparency" being 128 kbit/s for stereo, and 384 kbit/s for 5.1 audio). The maximum size it supports is up to 48 channels at 96kHz plus 16 LFE channels.

Since LDAC is capable of lossy 96kHz rate, I don't think it's a "deception" to say it's hi-res. There may be a miscommunication with marketing if they do state a "deception" of it being "lossless hi-res". To make things more complicated, apparently it has a CD quality "lossless" standard that's open for developers.

https://en.wikipedia.org/wiki/LDAC_(codec)

LDAC utilizes a type of lossy compression[2][3] by employing a hybrid coding scheme based on the modified discrete cosine transform[4] and Huffman coding[5] to provide more efficient data compression. By default, LDAC audio bitrate settings are set to Best Effort, which switches between discrete bitrate steps (CBR) 330/660/990 kbps depending on connection strength;[6] however, audio bitrate and resolution can be manually adjusted on Linux (when using PipeWire[7]), some Android platforms (which generally requires access to the "Developer Settings" menu), and Sony's own smartphones and Walkman devices at the following rates; 330/660/990 kbps at 96/48 kHz and 303/606/909 kbps at 88.2/44.1 kHz with depth of 32, 24 or 16 bits.[6]Lossless audio transmission can be achieved by manually configuring the codec's resolution to 44.1 kHz, 16 bits and selecting 'Sound quality preferred' for high bitrate streaming at 909 kbps. This setup is identical to a wired audio or an Audio-CD sound quality.
 
Last edited:
Apr 3, 2024 at 9:49 AM Post #60 of 231
@gregorio
I agree, I was talking more in general about didactic theory.

I'm just curious if the deficiencies in SBC I heard were related to defaulting to low bitrates and resampling vs aptX HD, I'll see if I can find my DSR9BT to check it out again.
You mention you were listening to a phone a few years ago. BT standards have improved to possibly pool a higher bitrate SBC (maybe the signal then wasn't quite optimal). Also, I wonder if the BT device processes SBC vs aptX the same way after the audio is decoded. It's possible it's still treating them differently in its audio/EQ settings.
 
Last edited:

Users who are viewing this thread

Back
Top