Is bluetooth aptx perfect for 320kbps mp3?
Jan 7, 2020 at 5:28 PM Post #46 of 62
Make sure no one in your house is watching the new Dracula on Netflix while you are overloading the wifi. They'll get mad at you!
 
Jan 7, 2020 at 7:03 PM Post #47 of 62
Make sure no one in your house is watching the new Dracula on Netflix while you are overloading the wifi. They'll get mad at you!

The actual home network is separate from the lab network for exactly that reason. Might tick off a few neighbors though. It’s hard to find completely unused channels with an apartment building within Wi-Fi range.

I’ll have to use my neighbor’s street address as my lab network SSID to throw any angry mob off the trail. :)
 
Jan 8, 2020 at 3:10 AM Post #48 of 62
[1] I'm extrapolating from that and making a bit of an assumption that the transmission through Bluetooth isn't costing so much that it's turning a 320kbmp3 into a 192kb mp3...
[2] I mean I guess we could ask the alternative question too - is there any science showing that the difference is audible under reasonable listening conditions?
[3] Just for reference, this was an article I read a few months back which sort of convinced me I didn't really need to worry a whole lot about the matter...even SBC is good enough ftmp.
1. I have no idea but the data rate for standard bluetooth is 328kbps. However, a lot more of that data rate is taken up with error correction code and the simplistic SBC also represents a loss of equivalent bits, so in effect it could easily be turning a 320kbps MP3 into a 192kbps MP3 and I'm not convinced your assumption is valid.

2. Yes, we could ask that question but I'm not making the assertion that the difference is audible and therefore it does matter, I'm saying I don't really know and it might be audible under some reasonable conditions, while but bigshot is saying that "it flat out doesn't matter".

3. That article highlights the problem. Different manufacturers implement the codec options differently, even between different models by the same manufacturer, for example: "The situation with AAC is ambiguous: on one hand, theoretically, the codec should produce quality that is indistinguishable from the original, but practice, judging by the tests of the SoundGuys laboratory on different Android devices, is not confirmed. Most likely, the fault is on low-quality hardware audio encoders embedded in various phone chipsets. It makes sense to use AAC only on Apple devices; with Android you'd better stick with aptX/HD and LDAC." - In fact the article states that there is an audible difference with headphones, although asserts that it's due to DSP within the chip sets. Another potential issue is the test tracks they've used, heavy rock/metal, which is just about the easiest genre/s to lossy encode transparently (128kbps MP3 is typically sufficient). While other genres, some electronic genres for example are more difficult and can require a significantly higher data rate.

[1] You want me to prove a negative?
[2] Wouldn't it be easier for you to do a test and see if you could hear it?
[3] Like I said, when bluetooth was first introduced, I wasn't impressed. But the current implementation is transparent for my purposes ...
[4] I'm not the one trying to be dominant here. I'm just saying what my experience with bluetooth is.

1. No, I do not! I never asked for proof, I just asked for reliable supporting evidence. Nice try using a typical audiophile ploy, that YOU criticise them for!!!!

2. How on earth would it be easier for me to test a bunch of different bluetooth transmitting and receiving devices that I don't own in a range of circumstances? I could easily test my personal bluetooth transmitter and headphones, under my personal listening circumstances but all that would reveal is my personal experience. I certainly couldn't make a blanket "it flat out doesn't matter" assertion here in the sound science forum unless I wanted to be a hypocrite!!

3. Which current implementation? AptX for example is exactly the same now as it was when bluetooth was first released.

4. But this is not the "bigshot's experience" forum, it's the sound science forum. Sure, you can provide your personal experience but you CANNOT extrapolate your personal experience to blanket assertions of fact for everyone else without, just as a minimum starting point, some reliable supporting evidence. How many times have you criticised visiting audiophiles for doing just that?

So YET AGAIN, your attempt defence of your position achieved the exact opposite and just confirmed the criticism levelled against you!!

G
 
Jan 8, 2020 at 6:20 AM Post #49 of 62
I'm not really trying to convince anyone that my assumptions are valid. I'm looking around at the evidence I can find, knowing my use case situation, and making a decision about what I want to do. The fact that it might be possible to manufacture some edge case scenario where I might be able to possibly hear some difference doesn't worry me any more than does the fact that it might under very special conditions be possible to hear a difference between high br lossy and lossless. I am honestly not sure what you are arguing here. Sometimes, even in the sound science forum, good enough is good enough. If as you say the transmission through bluetooth does in fact cause notable sound degradation, you show me the evidence. I've been using BT for a long time and can't hear it myself.

It seems to me that if there was some audible loss of sound quality related to current BT implementations, there would be evidence (as in simple frequency spectrum graphs) of that all over the place. The audiophile community has such a negative view towards BT that if there was to be found a notable, repeatable SQ issue that could be easily displayed (as any notable repeatable issue should be able to be) it would be trumpeted all over the various sound sites with glee.

As far as different codec option implementations go, I'd just say anyone who plans to use BT a lot should make an effort to insure that the codec is implemented suitably on any device they are considering. That is sort of SOP for the buying of any gear really...
 
Last edited:
Jan 8, 2020 at 12:11 PM Post #50 of 62
Do you have any components that you use bluetooth on, Gregorio?
 
Last edited:
Jan 11, 2020 at 9:03 AM Post #51 of 62
[1] I'm not really trying to convince anyone that my assumptions are valid. I'm looking around at the evidence I can find, knowing my use case situation, and making a decision about what I want to do.
[2] The fact that it might be possible to manufacture some edge case scenario where I might be able to possibly hear some difference doesn't worry me any more than does the fact that it might under very special conditions be possible to hear a difference between high br lossy and lossless.
[3] I am honestly not sure what you are arguing here.
[3a] Sometimes, even in the sound science forum, good enough is good enough.
[3b] If as you say the transmission through bluetooth does in fact cause notable sound degradation, you show me the evidence.
[3c] I've been using BT for a long time and can't hear it myself.
[4] It seems to me that if there was some audible loss of sound quality related to current BT implementations, there would be evidence (as in simple frequency spectrum graphs) of that all over the place. The audiophile community has such a negative view towards BT that if there was to be found a notable, repeatable SQ issue that could be easily displayed (as any notable repeatable issue should be able to be) it would be trumpeted all over the various sound sites with glee.

As far as different codec option implementations go, I'd just say anyone who plans to use BT a lot should make an effort to insure that the codec is implemented suitably on any device they are considering. That is sort of SOP for the buying of any gear really...

1. Which is absolutely fine and indeed, recommended practice but again, this isn't the "What Sgt. Ear Ache wants to do" forum, it's the sound science forum.

2. According to the evidence you yourself linked to, they could NOT confirm no difference with AAC bluetooth on Android devices and recommended using a different bluetooth protocol. Android is hardly "some edge case scenario"!

3. I've tried to make it very clear, I'm arguing that we cannot make blanket assertions of fact in a science forum based ONLY on our own individual experience and a particular interpretation of the evidence which supports our personal experience. If we do, how are we any different to any other subforum here? People come here for answers based on science/the facts, NOT your's, mine or bigshots personal experiences or what we would do/want. And continuing:
3a. Not just sometimes but often! There are virtually always differences but typically they are inaudible under reasonable listening conditions, we have a body of reliable evidence to support the assertion that it "flat out doesn't matter", plus a rational explanation of why it should be inaudible. However, that is NOT the case here! The problem we have here is: A large number of variables, relatively little reliable evidence of the audibility of different combinations of those variables and the evidence I've seen is somewhat inconclusive. If we are to be HONEST here in the sound science forum, we therefore MUST at least leave the door open that under some reasonable listening conditions there could be audible differences, because the science does NOT support an ablosute assertion and we would be hypocrites and no different to any other forum if we did!!
3b. But I did NOT say there would be "notable sound degradation", I've been consistently VERY CLEAR on that point!! I explained the mechanism by which there will be degradation, which as far as I'm aware is uncontested but I've clearly stated that I do not know if this degradation can sometimes be notable/audible (under reasonable listening conditions). So what evidence is it that you want from me?
3c. Which is fine for you but obviously doesn't define a sound science answer!

4. No, when dealing with perceptual lossy codecs a freq spectrum graph is largely useless. We typically find large/obvious freq content differences when perceptual lossy codecs are involved, these differences on their own are easily audible but in practice they're inaudible due to a quirk of our perception, "masking", which renders certain freqs inaudible in the presence of certain other freqs. Hence why they're called perceptual codecs and why the actual freq content doesn't tell us anything about audibility.

Do you have any components that you use bluetooth on, Gregorio?

Well done, another typical audiophile response and one that you yourself condemn in other threads! What difference does it make what bluetooth equipment I've got and what I've experienced? This isn't the "What Gregorio's got and experienced" subforum any more than it's the Hugo and $5,000 USB cable impressions subforum or what bigshot got and has experienced subforum, isn't this the sound science forum?

Again, you're just digging your hole even deeper, just what you criticise others for!!!

G
 
Jan 11, 2020 at 11:02 AM Post #52 of 62
1. Which is absolutely fine and indeed, recommended practice but again, this isn't the "What Sgt. Ear Ache wants to do" forum, it's the sound science forum.

2. According to the evidence you yourself linked to, they could NOT confirm no difference with AAC bluetooth on Android devices and recommended using a different bluetooth protocol. Android is hardly "some edge case scenario"!

3. I've tried to make it very clear, I'm arguing that we cannot make blanket assertions of fact in a science forum based ONLY on our own individual experience and a particular interpretation of the evidence which supports our personal experience. If we do, how are we any different to any other subforum here? People come here for answers based on science/the facts, NOT your's, mine or bigshots personal experiences or what we would do/want. And continuing:
3a. Not just sometimes but often! There are virtually always differences but typically they are inaudible under reasonable listening conditions, we have a body of reliable evidence to support the assertion that it "flat out doesn't matter", plus a rational explanation of why it should be inaudible. However, that is NOT the case here! The problem we have here is: A large number of variables, relatively little reliable evidence of the audibility of different combinations of those variables and the evidence I've seen is somewhat inconclusive. If we are to be HONEST here in the sound science forum, we therefore MUST at least leave the door open that under some reasonable listening conditions there could be audible differences, because the science does NOT support an ablosute assertion and we would be hypocrites and no different to any other forum if we did!!

OK

:ok_hand:
 
Jan 11, 2020 at 4:38 PM Post #53 of 62
This is a “what Bigshot wants to say” post. That’s all I want to say.

Edit: Oh! One more thing... Bluetooth works doggone good for the purposes of informal listening to music.
 
Last edited:
Jan 12, 2020 at 8:20 AM Post #54 of 62
Edit: Oh! One more thing... Bluetooth works doggone good for the purposes of informal listening to music.

Well done, yet another repeat of the same unsubstantiated personal use and personal impressions of bluetooth, which is UNACCEPTABLE here as you yourself would argue in pretty much every other thread! Unless of course my browser has fail to update and this forum's name really has been changed to "Bigshot's Impressions". Is there no end to how deep a hole you want to dig for yourself?

G
 
Jan 13, 2020 at 3:23 AM Post #55 of 62
Do you have any current bluetooth components? Have you been able to hear a difference? I sure haven't. Let me know if you have. I think you might be operating on old info. I thought bluetooth sounded mediocre when it first came out, and I avoided it for years. I revisited it and changed my mind.
 
Last edited:
Jan 13, 2020 at 4:34 AM Post #56 of 62
[1] Do you have any current bluetooth components?
[1] Have you been able to hear a difference? I sure haven't. Let me know if you have.
[2] I think you might be operating on old info. I thought bluetooth sounded mediocre when it first came out, and I avoided it for years. I revisited it and changed my mind.

1. Yes I do but that's irrelevant.

2. I haven't tested them under the full range of what could constitute "reasonable/common listening conditions" and even if I had (and couldn't hear a difference), all that would tell me is that my particular combination of audio format and bluetooth equipment is transparent, not that everyone else's is!

3. What old (or new) info/evidence? There's been very little evidence presented here and that which has been presented appears somewhat inconclusive but YOU haven't presented any reliable evidence whatsoever, only your personal opinions/impressions. This isn't the "What bigshot changed his mind about" forum!

The answer to my question "Is there no end to how deep a hole you want to dig for yourself?" - Is apparently "NO", just as it is with so many audiophiles and that you criticise then for!!!

G
 
Jan 13, 2020 at 11:52 AM Post #57 of 62
Have you heard artifacting using bluetooth in your bluetooth components?
 
Jan 13, 2020 at 12:39 PM Post #58 of 62
I have used Bluetooth in the past and still do for some fraction of use-casess. And I have used lossy encoding extensively for decades.

The thing I worry about is where any kind of DSP or transcodimg is applied. First of all yes, in my use cases,, for me, the results have always been highly pleasing with lossy and Bluetoth, but this is not the what does Steve999 find highly pleasing forum. . . ; )

Over time I have migrated to WiFi with lossless as the technology allows and makes it easier and çheaper and lossless (particularly over the last year, as lossless streaming services for cheap, for example Amazon HD and now Qobuz, become available) and wifi audio streaming in the home have started to come into the mainstream. With my audio home receiver when used for audio, I almost always go full lossless into the receiver, as I am fond of upmixing to 5.1 from stereo, and want the receiver to have the full lossless signal to work with for upmixing.

Finally, hats off to Harold999 on his handle.

--Steve999
 
Last edited:
Jan 13, 2020 at 1:59 PM Post #59 of 62
I've dealt with this a number of times - it usually happens in businesses that have outgrown their startup small biz network infrastructure.

It's not likely to occur in a home - just not enough devices to saturate either the infrastructure or channels. Most home networks will also automatically channel hunt for both 2.5/5Ghz if the signal is sub-optimal - could happen if BT conflicts with the channel, or if your neighbors happen to use the same channel for wifi. In a business office, generally, self healing of that type isn't enabled as the WiFi network layout needs to be more controlled.

I'll experiment a bit this weekend to see if I can create a scenario where BT starts to be impacted by WiFi. Will set 8-10 Wi-Fi SSIDs to use the same channel and will try to saturate the network with 2.5/5Ghz and BT traffic to see what shows up in network monitoring and also subjectively on a BT headphone.


Short version of the story:

While this is hardly conclusive as I was not able to saturate enough Wi-Fi or BT channels to ensure BT wasn't able to find an unused/unsaturated connection, I was unable to impact BT audio in detectable level by generating a significant amount of network traffic. Despite fully saturating several Wi-Fi networks and SSID based channels (to the point where issues were measurable on Wi-Fi), BT continued to operate with out creating artifacts I found audible.

When I have more time, I'll see if I can improve how I'm measuring BT transmission to see if anything shows, and if I can force issues with specific configuration. I think a reasonable assumption would be that unless there are other impacting elements, it's unlikely that Wi-Fi traffic is going to audibly degrade BT under normal operating conditions and configuration. Maxing out this test would require more hardware and setup than I'm likely to be able to accomplish in the home lab.
 
Jan 13, 2020 at 3:56 PM Post #60 of 62
Interesting. I'm wondering if the implementation of bluetooth has improved in the past two or three years. I remember not being impressed with it when it was first launched, but now I see no problems at all with it. I recently got a set of bluetooth speakers that claims a 100 foot range. Something there must be different than when I first tried it.

I even have a bluetooth Alexa for the car that allows me to do just about everything in my car that I can do at home. I use that all the time and I never have any problems at all with it. My phone is turned off in my pocket, and it connects and streams music through it with voice commands. I have wifi at home and work, but everywhere else I use bluetooth.
 
Last edited:

Users who are viewing this thread

Back
Top