What are head-fi members views on apt-x lossless codec (over bluetooth)?
Mar 11, 2015 at 11:18 AM Post #166 of 460

Giogio

Bluetooth Guru
Joined
Nov 13, 2014
Posts
1,419
Likes
348
  It isn't just a bigger antenna, you'll need bigger output power as well.
Ok, but it is the same, I mean, you need antenna and power, but still this does not imply any negotiation between two devices, unlike for the Aptx thing.
So, why must both devices be Class1?
 
But no, you don't need to play AAC only files. It only means the encoding and decoding on transmitter to receiver (*the actual Bluetooth transmission ) is AAC.
So, you mean that if I play an MP3 file on an iPhone it will be encoded on the fly in AAC and transmittet to the AAC Headphones with the AAC codec?
Are you sure sure sure and could you eventually also provide some official info, links etc?
I could not find anything on the Web...

 
 
  https://miccusblog.wordpress.com/2014/08/13/bluetooth-music-and-latency/
I would be careful with what those guys say. They do not write that on Dual Link mode (two headphones receiving the same signal, so that two people can listen same music at same time) the Aptx is switched off and the SBC is used instead.
And I know for my researches with other products (like Avantree and Telme2) that this is how it works.

But still interesting to know that they plan a long distance with Aptx.
Considering their prices anyway, I am more interested in the similar plan from Avantree, which in change will be also Low Latency Aptx.
But, we'll see.

 
 
@Class D, yes, this is what I was trying to explain, but I did not really explain, just implied it, because I took for granted that everybody knows that AAC is also a transmission Codec now, used by iPhones and several Headphones.
About the Class, can you explain why the range is not addictive? I mean, if a receiver can "listen" to what is being transmitted within a range of x meters, and a transmitter can send a signal into this range of x meters, why should that not work? Why is it needed that the start of the signal (not the end of it), = the transmitter itself, is also within this range of x meters?
 
Mar 11, 2015 at 2:00 PM Post #167 of 460

Class D

New Head-Fier
Joined
Dec 9, 2014
Posts
27
Likes
13
Hi Giogio,
 
Well, not everyone knows that AAC is a BT transmission CODEC.  I found this interesting so I did some research on Apple and their implementation of BT on their devices, iPhones, iPods, iPads, and etc.  There is a specification for Apple developers that provides guidelines for the BT interface =>
 
https://developer.apple.com/hardwaredrivers/BluetoothDesignGuidelines.pdf
 
Although SBC is the default codec for interoperability between sink and source devices, the guidelines allow for A2DP enhancement by using either the MPEG-2/4 AAC LC codec.  This approach is similar to Aptx while circumventing royalties to CSR for their codec.
 
Knowing this, anyone considering the purchase of a BT sink device, like a headphone, should look to see that it supports both the Aptx and AAC codec.  This will give you excellent sound quality and low latency across a variety of BT sources.  Additionally, these codecs require BT 4.0 for low power consumption and Enhanced Data Rate, EDR, for a wide data path and an advanced microprocessor to support these features.  These things will be advantageous to sound quality even if the default SBC codec is used. 
 
Quote:  Giogio
 About the Class, can you explain why the range is not addictive?

 
The only way I can explain this is to say that the BT radio signal attenuates to a dB value below the noise level of the transmission frequency being used at x meters.  For the receiver, BT sink, beyond x meters from the BT source to recognize the signal amongst the noise, it would have to have a really good sensitivity spec.  This is certainly possible.  In fact, this is how GPS receivers work.  By the time the GPS signal reaches earth, it is so weak it is lost in the noise.  A good GPS receiver will be sensitive enough to dip down into the transmission noise and recognize the signal from the satellite.  With a BT Class 1 radio receiver, you would expect that it is more sensitive to weak signals than BT Class 2 and have the capability to dip into the noise and recognize the BT signal coming from the BT source.  Nevertheless, it would not be sensitive enough to recognize signals beyond its rated x meters. 
 
Mar 11, 2015 at 11:49 PM Post #168 of 460

ClieOS

IEM Reviewer Extraordinaire
Joined
May 11, 2004
Posts
20,525
Likes
10,359
Location
Mid Johor, Malaysia
   
Ok, but it is the same, I mean, you need antenna and power, but still this does not imply any negotiation between two devices, unlike for the Aptx thing.
So, why must both devices be Class1?
 
 
So, you mean that if I play an MP3 file on an iPhone it will be encoded on the fly in AAC and transmittet to the AAC Headphones with the AAC codec?
Are you sure sure sure and could you eventually also provide some official info, links etc?
I could not find anything on the Web...
 

 
Q1: as Class D said, you will probably need the Class 2 receiver sensitive enough to pick up a Class 1 signal at distance.
 
Q2: Class D already listed the official Apple doc. BT.org actually allows any extra implementation over SBC as along as it is still confirm to the spec and bandwidth. This is true to aptX as well as AAC. The good news is of course most of the current aptX supporting devices seem to support AAC as well, likely because CSR want to benefit over Apple's big iDevice market even though Apple has been avoiding aptX as much as possible.
 
Mar 12, 2015 at 1:03 AM Post #169 of 460

Giogio

Bluetooth Guru
Joined
Nov 13, 2014
Posts
1,419
Likes
348
   
Q1: as Class D said, you will probably need the Class 2 receiver sensitive enough to pick up a Class 1 signal at distance.
 
Q2: Class D already listed the official Apple doc. BT.org actually allows any extra implementation over SBC as along as it is still confirm to the spec and bandwidth. This is true to aptX as well as AAC. The good news is of course most of the current aptX supporting devices seem to support AAC as well, likely because CSR want to benefit over Apple's big iDevice market even though Apple has been avoiding aptX as much as possible.

There is no information in that Apple link. No information about if the AAC codec is used also when an MP3 file is played.
Anyway Macs with OS X support Aptx.
 
Mar 12, 2015 at 2:04 AM Post #170 of 460

ClieOS

IEM Reviewer Extraordinaire
Joined
May 11, 2004
Posts
20,525
Likes
10,359
Location
Mid Johor, Malaysia
  There is no information in that Apple link. No information about if the AAC codec is used also when an MP3 file is played.
Anyway Macs with OS X support Aptx.

 
That's because Apple doesn't list any requirement asking to show whether AAC is used or not. It is all done automatically - meaning it will use AAC when both transmitter and receiver support it, or fall back to SBC when one of them doesn't. That is different from aptX implementation where it usually will display an icon or something to tell you whether aptX is in use or not.
 
On the other hand, whether you are playing mp3, FLAC, m4a or ALAC has nothing to do with whether AAC is used over BT or not. When you play a file, regardless of format, it will always get decoded to PCM first (that's the universal lossless language for audio) before it is sent to the DAC to convert to analog sound. In Bluetooth, the same PCM signal is sent to the BT chip to be re-encoded to SBC / AAC / aptX (depends on which is supported) for the transmission, because PCM itself is too big to be sent over. That means, the BT chip will choose whatever compression format that is available to it regardless what file format you are playing - because on the basic level, the BT chip only see PCM on the input and nothing else.
 
Mar 12, 2015 at 2:47 AM Post #171 of 460

james444

Headphoneus Supremus
Joined
Aug 25, 2004
Posts
7,085
Likes
2,325
 
About the Class, can you explain why the range is not addictive? I mean, if a receiver can "listen" to what is being transmitted within a range of x meters, and a transmitter can send a signal into this range of x meters, why should that not work? Why is it needed that the start of the signal (not the end of it), = the transmitter itself, is also within this range of x meters?

 
The only way I can explain this is to say that the BT radio signal attenuates to a dB value below the noise level of the transmission frequency being used at x meters.  For the receiver, BT sink, beyond x meters from the BT source to recognize the signal amongst the noise, it would have to have a really good sensitivity spec.  This is certainly possible.  In fact, this is how GPS receivers work.  By the time the GPS signal reaches earth, it is so weak it is lost in the noise.  A good GPS receiver will be sensitive enough to dip down into the transmission noise and recognize the signal from the satellite.  With a BT Class 1 radio receiver, you would expect that it is more sensitive to weak signals than BT Class 2 and have the capability to dip into the noise and recognize the BT signal coming from the BT source.  Nevertheless, it would not be sensitive enough to recognize signals beyond its rated x meters. 

 
Q1: as Class D said, you will probably need the Class 2 receiver sensitive enough to pick up a Class 1 signal at distance.

 
As I posted earlier, the A2DP specification contains negotiation procedures. I'm not a expert in any way, but from my understanding that would require bidirectional communication between source and sink. Meaning, not only must the source be able to reach the sink, but also vice versa. Not comparable to GPS imo, which is just a one-way signal transmission.
 
Here's an A2DP technology overview for developers, note how the communication arrows point both ways:
 

 
(Just my 2c, and I may be wrong... like I said, I'm not an expert.)
 
Mar 12, 2015 at 3:28 AM Post #172 of 460

Giogio

Bluetooth Guru
Joined
Nov 13, 2014
Posts
1,419
Likes
348
   
When you play a file, regardless of format, it will always get decoded to PCM first

That's kind of idiot.
Why do they do that?
Why cannot they directly send the Mp3 to the BT chip and encode it to SBC/Aptx/AAC?
It is a lost of time, energy, and signal.
 
Mar 12, 2015 at 3:34 AM Post #173 of 460

Giogio

Bluetooth Guru
Joined
Nov 13, 2014
Posts
1,419
Likes
348
   
As I posted earlier, the A2DP specification contains negotiation procedures. I'm not a expert in any way, but from my understanding that would require bidirectional communication between source and sink. Meaning, not only must the source be able to reach the sink, but also vice versa.
 

I think I understand what you mean, it sounds like what Class D said about the two devices with 50m range placed at 100m from each other, and being able to meet in the middle, so that for my fantasy the first receiver should be able to take what the transmitter sent.
But he means that the receiver must reach the transmitter where it physically is, not it waves somewhere in the middle.
 
But, I still cannot understand.
I mean, I cannot understand why, what are the technical and scientific reasons why this is so.
For me it is so clear, if the transmitter A, placed at 100 m from the Receiver B, sends its waves till the point C, and the Receiver B is able with its range to take those waves in that point C, why should that not work?
 
Mar 12, 2015 at 3:50 AM Post #174 of 460

ClieOS

IEM Reviewer Extraordinaire
Joined
May 11, 2004
Posts
20,525
Likes
10,359
Location
Mid Johor, Malaysia
   
As I posted earlier, the A2DP specification contains negotiation procedures. I'm not a expert in any way, but from my understanding that would require bidirectional communication between source and sink. Meaning, not only must the source be able to reach the sink, but also vice versa. Not comparable to GPS imo, which is just a one-way signal transmission.
 
  (Just my 2c, and I may be wrong... like I said, I'm not an expert.)

 
 
Yes, that makes sense to me as well. BT is inherently a bidirectional design in core.
 
  That's kind of idiot.
Why do they do that?
Why cannot they directly send the Mp3 to the BT chip and encode it to SBC/Aptx/AAC?
It is a lost of time, energy, and signal.

 
Because mp3 is not actually a free codec to use. You need to pay license for commercial use. So adapting a low-fi codec like SBC that the BT.org has license of will be a lot cheaper and, more importantly, has more control over in the long run. Not to mention mp3 doesn't really offer much better compression over SBC. AAC is also not free, but Apple probably picks it because that's what Apple is using all the time (again, it is more about control over their own market), plus AAC is a better codec to use for compression, so a it offers better SQ on the same bitrate than SBC and mp3. aptX is pretty much the same story, except it is even better than AAC for compression purpose.
 
Mar 12, 2015 at 4:10 AM Post #175 of 460

james444

Headphoneus Supremus
Joined
Aug 25, 2004
Posts
7,085
Likes
2,325
 
Because mp3 is not actually a free codec to use. You need to pay license for commercial use. So adapting a low-fi codec like SBC that the BT.org has license of will be a lot cheaper and, more importantly, has more control over in the long run. Not to mention mp3 doesn't really offer much better compression over SBC. AAC is also not free, but Apple probably picks it because that's what Apple is using all the time (again, it is more about control over their own market), plus AAC is a better codec to use for compression, so a it offers better SQ on the same bitrate than SBC and mp3. aptX is pretty much the same story, except it is even better than AAC for compression purpose.

 
Moreover, A2DP transmits audio data in small packets, it's not like an mp3 file of several MB can be transmitted as a whole. Chopping mp3 audio data into blocks and recreating a seamless audio stream from these blocks is not a trivial task, especially when the file in question is variable bitrate. I'd think that's one more reason why they first decode everything to PCM and encode / decode for A2DP on a per block basis.
 
Mar 12, 2015 at 5:00 AM Post #176 of 460

Giogio

Bluetooth Guru
Joined
Nov 13, 2014
Posts
1,419
Likes
348
Because mp3 is not actually a free codec to use. You need to pay license for commercial use. So adapting a low-fi codec like SBC that the BT.org has license of will be a lot cheaper and, more importantly, has more control over in the long run. Not to mention mp3 doesn't really offer much better compression over SBC. AAC is also not free, but Apple probably picks it because that's what Apple is using all the time (again, it is more about control over their own market), plus AAC is a better codec to use for compression, so a it offers better SQ on the same bitrate than SBC and mp3. aptX is pretty much the same story, except it is even better than AAC for compression purpose.

So, let me understand, you mean that when the AAC codec is used, and an AAC file is played, this file is not converted to PCM, but directly sent to the BT chip, while if another kind of file is played, than it is converted into PCM first?
Or also AAC files are converted into PCM (which would be so much more a pity, considering the chance to directly stream something totally untouched)?
 
While, for example, if the APTX codec is used, and a AAC file is played, than would this AAC converted into PCM first?
 
Mar 12, 2015 at 8:49 AM Post #177 of 460

ClieOS

IEM Reviewer Extraordinaire
Joined
May 11, 2004
Posts
20,525
Likes
10,359
Location
Mid Johor, Malaysia
  So, let me understand, you mean that when the AAC codec is used, and an AAC file is played, this file is not converted to PCM, but directly sent to the BT chip, while if another kind of file is played, than it is converted into PCM first?
Or also AAC files are converted into PCM (which would be so much more a pity, considering the chance to directly stream something totally untouched)?
 
While, for example, if the APTX codec is used, and a AAC file is played, than would this AAC converted into PCM first?

 
Regardless of what file format you use, it will always get decoded to PCM first. Formats like mp3, AAC, FLAC or ALAC is just a container, they defines how files is stored in a particular way, but they doesn't tell what kind of data is inside. That data is PCM. So basically we have one way of saving PCM called mp3, another way called AAC, another called FLAC, etc. When these files (mp3 / AAC / FLAC / etc) are played, the computer 'restored' it back to the purest form first, which is PCM, then it is fed to the DAC (or the BT chip for further re-encoding for transmission).
 
Taking your example:
AAC files played -> convert to PCM -> send to BT chip -> re-encoded to aptX -> send the aptX to receiver -> receiver take the aptX and convert it to PCM -> fed to DAC and becomes analog sound.
 
(*the red part can be SBC or AAC, depends on what is supported on both transmitter and receiver)
 
If you think it is a pity that AAC get converted to PCM, then you have the wrong idea. PCM is one of the two most basic form of digital audio data (the other one is DSD). But PCM is too big to store, so we process it with codec like mp3 or AAC in order to compress it down to the file size (and quality) that we can live with. So converting AAC to PCM is basic a process of restoration. It is equal to unzipping a file to reveal the original data. If you don't ever want to unzip a file, how would you have any access to the data inside? Therefore the process is a MUST.
 
Mar 12, 2015 at 1:26 PM Post #178 of 460

Class D

New Head-Fier
Joined
Dec 9, 2014
Posts
27
Likes
13
ClieOS,
 
Thank you for your exhaustive explanation of how audio coding and decoding are performed over a wireless connection, such as, BT.  This is how I understand it, also.
 
Giogio,
 
Your ABC analogy of the physical layer of a BT connection is correct.  In the case of radio waves, Point C is actually the end of the universe, 450 billion light years away.  This is where they terminate.  This is also true of wires, the signal terminates at the end of the wire (it is then reflected back into the wire unless terminated properly, with resistance).  Regulatory authorities determine where Point B can exist from Point A and a radio connection is still possible.  It's the antenna output power that is regulated.  For instance, a 30 dBm (1 Watt) radio signal will go for x distance and attenuate rapidly (the radio signal is dropped or lost in the noise).  Point B must be located within x distance from Point A, as the signal strength is too weak beyond that point, except for a highly sensitive radio receiver.  Since the radio signal continues on to the end of the universe, it is theoretically possible with a highly sensitive receiver to make a BT connection at the edge of the universe.  The latency would be approximately 450 billion years and lip synch would be terrible, although sound quality would stay intact when decoded at the end of the universe.
 
When it comes to Radio Frequency transmissions, it is governmental regulatory authorities that control the characteristics of the physical layer of the OSI model. 
 
 
 
OSI Model
Layer
Data unit
Function[3]
Examples
Host
layers

7. Application
Data
High-level APIs, including resource sharing, remote file access, directory services and virtual terminals
HTTP, FTP, SMTP
6. Presentation
Translation of data between a networking service and an application; including character encoding, data compression and encryption/decryption
ASCII, EBCDIC, JPEG
5. Session
Managing communication sessions, i.e. continuous exchange of information in the form of multiple back-and-forth transmissions between two nodes
RPC, PAP
4. Transport
Segments
Reliable transmission of data segments between points on a network, including segmentation, acknowledgement and multiplexing
TCP, UDP
Media
layers

3. Network
Packet/Datagram
Structuring and managing a multi-node network, including addressing, routing and traffic control
IPv4, IPv6, IPsec, AppleTalk
2. Data link
Bit/Frame
Reliable transmission of data frames between two nodes connected by a physical layer
PPP, IEEE 802.2, L2TP
1. Physical
Bit
Transmission and reception of raw bit streams over a physical medium
DSL, USB
 
 
Since BT operates in the ISM (Industrial Scientific Medical) frequency band which has the Effective Isotropic Radiated Power (EIRP) limited to 4 Watts, this limits the range of transmission.  Here's how power output is specified for the ISM band (in the US).

Maximum Transmit Output Power in the ISM bands

Several of the FCC part 15 rules govern the transmit power permitted in the ISM bands.  Here is a summary of those rules:
 
      Transmit Power
(dBm)
   Antenna Gain 
(dBi)
EIRP
     (dBm)
  
30636
29938
281240
271542
261844
252146
242448
232750
223052
  • Maximum transmitter output power, fed into the antenna, is 30 dBm (1 watt).
  • Maximum Effective Isotropic Radiated Power (EIRP) is 36 dBm (4 watt).
    You can obtain the EIRP by simply adding the transmit output power, in dBm, to the antenna gain in dBi  (if there is loss in the cable feeding the antenna you may subtract that loss).
  • If your equipment is used in a fixed point-to-point link, there are two exceptions to the maximum EIRP rule above:
    1. In the 5.8 GHz band the rule is less restrictive. The maximum EIRP allowed is 53 dBm (30 dBm plus 23 dBi of antenna gain).
    2. In the 2.4 GHz band you can increase the antenna gain to get an EIRP above 36 dBm but for every 3dBi increase of antenna gain you must reduce the transmit power by 1 dBm. The table below shows the combinations of allowed transmit power / antenna gain and the resulting EIRP.

The responsibility for staying within these power limits falls on the operator (or, if professionally installed, on the installer).
 
This is what limits your reception of a BT signal in your home.  If you have the Plantronics BackBeat Pro BT headphones which advertises, "Stream audio from up to 330 feet* away from your smartphone or tablet", then you have BT headphones with a highly sensitive radio receiver that will give you extended range (in most cases), even with a BT Class 2 smartphone.  
 
Mar 14, 2015 at 12:28 PM Post #179 of 460

gixxerwimp

100+ Head-Fier
Joined
Feb 21, 2015
Posts
405
Likes
95
Just thought I'd share my aptX experience with my Samsung phones and NAD D 3020 DAC/amp. I can't hear the difference between streaming FLAC files from a Samsung S3 (aptX) to the NAD, and optical in to the NAD (ZIN-5005HD>HDMI>Sharp TV>TOSLINK) when played through B&W M1 + Velodyne Impact-Mini subwoofer. Nor can I hear the difference between the aptX and optical sources when listening to DT 1350s out of the NAD's headphone jack. The tests results are the same with my Note 3 which also has aptX support.
 
Having the BT aptX on this amp makes it very convenient as the UI on the ZIN-5005HD isn't very friendly, and I can stream from my NAS over Wi-Fi to the phone, which then pumps it to the amp over BT. Song selection and volume control are in the palm of my hand. I just have to remember to leave the phone in the living room when I go to the bathroom, otherwise the signal drops out.
 

Users who are viewing this thread

Top