That's a misunderstanding on how audio works over BT. A true lossless 16bit 44.1kHz audio stream required 1.141 Mbps of PCM data (16 x 44100 x 2 channels = 1141.200 kbps = Redbook standard aka CD quality). BT, since around BT3.0 or so, has a total bandwidth of about 2 Mbps, but around 1 Mbps or so is reserved for basic BT communication purpose so you only get around 1 Mbps or so that audio can use. But this is only when you are in ideal situation. Given BT operates on the open and busy 2.4GHz range, you typical won't get the whole 1 Mbps but a bit less depending on how noisy the environment is. The more interference, the less bandwidth is available.
Now let talks about LDAC - while the L words in the name implies “Lossless”, it is actually a lossy compression codec. Since we already know even 16/44.1 is too big for BT, a true 24bit 96kHz lossless PCM audio stream is just impossible. So the only way to fit LDAC into the limited BT bandwidth is to compress it in lossy fashion. That results in the highest LDAC stream (SQ priority) being just 990 kbps, barely fitting into the 1 Mbps BT limitation and thus why LDAC highest setting often is not stable in a noisy environment. So why do we want 24/96 anyway? Well, because a little bit of digital headroom allow for easier digital volume control and filter implementation in the long run, plus it helps company to advertise for their “newer and better” technology, a la 'Hi-Res'. Tests have shown that LDAC can be audibly transparent on the 16/44.1 level but not quite on the 24/96 level as, again, because it is a lossy codec that is meant to get the best out of the current limited BT audio bandwidth and not because it can actually stream the whole full lossless 24/96 PCM data.