This is absolutely not at all how audio over Bluetooth codecs should be evaluated. Bitrate has
some correlation with audio quality provided we're comparing different bitrates
with the exact same codec.
These codecs vary very, very greatly in terms of how exactly do they compress data. aptX is actually the least sophisticated of them all. While this is no big deal in concrete terms it means, for example, that regular aptX has a built-in, sometimes audible, raised noise floor at higher frequencies that you can't get rid of :
https://www.soundguys.com/the-ultimate-guide-to-bluetooth-aptx-and-aptx-hd-19914/
But even way more important is how exactly
each one of these codecs is implemented on
each individual devices, and how BT is implemented in general.
Soundguys evaluated AAC's implementation on various emitting devices and the results vary from crap to possibly the best audio over bluetooth codec we currently have overall (given that it only needs 256 kbps to send noise floor and distortion to inaudible levels and can reach high enough frequencies) :
https://www.soundguys.com/the-ultimate-guide-to-bluetooth-headphones-aac-20296/
LDAC's actual bitrate vary depending on the device :
https://www.soundguys.com/ldac-ultimate-bluetooth-guide-20026/
A good implementation of SBC can be just as excellent as aptX.
Etc.
So this is a device by device situation.
But, beyond codecs some BT implementations simply aren't that good and can introduce distortion / artefacts
regardless of codec. A little test I like to do is simply to play single tones at various frequencies and particularly high ones (above 10 000hz), for example with from a website such as this one :
https://www.szynalski.com/tone-generator/
So far very few headphones I've tried pass this test without producing spurious tones at higher frequencies
regardless of the codec that's selected. This year I've tried quite a few of the supposedly higher-end ANC BT headphones and most of them (M3, PX7, Bose 700, etc...) failed it (coincidentally most of them use the exact same SoC family from Qualcomm so perhaps the problem comes from there).
And finally the way these codecs are implemented may introduce other variables that may produce an audible difference, but that isn't related to the codec per se. For example, the decoding of one codec might happen on the main SoC, while for another a specific chip is used.
So the gist of it is quite simple : you absolutely
cannot read a spec sheet, and even less just look at a codec's bitrate, and think that you can know with any form of exactitude how audio over bluetooth will sound.
This is a case by case basis and the whole audio over bluetooth chain needs to be taken into account, from the emitting device to the receiving one. You may have a pair of headphones that can handle LDAC 990, but your smartphone may be limited to LDAC 330, and this all says nothing at all anyway how exactly the SoC or electronics will handle the encoding / decoding process.
A pretty good primer on Bluetooth codecs IMO :
https://habr.com/en/post/456182/
In the case of using the H95 with an iOS device, at the very least the emitting device's output should be totally fine. The big question is : how good exactly the H95's AAC implementation is ? How good its electronics are at handling audio over bluetooth in general ?