Sure, we sadly live in post factual World, but we try to keep something alive of the better past when facts meant something.
I am bad at keeping in my mind what study has said what. My mind absorbs the core information, adds it to my knowledge/understanding and that's it. Other members here might know better.
We can ask why would 24 bit sound different from 16 bit? We know what the difference is: Take the 24 bit version and substract the 16 bit version from it. This can be easily done in Audacity for example. The result is noise at the level of 16 bit dither assuming the 24 bit version has much more real dynamic range. Can you hear this noise? You should not in reasonable listening levels. You can maybe just hear this noise if you raise it 20-30 dB, but this means the music would make you deaf and your gear would blow up. So, you never listen to music that loud and even if you did, the hearing damages would make it impossible to hear much anything. This means you don't hear this noise even when there is nothing else masking it. Now, add the music on it and you'll understand how hearing differences between 24 bit and 16 bit is a ridiculous idea, barely worth of scientific study.
However, we can hear the difference between 16 bit and 8 bit, because the dither level of 8 bit is quite high. Certain music types can mask this noise well, but not all. That's what I have concluded myself when testing it.
It really comes down to the question of why would 24 bit sound different from 16 bit? How much bits do we need in your opinion to make the sound so transparent more bits don't matter? 20 bits? 24 bits? 32 bits? 235 bits? Infinite bits? In my opinion the threshold is about 13 bits to be on the safe side, but that's just me. Some people may say 12 bits, some 14 bits, but it is somewhere there and clearly 16 bits is ENOUGH.
Why do you think we spread misinformation? If I didn't have good education, I might think bigger numbers must mean better fidelity, but I do have education that allows me to understand well what bit depth means in regards of audible fidelity. For people who have the proper education it is clear 8 bits is not enough, but 16 bits is. The threshold is somewhere between these numbers.
There are countless of Youtube videos about this. There are countless of articles about this. All just misinformation? For what reason? What are these people (including us) selling? If 16 bits wasn't enough then I would admit that of course, but luckily it is. Sometimes technology simply becomes good enough. That's the whole point of digital audio. It makes good enough easy to achieve. If we needed to add bits more and more, digital audio would just like analog audio: Good quality for billionaires who can afford the best and bad quality for normal people who can't.