You are wrong and people already tried to explain why a higher bit depth won't yield a "more accurate" recording after a certain point.
I will link
this one last time although you always ignored it in the past. Maybe read and digest it this time:
At least pretend you try to understand digital audio before you keep posting your misconceptions in this forum. I'm not expecting you to understand bit depth, quantization error, dither and how they relate to each other (as you never did or made an effort to do so), so I will give you one last example that you might be able to grasp before I accept that some people just never learn.
You have a device that can be set to output any voltage between 0 and 9 with perfect accuracy. This device is completely ideal, if you set it to output 5V, it will generate exactly that, not something between 4.9V and 5.1V. It's 5.000... volts to infinity and beyond (like a perfect microphone combined with a perfect sound source).
You also have an ideal multimeter not bound by physics that can measure voltage perfectly. However, it still have to show the measured voltage in a typical 8 segment LCD display, effectively quantizing the result. Let's say you set the ideal voltage generator to output exactly 5.4321V. How many digits (bits/steps) would you need for that if you weren't allowed to lose precision? You would need to quantize to 5 digits (0.0001 steps) because 5.4321 fits on a 5 digit display. Having more digits would not make the result more accurate as 5.432100 is still the same.
What happens if you pick a trickier output voltage like the square root of 2 or something like pi? The voltage measurement before the display is still perfect but unless you want to use an ...extremely large LCD display, you will have to settle for a finite number of digits and some precision will be lost due that, because the display will quantize the result to some digits. In this case, adding more digits would always make the result more and more accurate.
However, voltage measurements are bound by physics, so let's say you can measure the voltage with a certain amount of precision, like +/- 0.1%.
So if you set the perfect voltage generator to output pi volts (3.14159265359 and so on), the measured voltage can be anywhere between ~3.1385 and ~3.1447.
What do you think, how many of the digits are meaningful in this case, how many digits do you need to not lose the 0.2% precision? The value of the 10th digit is about 0.00000029% compared to pi (I might be off by a 1-2 zeros, it does not matter at all). This combined with the fact that the measurement's precision itself is limited to 0.2% makes displaying anything beyond the fifth or so digits completely pointless. The reading is not more accurate, the numbers will be completely random after a certain point (called "noise" in certain contexts) because the measurement already lacks precision. The only way to make the result more precise would be to increase the measurement's precision itself from +/- 0.1% to something better, displaying more numbers won't help.
The same rules apply when recording (measuring) a signal and recreating (generating) a signal. Both of these steps have a finite precision. Depending on how precisely these steps are done, more bits won't add anything of value because at the recording side, it just encodes random fluctuations coming from the studio, mic preamps even from the ADC itself (noise). Likewise at the generating side, it doesn't matter if the bits tell the DAC to create 1000.001mV if the DAC has a noise floor of 0.1mV, then everything after the first digit of the decimal point will be random. On top of that, the driver itself might not even respond to such a small change, and if it did, the tiny movement in air that it might create should reach the ear instead of getting obliterated by some other random sound coming from PC fans, fridges, passing cars, even people. 24bit is pointless for listening because all in all, neither step comes even close to utilizing the last couple of bits.