wilyodysseus
100+ Head-Fier
- Joined
- Jun 20, 2008
- Posts
- 248
- Likes
- 21
I've just spent the last couple of hours listing to Apple lossless files in both iTunes and sbooth.org Play. Is it just me, or is there a subtle but audible difference in sound quality? I'm not sure which one sounds better or worse, but they do sound different to my ears.
I haven't poked-around the source code yet, but the only thing that jumps out in the feature list is that "Play processes all audio using 32-bit floating point precision, providing the highest possible playback quality for files sampled at all bit depths."
Does anyone have a theory as to why casting a 16-bit source file to 32-bit floats would change the sound quality? I'm presuming that such a type conversion would just pad-out the extra bits with zeros. Am I off-base there?
Thanks.
I haven't poked-around the source code yet, but the only thing that jumps out in the feature list is that "Play processes all audio using 32-bit floating point precision, providing the highest possible playback quality for files sampled at all bit depths."
Does anyone have a theory as to why casting a 16-bit source file to 32-bit floats would change the sound quality? I'm presuming that such a type conversion would just pad-out the extra bits with zeros. Am I off-base there?
Thanks.