isn't the apple encoder supposed to reduce the gain after checking for intersample clipping so that everything is cool for the encoding and for the playback? I'm still using mp3 like a caveman, but I thought I read something like that once. but maybe it was about a specific AAC encoder? IDK.
personally I got pretty paranoid about intersample clipping and clipping in general, and I have 2 different points along the digital chain to tell me if I happened to clip, so I can mess around with EQ and DSP without panicking.
a free VST that does a fine job for me is DPmeter. the latest version DPmeter III, might require to RTFM to avoid messing up, so those who know they will never read the PDF, maybe get the first or second version of that VST if it's still somewhere on the web. I like that VST because when it clips, the little bars stay on even after several songs. so you don't actually have to keep it open and watch some VU meter all the time like a crazy person.
my EQ also shows that something had clipped, but it's a pricey app so better stick to free stuff.
@Steve999 more than apple vs windows, the difference for you could have to do with the players and audio paths settings. like in foobar, the values are 32bit floating, which won't really care if some value goes a little above 0dB. but after that, if you get into integer stuff(which is likely), then anything above 0dB becomes 0dB. oops. or it could be something super dumb like a digital gain or an EQ somewhere. either creating the clipping issues, or saving you from it with an attenuated gain by default so that you won't clip even when you boost a frequency in the EQ. effectively giving that much digital headroom to all signals. there are even DACs that are built to have a few dB of headroom for intersample clipping(not the norm though).
ultimately creating a loop from your DAC's output into the input of a soundcard and checking that some full scale sines don't clip from your player, would be a decent way to check if something weird is going on with gain and/or oversampling. in foobar when using replaygain, you have the option to prevent clipping according to peak. and if you scan with some oversampling(very CPU hungry!), the accuracy of those peaks will improve. or you could just set up stuff to have like -3dB in the "preamp", and just forget 99.9% of intersample clipping issues. this is really one of those stuff where we do have 20 ways to skin a cat.
now if the file is encoded with clipping into it, well that's that.
I always wonder how many people got into purchasing ludicrous gears, apps, and file resolutions thinking that everything else sounded like crap because they were clipping the signal without knowing? after all, the "bit perfect" frenzy on audio forums only makes it more likely to get clipping. I have no doubt that a good amount of the people saying that lossy formats sound bad would change their mind if they simply were more aware of digital levels and how things work. because it is absolutely true that a lossy file is likely to have higher intersample clipping than higher resolution files. but nobody is forced to actually convert that potential into real clipping.