Let's quote fact here. Ripping a CD to FLAC is a digital to digital transfer and error detection/correction is built in to ensure a bit perfect transfer. Both FLAC and ALAC are lossless compression so they are both bit perfect. Furthermore, the CD medium has redundant data built in to deal with external scratches. The bits are physical pits in the CD medium and completely immune to sub atomic particles flipping bits like they do in semiconductors.
If you do hear a difference it would be due to whatever "extra enhancement" processing AK has built in during the ripping. Assuming they do, it would be adding or deleting bits and can no longer be labeled bit perfect.
An example of this was early versions of Apple iTunes. During ripping, it was long suspected that iTunes normalized the volume by tossing out what they considered insignificant bits and artificially increasing the dynamic range. Since AAC is lossy to begin with, this was perfectly legitimate to do. However, people complained so now iTunes stores the volume matching data separately and only applies it during playback in iTunes and leaves the original rip unaltered.