I can not say for sure but I would consider the higher bitrate to make the largest difference, not only will more system resources such as RAM and storage be needed but the higher bitrate will tax the CPU more then any compression algorithm could possibly do.
Besides, FLAC and WAV are both types of compression, so I don't really understand what you mean by comparing it to MP3, sure FLAC is lossless compared to MP3 and the like but it is still a form of compression, unless we are talking about an uncompressed PCM stream then every format is a form of compression.
To get down to the small details, a bit is obviously the smallest form of data. So 320Kbps, or 320,000bits is not going to require as much CPU throughput as say 1400kbps or 1,400,000 bits of data per second. That isn't even mentioning upsampling and the extra bit depth, they will all stress the system further, it is actually the same as asking if 3GPP @ 320x240 288kbps is going to use more power then AVCHD @ 1920x1080 24mbps, that is an extreme example but some food for thought.
I would also like to mention that CPU cycles are not what uses the most power, take my 1090T for example, it has 6 cores clocked @ 4.2GHz, and it is constantly clocked at that for stability. Assuming it is idling it will use around 30watts, and the voltage is also constant, so 4.2GHz @ 1.475v. So what is the factor that gets a system to consume power, CPU cycles are one thing but not the main factor. Once stressed to 100% load, the cycles remain constant, the voltage constant but the power usage increases to over 280watt, that is thanks to the current
In computing, with all power saving features disabled, the only thing that will increase power usage is current drawn.