Computer audio has one significant disadvantage : your have to use a computer
If you are a PC wizard or like to deal with all the necessary trouble shooting to get things running smoothly, then be my guest. I like to listen to music. I press play on my old fashioned disc player and enjoy 16/44 and some SACD's. If people feel the high rez formats are the best thing since the invention of electricity, then again be my guest.
I'm also into photography a little bit and you have a similar phenomenon there, the pixel and the ISO race. More and more MP, flagships are around 50MP and ISO up to 125,000 and no new camera has less than 20MP and highest ISO below 12,500. And does it really matter? If you blow up the files big enough, yeah you can proof a there is a difference, as always "moar is bettar" ... on the spec sheet
. In real world application or viewing size, a little more than 10MP give excellent results. And yeah you can take pictures in complete darkness, only choosing the subject and framing might be an issue.
Why have the TV's grown to a size of half the living room wall? Because with a reg. 46 inch screen you won't recognize any difference. Of course 4k resolution makes it possible to identify the sort of bug on the leaf of grass on a football field. Does it make the game more interesting?
Back to audio ... there is no equivalent option to increase the screen size. Just positioning the speakers further apart is not really going to help
And since this is head-fi we're screwed anyway. With all the splitting hairs over formats - I just have to listen to some Mercury living presence recording, made with 3 microphones and 35mm magnetic tape more than 50 years ago and I realize that the ability of the recording engineer to listen to the space of the concert hall, pick the right microphones for a specific task and position them in the best possible location - THIS will make a fascinating recording and an artistic and audiophile document.
And btw I think for a processing chip in a computer it doesn't really make a difference if the multiplication / division is by 2.000000000 , 4.000000000, 8.000000000 or by 1.89512345678. It's only humans who like even numbers as our brain handles them easier. I'm certain a chip doesn't care. Yes, theoretically there might be differences in the 10th digit when you shovel the numbers back and forth. On a 80 inch screen you might see it, on a 46 inch, who cares? Ooops wrong medium again