KillingTime
New Head-Fier
- Joined
- Oct 21, 2007
- Posts
- 15
- Likes
- 1
Hi,
Not sure if this is the right sub-forum, but it is to do with source components.
I have a large music collection, and all my CD's were becoming a hassle to manage. I thought I might get around this by ripping the whole lot to HD and use a media player with digital out to play my selections. I'd then use an outboard DAC to get the quality I was used to from the hi (ish) end CD player I was used to.
I could have used a PC as the media player but PC's need lots more attention than dedicated players (need shutting down properly, they need re-installing every so often etc) so I opted for netmedia 101 media player. This is designed to play movies from it's internal HD, but it has SPDIF out and can play mp3's & wma. This is a low end (£50) player but that's really irrelevant here. The point is it has digital audio out, so it's being used as a digital transport of sorts.
I swapped the CD player for the netmedia 101 and kept the rest of my system the same (amp, headphones, DAC, cables etc). There was a difference. It took me while to figure it out, but the 101 sounded less clear, the soundstage was gone too.
Given that all I've changed is the digital transport, how can this be?
The whole point of SPDIF is that 1's and 0's are sent over the cable. If both transports are doing their job properly, then the DAC sees the same 1's and 0's from either, so the sound should be the same(?).
I looked at SPDIF & TOSLINK via the wiki, and it says:
"data is sent using Biphase mark code, which has either one or two transitions for every bit, allowing the original word clock to be extracted from the signal itself."
I assume this means (correct me):
If the DAC uses the timing info in the SPDIF stream to re-assemble the bits, then the quality of the sound will be dependant on the quality of the clock on the transport, because the timing info is embedded into the stream (?).
If the DAC buffers the data and then uses it's own (more accurate) clock, then there's no problem.
I don't know enough about digital audio to know which of the above is correct, or whether they're both correct, the answer being dependant on the the implmentation of the DAC.
More accurate digital clocks are sold as upgrades for CD players, advertising a resultant wider soundstage and cleaner sound, so I'm assuming the stability of the timing signal has a bearing on the quality of the sound that comes out of the same DAC also.
Anyone shed any light on this?
Thanks.
Not sure if this is the right sub-forum, but it is to do with source components.
I have a large music collection, and all my CD's were becoming a hassle to manage. I thought I might get around this by ripping the whole lot to HD and use a media player with digital out to play my selections. I'd then use an outboard DAC to get the quality I was used to from the hi (ish) end CD player I was used to.
I could have used a PC as the media player but PC's need lots more attention than dedicated players (need shutting down properly, they need re-installing every so often etc) so I opted for netmedia 101 media player. This is designed to play movies from it's internal HD, but it has SPDIF out and can play mp3's & wma. This is a low end (£50) player but that's really irrelevant here. The point is it has digital audio out, so it's being used as a digital transport of sorts.
I swapped the CD player for the netmedia 101 and kept the rest of my system the same (amp, headphones, DAC, cables etc). There was a difference. It took me while to figure it out, but the 101 sounded less clear, the soundstage was gone too.
Given that all I've changed is the digital transport, how can this be?
The whole point of SPDIF is that 1's and 0's are sent over the cable. If both transports are doing their job properly, then the DAC sees the same 1's and 0's from either, so the sound should be the same(?).
I looked at SPDIF & TOSLINK via the wiki, and it says:
"data is sent using Biphase mark code, which has either one or two transitions for every bit, allowing the original word clock to be extracted from the signal itself."
I assume this means (correct me):
If the DAC uses the timing info in the SPDIF stream to re-assemble the bits, then the quality of the sound will be dependant on the quality of the clock on the transport, because the timing info is embedded into the stream (?).
If the DAC buffers the data and then uses it's own (more accurate) clock, then there's no problem.
I don't know enough about digital audio to know which of the above is correct, or whether they're both correct, the answer being dependant on the the implmentation of the DAC.
More accurate digital clocks are sold as upgrades for CD players, advertising a resultant wider soundstage and cleaner sound, so I'm assuming the stability of the timing signal has a bearing on the quality of the sound that comes out of the same DAC also.
Anyone shed any light on this?
Thanks.