cerbie
1000+ Head-Fier
- Joined
- Mar 12, 2005
- Posts
- 1,219
- Likes
- 12
Quote:
It claims to do a great deal of good, does not say what's wrong that it fixes, nor how it fixes it, in any useful detail. Also, it's not $50, or even $100... Thus: skepticism.
This is a not a new claim. It's going to be treated like anything in the past making a similar claim, until it proven, and someone figures out what it's doing (not necessarily their specific algorithms, but in general).
I'm not keen on OS X, hate iTunes, and probably don't have a future in pro audio, so I'll likely never hear it. I've gone through enough crap on different hardware and software platforms to not bash it out of hand, though [size=x-small](most of our common audio problems can be heard with anything better than ibuds or average PC speakers--and worse, they can't be unheard, once you learn to hear them
)[/size]. I also find this an interesting topic of discussion, if going from practically lurking to a page-long post is any indication.
Quote:
1) What "jitter" does iTunes add? It takes data, decodes it, and sends it to be buffer well before it is needed. There's no place for jitter to be introduced. Now, if it can't always get the data out in time, then there may be other latency problems. But, if that's the case, how can it do that, and also succeed at providing bit-perfect output?
A paradox, have we?
2) I can certainly believe that there are hidden traps here and there within the OS' main audio subsystem. I've been using Windows and Linux, after all.
Quote:
A-ha! A symptom! Now what is the cause, I wonder...
Also, how many of the affected devices implement a dedicated clock recovery stage, internally?
Are these devices being fed the same sample rate and bit-depth as before this software was in use?
If it's the same rate and bps, what happens if you record the same song without and with it, and negate them in an audio editor (IE, how close is it)? Quote:
Is iTunes (or some OS X deep subsystem) not sending out the stream properly, leaving you with occasional gaps of a few samples here and there (could turn into jitter and varying perceived pitch?), or...what? Is this based on using any particular kind of interface for the difference*? IE, what kind of "jitter", since jitter is awfully broad, and clearly can't refer to the hardware interface signals (not as a cause, anyway).
Or, is it something entirely different, like a light psychoacoustic filter that masks what you are calling effects of jitter?
[size=x-small]* FI, "driver-free" USB uses a hideous clock and data transfer mechanism, and we commonly use software to deal with it (kernel streaming, ASIO, Jack, OSS4, real-time kernel...dunno what you Apple guys do
). Networking is often under similar constraints (wireless from your PC is under basically the same constraints). Even with non-USB, our OSes have gone so far away from the ideals of Amiga and BE (instant user I/O = Godliness), that it can take such measures even for CPU-misers, like Firewire and PCI based devices, when the wrong circumstances arise.[/size]
Originally Posted by Solitary1 /img/forum/go_quote.gif Damn, you hit the nail on the head. |
It claims to do a great deal of good, does not say what's wrong that it fixes, nor how it fixes it, in any useful detail. Also, it's not $50, or even $100... Thus: skepticism.
This is a not a new claim. It's going to be treated like anything in the past making a similar claim, until it proven, and someone figures out what it's doing (not necessarily their specific algorithms, but in general).
I'm not keen on OS X, hate iTunes, and probably don't have a future in pro audio, so I'll likely never hear it. I've gone through enough crap on different hardware and software platforms to not bash it out of hand, though [size=x-small](most of our common audio problems can be heard with anything better than ibuds or average PC speakers--and worse, they can't be unheard, once you learn to hear them
Quote:
Originally Posted by audioengr /img/forum/go_quote.gif 1) jitter reduction of iTunes 2) avoidance of damaging code in Core Audio |
1) What "jitter" does iTunes add? It takes data, decodes it, and sends it to be buffer well before it is needed. There's no place for jitter to be introduced. Now, if it can't always get the data out in time, then there may be other latency problems. But, if that's the case, how can it do that, and also succeed at providing bit-perfect output?
2) I can certainly believe that there are hidden traps here and there within the OS' main audio subsystem. I've been using Windows and Linux, after all.
Quote:
Without Amarra, converters that are affected by jitter sound fairly bad using itunes, worse than the same converters used on a PC. |
A-ha! A symptom! Now what is the cause, I wonder...
Also, how many of the affected devices implement a dedicated clock recovery stage, internally?
Are these devices being fed the same sample rate and bit-depth as before this software was in use?
If it's the same rate and bps, what happens if you record the same song without and with it, and negate them in an audio editor (IE, how close is it)? Quote:
With Amarra, [hardware alleged to show jitter symptoms with vanilla iTunes] sound better than on a PC. |
Is iTunes (or some OS X deep subsystem) not sending out the stream properly, leaving you with occasional gaps of a few samples here and there (could turn into jitter and varying perceived pitch?), or...what? Is this based on using any particular kind of interface for the difference*? IE, what kind of "jitter", since jitter is awfully broad, and clearly can't refer to the hardware interface signals (not as a cause, anyway).
Or, is it something entirely different, like a light psychoacoustic filter that masks what you are calling effects of jitter?
[size=x-small]* FI, "driver-free" USB uses a hideous clock and data transfer mechanism, and we commonly use software to deal with it (kernel streaming, ASIO, Jack, OSS4, real-time kernel...dunno what you Apple guys do