Schiit Happened: The Story of the World's Most Improbable Start-Up
Sep 20, 2022 at 7:11 PM Post #100,051 of 109,972

brooksgraham

New Head-Fier
Joined
Oct 4, 2019
Posts
24
Likes
122
Location
Portlandia
Just got back from a short trip and I thought I'd see if Roon ARC was as buggy as some predict.

I'm actually quite impressed. I started streaming while in my garage and verified that it was using my local WiFi for data, then I drove away to see it fail when my WiFi was no longer accessible. It didn't fail. It transitioned over to LTE without a (literal) missed beat. Impressive. Especially considering that it's a 1.0.0 release.

All this while playing seamlessly via my MX-5's built-in infotainment system (wired USB). I've verified that Roon ARC doesn't support CarPlay, but as I recall, neither did the Qobuz client initially. (memory is fuzzy on that, however). When in the car, I tend to start an album or playlist and leave it alone anyways. CarPlay control will definitely be a nice-to-have but it's not a show-stopper for me right now. (and don't get me started on how deficient the Qobuz CarPlay experience is.)

My biggest complaint so far is that running the Roon ARC app on an iPad resizes the UI to iPhone dimensions - as in, it's written for iPhone screens only. I hope that gets fixed in version 1.0.1. :wink:

So in summary, I'm not only cautiously optimistic, I'm optimistically optimistic.
 
Sep 20, 2022 at 8:43 PM Post #100,053 of 109,972

ArmchairPhilosopher

1000+ Head-Fier
Joined
Jul 24, 2021
Posts
1,127
Likes
10,772
Location
California
A bit misleading; M/S and Zamarin's CLR libraries themselves are not native to any given O/S, but the runtimes are. We could say the same thing about PHP, Swift, JRE, or even Flash

https://learn.microsoft.com/en-us/dotnet/maui/what-is-maui
Fair enough. We are probably applying different interpretations of what should be considered native.

The .NET runtime for every platform but Windows is Mono. Mono is an abstraction layer that sits between your code and the runtime environment the respective operating system provides to the application.

I don't consider any abstraction layer that's not actually necessary as "native."

Here's why:
As far as I am aware, in the case of iOS' implementation of Mono, that abstraction layer is essentially a massive pile of wrapper classes and functions. Some of them simply forward function calls and their parameters to the OS' runtime environment and return the return values unchanged, others have to do some additional work to translate given data into different formats/encodings before they can be forwarded or returned. In the case of the pure wrapper classes and functions, that's at best a nuisance, wasting a few CPU cycles for each call and a megabyte or five of extra memory for the additional library code — especially because they can't be optimized away by the compiler since Mono is a precompiled library. In cases where given data needs to be translated first, however, it's outright wasteful every way you look at it.

On every computing platform except Windows PCs, but especially on mobile devices like phones and tablets, performance and energy consumption is absolutely crucial for a great user experience. As a software developer, you are obligated to respect your user and their devices. Don't waste their devices' bandwidth and persistent memory with unnecessarily large app payloads. Don't waste their devices' RAM with unnecessary abstraction layer libraries. Don't waste their devices' battery live with tons of unnecessary CPU cycles and memory allocations. And maybe most crucially: don't waste their time by slowing your app unnecessarily.

So it simply doesn't matter whether your C# code gets compiled into native ARM assembly code or byte code or if it's even merely JIT interpreted, and it simply doesn't matter whether the Mono runtime library is native ARM assembly code or not. It's still a massive waste of memory and of performance.

And don't get me started on the memory and performance benefits of ARC over C#'s garbage collection nonsense.

But from where I'm sitting, that's not even the worst part. The worst part is actually the use of frameworks like Xamarin at al.

I mentioned above that it's your obligation to—among other things—not waste a user's time. Doing so does not only mean that your app should run as performant as possible. It also—and much more crucially—means that you should not force them to have to learn and navigate your non-native user interface.

You want to ship an Android app? Then make it look, feel, and behave like an Android app. Wanna ship something for iOS? Make sure it looks, feels, and behaves like an iOS app. Windows? macOS? Linux? Same thing.

No, that doesn't mean that every app needs to look alike within each platform's environment. There's plenty of room for creativity and ample opportunity to set your branding apart from all the others out there.
But what this DOES mean is that gestures should always work the same and lead to the same result. It means that the look and feel of a scroll container should feel the same in your app as it does natively in every other app. It means that your app should be true to each platform's (view) navigation concept and structure. View modality should behave the same. Keyboard shortcuts. Menu placements and labeling. Etc. etc. etc.

Xamarin et al make this utterly impossible. And they do so by design. Because the whole point of these frameworks is to enable the developer to code once and deploy everywhere. That can only really be accomplished by reimplementing an entire UI framework that's independent from each targeted platform's native UI framework. And not only are those reimplementations yet another massive waste of payload space and RAM, they're also not optimized for the platform they're supposed to run on, and they don't feel "at home" there, either.
Most banal example? Open this new Roon ARC app on your iPhone. Tap on the settings icon, an album cover, anything, really, that takes you to a new screen. See that transition animation? It's much too abrupt. The animation duration is a tad too short, and the animation curve that is applied has an entirely different "ramp" than the kind of damping that Apple's native navigation controller uses. That makes this transition appear hectic, robotic, much less organic than Apple's native transition animation. And the next screen doesn't slide over the previous one as an opaque card, as is standard on iOS, but with an entirely transparent background, sliding over the existing content while that existing content fades into the background. The result is that, for the duration of that transition, your eye and brain has to process one single screen with intermingling texts and images instead of the usual visually clearly separate cards.

This—and countless other issues I could list—creates a user experience that feels out of place on the platform. Every deviation from a platform's widely established UI/UX norms create a form of cognitive dissonance for the user. Muscle memory no longer applies. Known gestures lead to unexpected results. The brain has to invest more effort to locate information and functionality. All this creates frustration, subconsciously at best, conscious at worst. The user will connect that first impression and the resulting frustration with your brand, and you'll spend a lot of time and money down the road dealing with the result from that.

This can easily be avoided by using frameworks that are native to the platform you want to deploy to. But this of course also means that you can no longer "develop once, deploy everywhere," negating the entire raison d'être for these frameworks.

If you care for your own brand, you want a quality product. A quality product means that you'll have to go purely native. That will cost extra time and money. But from the user's perspective, and that of your business' bottom line down the road, the result will speak for itself.
The alternative is to save time and money upfront by going "develop once, deploy everywhere," but with lasting negative impacts for your product, your brand, and user loyalty.

Yes, I obviously feel very passionate about this issue. And so it is probably not immediately clear that I think that everybody should decide for themselves what technologies they want to base their products on. Although I would argue quite vehemently that one will be hard-pressed to find good objective arguments for it, there are plenty of subjectively very valid reasons to use Xamarin and other frameworks as the basis for a product. To each their own, use whatever you think works best for you.

But native it is not, and a great user experience it does not make.
Good enough for some businesses?
Apparently.
But not as good as it could be; and that was the entire point I wanted to make with the original post.



As an aside:
You compared Swift to .net, PHP, JRE, and Flash.
That honestly surprised me quite a bit, because: Swift is a language, the others are frameworks.

When Swift was first introduced, a lot of folks in the industry misinterpreted Apple's presentation and assumed that Swift would be a language that comes with a separate framework that sits on top of the Objective-C runtime environment. Kinda like C# is the language used to write code when you want to use .net, which in turn sits on top of Mono, which in turn sits on top of the Objective-C framework. (At least in the case of Mono for macOS and iOS.)

That is not the case. Swift is merely a language.

Wherever code that's written in Swift needs to work together with code that's written in Objective-C—and that includes precompiled libraries—bridging headers provide LLVM with the necessary means to interpret how it has to handle each end's respective calls. Similarly as it works with mixed codebases that are partly written in C or C++ and Objective-C. In those cases, bridging headers weren't necessary because these languages already come with header files. Swift doesn't use headers, so they have to be supplied by the developer when needed.

In the case of macOS and iOS applications, LLVM compiles Swift into CPU instructions for each and every CPU that the application should support. Yes, that means that it actually produces different binaries for different iPhones and Macs, even though they technically share the same operating systems. This machine code can optionally be unified into a single binary, usually for debug and AdHoc builds, or stripped into separate binaries that then get deployed to the respective target devices through the AppStore's app bundling process. When you download an app for your iPhone 14, it gets a different binary than an iPhone 14 Pro will get, which gets a different binary than an iPhone 12, and so forth.
That's as close to the actual hardware as you can get. LLVM is even able to optimize the instructions better than you could ever do by hand by writing in assembler. The only thing that's precompiled are the system frameworks like Foundation, UIKit, AVKit, StoreKit, CloudKit, etc. — and those are all precompiled into the same CPU-specific binaries as the application eventually will be.
There are no wrappers, no additional abstraction layers whatsoever.

So I'm not sure why exactly you mentioned Swift alongside these frameworks, but I did find it a bit surprising.
 
Last edited:
Sep 20, 2022 at 9:34 PM Post #100,055 of 109,972

rfernand

100+ Head-Fier
Joined
Oct 20, 2019
Posts
266
Likes
1,046
Location
Kirkland, WA
Just got back from a short trip and I thought I'd see if Roon ARC was as buggy as some predict.

I was impressed at how it “just worked”.

It didn't fail.

Pretty solid user experience for sure.

Roon ARC doesn't support CarPlay, but as I recall, neither did the Qobuz client initially.

I’d argue Qobuz doesn’t support CarPlay yet — their app just sucks there :) At least the “Now Playing” built-in worked as expected with ARC.

My biggest complaint so far is that running the Roon ARC app on an iPad resizes the UI to iPhone dimensions - as in, it's written for iPhone screens only. I hope that gets fixed in version 1.0.1. :wink:

Yeah, I hope a true iPad app comes soon.

For the curious: It seems RoonARC is doing on-the-fly re-encoding to Opus - so it is lossy. But who cares, my library travels with me :-D
 
Sep 20, 2022 at 9:48 PM Post #100,056 of 109,972

bcowen

Headphoneus Supremus
Joined
Jan 3, 2018
Posts
12,958
Likes
40,797
Location
North Carolina
Sep 20, 2022 at 10:43 PM Post #100,058 of 109,972

brooksgraham

New Head-Fier
Joined
Oct 4, 2019
Posts
24
Likes
122
Location
Portlandia
For the curious: It seems RoonARC is doing on-the-fly re-encoding to Opus - so it is lossy. But who cares, my library travels with me :-D

I found that in the settings you have control of how it handles WiFi vs cellular data in that regard. If you have an unlimited data plan, then you can set to stream the original file - I.e. lossless.
 
Sep 21, 2022 at 12:08 AM Post #100,059 of 109,972

earnmyturns

Headphoneus Supremus
Joined
Aug 1, 2015
Posts
2,506
Likes
5,855
Location
SF Bay Area
Yep but REL said the best location was behind the speakers so I compromised. I have two identical subs. The Sonus Fabers have the bass reflex openings toward the center at this time but I may point those outward later on just to see how that sounds.
When I had 2 of those subs (with KEF Ref 1s), I worked out a wavelength based placement with the subs against the back wall and the speakers quite a bit forward from the back wall (that particular room called for that). I was very happy with that setup, but then we moved to this glasshouse and that gear gave way to Linn space optimization, it was just too hard to get the old gear to sound good here (house aesthetics fighting audio performance).
 
Sep 21, 2022 at 12:17 AM Post #100,060 of 109,972

earnmyturns

Headphoneus Supremus
Joined
Aug 1, 2015
Posts
2,506
Likes
5,855
Location
SF Bay Area
I know how much better something like Roon could be done, for considerably less money than they ask for. And that just angers me.)
Hey, what's stopping you? I'd sign up for alpha testing :ksc75smile:
 
Last edited:
Sep 21, 2022 at 12:22 AM Post #100,061 of 109,972

earnmyturns

Headphoneus Supremus
Joined
Aug 1, 2015
Posts
2,506
Likes
5,855
Location
SF Bay Area
Sep 21, 2022 at 3:43 AM Post #100,063 of 109,972

Plautus001

1000+ Head-Fier
Joined
Sep 27, 2019
Posts
1,238
Likes
5,696
Location
Vancouver, BC
Sep 21, 2022 at 4:50 AM Post #100,064 of 109,972

tin-ear

100+ Head-Fier
Joined
Feb 19, 2018
Posts
229
Likes
919
Location
South Pasadena, CA


Seems Corpus Christi isn't getting any traffic relief any time soon, and steel will take longer to go from mill to the assembly shop.

Thank you! The Youtube channel that video is on is:
"Practical Engineering is all about infrastructure and the human-made world around us. It is hosted, written, and produced by civil engineer, Grady Hillhouse."
I have been watching his videos for years. Recommended to at least give some a look.
 
Last edited:
Sep 21, 2022 at 5:44 AM Post #100,065 of 109,972

Derrick Swart

1000+ Head-Fier
Joined
Jul 16, 2015
Posts
1,420
Likes
2,319
Location
Sintra, Portugal
Interesting. While I like to keep the audio chain as simple as possible that device might certainly have merit with the digital signal coming from my computer. I have my subwoofers tuned in and I am very happy with my speaker adjustments. Within a few days friends are going to analyze my speaker setup and we will most likely end up in a friendly argument. And so it goes. :ksc75smile:
Just to share: I used the analog in on the minidsp to keep the 'extra' juice the Freya S gives me in x4 gain setting. I can use the digital in but never liked the volume control on my DAC and/or computer, so did not even try it.
 

Users who are viewing this thread

Top