「Official」Asian Anime, Manga, and Music Lounge
Jul 7, 2017 at 1:11 AM Post #173,896 of 177,750
I mean I'd rather have 3rd party support than no support on the Apple side of things after 2-3 years depending on the pace of iOS.

2-3 years?

The iPhone 5S was released September 2013, it will be getting iOS 11 which means it'll receive official updates until iOS 12 in late 2018. Five years of official updates, and for all we know it might be getting iOS 12 too. (iOS 11 will be moving to 64-bit only apps, and the 5S was the first smartphone with a 64-bit processor, so at least there is some "technical" reason why the older iPhone 5 is not supported)
 
Jul 7, 2017 at 3:49 AM Post #173,897 of 177,750
upload_2017-7-7_17-49-26.gif
...
 
Jul 7, 2017 at 8:22 AM Post #173,898 of 177,750
QhZPT9R.jpg


2-3 years?

The iPhone 5S was released September 2013, it will be getting iOS 11 which means it'll receive official updates until iOS 12 in late 2018. Five years of official updates, and for all we know it might be getting iOS 12 too. (iOS 11 will be moving to 64-bit only apps, and the 5S was the first smartphone with a 64-bit processor, so at least there is some "technical" reason why the older iPhone 5 is not supported)
Nowhere to go but up. It's gonna take a while for Android OEMs to implement desktop-class hardware in phones.
 
Last edited:
Jul 7, 2017 at 9:39 AM Post #173,899 of 177,750
Camera performance "suffers" if the phone relied really heavily on post-processing to clean everything up. Capable cameras tend to have no issues especially since manual control is quite easy to access.
Post-processing is more important than you think. It's where raw image data from the sensor is converted/compressed into the easy-to-handle JPEG format. The difference in post-processing tech/algorithms is how iPhone, Samsung and Lumia cameras that have Sony sensors can take consistently better photos than actual Sony phones with nearly double the megapixel count. Oh, and Sony's default "Superior Auto" mode topping out at 8 MP also helps. I bet 95% of users don't even know their 20+ MP Sony phone camera is taking photos at a measly 8 MP.

Stock ROMs contain all the proprietary drivers and blobs that enable the camera in the phone to perform to its full potential. Custom ROMs don't have any of that. I've had first-hand experience of this with my OPO on LineageOS. SultanXDA's LineageOS uses OxygenOS camera binaries which greatly improved the photo quality, especially in low light.

Manual controls in a smartphone camera is just an extra unnecessary layer of complexity that should not be needed to be able to take good photos. It's in the same league as having to root your phone just to keep random wakelocks at bay to get decent standby battery life.

Edit: on this topic, I can't wait to see HMD Nokia phones with Carl Zeiss optics! https://www.engadget.com/2017/07/06/nokia-phones-return-to-using-zeiss-camera-tech/

On another side note, I love the hardware build of Symbian and Lumia-era Nokias. They just ooze quality. Back in the Symbian days, Nokia phones with stainless steel parts were not uncommon, and Lumia plastic was gorgeous. And that thick, green and gold PCB! HMD Nokia mainboards look kinda flimsy in comparison (although I've never worked with one in person before).
 
Last edited:
Jul 7, 2017 at 11:24 AM Post #173,900 of 177,750
2-3 years?

The iPhone 5S was released September 2013, it will be getting iOS 11 which means it'll receive official updates until iOS 12 in late 2018. Five years of official updates, and for all we know it might be getting iOS 12 too. (iOS 11 will be moving to 64-bit only apps, and the 5S was the first smartphone with a 64-bit processor, so at least there is some "technical" reason why the older iPhone 5 is not supported)

How fast is it still running? My last experience was with a 4S on iOS 9 and it was a pretty garbage experience. We'll see how quickly Apple pushes their microarchitectures since that ultimately determines how long the device will last as far as usability and not just official support.

Post-processing is more important than you think. It's where raw image data from the sensor is converted/compressed into the easy-to-handle JPEG format. The difference in post-processing tech/algorithms is how iPhone, Samsung and Lumia cameras that have Sony sensors can take consistently better photos than actual Sony phones with nearly double the megapixel count. Oh, and Sony's default "Superior Auto" mode topping out at 8 MP also helps. I bet 95% of users don't even know their 20+ MP Sony phone camera is taking photos at a measly 8 MP.

Stock ROMs contain all the proprietary drivers and blobs that enable the camera in the phone to perform to its full potential. Custom ROMs don't have any of that. I've had first-hand experience of this with my OPO on LineageOS. SultanXDA's LineageOS uses OxygenOS camera binaries which greatly improved the photo quality, especially in low light.

Manual controls in a smartphone camera is just an extra unnecessary layer of complexity that should not be needed to be able to take good photos. It's in the same league as having to root your phone just to keep random wakelocks at bay to get decent standby battery life.

Edit: on this topic, I can't wait to see HMD Nokia phones with Carl Zeiss optics! https://www.engadget.com/2017/07/06/nokia-phones-return-to-using-zeiss-camera-tech/

On another side note, I love the hardware build of Symbian and Lumia-era Nokias. They just ooze quality. Back in the Symbian days, Nokia phones with stainless steel parts were not uncommon, and Lumia plastic was gorgeous. And that thick, green and gold PCB! HMD Nokia mainboards look kinda flimsy in comparison (although I've never worked with one in person before).

Oh no I'm quite aware that post processing can make a large difference. That's already apparent enough in even expensive cameras and it's impact is amplified on small sensor cameras since differences in noise reduction are very apparent at that scale. The problem happens mostly in low light where these cameras are already abysmal. Not having the proprietary algorithms only raises that minimum flux/exposure the camera is useful at. In daylight situations it isn't nearly as apparent if not even noticeable at all.

I'm personally not a fan of manual controls but auto on smartphones isn't that great either. It's effectively just a programmed mode since there's not much in the way of aperture or iso to adjust. As far as I'm aware, dof adjustment basically doesn't exist outside of simulated ones since basically everything will be in focus with a small sensor and low radius of curvature lenses.
 
Last edited:
Jul 7, 2017 at 12:10 PM Post #173,901 of 177,750
How fast is it still running? My last experience was with a 4S on iOS 9 and it was a pretty garbage experience.

Yeah I totally agree it's garbage, but let's keep in mind that the 5S is 4 times faster than the 4S.

When's the last time we've seen a 4 fold increase in performance in CPU's? I think the first dual core Athlon X2 are just about 4 times slower than our wimpy 15W dual core Kaby Lake parts, and it took a whole decade to get here which is an eternity in computing.

I was at the hardware store earlier and I took a look at the lighting section (as you do, being a flashaholic and all), it seems quality LED bulbs have become very affordable now. 10.5W, CRI>80 @ 6500k only $12 or so, picked one up and it's so much better than the CCFL energy saver bulbs I've been using for the past decade.

I'm sick of the "warm white 3300k" coloured stuff but that's all I had in stock at home because there was an energy saving initiative some time ago and the government gave every household some bulbs to use.
 
Jul 7, 2017 at 12:52 PM Post #173,902 of 177,750
Yeah I totally agree it's garbage, but let's keep in mind that the 5S is 4 times faster than the 4S.

When's the last time we've seen a 4 fold increase in performance in CPU's? I think the first dual core Athlon X2 are just about 4 times slower than our wimpy 15W dual core Kaby Lake parts, and it took a whole decade to get here which is an eternity in computing.

I was at the hardware store earlier and I took a look at the lighting section (as you do, being a flashaholic and all), it seems quality LED bulbs have become very affordable now. 10.5W, CRI>80 @ 6500k only $12 or so, picked one up and it's so much better than the CCFL energy saver bulbs I've been using for the past decade.

I'm sick of the "warm white 3300k" coloured stuff but that's all I had in stock at home because there was an energy saving initiative some time ago and the government gave every household some bulbs to use.

I'll see if I can play with a 5S. I do still think the lack of fragmentation hurts app support lifespan though which is always an annoyance.

4x0 is still 0. jk but that's a very different class of processor. RISC architectures have more apparent benefits from clockspeed and transistor count increases and are easier to implement than CISC architectures which are ridiculously expensive and complex. CISC's speed is dictated by the slowest instructions. RISC only relies on shorter, simpler ones. Both are still basically waiting on process node shrinkage since you're only going to be able to get so much performance at such a low power draw for mobile devices. I'm quite sure RISC is technically more dependent on node size than CISC is just because of how well it scales with more transistors so RISC is probably going to hit that wall very soon, but whatever node shrink it gets, it'll still see more performance increases than CISCs.

I also still kind of hate how Apple does SoC design. They're very efficient (same with iOS) but I don't see dual core designs (A10 uses a big.LITTLE style design w/ 2 high performance, 2 low power draw and we're not sure if they can all be used together) aging well mostly because we're moving towards parallelism. It gets even worse with AI and more rigorous camera software.

I remember Cree LED lightbulbs being very popular for being affordable and good quality. I think they're the company that did the lighting for the Water Cube in Beijing for the 2008 Olympics?

http://engt.co/2uvKsbP

Isn't this kind of a big target for antitrust? I wouldn't be surprised to see Qualcomm sue Mediatek so they end up like Samsung with Exynos in the US. Unless Intel still somehow counts as an SoC competitor.
 
Last edited:
Jul 7, 2017 at 2:35 PM Post #173,904 of 177,750
I also still kind of hate how Apple does SoC design. They're very efficient (same with iOS) but I don't see dual core designs (A10 uses a big.LITTLE style design w/ 2 high performance, 2 low power draw and we're not sure if they can all be used together) aging well mostly because we're moving towards parallelism. It gets even worse with AI and more rigorous camera software.

Isn't this kind of a big target for antitrust? I wouldn't be surprised to see Qualcomm sue Mediatek so they end up like Samsung with Exynos in the US. Unless Intel still somehow counts as an SoC competitor.

AFAIK it's similar to the big.little designs, it's totally transparent to the app developers and we're not able to run both the high power and low power cores at the same time, the scheduler(?) takes care of that stuff.

You know it's interesting you bring up the 2 fast vs 4 slower core argument, it's kind of similar to the old CPU days where people had to pick between the Intel 2500K and the FX-8320 right? We've seen that the 2500K holds up extremely well today. I think it's a similar kind of story today with the A10 vs the 835, the A10 is only about 15% slower in multicore benchmarks but is a whopping 60% faster in single core performance. 15% might as well be a rounding error at this point, it's not significant enough to say that it'll age worse because of that, but I think the 60% extra single threaded performance takes the cake.

Besides, if they ever needed the extra performance they can shove another core in like on the A10X, that thing is a force of nature and I wonder if the A11 will be a triple core design too. What's interesting is that the A10X was built on a 10nm process, the A11 will be too and it's anyone's guess how they're gonna play that one out. Perhaps still a dual core design with improved IPC, but shoving in a massive GPU with the space savings to feed the possible 120Hz display? Or maybe they'll move to triple core as well because they seem to be going whole hog on the AR thing and they might need the horsepower.

Speaking of lawsuits, was Qualcomm or someone else working on implementing the x86-64 instruction set on a mobile ARM processor or something? Intel was trying real hard not to let it happen, it'll be really neat if we could one day just use our phones as a portable desktop workstation of sorts, kind of like what Samsung did with the S8, you could plug in a monitor/keyboard/mouse and work away.....sort of. XD

The anime about other world restaurants was nice, I like the ED. Violet Evergarden ep 1 was screened at some expo...wonder if there'll be a broadcast somewhere before it airs next year.
 
Jul 7, 2017 at 4:16 PM Post #173,905 of 177,750
Speaking of lawsuits, was Qualcomm or someone else working on implementing the x86-64 instruction set on a mobile ARM processor or something? Intel was trying real hard not to let it happen, it'll be really neat if we could one day just use our phones as a portable desktop workstation of sorts, kind of like what Samsung did with the S8, you could plug in a monitor/keyboard/mouse and work away.....sort of. XD

Curious how that will turn out. When I first read that at the beginning of the year on a Chinese news site, I thought that either it's made up to cause attention, or its going to be a war. :p
 
Last edited:
Jul 7, 2017 at 6:05 PM Post #173,906 of 177,750
AFAIK it's similar to the big.little designs, it's totally transparent to the app developers and we're not able to run both the high power and low power cores at the same time, the scheduler(?) takes care of that stuff.

You know it's interesting you bring up the 2 fast vs 4 slower core argument, it's kind of similar to the old CPU days where people had to pick between the Intel 2500K and the FX-8320 right? We've seen that the 2500K holds up extremely well today. I think it's a similar kind of story today with the A10 vs the 835, the A10 is only about 15% slower in multicore benchmarks but is a whopping 60% faster in single core performance. 15% might as well be a rounding error at this point, it's not significant enough to say that it'll age worse because of that, but I think the 60% extra single threaded performance takes the cake.

Besides, if they ever needed the extra performance they can shove another core in like on the A10X, that thing is a force of nature and I wonder if the A11 will be a triple core design too. What's interesting is that the A10X was built on a 10nm process, the A11 will be too and it's anyone's guess how they're gonna play that one out. Perhaps still a dual core design with improved IPC, but shoving in a massive GPU with the space savings to feed the possible 120Hz display? Or maybe they'll move to triple core as well because they seem to be going whole hog on the AR thing and they might need the horsepower.

Speaking of lawsuits, was Qualcomm or someone else working on implementing the x86-64 instruction set on a mobile ARM processor or something? Intel was trying real hard not to let it happen, it'll be really neat if we could one day just use our phones as a portable desktop workstation of sorts, kind of like what Samsung did with the S8, you could plug in a monitor/keyboard/mouse and work away.....sort of. XD

The anime about other world restaurants was nice, I like the ED. Violet Evergarden ep 1 was screened at some expo...wonder if there'll be a broadcast somewhere before it airs next year.

I don't remember but can big.LITTLE schedule tasks to both high and low speed cores? (ex. game runs on high, background runs on low)

Eh I don't really think I was arguing cores per se. You can kind of view it that way since whatever Apple is doing with what Jim Keller has left behind (or it's a new slate) is amazing. Single core performance is just smashing everything Qualcomm has to offer. I only think Samsung Exynoses can hold a candle and that's only in a few scenarios. Enough so that is can basically look like less fast cores versus more slow cores, but I digress. I'm just worried that as parallelism starts to become prevalent, iOS will also move in that direction and then it would cause dual-core designs to phase out relatively quickly. Yes, two faster cores can handle tasks quickly enough but it's still rapid switching, but it's not just 4 cores going simultaneously kind of thing.

2500K holds well today because general desktop software doesn't really know how to effectively use more resources on the core/thread side and Intel is basically sitting on their ass doing nothing right now. I also can take a guess as to why A10 single core performance is absolutely disgusting (in a good way); I haven't looked at die size measurements but given the way Qualcomm prioritizes parallelism and Apple doesn't on the mobile side, Qualcomm has to use smaller cores (and less transistors as a result) per core. Your SoC can only be so large. So if we just look at some square piece of silicon and think about how surface area of that square is an exponential function (by ^2), you'll quickly see why. It's still impressive by all means though. The main reason Apple can get away with less cores and less memory is because how efficient everything is and how streamlined it is since they have control over effectively everything from hardware to software.

A9X and A10X are effectively just two A9's and A10's shoved together respectively. The nice thing about RISC architectures is that they're a lot more modular than CISC architectures. I know AMD is trying hard for modularity on Zen with the 2-core complexes that they can add and remove with some pipeline adjustments.

Microsoft was working on Windows (full Windows, not RT or Windows 10 S) on ARM and they were making an x86-64 to ARM compatibility layer or something (emulator? Don't know what they call it). Intel threatened to sue them because Intel has the x86 rights. I doubt Intel would care as much if it was slow but any RISC can emulate any CISC at high speeds (well, not any, but for the most part the logic makes sense: RISCs have a lot less complicated instructions that can emulate the more complex instructions of CISC).
 
Last edited:

Users who are viewing this thread

Back
Top