「Official」Asian Anime, Manga, and Music Lounge
Jun 6, 2017 at 1:33 PM Post #173,701 of 177,735
They don't really sell those kind of music in my country, so I usually sneak in Mora and buy them
And why bother with the lossy format when the HR format is just about 500 yen more?
More expensive, take up way more storage, drain your battery faster etc etc
 
Jun 6, 2017 at 1:37 PM Post #173,702 of 177,735
More expensive, take up way more storage, drain your battery faster etc etc
I can put up with the expense if it's my favorite artist though

Storage-wise, I got way more than enough (256GB) and both my DAC and my phone can be fast charging too so I don't really think battery should pose a major problem either way
 
Last edited:
Jun 6, 2017 at 1:45 PM Post #173,703 of 177,735
I can put up with the expense if it's my favorite artist though

Storage-wise, I got way more than enough (256GB) and both my DAC and my phone can be fast charging too so I don't really think battery should pose a major problem either way
Just suggesting that you shouldn't support pseudoscience, but hey, more power to you.
__________________________________________
 
Jun 6, 2017 at 2:13 PM Post #173,704 of 177,735
Flossing doesn't help your teeth per se, rather flossing strengthens your gums. That's what all of those newer reports on flossing is about. The dental community has known that for the longest time already.
 
Last edited:
Jun 6, 2017 at 9:35 PM Post #173,707 of 177,735
Speaking of high-res, apparently the iOS 11 beta supports FLAC natively. Glad to see them supporting a format that they originally shunned. No need to convert to ALAC anymore. Perhaps we'll finally get the long-rumored option to purchase 24/192 tracks from iTunes as well. Wishful thinking, I know.
I never got ALAC to work on my iPod Classic anyway, and since then I stopped caring about getting lossless music onto Apple devices. For me it's a placebo effect at most anyway.

I can put up with the expense if it's my favorite artist though

Storage-wise, I got way more than enough (256GB) and both my DAC and my phone can be fast charging too so I don't really think battery should pose a major problem either way
Same, I've only used 70 GB. And the 7 Plus' battery life is plenty adequate for a full day's use that includes lossless listening. Surely it wouldn't use up more battery than HQ Apple Music over mobile data.

Flossing doesn't help your teeth per se, rather flossing strengthens your gums. That's what all of those newer reports on flossing is about. The dental community has known that for the longest time already.
Exactly, gum disease can lead to loose teeth and other complications like infection.



WEcD0DZ.jpg
 
Jun 6, 2017 at 9:53 PM Post #173,708 of 177,735
Human eye doesn't see in fps. It's continuous. (-_-")
I think 120 Hz will really help with lines/elements drawn using Apple Pencil to keep up with rapid movement of the pencil tip. Based on what they revealed at WWDC I think iOS will stay at 60 fps, but there might be some interpolation.

BtJOvkq.png
 
Jun 6, 2017 at 10:27 PM Post #173,709 of 177,735
Flossing doesn't help your teeth per se, rather flossing strengthens your gums. That's what all of those newer reports on flossing is about. The dental community has known that for the longest time already.
here is sensei showing how to make the gum strong with flossing
surprise-blood-cough.jpg

Alphonse is so jealous.


Human eye doesn't see in fps. It's continuous. (-_-")
https://en.wikipedia.org/wiki/Persistence_of_vision
of course there are many conditions where people can perceive more than about 20 images per second. for starters on a screen we have the number of new images per second, and the number of time the screen refreshes per second that could be different. and just because a screen can do 60 frames per seconds, doesn't mean the switch between 2 frames is instantaneous. on some screen it's a glorious mess. the most used trick is still to insert a dark or grey image every 4 or 5 images to reduce ghosting, but to do that without dropping the number of actual frames in whatever we're looking at, of course we need more frames per second. so the 20 image per second is true under conditions that often do not apply to how we use screens.
then for a moving scene if the object travels too big a distance, our brain will call BS on what it sees. that's one of the reasons why so many gamerz absolutely notice low fps when turning rapidly. it has nothing to do with the eye, and everything to do with ensuring some kind of spacial continuity between images so that the brain can assume it's the same object moving to the left and not fracking teleportation or changing the image on a powerpoint. I remember when I was young and still had hair on my head, with CRT monitors I could tell 75hz and below, but above 85hz I couldn't. but that was another tech with other requirements.

CRT monitor master race! suck it Apple! much gaming, very wow.
JIPVRXOPH
 
Jun 7, 2017 at 12:02 AM Post #173,710 of 177,735
here is sensei showing how to make the gum strong with flossing
surprise-blood-cough.jpg

Alphonse is so jealous.



https://en.wikipedia.org/wiki/Persistence_of_vision
of course there are many conditions where people can perceive more than about 20 images per second. for starters on a screen we have the number of new images per second, and the number of time the screen refreshes per second that could be different. and just because a screen can do 60 frames per seconds, doesn't mean the switch between 2 frames is instantaneous. on some screen it's a glorious mess. the most used trick is still to insert a dark or grey image every 4 or 5 images to reduce ghosting, but to do that without dropping the number of actual frames in whatever we're looking at, of course we need more frames per second. so the 20 image per second is true under conditions that often do not apply to how we use screens.
then for a moving scene if the object travels too big a distance, our brain will call BS on what it sees. that's one of the reasons why so many gamerz absolutely notice low fps when turning rapidly. it has nothing to do with the eye, and everything to do with ensuring some kind of spacial continuity between images so that the brain can assume it's the same object moving to the left and not fracking teleportation or changing the image on a powerpoint. I remember when I was young and still had hair on my head, with CRT monitors I could tell 75hz and below, but above 85hz I couldn't. but that was another tech with other requirements.

CRT monitor master race! suck it Apple! much gaming, very wow.
JIPVRXOPH
Now I want to see a smartphone or tablet with a CRT display.
 
Jun 7, 2017 at 12:14 AM Post #173,711 of 177,735
https://en.wikipedia.org/wiki/Persistence_of_vision


of course there are many conditions where people can perceive more than about 20 images per second. for starters on a screen we have the number of new images per second, and the number of time the screen refreshes per second that could be different.

and just because a screen can do 60 frames per seconds, doesn't mean the switch between 2 frames is instantaneous. on some screen it's a glorious mess.

the most used trick is still to insert a dark or grey image every 4 or 5 images to reduce ghosting, but to do that without dropping the number of actual frames in whatever we're looking at, of course we need more frames per second. so the 20 image per second is true under conditions that often do not apply to how we use screens.

then for a moving scene if the object travels too big a distance, our brain will call BS on what it sees. that's one of the reasons why so many gamerz absolutely notice low fps when turning rapidly. it has nothing to do with the eye, and everything to do with ensuring some kind of spacial continuity between images so that the brain can assume it's the same object moving to the left and not fracking teleportation or changing the image on a powerpoint. I remember when I was young and still had hair on my head, with CRT monitors I could tell 75hz and below, but above 85hz I couldn't. but that was another tech with other requirements.

CRT monitor master race! suck it Apple! much gaming, very wow.
*snip*

Not sure how that's relevant but I said human vision is continuous, not instantaneous. The phototransduction takes appreciable amounts of time relative to the speed of light so it's a given that this happens. Different intensities will cause neurons to fire at different frequencies due to the stimulus/source creating different chemical gradients. So long as the appropriate chemical gradient exists during the time, neurons will continue to fire until the concentration drops below a certain threshold and or equilibrium is reached. Stronger intensities will create stronger gradients meaning that traces of the image will "stay" around longer since having more of a reactant will take longer for you to reach equilibrium or for all of the reactant to be processed. It's common sense for anybody who has taken basic chemistry or biology. I'm also not sure where this mythical 20fps number is coming from. Just from experience, most if not all of us know that number is way too low. The difference threshold (just noticeable difference, whatever you want to call it) is dynamic and relies on intensity magnitude and the difference; larger magnitudes require higher differences in order for us to recognize an appreciable change since humans perceive things logarithmically.

Don't really see the point of mentioning screen tearing but frame syncing and adaptive framerate are becoming more common so we don't really have to worry about screen tearing as this kind of technology starts settling in (it should become standard within the next 5 years since it increases efficiency of electronics). In that case the switch between frame delivery and refresh will be instantaneous.

What you're describing as "ghosting" isn't really correct. Ghosting refers to physical afterimages on displays that exist, especially on modern LCD screens, due to pixels that can't transition fast enough or due to pixel overdrive (which are used to try to decrease pixel response time but overaggressive implementations will cause pixels to overshoot the targeted value). Black frame insertion, which is what you're describing, is used to reduce perceived blur by removing the image source so that the gradient created by a still image isn't maintained but has time to go towards equilibrium. This makes images appear smoother and is a tactic that was often used by early high refresh rate implementations on television. Nowadays I'm not quite sure if it's used as often. Most likely for inexpensive televisions that advertise 120Hz (the more expensive ones might be capable of 120Hz at certain resolutions) but 240Hz and 480Hz rely on inserting blank frames or using frame interpolation to generate frames. As for 120Hz computer monitors, 99%, if not 100% currently being sold that advertise 120Hz or 144Hz are actually that refresh rate.

Of course it has nothing to do with the eye. Your eyes only send sensory input (my wording in the previous post was wrong, so apologies for that). They do no processing. That's obvious from basic physics since your eye only contains a variable focal length convex lens that gives you a real, inverted image which is, for most of us at least, clearly not what we see, rather our brain inverts the image. Continuity has been a long, long debated topic among different schools of psychology for the longest time. Whether or not we're naturally wired up to perceive things this way or whether it's something we learn from exposure to motion that is only continuous (whichever school of thought you like better) is up to you but the end result is that we prefer to perceive motion as continuous. Long story short, we love Taylor Series (as much as we hate them in Calculus 2 and I guess Fourier Series) and other approximations up to a certain degree of accuracy.

tl;dr yes we can perceive frame rate increases above 60 Hz (or 75 Hz in your case) in motion-heavy applications (less so with less motion, no difference with stills for...obvious reasons). Yes we can tell the difference between 60 Hz and 120 Hz (actual 120 Hz, not gimmicky garbage like black frame insertion or frame interpolation).

And **** CRTs.
 
Last edited:
Jun 7, 2017 at 3:22 AM Post #173,713 of 177,735
Testing out video editing on my XPS with Hitfilm 4 Express for the first time. A bit choppy at times but not too bad. Maybe an external USB drive with UASP will do the trick. Some say that they do very well with 4k editing when using one with their laptops, though I suspect the occasional stutter is more of an optimization + cpu power thing. :/

In the future, I'll probably get an i7 version of a laptop just for the small bump in clock speeds. Although it's pretty smooth for most of my work, quite a few of Photoshop's tools still eat CPU power like no tomorrow.
Might need to update to 32 gb of RAM too. My scratch disk, which had about 60 gb of storage left, almost went full the other day and Windows sent me a warning. An unusual project, yes, but I expect more to come and 32 gb will be welcomed over a lowly 8 gb.

As for other programs that are a bit less optimized...yeah. Single core performance for the win. The laptop performs very well for almost everything though. It's just those few rare moments that lag gets a bit annoying, and I know that just a bit more oomph from the CPU will do the job.

Running most of my reasonably complex brushes at 2000 px wide through a 7200 px wide image is fairly acceptable in Photoshop. Not so much in other cheaper programs. Smudge brushes on the other hand...

*Waits full minute for accidental scribble to finish*

------------------------

Wanted an iPad for it's mobility as here are some places I need to go where I just can't bring in a laptop or safely store it nearby, but can bring in an iPad. Being able to move from that place to somewhere else without having to go home to pick up my laptop would have been a huge time saver, however, the old iPad Pro simply couldn't replace my daily needs, even though I know quite a few directors that pretty much work fully on their iPad for their work and for their personal stuff.

The new one announced seems quite interesting. The compromises are more acceptable for me personally, and I can see many people who can take advantage of the Apple Pencil loving it.

Oh, and 120 hz for smooth brush strokes is nice~









 
Last edited:
Jun 7, 2017 at 4:04 AM Post #173,714 of 177,735
Not sure how that's relevant but I said human vision is continuous, not instantaneous. The phototransduction takes appreciable amounts of time relative to the speed of light so it's a given that this happens. Different intensities will cause neurons to fire at different frequencies due to the stimulus/source creating different chemical gradients. So long as the appropriate chemical gradient exists during the time, neurons will continue to fire until the concentration drops below a certain threshold and or equilibrium is reached. Stronger intensities will create stronger gradients meaning that traces of the image will "stay" around longer since having more of a reactant will take longer for you to reach equilibrium or for all of the reactant to be processed. It's common sense for anybody who has taken basic chemistry or biology. I'm also not sure where this mythical 20fps number is coming from. Just from experience, most if not all of us know that number is way too low. The difference threshold (just noticeable difference, whatever you want to call it) is dynamic and relies on intensity magnitude and the difference; larger magnitudes require higher differences in order for us to recognize an appreciable change since humans perceive things logarithmically.

Don't really see the point of mentioning screen tearing but frame syncing and adaptive framerate are becoming more common so we don't really have to worry about screen tearing as this kind of technology starts settling in (it should become standard within the next 5 years since it increases efficiency of electronics). In that case the switch between frame delivery and refresh will be instantaneous.

What you're describing as "ghosting" isn't really correct. Ghosting refers to physical afterimages on displays that exist, especially on modern LCD screens, due to pixels that can't transition fast enough or due to pixel overdrive (which are used to try to decrease pixel response time but overaggressive implementations will cause pixels to overshoot the targeted value). Black frame insertion, which is what you're describing, is used to reduce perceived blur by removing the image source so that the gradient created by a still image isn't maintained but has time to go towards equilibrium. This makes images appear smoother and is a tactic that was often used by early high refresh rate implementations on television. Nowadays I'm not quite sure if it's used as often. Most likely for inexpensive televisions that advertise 120Hz (the more expensive ones might be capable of 120Hz at certain resolutions) but 240Hz and 480Hz rely on inserting blank frames or using frame interpolation to generate frames. As for 120Hz computer monitors, 99%, if not 100% currently being sold that advertise 120Hz or 144Hz are actually that refresh rate.

Of course it has nothing to do with the eye. Your eyes only send sensory input (my wording in the previous post was wrong, so apologies for that). They do no processing. That's obvious from basic physics since your eye only contains a variable focal length convex lens that gives you a real, inverted image which is, for most of us at least, clearly not what we see, rather our brain inverts the image. Continuity has been a long, long debated topic among different schools of psychology for the longest time. Whether or not we're naturally wired up to perceive things this way or whether it's something we learn from exposure to motion that is only continuous (whichever school of thought you like better) is up to you but the end result is that we prefer to perceive motion as continuous. Long story short, we love Taylor Series (as much as we hate them in Calculus 2 and I guess Fourier Series) and other approximations up to a certain degree of accuracy.

tl;dr yes we can perceive frame rate increases above 60 Hz (or 75 Hz in your case) in motion-heavy applications (less so with less motion, no difference with stills for...obvious reasons). Yes we can tell the difference between 60 Hz and 120 Hz (actual 120 Hz, not gimmicky garbage like black frame insertion or frame interpolation).

And **** CRTs.
ok I'll agree on continuous for the general process, it's just that given how easy it is to get stuck into "seeing" the same thing for a given length of time as soon as the stimulus is intense, somehow continuous feels strange.
I mentioned the added dark frame because it's the easiest to understand as a mechanism(like I always use R2R as an example of digital DAC) indeed now we have other "tricks" including actions taken by the graphic card. and in general all techs of flat screens have progressed a good deal in... everything. even my slow IPS panel doesn't murder video or gaming as much as the one I had 2.7eternities ago. sorry if I made it look like adding dark frames was the stuff ^_^.

about ghosting, I'll go to the supreme court like in the movies.
f6avy7nzgvi22sh6pwff75hlhwvxasfr_hq.jpg
<= reaction to ghosting.
 

Users who are viewing this thread

Back
Top