PC Enthusiast-Fi (PC Gaming/Hardware/Software/Overclocking)
May 15, 2016 at 8:42 AM Post #8,986 of 9,120
   
Laptops never had this problem XD
 
Well... 
 
A P870 Clevo laptop comes with OC 6700K + OC 980 200W + display + all other components from a power brick of 330W. I thought that having 450W in a node202 will be an excess. 
 

 
Nvidia recommends a minimum 500W power supply with the GTX 980 assuming that it is a desktop machine. Now, because you don't have an integrated battery with your portable PC build that means your PSU will have to do all the work. 
frown.gif

 
On the Clevo, there are situations where 330W won't be sufficient depending on the workload, which is why it will draw power from the battery as well. When this happens, your battery will be draining even with the charger plugged in. 
 
The 6700K at stock is known to draw up to 130W during video decoding, and the stock GTX 980's power consumption does go up to 200W... add the display which is probably around another 20-30 watts, the motherboard, the high rpm fans, the RAM, the speakers, M.2 storage... sheesh that's a lot to power with a measly 330W charger lol.
 
If you are looking to get the GTX 1080, that will draw more power than the GTX 980 too. Seeing the specs, it should draw somewhere between the GTX 980 and the GTX 980 TI but closer to the GTX 980. The stock GTX 980 TI is known to peak up to 280W BTW. 
blink.gif

 
Needless to say, if you overclock your CPU and GPU and/or overvolt them on top of this, they will draw significantly more watts because of diminishing returns and inefficiency.
 
And the 450W power supply the Node 202 comes with is only certified 80 Plus Bronze, which runs a bit hotter and is not the best. Power supplies are most efficient when drawing around 50% of its rated power, so it helps to get a 600W if your PC typically loads 300W for a healthy system. 
biggrin.gif

 
Also power supplies do degrade over time, so they are incapable of providing its rated wattage after a year or so if used heavily/at or near its rated wattage, which is another reason to get a bigger PSU since it gives you more headroom for aging as well...  
confused_face.gif
 
 
May 15, 2016 at 12:44 PM Post #8,987 of 9,120
   
...

 
One reason or another, Clevo P870 does not use battery when not having enough power. 
 
The only laptops that do that are MSI, that this is why I don't buy from them, I dislike the concept. 
 
This being said, do you know another case that fits everything, and can be taken in a backpack, that is not much bigger than Node 202 ? 

I need something that is not cubic, I need to carry it by backpack. 
 
May 15, 2016 at 8:18 PM Post #8,988 of 9,120
   
Nvidia recommends a minimum 500W power supply with the GTX 980 assuming that it is a desktop machine. Now, because you don't have an integrated battery with your portable PC build that means your PSU will have to do all the work. 
frown.gif

 
On the Clevo, there are situations where 330W won't be sufficient depending on the workload, which is why it will draw power from the battery as well. When this happens, your battery will be draining even with the charger plugged in. 
 
The 6700K at stock is known to draw up to 130W during video decoding, and the stock GTX 980's power consumption does go up to 200W... add the display which is probably around another 20-30 watts, the motherboard, the high rpm fans, the RAM, the speakers, M.2 storage... sheesh that's a lot to power with a measly 330W charger lol.
 
If you are looking to get the GTX 1080, that will draw more power than the GTX 980 too. Seeing the specs, it should draw somewhere between the GTX 980 and the GTX 980 TI but closer to the GTX 980. The stock GTX 980 TI is known to peak up to 280W BTW. 
blink.gif

 
Needless to say, if you overclock your CPU and GPU and/or overvolt them on top of this, they will draw significantly more watts because of diminishing returns and inefficiency.
 
And the 450W power supply the Node 202 comes with is only certified 80 Plus Bronze, which runs a bit hotter and is not the best. Power supplies are most efficient when drawing around 50% of its rated power, so it helps to get a 600W if your PC typically loads 300W for a healthy system. 
biggrin.gif

 
Also power supplies do degrade over time, so they are incapable of providing its rated wattage after a year or so if used heavily/at or near its rated wattage, which is another reason to get a bigger PSU since it gives you more headroom for aging as well...  
confused_face.gif
 

It doesn't. The 6700K does not draw up to 130W at stock. Nvidia RECOMMENDS because of the old standard of garbage power supplies in the past.
 
Stock 980s don't pull up to 200W either. High rpm fans ... Those are tiny. Not big, 1 or 2w, M.2 1 or 2w in more modest loads (come on they're SSDs), RAM ... What do you think DDR4 is meant for? Power consumption?
 
May 16, 2016 at 3:33 AM Post #8,989 of 9,120
  It doesn't. The 6700K does not draw up to 130W at stock. Nvidia RECOMMENDS because of the old standard of garbage power supplies in the past.
 
Stock 980s don't pull up to 200W either. High rpm fans ... Those are tiny. Not big, 1 or 2w, M.2 1 or 2w in more modest loads (come on they're SSDs), RAM ... What do you think DDR4 is meant for? Power consumption

 
I am confused at all these details. 
 
Seems as in a computers, is not like in laptops ?
 
For laptops, you add them mathematically. Clevo P870 can draw with that configuration a 450 W  from wall, if you really oc it, and have AC cooling. But most people don't keep a laptop on an AC bench, nor oc it to that point, so max draw is around 330W, which is enough. Even in very heavy workloads, and with moderate oc. 
 
Should I just stick to laptops, or buy a premade computer with 1080 when it comes out ?
 
May 16, 2016 at 2:12 PM Post #8,990 of 9,120
  It doesn't. The 6700K does not draw up to 130W at stock. Nvidia RECOMMENDS because of the old standard of garbage power supplies in the past.
 
Stock 980s don't pull up to 200W either. High rpm fans ... Those are tiny. Not big, 1 or 2w, M.2 1 or 2w in more modest loads (come on they're SSDs), RAM ... What do you think DDR4 is meant for? Power consumption?

 
Oops, I do need to make corrections after some more thorough researching. The benches I've been looking at with stock 6700K's consistently draw over 130W because I didn't account for the power draw of the motherboard itself (and sometimes the inefficiency of the power supply since they measure it from the wall) even without any graphics cards, soundcards, expansions/auxiliaries and extra storage installed in their test setups. Sorry about that. 
frown.gif

 
An accurate system compensated peak load for the stock i7 6700K is actually 110W, and the GTX 980's card-only power consumption (measured directly from the power pins and slot) maxes out at 190W.
 
 

 
It's very possible for the need to draw over 330W DC from the adapter, but unlikely since you would need to fully load the CPU and GPU at the same time. If this condition is met, the other two determining factors still depend on how much power that 17" 4K 350nits display uses, and how much power that motherboard draws. 
 
Also, thanks for reminding me about DDR4 and its nominal standard of 1.2v. I guess I'm still living in 2009 when DDR3 came in 1.65v or 1.8v 
tongue.gif

 
I find myself using online power supply calculators at times as a rough estimate of worst case scenarios outside of seeing actual system benchmark power measurements, but even these calculators seem more keen to con you into to buying bigger power supplies than what is needed. Still, I guess it doesn't hurt to have headroom for PSU aging and/or upgrading... 
 
 
   
One reason or another, Clevo P870 does not use battery when not having enough power. 
 
The only laptops that do that are MSI, that this is why I don't buy from them, I dislike the concept. 
 

 
That is quite odd actually. I've always had the experience that most higher-end electronic devices without an adequate charging current to sustain its load will usually tap into their batteries to make up the difference. Many mobile devices like, PDA's, Tablets, Phones/Smartphones and Bluetooth devices do this for example. Slightly puzzling that laptops don't work the same way?
 
It makes me wonder why something like MSI's Hybrid Power is a bad thing to have, since it will switch itself off if your battery gets low anyways and having this feature prevents your laptop from throttling down even while plugged in. 
confused.gif
 ... or they could just give us more powerful charging bricks lol.
 
 
Still, there aren't many conditions where a manufacturer doesn't provide a sufficient power brick for laptops unless you're talking high-end gaming/work laptops that come with the smallest possible charger to get the job done in most situations while being the most portable. 
rolleyes.gif
 
 
 
   
I am confused at all these details. 
 
Seems as in a computers, is not like in laptops ?
 
For laptops, you add them mathematically. Clevo P870 can draw with that configuration a 450 W  from wall, if you really oc it, and have AC cooling. But most people don't keep a laptop on an AC bench, nor oc it to that point, so max draw is around 330W, which is enough. Even in very heavy workloads, and with moderate oc. 
 
Should I just stick to laptops, or buy a premade computer with 1080 when it comes out ?

 
Hmm, not sure what you mean... perhaps you are talking about power supply efficiency?
 
Assuming the Clevo's power brick does draw up to 450W AC from the wall; it can't provide that same 450W DC to the laptop because nothing is 100% efficient. Since the charger is only rated to provide a maximum continuous supply of 330W to the laptop, that would mean that power supply is 73.3% efficient since it draws 450W from the wall. The missing 120W is wasted as heat. Desktop power supplies do the same thing. 
tongue_smile.gif
 
 
May 16, 2016 at 3:03 PM Post #8,991 of 9,120
My 980tis pull more than 190w each, nearer 275w each on a decent overclock
 
May 16, 2016 at 4:16 PM Post #8,992 of 9,120
My 980tis pull more than 190w each, nearer 275w each on a decent overclock

 
That is quite a lot 
eek.gif

 
 
   
Oops, I do need to make corrections after some more thorough researching. The benches I've been looking at with stock 6700K's consistently draw over 130W because I didn't account for the power draw of the motherboard itself (and sometimes the inefficiency of the power supply since they measure it from the wall) even without any graphics cards, soundcards, expansions/auxiliaries and extra storage installed in their test setups. Sorry about that. 
frown.gif

 
An accurate system compensated peak load for the stock i7 6700K is actually 110W, and the GTX 980's card-only power consumption (measured directly from the power pins and slot) maxes out at 190W.
 
 

 
It's very possible for the need to draw over 330W DC from the adapter, but unlikely since you would need to fully load the CPU and GPU at the same time. If this condition is met, the other two determining factors still depend on how much power that 17" 4K 350nits display uses, and how much power that motherboard draws. 
 
Also, thanks for reminding me about DDR4 and its nominal standard of 1.2v. I guess I'm still living in 2009 when DDR3 came in 1.65v or 1.8v 
tongue.gif

 
I find myself using online power supply calculators at times as a rough estimate of worst case scenarios outside of seeing actual system benchmark power measurements, but even these calculators seem more keen to con you into to buying bigger power supplies than what is needed. Still, I guess it doesn't hurt to have headroom for PSU aging and/or upgrading... 
 
 
 
That is quite odd actually. I've always had the experience that most higher-end electronic devices without an adequate charging current to sustain the load usually tap into their batteries to make up the difference. Many mobile devices like, PDA's, Tablets, Phones/Smartphones and Bluetooth devices do this for example. Slightly puzzling that laptops don't work the same way?
 
It makes me wonder why something like MSI's Hybrid Power is a bad thing to have, since it will switch itself off if your battery gets low anyways and having this feature prevents your laptop from throttling down even while plugged in. 
confused.gif
 ... or they could just give us more powerful charging bricks lol.
 
 
Still, there aren't many conditions where a manufacturer doesn't provide a sufficient power brick for laptops unless you're talking high-end gaming/work laptops that come with the smallest possible charger to get the job done in most situations while being the most portable. 
rolleyes.gif
 
 
 
 
Hmm, not sure what you mean... perhaps you are talking about power supply efficiency?
 
Assuming the Clevo's power brick does draw up to 450W AC from the wall; it can't provide that same 450W DC to the laptop because nothing is 100% efficient. Since the charger is only rated to provide a maximum continuous supply of 330W to the laptop, that would mean that power supply is 73.3% efficient since it draws 450W from the wall. The missing 120W is wasted as heat. Desktop power supplies do the same thing. 
tongue_smile.gif
 

 
I dislike the idea of hybrid power. 
 
On Clevo, if you want to do serious OC, you need to buy 2 power bricks, and connect them together through a converter, then connect the converter to the laptop. 660W of usable energy in that situation. 
 
May 18, 2016 at 2:07 AM Post #8,993 of 9,120

 
It's about time my day-one preorder got fulfilled! And I gotta say, it falls pleasantly in line with my expectations.
 
It works. No screen problems or "No HDMI detected" errors, everything looks fine, chromatic aberration is considerably reduced at the cost of those light shafts/"god rays" that come with Fresnel lenses, and the built-in headphones nearly blew my ears out at default volume. You might want to turn it down when setting it up!
 
And for that matter, this being Head-Fi and all, I don't notice anything wrong about the sound quality to warrant detaching them and using my own cans. They're surprisingly good for supra-aurals and quite convenient to have built right in.
 
Oh, and it's comfortable, MUCH more comfortable than the Gear VR ever was. Still heats up my face a bit after a while, but at least the lenses aren't fogging like crazy like the Gear VR does. That always annoyed the hell out of me when trying to use the Gear VR and having to just spend a few minutes letting the lenses warm up 'til they stopped fogging.
 
I'm gonna need a beefier rig for DCS World, though. That or Eagle Dynamics needs to do way more optimization. I mean, async timewarp is there and it does its job for the head-tracking, but I can still tell that the framerate tanks hard and I really need to scale back some settings compared to conventional monitor viewing.
 
Wouldn't dream of going back to monitor + TrackIR for it, though, even if I really have to work my neck when checking my six compared to TrackIR. It's just that immersive, and it reveals just how cramped it really is inside the P-51D or whatever other cockpit it is you're sitting in. I want all my flight/space sims in VR now. Falcon BMS VR when?
 
I still have a lot to try out with it; work ate up most of my time today after I got it, and it's getting late. Gonna be testing Elite Dangerous and maybe FSX next if FlyInside works out.
 
May 18, 2016 at 10:45 AM Post #8,994 of 9,120
My 980tis pull more than 190w each, nearer 275w each on a decent overclock

Nvidia revised the 980s these 980s aren't exactly the same 980 you get on a desktop card they're made for efficiency even tho it's still the full GM204
 
And a newer bin. Like what AMD did to drop the 7970 Ghz edition to sub 200W on the 280x tahiti XTL chips.
 
May 18, 2016 at 11:55 AM Post #8,995 of 9,120
 
 
It's about time my day-one preorder got fulfilled! And I gotta say, it falls pleasantly in line with my expectations.
 
It works. No screen problems or "No HDMI detected" errors, everything looks fine, chromatic aberration is considerably reduced at the cost of those light shafts/"god rays" that come with Fresnel lenses, and the built-in headphones nearly blew my ears out at default volume. You might want to turn it down when setting it up!
 
And for that matter, this being Head-Fi and all, I don't notice anything wrong about the sound quality to warrant detaching them and using my own cans. They're surprisingly good for supra-aurals and quite convenient to have built right in.
 
Oh, and it's comfortable, MUCH more comfortable than the Gear VR ever was. Still heats up my face a bit after a while, but at least the lenses aren't fogging like crazy like the Gear VR does. That always annoyed the hell out of me when trying to use the Gear VR and having to just spend a few minutes letting the lenses warm up 'til they stopped fogging.
 
I'm gonna need a beefier rig for DCS World, though. That or Eagle Dynamics needs to do way more optimization. I mean, async timewarp is there and it does its job for the head-tracking, but I can still tell that the framerate tanks hard and I really need to scale back some settings compared to conventional monitor viewing.
 
Wouldn't dream of going back to monitor + TrackIR for it, though, even if I really have to work my neck when checking my six compared to TrackIR. It's just that immersive, and it reveals just how cramped it really is inside the P-51D or whatever other cockpit it is you're sitting in. I want all my flight/space sims in VR now. Falcon BMS VR when?
 
I still have a lot to try out with it; work ate up most of my time today after I got it, and it's getting late. Gonna be testing Elite Dangerous and maybe FSX next if FlyInside works out.

Cool can you still spot enemy airplanes in the distance?
 
May 18, 2016 at 1:59 PM Post #8,997 of 9,120
  With the Oculus did you get any motion sickness?
 
I tried a dev model when Facebook brought it into work and it made me ultra queasy.

What did you demo? fps experiences where you walk normally but standing still seem to be an issue thus most use teleportation. adrift where you is adrift also causes great issues it seems whereas car games etc ironically don´t cause motion sickness :)
 
May 18, 2016 at 3:10 PM Post #8,998 of 9,120
Cool can you still spot enemy airplanes in the distance?

 
In all fairness, DCS is the sort of sim where I have trouble spotting planes in the distance on a conventional monitor if the model enlargement/visibility feature is turned off!
 
But with it turned on, I managed to actually keep track of the Fw-190 I was tangling with for a few quick one-on-one dogfights. I still need to test more modern aircraft that are more likely to engage BVR at first.
 
A friend of mine's actually had a few choice posts about aircraft visibility in sims:
 
http://why485.tumblr.com/post/144473497183/dcs-in-vr-i-had-a-cv1-last-weekend-and-spent-most
"The other major limitation is visibility. With model visibility off (which pretty much any MP server forces), you can’t see a thing out the window. I was able to see a big ass A-10 as a tiny flickering dot at about 4km, and it didn’t come into view as a discernible object until roughly 2km. You need to play with model visibility or labels on if you intent to play with VR, and because I fly DCS almost exclusively in MP, that means that I’m going to be sticking to my monitor and TIR. At the end of the day, I just cannot accept the dramatic loss in combat effectiveness for the novelty of VR.
 
I do want to say though, that the model visibility setting works very well in VR. I’m convinced that the only reason ED even implemented it at all is because they tried DCS with VR and ran into this very problem. The system has always seemed like it was designed to work with low pixel density screens, of which all VR headsets are right now. So if you plan on playing SP or hosting your own MP games with model visibility on, then it’s not the worst thing in the world to play with VR."
(Thankfully, I mostly fly offline, so this isn't a problem.)
 
http://why485.tumblr.com/post/143713696893/after-coming-across-a-really-interesting-paper
"WWII and Korea players in DCS they lose sight of things they really shouldn’t be when fighting WVR. Even in modern jets after the merge, it shouldn’t still feel like BVR when the other guy is only 2 miles away from you. I can’t count the number of times in DCS where I make it to the merge, and never visually re-acquire a target that can’t be more than 1 mile away from me. It’s awful, and this problem is only worse when you’re flying a guns only Mk1 eyeball reliant machine like a MiG-15 or Bf-109."
(Note that he means with DCS model enlargement/visibility enhancement OFF there, and he's not kidding. It's why I lose sight of my targets on a traditional monitor.)
 
With the Oculus did you get any motion sickness?
 
I tried a dev model when Facebook brought it into work and it made me ultra queasy.

 
I've suffered no motion sickness in the Rift so far, which is great. I may just have a higher tolerance for it.
 
I didn't really have to worry about it much on my Note 4 Gear VR either, but Omega Agent did give me some very mild headaches while jetting around for a while, as if that's the sort of nausea the folks at Oculus are trying to minimize. Motion sickness is clearly a very real concern with a VR HMD that takes up most of your natural FOV as opposed to looking at a monitor.
 
I'd like to try Omega Agent on the Rift CV1 to compare, since I'd have full-blown translational (positional) tracking and all, but of course, Oculus is about as bad as Nintendo when it comes to cross-buy on the Oculus Store - it's nonexistent! I'd rather not have to buy the same game twice!
 
May 18, 2016 at 3:46 PM Post #8,999 of 9,120
Thanks. I am used to the labels in IL 2 done one dog fight in DCS with the P51 on my 2560x1080 ultrawide and despite trackIR incredibly hard to spot. So can see how it could be next to impossible in VR. I should never venture online I am so rusty I would just get slaughtered and frustrated so hopefully the cheats are working :)
 
May 27, 2016 at 10:36 PM Post #9,000 of 9,120
No need for desktops anymore guys..........
 
http://www.tweaktown.com/news/52215/nvidia-rumored-launch-geforce-gtx-1080m-notebooks-computex/index.html
 

Users who are viewing this thread

Back
Top