PS4 vs XBOX ONE (What would you buy and why? No fanboy like comments please)
Oct 21, 2013 at 7:12 PM Post #376 of 1,094
  IDK I'm an xbox fan, even used to play halo competitively. Never owned a PS3. I'm getting PS4 for Killzone (never played any previous KZ games before, but it looks pretty cool) and BF4. The xbox one was always in the cards for me I was just pissed about the specs/raw power compared to the PS4 and having to buy with Kinect. I really want to play Forza on launch
frown.gif
 

 
I personally cannot justify buying the ONE but that's me.
I am paying 100 more for a console that comes bundled with weaker specs and a camera...
I mean, I understand you are in it for the exclusives but I think I can assure you you're gonna like the Sony 1st parties as much as you liked MS's.
The ONE price is bound to drop by a 100 soon enough so I'd just wait it out if you want to buy the ONE at a later time.
By then, you should know better anyway.
 
Oct 21, 2013 at 8:32 PM Post #377 of 1,094
I always played Playstation....1,2 and now 3 after the xbox 360 that i sold,but stupid cause the controller of the xbox is so much better.And xbox live is better and more fun then playstation plus so if i get a new console it will be the new xbox......
 
Oct 22, 2013 at 1:08 PM Post #378 of 1,094
I've always been one to wait a bit to see where new consoles go. I'm a huge fan of Playstation Plus, but am a bit worried about the quality of games it will give away now that its a requirement to play online.
 
Oct 22, 2013 at 6:17 PM Post #379 of 1,094
Dunno - the sad little geek in me really wants to see what a good FPS looks like on a PS4 hooked up to a 4D TV, complete with a decent surround system pumping out the audio. All up, I reckon it wont cost more than 20K to make that dream a reality  :D
 
Oct 25, 2013 at 2:17 AM Post #381 of 1,094
Shots fired. NeoGAF insiders say that Call Of Duty: Ghosts will be only 720p on Xbone, compared to PS4's 1080p.

http://www.neogaf.com/forum/showthread.php?t=702055


Wow. the XBone is already looking aged...

More expensive, and running games at 2006 resoltuions.

Congratulations, Microsoft. You're pulling a Nintendo.

Yes, graphic doesn't make a game, but it proves that the system is not gonna age well.
 
Oct 25, 2013 at 11:02 AM Post #382 of 1,094
M$: "But...but...but our system is balanced for the optimal gaming experience!"
Sony: "Our console is made for gamers and is the most powerful and it's 399"

Hmmm... Which one to choose?...
 
Oct 25, 2013 at 11:34 AM Post #384 of 1,094
You've git an extremely long wait for native 4k gaming. The new systems will eventually struggle with 1080p once games start really taxing their powers. The only thing that can do native 4k gaming is extremely powerful pcs with multiple high end gfx cards.

You'll have to wait for a PS5 or Xbox 1080 for 4k gaming. By then PC gaming will probably be attempting 8k gaming...
 
Oct 25, 2013 at 12:17 PM Post #385 of 1,094
The new generation of PC graphics cards, AMD Hawaii and NVidia Titan, will do 4K on a single card, though the first releases of that generation are only hitting 25 FPS or so.  But on the PC side it won't be too long of a wait for 4K if you're willing to spend $600 on a graphics card.  (Also the CPU to support it as well.  4K can get CPU bound on anything less than a top end i7.)
 
I think the PS4 does 4K video decoding, so while games won't be an option, watching movies at 4K might be. 
 
That's assuming that any content companies will distribute their movies in 4K.  Hollywood is really paranoid at this point about distributing 4K, because that's what movie theaters use and they don't want bootleg theaters.  Also, doing 4K without artifacts takes about 40 Megabits of bandwidth, so prices would probably need to be higher to pay for the bandwidth.
 
Oct 25, 2013 at 11:37 PM Post #386 of 1,094
I feel that resolution isn't all that important. I would much rather have higher frame rates and more powerful lighting engines. I kind of would like devs to stick to 1080p rather than go 4k and lose a lot of the beauty.
 
I downloaded a 1 GB 6 minute video of killzone gameplay. I wanted to see how high of quality the ps4 goes. Did you know that Killzone was going to be 290 GBs? they had to completely rework the compression algorithms to figure out how to get it down to 40. This game has pretty amazing graphics. Still not Skyrim-but-heavily-modded levels, but amazing nonetheless.
 
Edit: Something I'm wondering is how within the next 4 years or so, Oculus wants to release a 4k rift. Sony is also designing an hmd for the ps4, so the ps4 will have to deal with rendering 2 images. I wonder how the graphics will look for ps4 vr games? Also, definitely another huge reason to get a ps4 over an xbox one.One will have vr, one won't.
 
Oct 25, 2013 at 11:54 PM Post #387 of 1,094
  I feel that resolution isn't all that important. I would much rather have higher frame rates and more powerful lighting engines. I kind of would like devs to stick to 1080p rather than go 4k and lose a lot of the beauty.
 
I downloaded a 1 GB 6 minute video of killzone gameplay. I wanted to see how high of quality the ps4 goes. Did you know that Killzone was going to be 290 GBs? they had to completely rework the compression algorithms to figure out how to get it down to 40. This game has pretty amazing graphics. Still not Skyrim-but-heavily-modded levels, but amazing nonetheless.
 
Edit: Something I'm wondering is how within the next 4 years or so, Oculus wants to release a 4k rift. Sony is also designing an hmd for the ps4, so the ps4 will have to deal with rendering 2 images. I wonder how the graphics will look for ps4 vr games? Also, definitely another huge reason to get a ps4 over an xbox one.One will have vr, one won't.


The current rift is one display split optically.  If they do a 4K rift, it will probably be the same; it will probably be 4K for both eyes, so roughly 1980x2160 per eye.  Splitting a single display gets around lots of nasty synchronization issues.  John Carmack covers it in detail in his 2012 keynote, if you care that much about the details.  I watch them every year because I think he's fascinating.
 
A couple of other notes about rendering for the rift: Since it's only one display split in half, rendering for it is basically just putting a left eye image on the left half of the screen, and a right eye image on the right half of the screen.  It's a little more complicated because the rift uses fish eye lenses between your eyes and the display.  That increases the field of vision.  Without the special lenses, it would be "like looking at the world through a pair of toilet paper tubes".
 
But since it's really just split screen, the rendering could be done on normal HDMI with software, just like rendering split screen multiplayer.  The part that's special as far as I/O is the head tracking.  When you are wearing the rift and move your head, your view of the world moves too.  That requires getting the head tracking info from the Rift back to the console.
 
Overall though, I think if they want the Rift to work on the XBone, they could accomplish it without a significant amount of special hardware on the console end.
 
Oct 26, 2013 at 12:51 AM Post #388 of 1,094
 
The current rift is one display split optically.  If they do a 4K rift, it will probably be the same; it will probably be 4K for both eyes, so roughly 1980x2160 per eye.  Splitting a single display gets around lots of nasty synchronization issues.  John Carmack covers it in detail in his 2012 keynote, if you care that much about the details.  I watch them every year because I think he's fascinating.
 
A couple of other notes about rendering for the rift: Since it's only one display split in half, rendering for it is basically just putting a left eye image on the left half of the screen, and a right eye image on the right half of the screen.  It's a little more complicated because the rift uses fish eye lenses between your eyes and the display.  That increases the field of vision.  Without the special lenses, it would be "like looking at the world through a pair of toilet paper tubes".
 
But since it's really just split screen, the rendering could be done on normal HDMI with software, just like rendering split screen multiplayer.  The part that's special as far as I/O is the head tracking.  When you are wearing the rift and move your head, your view of the world moves too.  That requires getting the head tracking info from the Rift back to the console.
 
Overall though, I think if they want the Rift to work on the XBone, they could accomplish it without a significant amount of special hardware on the console end.

its not 4k for both eyes individually, its 4k split between 2 eyes. Yes, I do watch them, too.
 
Also, its not just split screen, the rendering is a LOT more complicated than that. Otherwise, there wouldn't be the decrease in FPS you see. What you are referring to is called "Zbuffering", and even that will still take up a good amount of power to pull off right. It is a common technique to create the 2 images, but it makes for some very lousy 3d. Sony will not use that very often for the rift, maybe for a few effects demos, but for actual games, not at all. If they want to give people the full shock of 3d, they have to do geometric 3d rendering, which takes up tons more processing power.
 
One of the more common, native ways it is done in game is by using 2 cameras instead of 1, and putting them at the same distances your eyes are. This is actually the method I'll be using, because its natively supported in Unity. Also because I'm not good enough to make use of the other 2 methods XD
 
Oct 26, 2013 at 7:59 PM Post #389 of 1,094
  its not 4k for both eyes individually, its 4k split between 2 eyes. Yes, I do watch them, too.
 
Also, its not just split screen, the rendering is a LOT more complicated than that. Otherwise, there wouldn't be the decrease in FPS you see. What you are referring to is called "Zbuffering", and even that will still take up a good amount of power to pull off right. It is a common technique to create the 2 images, but it makes for some very lousy 3d. Sony will not use that very often for the rift, maybe for a few effects demos, but for actual games, not at all. If they want to give people the full shock of 3d, they have to do geometric 3d rendering, which takes up tons more processing power.
 
One of the more common, native ways it is done in game is by using 2 cameras instead of 1, and putting them at the same distances your eyes are. This is actually the method I'll be using, because its natively supported in Unity. Also because I'm not good enough to make use of the other 2 methods XD


I did actually say that it was 4K split between 2 eyes, though I may have worded it badly.  Full 4K is 3840x2160, so half of 4K would be 1920x2160.
 
Since every frame that ever goes through the device has to have a geometric transformation for the same lens, it seems like something you could create a lookup table for.  It would take up some RAM, but it would probably be faster than re-computing the transformation every time.  To create the lookup table you'd just have to run the transformation, keep track of where every pixel came from and went to, and store it in the table.  It's possible that the table would have to be 3D though since the lens effect may vary by distance.
 
Just a wacky idea from the peanut gallery.  I'm not actually going to be writing any games for VR...
 
Oct 27, 2013 at 3:54 AM Post #390 of 1,094
I loved the PS2 and then the Xbox 360. I felt the Xbox controller and Xbox Live was perfect for FPS like COD, which is why pre-revealing I really wanted to like the Xbox One and thought it was a no brainer. But then MS screwed a lot of things up e.g. all the policies/restrictions that have now been removed and mandatory kinect. So now I'm leaning towards the PS4 because the launch model looks already slim enough for me despite future revisions and the new dual shock 4 looks quite impressive (hopefully it plays as good as it looks). While the Xbox One, I feel can make some improvements such as a slimmer model and *crossing fingers but highly doubtful* kinect-less model.
 

Users who are viewing this thread

Back
Top