Next generation of consoles come in Page 339

  • Page

    of 402 First / Last

  • monkman76 8 Oct 2013 14:42:15 3,913 posts
    Seen 3 hours ago
    Registered 6 years ago
    You're not taking this seriously. :evil:
  • Deleted user 8 October 2013 14:43:40
    Zomoniac wrote:
    monkman76 wrote:
    Zomoniac wrote:
    I'd much rather have 1080p30 with shiny effects.
    You presumably have a 1080p display then?

    There must be plenty of us considering a next-gen console despite only having 720p TVs, just as plenty of people (me included) played this gen on SD TVs for a good while.
    I do. But even if I didn't, a lot of the time, everything else being equal, I prefer 30fps to 60 anyway.
    Um... :S
  • Armoured_Bear 8 Oct 2013 14:44:33 10,240 posts
    Seen 2 hours ago
    Registered 2 years ago
    How can you prefer 30fps to 60fps you loon?

    XBL : ecosse011172
    PSN : ecosse_011172
    NNID : armoured_bear

  • Zomoniac 8 Oct 2013 14:48:27 7,785 posts
    Seen 3 hours ago
    Registered 10 years ago
    monkman76 wrote:
    Hm, never heard anyone say that before. Why?
    Just looks wrong to my eyes. Play might be a fraction more responsive, but it often feels strange and unnatural. In story games you lose the cinematic feel. Something like TLoU at 60fps would be really jarring.

    In racing games, in things like WipEout I like 60fps, but playing Forza 2 or 3, using bumper cam, there's an unnaturally low (by which I mean zero) effect of the road surface on the car, so at 60fps it's so smooth that the cars appear to be gliding on the road, and it feels very strange.
  • Deleted user 8 October 2013 14:54:43
    30fps hurts my eyes and plays like shit. Dull response times coupled with half the frames being fed to you per second results in an overall less engaging and immersive experience for me, by a long way.
  • beastmaster 8 Oct 2013 15:00:59 11,144 posts
    Seen 37 minutes ago
    Registered 10 years ago
    Armoured_Bear wrote:
    Pinky_Floyd wrote:
    I would have thought a single 670 wouldn't be that far ahead of the new consoles, given how super efficient the consoles are compared to the crusty old pc architecture. Can't see it being a huge gulf at any rate.
    I think there's still quite a gap but it'll narrow considerably with time.
    I'm sure Watch Dogs, AC4 etc. will run much better on a 670 and a decent CPU than a PS4.
    I've not done PC gaming so I'm a tad unfamiliar with all those graphics cards etc. My question is though doesn't PS4 having 8G on the graphics card pretty much match (or better) what's out there on PC? I don't think I've seen many PC graphics cards on PC with that much RAM.

    Edited by beastmaster at 15:01:41 08-10-2013

    The Resident Evil films. I'm one of the reasons they keep making them.

  • Widge Moderator 8 Oct 2013 15:02:04 13,243 posts
    Seen 5 hours ago
    Registered 6 years ago
    I can't say I have problems with 60fps. Then again, neither do I with 30fps. What I like is consistency and no tearing really.

    _ _ _

    www.unpaused.co.uk - electronic noise adjective salad

  • bitch_tits_zero_nine 8 Oct 2013 15:04:07 6,654 posts
    Seen 1 day ago
    Registered 3 years ago
    It's unified GDDR 5. But developers have said that only 5gb is available to games which is still more than most PC graphics cards.

    But the PS4 GPU is supposedly comparable in rendering performance to a mid range card, much, much weaker than a 670.
  • Deckard1 8 Oct 2013 15:05:29 27,049 posts
    Seen 1 hour ago
    Registered 5 years ago
    Anyone who says they can feel a massive difference in response time between 30 and 60fps while using a game pad is kidding themselves.
  • Widge Moderator 8 Oct 2013 15:06:01 13,243 posts
    Seen 5 hours ago
    Registered 6 years ago
    Pinky_Floyd wrote:
    I would have thought a single 670 wouldn't be that far ahead of the new consoles, given how super efficient the consoles are compared to the crusty old pc architecture. Can't see it being a huge gulf at any rate.
    Calling in to play the unreliable world of teraflops. I went and dug out what floppage my 660 was running at, to compare to the 1.8 of the PS4. Allegedly it is 2.1, and this is the card that is regarded as a notch below the recommended 660ti and 670's.

    Of course you have to consider dedicated hardware/OS vs. PC, SSD with 8GB ram and all that sort of variable, but I would imagine what we are going to see is going to be not a hell of a lot of different to what I have experienced in the last year.

    The only thing I can imagine changing over the next few years is the structure of games no longer being hampered by older platforms, which will in turn translate to what PC gamers get too.

    _ _ _

    www.unpaused.co.uk - electronic noise adjective salad

  • Deleted user 8 October 2013 15:06:53
    beastmaster wrote:
    Armoured_Bear wrote:
    Pinky_Floyd wrote:
    I would have thought a single 670 wouldn't be that far ahead of the new consoles, given how super efficient the consoles are compared to the crusty old pc architecture. Can't see it being a huge gulf at any rate.
    I think there's still quite a gap but it'll narrow considerably with time.
    I'm sure Watch Dogs, AC4 etc. will run much better on a 670 and a decent CPU than a PS4.
    I've not done PC gaming so I'm a tad unfamiliar with all those graphics cards etc. My question is though doesn't PS4 having 8G on the graphics card pretty much match (or better) what's out there on PC? I don't think I've seen many PC graphics cards on PC with that much RAM.
    RAM is only one part of the equation. It allows more textures/higher res textures, more geometry etc at any one time. That still has to be backed up by everything else, fillrate, shader rate, vertex rate, cpu power etc etc.

    You could give a PS1 8 gigs of RAM, it's still going to have PS1 level visuals, just with high res textures and more of them.

    It does help in a big way though obviously, we should be seeing bigger worlds with far more stuff in them, more texture variety, more texture detail etc.
  • bitch_tits_zero_nine 8 Oct 2013 15:07:48 6,654 posts
    Seen 1 day ago
    Registered 3 years ago
    Why it's batshit crazy that COD recommends a 780.
  • Deleted user 8 October 2013 15:09:32
    Deckard1 wrote:
    Anyone who says they can feel a massive difference in response time between 30 and 60fps while using a game pad is kidding themselves.
    Anyone who says they can't is a heathen.

    The 360 pad response times is a few milliseconds. I've tried the same games on my PC running at 30fps and 60fps and the difference is huge in response times and level of immersion.

    Have you ever tried a 30fps fighting game? The difference is night and day. There is a reason every fighting game worth its salt runs at 60.

    Or even the same damn game, when something like Outrun 2 drops to 30fps on the PS2, the response times suddenly feel far more sluggish.
  • Widge Moderator 8 Oct 2013 15:10:07 13,243 posts
    Seen 5 hours ago
    Registered 6 years ago
    Also the PS4 RAM is kept aside for some OS stuff, so it starts to bring it more in line with PC cards... remembering that they also can have huge wadges of RAM for other usage.

    _ _ _

    www.unpaused.co.uk - electronic noise adjective salad

  • Zomoniac 8 Oct 2013 15:10:21 7,785 posts
    Seen 3 hours ago
    Registered 10 years ago
    bitch_tits_zero_nine wrote:
    It's unified GDDR 5. But developers have said that only 5gb is available to games which is still more than most PC graphics cards.

    But the PS4 GPU is supposedly comparable in rendering performance to a mid range card, much, much weaker than a 670.
    Never works out like that though, if you only have one hardware target and no OS it's much easier to optimise. Yes, you could run TLoU on a PC, but you'd need five times the graphics power to do it. There's no PC GPU with 256MB RAM that will even come close to that level of performance, so comparing theoretical performance doesn't really serve a purpose.
  • Pinky_Floyd 8 Oct 2013 15:14:42 7,343 posts
    Seen 58 minutes ago
    Registered 5 years ago
    Widge wrote:
    I can't say I have problems with 60fps. Then again, neither do I with 30fps. What I like is consistency and no tearing really.
    Pretty much. Also one of the reasons I like playing on the wii u, none of the games tear and the framerate is usually pretty stable whether at 30 or at 60. Tearing is the big scourge of the last gen. Well, that and 20fps gaming.
  • bitch_tits_zero_nine 8 Oct 2013 15:16:12 6,654 posts
    Seen 1 day ago
    Registered 3 years ago
    Still, with that taken into account you're still talking about a considerable disparity in cost of silicon.

    Be interesting to see what this AMD API for direct hardware access turns up in terms of performance.
  • L0cky 8 Oct 2013 15:35:10 1,498 posts
    Seen 9 hours ago
    Registered 11 years ago
    Zomoniac wrote:
    Just looks wrong to my eyes. Play might be a fraction more responsive, but it often feels strange and unnatural. In story games you lose the cinematic feel. Something like TLoU at 60fps would be really jarring.
    I know what you mean, but couldn't it be a case that you're just not used to it. Ie, you wouldn't think it was 'less cinematic' if movies had always been 60fps.

    The reason movies were originally 24-30 fps were technical limitations, and (literal) film costs. I think we've continued that in digital because people are not used to 60fps and therefore think it looks wrong (see the responses to The Hobbit for example).

    I wonder if they had always been 60fps we'd think 24fps was crazy (and jarring) as a directorial choice.

    I remember watching LotR in the cinema, and in the scenes where the camera is sweeping over a large battle, the stuff on the outside of the view as it turns always looked really really choppy because the characters are moving too far between frames. That's the first time I thought we need to move to 60fps for movies. Haven't seen The Hobbit yet, so can't say f I think it benefited (does it even have scenes like that?).
  • Deleted user 8 October 2013 17:46:15
    rumblesushi wrote:
    Deckard1 wrote:
    Anyone who says they can feel a massive difference in response time between 30 and 60fps while using a game pad is kidding themselves.
    Anyone who says they can't is a heathen.

    The 360 pad response times is a few milliseconds. I've tried the same games on my PC running at 30fps and 60fps and the difference is huge in response times and level of immersion.

    Have you ever tried a 30fps fighting game? The difference is night and day. There is a reason every fighting game worth its salt runs at 60.

    Or even the same damn game, when something like Outrun 2 drops to 30fps on the PS2, the response times suddenly feel far more sluggish.
    I agree with you that games between 20-30fps feel sluggish compared to games at 60fps (or even when they dip to 40ish fps).

    Thinking back to the past, when we all got on the back of the Force unleashed 2 developer that was suggesting frame interpolation upscaling of 30fps games upto 60fps might need revisited.

    With the advent of games possibly targeting 45fps (natively) I'm now thinking frame interpolation upscaling might actually be a really good solution if accelerated by a quality hardware scaler.

    I'm somewhat indifferent to the difference between 40 and 60fps in all but driving/fighting games, but very conscious of the sluggish running in water feel(of the AC/batman type 25-30fps game).

    So a 40fps upscaled to 60fps would probably be a good middle ground for performance and zero-tearing in most UE powered games.
  • cubbymoore 8 Oct 2013 18:54:33 36,468 posts
    Seen 3 hours ago
    Registered 10 years ago
    vizzini wrote:
    Thinking back to the past, when we all got on the back of the Force unleashed 2 developer that was suggesting frame interpolation upscaling of 30fps games upto 60fps might need revisited.
    Those were the days.
  • dpezzer 8 Oct 2013 18:57:24 34 posts
    Seen 4 weeks ago
    Registered 2 years ago
    Apologies if this has been asked before but will the new consoles finally see off screen tear for good?,deffo my number one annoyance of the current gen!.
  • cubbymoore 8 Oct 2013 19:00:17 36,468 posts
    Seen 3 hours ago
    Registered 10 years ago
    Nope.
  • bitch_tits_zero_nine 8 Oct 2013 19:09:54 6,654 posts
    Seen 1 day ago
    Registered 3 years ago
    Yeah, they'll just ask the renderer to do more so the game engine will be constantly stressed to capacity which is what produces shit like poor fps and tearing.

    Always a compromise between image quality, fps and graphical anomalies.

    Edited by bitch_tits_zero_nine at 19:10:45 08-10-2013
  • Ryze 13 Oct 2013 14:29:05 3,121 posts
    Seen 1 hour ago
    Registered 8 years ago
    I'm glad that this happened, anyway:

    What I'd like to see from streaming games, is games that can stream content without using a 'video terminal' approach.

    I'd like to see the game begin to download, and be organised such that when the title screen, options and intro/tutorial are ready, then the game can be loaded. Options can be selected, and the 'attract sequence' streamed in, but nothing more until enough data for the first level arrives.

    If your bandwidth is high enough, then by the time you've set any options, and watched the intro, the tutorial / first cutscenes are ready to go, and so on.

    So the intro sequence and title screen / options may be 200MB, and the first level or so might require up to 1GB worth of executable code and assets. The game may well be launchable after 700MB has been downloaded, depending on the download speed, because the intro and tutorial take around the same amount of time that the final 500MB of the 1st level takes to stream into the HDD.

    The tutorial could remain unskippable until enough data arrives for the first level to be playable without any (down)loading pauses. As soon as the 1st 1.2GB arrives, then the rest of the game continues to download in the background until the entire game arrives.

    There's no reason why this couldn't happen for some types of game.

    Something like Streetfighter or an open world game would be a challenge, as the pace of these games wouldn't lend themselves very well to this approach. for Example, you could get through probably 20 characters and 10 backgrounds in SFvTekken within a matter of a few minutes, and there'd be little chance of many people having a fast enough download speed to be able to keep up, without pretty much downloading the whole thing before starting the game.

    With a GTA type game, again, it'd be a challenge due to the ability to sprint across the map immediately in a fast car.

    Saying that, the option for streaming/caching could depend on the user's bandwidth. Those with slower bandwidth could be offered a download.

    For example, Star Wars: Rebel Assault and Silpheed on Mega CD back in the early 90s ran as streaming games, from a 1x CD-ROM @ 150KB/s, with nothing but the RAM to cache data into. GTA and Burnout games, plus loads of others stream the map/tracks from the DVD. An 8Mbit/s Internet connection is only 1MB per second, so huge games would be out of the question.

    With a cache (a HDD) available, it should be possible to pull enough data down from a server to start playing an XBL-sized game like Limbo, Braid or Bastion within a few short moments, if the game data is organised for such a scheme.

    We'd have a long way to go in terms of Broadband speeds and penetration before we could easily do this with dual-layer DVD-sized games or bigger.

    Edited by Ryze at 15:20:43 18-07-2012
    http://www.eurogamer.net/forum/thread/238900

    Edited by Ryze at 14:30:08 13-10-2013
  • Pinky_Floyd 15 Oct 2013 23:31:58 7,343 posts
    Seen 58 minutes ago
    Registered 5 years ago
    Watch Dogs delayed and looks like a wise decision.  Even less reason to jump in at launch.  Mind you, it isnt the only next gen title that looks like it could do with being delayed. 
  • -cerberus- 17 Oct 2013 20:24:19 2,402 posts
    Seen 1 hour ago
    Registered 3 years ago
    Gametrailers posted an old episode of the Bonus Round where the panel discussed the future of the industry. Scary stuff...

    "You see it too? For me, it's always like this..."
    (Angela Orosco - Silent Hill 2)

  • Chopsen 17 Oct 2013 20:30:53 15,699 posts
    Seen 39 minutes ago
    Registered 9 years ago
    36 minutes? tl;dw. Summary?
  • MrTomFTW Moderator 27 Oct 2013 10:50:42 37,300 posts
    Seen 4 hours ago
    Registered 11 years ago
    So that there rumour mill has been grinding about... something over night. And whatever it is, it's not Xbox One based. Instead what I can make out is that Sony is only letting a very select number of people at the hardware - just one North American video producing outlet so far. Plus when they give "direct feed" footage out they're also telling people not to edit the footage.

    So here's Kevin Dent's take on it. The relevant quote from that text:

    I am 100% sure that Microsoft will get a ton more coverage than Sony. There is no conspiracy theory here, the simple truth is that Sony are providing one debug unit to one video outlet in North America (I don’t know about the rest of the world) and only a select few “wordy/written/text” based outlets.

    And before the question is asked, I don’t actually know why Sony doesn’t want established video outlets to have access to video footage of their console before or after launch. It has never been done before.
    Although I admit, the only time I've ever heard of Kevin Dent before is when he had that spat with Phil Fish over Twitter, which was partly the cause of Phil cancelling Fez 2.

    Also, here's Marcus Beer's (YouTube's "AnnoyedGamer") post on the matter.

    Its not just about the games, its reviewing the actual system 2 days before the damn thing ships.
    Something feels off right now to be honest & a lot of people are pissed about it.
    If you have seen anything I have done over the last 8 months, you will know I have been very happy with the way Sony has been doing things but now its time for them to shit or get off the pot and they are being damn evasive about allowing serious access to the system before launch.
    I have mine preordered (ditto XboxOne) & if that is the only way I will get to have said access to the PS4 then so be it.
    When people delay access to something, game or hardware, it's never for a good reason in my experience.
    Arthur Gies of Polygon:

    You'll know answers to certain resolution rumors before the 12th. That's not why embargoes are separate for Xbox one. Think harder eyeroll.
    Here's Adam Sessler (ex-G4 presenter, now editor-in-chief of Internet TV channel Rev3) getting upset about it. Twitter links - 2

    My concerns are about my livelihood being dramatically affected by corporate decisions. This will have a nominal effect on you as a consumer
    This only affects myself and a handful of my colleagues who practice a particular form of coverage of the industry.
    And just for laughs here's Albert Penello tweeting after being woken up by an email because people assumed the problem was Xbox One related. But it's not.

    So what's going on? Are Sony being arseholes about letting people actual get their own footage for some reason, or is it just they haven't disabled the copy protection over HDMI? Probably the latter to be fair :D

    Edited by MrTomFTW at 11:05:08 27-10-2013

    Follow me on Twitter: @MrTom
    Voted by the community "Best mod" 2011, 2012 and 2013.

  • MrTomFTW Moderator 27 Oct 2013 10:57:40 37,300 posts
    Seen 4 hours ago
    Registered 11 years ago
    By the way that's a lot of words to say "LOL Twitter, LOL NeoGAF, they sure do get themselves wound up over very little".

    Follow me on Twitter: @MrTom
    Voted by the community "Best mod" 2011, 2012 and 2013.

  • Page

    of 402 First / Last

Log in or register to reply