The Budget Gaming PC thread Page 117

  • Page

    of 147 First / Last

  • Bremenacht 26 Jul 2013 15:26:16 17,614 posts
    Seen 13 hours ago
    Registered 7 years ago
    bitch_tits_zero_nine wrote:
    The software hasn't been designed yet so you don't know is the point.
    That's how the argument started.
  • bitch_tits_zero_nine 26 Jul 2013 15:26:48 6,654 posts
    Seen 26 minutes ago
    Registered 3 years ago
    You're still speculating, irrespective. Because the systems haven't been designed yet. That's a fact.
  • bitch_tits_zero_nine 26 Jul 2013 15:27:06 6,654 posts
    Seen 26 minutes ago
    Registered 3 years ago
    oops
  • graysonavich 26 Jul 2013 15:27:10 7,309 posts
    Seen 2 weeks ago
    Registered 4 years ago
    bitch_tits_zero_nine wrote:
    Games aren't using the grunt of PC GPUs and haven't required more than an 8800gt for years.
    :D
  • bitch_tits_zero_nine 26 Jul 2013 15:27:37 6,654 posts
    Seen 26 minutes ago
    Registered 3 years ago
    Bremenacht wrote:
    bitch_tits_zero_nine wrote:
    The software hasn't been designed yet so you don't know is the point.
    That's how the argument started.
    Who's side am I on? :D
  • Roddles 26 Jul 2013 15:27:40 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    bitch_tits_zero_nine wrote:
    Games aren't using the grunt of PC GPUs and haven't required more than an 8800gt for years.
    Sorry, but that's complete nonsense. Try running any modern 3D game on an 8800GT.
  • Bremenacht 26 Jul 2013 15:28:38 17,614 posts
    Seen 13 hours ago
    Registered 7 years ago
    bitch_tits_zero_nine wrote:
    Bremenacht wrote:
    bitch_tits_zero_nine wrote:
    The software hasn't been designed yet so you don't know is the point.
    That's how the argument started.
    Who's side am I on? :D
    Vizzini!!
  • bitch_tits_zero_nine 26 Jul 2013 15:29:06 6,654 posts
    Seen 26 minutes ago
    Registered 3 years ago
    graysonavich wrote:
    bitch_tits_zero_nine wrote:
    Games aren't using the grunt of PC GPUs and haven't required more than an 8800gt for years.
    :D
    Not for a decent play experience. People are playing Skyrim on low on laptops.
  • Bremenacht 26 Jul 2013 15:34:48 17,614 posts
    Seen 13 hours ago
    Registered 7 years ago
    mazty wrote:
    you are simply speculating they won't use the video memory when the devs are almost literally saying the opposite. Devs + DF vs guy who can't read a quote. Not a compelling argument I have to admit.
    No, he didn't. You're trying hard to twist his argument into something else.
  • Roddles 26 Jul 2013 15:35:34 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    mazty wrote:
    Roddles wrote:
    Avalanche software haven't really made anything visually spectacular have they? All I see on Wikipedia is:

    So Avalanche software and Guerilla Games. Anybody else?
    1. How the hell did you fail to read a quote twice?
    2. Avalanche Studios, not software. They made Just Cause 1 & 2, the latter being DX10 only on PC and used frequently for benchmarking.

    As bitch tits is saying, you are simply speculating they won't use the video memory when the devs are almost literally saying the opposite. Devs + DF vs guy who can't read a quote. Not a compelling argument I have to admit.
    Spending an extra 20 on a GPU because devs recommend it seems a no-brainer.
    My bad, indeed Avalanche Studios. I can happily put my hands up and admit I misread that. However it hardly now negates anything I've said because I misread a single word. Not quite sure why that's the clincher in the overall point you're trying prove. It's still just two studios.

    And now you've gone all personal and nasty when I've resisted judging your personality this entire discussion. Baffling as to why you're treating this like some kind of argument and competition. It's a shame really, a nice good technical discussion to be spoiled by internet posturing.

    Edited by Roddles at 15:36:27 26-07-2013
  • Bremenacht 26 Jul 2013 15:36:30 17,614 posts
    Seen 13 hours ago
    Registered 7 years ago
    mazty wrote:
    Spending an extra 20 on a GPU because devs recommend it seems a no-brainer.
    It's certainly advice aimed at you.
  • Bremenacht 26 Jul 2013 15:38:04 17,614 posts
    Seen 13 hours ago
    Registered 7 years ago
    Roddles wrote:
    And now you've gone all personal and nasty when I've resisted judging your personality this entire discussion.
    Hence 'cockroach', squirting noxious fluid when he feels threatened.
  • Deleted user 26 July 2013 15:46:18
    This mazty must be some sort of serious ninja warlock on the autistic spectrum with these social skills.
  • Roddles 26 Jul 2013 15:47:01 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    Well sure, I'm frustrated that you can't see my point but I'm just fine talking about the subject. We're only talking about memory on a gaming forum, no need to get worked up about it.

    This line that I've repeated frustrates me to say again, but I'll keep cool about it:
    Sure, doubling the memory might help in x years time, however your 760 will sure as shit not be up to the task in 3 years regardless of memory - and even moreso with consoles now catching up beyond their ATI X1800 and Nvidia 7800.

    I'm certainly not going to take developers words over anything at this point, especially as it's only two. Developers can talk a lot of fluff when it comes to thinking their product is a special snowflake. I prefer to make decisions on an array of real world data.

    Edited by Roddles at 15:52:25 26-07-2013
  • Roddles 26 Jul 2013 15:58:30 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    while the PlayStation 4's chip fits in nicely midway between the higher-end 7850 and 7870. Just to match next-gen console from a core processing perspective, we're looking at investing anything between 130 to 180 in a graphics card
    From Digital Foundry no less.

    There's been plenty of games that tax the GPU and make it run at 100% load.

    Sure 3 years is an arbitrary figure, however I merely chose that to represent the timeframe in which we usually see Nvidia or ATI/AMD release two product ranges up from the one you currently own. Replace "3 years" with "2 generations ahead".

    Edited by Roddles at 15:59:59 26-07-2013
  • gamingdave 26 Jul 2013 16:04:38 4,187 posts
    Seen 8 minutes ago
    Registered 10 years ago
    I can see the argument that a top tier today card will be a lot less impressive in 4 years time than it is now, regardless of its onboard RAM. But who is to say it will be 4 years? If it's more like 2 then it becomes a lot more appealing.

    However, if we are talking about spending and extra 40 on something that already costs 300+ then it really is a minimal increased outlay. For me, looking at the specs of the next gen consoles, and going on what some devs have said, it really does seem the obvious choice.
  • Roddles 26 Jul 2013 17:27:03 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    http://www.eurogamer.net/articles/digitalfoundry-ps3-system-software-memory

    Perfect timing :)

    4.5GB of total memory available under normal conditions. Somehow I don't think we're going to be seeing 4GB of that being dedicated to VRAM. If you use 3GB example that Killzone is asking for, that only leaves 1.5GB of system memory for the game to use which will be ludicrously tight. I'm skeptical and am sure that was a figure they used in development.

    Edited by Roddles at 17:37:54 26-07-2013
  • Deleted user 26 July 2013 17:33:25
    @mazty And I now have a law degree. You're not very good at this are you. You don't appear to be very good at anything really, you even suck at being a social retard going by your reception on EG. Probably time to end it and leave.

    Edited by mowgli at 17:34:07 26-07-2013
  • Phattso Moderator 26 Jul 2013 17:34:55 13,212 posts
    Seen 2 hours ago
    Registered 10 years ago
    @Roddles you've said "in three years" many many times in this thread, but the software is being made now. It will be released by the end of this year. By the end of next year most, if not all, engines (both the multiplatform ones, and the in house custom ones) will be optimised for the next gen consoles. And I'm fairly sure they're going to make use of as much memory for textures, effects and post processing as they can.

    I think it's fair to say that "some" software could make use of > 2GB by the end of this year, and "most" by the end of next. If I built a PC today, I'd expect its shelf life to be 2-3 years. Certainly I wouldn't want to be upgrading it in the first two years unless I'd deliberately bought cheaper parts with the intention of replacing them.

    I'd spend a little extra on that future proofing today.

    Same argument goes for > 4 cores on a CPU. There was never a compelling reason to use even the 4 cores that are now commonplace. But now there is a compelling reason. It's happening and it's happening relatively soon.

    Apparently it's not cool to have the same opinion as User Mazty, but on this one it's broadly how I see it. I think I'd concede that 4 cores running at more than twice the clock will beat out the 8 shitty cores on the next gen consoles though, so perhaps not so much on the i7 side of things. But on the video memory? I'd pay a 50 premium to double up I think.
  • Roddles 26 Jul 2013 17:43:21 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    As it's a unified pool of memory they can't allocate as much memory to the GPU as they'd like as the game needs to use a hefty amount of system memory too. The size of the OS locked into memory that was revealed today is pretty massive.

    I'd say that *some* PC games might go over 2GB by the end of next year, but only when ramping up everything to the max, which would make today's midrange cards cry like a baby regardless of how much memory it has.

    In regards to cores, Hyperthreading isn't the same real cores if you're referring to the i7 a few pages back. Only 6 of the Jaguar's 8 cores are available to games, and right now Haswell only supports 4 physical cores. It's successor Broadwell isn't out till 2015, and we don't know if that will have 6/8 physical cores or not. The expensive LGA2011 platform that does support 6 physical cores (and more) is just that, crazy expensive. Developers aren't going to make that the target for your typical PC setup.

    But yeah, the clockspeed difference is massive anyway.

    Edited by Roddles at 17:55:28 26-07-2013
  • superdelphinus 26 Jul 2013 17:44:07 8,035 posts
    Seen 1 day ago
    Registered 9 years ago
    Who gives a fuck lads
  • Roddles 26 Jul 2013 17:45:31 1,780 posts
    Seen 1 hour ago
    Registered 4 years ago
    We do, it's a PC gaming hardware thread!
  • superdelphinus 26 Jul 2013 17:49:14 8,035 posts
    Seen 1 day ago
    Registered 9 years ago
    Oh right, as you were then
  • graysonavich 26 Jul 2013 17:53:20 7,309 posts
    Seen 2 weeks ago
    Registered 4 years ago
  • Sharzam 26 Jul 2013 17:55:27 2,727 posts
    Seen 16 minutes ago
    Registered 5 years ago
    Good lord i didn't see this argument coming when i posted this morning about changing to 4GB GPU. I understand people are trying to be helpful but surely at some point you have to think everyone is different with different views and different thoughts about valuation of money. As i mentioned above personally i think a little bit more money for double vram is worth it as already spending over 300.

    I dont have children, i have a full time job and an ok level of disposable income where as the view might be different to a family man with mortgage and 2 children. Anyway can we please all calm down and get back to the top at hand which is meant to be how to upgrade a PC or build one using a particular budget, rather than just fight.

    Known as 'Sharzam' in 98.5% of games

  • Page

    of 147 First / Last

Log in or register to reply