Getting back into the PC building game Page 2

  • Page

    of 3 First / Last

  • WoodenSpoon 26 Jan 2013 22:41:12 12,306 posts
    Seen 8 hours ago
    Registered 12 years ago
    Assuming you can just pull a HDD out of a Mac like you can with a PC then I reckon it'd be better to get a cheaper SSD now instead of that Barracuda, and just use your old HDD in the new PC if you need the extra storage space for music and the like.

    Buying the SSD later means you're gonna have to reinstall Windows.

    Edited by WoodenSpoon at 22:41:37 26-01-2013
  • Carlo 26 Jan 2013 22:48:58 18,227 posts
    Seen 1 hour ago
    Registered 9 years ago
    Dirtbox wrote:
    Migrating from an HDD to an SSD is a huge pain in the arse, you're better off with getting the boot/program drive to begin with and expanding with an HDD later.
    Or just enable the raid on the motherboard when you build it and before you install the os, make sure the motherboard has rapid storage technology, and then when you do eventually buy a ssd, just pick one that's no larger than 60 gb, and enable RST on your Intel drivers in your os.

    The ssd will then cache the hdd data you most frequently access, which is better than installing the os on a ssd, and your apps and data on a hdd.

    Plus, you can do this without reinstalling the os, or even change drivers.

    ;)

    PSN ID: Djini

  • Deleted user 27 January 2013 00:10:54
    @redneon

    Sadly, I very much doubt you can get the features you are looking for to work in your budget.

    As a starting point, the case and PSU you've chosen aren't any good. The case is old technology (Ac97/USB front panel. It should be HDAudio/USB3) making it unsuited to the newer motherboard. The PSU isn't 80plus certified and only has one pcie power connector for powering SLI graphics cards. The case is also too small for 10.7 GTX 660 card you selected

    http://www.coolermaster.co.uk/product.php?product_id=6696)

    http://www.coolermaster.com/product.php?product_id=30

    http://www.asus.com/Graphics_Cards/GTX660_TIDC2O2GD5/#specifications


    Personally I doubt any dual channel memory ivy-bridge systems work well with SLI/crossfire, so I'd abandon that plan as the quad memory channel SandyBridge-E systems would eat 600 out of your budget.

    As a side observation. Looking at the technical specs of the CPU you selected, it only provides 25.6GB/s of main memory bandwidth to 8GB, and the graphics card has 2GB of GDDR5.

    http://ark.intel.com/products/68315

    If the current next-gen rumours are to be believed, that value is less than half the memory bandwidth of the slower DDR3 setup used in the Durango(64GB/sec).

    If you've read an NDA for the coming system's technical specs, then that value will either not bother at all, or it might concern you about your new PC's shelf life for gaming. Either needing a new GPU with at least 4GB of GDDR5, or needing a system based around socket LGA2011, rather than the socket 1155 setup.

    http://en.wikipedia.org/wiki/LGA_2011
  • richardiox 27 Jan 2013 00:33:23 5,738 posts
    Seen 5 hours ago
    Registered 10 years ago
    Fuck me, Vizzini somehow uses this thread for a sideswipe at Durango. What a fucking pissweasel.

    Also, always ignore his advice. You're after a sub 800 mid/high range machine. He's recommending a GPU with 4gb of GDDR5.
  • Deleted user 27 January 2013 00:44:55
    @richardiox

    No, I'm actually complimenting the rumoured 64GB/s memory bandwidth. My i7-3820 on an X79 chipset doesn't provide that amount.

    Durango and Ps4 are rumoured to have minimum of 4GB of main memory at a minimum of 64GB/sec bandwidth. So an ivy-bridge system might need a GPU with 4GB of GDDR5 to subsume the consoles' specs (to be able to run next-gen cross platform games with any advantage).

    Now do you understand?

    Edited by vizzini at 00:52:05 27-01-2013
  • Pinky_Floyd 27 Jan 2013 00:45:15 8,388 posts
    Seen 2 hours ago
    Registered 5 years ago
    richardiox wrote:
    Fuck me, Vizzini somehow uses this thread for a sideswipe at Durango. What a fucking pissweasel.

    Also, always ignore his advice. You're after a sub 800 mid/high range machine. He's recommending a GPU with 4gb of GDDR5.
    Games at the moment don't see any real benefit from the extra memory. Even using 3 x 1080p monitors.

    http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154

    Basically, even if you are gaming in surround, 2gb is plenty for the forseeable future. By the time you do need 4gb of on your GPU, it will be time for a new one anyway.
  • Sandbox 27 Jan 2013 01:08:37 340 posts
    Seen 2 hours ago
    Registered 8 years ago
    http://www.youtube.com/watch?v=4et7kDGSRfc
  • Dirtbox 27 Jan 2013 03:10:21 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    I got a better idea.

    Buy a Cell processor and use that for everything, because you don't need anything else if you've got a cell because the cell is the best because sony told me it was the best one for playing games and it can do loads of stuff that no one even knows about yet because they said so at E3 that year and it's too powerful for using for games but you can use it for playing games because it's cool but you can make nuclear bombs with it instead if you wanted because sony were talking about that once and saddam wanted to make a giant robot to crush the west and he was going to put a cell into it's brain because it is the best processor ever made.

    +1 / Like / Tweet this post

  • DodgyPast 27 Jan 2013 03:21:14 8,541 posts
    Seen 40 minutes ago
    Registered 9 years ago
    vizzini wrote:
    @richardiox

    No, I'm actually complimenting the rumoured 64GB/s memory bandwidth. My i7-3820 on an X79 chipset doesn't provide that amount.

    Durango and Ps4 are rumoured to have minimum of 4GB of main memory at a minimum of 64GB/sec bandwidth. So an ivy-bridge system might need a GPU with 4GB of GDDR5 to subsume the consoles' specs (to be able to run next-gen cross platform games with any advantage).

    Now do you understand?
    The next console generation isn't really going to get going for another year or so anyway. Pretty confident my overclocked i2500k and 2gb 660ti will at least be in a similar enough ball park to keep me going for another couple of years.

    Keeping up with the consoles isn't going to be that difficult... Reasonably budget kit will be powerful enough to iron out low frame rates and tearing on console versions that are over ambitious within a year of the console's launch.
  • Andee 27 Jan 2013 05:09:42 716 posts
    Seen 5 minutes ago
    Registered 8 years ago
    @redneon, is the budget excl or inc vat? (since the link was to the business site). If it was excluding, just thought it might be worth mentioning chillblast still have a possible less than than parts cost system, no monitor, but 3570k, GTX670, 240GB SSD, inc OS for 1k inc vat.
  • Dirtbox 27 Jan 2013 05:15:40 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    Vizzini is in cloud cuckoo land again. All consoles will be doing is playing the catch up game to get to the current level of PCs for the first few years, after which all the developers and publishers will be going out of business because they can't afford the development costs of the games they'll be trying to make to raise the already too high ceiling with.

    Edited by Dirtbox at 08:45:53 27-01-2013

    +1 / Like / Tweet this post

  • redneon Programmer, SUMO Digital 27 Jan 2013 08:09:33 46 posts
    Seen 1 week ago
    Registered 9 years ago
    @Sandbox Now, that's interesting. I have to admit, I hadn't even considered getting an AMD CPU but maybe I'll look into the FX8350...
  • redneon Programmer, SUMO Digital 27 Jan 2013 08:35:41 46 posts
    Seen 1 week ago
    Registered 9 years ago
    If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.
  • Dirtbox 27 Jan 2013 08:44:02 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    No, you can use whatever, there are no benefits. Nvidia has sparklies like physx, which are worth having imo.

    +1 / Like / Tweet this post

  • Sandbox 27 Jan 2013 09:56:03 340 posts
    Seen 2 hours ago
    Registered 8 years ago
    redneon wrote:
    If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.
    I build PC's for a living, supplying a number of business and schools. I have based this on AMD stuff as that is what I know, they work very well and tend to offer a few more features at a given price. The computing industry is going multi-core, and as software catches up AMD's stuff just gets better.
  • Dirtbox 27 Jan 2013 10:23:28 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    I've owned a heap of AMD GPUs and CPUs over the last 15 years and while the CPUs have all been great, I've never once had an AMD GPU that didn't have some annoying little problem with it, whether it's a dodgy driver installation or not changing resolution properly or fast enough, silly stuff like that. Still easily the best bang for buck though and definitely great cards. Just be aware that st least one stupid OCD twitching annoyance is par for the course with them.

    Edited by Dirtbox at 10:25:33 27-01-2013

    +1 / Like / Tweet this post

  • Deleted user 27 January 2013 10:39:54
    Sandbox wrote:
    redneon wrote:
    If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.
    I build PC's for a living, supplying a number of business and schools. I have based this on AMD stuff as that is what I know, they work very well and tend to offer a few more features at a given price. The computing industry is going multi-core, and as software catches up AMD's stuff just gets better.
    @Sandbox

    Watching the video you posted and looking at the comparative technical specs, the AMD cpu does compare more favourably on features+memory bandwidth to Intel's LGA2011 systems, and is still cheaper than the lesser Intel ivy-bridges.


    http://www.amd.com/us/products/desktop/processors/amdfx/Pages/amdfx-key-architectural-features.aspx

    The only reservation I'd have about recommending AMD/ATI stuff is from reliability. And driver support. In my experience Windows does tend to degrade performance quicker on AMD CPUs. Hotfixes/service packs don't tend to cripple ageing Intel CPU systems the same for some inexplicable reason.
  • Dirtbox 27 Jan 2013 12:00:09 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    Yeah, well that's bullshit. There hasn't been any degradation of that type since winxp, the problem is that the drivers are generally shit.

    +1 / Like / Tweet this post

  • redneon Programmer, SUMO Digital 27 Jan 2013 17:26:25 46 posts
    Seen 1 week ago
    Registered 9 years ago
    Ok, so I think I'm going to go AMD. There are just more pros than cons. So, should I get a GTX 660 or a HD 7850?
  • Deleted user 27 January 2013 17:27:48
    I'd get NVidia.
  • superdelphinus 27 Jan 2013 18:55:38 8,127 posts
    Seen 6 hours ago
    Registered 9 years ago
    7850
  • Dirtbox 27 Jan 2013 18:57:52 79,216 posts
    Seen 17 hours ago
    Registered 12 years ago
    Another nvidia call.

    There's not enough between the cards performance-wise and the nvidia card has physx, which is being featured in games more and more.

    Edited by Dirtbox at 18:58:09 27-01-2013

    +1 / Like / Tweet this post

  • redneon Programmer, SUMO Digital 27 Jan 2013 21:01:54 46 posts
    Seen 1 week ago
    Registered 9 years ago
    Ok, so I've built an AMD 8350 spec on eBuyer (without monitors) which looks like this: http://www.ebuyer.com/lists/list/136497. What do you think?

    I could probably stretch to getting this motherboard too: http://www.ebuyer.com/393530-asus-sabertooth-990fx-r2-0-socket-am3-8-channel-audio-atx-motherboard-sabertooth-990fx-r2-0. You may have noticed my budget is being broken a little :)

    Edited by redneon at 22:10:27 27-01-2013

    Edited by redneon at 22:10:50 27-01-2013
  • Carlo 27 Jan 2013 21:18:58 18,227 posts
    Seen 1 hour ago
    Registered 9 years ago
    redneon wrote:
    Ok, so I've built an AMD 8350 spec on eBuyer (without monitors) which looks like this: http://www.ebuyer.com/lists/list/136497. What do you think?
    Doesn't open :/

    PSN ID: Djini

  • redneon Programmer, SUMO Digital 27 Jan 2013 22:11:15 46 posts
    Seen 1 week ago
    Registered 9 years ago
    Try now. Looks like it was considering the period as part of the URL.
  • FutileResistor 28 Jan 2013 11:54:20 1,239 posts
    Seen 22 hours ago
    Registered 5 years ago
    Sandbox wrote:
    http://www.youtube.com/watch?v=4et7kDGSRfc
    redneon wrote:
    @Sandbox Now, that's interesting. I have to admit, I hadn't even considered getting an AMD CPU but maybe I'll look into the FX8350...
    I've just watched this. You have to take their results for Arma II and Farcry 3 with a lorry load of salt.

    Their results show that at 1080p you will get double the framerate with the 8350 vs the 3570K. The idea that even a stock 3570K is struggling with the processing demands of Farcry 3 or Arma II is not credible. However it get weirder than that, at 1440P the GPU load has increased significantly, the CPU should matter less, but according to these results switching from the 3570K to the 8350 will get you triple to almost quadruple the fps.

    In fact it's worth highlighting just how ridiculous the Arma2 figures are:

    Stock
    AMD 1080p: 51.92
    Intel 1080p: 25.56

    Overclocked
    AMD 1080p: 59.96
    Intel 1080p: 29.87

    Stock
    AMD 1440p: 37.8
    Intel 1440p: 13.12

    Overclocked
    AMD 1440p: 54.48
    Intel 1440p: 15.92

    According to these results at 1080p stock we get 51.92fps but at 1440p if we overclock the FX8350 to 5GHz we get 54.48fps. By these results an overclocked FX8350 is making more difference to the framerate than a GTX670 GPU.

    I would love for it to be true that a couple of Windows fixes for AMD FX CPU scheduling has made GPUs redundant for gaming and look forward to multi CPU AMD motherboards in the future.

    I suspect the fact that their results disagree with logic and the benchmarks we have seen from every other testing site means that they've screwed up in their testing somewhere.
  • Deleted user 28 January 2013 12:14:46
    @FutileResistor

    I suspect the results just illustrate that the ivy-bridge systems are memory bandwidth bottlenecked in certain games compared to the AMD system. (25GB/sec versus 30GB/sec + 8GB/sec HT).

    This is no different that a computer program that needs more RAM than physically available in the system, and the bottleneck being the HDD speed that the pagefile lives on.

    If they are also offloading the physx stuff to the CPU, then this might have an even bigger impact on reducing the GPU workload and getting the best of memory bandwidth and the overclock.

    They should add a sandybridge-EP system to the tests as point of reference.
  • Bremenacht 28 Jan 2013 12:15:55 19,665 posts
    Seen 2 days ago
    Registered 8 years ago
    \o/
  • Bremenacht 28 Jan 2013 13:23:10 19,665 posts
    Seen 2 days ago
    Registered 8 years ago
    @vizzini Something just popped up on an RSS for you:

    Does Memory Performance Bottleneck Your Games?

    I'll quote the take-home message at the end

    Getting back to the games that were affected by memory performance, only one title exhibited differences significant enough to be noticeable during real-world play. Even then, the average frame rates were so high that your eyes (and displays) would need to be about twice as fast as ours to realize the real-world benefits of faster RAM.

    The game in question, F1 2012, consistently averages more than 100 FPS, yet also scales well with memory improvements. Really, that's only important to sustain if you're using AMD's HD3D and Eyefinity technologies at the same time, encouraging frame rates two times the 60 Hz refresh rate of most monitors. If you don't have a trio of stereo-enabled screens, large performance bumps above and beyond already-high frame rates are really only good for bragging rights.
    ed - from THG.

    Edited by Bremenacht at 13:24:44 28-01-2013
  • Deleted user 28 January 2013 13:57:06
    @Bremenacht

    It sounds like you don't understand how each subsystem works in a computer, and how a bottleneck in one impacts on others.

    Memory module speed and memory channel bandwidth are not the same thing. Faster modules would make virtually zero difference if they are already bottlenecked by channel bandwidth.

    The test video that sandbox linked, and FutileResitor brought into question was on frame-rates below 60fps. Memory bandwidth can be an important bottleneck in modern day computer games if the engine is able to scale to CPU, memory size, memory/HDD bandwith, in addition to using the GPU. Adding an irrelevant link to something different doesn't change the point.
  • Page

    of 3 First / Last

Log in or register to reply