kirankara Comments

Page 1 of 36

  • We built a PC with PlayStation Neo's GPU tech

  • kirankara 28/07/2016

    @justerthought

    I wasn't arguing resolution makes no difference. Exactly opposite in fact.

    I disagree about 1080p being a sweet spot though.

    I'd say 1440p is much sweeter.upscale that to 4k and that's a pretty sweet image with much less aliasing and pixel shimmer.
    Reply 0
  • kirankara 28/07/2016

    @BadFlounder

    I disagree, and again, like I said originally distance from screen plays a part.

    I play on 4k 27" monitor and when I sit next to it playing and then switch to 1440p or 1080p it's a gulf in quality.
    Sit back 6-8 ft and it's hard to tell difference between 1440p and 4k, although it's still softer slightly.
    Reply 0
  • kirankara 28/07/2016

    @Beva78

    nobody needed 1080p.

    Nobody needed a ps4 and we could have stuck with ps3 level visuals.

    Technological advances give us improvements that make visual experiences better. We don't need any of them, it's just a better experience.
    Reply +2
  • kirankara 28/07/2016

    @Pirederas

    4k isn't a fad. That's a daft comment. Was 1080p a fad too?

    Eventually it will become the standard as technology catches up.

    It is a huge improvement in quality of image. Once you experience it, then going back is hard. I suspect for many who buy a 4k tv with size of screen and viewing distance taken into account 1440p up scaled would be good enough though.
    Reply +3
  • kirankara 28/07/2016

    @rob

    It's really not that radically different.

    Yes, using low level api and optimisations will have positive impact on performance figures, but this gives a decent insight into how the neo will perform from gpu perspective .

    More worrying aspect of article is using the i7 , which far exceeds that of one in neo.this may be bigger hindrance than gpu will be.
    Reply +1
  • Nintendo NX is a portable console with detachable controllers

  • kirankara 28/07/2016

    @jabberwoky

    They don't have to turn around in sense of doing something drastic.


    They can't and won't ever sell figures they sold before. The whole market has changed considerably.

    Their financial results reflect the reality that wiiu has sold to everyone who will buy it and they're now awaiting new hardware that's on horizon.

    Fairly standard for this point in console life.


    Last figures i saw had wiiu having sold about as many as xb1. Which has probably changed now with xb1 in lead.

    They will simply sell their 20 million units and then do same again next time.

    They are also entering phone game business to supplement their business.

    This is not new territory for Nintendo. Wii was atually a freak success and was not in line with previous hardware sales. Ds and 3ds will become irrelevant and nx will be their mobile and gaming device. Meaning only one piece of hardware to develop for and resources will be efficiently used.

    It paints a potentially brighter future for them
    Reply +1
  • kirankara 28/07/2016

    @jabberwoky

    You don't get it do you?

    They don't really need to turn it around in reality.

    They've accepted their place in ecosystem and that's not doing what Sony or MS do.

    Mobile gaming is big in Japan, and they don't like having many devices around home, so here you have a device that brings home and portable gaming together.they also don't care so much for flashy visuals and don't want AAA games the western world clamours for.

    This will keep development costs down for the financially strained Japanese developers to produce games their Japanese audience likes, Nintendo will produce their usual first party games, will be easy for indy developers to port to.

    They're simply not aiming for same markets.

    I would have loved it if wiiu could have been mobile device and I love the idea you can go home whack it onto stand by tv and play some mario on TV too whilst device charges.

    Sure it's not revolutionary, but it's more refined version of previous implementaions.

    They won't sell 50 million units probably, but they will selll 15-20 million and a load of first party games that means they will keep earning money both ways.
    Reply +6
  • kirankara 27/07/2016

    @Subzero87

    Speak for yourself i love this concept
    Reply +4
  • kirankara 27/07/2016

    I personally quite like the concept. Nintendo clearly isn't going to get into rat race with Sony and MS and doesn't care about the big publishers bringing their efforts to their consoles. They make their money selling their hardware and through their own titles.

    I love the concept, I hope it has a bit more oomph than x1 tegra, but franklt I wouldn't care if it doesn't as they will churn out great looking first party games I can take on move and then hook up to TV later down line at home. Plus i can grab some indy games to play on move. Everything I wanted wiiu to be really tbh.
    Reply +2
  • AMD Radeon RX 480 4GB vs 8GB review

  • kirankara 25/07/2016

    @FMV-GAMER

    That's true, but on flip side there's been some that 3gb is required for console level textures.

    Would be utter waste having a 1060 running at 3gb and unable to run higher res textures , Better affects and resolutions, because they put in 3gb vram
    Reply 0
  • kirankara 25/07/2016

    @dogmanstaruk

    Not coming across as a prick.. he is a prick
    Reply +1
  • kirankara 25/07/2016

    @dipje

    Try AC unity or Arkam Knight with ultra textures on a 2gb card.

    Won't be pretty
    Reply 0
  • kirankara 25/07/2016

    @henrylee

    What you mean is a 480 8gb when oc'd can beat a 980 in some games.

    But then you can heavily oc a gtx 980 too.
    Reply 0
  • kirankara 24/07/2016

    @Megaknight

    My old Radeon 7970 has 3 GB of RAM and can prove you wrong, about 3 GB not being enough, period.
    Cobblers, you can rinse 3gb with ease at 1080p in many games now. Now your 7970 wouldnt be able to run half those with enough settings maxed out at stable frame rates to cause that issue, so probably doesn't affect you that much anyway.

    there's number of games that rinse out texture settings above 3gb. That's just textures alone.

    This card is meant for 1080p and even 1440p and there's no way on God's green earth Nvidia will release a 3gb version of this card. You would be unable to run it at settings it deserves or resolution with 3gb.

    I also stated it's not suitable for MODERN gaming. Vram usage has rocketed and if you don't want to play with substandard textures now or in near future, you need 4gb vram going forward for even 1080p.

    Read this article and you can see several games start hitting above 3gb sometimes even at 1080p and above 4gb at 1440p

    http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/2
    Reply 0
  • kirankara 24/07/2016

    @MadchesterManc

    wouldnt bother with nano at that price.

    1060 seems to match it in most games

    http://www.guru3d.com/articles-pages/geforce-gtx-1060-review,1.html
    Reply 0
  • kirankara 24/07/2016

    @Sober-Si

    Find it hilarious Nvidia is charging more for their reference coolers now.

    They're decent tbf and look great, but not worth premium they putting on them.
    Reply +3
  • kirankara 24/07/2016

    @pomi

    1060 seems best bet overall imo.

    On balance, little in it, but 1060 will run cooler, quieter and overall has slight advantage with performance in majority of games. Async compute may show advantage in some games for 480 in future though.
    1060sc should oc slightly too, plus give you access to gameworks in titles. They get a bad name, but can be nice touch.
    Reply 0
  • kirankara 24/07/2016

    @dipje

    970 had 4gb .

    3.5 gb faster vram and.0.5 slower ram.

    They only have option of 3 or 6 gb due to set up of card

    Nvidia are not insane. 3gb isn't enough for modern games. Period.
    Reply 0
  • kirankara 24/07/2016

    @dipje

    They're won't be a 3gb 1060.

    It's too much a middle range gaming gpu to have 3gb vram with way things moving. 4gb really is pretty much the baseline for this level of gpu. (This is a pascal 970 effectively).
    Reply 0
  • kirankara 24/07/2016

    @Bloodios

    Story is that 4gb memory modules weren't ready at launch, and AMD wanted to ensure it could launch with price point of $199 , so took the hit.

    Whether cards still are selling work 8gb modules might be purely pot luck m
    Reply +1
  • kirankara 24/07/2016

    @televizor

    Just google that shit....been benchmarks around for weeks now.

    Obviously crossfire isn't available for all games (Gears of War 4 won't support for example), so me personally, I'm not going dual gpu again until the situation stabilises

    https://youtu.be/cVVJPbFRDEc
    Reply +2
  • kirankara 24/07/2016

    Just great to see that anyone wanting a card around this price range will be getting great value whichever card they go for really.

    Can't really see necessity of 8gb but 4 does seem short sighted.

    6 gb seems ideal, but if you going amd, then the 8gb seems best choice.
    Reply +1
  • Tech Interview: Gears of War 4

  • kirankara 24/07/2016

    @SaberEdge

    Aye i don't see any harm with extra settings as long as basic game runs well already. I'm relatively high end pc owner myself.
    Things like PCSS will one day become standard and are indeed great additions.

    I don't get why people feel compulsion to max out either. I've learned hard way is an almost impossible dream, as something always changes and suddenly you can't max out a game.
    Reply +1
  • kirankara 24/07/2016

    @SaberEdge

    Whilst i kinda agree with you. Sometimes they develop games and simply tack on certain settings that offer little ultimately to quality of image, but are very taxing on hardware.

    I prefer they get games that run well and look good overall, and extra settings are a bonus really. I don't want settings that are super intensive but add little to image quality. I want a great looking game as starting point and smart extra settings that enhance that.
    Reply -1
  • Is this the best cheap phone for Pokémon Go?

  • kirankara 24/07/2016

    @carmagainagain

    Sure, let's make this personal? Why don't you insult my mom too?

    The concept did nothing for me. Was free demo and and i tried, but not my thing.

    I don't need to have a woman piss and shit on me , to know it's not my thing. I didn't think i would like it, and i didn't.

    Don't like it? Blow me
    Reply +1
  • kirankara 24/07/2016

    @ronorra

    Whole thing of " gameplay" is so subjective.

    I tried life is strange yesterday and deleted it within few minutes as i was bored to tears frankly. Yet others rave about it.

    Different strokes for Different folks I guess.
    Reply -1
  • kirankara 24/07/2016

    @beanchance

    I'm hoping that was attempt to be funny and not serious.
    Reply -1
  • Gears of War 4's PC-exclusive features detailed

  • kirankara 24/07/2016

    @togan

    Wtf has the gpu and cpu you own , along with when you purchased them got to do with not upgrading to windows 10?

    If anything you are doing them a disservice by not doing so, as dx12 will make better use of them
    Reply +4
  • Spec Analysis: Nvidia's next-gen Titan X

  • kirankara 23/07/2016

    @Sober-Si

    You missing my point, which is pretty much that it is early days and that Vulkan and Dx12 aren't purely async compute. Not every game will rely on async compute and other games will benefit from nvidia architecture and strengths.

    I was simply throwing it at him to illustrate that one game performs better on one gpu vendor and another better on another and show a "grey" area to his black and white world.

    I am personally annoyed Nvidia didn't see async compute as relevant and plan for it's use better, but I am also sure that it's not be all and end all.

    Also his choice of data is as always cherry picked.

    http://www.gamersnexus.net/game-bench/2510-doom-vulkan-vs-opengl-benchmark-rx-480-gtx-1080

    clearly amd benefits more from Vulkan, but here we see the 970 is actually not that far behind a 480 on avg fps with vulkan anyway in these tests.

    Furthermore Vulkan even hinders the Nvidia card here, so maybe they haven't worked so hard to get Vulkan working with Nvidia hardware?

    Who knows?

    Same with Total war, but that's actually had AMD working alongside them and still is losing out (kinda worrying in itself), so you expect it to come out better.

    So what went wrong?

    Maybe architecturally Nvidia suited it better? maybe it's early stages and AMD hardware requires more work to get to that point due to needing to leverage lower levels of API?

    I don't know, I just have learnt in these matters, rarely are things as black and white as this nob ed tries to make out.
    Reply +1
  • kirankara 23/07/2016

    @stephenrundle

    It's not about being uneducated or a kid, it's simply about they have the disposable income and want best product they can afford now.

    SLi brings issues (Gears of War 4 won't work for example), and they dont want to wait till whenever 1080ti comes out.
    Reply +2
  • kirankara 23/07/2016

    @kintama

    Oh look, youre repeating stuff people already know, big surprise.

    now back to Total war in dx12 being outperformed by Nvidia in dx11
    Reply +1
  • kirankara 23/07/2016

    @Sober-Si

    I was exaggerating for dramatic effect slightly tbh.

    But if you start whacking settings up to ultra on games, 4gb can be easily hit.

    However, you wont get decent frame fates on most cards with 4gb limit anyway if you do start maxing settings across board.
    Reply 0
  • kirankara 23/07/2016

    @FMV-GAMER

    I forgot was 4gb card too.

    That 4gb going to be prefect for 4k......oh wait...it's barely enough for 1080p now ;)

    It has 8gb shared across two gpu I'm guessing?
    Reply 0
  • kirankara 23/07/2016

    @kintama

    Also like to point out, that 295x2 and pro duo will both perform pretty shitty on new gears of war with no xfire or sli support and as I've pointed out before dx12 isn't simply async compute as some believe.

    Total war is an amd sponsored game wherein dx12 code is optimised for their cards and...it still performs worse than nvidia dx11 performance.

    http://www.pcgamer.com/total-war-warhammer-dx12-boost-for-amd-still-cant-match-nvidias-dx11-performance
    Reply +3
  • kirankara 23/07/2016

    @kintama

    Broken record much?

    That pro duo won't do shit if there's no X fire support, will it???

    That card also sells for well over a grand , will require more space, use significantly more power and frankly is a far less eloquent solution.

    But let's compare apples to oregano (this auto correct error seems even more appropriate than oranges tbh).

    AMD seem to have given up on producing elite cards, they want to promote Xfire of high end cards as answer.

    Until xfire becomes as reliable as it was, then for many people, it's simply not a solution they're interested in.
    Reply +2
  • Nvidia GeForce GTX 1060 review

  • kirankara 21/07/2016

    @kintama

    His scientific methods are slightly questionable.

    He doesn't clarify what settings he's using. I know nightmare settings have to be used with over 4gb vram due to textures used.

    If he's using nightmare settings then his issues could lie there.

    My 980ti at 4k didn't have this issue that's for sure
    Reply 0
  • kirankara 20/07/2016

    @PCMaestro

    " The "async drivers" were announced months ago lol"

    Took AMD years to even admit issue of micro stutter.

    Nuff said.

    Have you ever even seen an AMD driver bug fix list....Ive seen encyclopaedia's that are shorter.

    They also have fixes for games from years ago.

    Nuff said.

    Now, I must point out that Ive owned both, and would happily swap over again IF I had faith in AMD, which I just don't.

    I gave away my r9 290 and it was a decent card, but I always felt like I was dealing with a bunch of boys in their bedroom fixing stuff with AMD.

    Until AMD have truly caught up in all respects, including driver support and software support, then simply having better async compute isn't going to sway me.

    Your hilarious attempts to trash Nvidia are frankly pathetic. The reason AMD have a minority market share, is because theyve been sub standard for long time.

    They are heading in right direction for sure, and Nvidia need to also take heed and not be arrogant, but the reality is AMD will just push Nvidia to be stronger again.

    They have been able to rest on laurels too long. They don't even have to try to match AMD with more efficient architecture.
    Reply -3
  • kirankara 20/07/2016

    @PCMaestro

    We might get that async driver around same time AMD gets their next driver.


    Which is why AMD are where they are now.

    How long did it take them to address micro stutter? How long to address no adaptive vsync? etc etc

    Answer usually was years btw.

    Now talking of disabling things to stop pc's blowing up, AMD were actually recently responsible for blowing up gpus.

    " AMD has confirmed the existence of a bug in its latest Radeon Software Crimson driver bundle which causes fans to spin too slowly, but not before users have reported complete GPU failures as a result."

    So, whilst async compute may well be impossible on nvidia cards, i still rather put my trust in them right now than AMD.
    Reply -3
  • kirankara 19/07/2016

    @dogmanstaruk

    Palit were a little sketchy when first arrived on the scene, but quickly established themselves and became very solid cards. I've seen some versions of cards by Palit review better than the more esteemed competition
    Reply 0
  • kirankara 19/07/2016

    @IronSoldier

    Ooooh that's clever.....right up there with Phony.


    Wonder how they come up with this genius material.

    On side note. Nvidia are greedy bar stewards, but as market leaders, this should be no shock to anyone.

    Apple are no different? Sony were no different in their era of dominance with audio and visual markets.
    Reply +1
  • kirankara 19/07/2016

    @Sober-Si

    Unless you going ti drop some serious wedge , there's no point upgrading yet from that card.

    I just don't see the value in flogging your card cheap and gaining what you would from upgrading to a 1060 for example , and probably having to spend Ł100 at least.

    I've made these impulse upgrades before and unless you got cash to spare, you're better riding it out bit longer.
    Reply +1
  • kirankara 19/07/2016

    @IronSoldier

    No need, will just be blah blah about Ashes of singularity, async compute etc
    Reply +2
  • kirankara 19/07/2016

    @LargeStyle

    You've already had the answer to that question really.

    For someone who say had a 760 a 1070 would be huge jump still.

    Or a 660ti to a 960.

    There's rarely huge jumps from one generation to another. My 980 to 980ti saw similar jump as 980ti to 1080.
    Reply +1
  • kirankara 19/07/2016

    @LargeStyle

    Well technically no.

    Larger caches of vram used, new architecture with better (still dubious though it seems ) async capacity and a lower TDP, as well as better pricing for what you would have paid for previous equivalent card.

    What more do you want in reality?
    Reply +6
  • kirankara 19/07/2016

    @The_shlaaaag_returns

    Vulkan and dx12 seem to mainly bring AMD cards in line with Nvidia equivalent, whilst offering advantages in some other games. However this works both ways, which is often case. Vulkan and Dx12 still wont become norm for a while yet.

    Id still currently choose Nvidia for Nvidia features, which whilst not game changing are a nice touch.

    First time in long time though I am seeing AMD cards as genuine competitors, although I dont expect this to translate to actual sales and eating into Nvidia's market share.
    Reply 0
  • kirankara 19/07/2016

    Whilst It's not game changing in my opinion, and is more expensive than the 480, it's clearly going to mop up and sell like hot cakes.

    They've delivered a card that replaces the 970 whilst bringing 980 levels of performance and is cheaper than both.

    Would have been better if they dropped price slightly, but 3rd party cards might be cheaper and offer better cooling options anyway.

    I don't see 480's 8gb as real advantage at 1080p and 1440p, but possibly 3-4 years down line will become relevant.

    good, if unspectacular GPU release.
    Reply +2
  • Video: Star Wars Battlefront's offline multiplayer feels like the game's best addition yet

  • kirankara 21/07/2016

    @Saul_Iscariot

    George can go on a bit, but you only have yourself to blame with that comment, and your response continued to belittle pc gaming (I'm guessing mainly as you felt provoked ).

    You can't really espouse tolerance of others opinions and subjectivity of value, criticise narrow mindedness and then slag off pc gaming.
    Reply +1
  • Rise of the Tomb Raider PS4 includes a PSVR chapter when it launches

  • kirankara 19/07/2016

    @Gemini73

    Rise is a good game. I think scripting is again pretty crap, and in this respect will not come close to uncharted, but the game itself is pretty good as an action game, and not taken as an actual tomb raiding game.
    Reply +2
  • kirankara 19/07/2016

    @IronSoldier

    Surely even if we were specifying definitive console version, even then Neo would fall short of Scorpio version, so it can't be remotely considered definitive on Neo.
    Reply -2
  • Pokémon Go: Type locations, and discovering Pokémon types by habitat

  • kirankara 19/07/2016

    http://metro.co.uk/2016/07/19/nintendo-is-now-worth-more-than-sony-thanks-to-pokemon-go-6015465/

    random unrelated news, but Nintendo value is now more than Sony's due to this game.
    Reply 0