Batman: Arkham Knight guide
Our guide to Rocksteady's final Batman game.
Batman: Arkham Knight Riddle solutions
How to track down and solve every Riddle.
Loading... hold tight!
that price is absolutely obscene
I insult people when they make - and crucially REPEAT - stupid arguments, not by your amazingly loose assumption it's done in order to "put others down to feel superior of themselves."
How old were you when you were finally allowed to dress yourself, twenty perhaps?
Was this 'licensed'? Otherwise this guy has a right to sue.
Hold on, Vin Diesel is cool?
I want to watch it come down.
I think the problem for oldies like me who often remember Nintendo's golden age with rose tinted glasses. They haven't progressed with the times, it all seem to be a downward spiral since the n64 days.
The sudden artstyle change was a clone of codehunters... would be interesting to hear Randys views on that.
So many people have already dismissed him outright, that it didnt make a difference what he had to say.
Gearbox CEO Randy Pitchford on gamer criticism: 'some people are sadists'
“If you’re making entertainment on a grand scale, if you’re reaching millions, there will be tens of thousands of people who absolutely hate us, and some percentage of those will take it upon themselves to let us known how they feel,” he said.
“I read it in this way: we moved those people, we touched them – even the person who hates [your game] so much, you’ve affected them. That’s why we fight, we’re creating emotion and experience - and some people thrive on that type of feeling, some people are sadists.”
“There is always the person who’s got to stand on the sandcastle, they must crush it,” he said. “That’s their way of relating to that. It’s typically a less sophisticated mind. There’s a dark part of us all that likes the idea of crushing a sandcastle, but most of us will respect it and let it be. That’s why we like playing video games where we can blow stuff up and no one gets hurt.”
Payback for the NGE bitch!
The law has no fucking clue what it's doing these days, does it.
You do realise that you have played right into their hands, yeah?
thanks for replying!
Better than anything announced for the PS4.
To bad the guy went after Smedley's family but I'm glad the guy is finally getting what he deserves. The lying asshat.
Get Moving Shadow back on the soundtrack, and I'm all over this =D
He's a 17 year old child and he hurt nobody and stole nothing, some of you are fucked up calling for him to be raped in prison or forced to fight terrorists.
It's your problem, not mine...
Media critics Anita Sarkeesian and Jonathan McIntosh of Feminist Frequency took issue with both the graphic violence and the vocal pleasure of the audience.
"Our mantra of '60FPS 60FPS 60FPS!' would all be for nothing if we had horrible input lag," says Infinity Ward's Drew McCoy. "It is extremely helpful being able to see the physical, measurable, result of what is going on in our game - especially if things change or if someone in the office complains that things 'don’t feel right'. If anyone cares about the end user experience of their game, they should be heavily invested in their input latency."
Criterion senior engineer Alex Fry concurred in our expansive Burnout tech interview. "We try to get the latency down to the lowest possible, because it's just a better experience. It's one of the reasons Burnout runs at 60FPS."
In basic terms, controller latency is very easy to define. It's the time, usually measured in frames or milliseconds, between pressing the button on your controller and the appropriate action kicking in on-screen during gameplay. The longer the delay, the less responsive the controls, and the more unsatisfying the game can feel.
In-game latency, or the level of response in our controls, is one of the most crucial elements in game-making, not just in the here and now, but for the future too. It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it.
The average videogame runs at 30FPS, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server.
But in the meantime, while overall "pings" between console and gamer remain rather high, the bottom line seems to be that players are now used to it, to the point where developers - like Infinity Ward - centred on getting the very lowest possible latencies are using that to give their games an edge over the competition.
Ther more frames, the less "weight" it has. 60hz doesn't present any real gameplay improve,eat, it's just a pissing match item. locked frame rates are far more important. And given, as you point out, all the moaning about "limited console hardware", seems to me that good graphics and quality gameplay is far more important., especially since a very small percentage of people give a toss about it, but thry do care it looks super pretty
A properly made game doesn't have latency issues. Don't make excuses for bad design.
Are you trying to say that Nintendo didn't save gaming following the game crash?
The Crash killed the American home console market for two years: video game sales dropped from $3B in 1982 ($7.13B in 2012 dollars) to as low as $100M ($213M in 2012 dollars) in 1985, which caused a majority of game companies to go out of business.
When the market returned to prominence in 1985, it largely rode on the success of the Nintendo Entertainment System, at which point the console's native Japan overtook America as a leader in the video game industry, which meant the Crash never came close to killing video games as a medium (think of the Crash as a condensed version of the type of American business hubris that led to the financial meltdowns of 1929 and 2008).
The home gaming market suffered a huge blow from the temporary death of the dedicated console, but the growing PC base (especially the Commodore 64) provided a viable replacement for game production by the small number of companies left alive. While the American arcade scene began its slow descent into obscurity, arcade games still stood near the height of their popularity. Minor arcade classics like Paperboy, Punch-Out!!, Space Ace, Karate Champ, and Gauntlet saw release during this period — and many of them would end up ported to home consoles (with varying degrees of success) after that market's revival.
Outside North America, though, the Crash made little impact. In Europe, eight-bit home microcomputers (predominantly the Sinclair ZX Spectrum and the C64) already dominated the gaming market. An outrageous number of one-person coders wrote and released games for the far cheaper tape-distribution system, which helped those machines flourish and become the backbone of Europe's gaming industry for the next decade. These "bedroom coders" received status labels ranging from "cult hero" (Jeff Minter, Matthew Smith et al) to "legend" (Bell and Braben, the Oliver Twins) from their fans — but that didn't prevent a number of talented developers from making enough stupid decisions to snatch defeat from the jaws of victory (Imagine Software — see here for info, with a big example of an Orwellian Editor as a bonus). Even with the missteps, the European gaming industry remained solid. (Nevertheless, a similar crash affected the home computer hardware market in the UK in 1984, causing a lot of the less popular machines like the Dragon 32 and Jupiter Ace to disappear entirely and causing Sinclair and Acorn to be taken over by Amstrad and Olivetti respectively).
The Crash had little discernible effect on the Japanese market, either. Though the home of a massive arcade base that grew out of Pachinko parlors and Mahjong dens, Japan didn't adopt home gaming consoles at first; people deemed American imports as curiosities at best. The massive discounts at which Japan forced people to sell computer technology after the Crash provided the perfect storm for domestic development — and the release of both the Famicom console and the MSX computer in 1983 didn't hurt, either. Both systems dominated the Japanese gaming industry for the rest of the decade, though the latter would soon fall to increased PC competition. (Near the start of 1983, Atari had entered into the early stages of negotiating rights for the Famicom's US release. The Crash eventually ended those plans, but oh What Could Have Been...)
Nintendo exported the Famicom two years later as the "Nintendo Entertainment System" and achieved near-monopoly status because of the American console market's weakened state. Nintendo's "Seal of Quality" system and a cartridge design that no one could manufacture without Nintendo's approval provided a degree of protection against the low-quality shovelware that plagued the Atari. To assuage concerns of American shopkeepers burned by the Crash, Nintendo designed the NES with a front-loading cartridge slot to make it look more like a VCR and bundled its largest NES set with the Robot Operating Buddy (R.O.B.) and Zapper light gun peripherals (which looked much more like conventional toys). The former only worked with two games, and the latter didn't fare much better in the long run.
Nintendo (the only decent dev around nowadays)
I now exclusively game on console and it's just plug in and play...
Rocksteady's superb work doesn't deserve to be overshadowed by the poor PC version.
I am a wind up merchant
Guild Wars 2