ikkei Comments

  • Xbox sales decreased 20 per cent year-over-year in Q1

  • ikkei 24/04/2015

    @Bauul I think you're on the right track.

    With the virtualization (ubiquity, multi-OS/app, etc.) trend of the next decade (already built-in Xone for that matter), I think it makes no sense to make dedicated console hardware.

    All you need is a big black box somewhere in the house to compute (kinda like any nVidia recent GPU may stream to Shield), and then stream these OS's and apps to any device in the house (low-profile clients, but local horsepower can help with computation as well in a grid model).

    These concepts are not new, but now they're becoming mainstream in IT, therefore will become mainstream at the consumer level within the next decade. Beyond that is the cloud (think not "hard drive online" but rather "GPUs & CPU's online streaming their processing to your device, think apps, not just data ─ that is the real cloud), and most people won't even need to invest in low-ROI high-TOC vastly underused home computers (we need much more bandwidth to make games seamless from the cloud, though, quite not there yet except in modern cities usually Asian these days, or Californian infrastructure goodness).

    I'm thinking the next cycle of "consoles" will be PC's, and soon thereafter datacenters & clouds. Simply because it's cost effective and will cut everyone's computer bill by half if not more, for the same or better end-user quality.
    Reply -2
  • BioWare: Mass Effect 4 may not relate to Shepard's story "at all, whatsoever"

  • ikkei 15/10/2013

    Walters' writing feels like that of a man who dreams about writing for the Deus Ex series but ended up signing a contract with the ME franchise. It would be like writing in a Lord of the Rings fashion for the Games of Throne universe: would be totally out of touch. It's like what J.J. Abrahms did to Star Trek, essentially striping it off anything Rodenberry and making it feel more like Firefly, or heroic fantasy. Seriously, how can these writers be so unable to continue an existing IP these days? And how hard can it be for a studio to recruit writers that actually know the context they're about to write upon and expand? Reply +3
  • Warcraft movie gets an official release date

  • ikkei 02/10/2013

    @CaptainTrips The problem was that TSW was what Square does too often: a two-hours long CG demonstration lacking any real depth. Scenario was non-existent. Had these been filmed, it would have been so empty I'm not sure anyone would have shown it in theaters. Great for machinima, though. Likewise, Advent children certainly caters to FF fans but isn't much better by any "real movie" standard. It's not Woody Allen directing, you know. Not even Brian de Palma. It's just awesomely pretty.

    Now, I didn't like Avatar (too obvious and cheesy for me), but at least it had a story, and I'm pretty sure the kind of money it generated is a strong indication that CG movies can fare pretty well.

    WoW filmed seems pretty "meh" to me, though I'm confident it may be a great movie in and out of itself. Just not what core gamers might really long for, considering no filmed character can ever resemble exactly what we had in the games.
    Reply 0
  • Is the most disturbing scene in GTA 5 justified?

  • ikkei 20/09/2013

    I totally agree with Tom here. On a broader spectrum, I must recall a torture scene taking place in Star Trek: The Next Generation.

    ****** WARNING!!! SPOILER!!!!! ****** (obfuscated as much as I could) (you may read below however)

    I'm talking about that moment when one of our beloved Captains is tortured by a well-known Reptilian individual.

    - How many lights are there?
    - There... are... four... lights!

    That moment was, in my opinion, justified. It helps giving even more strength to this character we've all come to really appreciate over the course of 5 seasons (maybe 4 or 6, don't remember exactly). No one is surprised about the intent and process put up by the torturer in this scene, but it's nevertheless the first time we see it in a so graphical and dramatical way, and on such an iconic character -- up until then torture at large has only has been worded, not shown specifically. And it gives much meaning to this specific context, it definitely engraves the horror of what the species to which this torturer belongs has done in the past to a much more ape-like humanoid species (colonization, slavery and so on), one whom we identify very much with as humans.


    The point of that scene was not to indulge in gratuitous violence. The real Star Trek just never does that, so one is not even wondering, unlike GTA; there's just this sense of respect in these shows, both of the topics pictured, and of the viewers, that makes crossing some lines actually meaningful rather than attention-grabbing. And that scene, in my humble opinion, is perfect illustration of that.

    So it is my contention that torture depicted in passive forms of entertainment can be put to good use, namely to build characters and criticize the process itself. In a video game, I have yet to see it done the right way, but certainly GTA's attempt this time is all but commendable. I'm not so sure I'll recommend that game to anyone now, and certainly not to sensitive people. As for people under 18, it's a definite "no way" as far as I'm concerned (maybe 16-17 if I know that person to have a solid mind and maturity to handle such things).
    Reply +1
  • Blizzard's Titan unlikely to be a subscription MMO

  • ikkei 02/08/2013

    From the get go, I thought this "Titan" could be a multiplatform title, aimed at all these casual players on mobile, ant tapping into the console market as well. Not necessarily a MMO"RPG" per se, more like a blend with some action, some strategy or so, and with a strong character building system so that players would spend hours loving their characters. Kinda echoing the past declaration that this Titan "would appeal to fans of the Sims" somehow. So far, Morhaime's words fit this bill I believe. Time will tell. Reply 0
  • Xbox One's new reputation system detailed

  • ikkei 02/08/2013

    @darkmorgado agreed 100%, that was my initial issue with this reputation system, but you kinda convinced algorithms were fine to "do a fairly decent job of the basics as long as they are done right". how is it that it would work (at a very basic level, of course, not "advanced psychometrics) to label players as being "good or bad" (a concept not even definable and damn questionnable in itself, ref. shakespeare) and not for specific values? once again, the mass is what it is, but the edges of a gaussian curve are quite identifiable I believe (if 90% of people call you something, they might not all be wrong, there must be something with you). nope? Reply 0
  • ikkei 02/08/2013

    @darkmorgado oh yes, it does. Very much so.

    I've just nailed a good way to explain what I fear with this reputation system. This idea of "good" or "bad", a single scale to rule them all.

    From our judeo-christian millenarian roots, we took a turn two or three centuries ago that bolstered individualism (19th century) and the emergence of less-holistic visions of the world, followed by complex and systems approaches a century later. But as of late, starting with a die-hard political two-party system in most countries (certainly convenient for mediated debates but often flawed at its core), up to "(Dis)Likes" and "+/-1s", we basically went backwards 2 centuries over the course of 2 decades on some level, namely that of manichaeism, good-or-bad vision of the world, negating the very complexity of things (not least of which, human nature) that we took so long to uncover.

    Our societies have become borderline in the strictest psychological sense.

    And shaped by them, by our peers, by our environment, we become more and more borderline ourselves. It might very well become the curse of this 21st century. Though Shakespeare knew and told us that "nothing is good or bad, but thinking makes it so", we've never been so adamant at simplifying reality around us. Way too much. But far from the elegance of simplicity, we took Ockham's razor a bit too literally: not simple but simplistic. We basically negate complexity where it is due. If it's somewhat detrimental to our individual mental sanity, at the very least making us too impulsive and prone to wrong assessments thus wrong choices, it's proven to be catastrophic on global issues, from finance to climate passing by social interaction: we're lost in analyzing our world and the issues that face us. How could simplistic minds tackle the modern complexity? They can't. So they fail. We fail, collectively.

    For an optimistic, I've never been so pessimistic about our future. It's ok to fail when we try, because we learn, and other times we'll succeed; but I fail to see us really, intelligently trying anymore on a collective scale; because we don't really learn, we just "like" or "dislike" on an unprecedented level, praise or bash, and then move forward without a second thought, as if everything, people included, were ethereal as you say, ephemeral. Yet the trouble is that it's not, but we don't really wake up to that.

    Back on topic, it would so much better to use a "flavor" system, with a very few simple yet meaningful scales: "skilled", "chatty", "arrogant", "helpful", "bullying"... That would be a better way, I think, to match people based on who they are and what they're looking for. I don't know if it could be practically implemented, but I'm sure MS cloud can handle that, and you could just speak to Kinect to define how you feel others based on a few (very few to make it efficient, or more but clustered) keywords.
    Reply +1
  • ikkei 02/08/2013

    @darkmorgado oh, ok, then you get it ;)

    Well I do hear all these constrains and, arguably, you're right on spot. But people aren't data. So while I totally agree that it may work generally to identify the edges of a Gaussian curve (bullies & "good" players), my concern is less about the efficiency of said system(s) than its consequences in the long run. Hence my mention of kids. Humans tend to adapt to their environment, and think for a second how people might adapt to such systems. Certainly Facebook for instance didn't make us more human, rather more detached. I'm a "technology optimistic" who might have been inclined to welcome a future with AI as friends and Vinge's vision of a world post-Singularity (so I'm certainly biased in a good way on such topics); but realistically, there's a worrying trend in how we evolve, socially, in the most connected countries, because of technology. The fact is it didn't help people with intimacy issues, nor did it help lessen social-induced depressions, let alone bolster global cohesion. Wikipedia is a marvel, but all it can do is help knowledge, our minds, not our hearts. Matchmaking is great and certainly necessary, but it's just a tool.

    There's a great book by Sherry Turkle, "Alone Together", which finally convinced me to rethink my whole approach of technological connectivity. All of these things are in their infancy (1 human generation or so), and I'd wager we'll need a lot of trial and error before we get it somehow right. In my opinion, the constrains and objectives you were talking about should not blur our long-term approach, nor deter us from a cautious and positive criticism (as in being constructive but "not letting our guard down", trying to be more aware than ever that what tools we make have deep consequences on the generations that use them). If you know about intelligent design, the fact that we now shape the environment that shapes us, especially kids, I think you'd agree that caution and wisdom are of the essence.
    Reply 0
  • ikkei 02/08/2013

    @darkmorgado I do agree with you but it seems you didn't get my point. I'll try to make it more clearly. It requires a bit of knowledge in sociology, anthropology, or psychology though. And it's not a rant, it's more of a social theory.

    Algorithms are good for assessing global trends and categories. You can fairly well characterize a demographic group for instance. But this is "macro"; whereas it tells you how an individual might think or react on average, it tells you very few about a given person. Macro trends just do not apply to the individuals contained in that group, any psychology or sociology graduate will tell you that. We're not gods with statistics, we're observers, and as of yet human behaviors aren't equations. We don't even try to talk about individuals in this field of research, that fringe idealism has been shut for a good 30 years now.

    Simply put, you can predict how "democrats" or "FPS players" may react, but when facing an individual, you can't predict that person. That's what a century of anthropology and psychology taught us, this is why these fields don't rely solely on numbers and statistics: if they want to build a deeper understanding of any social phenomenon, they need to approach matters also from a qualitative standpoint. Something that an algorithm cannot do, or at least not as we speak. I can't go deeper into this since its fairly complex but its common knowledge for graduates in these fields. This is why your shrink can't predict you, though he can know a lot about your group. This is why you may assess how ants behave, but not each of them individually.

    We tend to rely on systems theory, blending holistic and individualistic approaches these days, and the big challenge is of course Big Data. But Big Data isn't Big Brother, these are two quite different beats that may complete each other but not substitute.
    Now this isn't fixed n stone, these are relatively new research and I certainly wouldn't claim to know the end game. But my intuition and knowledge screams "wrong approach".

    Matching isn't the same, and works quite well with algorithms, but we're talking about something else entirely here. Something fairly more complex, and it is my contention this MS take on labeling people is flawed at its core. It's interesting to start with an iterative process, and certainly crowdsourcing it is the way; but there's just no denying that tweaking it is in no way, shape or form a done deal. Behavioral studies on big data just aren't there yet, and you might be surprised that by science standards, that methodology is much more "1980s" than it is 2010s. But hey, time will tell. It might work decently enough for a bunch of rabid gamers. I don't see any recent papers hinting that it's a smart way to approach big data, though.
    Reply +1
  • ikkei 02/08/2013

    PhD? PhD in what exactly? Certainly not psychology or sociology! This system has one fundamental flaw: it tries to put an objective mark on something that is inherently subjective and meant to be so, i.e. how much we appreciate others. It falls under the wrong, immature idea that "you're either popular or a jerk or something in between", like there's just one scale to evaluate people and it's all that matters. Well guess what... Human nature doesn't work like that. We're a bit more complex (psychology 101).

    That guy who 90% see as a jerk? He might be your best friend, because his cynicism just makes you laugh and you know it's just a facade. That miss-everybody-likes-me? You hate her, or you couldn't care less for her superficial attention/likes cravings. We're not statistics. Numbers account for characterizing groups and trends, certainly not individuals per se (anthropology 101). MS is doing what decent people pretty much try to avoid in life: labeling others. There's no such thing as an objective mark for a game, but people? Certainly it's a matter of preferences.

    Just like Facebook took years to understand the idea that anything "social" has degrees and nuances, that you don't talk to everyone you know in the same way, that human communication is waaaay more subtle than that (something that Google+ tries to do better from scratch with the notion of circles embedded at its core), that anything "all-around" is just superficial, image-building or sheer flattering, MS is just way out of touch on social relationships. The worst thing is that we grown ups know that, or so we should; but what of kids born and raised in such an environment? It's worrying to say the least. They might never learn how to properly build intimacy, nor how to behave publicly altogether. That's a recipe for psychological disaster, because human biology and nature hasn't changed in 20 years all of a sudden because the Internet appeared. Our tools (technology) should try to cater to our natural needs, making life easier, not more awkward, it should not force us into becoming weird in order to use it. I wager that such freak tools are doomed to failure, notwithstanding the driving power of technology itself, that might emerge later on in more mature forms, shapes and uses.

    After Zuckerberg, an anti-social personality conceiving the current standard for online social interactions (talk about paradoxes... and obvious shortcomings...), we now have a bunch of anti-consumer suits saying that labeling others is fine and can be a substitute for proper reporting and human moderation. This is damn pathologogical.
    Reply +5
  • Sony developing Gran Turismo film

  • ikkei 24/07/2013

    Franchises are one thing when it comes to video games. As Total Biscuit recently pointed out in a Content Patch (commenting upon Ubisoft's public acknowledgement that they don't even consider developing titles that aren't suited for multiple sequels), there are a number of business considerations that actually justify the need for franchises — not least being the fact that people buy them, 19 out of 19 best-selling titles in 2012 (in the USA or UK, I don't remember) having been sequels (#20 was a single title).

    But adapting a game to a movie screen… that's something else entirely. Be it from/to books/games/movies, adaptations usually are a mess. Great adaptations are the exception, and a very rare one at that. Most of them are just easy cash grabs. I don't see how this GT can't be; unless they have some serious scenario underlying the project, in which case there's no link to GT whatsoever since GT has absolutely no story — meaning the GT license would be PR and nothing else, like blatantly putting a "gamers come spend $10" label on a movie.

    Reply 0
  • Sony tested PlayStation 4 controller that sensed how much you sweat

  • ikkei 17/07/2013

    Guys, I really need you to enlighten me. Here's my question: regardless of the controller, how can anyone feel "decently in control" when playing aiming shooters on console?

    I'm a PC gamer, formerly been playing consoles for 10+ years (still do, just way less than I used to), and if there's one genre I could just never handle with an analog stick, it's anything that requires precision aiming. I just can't get the same level of precision (thus reactivity) than I do with a simple mouse. Basically put, it's easy to point your mouse at something (that the camera is linked to the aim makes no difference). However with a console controller, I find it obnoxiously difficult to "go to a certain spot and then stop that camera from going too far", so I have to go back and forth until the aim is right where it needs to be... and I just loose like half a second or more to precisely aim each time I shoot. On the contrary, with a mouse, I simply aim and shoot, there's no inertia if you will, and precision is flawless...

    Not only is it hard as hell for me to be a good shot with a controller, it's not even remotely fun! I feel handicapped, I feel like a rookie who needs several seconds to aim... It's a pain. A good friend of mine who works at Capcom tells me that many good shooters on console ("good" meaning appreciated by gamers, at least in Japan) actually have a "semi-auto aim" mechanism in place to account for poor precision, and thus give the player a perfect shot even if it was, technically, a bit off when the player fired. That, to me, is very poor in terms of satisfaction, it literally takes the player skill away as I see it. I generally activate it (MSG for instance) but that takes the fun away when shooting things.

    So is it me? Am I doing something wrong? Or is it a consensual fact that shooters always get the most precision when played with a mouse? (Without any scripted help of course, I mean strict precision aim that you just fire within exactly the time it takes to move from point A to point B)

    Also, subsidiary question, unrelated: on my Xbox-like windows controllers, pressing the D-Pad up for instance most always results in it triggering the right or left direction (probably a diagonal, I don't know, but it's just a pain in menus). Are my controllers broken (I have two, both poor on the D-Pad), or is it me? I've never had such issues with any Nintendo or Sony's D-Pads... I feel very poorly skilled when I hold these Microsoft controllers, and I keep ranting about how hard it can be to make a 30 years old simple D-pad... It's too bad because the MS controllers are indeed quite comfortable to hold, but I just can't play properly with them, from shooters to sports passing by menu-based games...

    Thanks for any answer you might provide me. I've been struggling with this issue for a good ten years now.
    Reply 0
  • Cerny Computer Entertainment

  • ikkei 14/07/2013

    @Alcifer Please allow me to elaborate a bit on your comment. You're totally right that better audio hardware can make an audible difference in sound quality. Of course, your speakers (and mics at the recording stage) are the most important part (in the audio chain, from a real live guitar to your ear, it's these membrane-based analog devices that are always somewhat imperfect). The digital part of the audio chain may be "perfect" as in "no human ear can make a difference", and times have changed greatly recently, for the better. The most crucial part is the A/D-D/A (analog to digital then back to analog).

    Back in the 1980's, you'd be lighter by dozens of thousands of dollars to get that perfect A/D-D/A conversion (that would happen only in studios and at very rich individual's). It got better as computing processing cost gradually decreased. Now for about $100 you can get that perfect D/A converter (typically hooked in USB to your PC, outputting classic stereo analog RCA plugs). not five years ago you'd flush $1,000 to your retailer to get that kind of quality (these famous "DAC", Digital-to-Analog Converters, really just a dedicated chip). If you buy a big receiver these days, chances are you've got state-of-the-art DACs inside, because it's their main trade if I may say, but such HQ chips are harder to find in PCs, let alone consoles. Your X-Fi is a perfect example of "upping the ante" and giving your speakers the best (unaltered, non-distorted) sound material. Remember that the result can only be as good as your source though (if your sound file is badly recorded, no way it can get better, the best you can get is what's on the source, not more).

    Which means that a truly great sounding game needs:
    - a good DAC (I'd wager we can nowadays make that for as little as $10 for integrators such as Sony/Apple/etc.)
    - a good source (that would rest upon the devs and their sound studio)
    - good speakers (that will make a world of difference to your ears... probably much more than any other aspect, since it's analog and therefore never perfect (one can get pretty mind-numbingly natural sounding speakers for a good $12,000 or so, but under that, you have to settle for "some" imperfection).

    Overall, I've never heard any consumer device sound as good as my PC (Mac+dedicated player called "Audirvana" ) + DAC + speakers (a good pair of columns) for a comparable price, and I don't expect a PS4 or Xbone to do better; but I can certainly see these sounding just about as good. Anyway, if games sounds are compressed in mp3, there's no way a trained ear cannot make the difference with a lossless encoding, and that will make the real difference you'll hear if comparing sources. But who compares? We just fire up one device and roll with that. I may still use my mac to play music, I don't expect to be disappointed by my PS4 sound output quality. Especially considering it doesn't handle the D/A conversion since HDMI is digital (you need a good dac in your receiver or whatever processes your sound from HDMI to your speakers). As for digital operations on sound... any computing device is able to process sound without so much as a single byte loss or distortion nowadays (of course it requires good software, well-coded… I think we can trust sony on that, given their experience in audio, from pro to consumer). The PS4 GPU is more than qualified. There will be no shortage of quality for devs on this aspect. What will matter, once again, is the quality of the source. 128kbps mp3 just doesn't cut it… even 320kbps is just ok… lossless FLAC or equivalent at 16bit/48KHz would be the best we could hope, at least for musical parts of the soundtrack. My receiver demands it! Will it get that lossless soundtracks? We shall know… to be continued. :)
    Reply +3
  • Apples and oranges

  • ikkei 14/07/2013

    @Smill110 I can see that a "possible reality". It's been 10+ years that the "revolution of casual gaming" has been Nintendo's domain (PR, dev, at every level) and quite a successful endeavour at that. All these women/older guys who now play games didn't think they would prior to 2000. Nintendo did a lot in that respect. I've been arguing for all these years that, from a consumer standpoint, both Nintendo and Apple are very comparable in that they cater to mass audiences with "ease of use, simplicity of concepts, and immediate retribution of purchase". Basically, it just works, simple and easy. There's really not much distance between their approach, and they both rock the market each and every time, no matter how sceptical "core consumers" (from linux fans to PS-boys passing by Windows/steam/android/etc.) These are facts, even though they puzzle us sometimes. So yeah, definitely, a joint-venture between Apple and Nintendo may be the high road for both companies. It carries a lot of meaning. Reply 0
  • Sony's E3 victory was a PR stunt, but PlayStation 4 is still the one to watch

  • ikkei 13/06/2013

    This article sums up many of my thoughts. Great piece IMHO. Reply 0
  • Ouya review

  • ikkei 08/06/2013

    "Plex playback was excellent and the app worked beautifully with our in-house Plex server"
    That's better than an AppleTV3, in 1080p. How about sound? Can I send 5.1 to my receiver? Is there a better, cheaper alternative for a Plex media player? Considering neither PS3/4 nor XBox360/1 supports Plex, I'd assume Ouya is still a better choice for Home Cinema enthusiasts.
    Reply +2
  • Microsoft insists Xbox One Kinect doesn't record your conversations as privacy concerns build

  • ikkei 08/06/2013

    @ubergine I think you mistook my point. I do agree with you. :)

    Just saying that the cloud in itself can be great, just looking at what apps makers can do with it (iOS/Android/PC convergence etc.; as a user, more precisely as a writer who needs to put down a few words every now and then, I can tell you it empowered me). In gaming, the cloud has been nothing but abuse of players trust and a shady way to enforce DRM under the hood.
    Reply +2
  • ikkei 07/06/2013

    @ubergine I can think of a hundred ways in which cloud computing could actually make for better games, just haven't seen many in real life. So far, save files excepted, cloud computing in gaming is another case of using great tools to achieve very poor and shortsighted goals, and utterly under-achieving in terms of engineering cleverness, to put it mildly.

    Oh, heck, why remain mild when facing such nonsense... with privacy concerns thrown in the debate, considering Prism and foreign equivalents, I should say the new xb live cloud/kinect is a potential threat against democracy. Borderline suicidal for private entities (companies and individuals) to either sell or buy such a device if you ask me.
    Reply +3
  • ikkei 07/06/2013

    This is a dire issue. As Rebecca McKinnon pointed out in her book "Consent of the networked", our governments in "free countries" are not evolving towards better protecting our online privacy and preserving the neutrality of the networks, but rather imitating more and more the practices of authoritary governments: basically, they tend towards using private corporate means to force these companies into "collaborating" with surveillance policies and giving information about suspect behaviors (last chapter of the aforementioned book).

    It's often shadily conducted, too, as Google's annual report shows, as (some) ISP often complain about, as neutral observers are increasingly worried about. Companies are regulated and cannot do 1984 BS without risking heavy sanctions; but governments tend to operate, well you know, above the law, when it suits their goal. We all know there's a good side to this, namely combating real dangers, but we also know these dangers are marginal; the bulk of this "not really legal but government-backed intrusions" is more about IP protection and the likes. From Acta to NSA passing by Patriot Act-related surveillance of all private communications, this may be one of the biggest issue against freedom and privacy in this century. It may very well be, if we don't act and oppose these policies, that a few decades from now, political freedom and freedom of speech may be severely hindered. What we gained these last two centuries, no less, is at stake people. We need to wake up and take these issues where they can be solved. That's not in the private sector which just does business.

    So corporate means, cameras and mics on every device, kinect and so on, is a perfect tool, but it's just a tool, and one that we may like using, if iPhone's share of pictures taken says something about it. So, as any tool, a camera or a mic is neither good nor bad, but what we make of it. I don't fear corporations themselves on this issue, I fear that governments can use corporate means without any form of transparency, without answering to anyone. I fear that hackers can use and steal these tools when companies fail to secure them properly (the PSN debacle is a very good recent example, but don't be fooled by its magnitude: it's not a one-time thing, it happens everyday at much lower scales to a lot of companies in the world. When you add it all up, it's quite a lot.)

    Kinect may very well be listening only for "xBox off/on" words. But what if the NSA asks MS to include other keywords, such as "bomb" or "terrorist" and whatnot?

    Can MS say 'no'? I don't think so. Google doesn't say no, instead they divulge publicly how many requests were made by governments. Apple is silent on the subject, yet Siri is very much "aware" of many things we do and say already. So I don't fear Microsoft, nor apple or google (the "worst" they can do is make more money). I fear what others, moved by other motives, could do with the technological means at our disposal. I think it's time we ask these questions publicly and come up with real legislation for online activities, especially when the line between physical and online tends to blur more and more. Because I, for one, would prefer that the definition of some things, for instance a "terrorist" or a "political opponent", be what the people say about it-- not some obscure men in black suits far away from the public sphere. A rightful State is one that answers to the people, otherwise it's authoritary, it sets us back to dark ages.

    This is a much bigger issue than gaming. It's about freedom in a new world where the means, thus the rules, have changed. And in seeking collective solutions, we must never forget past lights:
    - "any society that trades a little liberty to gain a little security deserves neither and will end up losing both." (Benjamin Franklin).
    - another way if saying this is that "an end doesn't justify any means", for the very reason that, if we lose liberty, then what have we gained? What's left to preserve if we lose our freedom for the sake of protection? Protecting what exactly?

    I'll leave you with a few figures. Terrorism in the world kills about 25,000 people each year, compared to 7,000,000,000 people on this earth. That's so small, it doesn't even show in statistics. In France, for instance (~1% of the world population, 65,000,000), alcohol kills 70,000 people each year, tobacco another 60,000, random accidents (poisonings, car crashes, falls, etc) a bit more than 10,000. 75,000 women are raped each year in this country alone. That's huge. That shows in statistics, very much so. These are real, genuine threats. Terrorism is horrible, and we must act against it for sure, but it's not what we can call a "major threat". We won't disappear because of terrorism, that is mathematically impossible. However consider what we lost already, from the hands of our own governments, in terms of freedom, in the name of protecting us...

    MS, Google or Apple are just tools. You don't fight a tool. You fight its misuse, you fight misconceptions, and ultimately the individuals that are responsible of misinformation and shady practices.
    Reply +14
  • Deus Ex: The Fall is an iPhone and iPad game out soon

  • ikkei 05/06/2013

    I wish we could "plug" a Bluetooth gamepad on these tablets, that would make games both playable and actually enjoyable... Reply +1
  • Final Fantasy 7 retrospective

  • ikkei 02/06/2013

    I often imagined, or rather re-imagined, how popular masterpieces would look in another format. By "format" I mean movies, series, games, theater, albums, novels. You know, What if... Wish You Were Here were a movie before it was an album? Or Dexter an act, or a musical? What if 24 were an album? A book? Some look awkward and not enticing, others could be great. I'm sure we all have these evasive, blurry thoughts once in a while.

    Since there's always the tricky technical matter of adaptation, my observance is less about technical aspects (game systems, movie cuts, albums recording/mastering and whatnot) than it is about storytelling, ambiance, characters development and world exploration. Every fiction is about saying something on human nature, and that's what I, for one, usually "retain" from stories, no matter how they were initially presented to me. What I mean, simply put, is that Akira, Battlestar Galactica, Final Fantasy VI, OK Computer, Mellon Collie and the Infinite Sadness, or Dune all had an impact on my mind that, eventually, contributed to how I view the world from an emotional standpoint. If sciences are the food for thoughts, then I dare say fictions are the food for emotions, and that stories we know shape us just as reality and "truths". Probably not as much, but definitely along the same principle of assessing the world surrounding us based on things we have inside our minds, simply said knowledge and experience.

    Based on this, I find FFVII to be a "very good story", arguably even more so if considering few games can reach such heights; but overall I personally don't think this particular tale as "mind-blowing" or "earth-shattering" compared to the wider offer, that is thousands of years of stories wrote and told by humans in countless ways and languages. FFVII's story and characters convey a truly epic setting and grandiose events that nevertheless remain absolutely "unreal" or "surreal" to me, as in "I can pretend to be, to play the hero in FFVII, but I can't find the ways, beyond obvious fantasy "qualities", to grow a better understanding of these things in my real world, from this game". It's great when I'm in it, but I don't, I can't, take much from it in my psychological reflexion.

    Basically, if it were a book, thus striping all gaming elements from it, FFVII would be a great epic tale, for those who like fantasy tales (LOTR and Star Wars fans may apply, Terminator and Batman fans probably as well). It's a category of fiction that stems from classical myths, including religious ones, and tend to depict idealized characters, typically Manichaeans, a little bit too much for me to forget that there's a "fourth wall", that its just a fiction, and one that may never happen in our real world, not even the detailed states through which characters pass -- too simplified, too black-or-white, too immature if I were to mildly exaggerate.

    FFVI, however, is a much different beast. Beyond the fact that both are J-RP games where the world is at stake and technology has a role to play, they have very little in common. If you're likely to compare VII to Star Wars or Lord of the Rings, because it's just epic and huge, FFVI comes from a deeper look at human nature, a more subtle or intellectual take on things; you might think Kafka (obviously...), Kundera, Nietzsche, perhaps Anne Rice for her romantic take on vampire tales, or even Tolstoy. Certainly, as far as comparison goes, VI is closer to Star Trek than it is to Star Wars -- whoever gets what that means; as for others, let's just say that ST or FFVI are less a show-off of technology than a reflexion about it, that they're both less about glorifying greater beings than depicting the sheer burden of greatness and asking, first and foremost, what is "betterness". And as you might have guessed, there may not be a definitive answer to such a subjective question.

    This might seem vague, especially if none of these references speak to you, but I certainly don't want to spoil FFVI to anyone. And I could go on about it for days, which is probably not the place to do so. So I'll leave it at that, in hoping that people will try it -- if it ever comes on iOS/Android: DO. GET. IT. Don't mind that its a 2D RPG: consider what books, with mere words, can do to you.

    Also, worth mentioning is that, whereas FFVII soundtrack is excellent, and Sephiroth's theme "one of the best I ever composed" according to Uematsu, many of his fans and FF fans alike think that, as a whole, it's not necessarily his best. And that's because, as a whole, the FFVI soundtrack may be even greater; it is a masterpiece in itself that I would deem worthy of comparison with the greatest classic symphonies. FFVII OST is one of the greatest OST ever, FFVI OST is simply one of the best compositions ever... It's just beyond traditional movie/games OST, it's a work of musical art that may, a century from now, make our children write "Uematsu" next to Chopin or Mozart.

    I could actually generalize: FFVII is a likely contender for "best game ever", VI certainly not (excellent, but not the best); however story-wise, the situation seems quite reversed to me.

    I'll just leave with this remark: if FFVII would deserve an HD remastering (I agree, a remake is too hazardous and not required), FVI should perhaps just become a book, ideally a movie, or perhaps a short series of 6-12h. That's how, in my humbled opinion, it could best be known and remembered by our century for the amazing, once-in-a-lifetime masterpiece of fiction that it is to me.
    Reply +2
  • FIFA 14 on PS4 and Xbox One uses the fancy new Ignite Engine - but the PC version doesn't

  • ikkei 25/05/2013

    @kosmosagamemnon It might be a bit late, but here goes.

    You're right that the term 'alien' was an exaggeration, also you're right that sharing DX APIs between Windows and X1 platforms is a strong factor of code convergence and cost reduction. I totally agree. I just don't think it's equivalent to "insta-port" nor anything hassle-free of the sort.
    In short, we agree that ports will be easier this gen, more so between PC and X1; but easier doesn't mean costless, and our main point of contention might be different assertions on "how much" that cost weighs on EA sports division budget and workforce, on a yearly basis.

    Also, we all assume that the "normal" stance for EA would have been to port the engine to all and any platform from year 1 (implying that they scratched the PC port); but perhaps even one platform was a lot, two was a serious endeavor, and three was just impossible or too risky (do we want two 'great' engines or three 'average' ones? Quality must be considered at some point, especially given EA's last failures). I would love to hear a dev about these issues, but I guess we'll have to wait until NDAs are lifted.

    And thank you indeed for the discussion. :)
    Reply 0
  • ikkei 24/05/2013

    I would also add that I could bash EA for more valid reasons than I can count right now, but this doesn't appear to be one of them. It is my humble opinion that people disappointed by this news are not totally aware of the real cost of these things, and are essentially asking for more than is possible.

    Of course I could be wrong and EA might just be spreading costs over several years to avoid too big a loss in the 2012 or 2013 fiscal year, which may not be game related and bit lacking in consideration for us, but nevertheless is something that most big companies do out there. I don't think we should bash EA for more reasons than they deserve -- there are already plenty. If we have a problem with how companies are managed, then maybe the real criticism should be through one's wallet (by not buying), or looking at regulations (by voting).

    But really, in this case, I'm afraid there's no agenda other than what a team of core 3D programmers can do within a certain timeframe, even with EA's means. There are human beings at work behind, teams of people like you and me, and we know they're not particularly well-paid nor lazy in the videogame industry (rather the opposite, long hours, underpaid compared to other sectors for jobs with their skillset). So if they can suffer a little bit less pressure by developing two ports of their engine instead of three this year, then so be it, more power to them. I'm a PC gamer and that gives me many other games to play.
    Reply +1
  • ikkei 24/05/2013

    @kosmosagamemnon Dear Sir, I respectfully disagree. We should not confuse physical and logical configuration of computers, nor should we overlook the fact that a 3D engine is at the very crossing of the two, where "hard code" meets "soft and apps", more precisely a complex and optimized layer between the physical layer and the logical application.

    1. The fact that they're both x86 has to do with processing instructions, but it doesn't mean anything about how they're physically programmed, i.e. the real architecture. This "hard-coding" (taking the real architecture into account) is mandatory to achieve great performance on consoles, it's part of the reason why first-party developers used to obtain fantastic results on PS3 (by fine-tuning and optimizing the code through millions of tests to tailor it exactly to the Cell). This time, PS4 is much more straight forward (8GB DDR5 and that's it) while the X1 goes the complicated way (for gaming and performance application) with its ESRAM step in the processing chain, something that will undoubtedly be tricky to master.

    The current consensus is thus that PS4 will be easier to code for games, while XBox is more suited for general purposes.

    But in both cases, it's very different from a PC architecture in which you have a northbridge, a separate CPU and GPU, distinct RAM and so on and so forth. The fact is, both in the PS4 and X1, CPU-GPU-RAM are fused on a single die, something totally alien to PC architectures, much closer to what you find inside tablets actually.

    2. You also have to factor in that we're not talking about a simple "port" of an abstract application, we're talking about an engine, something "on which" 3D imagery does rely as a framework, a specific "set of methods" if you will. It's as close as game programmers go to the machine, its intricacies, its specific architecture (an engine needs to be heavily optimized). This is so parallel to gaming development itself that most studios don't do it, they just pay fees to use an existing commercial engine (unreal etc.) All of this, in layman words, sheds light on the daunting task that designing an engine for the next 5 years or so may be, in terms of work and resources (R&D cost). It's huge guys, really.

    So I stand by my argument that such an endeavour, that only the major studios undertake once or twice every decade, is by no means "just another development" or "simple to port" ; I also stand by the fact that, despite their x86 core architecture which makes them closer to PCs, especially in terms of tools and instructions (logically), programming for these consoles will require time to master their physical configuration. That's how you make a 300$ piece of hardware perform about as well as twice more expensive general-purpose PCs.
    Reply +1
  • ikkei 23/05/2013

    I don't like it because I'm a PC gamer, but I think I get EA's strategy.

    They have to choose where to invest, and this year they chose to focus on consoles, for the obvious reason that they're beginning a new cycle.

    1. All other things being equal, it requires more money to develop a new engine for a new platform, so focusing on new consoles may have shrunk the sum of what they could do. Evidently, they scrapped the porting of their new engine on Windows platforms. It will probably come for an early 15th installment of these franchises (next summer?)

    2. Consoles require time to master, development-wise, thus you want to start early. Consoles are more profitable for developers when the installed base is wide, and EA is the kind of publisher that makes a console sell well: they know they have this responsibility to them (medium and long-term profit) and the whole market to "contribute" (still profitably yet while strategically maneuvering) to a fast and wide adoption of this 8th gen.

    3. On the other hand, the PC gaming market is stable (slow but steadily rising trend I believe) and heavily challenged by other, more profitable form factors (smart-mobile-touch-things), in which EA dives deeply as we all know: their PC share will not change that much, no matter what they do, especially for the sports series which sell much less on PC than they do on console (must have something to do with the target audiences for these games which don't match the core PC gamers, added to the fact that casual PC gamers don't often bother connecting a gamepad and so on, it's still a bit geeky to FIFA or NFL on a PC in many a household...)

    All these reasons, driven by the fact that a business has to make choices to keep some focus, make a solid case, IMHO, for EA to choose to roll off their new engines exclusively on consoles this year, and PC and other platforms next year. And actually, saying this very word, "exclusive", screams of another argument--money shelved by console manufacturers to make sure EA is well aware of all these arguments, enough to go all-in on a pair of consoles.

    I don't like it personally as a PC gamer, but if I were EA, and assuming my assumptions in this post were correct, I would have done the exact same as they did.
    Reply -1
  • Spec Analysis: Xbox One

  • ikkei 23/05/2013

    One... one... Could it be that MS is considering moving to an iDevice business logic, with a new refreshed version each year or so, maybe every other year for consoles (Xbox Two, or New Xbox, and so on), a backwards compatible family akin to iPads and the likes? If Glass can stream games from XBox and PC (as nVidia shield does from a Windows PC), and Windows 8 proves to be truly seamlessly cross-platform, MS may have a solid basis for a user-centric ubiquitous ecosystem.

    On an ironically "side-related" note, this perspective couldn't be further from core gaming, insofar as a console making strategy goes.
    Reply 0
  • SimCity: Maxis says core problem "behind us", crashes down 92%

  • ikkei 11/03/2013

    I haven't actually played, let alone bought, a Sim City game since Sim City 3000. (I watched a friend play a bit of 4, and I have Deluxe on iOS but never actually played it because it felt much too simplified). Hell, I wasn't even really thinking about Sim City before this reboot came into PR cycle.

    Everything I read or saw on videos had two effects on me:
    - It sparked my interest anew about spending some time making a city. Or an empire. Well a something without combat.
    - Considering the options available (as of march 2013), I'm hesitating between spending €5 to get Tropico 3 or 4, or spending €10 to get Sim City 4, both from Steam of course.

    In fact, at this price I could buy both and just try them altogether. Sim City 2013 is a distant fouth in the list, and that doesn't even include other strategy games that I really want to replay or try.
    Reply +3
  • Valve-backed Xi3 Piston console starts at $1000

  • ikkei 11/03/2013

    I think it boils down to this.

    Basis: currently (under current economical circumstances, interest rates, blablabla), in rich countries, here are the usual price points we expect when we buy computing hardware:
    1. €0-150: not much, older generation consoles and lower-end/older PC. Some kid/school initiatives.
    2. €150-300: good living-room machine (typically media-oriented, such as current-gen console, or "cheap" HTPC likely not at ease with HD), low-end desktops/netbooks.
    3. €300-600: decent computing machine (regardless of form factor), including PC (good desktop, average laptop), HTPC (low-end), smartphones, tablets, high-end living-room consoles and TV boxes.
    4. €600-1000: very good gaming rig (desktop), good HTPC, decent gamer's laptop.
    5. €1000-2000: high-end gaming rig (desktop or laptop), high-end HTPC.
    6. €2000+: dedicated machine (workstation, servers, manufactured HTPC and A/V components, etc.)

    I've put numbers for easy refs to each machine type/price.
    These price points expectations ofc should rise with looks and additional features.
    HTPC includes windows/Linux/OSX (it's just a PC tailored to home cinema / hi-fi rendering, which is quite compatible with high-end gaming, at the notable exception of 3D calculations).

    What I can say, from this observation of the market, is that I don't see how customers, looking for a type 4 machine to put under their telly, would go for a $1000 Piston, when they can get a fully silent (passive, except the GPU, and a cold one when idling at that) and more powerful HTPC-like machine for as much or less money. You know, when not playing 3D-intensive games, I'd expect my living-room machine to be perfectly silent; I don't see how a small form factor can achieve that. It's why living room devices in hi-fi or home cinema are usually over-sized and over-powered, it has to do with heat management (fanless) and stability of the performance; a miniaturized device is a physical nonsense to achieve decent and silent living-room performance. Which is static, you know, "living room" and "TV-plugged" implies that it does not need to move, it's not a smartphone... so why go mini? except for the looks...

    The fact is, Piston aims at a sub-group of customers (a particular crowd within high-end gamers) that either already has the equipment, or the skills to build it, or both, and if they don't personally, then they have friends; it's not like the gaming scene isn't "social" these days. This is a tech-savvy-friendly demographic group, quite potent at turning any Windows computer into essentially anything, including a decent HTPC/gaming machine. So why, why would they go for Piston when they can do as much, for less money, with other more evolutive solutions?

    Type 3 machines (in this arbitrary price list) would be bought by much more "mass-market" customers, these are the real streamlined machines: imo, this is the only market segment where TV boxes can effectively compete with consoles, and ultimately win by increasing the usability/reachability of users' media content in the living room (to achieve less fragmentation on the OS/software in particular).

    In a near future you'll even probably be, as a gamer who most probably already has a desktop PC gaming rig, streaming from that PC to a hardware video-decoding capable device plugged in to your TV, as nVidia demonstrated with Shield--and I'm sure AMD will follow. Hardware streaming all the way from your PC monster GPU to your TV, from your desk to your living room sofa (or any portable device on your Wifi...), that's essentially the best of both worlds. Even internet-based cloud gaming should be a decent alternative to owning a device in a few years! That calls for type 2 priced machines, simple terminals, which can run under Linux or even Android probably (anyway: no license fees for the OS).

    I even see dedicated gamer, and resource-hungry users in general, buying not desktop rigs in this future, but essentially "black computing boxes" full of chips, that they put in a corner of the house, that does all the computing, and then streams it in wifi to anywhere in the home, any device. Such an integrated approach of "one power unit, lots of cheap terminals" would probably lower the overall cost of our individual computing power (purchase, use and maintenance). It could run under Linux and whatnot. It could be grided with other back boxes, physically close or not. I can't even begin to see all the implications of cloud computing (LAN/wifi and Internet/optical fiber), grid-computing, dedicated servers for every home, cheap terminals in multiple form factors from smart devices to 100" screens passing by tablets and you-name-it.

    Where is the Piston in this world? What purpose does it serve? Is it just an old PC, from the old world, with a fancy shape? If one thing, the next best thing to win the living room market (under current economical circumstances, blablabla) will be under the type 3 price range, or within it; but certainly not above.

    I just don't get Piston's strategy. I certainly don't see why, in any scenario, I would need, let alone buy, such a device.
    Reply +9