Alcifer Comments

Page 1 of 19

  • Digital Foundry: Hands-on with DriveClub

  • Alcifer 06/09/2014

    I'm amazed he made it through a whole interview on real time lighting without once saying "It's mathematically correct." :D

    Looks really stunning, it really shows how much a good lighting model can bring in terms of atmosphere. Still, it would have been interesting to know if they could have hit 60fps at 720p or using Guerrilla's interlaced 960x1080 method.
    Reply +2
  • First footage of From Software's Project Beast

  • Alcifer 30/05/2014

    @marcofdeath except Hynix weren't producing HBM in 2013, they're only supposed to start ramping up production in the second half of this year, so it doesn't look like either console is using it.

    Besides, Chipworks have done a teardownn of the XB1.
    Apparently it's using H5TQ4G63AFR DDR3 SDRAM.

    In that light, I'm really looking forward to Nintendo's next console :D
    Reply +3
  • Hardware Test: PlayStation 4

  • Alcifer 13/11/2013

    Looks like the PS4 has AMD's TrueAudio tech, the same as the new Radeon cards. Reply +6
  • Digital Foundry vs. Dead Rising 3

  • Alcifer 12/11/2013

    @MeBrains
    holy bedjezus... that looks like a boring game... where's the tension?
    In the dramatic pauses between frames ;)
    Reply +8
  • Eurogamer's guide to system swansongs

  • Alcifer 10/11/2013

    @dogmanstaruk, you usually see the system's preowned section in Game getting clogged up with 3-4 iterations of Fifa as the system dies. Like some terminal degenerative disease. Not really what I'd call a swansong. Reply +27
  • Face-Off Preview: Battlefield 4 next-gen vs. PC

  • Alcifer 30/10/2013

    COD confirmed to be 720p XB1 and 1080p on PS4.

    There may be a pattern appearing here...
    Reply +4
  • Alcifer 29/10/2013

    @Suarez07, there's a tweet about the XBone AO on the NeoGAF thread.

    You can actually see it in the last comparison shot quite well in the DF article. Look at the guy on the left's gun and you'll see it shadow the elevator doors on the PS4 and PC images. Also this comparison shows it pretty well here.
    Reply 0
  • Alcifer 29/10/2013

    @Suarez07, DICE have already commented that the XB1 version is not final anyway, it doesn't have AO enabled.

    For a proper comparison of the game I'd wait for DF to compare final builds anyway. Enabling AO, and possibly AA, will most likely reduce the framerate further. With any luck DF will get better captures to work with when they don't have limited access, I interpreted the article as them capturing each platform on separate days.

    As it stands the comparison confirms what most people have been saying for a while, that the PS4 is more powerful and the XB1 versions of games are lagging behind in development.
    Reply +4
  • Editor's blog: Battlefield 4 Face-off Preview Q&A

  • Alcifer 29/10/2013

    @blarty,
    The big question is - if ambient occlusion affects light as it hits a surface based on the surface itself and adds a softening effect similar to ambient light, does this combined with the limited RGB of the XB1, mean that when AO is added to the X1 version, we'll see similar softening as is appearing in the PS4 version?
    While it is true that AO tends to give a smooth shading effect it should not impact the clarity of texture detail in the final image, just darken areas of the image.

    It's more likely that the PS4 version is applying an additional shader based AA or motion blur pass that blurs the image. In the case of character faces in particular this could be part of the sub-surface scattering applied to skin.

    Fixing the RGB output of the XB1 will certainly reduce the contrast in the images, the DF captures make many textures appear "sharper" than those in the PC exemplar.
    Reply +4
  • Watch Epic explain Unreal Engine 4's fancy pants visual effects

  • Alcifer 24/10/2013

    @dogmanstaruk, it's pretty ironic considering all the "PS4 can't do 1080p 60" comments after the initial XBox reveal.

    I'm beginning to suspect they designed a system that could render 360 games at 1080p and didn't expect developers to do anything beyond that.
    Reply +1
  • Introducing Games of the Generation

  • Alcifer 21/10/2013

    I think the current gen still has a year or two in it, at least on some platforms. Will certainly be keeping an eye on these lists to see if there is anything really awesome that I've missed though.

    Not sure what my top 5 would be but I think Nier, Uncharted and Assassin's Creed 2 are definitely up there.
    Reply 0
  • Kingdom Hearts 3 PS4, Xbox One gameplay video

  • Alcifer 15/10/2013

    @Rogueywon, the cutscenes for Re:Coded and 356/2 Days are included in the HD remakes on PS3, so there's at least an option to avoid playing those two.

    No idea what they're planning to do with 3D regarding the HD remakes. Maybe a digital Final Mix closer to KH3 release...
    Reply 0
  • Jedi Knight 2: Jedi Outcast retrospective

  • Alcifer 13/10/2013

    I put a ridiculous amount of time into this game, and Jedi Academy, mostly just reloading checkpoints to replay fights over and over again, it never got dull. I think only Bushido Blade came anywhere near these games for visceral melee combat.

    That Lucas Arts could make such a mess of the Force Unleashed when these games existed is probably proof enough that they had no idea what they were doing. For all its gritty and dark atmosphere, watching Starkiller try to beat a storm trooper to death with his Fisher Price Padawan's first lightsabre was just sad.
    Reply +5
  • Assassin's Creed 4 director wants to take the series to ancient Egypt

  • Alcifer 12/10/2013

    @The-Bodybuilder,
    I'm no historian, but could someone enlighten me on this "recent evidence"?
    I'm no historian either, but I think he might be referring to the discovery of towns built to house the workers who built the pyramids. These apparently show that the pyramids were built by gangs of seasonal labourers, rather than slaves as previously thought.

    I don't think this is evidence that slaves weren't used in ancient Egypt, just that the pyramids weren't built by armies of slaves but instead large numbers of normal Egyptians.

    http://www.bbc.co.uk/history/ancient/egyptians/pyramid_builders_01.shtml
    Reply 0
  • Drakengard 3 will be a digital-only title on PS3 next year

  • Alcifer 10/10/2013

    Looks like it will be available at retail in the US. Will be importing this one then.

    http://blog.us.playstation.com/2013/10/09/drakengard-3-coming-to-ps3-in-2014/

    http://store.na.square-enix.com/store/sqenixus/en_US/pd/productID.288520500#.UlZpDiTzw7D
    Reply 0
  • Digital Foundry: the complete Xbox One architects interview

  • Alcifer 08/10/2013

    @khanthony, the main issue is that newer techniques may not be picked up by third parties who will want their games to be playable on all major platforms.

    In the previous generation full deferred shading was largely limited to Sony first party titles because it was not suitable on the 360 which would not support a G-Buffer larger than 10MB.
    Reply +2
  • Alcifer 08/10/2013

    @Kretenn, there is a difference between making a subjective comparison of Forza 5's graphics to other next gen titles and looking critically at the techniques they're using. Aside from an increase in resolution and quality, they don't appear to be doing a great deal that they didn't already have implemented in some form on Forza 4. The results look good, but they don't say anything about the graphics capabilities of the XB1.

    I would take Goossen's figures regarding ROPs with a pinch of salt. There is only a direct relationship between bandwidth and fillrate if none of the pixels are being rejected (depth/alpha testing). 16 ROPs will be suitable for scenes with little overdraw, like those in Forza 5 or Ryse, which to the best of my knowledge, is running at 900p 30fps. The fact that the game with the highest rate of overdraw, Dead Rising 3, is using dynamic resolution suggests that they are already having problems with fillrate or shading.

    Considering for the latter part of the current generation PC graphics have been largely crippled by the capabilities of both consoles, it would be a shame if PC and PS4 are held back by the limitations of the XB1 due to its GPU and RAM set up.

    Even the architects sound like they were disappointed with some of the compromises they had to make, rejecting GDDR5 and then a potentially much larger eDRAM pool. They certainly made the best system they could within the constraints they had, but they were pretty unlucky in what they had available to work with.
    Reply +1
  • Alcifer 07/10/2013

    @Ninja_skill, the cloudlight examples are really interesting and the latency figures might suggest the calculations could also be done locally across multiple frames.

    I'm not sure I'd expect games to be using the cloud for these sort purposes any time soon though. It would require GPU based servers, like NVidia's GRID and probably consumes hundreds of times more computing power than Titanfall uses for its servers. Even at a reduced rate and sharing data across multiple users that doesn't sound viable.
    Reply +1
  • Alcifer 06/10/2013

    @khanthony, not to downplay Azure, which is a very interesting resource, but I do think Sony also has a strength in their online offering.

    At least for the present, PSN is a much more open platform than Live, allowing developers to use their own servers and allow cross platform play, which is why games like FF:XIV, Timesplitters and War Thunder are being released on PS4 but not XB1.

    As with the hardware architectures, these are two very different approaches so it should be interesting see how things pan out.
    Reply 0
  • Alcifer 06/10/2013

    @SF1, we've heard about hardware for encoding and decoding audio and video on PS4. So it does certainly have some fixed function hardware. Though most of what is confirmed has come from interviews early on after the reveal. There are also some nice details about their hUMA setup but they're pretty technical.

    I agree it would be very nice to have Sony publish an overview of the system architecture as a whole to be able to make more direct comparisons.
    Reply +4
  • Alcifer 06/10/2013

    @CMcK, the distinction I'd make between the Cell on the PS3 and the GPU on the PS4 is that there is nowhere near as much requirement to use it for any particular purpose.

    The Cell chip effectively gave the PS3 only 33% of the CPU of the 360, the rest of it being made up of SPUs which required very specific program architecture. In the case of the PS4 (based on current clockspeed and accounting for SHAPE) it has maybe 80% of the CPU power of the XBox as well as having 50% additional GPU compute.

    To balance the comparison of the two next gen consoles: I'd probably say that if the PS4 uses brute force, it also has more flexibility in how those resources are applied. The XB1 is perhaps more efficient in the resources it has, but it is much more rigid in how those can be used.

    A good comparison of these from launch titles would be Forza 5 and KZ:SF. Forza 5 makes excellent use of the XBox system capabilities, but it could probably be implemented with only minor changes on the PS4. KZ:SF pushes heavy graphics workloads for its lighting and rendering, making use of a huge G-buffer (5 floating point buffers), which could not be replicated on the XBox.
    Reply +3
  • Alcifer 06/10/2013

    @CMcK
    Ultimately this generation of consoles are very similar and it will be fascinating to see which approach works well most consistently. The brute force approach Sony have taken or the efficient approach Microsoft have chosen.
    Not picking on you in particular, but why do people think the XBox is more efficient than the PS4, or that the PS4 approach is brute force?

    You could equally say, overclocking the whole system and tacking on multiple fixed function processors is brute force, and adding extra CUs and ACEs with flexible memory access is a more elegant/efficient solution.

    These are clearly two very different solutions to enhancing what is effectively the same base hardware, it just seems odd that they're characterised in this way.
    Reply +7
  • Alcifer 06/10/2013

    @Ninja_skill, I think you're right, virtual memory on the GPU goes far beyond just more efficient use of texture memory, although that is a big improvement.

    Sony have already presented an example of sparse voxel cone tracing for lighting using PRT in their GDC presentation on PSSL. It should be interesting to see what developers are going to do with this stuff and hUMA.
    Reply 0
  • Alcifer 05/10/2013

    @khanthony, the article you posted a link to earlier suggests that GNMX, the high level API, is actually closer to D3D11 than OpenGL. Along with PSSL being very similar to HLSL, but with lower level access to hardware, helps developers port games from PC to PS4.

    "Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11.
    - Reflections' expert programmer Simon O'Connor

    http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
    Reply +2
  • Alcifer 05/10/2013

    @khanthony
    Sony already went out and said it uses custom OpenGL
    Do you have a source for this?
    Reply +1
  • Alcifer 05/10/2013

    @OrbitScant, Interesting. The first slide about different heaps and buses puts me in mind of the interview with The Crew developers.

    There's a lot more detail here:
    http://www.vgleaks.com/playstation-4-includes-huma-technology/
    Reply +2
  • Alcifer 05/10/2013

    @deesmith, I think a lot of misconceptions arise from people automatically assuming OpenGL is the only alternative to D3D. Although Sony did implement OpenGL ES as a high level alternative to GCM on PS3.

    The DirectX on PS4 comes from DirectX versions being used to describe GPU feature sets. Both consoles use the same Sea Islands architecture and both support the DX11.2+ feature set, only the XBox supports the actual D3D11.2 API with the PS4 accessing the same features through GNM and GNMX (although GNMX could be a clone of the D3D11.2 interface with their own implementation).
    Reply +6
  • Alcifer 05/10/2013

    @deesmith,
    Sony's reliance on OPENGL actually may limit optimization on some levels because they do not own the OpenGL codebase which by its nature cannot cater to any specific hardware. This may be why they have less actual AAA games at launch for exclusives - just a guess.
    There isn't actually any one OpenGL code base. It's an open interface standard that platform holders have to implement themselves in order to support it.

    This is pretty moot as the PS4 doesn't support OpenGL. The developers of The Crew have stated that Sony are using their own low level graphics API called GNM (possibly similar to mantle) and a high level API wrapper with source code called GNMX (possibly similar to D3D11).
    Reply +7
  • Alcifer 05/10/2013

    The more I read about the lengths to which Microsoft went to build multimedia functions into the XBox the more it sounds like a big gamble. They're betting on the strength of their gaming brand to get these features into homes and then try to expand that market.

    However, a lot of this stuff sounds like it would be better built into a TV than a console (snap, Kinect, etc.). There are already multiple HDMI inputs, it's the main device used for all these features so it's the first to be turned on.

    They might have been better partnering with TV manufacturers to push Windows OS, Kinect and smart glass functionality than creating a hybrid set top box/console.
    Reply +2
  • Alcifer 05/10/2013

    @OrbitScant
    It's interesting that MSoft always talk about cache coherent memory bandwidth and never mention the fact that PS4 can bypass the L1 & L2 caches completely.
    Indeed, Cerny was also very specific about adding extra flags to allocations so they could selectively flush GPU cache lines to maintain coherence. Without this the GPU would stall completely each time coherence needed to be enforced, flushing all cache data regardless of whether it was GPGPU or graphics related.

    I assume Microsoft would have the same problem but they don't go into that much detail.
    Reply +4
  • Alcifer 05/10/2013

    My take on it is that when you buy a graphics card, do you go by the specs or do you actually run some benchmarks?
    Does he mean two AMD Radeon cards in the same family?

    If so, it's more of a price/performance trade off than a question of which is more powerful, I wouldn't expect a 7850 to outperform a 7870 but I would expect it to cost less...
    Reply +16
  • Alcifer 05/10/2013

    Reading the interview in full was much better than the articles it spawned, I appreciate seeing the points put across in order and their full context. Though I can understand that so much text is daunting.


    I get the feeling that Microsoft have been very insular, developing the console based solely on their own internal goals and experiences where Sony have made a point about reaching out to third parties.

    There's a lot of emphasis on multi-tasking behind the earliest decisions (8GB DDR3, virtualisation) clearly looking to expand their market and feature set beyond gaming.
    Their gaming related choices seem to be informed largely by the design of the 360 (eSRAM, CPU, fixed function offload, Kinect) and its strong market position and ignoring some of its inherent weaknesses (difficulty with deferred shading).

    When they talk about tweaking their setup (overclocks and CU numbers) with regards to existing titles, they seem to be doing so with exclusives, software that has already been balanced to run on their architecture. Obviously if a game has little emphasis on shader processing (is not pushing the limits of 12CUs) then an increase in the number of CUs will have virtually no impact on performance.

    The performance of multi-platform titles will say an awful lot about how well Microsoft's own priorities and balance fit with the industry at large, and by extension how well Sony have delivered too.
    Reply +26
  • Assassin's Creed Heritage Collection bundles five Assassin's Creed games

  • Alcifer 04/10/2013

    @dogmanstaruk, totally agree, Ezio really made the AC series for me, especially with the mess they made of Desmond's part. They should have included Embers with this. Although I'm sure people can find it on Youtube if they want. Reply +1
  • Microsoft to unlock more GPU power for Xbox One developers

  • Alcifer 04/10/2013

    @nathanottenson
    I'm a fairly technically minded person but I can't figure out whether or not this is true.
    It's a fairly laboured example designed to prove a point about balance.

    It predicts that developers would enable depth testing without having it reject any pixels (no overdraw), which defeats the point of depth testing.

    If any pixels were rejected, by their own calculations, they'd be ROP bound with only 16 ROPs. If they turned off depth testing they'd still be eSRAM bandwidth limited because they'd hit the 109GB/s limit for writes (even if they had more than 16 ROPs).

    Also as RL points out they're completely ROP bound when rendering 8 Bytes per pixel (4 read, 4 write), such as when rendering shadows or Z-pre-pass.

    Edit: As Goossen says:
    parts of the frame can be fill-rate bound, other can be ALU bound, others can be fetch bound, others can be memory bound, others can be wave occupancy bound, others can be draw-setup bound, others can be state change bound, etc. To complicate matters further, the GPU bottlenecks can change within the course of a single draw call!
    The bigger the number, the less likely a part of the GPU will be the bottleneck. Hence bigger numbers, better performance for those parts of the pipeline.
    Reply +2
  • Alcifer 03/10/2013

    @marcofdeath
    Just like our friends we're based on the Sea Islands family.
    In the last article Goossen confirmed both consoles were using the same GPU architecture.
    Reply +3
  • Alcifer 02/10/2013

    @DopeBoy3010, yes, apparently the ps4 has a similar chip for scaling and compositing, although only for two layers rather than three.

    http://www.vgleaks.com/orbis-displayscanout-engine-dce/
    Reply +2
  • Could AMD's Mantle revolutionise PC gaming?

  • Alcifer 26/09/2013

    @_tangent, the fully programmable graphics pipeline was a future direction, I don't think there will be much information available.

    At the moment graphics pipelines are split up into various stages, some of which are fully programmable through shaders and others which offer limited fixed functionality (like blending, raster operations, etc.). I think what AMD were talking about is writing C++ like code that compiles for the GPU and actually writing libraries to implement the whole rendering process in software.

    It sounds like a future that Intel and Sony et al were reaching for with Larrabee and Cell.
    Reply +2
  • Alcifer 26/09/2013

    @_tangent, I really don't know enough about architecture at that level to know how similar they are between manufacturers. Although they must have similarities in architecture due to supporting all the same graphics pipeline stages on existing APIs.

    That said, I think it should be possible to go lower level than D3D and OpenGL. Compute languages are already exposing features like memory and thread management at a much lower level so you could add interfaces for fixed function graphics elements too.

    AMD have already talked about doing away with specific graphics features altogether in their GCN/HSA presentations, just using programmable units.
    Reply +3
  • Alcifer 26/09/2013

    @_tangent, yeah it goes to illustrate just how much more compute power is available in GPUs.

    I'd imagine CUDA would be much more efficient than using old style GPGPU tricks, both in terms of performance and in terms of developing algorithms without having to work around the fixed function elements of the pipeline.

    DirectX now has DirectCompute as an equivalent to CUDA but that may well be built on top of CUDA for Nvidia GPUs anyway so may not have any benefits other than portability.
    Reply +3
  • Alcifer 26/09/2013

    @MattEvansC3, yeah, an API is an Application Programming Interface, it exposes certain functions to a program independent of their implementation.

    OpenGL and Direct3D are both graphics APIs while the CUDA APIs allow developers to use the CUDA platform in their program.

    Edit: CUDA isn't a language as such but a platform that can be used to extend existing languages to support running programs on Nvidia GPUs.
    Reply +2
  • Alcifer 26/09/2013

    @MattEvansC3, I'm sure CUDA is NVidia's language for GPGPU, similar to OpenCL, rather than a graphics API like Mantle. Reply +3
  • Alcifer 26/09/2013

    @funkateer,
    I seems to me the similarity to libGNM is more on a conceptual level rather than being directly similar from a development pov.
    Will be interesting to find out more about this.

    I was about to point out that PS4 and XBO support GCN. The recent Edge article suggested PS4 APIs and drivers were ahead of XBox, it might be because they were working with AMD collaboratively on the API.

    Personally I'd be happy with conceptually close APIs anyway. If there are equivalent structures and functions available it's pretty simple to smooth out any syntactic differences.
    Reply +3
  • Alcifer 26/09/2013

    Mantle appears to have much in common with the GNM API used in PlayStation 4
    If true this could benefit from Sony's push to get indies on their platform and AAA support because of PS4 development.

    -GNM/Mantle on PS4 + AMD Windows/Linux/Steam OS
    -DirectX on XBox + Windows
    -OpenGL on Windows/Linux/Steam OS/Android/iOS/Web
    Reply +5
  • Digital Foundry vs. the Xbox One architects

  • Alcifer 25/09/2013

    @sazse, sadly I think I agree.

    In many articles there are comments that can prove to be more enlightening than the article itself, but recently it seems every comment is being judged through the lens of decades of fanboy warfare instead of its own merits and context.
    Reply +1
  • Alcifer 24/09/2013

    @Machiavellian
    The area where the X1 excel may be the key to leveling the field because you do not have to be at the same resolution as the PS4 if scaling and variable scaling can make the on screen picture look no real different from the user playing the game.
    Apparently the PS4 has a similar hardware scaling and composition chip, although with 2 channels instead of 3. Not surprising considering they're both supplied by AMD and the PS4 has no TV pass through or snap.

    http://www.vgleaks.com/orbis-displayscanout-engine-dce/
    Reply +3
  • Alcifer 24/09/2013

    @Murton
    For example the memory bandwidth. They've just revealed that it's not 204GB/s but actually more like 140GB/s. So why did MS previously claim 204GB/s were they adding the 68GB/s of the DDR to achieve that figure because that's just plain wrong.
    They explain this pretty well I thought. The maximum theoretical value is 204GB/s just for the eSRAM, with real code the maximum they achieve is around 150GB/s, either because the GPU cannot use greater than that or it doesn't produce the optimal pattern of read/writes to make full use of the read/write in the same cycle. The GPU can access both eSRAM and DRAM at the same time, in practical code getting about 50GB/s maximum, so they say that their system has a peak of 200GB/s total.

    The compute units thing, there's 14 in there and only 12 are being used
    Every XBox APU is built with 14CUs but to keep the yield acceptable they allow for 2 of those CUs to be faulty. Thus any particular console has between 12 and 14 working CUs but only 12 active. They found one with 14 working CUs for their test but don't elaborate on why it didn't increase performance as expected, it has been speculated that the GPU hits another bottleneck and RL doesn't push them on this even though it directly contradicts his previous article.
    Reply +3
  • Steam announces its own Linux-based operating system

  • Alcifer 24/09/2013

    @enkiduaruru, there may well be an OpenGL driver for the PS4 but it won't be used for many games. The big engines will all be using the low level GNM API, in the same way they did GCM on the PS3.

    GCM was much faster when using the RSX, not because it used the Cell but because it was a better fit for the hardware than OpenGL, thus using less CPU time and memory, although it did expose some really annoying details that had to be worked around.

    Also, if you are developing a game for PS3/4 you will most likely use an off-the shelf engine which is rendering API-agnostic.
    Each to their own really, but on PS4 those engines will use GNM if they expect to be competitive.

    Edit: The developers of The Crew mentioned both GNM and GNMX in their interview but no OpenGL. I think it's safe to assume there is no OpenGL support on PS4 as GNMX is there if a high level API is required to help porting from D3D.
    Reply 0
  • Alcifer 24/09/2013

    @enkiduaruru, GNM like GCM on the PS3 is much lower level than OpenGL. It's somewhere between the level of OpenGL/D3D and writing directly to the driver. Reply 0
  • Alcifer 24/09/2013

    @Liuwil, PS4 doesn't support OpenGL, it apparently uses its own graphics API called GNM with a higher level wrapper called GNMX that more closely resembles DirectX.

    OpenGL has gained a lot more traction in recent years but a lot of that is down to OpenGL ES being used on Android and iOS.
    Reply 0
  • Alcifer 23/09/2013

    @darkmorgado, I think he means that they weren't being lazy when they weren't working on HL3, they were busy building their own OS instead. Reply +1