|telboy007 1 hour ago||Wow this comment thread is fucking dull.|
|mateau 2 hours ago||
"Game studios were just not taxed very hard with Directx11 you say, well we know otherwise"
Dx11 can ONLY process 4 light sources, 10,000 deraw calls, 6000 AI objects.
DX12 can process 1000's of light sources, 600,000 draw calls and 100,000 AI objects.
I game written to the Max spec of DX11 does not come close to a game written to a Dx12 spec.
DX12 will allow for higher concept gaming, incredible visuals, and frame speeds unheard of with DX11.
DX12 will also DEMAND better programmers.
This debate is over.
I'm not here to argue with you.
You can have the last word I really don't care. This has become tedious and banal.
|FMV-GAMER 2 hours ago||
@mateau I believe DX12 will
make a massive difference to PC but only because PC has
limitations that the consoles do not, draw call limitation
I don't believe DX12 will make a massive difference to XB1 simply because the consoles already have a low level API in place, they can use more than one CPU for draw calls and lastly the consoles are more GPU limited.
For example we see a great many games already where the GPU is the limiting factor and in these cases upping the amount of instructions the CPU could send isn't going to make much difference.
I believe DX12 will make the difference on PC as developers won't be making dumb decisions that hurt GPU performance just so the don't hammer their draw call limit and this is what they are currently doing/
|mateau 2 hours ago||
"Also link please for your theory that XB1 could only use two CPU cores "
I did not say that. In fact MS tweaked Xbox API which is basically DX11 plus to use 7 cores for gaming and 1 core for O/S housekeeping. But that is probably not accurate either as MS is so closed muth regarding Xbox technicals. ANd while I have some curiosity in consoles I don't really care one way or the other.
What I did say that DX11 in general can only process one to two cores as those cores can ONLY send data to the GP serially AND sequentially.
On the other hand DX12 enables ALL CPU multithreaded and multicore assets to send data to the GPU Pipelines.
And in the case of AMD APU's Asynchronous Compute Engines and Asynchronous Shader Pipelines give AMD a 100% performance advantage over ALL Intel IGP. i3, i5 an i7. This is undisputed.
DX12 will allow much more data to be processed and sent to the GPU than DX11 currently allows even with MS Dram memory tweaks and XB1 API tweaks.
But what difference does that make NOW.
We'll see soon enough.
|FMV-GAMER 2 hours ago||
@mateau Yes but the Directx11
API that Microsoft used for XB1 even at launch was not the
same DX11 API that PC uses.
Also link please for your theory that XB1 could only use two CPU cores because I haven't read that anywhere. What I have read was that even the Directx9 360 was able to handle more draw calls than Directx11 PC API.
Of course draw calls are only one piece of the puzzle, if you have a weak sauce GPU then you won't ever be able to take advantage of peak draw calls . That doesn't change the fact that I had read all about 360's higher draw call throughput than DX11 PC.
Game studios were just not taxed very hard with Directx11 you say, well we know otherwise seeing the amount of Directx11 games where XB1 has to run sub 1080p or are struggling to hand in even 30fps performance. That's struggling to me.
So if developers are already showing signs of being GPU limited then why are XB1 gamers choosing to believe something that increases the amount of instructions a CPU can send to the GPU going to benefit them with massive performance gain?. See the GPU's are already limiting performance otherwise games wouldn't be running sub 1080p , so how does being able to send more instructions to an already taxed GPU going to help?.
No I don't believe DX12 will make much of a difference to performance of XB1 and if I am wrong then fair enough I will admit so , however if I am truly wrong then I will be in good company as people smarter than me who have bachelors degrees in computing have opined on this which is how I am forming my opinion.
Lastly I do own an XB1 so any performance gain in so called exclusives would be welcome but Halo5 is running at 60fps, yet when taxed the resolution drops well below 720p, that's the GPU's shader prowess, not CPU draw call inefficiency.
|mateau 2 hours ago||
The implication is that DX12 will not improve DX11 games.
And that is absolutely true not only in XBOX by PC's as well.
For DX12 to have an impact the game needs to support DX12.
And in that regard how can you say that support for 1000's of light sources for DX12 vs 4 for DX11, 600,000 draw calls for DX12 vs 6000 for DX11; DX12 can also process 100,000 AI objects vs only 10,000 for DX11 will not have a uge impact on improved game play? On the face of it DX12 allows for gaming that is light years beyond anything that DX11 can process.
Games will become truly epic.
And you still say that DX12 will have no impact?
Even games ported from DX11 to DX12 will be processed at a much faster rate simply by virtue of 100x draw call performance. You can not render an object until you draw it.
When Xbox supports DX12 and DX12 games are written for it then you will see 1080 at 45fps or better.
It just stand to reason.
But rather than debate it all we have to do is wait and see!
|FMV-GAMER 2 hours ago||
@SuperShinobi The PS4
isn't competitive with a gtx680 or 7970 mate, it's not my
intention to start a pissing contest but if you haven't been
keeping up with the Digital Foundry articles the last 20 or
so months an i3 paired with 750ti is enough to match the PS4
and sometimes exceed it. The 750ti has less than half the
compute performance of a 680 despite it at first glance
looking like a superior card seeing as 7xx is a higher
overall number than 6xx.
Also I don't see any evidence of PC developers struggling with RAM issues?. Like I said , the 750ti with only 2gb of GDDR5 on a very modest 128 bit BUS has matched if not marginally outperformed the PS4 over the last 20 months, some times considerably so.
If anything the consoles have struggled with memory issues seeing as they use APU's so bandwidth has to be shared between CPU and GPU which is one consideration for why so many games don't even have texture filtering.
There may come a time 4-5 years from now when PS4 performs as good as 7970 or 680 but that will only happen if lots of PC gamers replace their GPU's for something better, kind of like how an 8800gtx was a Directx10 GPU 2.5x faster than PS3 but didn't run Directx11 games like Crysis3 as there wasn't a big enough market to optimise for GPU's as old as 8800gtx as most enthusiastic PC gamers upgraded to Directx11 GPU's. The same thing will happen this generation where PC gamers will upgrade on mass to Directx12, 13 or 14 and PC versions of games will only support Directx13 or 14 while the console versions will be Directx12. If that happens it's conceivable that PS4 could perform better than 680 simply because PC gamers aren't using 680's anymore so developers don't waste time optimising for it but that's a long way off yet.
You are right about consoles having their own advancements over time, but I don't think 4k gaming will be one of them as 4k isn't an advancement per say, it comes from having a GPU with lots of memory bandwidth and ROP's. In other words , you can't optimise towards that goal.
|mateau 3 hours ago||
Personally I've no dog in the Xbox hunt. So I really don't care. Also as you know MS "tweaked" the Xbox API to send data to the gpu pipeline non-sequentially as did Sony for that matter.
AS you also know Dx11 can not send data to the GPU non-serially or non-sequentially and in fact usally with DX11 not more than 2 cores are enabled.
This is why DX12 is such a game changer for PC's as not only does it allow ALL cores to process data, those same cores in AMD APU's can send that data Asynchronously and the GPU stores it and uses it when ready. The CPU does not wait and the GPU does not wait. Also that is why benching silicon in single core mode is a lie. DX12 is multicore gaming. Just because Intel IGP IP is so poor, that doesn't meant writers should cripple AMD silicon with hacked benchmarks just to lie to the consumer.
The biggest issue that DX12 faces is not adoption as MS points out that the adoption rate of DX12 is staggering; the biggest issue is competent programing.
Game studios just were not taxed very hard when dealing with DX11.
Witcher 3 and Arkahm Knights are prime examples.
What ever you are buying now for GPU tech is for the future of DX12 not for the obsolete DX11 past.
|FMV-GAMER 3 hours ago||
@mateau Well I don't think it
will and I have stated my reasons why and anyone in here
that has argued against my opinion hasn't been able to give
me any sort of reason why other than insisting I am
Either consoles have low level API's or they don't. All the performance gains shown have been from upping the amount of CPU cores that can send draw calls which the consoles can already do unless console gamers are really arguing that modern games are having all the draw calls supplied by just one of the 8 Jaguar CPU cores which would also mean that "next-gen" games are running with less CPU resources than last gen.
Also, if all the performance gains come from having more than one CPU core send draw calls to the GPU and the XB1 will also benefit in the exact same way then people need to ask why Microsoft took eight years to design their next console with 8 tiny Jaguar CPU cores but were somehow now able to get an API ready that takes advantage of more than one of them until another two years after the console launches. In other words Microsoft ( a software company) took over a decade to make software to that exposes more than one CPU core despite their hardware having more than one CPU core for over a decade?.
You see the title of the article, you notice how there is no mention of consoles in the title?.
Also how many developers have went on record now and said DX12 will make very little difference to XB1?.
Feel free to believe what you want but it's not long now until it launches
|Grantelicious 3 hours ago||Shame it's so boring.|
|LoccOtHaN 3 hours ago||
@MattEvansC3 Here bro,
the remedy for your problems with auto update :D
Just turn-off some of'em ;-)
-> https://support.microsoft.com/en-us/kb/30739 30
|mateau 3 hours ago||
Quite right. Most folks are gaming with their TV and likely that is 1080P.
I bought a Samsung 22' HDTV monitor largely for a laptop monitor and I don't need to spend several hundred dollars more for higher res.
Even for cad that monitor is perfect.
That is why my earlier point about DX12 opening new markets for game developers is spot on.
At 1080p an AMD APU will give you better than 45fps!!!
That is good gaming for the rest of the world who can not afford the bleeding edge!!
We tend to forget the rest of the world is NOT the United States or Europe but they are folks who like the same toys that we do!!
|TheDarkSide 3 hours ago||@mateau Good point. I vaguely remember reading an article somewhere, claiming that 1080p was the res of choice for many PC gamers. Of course, I emphasise the word choice. I'd imagine 60fps at least, and all the graphical bells and whistles turned up to 11 would be the reason for it.|
|XbDf 3 hours ago||Can't wait for the DF verdict on DX12 on X1, now that will be a good read.|
|mateau 3 hours ago||
"It's an enthusiast thing for people who want to experience the bleeding edge of gaming right now. Nothing more, nothing less."
The point that I am making is quite simple. AIB GPA's are not cheap. Why are consumers given information that doesn't not help them make informed choices?
I am not interested in how a dGPU card performs at 45fps at 4k, I want to know how the performance is at 65fps or more at 2k.
Which handles better at 200mph a Ferrari or a Maserati? Who cares? I don't drive at 150mph much less with a Ferrari!!
Pieces like this one here are completely irrelevant. The whole issue of single core benchmarking is a huge lie, as the single greatest benefit of DX12 is that it enables all cpu multithreaded and multicore assets!!
To test at single core is a big lie as under DX12 single core performance DOES NOT EXIST!!!
Also AMD APU's CRUSH ALL Intel i3, i5 and i7 in draw call performance by 100%. This author of this piece did not tell you that now did he.
|mateau 4 hours ago||
Exactly my point!!!!
4k is certainly a stress test but when almost ALL gaming uses at the most a 2k monitor, 4k doesn't give consumers a relevant read on performance.
It's like track testing a car at 150mph as the relevant indicator of handling.
I would even say that most folks do not game at better than 1080p!!!
|Nibbles 4 hours ago||i know bugger all bout WarHammer but colour me impressed and interested|
|giapel 4 hours ago||So not really a remake then, eh?|
|riceNpea 4 hours ago||
@man.the.king thanks for
the recommendations. Somewhat embarrassingly I've not read
anything by Asimov and Clarke, my choices have been more
contemporary, authors like Peter F Hamilton and Ian M Banks.
Such as loss is his death. His Culture novels I'm especially
fond of because of their wonderful mix of hard sci if and
understated British humour.
I really need to get round to reading the classics though. Goodness knows I've wasted enough time reading bad sci fi hoping to find a gem.
I had a little look at Flowertown on Amazon to see what it's about and thanks for recommending it but it's really not my taste. I'm quite fussy, the sci fi I read has to be set in the future, preferably far future. As wonderful as it is being alive in this age of discovery I'm impatient to know the answers so I often find myself fervently wishing that I was born at least a century from now. So the closest I get to knowing is reading the brilliant imaginings from people that can make me believe they have a window to the future.
|mdeneuve 5 hours ago||
Um, I don't think I've ever seen anyone claiming 4k is "necessary", let alone *everybody*.
It's an enthusiast thing for people who want to experience the bleeding edge of gaming right now. Nothing more, nothing less.
|bad09 5 hours ago||
I would hardly say "everybody" claims it is necessary. Less than 1% actually game at 4K :)
|DarkSeptember 5 hours ago||
I have always given new tech a chance. I can remember when
Kinect was released , I actually said it was quite good. I
own many Kinect games (Child Of Eden being the best) I
remember picking up this game for a mere £5 and I still felt
ripped off !!! The game just ignored every gesture you made
and done whatever it liked !
Many years later and Kinect V2 is just as bad. If you've ever played 'Fighter Within' you know what I'm talking about !
|mateau 5 hours ago||
Single core DX12 benchmarking is a pathetic attempt to hide
the fact that ALL AMD APU's A6 - A10 outperform Intel i3, i5
and i7 IGP by 100% in draw calls at 30fps.
The best that an Intel I& 4690 can do is 2.2million draw calls at 30fps. AMD A6 processes 4MILLION draw calls and A10 process 4.4MILLION draw calls at 30 fps.
Intl IGP is SOOOOOOOOOOOO pathetic.
Go here and read!!
www dot anandtech.com/show/9112/exploring-dx12-3dmark-api-overhead-f eature-test
Starswarm is also a mature game simulation metric that is also very telling!! Go here:
www dot anandtech.com/show/8962/the-directx-12-performance-preview-a md-nvidia-star-swarm
|mateau 5 hours ago||
Actually MS is going to release about 100 legacy games from Xbox 360.
DX 12 will give a huge performance boost to games that get ported to DX12.
|mdeneuve 5 hours ago||
I will have upgraded in a week or two and will let you know. I don't have any unusual drivers on my PC but will let you know how my 5.1 surround sound headset works out as that's one of those add on "peripheral" drivers that could go flaky.Thanks mate, I appreciate it :)
I have this Tascam Audio/MIDI device that I had some trouble with getting it to work on my Win8 PC (especially the MIDI part), since it didn't officially support Win8 (luckily the Win7 drivers turned out to work fine in the end).
But it's quite an expensive device that I use a lot, so I'm a bit nervous about the Win10 upgrade.
|mateau 5 hours ago||
What I find laughably ABSURD and almost sublimely BS is 4k
Everyone "claims" how 4k is necessary while most are still only using 2k monitors.
98% of ALL gamers still have 2k monitors!!! LOL
|Khanivor 5 hours ago||So Clicker Heroes is all gaming stripped of any and all pretense and illusory padding?|
|FMV-GAMER 5 hours ago||
@wicked_uk I think you
should read my posts again and why DX12 won't make a massive
difference to XB1.
When they announced that they had no plans for backwards compatibility they actually meant that at the time, going back on your word and explaining why something technically won't make a difference and the reasons why it won't are two different things.
|BellyFullOfHell 5 hours ago||To be fair, the original game was shit as well once the novelty of the controller wore off. It was a bloody nice setup though.|
|mdeneuve 5 hours ago||
The only way they raised bar against high end PC was through amount of ram available for textures.Even that is somewhat debatable considering the amount of memory available in the PS4, and all the bandwidth constrictions.
|FMV-GAMER 5 hours ago||
@mdeneuve I myself wonder
exactly how much use these features will get if any.
For example NVIDIA won't want to share it's software tech with AMD, we have already seen how they disable PhysX if the driver detects you have an AMD GPU in your PC.
I imagine there will be tasks would be broken up and hived off to each GPU kind of like how they propose to do with the Intel Integrated GPU's laying dormant on the Intel CPU's?.
I am hoping that when you SLI or Xfire two of the same GPU that VRAM will double though as this is one area that would be very beneficial seeing as currently you are doubling GPU prowess but VRAM stays stationery.
I am just excited for more draw call efficiency and the enhanced visuals that will bring to the table.
I haven't upgraded to Windows 10 yet, remember I told you that I did a clean install of Windows 7 a while back and things went wrong?. Well one of the things that went wrong was that I lost my CD-key for Windows 7 so as far as Microsoft is concerned my copy of Windows isn't legit.
So while everyone else gets a free upgrade I need to pay for a legit copy for my main PC and of course I have two other PC's that need a copy of Windows 10 as well so I won't upgrade until I put more cash into the "spending bank", however I don't see the hurry just yet anyway as I don't think there is anything in gaming terms that I can test that uses Windows 10 specifically so no rush.
I will have upgraded in a week or two and will let you know. I don't have any unusual drivers on my PC but will let you know how my 5.1 surround sound headset works out as that's one of those add on "peripheral" drivers that could go flaky.
|shehzaanshazabdulla 6 hours ago||@a-double-vodka Can't wait for Deus Ex later this year. First taste of DX12 in a game I'm interested in.|
|man.the.king 6 hours ago||
Both of us must be discussing something that's really offensive to some individual, as they chose to neg us for discussing SF and religion.
I'm a science fiction nut and that's extended to as fascination about how the Universe works so I read a lot about cosmology and quantum physics.You too?
That's my favorite genre. For books, movies, you name it.
Asimov and Clarke are still my favorites. A recommendation if you haven't read this: try Nightfall, by Asimov and Robert Silverberg. I presume you've already read stuff like The Martian, Wool, Sand, etc. Maybe also try some more contemporary SF like Flowertown.
I've just realised that I'm in serious danger of just waffling such is my love of this subject. There's so much happening right now, it's a wonderful time to be alive. We're developing theories for what happened before the 'big bang', we're finding comparable to Earth exoplanets in habitable zones of other stars, we're working on the nature of time arrow, what it is and how there was, counter intuitively, more entropy in the past than now, there's so much happening, it's a constant source of wonder.Oh I know the feeling. I know very few people personally who are so much into SF as me, so not many topics of commonality when it comes to face-to-face conversation regarding genres of interest.
I started the reply to you mentioning religion then I've just realised I didn't say a word about it. That probably reflects how I feel about it quite well.I get what you feel. I used to be like that, circa 1998-99. I think what makes some people contemptuous of religion is:
1) Equating of a religion and its underlying philosophy (which is pretty much the same for all major religions) with some of its fanatic "followers". Ironic, considering that fanaticism and intolerance means the opposite of being religious.
2) Not understanding the underlying philosophy or goal of religion (hint: goal is definitely not utilitarian).
3) A preconception that Science and Religion are two different things. From what I have seen, they are the same (if you actually dig deep and find out the underlying goals, NOT take the surface myths at face value). But... I guess it's easier to discount rather than perform deep introspective exercises. :D . IMO, "scientists" are too often guilty of that particular mistake. It's always easier to look outside than inside.
|a-double-vodka 6 hours ago||
I still have no idea of what kind of real world gains I'm going to see with DX12... and no one seems to even have ballpark figures.Early days. We will find out hopefully soon.
|Hxy3000 6 hours ago||@Suarez07 I'm aware what he meant. That still doesn't make it a Xbox exclusive, just an exclusive among consoles.|
|a-double-vodka 6 hours ago||
deployed the term "cunt-hive"Agreed, this is a good term. One I shall use from now on.
|Carbon_Altered 6 hours ago||
I flippin' PRE-ORDERED this game, I was that excited about
it. Got some free fancy pants armour DLC for my troubles.
Then I tried to set it up. Got my chair in front of the Kinnect and the point in the article about chair arms getting in the way was spot on. Got it going eventually, but it was sooooooooooooooooooo fiddly.
Almost lost it on a level I had to crawl out of the mech to do something or other, but the game never made it clear that I was supposed to be ducking down. Just a bit of direction from the UI would have made it so much less painful.
Just too much hard work to get it working right. Then the actual gameplay/graphics was crap on top of that. Mech Warrior back in the day was more involving.
|ghostgate2001 6 hours ago||
Charlie Brooker deployed the term "cunt-hive" and I'm happy
to defer to his wisdom on that. He certainly knows a cunt
when he sees one, and his definition of what constitutes one
pretty much matches mine.
For the collective noun, though, I find "a compendium of cunts" to be compelling, in an alliterative kind of way.
|mdeneuve 6 hours ago||
Those are some impressive features that you mention.
I wonder how this will work. For example will it be able to do this transparently, or does the game have to be developed to support this?
What will be the performance impact of this? I mean I can imagine copying memory between two totally different GPUs will be quite slow.
Personally I think the biggest benefits of DX12 will be improved performance and better drivers across the board.
BTW, have you already upgraded to Win10?
If so, was it smooth sailing for you?
I have it waiting to be installed, but I'm a bit nervous to do so right now. Will all my drivers still work? (I have some specialized audio drivers for music for example).
|StooMonster 6 hours ago||
The PS4 does support 4K output. Firmware update 1.5 already enabled 4K photo viewing. Lately there has been quite a lot of talk about 4K video, HEVC decoding and 4K Blu-Ray support on the PS4. Those are expected to become available fairly soon via firmware updates.As with XbOne the PS4's HDMI hardware can only support 4K at 30Hz or 24Hz (for movies) which isn't good enough for gaming.
You need HDMI 2.0 for 4K at 60Hz.
Also, the new standard for 4K video is Ultra HD Blu-ray discs, these offer 66GB or 100GB storage and are different to regular old single layer 25GB and dual-layer 50GB Blu-ray discs ... you need new hardware/player for the new optical discs.
You cannot update the PS4's Blu-ray drive to an Ultra HD one with a firmware update.
Although maybe Sony will release a new SKU with the new hardware in it, and update HDMI too.
|blarty 6 hours ago||@liveswired Yeah possibly, the only advance would be in an update that utilises more concurrent cores for draw calls, although how close the current XB1 libraries are to DX12, I don't know.|
|shehzaanshazabdulla 6 hours ago||
TBH after all the reading I've done I still have no idea of
what kind of real world gains I'm going to see with DX12...
and no one seems to even have ballpark figures.
In any case it's going to be interesting come the end of this year when DX12 titles like Deus Ex starts shipping.
|StooMonster 7 hours ago||
He also said xbox one wouldn't have backwards compatibility, and yet it now does.Marketing calls it 'backwards compatibility', but it's not really: you cannot put any 360 disc in or simply run any an old 360 download.
It's done entirely through software emulation/wrapper of selected 360 titles, moreover doesn't run at 100% speed.
So he wasn't lying.
|frazzl 7 hours ago||@avluis Thanks a lot for the info dude. Will definitely be getting this. Enjoy the game :)|
|clockworkzombie 7 hours ago||
If your motherboard is still in its warranty period then you are fine and MS will activate your licence.
|wicked_uk 7 hours ago||@FMV-GAMER He also said xbox one wouldn't have backwards compatibility, and yet it now does. I think he is keeping things close to his chest until it is time to announce them....|
|TheDarkSide 7 hours ago||
@FMV-GAMER I have to say
mate, Gemini started this nonsense with his very first post
since the thread reopened. I was glad to see @Suarez07, a PC
gamer (and a thoroughly decent one too), call him out for
his unnecessary trolling, which I can only assume was aimed
at months old posts. Two wrongs don't make a right, etc.
I've noticed he's since deleted his account, probably in
embarrassment at his comments. People like that aren't good
for the sites console/PC gamer relations, he comes across as
someone who spits the dummy whenever his opinion isn't met
with unanimous praise TBH.
EDIT: *unnecessary. WTF was the word I typed first? I don't even think it was a word. In my defence, I'm suffering from the Mother of all hangovers, which doesn't seem to be going. Middle age isn't good for you. :D
|kevboard 7 hours ago||
@Centrale1 that is accoring
to everyone who understands what good gamedesign is! if you
are good you can get through the turbo tunnel first try it
is just hard to do! since every obstacle gets this blinking
waring thingy before it enters the screen
some Sonic games on the other hand throw shit at you you could never have anticipated coming making it almost impossible to dodge. that is due to the high speed you move and the view in the 2D games being extremely limited for a guy of his speed.
and this is a Sonic fan speaking...
it is bad gamedesign to put in stuff you dan't react to fast enough making it necessary to learn the level.
2D sonic compensates this problem by providing multiple paths, if you learn the level you can fly through it, if not you will probably fall to the lowest path but you get a chance to do the level slow.
|kevboard 7 hours ago||@saulxtigh for training you can rewind the game by holding LT. works for all the 2D games I think... well except Jetpack Refuled since it is a 360 game|
|Suarez07 8 hours ago||
4k pong don't count.
Sure it can run simplistic games in 4k , but that was not what was being implied at time.
The console doesn't manage most modern games @1080p 60fps which was also implied was going to happen.
High end PC has indeed come further since ps4 release, but the distinctly middle end 7870(probably low end now) still holds its own against ps4, as does an i3 , and going forward, so will even lower end AMD CPU's.
The reality is, ps4 was never a high end PC competitor. The only way they raised bar against high end PC was through amount of ram available for textures. Outside this, they remain very much lower end of discrete GPU