[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
If the 290x has so much computer power why is it behind the 980
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 72
Thread images: 16
File: images-42.png (11 KB, 489x301) Image search: [Google]
images-42.png
11 KB, 489x301
If the 290x has so much computer power why is it behind the 980 in performance?

Can someone explain to me other aspects of a gpu that affect performance?

Is the difference in compute power why people say AMD cards age better?
>>
File: images-43.png (11 KB, 489x301) Image search: [Google]
images-43.png
11 KB, 489x301
>>51482969
>>
>>51482969
Depends on what you mean by performance.

If you mean gaming, then I'd say gaming isn't all about hardware or even the drivers. Its about controlling the market and making the most use of it. Nvidia has lots of ties with game developers and thus makes them use whats best fit for nvidia hardware. This means nvidia can cut where AMD excels and route it to where nvidia is better at. AMD has some ties to game devs too, but they don't usually get traction. Vendor neutral softwares show accurate depiction on which GPU is more powerful in neutral setting, but those are rare.
>>
>>51483026
So is compute power relevant to gaming performance then?
>>
>>51483174
More relevant is the game engine/company and their ties to GPU vendor. Vendor ties predict whether the game will be better on AMD/Nvidia.

For example, games that run Unreal Engine 4 or CryEngine(crytek) will run worse on AMD GPU. Games that utilize Frostbite engine will run worse on Nvidia. There are other game engines, but you have to find out where their loyalties/money flows to.
>>
>>51482969
>Is the difference in compute power why people say AMD cards age better?

No, it's because AMD drivers are so shitty that it takes months or years for them to mature enough to give their cards the performance they should have launched with.
>>
Did you read that graph, by any chance?

The 290 and 290x are nearly 50% better than the 980....
>>
File: cute anime pic 0579.png (186 KB, 307x315) Image search: [Google]
cute anime pic 0579.png
186 KB, 307x315
>>51483591
>....
go back to reddit
>>
>>51483591
Are you talking to me?
290x tends to trade blows with a 970 in games.
>>
Nvidia has bad opencl drivers.
AMD has bad dx11 drivers.

The 290x beats the furyx in many openCL things as well.

Back when gpu bitcoin mining was a thing, slightly different firmware was giving 50% more performance on certain brands.
>>
>>51482969
But what is compute power good for?
>>
>>51484302
Computing things
>>
File: 1448058579234.jpg (122 KB, 600x450) Image search: [Google]
1448058579234.jpg
122 KB, 600x450
>>51484568
Gee, thanks.
Never would have figured that out on my own.
>>
>>51484302
>>51484590
OpenCL is what makes AMD cards good for Crypto mining.

Compute is good for scientific research, network hashing (i.e. buttcoin) Just basic number crunching, rather than just optimizations for texture mapping, shading and vectors, and lighting in 3d space. It can do more non-specific tasks better than a normal graphics card could.

The Idea behind GCN being able to do Compute shit better was sort of a happy accident for AMD. GCN was originally pitched for HSA, or AMD's tagline during the mid 2000's "The Future is Fusion". They wanted a graphics core that could turn its shading units into compute units too, so they could help a traditional CPU with serial and parallel workloads alike. Hence why they were interested in ATI's technology; Hence we have the APU today.

But during the crypto boom of 2011, and 2013, AMD decided to slide their graphics line more in that direction 'by accident' to capitalize a little more. Its probably one of the only reasons they are afloat today.
>>
>>51484897
So compute power isn't really related to gaming performance?
And bitcoins saved AMD?
Neat.
>>
>>51485054
No. You should always make a decision on a graphics card based on what you're going to do with it. And if you don't need compute-heavy performance, then It shouldn't be a factor.
>>
>>51482969
GPUs are a massively parallel series of ALUs with assorted other good things. ALUs themselves are little execution units that do the actual processing. GPUs have tons of these, and they're controlled through command processors. The ratio of command logic to ALUs can play a big role in how well these ALUs are utilized in a super heavy static workload. Such a workload is represented by the compute bench you posted, and the video rendering bench done in Sony's editing software.

The arrangement that AMD went with for their GCN architecture is 4x16SIMD lanes per compute unit. That means effectively every separate compute unit has 64 ALUs with their own control logic. That means those 64 ALUs can each be given instructions at a pretty high rate.
Nvidia's Kepler and older arch used 192 ALU wide build blocks, they called them SMMs or something like that. Maxwell moved down to 128 if I recall.
Obviously other factors play into this, caches, swap spaces, registers, etc, but these only accentuate the heart of the matter. If you want high ALU utilization in compute workloads you need more command logic per ALU, and GCN does that. Thats why AMD's cards these past few generations have had fairly high FP32 performance in some particular metrics.


Double precision, particle simulation ops that hit memory harder, and other things tell a different story however. It all goes to show how incredibly complex modern GPU architecture is.
>>
>>51485145
More to the point: No compute performance is not indicative of gaming performance in most cases.
There are some games that use tons of compute functions for lighting, shadows, post processing effects and things like that, but they aren't exceedingly common yet.
>>
Because it is an ASIC designed to do synthetic benchmarks
>>
>>51482969
Let me break it to you, compute is useless for gaming. The only thing compute is good for is niche obscure calculations. CUDA is the standard on the enterprise/HPC level for computing and even AMD is adopting it on their firepro cards to catch up. Except their cards don't really support CUDA and they have to do software translation.

For computing CUDA is much faster than open cl.
>>
>>51485201
>Let me break it to you, compute is useless for gaming.
This statement is 100% patently false. Tons of things in games use compute, its just a matter of how high the utilization is.
The PS4, Xbone, and even the WiiU were all designed to increase the utilization of parallel compute in gaming.
Better frustum culling, path finding, post processing, physics, all of these are best suited to the GPU.

http://www.slideshare.net/naroon2/amd-2012-hsa-in-gaming
>>
>>51485226
>Tons of things in games use compute, its just a matter of how high the utilization is.

That's the key here, the amount gcn cards have is unnecessary and poorly thought out.
>>
>>51485266
Not at all.
The latest Maxwell cards are holding their own against AMD. They've had a better cache subsystem for a few generations now, and they're improving their arch specifically for compute performance. The 980 outperforms the Fury X in a number of metrics in CLBench.
AMD, intel, and Nvidia all agree on this. Compute performance absolutely is the future, and it is vital to gaming performance in the coming years among other things. Everyone across the board is doing everything possible to push on this one area.
>>
File: 1448236996563.png (72 KB, 980x720) Image search: [Google]
1448236996563.png
72 KB, 980x720
>>51485266
Did you not see the ashes benchmark?
>>
File: nvidia cannot into compute.jpg (83 KB, 1026x839) Image search: [Google]
nvidia cannot into compute.jpg
83 KB, 1026x839
It is worth pointing out that the ACE that give GCN (especially hawaii onwards) such enormous compute power are not exposed to developers in DX11, which is partly why you don;t see the 290x murdering everything short of a 980ti. The other major factor is - as a rule - games are shader limited.

Will this change? Yes, but not to a huge degree - game engines are moving towards more compute based workloads but at the end of the day shaders (and their ability to be kept fed by other components) are the deciding factor in performance.
>>
>>51486925
But if games are moving towards compute, doesn't that mean GCN is a better choice?
>>
>>51486997

Perhaps, perhaps not. My limited understanding of how GCN's ACE work is there is effectively a predefined latency that it will never go below (i.e get faster) but above that you can simply keep piling on the tasks and the time to compute will not change. Saturating a card to that degree is unlikely to ever be something games do.

kepler and especially maxwell are lightning quick at serial compute loads, but they really, really do not like switching between compute and shader workloads as this tends to stall the pipeline (hence why Nvidia said don't do it).

I personally feel GCN is the superior architecture but I don't have any inside knowledge of how likely it is to see GCN's strengths shine in the PC space. Ironically this enormous parallel compute ability is what lets the ps4 punch above its weight. Off the top of my head killzone shadowfall leans moderately heavy on the ACE the console has and you can be damn sure first party ps4 titles will use even more as time goes on.
>>
>>51487066
We haven’t seen a single console port yet that went directly to DX12, I guess that’s when you’re likely to see something close to ACE saturation. Hawaii is going to age pretty darn well.

PC titles really don’t use compute a lot. Nvidia’s clutch on PC gaming means we have useless tesselation fucking everywhere, and Elite Dangerous is apparently already pushing it really hard just because they use compute shaders and spend a non zero time in them or something. Can’t have people buy less Intel CPUs and Nvidia GPUs, that would be terrible!
>>
File: maxresdefault-3.jpg (139 KB, 1920x1080) Image search: [Google]
maxresdefault-3.jpg
139 KB, 1920x1080
>>51487232
>nvidias clutch on pc gaming
It's why this time consoles are winning.
When you can have pc graphics on console performing the same and costing less...
I was hoping dx12 might give AMD the edge they needed, I'd love an amd zen itx apu system that can give decent performance on newer dx12 titles but that hope it's slowly going away.
>>
>>51487288
Why? No one is using DX12 yet. No one even knows how to use it. (Properly) That’s going to take years, much like it takes years for consoles to be used properly, and it’s going to change with each new architecture of graphics cards, the hunt for the best performance is going to start anew.

AMD winning the console war is HUGE, that means there ARE going to be lots of lazy ports that just 1:1 map as much as possible from their respective console API to DX12 or Vulkan. Those are going to be optimized on and for AMD and that’s why you can fucking bet Pascal is going to have ACEs. No matter how much Nvidia pays devs, consoles make more money.
>>
>>51487360
ark team said they're going to release DX12 patch back in August, they promised 20% FPS boost. After AotS bench results were released they said they're holding this patch back and in AMA they said DX12 is a complicated matter.
Ark is a GameWorks title.
UE4 doesn't even support AC, Epic said it's a nice feature, but they won't implement it yet.
Nothing will change.
>>
File: cloth.png (375 KB, 960x540) Image search: [Google]
cloth.png
375 KB, 960x540
>>51484302
Rendering
>>
>>51487360

It is worth considering that DX11 is fucking any and all hardware not designed for it - dual core cpus haven't really been a thing for best part of a decade (hyperthreading a decent replacement on the low end) and gpu hardware has scaled far, far beyond what the API was ever written to cope with.

>>51487424

UE4 might as well be Nvidia engine given how close those two are. I do find it highly suspicious just as the whole async compute story broke ARK had their DX12 patched delayed...just as they announced gamesworks support.

My gut feeling is consumer pascal cards are just maxwell turbo edition so Nvidia is scrambling to use what leverage is has to try and prevent the widescale adoption of async compute in games simply because they have no answer to it - it really is free performance for AMD and given how close (performance wise) most of AMD's cards are to Nvidia's they can smell change on the wind. Gamesworks is just one part of their efforts to keep the status quo.
>>
>>51487424
Ark doesn't matter jack shit on consoles, and neither does which feature UE4 does or doesn't have. Console games to this day often use bastardized UE3 versions with tons of shitty console hacks to MAEK SHIT GO FAST. And this MAEK SHIT GO FAST is going to translate well to all GCN hardware and terrible to all Nvidia hardware until Pascal.
>>
>>51486735
>>51486925
The 290x really is one impressive card. How does it compare to the Titans when it comes to GPU compute?
>>
>>51487467
Nah, if the past has taught us anything, Nvidia is going to use AC harder than AMD and it's going to be their new tesselation. AMD had tesselation first too. Of course no one used it until Nvidia had it too, and Nvidia went all in on it, overdid it as fuck and now uses it against AMD.
Look forward to Nvidia Gameworks titles and maybe even console ports cranking their AC usage up to eleven so that it runs like absolute dogshit on AMD and older Nvidia hardware.
>>
>>51487360
Nvidia will be producing gpus for xbox 2.
>>
>>51487529
No they won’t. Tegra is still shit, consoles are going to continue using x86 anyway and you won’t ever see dedicated GPUs again in a console. The console market belongs to AMD and the only one who could compete here, Intel, has zero fucking interest in it.
>>
>>51487480

This and the following are the only charts I have.
>>
>>51487564
>>
>>51487476
it does matter though. PS4 code won't run on PC.
MS themselves said not to use DX12 to small indies. It's too complicated, things that AotS devs do just scratch the surface. No one will develop a new engine and assemble a team just to get AC support, this is how DX12 will work, there will be engine provides and content providers, i.e. game devs. If the engine of choice doesn't support a feature for most devs it might as well not exist. and UE is the most popular engine right now.

W10 is out for 3 month and over 25% of Steam userbase use it. Were are those easy to do ports from consoles with AC support? They require time and effort. And not everyone is sold on low level API, Ubi said they have no plans for it in the nearest future, because it requires times and effort. Only DX12 titles are AMD and MS sponsored titles. most gameworks devs discredit all these rumors about DX12 ports. Witcher, GTAV, BamHam now ARK.
And inb4 Vulkan, it's the same shit. Even worse in some regards. MS already provided DX12 Xbone tools to devs to use on PC.
>>
File: Screenshot_20151123-082656.png (711 KB, 1440x2560) Image search: [Google]
Screenshot_20151123-082656.png
711 KB, 1440x2560
>>51487564
>>51487569
I like how this loser stalks out these threads just to post this image. Most likely he's the op.

There's literally pages of his shit.

http://rbt.asia/g/image/n_G-P2c0u-hMPESm4IUuXg
>>
>>51482969
i saw some compute benchmark where fucking Titan X was behind 400dollar amd gpu
>>
File: Screenshot_20151123-082842.png (830 KB, 1440x2560) Image search: [Google]
Screenshot_20151123-082842.png
830 KB, 1440x2560
>>51487599
the cringe is real
>>
>>51487573

Watch this space for frostbite games - that engine is fucking ready to go for DX12 support given the legwork DICE put in for Mantle (which DX12 shares a lot of code with). I would be proper money EA becomes one of the earliest adopters out of the major players.
>>
>>51487608

Man you really don't like it when there is evidence showing Nvidia being terrible do you?
>>
>>51487602
you mean this? >>51487569

>>51487599
>>51487608
Your Point? I >>51487480 asked.
>>
>>51486735
why doesnt amd use this in marketing
>>
>>51487609
DICE lead programmer said he wants DX12 to become a minimum requirement for all their games. Techies love that shit, but it's not about them, it's about suites.

Battlefront is a DX11 game, it runs well, but it's still a DX11 game, I don't thing it even has Mantle support. I'm pretty sure early 2016 EA games won't jump the ship and shift to DX12 suddenly.
Guys from SE like DX12 as well. DX:MD is a DX12 title, which is already more than what DICE did for low level APIs. I'm talking about DICE as a studio, not people working at DICE, some of them worked on Mantle.
But does next DICE and\or SE game support AC? AC is not a part of the core spec.
>>
>>51487573
>PS4 code won't run on PC
Yes it will. You don’t understand porting. Porting a game means touching as little code as possible, not rewriting everything.
>indies
Yeah, that’s where the best console graphics come from, right.

>W10 is out for 3 month and over 25% of Steam userbase use it.
Which is not a lot for games that target as big an audience as AAA titles do.

>Were are those easy to do ports from consoles with AC support?
Well, where are the console ports at all?
>fallout
Yeah, that sure is pushing the tech on consoles.

Wait until next year, that’s where a lot of gaming evolved titles are going to drop, frostbite is ready, and as I said, most devs don’t even use consoles properly yet. NO ONE KNOWS HOW TO USE DX12. It is going to come, but it is going to take time.
And just shut up about ark, ark is a low effort UE4 thing that doesn’t do any significant rendering code on its own. No, Vulkan is not the same thing, Vulkan is going to have much broader userbase and great support by LunarG.
>>
>>51487569
didnt tge removal of hardware scheduling or limiting of it in CUDA fuck up nvidia?
>>
>>51487608
and? raising awareness of amd good side is bad? wat
>>
>>51487657

>AC is not a part of the core spec.

Not but it relies on core specs and its only a small hop as it were to implement it for GCN cards from that point and what amounts to free performance is hard to turn down.

>>51487671

My knowledge is a bit limited on the subject so all I can say is it took quite a while for maxwell basedp rofessional cards to be released - even now (iirc) most of their top end quadros are still kepler based.

>>51487687

Knowledge is the enemy of fanboys.
>>
i own 980ti for gaymen and movies i hope i can switch to amd in two years
>>
>>51487664
>Well, where are the console ports at all?
AC, BamHam, FO4, Battlefront, CoD just open recent release chart, if it's not indie, it's a console port.
>~30% of market is not much
It's more than enough to warrant a DX12 mode if it's so freaking easy as "changing a few lines of code". Linux gets ports, and it has even less marketshare, and porting from DX11 to OpenGL is not that easy.
>Muh Vulakn
I'm talking about challenges it puts before devs, not marketshare.
>LunarG
What does it have to do with anything? Aren't they making a debugger\profiler for Vulkan?
>>
>>51487635
>why doesnt amd use this in marketing

They would be sued for using misleading benchmarks

Ashes benchmarks is a synthetic benchmark that auto-identifies AMD cards and bumps them up a bit.
>>
>>51487696
i just want to discuss computer tech without shitposting, too hard in here.

im >>51487698 and really hate nvidia for a while now but its the best for my use, but like i said i hope amd can change this in 2 year time

wanted fury x so bad but it couldnt keep up with how good this 980ti overclocks
>>
File: 1446929530464.png (106 KB, 1008x1216) Image search: [Google]
1446929530464.png
106 KB, 1008x1216
>>51487728

As shocking as it is the latest CoD is actually quite impressive for its hardware scaling.
>>
>>51487733
and nvidia with bullshit tessalation aint misleading?
>>
File: 1446930114072.jpg (92 KB, 523x440) Image search: [Google]
1446930114072.jpg
92 KB, 523x440
>>51487733

>Ashes benchmarks is a synthetic benchmark that auto-identifies AMD cards and bumps them up a bit.

Source?
>>
>>51487696
>free performance is hard to turn down.
Not using GameWorks and making sure your game runs fine on everything, not only last gen nVidia hardware is also rather hard to turn down, especially considering there are free alternatives.
And devs still do it. It's about suits.
>>
why can everyone discuss CPUs like gentlemen but GPU discussion is shit flinging
>>
File: 1419206769559.jpg (328 KB, 810x587) Image search: [Google]
1419206769559.jpg
328 KB, 810x587
>>51487789

/g/ can't discuss cpus like gentlemen because it always ends up as pic related. In fact /g/ is even more ignorant of how cpus work than gpus (see: lack of understanding of how intel jews you on secondary features across their lineup or the ins and outs of what AMD fx chips are good at).
>>
>>51487733
do you mean the fact it enables Async for AMD, but disables it for Nvidia?
Nvidia asked for that. Their async compute performance is not stellar, at least wasn't stellar back in august. Oxide worked closely with nvidia to make sure AotS runs well on their hardware.
>>
>>51487564
The original Titan still raping everything means Nvidia has the capability to produce fast compute cards but they consciously choosen not to in an effort to optimize for games. They're probably thinking of splitting up their consumer and enterprise cards entirely to save money. No reason to include useless features on quadros if they're just gonna sit away in some datacenter doing some highly specialized task.

Its not that it can't. They have chosen not to. I'm sure those hardware engineers making 500k/yr have better foresight than any of us.
>>
>>51487602
FP64 compute performance has limited real-world applications, one of the more popular ones being crypto-currency, but before the boom happened you literally never heard anyone talk about this figure in detail.

Compute performance and rendering performance aren't intrinsically tied to one another, stuff like >>51487465 only needs raw computational power at moderate precision levels, just like games, so the only thing that matters is core count x frequency, minus software optimizations. For many rendering solutions, a Titan X or similar Maxwell chip is still the fastest option.

Nvidia doesn't want to waste wafer space on adding FP64 units back into their chip designs, so starting from Pascal the card can run in mixed precision mode, only executing chunks of code in FP64 that absolutely require it, instead of going all-or-nothing. This was all explained at GTC 2015.

>>51487805
There seems to be a mixed understanding of what is applicable to games vs development.
>>
File: 900x900px-LL-5aa8b374_71452.png (23 KB, 650x400) Image search: [Google]
900x900px-LL-5aa8b374_71452.png
23 KB, 650x400
>>51487815

>Its not that it can't. They have chosen not to.

It does make sense given GCN is sort of one design to rule them all, which as current games show, it has a shitload of power hungry components sitting around mostly doing nothing.

Maxwell is pretty great for games on the power consumption front when there are stalsl and whatnot, but when you truly push it balls to the wall it isn't much better than kepler or GCN for power draw.
>>
>>51482969
Because nvidia has highly optimized drivers

Amd has always made more powerful cards, however while "no drivers" is an epic may may, they never focus on optimization
>>
>>51487728
>I'm talking about challenges it puts before devs
It sure would be nice if that mattered at all but it doesn’t. This is politics and if the higher ups don’t want devs to waste their time on a Win10 only mode, it won’t happen. A DX11 mode is currently absolutely needed, it takes a lot of work and any work no matter how minuscule that would go into DX12 would be better spent on DX11 in their eyes.
In case you didn’t notice, Linux ports are mostly done externally and aren’t even profitable yet either. And a Win10 mode isn’t about making the game playable on Win10 at all, it’s just about a bit more performance. That isn’t going to make a lot of money unless your name is Microsoft and you really want to push your shitty OS.

Things are however going to get very interesting once there is no need for a DX11 renderer any more. Currently all those console ports would be Console → DX11 → DX12 ports, ripping out what little modern optimization there is on consoles, and all of the ones you’ve mentioned have been brewing well before any decent Win10 adoption or even its release, or are plain garbage to begin with. (Lol Fallout.)
Just starting over by scratch for DX12 is of course not *that* cheap since the game is also optimized to just run on one hardware configuration and has to be made to thread better across unknown cores etc., but I expect more direct ports to happen pretty soon. DX12 and Vulkan as the “performance mode” that gets attention with DX11 being kept around for compatibility will happen a lot sooner than DX11 getting dropped entirely.

>Aren't they making a debugger\profiler for Vulkan?
That and the LunarXchange platform. If there’s one thing Valve is great at, it’s catering to devs, this should be nice.
>>
>>51487748
That's not shocking considering that this was the case for advanced warfare as well. COD games perform well in general, it's just that Ghost was unforgivably bad and left a taste of fermented poop in everybody's mouth. For christ's sake a Titan couldn't even handle the menu, and it only looked about as good as the first round of Unreal 3 engine games or worse, and wouldn't even let you play the game with 4gb of RAM.
>>
>>51488066

Equally consider in that chart the gainward 780ti - a very high clocked model - is as fast as the 280x which should be borderline impossible.
>>
It's more difficult to program games to use compute shaders along with graphics shaders. On top of having more compute power, AMD cards have proper async compute. So as long as developers keep trying to get more raw power out of the cards with smart scheduling to overlap render-bound and compute-bound tasks, AMD will keep on giving and Nvidia will fizzle out.
Thread replies: 72
Thread images: 16

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.