[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Nvidia Pascal or AMD Greeland GPU?
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 135
Thread images: 16
File: NVIDIA-GeForce-GTX-980-17.jpg (269 KB, 1600x900) Image search: [Google]
NVIDIA-GeForce-GTX-980-17.jpg
269 KB, 1600x900
which you you think will better for long term use? (as in 3+ years)

I hear Nvidia gimps older cards to make newer ones look better, but AMD struggles to keep up with Nvidia in terms of game frame rates.
>>
>>53096685
In my admittedly limited experience, the performance gap between the two has never been huge (around 10fps), but there's no doubt NVIDIA has had the power efficiency crown for a few generations

For Pascal and Greenland, (I thought it was Polaris?), give it a month or so, see what deals come out, wait for newer stable drivers, and make an informed decision.
>>
>>53096685
For games, it's anyone's guess at this point.
For pro use, the answer is kind of obvious.
>>
File: 1429547011132.png (619 KB, 767x621) Image search: [Google]
1429547011132.png
619 KB, 767x621
>>53096685
>AMD struggles to keep up with Nvidia in terms of game frame rates
We talking crown, halo, per watt, or per dollar.
Because the answer is yes, no, yes, no

>>53096857
>a few generations
one
Kepler used just as much power as GCN

OP is dumb, Greenland is a beef ass server APU. The series was Arctic Islands, but was renamed Polaris.
>>
>>53096685
Whatever performs better (for the price)
>>
>>53097747
But none of the new gpu/apus are out yet
>>
>>53096685
greenland sounds cooler
>>
There is almost no difference now between top end GPUs except maybe 5-10 FPS. Both will probably alst you same amount of time. AMD recently pulled ahead because of good drivers.
just get what you think has more bang for your buck. Personally I choose AMD over Nvidia because Nvidia has shoddy business practices and AMD is open source
>>
in this case, the word "gimp" is a buzzword with no meaning.
i beat Fallout 3 with my GT 9800 that i had to cool manually because it's fan was broke, it regularly would overheat and it still works.
i beat Skyrim on the GTX 650 and was able to play Company of Heroes 2 on the 650.

what does the word "gimp" mean here?
>>
when is the new gens hitting the shelf?? people keep saying to hold off on buying a card cos they're right around the corner, but how far away is it really??
>>
>>53099092
I'd expect e3 to be at least a launch date announcement, if not an official launch
>>
>>53099092
4-6 months from now for Polaris I think
>>
>>53096685
>hear Nvidia gimps older cards to make newer ones look better
You might want to extend your research to include more than AMD fanboys.
>>
>>53099045
it means that old cards get worse fps with new drivers to animate users to upgrade to the next gen. There isn't a lot of evidence for this but it's enough to make it a /g/ meme.
>>
File: couldnt handle the zone.gif (1 MB, 225x265) Image search: [Google]
couldnt handle the zone.gif
1 MB, 225x265
>>53099236

>Another 6 months of having to use my GTX 780
>>
>mfw ive been using a gtx670 for 4 years
>still runs pretty much everything with good settings at good framerates

im just waiting for pascal to come out, so the 970/80 fags upgrade and then i snatch a cheap used 970

or i might buy a pascal but not likely, since those will probably be expensive as shit.
>>
>>53099262
it's not that per se, it's that nvidia locks many optimizations to newer cards. They wont magically get worse at games you already enjoy but a few years down the line, an amd equivalent will likely be faring better with new games.
>>
>>53096685
it's anyone's guess. DX12 changes the game quite significantly.
>>
There are alot of things Nvidia did with Maxwell that gave them a major advantage on the 28nm planar process that they will not have on FinFET. Nvidia has also pigeon-holed themselves with some bullshit proprietary features that are very undesirable to consumers. Gameworks sabotage which resulted in bad performance for nvidia cards as well as AMD cards ALONG with massive instability on game engines because the game devs cant see the fucking code they are using has caused a huge ammount of ill will toward nvidia from the game-making community. Meanwhile AMd has launched their own open-source version of gameworks that anyone can use without licensing fees or hidden code.

Same with features. G-sync adds hundreds of dollars to the cost of monitors while AMD's Freesync adds very little cost. And theoretically Freesync is better than G-sync and only requires a montior manufacturer to implement an high quality version on their panels to beat the performance curve of G-sync. G-sync is basically going to be dead very soon and Nvidia will be forced to make their future cards Freesync compatible.

There is alot of money gouging game breaking bullshit that nvidia does that AMD simply does not do and the vast majority of software and hardware advancement in gaming is the direct result of AMD.

So even if AMD hardware was inferior which is most certainly is not you should feel compelled to buy them anyway simply because of their commitment to open source.

there are alot of hardware features on curruent gen AMD cards that are going to be a real eye opener for consumers when the first VR headsets hits the shelves.

AMD is miles ahead of nvidia when it comes to VR their Crossfire implementation is fundamentally superior to Nvidia's physical bridge SLI and allows AMD gpus to scale almost 100% in crossfire running Liquid VR while Maxwell cards can only scale about 50% at best. Asynchronous compute also plays a major role in AMD having lower latency than Maxwell in VR.
>>
>>53100340
Is this true? Any one can confirm or give links? I've been an Nvidia shill for a while but would consider switching if shit like this is all true
>>
Can somebody tell me when the pascal cards will debut?
>>
>>53100474

Most of the reports I've heard/read/seen say Q3 or more generally second half 2016. nVidia reports they're on track to make that deadline, but we'll see. Aiming that late could mean it'll be pushed to 2017 for just about any reason.
>>
>>53100603

Just to append an inb4 this post, I'm aware of the reports claiming Q2 and even as early as April, but I don't subscribe to them simply because I feel that it's too close to the heels of the 900s. I imagine nVidia will soak up as many sales as possible before moving on.
>>
>>53100460

Its very true Oculus Rift devs complained about the latency problems on Maxwell. Nvidia will almost certainly add Asynchronous compute to Pascale and use a new SLI setup but Maxwell is completely fucked for VR

AMd has often has better hardware but it was held back by shitty drivers this has changed drastically because AMD's driver development has improved over the last 2 years they are rolling out better drivers that are more stable than nvidias now and they are getting those drivers out faster. They still are not as fast on release as nvidia but they are catching up quick.

the reason AMD crossfire scales 100% for VR is because it runs driectly through the PCIe bus and allows AMd to simply sync each GPU to one of the two screens in the VR headset. So they take no penalties from alternate frame rendering which nvidia has to run on both VR cscreens because they have old physicaly bridge SLI

We even now are starting to see crazy situations for alternate frame rendering on AMD GPU now. Rise of the tomb raider was benchmarked recently and the guys benching ran two 390X's in crossfire and found out they scaled with 99.7% efficiency which is fucking insane. They re-ran the test over and over again because they thought it was a fluke or a glitch. This is unheard of performance from any dual GPU set-up running Alternate Frame rendering through SLI. It stuttered like a motherfucker of course so it wasnt playable but never the less the efficiency was undeniable and if they can fix the stutter through driver patches then we will have the first situation in history where two GPU's scaled 99% for ALR in a game and this tomb raider game is a next gen graphics game with huge GPU requirements so getting that scale on this game is a major deal.

If they can figure out what magic was worked on the tomb raider game engine to allow AMd to scale 100% then that can be combined with Liquid VR to allow four AMD gpu's to scale 400% for a VR headset which would be absurd.
>>
Which one will debut faster?
>>
Polaris comes out with 14nm node size
Pascal is coming in 2 flavors, one 28nm (current size) when Polaris releases, and the real 14nm size ones sometime in the Fall.

Buy AMD. At the very least they allow you to do shit like reduce the tessellation for EVERYTHING EVEN ROCKS UNDERWATER from 64x to 16x or 8x which results in a huge boost in performance. NVIDIA won't let you do that.
>>
>>53100816
>Pascal is coming in 2 flavors, one 28nm (current size) when Polaris releases, and the real 14nm size ones sometime in the Fall.

I suspect that at least on the low end pascale would be maxwell rebadges. But at the very least i expected maxwell on 16nm finFET if they rebadge maxwell on 28nm.....i have no words....AMD polaris gpus would eat them alive.

Polaris is a complete overhaul of GCN architecture specifically optimized for the new FinFET node. Polaris will range from 2 to 2.5 times the power efficiency of maxwell and will likely even be noticably more efficient than pascale on finFET

>Buy AMD. At the very least they allow you to do shit like reduce the tessellation for EVERYTHING EVEN ROCKS UNDERWATER from 64x to 16x or 8x which results in a huge boost in performance. NVIDIA won't let you do that.

alot of people dont realise this. AMD has been dealing gameworks tesselation for a long time and after crysis2 they said fuck this shit and added a limiter in their catayst driver software.

you can force 16x tesselation on your AMd gpu and it will run gameworks games like Witcher 3 better than nvidia cards because it wont take the x64 tesselation penalty while looking identical to nvidia.
>>
AMD lets you make use of free sync monitors if you are interested in the 144hz meme. Much cheaper than the equivalent nvidia gysync monitors.
>>
>>53100147
>muh dx12


muh vulkan
>>
>>53100996
Freesynch enables 75-90 hz on a 60hz.
>>
>>53099045
>>53100070
This. Nvidia doesn't make gpus perform worse with new drivers. They simply don't support gpus older than their latest generation. Gtx700 series and below in this case. They're only getting placebo drivers to make autists feel great about their " game ready" drivers.
>>
File: Jesus_fuckin__Christ.jpg (6 KB, 281x200) Image search: [Google]
Jesus_fuckin__Christ.jpg
6 KB, 281x200
>>53101068
>retardation:The Post
>>
>>53101316
http://wccftech.com/amd-freesync-hack-expands-refresh-rate-range/

Jesus you are behind on the times, grandpa
>>
>>53100959
>gameworks games like Witcher 3 better than nvidia cards because it wont take the x64 tesselation penalty while looking identical to nvidia.
i actually thought about this. my tesselation option was "best for amd" and geralt's hair seemed really fine to me. if i'd turned off tesselation, would i squezee few fps?
>>
>>53096685
I don't know, I bought my wife's bull a GTX 970 for his birthday and he's liking it so far.
>>
>>53096685
AMDs cards are faster in the mid range
>>
Can someone explain this NVLINK shit to me?

Are people going to have to buy proprietary motherboards just to use Pascal cards?
>>
>>53101499
It depends on the monitor how far up from 60 you can go. Mine can go up to 66 without pooping.
>>
>>53096685
it depends if you believe dx12 will be the next big api. it seems to fix a lot of the problems with amd's hardware so it could lead to a better experience with amd. that said i learned a long time ago through personal experience to never buy amd products. i'd also much rather vulkan become the next big api because i really want to avoid windows 10. amd has always sucked at opengl so it will be interesting to see if they have the same problems with vulkan.
>>
>>53096685
>AMD struggles to keep up with Nvidia in terms of game frame rates.
Hahahaha no. The only difference is a negligible amount of power draw.
>>
File: 1452708645547.gif (2 MB, 350x258) Image search: [Google]
1452708645547.gif
2 MB, 350x258
>>53099032
>top end gpus used to be $200
>>
>tfw your current GPU is borked and you have to buy a new one now instead of waiting for Pascal.
GTX 960, here we go
>>
>>53100340
Posts like these make me tolerate the remaining 99% of jew/trump/shill propaganda on /g/. Thanks.
>>
>>53101899
How far back are we going? My first GPU was a Radeon 9800, that was definitely more than $200. Also, you expect a lot more from a mid range GPU now than you did back then.

I'm all for nostalgia but let's stay realistic.
>>
>>53100460
It's not true. You're seeing an AMD shill in the wild. Buy what fits your budget and performance needs. Never buy something just because you want to support a giant corporation, that's just silly.
>>
>>53102108
You fucking moron. Anon explained in detail as to why AMD is an objectively better choice based on current ethics and practices of both companies.

Basically. Buying an nvidia because it is more performant means you are a myopic piece of shit that lacks the IQ to see the long term repercussions of supporting nvidias vulture capitalism in its very essence.

Yes I buy AMD products. Yes I know it isnt always the best performance wise. But I support a brighter future without having my shit deprecated by the end of the year.
>>
>>53100340
who let the amd PR in
>>
>>53096685

I don't know, but if I had to bet money? Pascal.

Tired of AMD's "it will totally get better next year guyze!" meme overhyped by its fanboys and shills, specially here, and they fail to deliver every single time. Look at every single product they've released since the pre-bulldozer era. It's always "it will totally kill competition! It will be X times better than Y! It will be a miracle!", then the thing comes out, it's a rebrand/a plain out turd and the fanboys here start backpedalling and damage controlling saying "n-nobyd expected t-them to b-be any b-better. T-that was u-unreasonable" like I've seen COUNTLESS of times before here.

AMD has long lost any kind of benefit of the doubt whatsoever. I'm tired of "wait for the next amd's X thing!!!" and then it turns out to be garbage. Plus I've suffered enough for 3 long years with an fx 8120 in my hands until I made the switch.

To me, as well as to any sane individual, I'll believe AMD's stuff when I see it and I don't mean shitty synthetic benchmarks, I mean the actual real world performing thing.

What I've heard about Polaris is that it is better than Pascal in performance/watt. I don't particularly care about that because I believe that even if there's a difference, both of them will still consume way less energy than our current GPUs for anyone to care, specially at the enthusiast market. But it's an advantage nonetheless and possibly leads to better OCing if it doesn't turn out to be yet another "overclocker's dream" flat out lie. I'll wait and see though, AMD and their fanboys lie way too much for me to care about their promises anymore. I hope they deliver and give us the competition we need, but I'll only believe it, or waste any money whatsoever, when I see it.

>inb4 buttblasted amd fanboys
>>
File: 1393186638514.gif (1020 KB, 300x300) Image search: [Google]
1393186638514.gif
1020 KB, 300x300
>>53100070
>>53101094
they actually do. there are jewtube videos and graphs floating around of before/after patch or driver updates for some games, specifically fallout 4 I think but I don't really care enough about this shit.
Novidya has done a lot of shit just as intel has and amd probably would do the same if their positions in the foodchain were switched. fuck those corporations and their agendas. I'll just buy whatever has the best performance, compatibility (e.g. freesync) and price.
>>
>>53100070
>>53101094

Completely false.

What happens is that gameworks is heavy on the tesselation side and Kepler has always had very poor tesselation performance, which is why you always see it struggling at benchmarks because they have all the effects turned on. If you turn off gameworks features they behave normally.

This driver gimping myth has already been busted.
>>
>>53102339
Nice samefag.
If you choose to intentionally cripple your performance to support a multi million dollar company, that is your decision, but don't try to conform others to your idiotic ideology.
>>
>>53101588
'No because the way Witcher was made the textures wont render correctly if you go below x16 tesselation

gotta run at least x16 but for alot of games you can run x8 and get very high fps while losing little to no visual quality
>>
I want Gaymurrtard children to leave.
>>
>>53100340
>And theoretically Freesync is better than G-sync and only requires a montior manufacturer to implement an high quality version on their panels to beat the performance curve of G-sync

lolno

Freesync is fucking garbage, has a tiny working gap and is full of issues, such as some monitors having green lines when it's on.
>>
>>53102339
>Yes I buy AMD products. Yes I know it isnt always the best performance wise. But I support a brighter future without having my shit deprecated by the end of the year.
>deprecated by the end of the year
Nvidia has better legacy support. Pajeet plz. I can smell your curry shit smut wafting off your shillpost from here.
>>
>>53102339
>AMD
>legacy support
HAHAHAHAHA
>>
>>53102339
>>53100340

The funny thing about these posts is that to be this much of a walking ad for AMD is A-OKAY, but the moment someone mentions intell or nvidia it's shill this, shill that.

You've done nothing but recite your little fantasy baubble you amd shills regurgitate at one another, the same kind of shit that has no real world verification whatsoever and relies only on FUD.
>>
>>53101094
The thing is they are worse than placebo drivers. There are benchmarks in which previous gen. cards perform worse with newer drivers.
>>
>>53102941
go back to your mature desktop ricing threads
>>
File: 1447019165341.jpg (60 KB, 604x604) Image search: [Google]
1447019165341.jpg
60 KB, 604x604
>>53103087

Except there are not.
>>
>>53103174
how much do they pay you?
>>
>>53103216

You're welcome to prove me wrong, AMD marketer.
>>
>>53103222
Do you even understand how differently nvidia drivers work from amd?
>>
>>53103232

Still waiting on the benchmarks.
>>
>>53102966

you literally just proved his argument correct anon. The varying quality of freesync is entirely due to the wide range of quality control among manufactuers. G-synch modules are made elswhere where QC can be kept up and then sold to those companies to attach to their panels. Ergo Monitor manufacturers get their shit together and up the QC of freesynch then it gets better and the theoretic refresh rate curve of freesync is 8hz to 240hz

you should troll anymore anon you are bad at it also the new DP connectors allow for 240hz panels which freesynch will be able to refresh wheras 240hz panels with G-synch will only do 144-160hz
>>
File: AMD Vs Nvidia.webm (2 MB, 960x540) Image search: [Google]
AMD Vs Nvidia.webm
2 MB, 960x540
>>53097747
>Greenland
>Arctic Islands
>Polaris
its confusing as fuck anon
>>
>>53103248
>better quality control
>for things that are built-in in fucking laptop screens
>frame rate in charge of dictating quality control when the thing has a tiny working gap and tons of issues whereas g-sync does not

Yeah sure.

Look, I'm not supporting the g-sync proprietary bullshit with their DRM chips. But it's plain out superior. Stop being an amd flying bot.
>>
>>53103288

>But it's plain out superior.

actually you are shilling. g-sync quality is more consistent due to the way they build and ship serparate modules as opposed to manufactuers implementing freesync themselves but its not superior by any stretch of the imagination

you are just being a retarded fanboy faggot while pretending not to be one
>>
>>53102033
Please don't buy the 960 it is a pile of shit. If your going for middle level graphics cards get AMD. If you are going high end get nvidia
>>
>>53096685
AMD (Polaris), since the company will supply the important hardware for the consoles. Next gen will be on the 14nm node.
>>
>>53101899
>>53102104
My Imagine 128 II was roughly $600 in todays money.
>>
>>53104866
But it gets pretty much the same results in vidya as the 380X, with a TDP lower by 70W. And it's cheaper too. Why would I get the Radeon in that case?
>>
>>53102033
960 is the shittiest card out there. It has a very low memory bandwidth.
Get a 380/x for that price.
>>
>>53106610
>It has a very low memory bandwidth.
Yeah, and you're gonna tell me to get an AMD processor, because Intel has less cores for the same price. The 960's memory bandwidth is low, yeah, but it's still high enough to compete with whatever AMD sells at that pricepoint.

The only reasons that matter when buying a GPU are:
>actual in-game performance
>TDP
>noise levels (though since the same coolers are being put on AMD and Nvidia cards, this is pretty much a moot point)
>price
The 380X loses in two categories and ties in the other two. How come it's supposed to be a better purchase?
>>
>>53101899
>big mac used to cost $2.70
>>
>>53106751
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-960-vs-AMD-R9-380X/3165vs3532
>>
daily reminder that GTX 9800 was better than the entire 400 series
>>
>>53101899
That's a bit optimistic, some high-end cards were $350 in the early 00's, that would be $480 in today's money.
There is also the key point that GPUs used to pretty much double in speed every six months back then, whereas now we're lucky to get 30~50% more speed every year. Hardware being more expensive is offset by the fact that you don't need to upgrade all the damn time anymore.
>>
File: graph.png (76 KB, 582x2384) Image search: [Google]
graph.png
76 KB, 582x2384
>>53106891
That's not actual in-game performance. That's a benchmark. In-game the cards have virtually the same performance, depending on the game. For example, the 960 is faster in Crysis 3 and GTA V, while the 380X is faster in Witcher 3. The differences are usually very low though.
>>
>>53102785
>Tired of AMD's "it will totally get better next year guyze!" meme overhyped by its fanboys and shills, specially here, and they fail to deliver every single time.
AMD's CPUs and GPUs have aged quite well; however I've only been following things for two years.
>>
>>53107374
>aged quite well
except they can't tessellate to save their life.
>>
>>53102875
But can you turn down the tesselation on on 700 series cards?
>>
>>53102033
>>53106751
>actual in-game performance
>>53107032
Anon, you what?
>>
File: 1453752485646.jpg (102 KB, 650x434) Image search: [Google]
1453752485646.jpg
102 KB, 650x434
>>53107488
>280X beats Titan
>>
>>53100340
I smell bullshit.

>Nvidia has also pigeon-holed themselves with some bullshit proprietary features that are very undesirable to consumers.
Are you still butthurt about hairworks? Most their proprietary *bullshit* like CUDA/Car works has given them billions in Industry contracts. Nvidia doesn't making gaming GPU's any more. They make ML cards, and sell lower binned chips to us.

>because the game devs cant see the fucking code
They've never been able too. DirectX decompilers are a pain to use. Most game devs stop giving a shit at the engine level. Pass that a.k.a. the real 3D code nobody really reads. This the generic engine code is what your concerned about seeing de-compiled dumps of?

>G-sync adds hundreds of dollars to the cost of monitors while AMD's Freesync adds very little cost.
100% True.

>G-sync is basically going to be dead very soon and Nvidia will be forced to make their future cards Freesync compatible.
100% True give it 1/2 generations.

>So even if AMD hardware was inferior which is most certainly is not you should feel compelled to buy them anyway simply because of their commitment to open source.
The fact that AMD has an Open ISA doesn't mean their more committed to Open Source. Both companies are OSS hostile.

>there are alot of hardware features on curruent gen AMD cards that are going to be a real eye opener for consumers when the first VR headsets hits the shelves.
They stamped VR on the box so yeah paste eaters like yourself get a hardon.

>AMD is miles ahead of nvidia when it comes to VR their Crossfire implementation is fundamentally superior to Nvidia's physical bridge SLI and allows AMD gpus to scale almost 100%
I'll need a source on this. Also the SLI bridge is being deprecated with future iterations of Nvidia cards moving to Card -> Card communication across the PCIe interfact.
>>
>>53107396
Yeah they can, just not at 64x. ATI invented tesselation
>>
>>53096685
>I hear Nvidia gimps older cards to make newer ones look better, but AMD struggles to keep up with Nvidia in terms of game frame rates.

Amd easily the better solution for long term, nvidia has no issue throwing its old cards under the bus

"oh look, maxwell handles tessellation well, lets max this shit out, what, kepler struggles to be better than amd, but it IS better by a small ammount... fuck it reason to upgrade"

as for the next gen of gpu... nvidia did node level optimizations to get maxwell as good as it is, while amd assumed they would be going to 20nm so the move to 16/14nm is going to strip most of the maxwell advantage, whereas games are going to start to be made to take advantage of a sync, which amd has has sense the 7000 cards.

but everything is really a wait and see kind of deal, that said, my money is on amd as nvidia is losing node level optimizations and is going to have a first gen hbm (hmb2 in this case) implementation.
>>
>>53107488
Finally an argument from the AMD side. Thank you Anon. Finally somebody gave me a sensible reason to think about the 380X.
>>
>>53101866
vulcan is built on mantle as well if i remember correctly.
>>
>>53107488
I'm assuming this must be for the previous beta and not the current open beta. I'm seeing performance that matches what is shown with my 780 Ti but my friend who just got a 960 is getting a solid 60FPS outdoors on ultra when I get low 50's. We both have overclocked 2500k's but mine is at 4.6GHz and his is at 4.2GHz.

Kepler fuckery strikes again. Hopefully a few weeks or a month later Kepler will see a driver release with optimizations like what happened with The Witcher 3.
>>
>>53107396
tesselate to save their lives... ok, what is the point of adding so much geometry to a cube that you literally make a solid color wireframe?

nvidia bitched and moaned to microsoft about tessellation because they would not have a working card for dx10 if they put it in, and got it cut out as a result, so they pissed away amds money for a tessellation engine on dx10 cards that never got used, and by dx11 they got a fairly good engine going, so what do they do, lets gimp amd hardware by having everyone tessellate far beyond what is visually noticeable.

it got to the point that amd put in driver's ways to limit tessellation, something nvidia cards don't, so have fun with a maxwell when the next gen tessellates better so nvidia pushes it to 128x or 256x
>>
>>53100070
nvidia pushed a driver that has the heaven benchmark around 10% worse on older cards... took them around a month to fix it.

we noticed the gimp that time.
>>
>>53107651
The 4GB 960 and the 380x are mostly neck and neck in current DX11 games, but the 380x will likely have a significant advantage in DX12 games.
>>
File: nvidia driver optimization.jpg (347 KB, 1936x2733) Image search: [Google]
nvidia driver optimization.jpg
347 KB, 1936x2733
>>53099045
>what does the word "gimp" mean here?

It means that Nvidia optimizes stuff backwards, forcing you to buy new cards if you want a good framerate.
>>
>>53099307
dude, you have a higher end card, turn a few of the graphics settings down and its perfectly fine, no need to jump on a 3XX or a 9XX card when in less then 6 months a massive fuck off die shrink generation is happening, likely bringing the 500-700$ performance down to the 2-300$ range.
>>
>>53107591
CUDA only got them contracts because it was first to the party, because it has better support (both dev support and backwards compatibility support), and because it WORKS.

The ATI/AMD equivalent, is a total pain in the ass with random as shit bugs, card lockups, shit support, and needing different program kernels per card.

AMD could easily win that battle but their implementation is just awful to the point of being nearly useless.
>>
>>53106567
try overclocking a 960 to 380(x) performance levels and you'll find that it uses almost as much power as hawaii. nvidia really lowballs their TDP whereas intel and AMD are at least honest about the power draw

and the 970 is about as power efficient as the 290x (using upwards of 300w) if you bother to overclock it, which everyone does.

it's really down to personal preference and whichever resolution you want to play, if you want better handling at high resolutions, go AMD, if you want sick framerate, go nvidia
>>
>>53107891
He has a 780, which now performs on par with a 960 in most games due to shit drivers.

Compare that to a 7970 which is almost 2 times as fast today as it was on launch day, and it is still getting major driver improvements due to GCN architecture.
>>
>>53108007
its still a turn your settings down situation, i have played games on hardware gaps that were worse than a 780 to 980ti and if its just 5-6 months, fucking stick it out.
>>
>>53103384
g-sync is objectively worse than freesync, the problem is g-sync has a quality requirement to implement, where as freesync does not, so manufactures skimp.

there are several freesync monitors that get i believe a 20-120/144 range but there are far more that have a shit range.
>>
File: 1455927433863.jpg (64 KB, 455x628) Image search: [Google]
1455927433863.jpg
64 KB, 455x628
>>53108007
Its pretty funny that the HD 7970 and GTX 680 where roughly the same on release and now the 7970 destroys it. AMD seems to be the FAR better investment if you plan on keeping your card for more than a gen.
>>
>>53100657
a fire is under their ass due to amd gearing up to be first out for the next gen of cards, and as we see with the 980ti nvidia hates not being the fastest so much that they will sacrifice their 1000$ card in the process.
>>
>>53107991
>using upwards of 300w

lol no, I have a overclocked G1 GTX 970 to 1519mhz with a 4690k 2x1tb seasgate hardisk and two fans and I've never seen my PC go over 330watts from the wall.

This GTX 970 can pull 75 watts from the motherboard, 105watts from the 6pins pcie wire and another150 watts from the 8pin with a maximum tdp of 310watts in the bios, it can't pull more than that or it will drop the GPU core clock to keep a stable overclock.

It has never gotten over 70% Power, if you enable vsync it stays right below 50% most of the time.

Those cards just don't have the raw power to pull that much power unless you are using some retarded application or some really badly written games.
>>
>>53108099
Blame Jewvidia for that one.
>>
>>53100959
Polaris will only have 2 finfet gpus, the fury 2 and the 470 and 470x
>>
Do AMD cards have an equivalent to shadowplay yet?
>>
File: 1448536820456.png (250 KB, 487x600) Image search: [Google]
1448536820456.png
250 KB, 487x600
>>53108609
http://raptr.com/amd

Same performance penalty, and isn't compressed to shit.
>>
>>53108609
>>53108658
It also works for Nvidia, for the record.
>>
So what's going on with Polaris, only the flagship $550ish card and some low end card will be getting 14nm?
>>
>>53108906
Nah there is clearly at least one midrange card as well; probably two.
>>
>>53108609
Nope.
>>
>>53096911
Ah, so you mean Fire Pro, since they're going to support CUDA soon got it, thanks
>>
Cant decide.


Wait till gp100 or buy 980 now.


Running on 670 sli.

Want to get ready for VR
>>
>>53109445

Fuuuck

Whichever company offers a mid range card with HBM2 will have my shekels
>>
>>53107680
It is, and like DX12 should provide an advantage to AMD cards if implemented correctly
>>
File: hehe.png (158 KB, 454x404) Image search: [Google]
hehe.png
158 KB, 454x404
>AMD

https://youtu.be/8lTGk0R12gs
>>
>>53111924
isn't directx 12 not so much build on mantle but using all the ideas whereas vulcan was handed all of the mantle code base?
>>
>>53102785
>buys an 8120
>bitches
meanwhile my 8350 and 970 can run anything at 60 fps at 1080p
>>
How does the fact that Polaris is built on 14nm and Pascal is built on 16nm affect expectations?
>>
>>53114094
better power consumption, potentially better clocks, better density, possibly cheaper chips.
>>
>>53099262
Is it for sure the drivers? Or just the card worsening overtime naturally? Or just accidental gimping as nvidia deprioritizes optimizing older cards in order to just ensure the best optimization of newer cards? I doubt they assign the same amount of man power to improving drivers for all existing cards equally.
>>
>>53100759
No idea but I know I wont buy shit until I can compare both. I have been with nvidia, but unless gsync gets cheaper I really hope amd does well.
>>
>>53100816
Source on this tesselation thing? Basically you can reduce quality of specific superfluous shit?
>>
>>53114094
It barely matters.
>>
>>53102339
your brighter future is about to go out of business. mission not accomplished.
>>
>>53114003
>2016
>1080p

Get with the times faggot, anything can run 1080p now.
>>
I own a R9 380 SOC yet not a single benchmark,
I love my video card not sure if i'll ever upgrade for a while

its funny but the 380X is cheaper in my country then the 380 SOC then though the 380 SOC version is still slower than the 380X lel
>>
>>53114460
Here is basically the reason that option exists in AMD drivers: https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing
>>
File: sorcery.jpg (179 KB, 1440x900) Image search: [Google]
sorcery.jpg
179 KB, 1440x900
>>53107488
>mfw 7870 > 680
>>
>>53101899
This has never been the case.
>>
>>53114541
this, whats with still using 1080p these days
my 960 is good for 1440 at some sort of playable level, and thats a 960!
>>
>>53103262
The only name they ever published officially to the Public was Polaris.

>that pic
This kills the user.
>>
>Nvidia gimps older cards to make newer ones look better

the opposite.

Nvidia cards get slightly better with newer drivers. Not by anything worth making your decision (protip nvidia*) based on but they definitely don't gimp older cards. That wouldn't be a rumour, it'd be an outright scandal. People who care about that shit would notice.

*nvidia cards have better coverage overall for performance. AMD is much better at getting their drivers out with game specific optimisations now, but you can't beat having agents inside studios.
>>
>>53100687
>still no sources

c'mon mate.
>>
>>53101899
I remember paying around 350 for a ti4600 14 or so years ago. if you adjust for the inflation caused by greenspan and bernanke that would be a couple million dollars.
Thread replies: 135
Thread images: 16

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.