[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
AMD under-rated in this wave of GPUs
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 104
Thread images: 13
File: download.jpg (9 KB, 244x206) Image search: [Google]
download.jpg
9 KB, 244x206
Does anyone else feel like the current generation of AMD cards are secretly superior but everyone's so used to the prior era's of AMD fuck ups in both the CPU and GPU market that they are too far down the Nvidia rabbithole to admit it?
>>
I think ur a gey fegget
>>
>>52485889
>360
Re-branded 260 with higher clock speeds and 2GB of VRAM, priced similarly at release, with street prices being closer to what the superior 260X used to be

>370
Re-branded 265/7850 with higher clocks - again, similar MSRP, but with the superior 270 diminishing and no replacement released, its prices haven't gone down as much as they should be.

>380
285 Re-brand at cheaper MSRP and 4Gb available. Good budget card, especially against NV's 960, albeit cut-down memory bus from the 280, and barely performing better. It also has a fairly high power-consumption towards Nvidia's offering.

>390
290 with higher clocks, lower MSRP and more (actually as advertised :^)) VRAM. Nothing bad here as well, but again, power-consumption is an issue among the 970, especially when overclocked.

>390X
290X with higher clocks and similar to the 290. Can match or get close to a 980 at a much lower price point, but in contrast has significantly higher power-consumption, which in excessive use, can not only contribute towards your power bill, but might also require a new PSU purchase, which may void its price advantage in the first place. Also OC's worse on average, even considering Maxwell's shitty scaling with clockspeed.

>Fury
Cut-down Fiji, can beat a 980 but is a good leap from the 980Ti whilst the difference in price is rather marginal. Likely outclassed by the 390X and Nano.

>Nano
Good card, especially after the price drop. Fully-fledged Fury X at lower clocks, which unfortunately only hovers it around 980-level and barely even beats a 390X, but at least it's pretty efficient, at least until you overclock it. Good for small form-factor builds, not so good as a budget GPU.

>Fury X
Can get close or match a 980Ti around similar price points, but contains less VRAM, uses more power and doesn't overclock as well. Quite a disappointment, at least for its price.

Overall, AMD's offerings are decent, but definitely not outshining Nvidia's. We'll see about Polaris.
>>
>>52486563
you forgot the 380x
>>
>>52486640
Wouldn't have fit into 2000 characters anyway, but here we go :^)

>380X
Tonga in full. Performance lies on Tahiti-level, just like on the 380 - here against the 280X/7970, which is only about 10-20% faster. Doesn't seem worth the additional 50+ Dollars to a 380 right now, you'd be better to drop another 50 into the 390, but it's relatively new, so things might change. (Driver optimization/price drops)
>>
AMD cards are competitive, especially in the lower price segment (let's say up to 300 €).

I upgraded to an R9 390 from a HD 7950 and sold my HD 7950 for 110 € on ebay last summer.

Best price performance at the moment is undoubtedly the Radeon R9 290 with custom cooler. You can get those for around 200€ if you're patient and wait for a deal.
>>
>>52486563
>>52486750
save for future pasta
>>
>>52486563
>It also has a fairly high power-consumption towards Nvidia's offering.
The end. Polaris better deliver fro us low enders.
>>
>ITT: Caring about pennies worth of extra power draw
>>
File: energy.png (7 KB, 470x275) Image search: [Google]
energy.png
7 KB, 470x275
>>52487751
As someone who uses AMD and lives alone, this amuses me how much faggots who live with their parents worry about this.
I live completely solo working a low tier job (retail butcher) and pay just a smidgen over half of my income per month to rent alone.
However, I don't give a fuck that the Fury X at stock sucks down slightly more than the 980ti (at stock - though the 980ti EATS power when OC'd).
I DO turn lights off that I don't need and stuff like that, and I always make sure to buy energy efficient bulbs and do stuff the efficient way (use a kettle instead of water boiler to do washing up, for example.)

Proof is in the pudding; using an AMD card doesn't mean jack shit to your power bill.
>>
>>52487922
i think it's probably less so that and more them being worried about their PSUs actually
>>
>>52487957
>Being worried about the PSU
I thought everyone here was smart enough to get an absolute minimum of 600w +80 bronze.
The only exception would be SFF ITX builds.
>>
>>52488147
Yeah, but they get spooked into shills on random first google result forums saying you NEED a 750w+ gold PSU at bare minimum to run a 290 and you should just go with a 970
>>
>>52488157
That's true, fuck I hate morons who do that kind of shit. When I bought my 750w PSU, it was for the fact that the prices were fuck all different and IF I decided to one day go crossfire, I would be able to.
This whole LEL NVIDIA SOOOO MUCH MORE EFFICIENT XDDDD meme needs to die. The differences are tiny.
>>
>390 is power hungry shit
>970 is missing 0.5GB of VRAM
We need another player in the market.
>>
>>52487922
>980ti EATS power when OC'd
That's just wrong. Maxwell doesn't require any voltage increases when reaching their max OC thus the power consumption increases only negligibly.


Also it is not about the power consumption, it is about the heat output.

Higher heat output increases the room temperature.

Increase room temperature turns on the central ac.

Central AC consumes 3000w and increases ambient noise.

A 50~100w increase will heat up a room over the hours and cause a domino effect.
>>
>>52488237
Nice meme m9, but I've heard otherwise from a lot of people who actually own one.
>>
>>52485889
That happens every gen.
>>
>>52488237
Oh, also? >MUH HEAT OUTPUT :'(((((((((((
Jesus christ you're a faggot.
>>
>>52488246
I own one as well. And it's watercooled.

Between stock clocks (1200) and overclocked (1500), the difference is about 30w at the absolute most. It probably averages about 10w more overclocked.

EVGA said the same thing also, I can link the article if you don't believe me.
>>
>>52487922
>>52488237
>>52488259
>my situation
>no, my situation
>wow ur a faget
lel
>>
>>52488277
Don't you know? That's how internet discussions work :^)
>>
>>52488237
Fuck me, you think running an AMD card increases the room temperature that dramatically. Try dual GTX580's before I get my Dual Fury X's.

I'm going to need some proof on that 30w? got an AX PSU?
>>
>>52488147
Which means you're paying too much compared to an XFX 550W or Seasonic 520W which can run Nvidia just fine.
>>
>>52488299
and an overclocked 5820k/Fury X just fine too?
>>
File: perfwatt_1920_1080.png (35 KB, 500x850) Image search: [Google]
perfwatt_1920_1080.png
35 KB, 500x850
>>52488277
From this chart alone there's a 26% efficiency difference between a STOCK Fury X and a non reference OVERCLOCKED (1400) 980 ti.
>>
>>52488299
520w can run AMD just fine too.
http://www.guru3d.com/articles-pages/msi-radeon-r9-390x-gaming-8g-oc-review,8.html
>>
>>52488290
>>52488309
You're the guy I asked for occt bench a few days ago

http://rbt.asia/g/thread/S52378380#p52381141

Did your Fury blow up yet?
>>
>>52488315
What makes you think I give a shit?
>>
>>52488328
Well participated twice so far.
>>
>>52488337
>Laugh at retarded conversation
>You're participating!
You think shitposting is participating?
>>
File: Untitled.png (1 MB, 2560x1440) Image search: [Google]
Untitled.png
1 MB, 2560x1440
>>52488327
Yeah, but why exactly is it blowing up?
>>
File: perfwatt_3840.gif (41 KB, 400x615) Image search: [Google]
perfwatt_3840.gif
41 KB, 400x615
>>52487922
Also take a look at pic

Most cards decreases pref/watt when you oc it.

Maxwell are the only cards that increases pref/watt when you oc it.

>>52488367
Did you use the correct settings this time?
>>
>>52488362
>trying to offend people on 4chan

really dude?
>>
File: Untitled.png (382 KB, 2560x1440) Image search: [Google]
Untitled.png
382 KB, 2560x1440
>>52488394
>>
>>52488402
>taking offense this easily

really tumblr?
>>
>>52488403
Can't see shit from that.

-need gpu z to confirm clock/clock history
-screenshot taken only while the bench running and occt settings shown
-1080p only
>>
>>52488416
No I'm asking why you're desperately trying to get an reaction out of someone, on 4chan out of all places where desensitized people conjugate.
>>
Maybe if you have 1000W PSU.
>>
>>52488455
I actually never set out with the intent to offend anyone.

Oh and, welcome to 4chan.
>>
>>52488443
Ughh.. forget it.. Point is, that's the power an OC 5820k and OC Fury X uses.
>>
File: Screenshot_20160118-051635.png (339 KB, 1440x2560) Image search: [Google]
Screenshot_20160118-051635.png
339 KB, 1440x2560
>>52488474
lol why did it end up consuming 800w like I predicted? or are you scared of blowing up your vrms? (I would tbqh)

The last 290x I tested consumed 600w+, wayyyy more than my 980 ti overclocked. The card was squealing like a pig.
>>
>>52485889
I bought 2 new R9 290s for 200 bucks each, I'm a happy dingo right now. The Nvidia alternative at the time was a 760.
>>
>>52488157
You can run pretty much any high end single GPU with an efficient 500W PSU, unless you also have a shitload of fans, HDDs and expansion cards.
>>
>>52488416
>>52488402
>>52488362
>>52488337
>>52488328
>>52488315
>Come back from /gif/, see this
Fucking hell, 10/10.
>>
>>52488503
>290x
>600w+
HOW?
>>
>>52488605
My educated hypothesis is that amd is shit.
>>
>>52488624
No, you were obviously doing something horrendous to it. NO WAY should ANY single GPU card be trying to take anywhere near that much.
>>
>>52488605
He's probably talking about the entire system because people are still too retarded to seperate the two. A 290x, without a heavy OC, will draw roughly 300W from the wall.
>>52488624
You're retarded, see: >>52488629
>>
>>52486563
>>52486750

When will people learn the difference between a rebrand and a refresh.

Yes, those cards are tired as fuck already, and we should have gotten Polaris instead of the 300 series, but those are not rebrands, but refreshes.

The fact is that the only thing they Really achieved with the 300 series is the power consumption, though it is still higher than nvidia's.

The 200 series are still heat-emitting power-eating monsters, but they perform the same fucking way, and are currently superior because most people have a 750W PSU anyway, and can OC that 290 to eternity with this, reaching a $320 970, all the while being priced at like $200.

Also, AMD should fire their whole PR team who outright lied about the Fiji performance.
>>
I bought a 390 for christmas. Very happy with it honestly, especially when I see that I'm exceeding 3.5gb in a game.
>>
>>52488642
That's a good point, something I hadn't expected. But still, even if we assume he was using a basic Intel CPU... that's more than should've been plugged into it unless you're literally using a custom VRM board.

i7 4790k @ stock is 88w, even if it was an AMD CPU the stock is something around 120w. That would still mean that if the CPU wasn't clocked that its 200w on the GPU. Again, no way should people be trying to add THAT much power unless they're going for world records.
>>
>>52488629
The numbers stared right at my face, I have no reason to fabricate lies.

Maybe the archive stores webms, try visiting the rbt.asia link above.

>>52488642
Must be nice to be ignorant
>>
What are the chances that Pascal will be another GeForce FX / Fermi?
>>
>>52488675
Bring the proof here, otherwise I'm calling your claims complete bullshit lies and shillery.
>>
File: untitled-1.png (83 KB, 636x1041) Image search: [Google]
untitled-1.png
83 KB, 636x1041
>>52488669
If you heavily OC the CPU and GPU then it's not totally unreasonable to see 600W from a system like that, especially if you measure it from the wall. At 90% efficiency that's just 540W for the actual system, which sounds reasonable.
>>52488675
Pic related, that's for the GPU alone. How exactly did you measure the power consumption?
>>
>>52488686

We don't know anything about Pascal atm, but we could sure you a bit of fluctuation on the market. Nvidia is shitting up the industry.
>>
>>52488694
From the wall is another matter, but in terms of what the system itself is using? A reliable PSU will be able to supply up to a minimum of what is stated on the PSU itself. Sometimes you're lucky and they can exceed it slightly, but its never encouraged to try.
What the PSU takes from the wall is different altogether and THIS is bullshittery because such arguments are trying to obscure the truth in hidden context.
>>
>>52488687
You saw my return receipt and you know I don't have the card on hand to retest it.

It's up to you whether you believe me or not. Either way I don't really care because I don't have control over any of these.
>>
>>52488721
It's how most people measure it, and honestly it's not a terrible measurement since most PSUs are pretty damn efficient nowadays. So, as long as you're not using a really shitty PSU on purpose they're alright to get the general ballpark of what the entire system is using, and then you can calculate it at different states to get the numbers for the GPU itself. Like the pic I posted, that's done like that so at the very worst it's 290W for a stock 290x, that's a number you can work with as long s you know what it stands for.
>>
>>52488745
Or maybe, just maybe; you're a fucking moron who tried overclocking the card with 2x the power requirements/OC'd and are using whole system wall power draw with OC'd CPU as well, to obscure the facts of how retarded you are.

Either that or you got one in a million in that the card was built so shittily and slipped by QA that it was too faulty to regulate power consumption at all.
>>
>>52488749
Its not so bad when a tech firm does it AS LONG AS they've got variables that don't change outside of the GPU.
Literally the way to do it would be:
>Build rig, no CPU. Power it on, test power usage
>Change nothing about rig, implant GPU, test power usage
>A - B = C
However, I highly doubt anon does this.
>>
>>52488754
If you understand the nature of occt you wouldn't be saying such things.

Either way the power was calculated from total amperage from the pcie connectors. Take at look at >>52488367 for example.
>>
>>52488749
See: >>52488694
That's the general figure you'll find everywhere, 600w is more than double that, you're either bullshitting or your methodology was stupid.
>>52488754
A 290X consuming 600W would blow the power delivery on the card and overheat very fast, it would be very clear that the card is faulty, he has to be bullshitting or he has no clue how to measure power consumption.
>>
>>52488749
>>52488721
Most I've drawn from the wall was 800W, and that was a fully-loaded 3.2GHz e6420 and HD 4870.

OCZ, not even once.
>>
>>52485889
>Cut-down Fiji, can beat a 980 but is a good leap from the 980Ti whilst the difference in price is rather marginal. Likely outclassed by the 390X and Nano.

>whilst the difference in price is rather marginal

Strix Fury is £430
Strix 980ti is £600
>>
>>52488770
You can't put any kind of real load on the GPU without a CPU, so that wouldn't work. The testing methodology that guru3d uses is mostly fine, it's not perfect by any means but it works and is fairly consistent. With proper lab equipment you can obviously do a better job, but barely anyone is doing that.
>>
>>52488503
>>52488624
If you try to kill any chip, it'll consume massive amounts of power. Using the cards for what they're intended for will net you less than 500W. If you want to make the card to whatever it needs to, to chew as much wattag as possible then go for it. Just don't give that shit to normal people who use GPU's normally.

>>52488629
No doubt.
>>
File: crying-kid.jpg (133 KB, 1600x1200) Image search: [Google]
crying-kid.jpg
133 KB, 1600x1200
>>52488704
>Nvidia is shitting up the industry.
>>
>>52488783
I hit 750W spikes with 2 R9 290s and an i5 [email protected] 4.4GHz, but they're incredibly short so my 850W PSU is comfy as fuck. Thanks EVGA!
>>
>>52488846
When I was testing my current 980 system it was drawing like 350 from the wall tops, which is about what I'd expect.
>>
>>52488886
Makes sense, at stock the 980 should consume roughly 200W, the R9 290 needs roughly 250W.
>>
File: Nvidia Support.jpg (46 KB, 741x404) Image search: [Google]
Nvidia Support.jpg
46 KB, 741x404
I honestly don't know if I will buy another AMD card after they decided to stop supporting my crossfire 6950's whenever one of them still performs as well as their mid/low offerings.

I understand not supporting older hardware but I wouldn't be as mad about it if the "stable" driver they have for windows 7 wasn't released in a broken state (HDMI audio doesn't work)
Ive tried the beta crimson driver and it fixes the audio issue but breaks crossfire. It might have been updated since I tried it so I might try it again soon.

To give a little comparison look at when the last driver update for some Nvidia cards that were released in 2004 were
>>
>>52488920
Fuck off Nvidiot shill
>>
>>52488798
That's why I said you test the CPU first...
>>
>>52488920
You can literally buy a 7850 for like $80 that is loads better than your 6950, or shit, you can even get a 7950 for fucking ~$100 even some of the betters models. You can sell your 6950 and pretty much just make a swap.
>>
>>52485889
That's cause they are superior. The only reason they perform better in gaymes is because Nvidia has their proprietary cancer that is gameworks already injected everywhere. The FuryX for examlpe has so much throughput it shuld be able to crush any Nvidia card but nooooooo.

Anyway go back to /v/ gaymen.
>>
>>52488941
That's literally what guru3d is doing, except they take a general GPU idle power consumption into consideration, something around 10W, which is a good number for 90+% of GPUs out there.
>>
>>52488945
>7850
>loads better

How much longer is the 7950 going to be supported? Going by whats already happened to me probably not much longer
>>
>>52488937
Keep telling yourself that
>>
>>52488991
>How much longer is the 7950 going to be supported?

It's GCN so at least for another couple of years.
>>
>>52488991
The move to 14nm will halt support for all current cards excerpt for Fiji and Tonga.
>>
Is getting a 290 instead of 380/960 a good idea when the same price?
>>
>>52489293
Get the 290, best performance of the options listed.
380 isn't as good as 290, 960 is only around on par as 380.
>>
>>52485889
that's what /g/ says every generation.
>>
File: 1449835934524.jpg (14 KB, 220x220) Image search: [Google]
1449835934524.jpg
14 KB, 220x220
R7 370 is slower then 270
380 slower then 280
380X slower then 280X but muh
> +1GB VRAM
enjoy your upgrade
>>
>>52489312
it is this way only two nvidia gens 700s and 900s
400 and 500 weren't that great, consequently it began when amd had some restructuring to do
i want 60-40 market so much, ideally 50-50
>>
It seems to me that in mostly online marketing AMD is underated. Personally I was a little let down by the recent GPUs that AMD released, it's still undeniable that a lot of what AMD offered before the 300 cards were outstanding (ie HD 7970 and R9 290x). I'm not saying that the 300 cards are bad, actually they are pretty kick ass for non gaming applications like using OpenCl acceleration in Adobe programs. I feel that AMD gets an underated reputation partially from the YouTube community (ie Linus Tech Tips), but that could be contributed to Nvidia's superior market skills, however damaging they may be to the overall market. With all this said, come at me Nvidia fanboys, I can't wait to see you all make fpols of your selves.
>>
Everyone says "power consumption" and "heat" but those are retarded.

Extra 50-70w isn't going to change anything in your billing. "heat" can either be referred to as heat generated by the card's power or the temperature of the card itself. The later is addressed by aftermarket coolers. It was only an issue for the reference 290/x cards with blower design. The initial doesn't even affect you in the slightest. The amount of heat it generates vs a similar card is so microscopic, its just hilarious.

Others I've heard is drivers issue. Couple of years ago, that might have been the case, I very much doubt this is the case. Last couple of years, AMD's drivers have been pretty stellar. I've had my share of both nVidia cards and AMD cards (AMD cards being my current setup). I've had some minor issues with both companies, however overall, I'd say they were pretty good. I've heard the current driver issues lies with the nvidia cards and windows10. Scanning over the reddit nvidia official page, that seems to be the case.

Current gen nVidia I think are overvalued and AMD cards undervalued. If you value your money, you'd choose wisely. Heresy and opinions will have to be discarded if you want whats best for yourself both for current games and for future games.
>>
No, the 300 series really was disappointing.
>>
>>52490372
Nvidia equivalents cost less.

To prove it name an AMD card and I'll find a nvidia alternative that cost less.
>>
>>52485889
Radeons are better designed in an ideal world where DX11 driver efficiency was never an issue, GameWorks never existed, and games were limited just by available shader power.

In reality however, you have one vendor with slightly beefier tessellation/ROP throughput paying developers to flood scene with tons of 1-3 pixel triangles that don't improve visuals but choke the other vendor's GPUs.

If Polaris doesn't reverse this trend, AMD is probably finished, no matter how much they might consider themselves to be taking the high road or whatever.
>>
>>52490409
Radeon HD 5450 ($28 new)
>>
>>52486563
I was looking at the 970 and a 390 for folding both are about the same for gaming and folding speed but the 390 takes double the wattage of the 970.
>>
>>52490409
r9 380 2gb @ 1080p

good luck
>>
>>52490409
The R9 390 is a cheap Nvidia equivalent to the GTX 970. :)
>>
>>52489457
To be fair they stepped each down in the product line. A 380 is the same price tier as a 270 was last generation.
>>
>>52489457
Nice proof you got there
>>
Your actually going to see a bit of a reversal in power consumption/heat and performance from pascale and polaris.

AMD is building brand new core designs radically difference from current GCN and they are putting them on a superior process than the 16nm node pascale will be on

speculation is that nvidia is refreshing maxwell for the smaller node which means they will likely say fuck it to power efficiency and heat and scale maxwell CUDA in order to drive core clockspeeds even higher than they currently are in order to beat polaris on synthetic benchmarks

so pascale will be very inefficient and possibly even a housefire meme waiting to happen in exchange for possibly being the first 2ghz GPUs in the world

while polaris will be farther behind on clockspeed by an order of magnitude more efficient shaders and more of those shaders packed into the die.

if my assessment is correct pascale will cost an arm and leg but be technically faster on desktops simply by pushing for insanely dangerous clockspeeds to make up for the architectures other shortcomings and AMD's pascale will totally outclass them in the mobile variants that go into gaming laptops
>>
>>52491559
to be fair AMD comparing 380X with GTX 960 in their promo
with non-OC GTX 960
>>
>>52491770
>radically difference from current GCN
no
>pascale will totally outclass them in the mobile variants that go into gaming laptops
pascale will be everywhere but destinated for 64FPU computing which nevertheless will be blocked on every non-professional card. SURPRISE!
>>
>>52488237
this post gave you 0.021$ to your nvidia savings account (tm)

I wonder when gtx480 was released why no nvidia shill cared about the power or heat :^)

It is literally 30-50w or around 10-15% for whole pc more power usage 970 vs 390.

Same applies to Fury vs 980ti.

It would cost maybe 10$ in year more in electricity and considering half of the year heating is only positive thing anyways..

So nice try Rajeed. Go POO in LOO
>>
>>52488991
69xx series cant really be overclocked more than 10%

78xx goes to 1.2ghz and 1.4 for memory and still uses less power.

7850 can get nice even 35%-40% bump relatively easy in real performance and is on par or better than stock 7950 even.

t. some guy who had 1260mhz core and 1450mhz memory with on asus dc2 hd7850
>>
>>52490409
A 280x. Same generation.
Thread replies: 104
Thread images: 13

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.