[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
>hurr durr 480 is as good as an 1080 for only 500$ nVidia
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 122
Thread images: 13
File: mykRASw.jpg (131 KB, 960x720) Image search: [Google]
mykRASw.jpg
131 KB, 960x720
>hurr durr 480 is as good as an 1080 for only 500$ nVidia is over
Is this really how retarded you have to be to be an AMDrone?
It's an undeniable fact that it will draw almost twice the power of an 1080. This means in only 2 years, the 1080 will be cheaper at average, electricity prices.
Secondly, CF/SLI sucks compared to single GPU. It just does. Unsupported games, bad frametimes, bad airflow, more noise, bad G/freesync... If CF/SLI was a viable replacement, nobody would even buy high end cards to begin with, whether they're from AMD of nVidia
Thirdly, this is even what AMD are saying themselves you retards! They've officially stated that polaris is going to compete in low-mid end. We can all be glad that AMD has released a competitive card for the mid-end market. This is what most people buy anyways, and will put pressure on nVidia to lower their high-end prices. But that's about it.
>>
>>54854790
the way it's meant to be shilled™
>>
>>54854803
Point out anything that's not fact
>>
There is no pressure, GP106 will be $199 and AMD simply can't compete with Nvidia in performance or power efficiency when they chose a bad Samshit 14nm LPP process
>>
File: 1460229130135.png (89 KB, 1272x1152) Image search: [Google]
1460229130135.png
89 KB, 1272x1152
>>54854790
K E K
E
K
>>
>>54854790
First of all 2 rx480 cost only 400 dollars and consume the same amount of power. Yes CF sucks but its also 400 dollars cheaper then a gtx 1080 and you get the same performance
>>
>>54854858
Meant to say 300 cheaper
>>
>>54854858
>consume the same amount of power
>400 dollars cheaper then a gtx 1080
desperately lying on an anonymous anime forum because your favourite video game brand is losing
>>
>>54854882
still desperatey lying (or just completely ignorant)
>>
>>54854790
CF/SLI might suck, but that's why the 480 is probably going to be properly configured for DX12 multi-gpu support instead.
>>
>>54854887
>>54854899
1 gtx 1080 costs 699 and consumes 240 watts
2 rx 480 cost 399 and consume 250 watts
CF does suck and those points are valid, but the performance is almost up there with a 1080
>>
>>54854790
OP is in so much pain!!!!!
>>
>>54854900
That doesn't solve the majority of problems. Also, even IF you could just "hurr just DX12 it lmao" then nVidia would have done it too. DX12 isn't magic that's going to give you compability and 99% efficiency
>>54854935
>gtx 1080 costs 699 and consumes 240 watts
wrong
>rx 480 cost 399 and consume 250 watts
also wrong, you don't even know what you're shilling for drone. That would mean it consumes 500 watts in CF.
>the performance is almost up there
no, the frame rate is.
>>
>>54854790

>This means in only 2 years, the 1080 will be cheaper at average, electricity prices.

Just how fucking expensive electricity is where you live? Assuming 150w per card, for 2 years I'd need to have them work at full blast 24/7, all the time at the highest demand pricing. And electricity at southern europe is already expensive as fuck.
>>
>>54855038
I used the american average of 12c per watt. I was also more generous towards AMD than you. With your "full blast 24/7" @ 10c/watt it's 150$/year more expensive to own an AMD
>>
>>54854991
Could you do me a favor and re-read my post before commentating
>>
>>54854858
>2 rx480 cost only 400
>when nvidia announce MSRPs they're bullshitting but when AMD does it's fact
lol drone
>>
>>54855081
Ok, I thought you were listing things since you usually don't write small numbers with a digit like a preschooler.
Anyways, You watts are still wrong for both cards. You're desperately lying. Third party 1080s are also less than 699, and we don't even know what the actual prices for the 480 will be.
>>
Sigh... I cant believe I thought I was having a conversation with a normal person but alas he was a shitposter.
I guess sage and good bye
>>
>>54855134
>facts are shitposting
>>
>>54854790
CRAPSMANSHIP
>>
>>54855399
>>54855134
>>54854954
>>54854803
This is literally all AMD cucks have to offer
>>
File: 1449176214601.gif (89 KB, 256x256) Image search: [Google]
1449176214601.gif
89 KB, 256x256
>all these butthurt nvidia faggots
1
9
9
>>
>>54854790
Who the fuck said it's as good as the 1080?
Who the fuck said it's 500 dollars?
It's 200 dollars you dumbshit and it's meant to be a good price/performace mid end card.
Why are you so retarded OP?
>>
File: memes.jpg (160 KB, 895x790) Image search: [Google]
memes.jpg
160 KB, 895x790
>>
>>54855528
>beats a 980 at half the price

B A S E D
>>
>>54855079

Ok, redid the math with a 50/50 split of our 0.19 and 0.10€ per kWh for high and low demand hours, turned out I botched the math and got roughly 175€/year. I stand corrected.
Still it's a worst case scenario. Hell I have my pc on 6 hours a day, and that's being generous. Assuming I turn it on everyday, I'd need 5 years at least.
>>
>>54855515
/g/ said both of those. It's for CF
read again.
>>
>>54855130
200$ are you to retarded to even listen when people tell you the price, and the 1080 can only be had for 700$ now, not some future number that may or may not happen, we will see if retailers jack up the price.
>>
>>54855087
they weren't lying about the 700$, in fact we have people with receipts of 750$ post here.
>>
>>54856332
>>54856341
>200$
>not some future number that may or may not happen
TOP KEK defusing your own argument there
>1080 can only be had for 700$ now
I've seen 630$ EVGA ACX on amazon
>>
>>54854803
>it will draw almost twice the power of an 1080. This means in only 2 years, the 1080 will be cheaper at average, electricity prices.

>480 has same power draw as 1080
>thinking an electrictricty bill will outpace a high end card's depreciation
>ever

NVIDIOTS, everyone. Also, nice strawman greentexting in the OP faggot
>>
>>54857433
see >>54855651
>>
>>54854790
if it only takes 2 mid range Polaris GPUs to beat a 1080 then nvidia is in trouble once vega rolls around
>>
>>54857530
>>hurr durr 480 is as good as an 1080 for only 500$ nVidia is over
>Is this really how retarded you have to be to be an AMDrone?
>It's an undeniable fact that it will draw almost twice the power of an 1080.

Never says anything about CF until later in the post dumbass, and even then it's just general commentary about CF/SLI. Shut the fuck up if you can't even explain yourself clearly
>>
>>54854790
Pretty much.
They're just retarded slavs and nigs happy a card they can buy with the money they've been slaving for years can match a GeForce in a synthetic bench when CFed.
>>
>>54857598
>Never says anything about CF until later in the post
So you're one of those guys who finds it hard to read more than 3 sentences?
>>
i dont get the appeal of the GTX 1070 and GTX 1080. 95+% of gamers aim for 1080p and 60fps. the 1070 and 1080 are indefensibly overkill for that.
>>
>>54857582
2 mid range nvidia cards also get higher framerates than their top models. So you're not really making a point.
Also how do you go from "roughly equal framerate but with all the drawbacks of a dual gpu" to "beat a 1080"?
>>
>>54857697
95% of gamer's won't buy them either retard. Or buy one and keep it for 5 years
>>
>>54857697
And what about the appeal of a card that is an equivalent of a mid range card from 2 years ago?

>it's as good as a fury for 200$
Yeah right. AMD will gimp their 500$ cards because of you niggers. Stop being borderline retarded.
>>
>>54857697
>overkill

at least for now
Its worth it if my 1070 lives for 4 years and was to play games on high or max settings most of the time.

My last expensive card was 9800gx2 on the same price level anyway lasted me 5 years before giving out.
>>
>>54855565
>Overclock that shit and you'll have Fury X/980 ti beating performance for $200

Based AMD, they could have followed nvidia and put this nearer to $300
>>
>>54857754
>And what about the appeal of a card that is an equivalent of a mid range card from 2 years ago?
because it can play every game at 1080p 60fps? for $200? a $200 card that can play every game at max at the resolution and framerate that quite literally every game on PC aims for?

every game released is built on the ps4s and xbox's 6 year old weak gpu. blowing $600 on a gpu is just dumb, you're wasting money for wasted resources.
>>
>>54857697
>1070 and 1080 are indefensibly overkill for that.
For now.
>>
>>54857777
If you're buying a videocard for videogames in 2016 you're retarded anyway.
Videogaming is fucking dead. Wait for 2-3 years for VR to pick up.
>>
Holy fucking shit mostly everyone in this thread should kill themselves.

Also, where do half of you faggots reside? This thread started to early to have burgers in it... Oh wait, it is summer after all.

Fuck off to /v/ with this gay shit, seriously read half the posts in here, and tell me this doesn't feel like this is literally a school playground with a bunch of faggot fucking kids arguing about useless stuff and trying to sound like they are the least bit intelligent.

This is all coming from a tech illiterate NEET gaymur who at least respects the integrity of this board enough to just lurk.
>>
>>54857796
>Videogaming is fucking dead
You've spent too much time on /g/
>>
Is hbm2 cards due next year or 2018?

I might intend to purchase a r480 for a cheap fix to my shitty gt630 and wait for the release of hbm2 cards.
>>
File: 1419789079656.gif (844 KB, 200x150) Image search: [Google]
1419789079656.gif
844 KB, 200x150
>>54857796
>VR

nice joke, its never gonna take off
>>
>>54857796
>VR

But VR is just the Wiimote/Kinect/PSMove for PC. It's little tech demos and minigames and you can't make elaborate games for it.
>>
>>54857822
Then even less of a reason to buy a video card ain't it?
>>
>>54857809
Most likely next year unless they get delayed again

480 looks like the perfect card if you're still on 1080p. Im picking one up for my current i5 2500k/7970/1080p 120hz build.

In 2017/2018 il probably make a new build with zen/hbm2 amd card and a 1440p freesync 144hz ips monitor
>>
Will the 480 or 1060 be better for 1080p csgo?

I have a 144hz monitor, and right now I get 250~fps with a 7870 and 6300. I would like to get 325+fps.

I'll be getting an i5.
>>
>>54857928
You obviously just wait for the benchmarks and see for yourself you fucking retard
>>
>>54857928
This is pretty much the "logic" of buying a 1080 card in 2016.
Considering most game engines are optimized for 1080 and last gen of video cards it's fucking ridiculous to even imply it makes sense to buy a 1080 card now.
>>
>>54857964
>game engines optimized for 1080
>droolinghomer.tiff
>>
>>54857928
It's gonna be damn near impossible for Nvidia to beat the 480 at this level of performance/price. Then again, the 960 wasn't the best in terms of price/performance in its segment either.

The 60 variant has not exactly been a strong point for Nvidia since the 460/560.
>>
>>54857964
Except for, y'know, future games instead of 4 year old games
>>
>>54854790
>being this retarded
>amd and nvidia both draw more power then retailed

>amd is maybe 10 percent more than nvidia over draw

>i do not research shit and draw conclusions based on that one website i like and proves my point versus those that go against me.I am kinda like a SJW
>>
>>54857951
>>54857964
>>54858001

Thanks


Hopefully I can squeeze it all into a pink sg13 :^)
Speaking of which, is there any reason to buy expensive noctua fans nowadays? It seems that cheap fans are just as good as expansive fans
>>
>>54857994
But they are retard.
Most of the technology they use for LODs, tessellation, texture filtering, pretty much everything they've been doing so far to mooch a bit of fps from has been aimed at 1080 resolutions and bellow and for the old generation of cards and specifically with the limitations of DX11 in mind.

Buying a 480 for 1080p gaming thinking you're future proofing yourself or for the added 20 fps in a current title you'll get is fucking retarded thing to do.

If you have a card that already does 1080p around the 60 fps mark keep it and don't be a mouth breather.
>>
>>54858010
By future games you mean console ports that are designed to run on 5 year old hardware
>>
>>54858107
So what would you recommend to a 144hz gamer with a 7870 then?

The 1080?
>>
>>54858107
You can't even spell or form a coherent sentence.
And none of the techniques you describe are resolution dependent
>>
>>54858140
You left out the "terribly unoptimized" part of console ports
>>
>>54858401
A better card won't help you much with that.
I've seen PS3 ports lagging on a Titan.
>>
>>54857697
>60fps

lol it isn't 2010
>>
>>54854991
>That doesn't solve the majority of problems. Also, even IF you could just "hurr just DX12 it lmao" then nVidia would have done it too. DX12 isn't magic that's going to give you compability and 99% efficiency

Can you explain to me what the "majority of problems" are? The multi-GPU support that DX12 has requires no agency from nvidia/AMD for CF and sli, you can run a lesser card paired up with a high-end one now without the high-end one being bogged down by the lesser card, for example. 1080 will still probably hold a lead in DX11, but DX12 gives AMD the edge. We've seen the effect it has already in AotS.

Nvidia hasn't made good design choices as of late because they're out of touch. Like, why do they keep shipping GPUs with tiny buses?
>>
>>54854935

2 480s would consume ~300W afaik
>>
So, if I want to end up running a 144hz 1440p monitor, I should probably go with a 1080 right?
>>
>>54858558
>Can you explain to me what the "majority of problems" are?
literally every drawback mentioned in OP's post isn't solved by DX12
>The multi-GPU support that DX12 has requires no agency from nvidia/AMD
MultiGPU still requires per-game implementation. Which is something nvidia/AMD usually help devs with, so yes, it does involve them. Also it doesn't make a single GPU engine magically be able to run several you retard, only maybe 30% of a game runs through DX

>why do they keep shipping GPUs with tiny buses?
Why are you a specfaggot? The important part is how it performs in video games and how much it costs.
>>
>>54854790
Poojeet, pls.
>>
>>54857801
>Fuck off to /v/ with this gay shit
>gay shit
Are you literally 12?
>>
>>54858798
>literally every drawback mentioned in OP's post isn't solved by DX12
The only one applicable is the energy costs.
>MultiGPU still requires per-game implementation. Which is something nvidia/AMD usually help devs with, so yes, it does involve them. Also it doesn't make a single GPU engine magically be able to run several you retard, only maybe 30% of a game runs through DX
Everything is on the devs implementation. Why do you think Crossfire/sli isn't achievable on DX12?
>only maybe 30% of a game runs through DX
Did you really just say games make 30% use out of an API?
>Why are you a specfaggot? The important part is how it performs in video games and how much it costs.
There's a reason why AMD has outperformed Nvidia at high resolutions for generations.
>>
>>54858954
>The only one applicable is the energy costs.
Read again retard. Literally every one. Or do you think DX12 will make the physical airflow in your computer better, are you actually THAT retarded?
>Everything is on the devs implementation.
Like it is now
>Did you really just say...
no and I don't even know how you fucked up reading a simple sentence so bad
>There's a reason...
Like 5% run at those resolutions. And they've been pretty much neck and neck anyways.
>>
>>54858798
big engines like UE4, Unity, Source, CryEngine, I bet will all get multi GPU support even if just the basics.

As far as im aware multi GPU support should be fairly trivial to most devs since its built into the API rather than an extension which was the case with SLI and Crossfire which is why most of the time it was never implemented.

Most engines are also multi threaded in this day and age so its just a matter of working out which render threads can be off loaded onto another GPU while the other is running its task. Hell you could also ignore rendering and just use the compute shaders and use it to compute physics and particle effects.

The only difficult bit will be ensuring both cards are correctly sync'd as I expect putting a super low spec GPU could cause slowdown if the single task was just too much for it in the first place. In DX12 I believe you can request feature support from the GPU so you could work out what it might be best used for or if to ignore it all together.

In my mind not using Multi GPU would be retarded for any dev and would be the equivalent of not running multiple threads on a multi-threaded CPU since the support is built into the compiler now there is no excuse not to use it if you know how to.
>>
>>54859105
1. All of these are done to varying degrees and takes time and resources to do well. There's no magic DX12 button that makes everything work perfectly.
2. You're still focused solely on DX retard.
3. Even if UE4 ships with multi GPU support, many devs will implement their own systems that can only run on single on top of that
>>
>>54857694
>even then it's just general commentary about CF/SLI
So you're one of those guys who can't even read a complete sentence?
>>
>>54854991
>gtx 1080
It's 50 watts less than the 480
>>
>>54859899
Actually 100w more than a single 480
>>
File: 77670d34_gqxc.jpg (334 KB, 1024x640) Image search: [Google]
77670d34_gqxc.jpg
334 KB, 1024x640
>>54859105
>big engines like UE4, Unity, Source, CryEngine, I bet will all get multi GPU support even if just the basics.

Crossfire has been pretty well implemented on the developers since 2012.

I remember getting a second 5770 to put into crossfire, and holy shit that shit worked amazingly well.
>>
>hi i'm a poorfag so i have to sacrifice performance for money
>hi yes amd for life!
amdfans
>>
>>54854790
Just wait and see
What's the point of this speculation?
I have heard so many conflicting rumors the last 48 hours
>>
>>54854790
>mid-end
Middle range cards can't be at the end.
The end is either high or low.

Oh wait I forgot its a pajeet card so mid-end performant all the way.
>>
>>54860025
Cuck. I have a gtx 960 btw
>>
>everyone says crossfire sucks
>meanwhile my dual 390x runs great with no problems in any games

get good
>>
At least they have a monopoly on the Housefire market.

> owns a 7870.
> can confirm hot as fuck at default clocks
>>sometimes smells like burning plastic
>>
>>54854790
Be honest OP, you live in your mom's basement and don't even pay the electric bill. You don't care about that shit.

RX480s have 1 6-pin each anyway. Power draw is fucking nothing. I don't have much brand loyalty if any, but holy crap you nvidiots sure are butt blasted about this shit.
>>
File: 1464636629638.png (67 KB, 616x596) Image search: [Google]
1464636629638.png
67 KB, 616x596
>>54854790
I don't pay the electricity bill
>>
Nvidiot kids are in this level of damage control becasue their moms will get them the 480 instead of the 1080.
>>
>>54859929
How can you be this much in denial?
>>
>>54860418
see >>54855434
>>
>>54860427
Hey, don't kill the messenger, I'm only stating facts here.
>>
>>54860449
No, you're not. You're either an idiot or cucked into denial.
http://lmgtfy.com/?q=1080+power+draw
>>
>>54860491
Thanks for confirming that the 480 draws 100w less than the 1080.
>>
>>54857697
>1080p
Ahahahaha poor fucking pajeets
>>
File: 1445158139862.gif (2 MB, 500x500) Image search: [Google]
1445158139862.gif
2 MB, 500x500
The average price for a kilowatt hour of electricity in the united states is twelve cents.

The 1080 and 480 are both 150 watt cards.

If you use two 480s, that's 150 watts more than a single 1080. Over the course of a full year, at 90 minutes of use per day, that equates to 6.75 extra kilowatt hours per month, which totals a whopping $9.72 per year.

If you have any sort of balance in your life, you're not gaming for more than 10 hours per week. Even if you treat gaming like a full time job and do it 40 hours a week, the difference is still less than $40 per year.

There is absolutely no argument to be made here on power consumption.
>>
>>54860533
You have single handedly lowered my view of AMDrones

>>54860593
It still draws extra power when you're not gaming tard. And not just idling either, even shit like Facebook has GPU accelerated parts
>>
File: 1462820406673-b.gif (112 KB, 255x231) Image search: [Google]
1462820406673-b.gif
112 KB, 255x231
>>54854790
>ass blasted /thread
>>
>>54860663
>He browses Facebook on his gaming computer
>He leaves his gaming computer on all the time
>His gaming computer is his main computer
>He has a gaming computer

This is an 18 and up board, young anon.
>>
>>54858107
This guy knows what he's talking about.
>>
>>54860663
If you leave your desktop computer on 24/7/365, yes, that would add up to an additional $360 over two years. But why would you do that? It's 2016, do you really use your desktop computer for everything still?

Assuming you sleep your computer while you're asleep, you're looking at an additional $118 per year (running 18 hours per day), or $59 per year ig you don't leave it running while you're at work.

So basically the cost difference only matters to people who don't have jobs, in which case they're not paying for any of this anyway, so it still doesn't matter.
>>
>>54860688
Non-retards understood that facebook was a generic example of how GPU acceleration is everywhere and your calculation of hours was completely invalid.
Also /g/ is 95% focused on consumer hardware. For AMD and nvidia that means cards for video games (and low end CPUs). Browsing a video game card thread in a video game focused board and complaining about video games is retarded.

>>54860766
If you buy a high end video card it's not at all uncommon to keep it for 4 years. So we're talking ~200$. Meaning you get a worse performing solution for the same price with AMD
>>
File: 1462740188215-0-b.gif (44 KB, 133x240) Image search: [Google]
1462740188215-0-b.gif
44 KB, 133x240
>>54860663
>It still draws extra power when you're not gaming tard
No it doesn't. Idle load is way less. 2 smaller card in idle can draw less then a more powerful card. Maximum Load ≠ Idle Load. Power isn't the discussion.

For ≈400$, 2xRX480's displace the entire gpu market.

I told nvidia lovers don't be fooled by the rushed 1080 launch. Nvidia knew what amd had in store. Yes nvidia has powerful cards. Amd now has a extremely cheap 980. (RX480 is possibly equivalent to a 980 up to a 980ti).
>>
>>54860864
>literally doesn't finish the sentence before replying
>>
>>54860864
>2xRX480's displace the entire gpu market.
Even AMDs own cherrypicked marketing figures puts just on par with a 1080.
But it's only gullible when you swallow nvidias stuff, isn't it?
>>
File: 1462514903755-b.jpg (97 KB, 634x392) Image search: [Google]
1462514903755-b.jpg
97 KB, 634x392
>>54860951
Nvidia overhyped the 1080. It's great for certain applications. For gaming it was way overhyped.
>>
>>54861073
>Nvidia overhyped the 1080 but AMD isn't overhyping 480 I know this because of the 0 reviews I've read
it's only gullible when you swallow nvidias stuff, isn't it?
>It's great for certain applications.
It's 100% a video game card. They sell other stuff for "certain applications"
>For gaming it was way overhyped.
It pretty much delivered as promised (70% faster than 980) And it won't have any competition for performance at all for at least 6 months
>>
>>54861176
Bullshit. They were saying the 1080 was faster then 2. 980's. They lied. Continue sucking off your daddy.
I'll be ditching my 970 for 2xRX480's unless the 1070 gets a huge price reduction.
>>
>>54861344
>faster then 2. 980's.
Yes, this is a shitty marketing formulation by nvidia. BUT to anyone with basic knowledge in these things (not you) it's obvious that "2 980's" is 980 SLI, otherwise they would just have said "twice as fast".
+70% is the usual benefit you get from running SLI. And that's what they delivered, except in a single card. Read the reviews yourself. Not a single one said Nvidia claimed it would be twice as fast. Also have fun with all the multiGPU problems you're going to have. There's a reason people don't just buy 3 960s
>>
>>54860593
>90 minutes of use per day
What are you, some kind of casual?
>>
File: 1462513245228-b.jpg (12 KB, 244x206) Image search: [Google]
1462513245228-b.jpg
12 KB, 244x206
>>54861531
So if the RX480 is the equivalent 980 then 2xRX480's in crossfire = a 1080 for ≈400$.
>>
>>54860766
How are you even calculating costs?

My gaming computer uses 72w on idle with the monitor turned off and about 300w load playing games. This is with an old i5 and r9 290 housefire.

Leaving my gaming PC on is more along the lines of $20/yr.
>>
File: 1464809723733-g.jpg (882 KB, 3264x1836) Image search: [Google]
1464809723733-g.jpg
882 KB, 3264x1836
>>54861531
>>
>>54857697

People buying these cards are buying hard into the 144Hz meme.

They don't realize that the real limiting factor for 144 FPS is their CPU.
>>
>>54856460
I've also seen OEM 1080s for $899. What's your point?

Keep being blind to the AMD slaying of nvidia and spending your money on bullshit. You've deflected the facts--2 Crossfired AMD 480s are about the same as a single 1080, it produces similar results, both in power consumption and in output. But keep deflecting, paid shill. Keep it up
>>
>>54854807
>draws twice the power
>cfx sucks

Fact of the matter is, you're insecure about a $199 card from the competition performing this close to your favorite companies $899 card.
Get over yourself, there's no way anyone will fall for your stupid shill thread.

Unless this was a bait thread, in which case, fuck you.
>>
>>54854935
HOW THE FUCK DO YOU KNOW? I only have seen a retarded graph showing a 3 frame difference in a benchmark. How does that translate in real power? People just seem to pull numbers out of their asses. You cant assume stuff bcuz you saw some numbers somewhere. I thought the rx 480 used 150 Watts btw but could be wrong.
>>
>falling for amds sketchy marketing

https://www.youtube.com/watch?v=04ITA1_XoqM
>>
>>54862921
It's ok when AMD does it.
>>
>>54862921
>if I keep posting it, it will become true
The buttmad is hurt.
Thread replies: 122
Thread images: 13

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.