[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
AMD GCN 4.0 Leak
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 35
File: AMD-Polaris-Architecture-8.jpg (80 KB, 900x506) Image search: [Google]
AMD-Polaris-Architecture-8.jpg
80 KB, 900x506
Here we go, the AMD meme marketing start

>http://videocardz.com/58031/amd-announces-polaris-architecture-gcn-4-0
>>
>>52227357
Was the last AMD related leek we got out of VideoCardzzzzzzzzzzzz not the fake FAD slides?

Why even bother with those cucks.

They didn't even put any effort into these ones. Not even trying to mimick past AMD colour schemes, poorly written, and poorly structured.

No. If this was a "Leaked" nVidia slide, I would be more convinced.
>>
>>52227357
For once it's not fucking wcctech.
>>
>>52227393
It's fake, too much detail
>>
>>52227446
>It's fake
*not
>>
I think AMD's only goal should be power consumption, because without it, their Zen APUs aren't gonna be good at all.
>>
File: amd-radeon-split-2015-09-10-03.jpg (110 KB, 960x655) Image search: [Google]
amd-radeon-split-2015-09-10-03.jpg
110 KB, 960x655
>86w
>140w

This is not the 2x performance per watt we were promised.

More like 1.85x.

Also GCN 4.0?
I thought we were still on 1.3
>>
Oh boy now we have a '40%' for AMD's gpus too.

From someone who's seen the last 10 years of videocard/CPU releases in real time along with their marketing and 'pre release information', nothing that comes out before the third party reviews of the product means a god damn thing.
>>
>>52227482
>last 10 years
Underage please go
>>
>>52227357
So AMD's Polaris which is in a new 16nm and use HBM consumes 50W less than a GTX 950?

mmm lets see, HBM its like 20 to 30 watts alone,

If we go with 20W and since Pascal will use HBM too, that puts the new Polaris card at 40W difference, and that is with node shrink and they thing this is impressive?
>>
>>52227470
>on early HW/SW
>early
>>
>>52227357
these marketing materials are complete utter bullshit whether they are real or not
>>
>>52227357
>no performance improvement
>guys how the hell do we market this?
>performance... per watt?
>brilliant!
>>
>>52227446
Not even that.
But just LOOK at it.

Hell, the embargo lifts in 2 hours. Do you honestly think AMD would dare show up to a conference to announce a new version of GCN with slide deck like THIS?

Hell, "GCN 4.0" is the real kicker here. Arctic Islands, the yet to be released 16nm/14nm GPU due this year is based off of the yet-to-be-released GCN 1.3. So Whatever happened to the other 27 versions, I don't know.

This is WCCF tier clickbait right here.

And if ANY of this was true, even on OP's pic alone, do you think AMD would have been able to keep a lid on it? do you not think it would have found it's way out to some source or another. Semi-accurate are usually the first to report accurately on AMD leaks.
>>
>>52227487
whoops its actually 14nm.
>>
>>52227357
which nvidia card is 140w?
>>52227487
maybe it's the tdp
>>
>>52227505
It's okay when Nvidia does it.
>>
>>52227505
Worked for nVidia.
>>
just bought an r9 380
how did i do?
>>
>>52227521
GTX 950 apparently.
>>
>>52227486
Didn't really pay attention to computer tech that much when I was 15
And why would I, didn't have the money to buy any of it.
>>
>>52227521
>which nvidia card is 140w?
GTX 950 at factory power targets.
Read the small print.
>>
>>52227521
>which nvidia card is 140w?
980
>>
>>52227505
Please, one of the biggest complaints about the 200/300 was performance per watt.
Also, if you realise that they have increased P/W, they can stick more stuff on a chip to result in more performance overall.
>>
>>52227521
>>52227535
>>52227542
How much does the 960 use? Shouldn't be far from 140, if not less.
>>
>>52227535
>>52227542
hmmm, thats a lot
>>
>>52227582
It's system power consumption, not the card.
>>
>>52227553
>>52227582
it is considering the GTX 980 mobile the full chipset not the GTX 980M chipset, uses 180W and it has the same performance as the desktop version.
>>
File: perfwatt_3840.gif (47 KB, 400x717) Image search: [Google]
perfwatt_3840.gif
47 KB, 400x717
>>52227357
So uh, slightly more efficient than maxwell?

Can't wait for amd users to suddenly not giving a shit about efficiency to bragging about it all the time.
>>
>>52227596
What about the CPU even intel's need 80Watts.
>>
>>52227553
According to TomsHardWare, who have a pretty sound testing methodology for power consumption. The 960, at reference with factory power targets will pull 119 watts at most. So that's the slide BTFO'd right there in their claiming that a 950 uses 140 watts.
It's all well and good having Maxwell so efficient, but that's only at factory power targets, in games at those targets, the performance is seriously lacking.
>>
>>52227596
no way
>>
>>52227608
Probably a Core M. No idea why you'd ever calculate the entire system power draw when comparing cards.
>>
My toaster can run Battlefront at 60fps.
>>
>>52227619
it says right there is it is a 4790K
>>
4790k and ddr4 memory. Amd engineers ladies and gentlemen
>>
>>52227630
yes but how the fuck do you run a gpu + cpu at 86w.
And if this is only the power draw of the GPU, how do you manage to make a GTX 950 draw 140 watt?
>>
File: 1427249871634.gif (836 KB, 286x204) Image search: [Google]
1427249871634.gif
836 KB, 286x204
>>52227638
>>
>>52227608
See, this is where the trickery comes in.

What happened here is AMD capped the framerate at 60FPS on their highend cards in the driver settings, this ends up making the CPU do very little work and thus low power consumption.
So basically it looks like 30-40W for the CPU, and 50-60W for the GPU
>>
>>52227532
You did good
>>
>>52227638
wew lad
>>
Why don't they use their own fucking processors for testing things like this?
>>
>>52227649
My God are you insinuating AMD is lying???! Good sir I believe your posts are certainly sponsored by Israeli agencies.
>>
File: 1449595999555.jpg (57 KB, 640x813) Image search: [Google]
1449595999555.jpg
57 KB, 640x813
>>52227357
>under embargo until january 4 9AM EST


YESSSSSSSSSSSSSSSSS
>>
File: gtav_vhigh_cpu.png (53 KB, 582x1968) Image search: [Google]
gtav_vhigh_cpu.png
53 KB, 582x1968
>>52227666
next question pls
>>
>>52227675
Yes it is I, Chaim Schlomo Israel Goldbergstein.
>>
File: amd_quantum_pc_2_w_600.jpg (34 KB, 600x340) Image search: [Google]
amd_quantum_pc_2_w_600.jpg
34 KB, 600x340
>>52227666
Because it suck, pic very related
>>
>>52227666
They are advertising the gpu, not the cpu. They know that an intel cpu will remove bottlenecks, they admit that.
>>
>>52227699
>cool things that will never happen
>>
>>52227701
it's very bad brand advertisement though.
Why would you buy a gpu from a company that won't use their own hardware?
>>
isn't battlefield 4 still the only game with GCN support?
>>
>>52227470

There is no such thing as GCN 1.X, an Anandtech writer invented this.
>>
>>52227720
I think there will be a RTS game soon, something with ashes in it. At least the shills keep posting benchmarks from that game.
>>
>>52227666
Because the CPUs suck.
They don't want to be in denial about the CPUs and gimp their GPU marketing in the process.
>>52227714
There are things more important than 'brand'.
I'd buy a GPU from a company that made good GPUs.
Would you buy a GPU from intel?
>>
>>52227731
>>52227720
>I have no idea what GCN is

Get out.
>>
>>52227735
I would definitely buy a gpu from Intel if they made one.
>>
>>52227741

Mantle is dead guys

http://www.pcworld.com/article/2891672/amds-mantle-10-is-dead-long-live-directx.html
>>
>>52227735
>more important than 'brand'
This attitude is what made AMD the amazing company it is today. They accepted being the 'cheap shit alternative' and now look at their sales.

Brand is EVERYTHING a company has.
>>
>>52227750
Way to prove the point you tech illiterate retard.
>>
What is 9AM EST in GMT? Fuck these shit, why doesn't everyone use GMT? What's the point of UCT, EST, EDT, COT, CST, ECT

WHAT THE FUCK
>>
>>52227768
Cali uses gmt right?

Est is gmt+3
>>
>>52227766
why? it's not like there's a difference between gcn and mantle. And if there is, what's so good about this 'GCN' that's even worth discussing?
>>
>>52227757
Brand isn't as important in a world of objective measurements.
You say 'cheap shit alternative', AMD says 'better value'.

If you get 90% of the performance for half the price that's called a good brand.
>>
>>52227784
Mantle is an API.
GCN is the name of hardware architecture.

Jesus christ go kill yourself
>>
>>52227782
No wait I think cali uses pst

Whatever you mountain niggers don't matter
>>
>>52227768
>>52227782
a 1 second google search shows that EST is GMT -5 so considering you're too stupid to do a 1 second google search I assume you're on continental Europe (CET = GMT+1) so you have to add 6.
>>
here is nvidia ces link http://www.nvidia.com/object/ces2016.html
Monday, January 4 at 6:00 PM PST


I don't think i will be able to watch AMD's im sleepy.
>>
>>52227784
>it's not like there's a difference between gcn and mantle.
Stop posting please.
>>
>>52227795
>a good brand
a product does not define a brand, the market defines a good brand.
AMD can make the greatest GPU on earth but if they pair them up with Intel products, and thus 'admit' their own products being inferior they damage themselves a lot in the process.
>>
>>52227804
So the NDA lifts in 1 hour and 30 minutes?
>>
>>52227814
same here, let's wait the anandtech/techpowerup summary
>>
>>52227822
For fuck's sake what's the problem with admitting they don't make the best CPUs in the world? And that in one particular situation, they chose an Intel.
I'm a consumer in the CPU and GPU market. To me, this choice means they can make an objective unbiased choice for a competitor's offering instead of using their own out of pride. I like that, I value that sort of vision in a brand, it's something that attracts me to the brand. So why is my opinion as a consumer disregarded?
>>
>>52227839
2h 30min
>>
>>52227814
Nvidia is doing something with self driving cars.
>>
>>52227822
No they don't.

Their market is NERDS.
>>
>>52227848
I'm talking marketing here.
BMW is never going to use an Audi engine for the same reason, it makes their brand looks weak which is stupid. Even if the competition is better you must never admit it like that, you discard your own company.
>>
>>52227784
Would you be so kind as to get a trip?
>>
>>52227859
Doesn't look like it
>>
File: 1440748328147.jpg (15 KB, 255x248) Image search: [Google]
1440748328147.jpg
15 KB, 255x248
>>52227848
>consumer
Please don't use wrong words.
https://www.gnu.org/philosophy/words-to-avoid.html#Consumer
>>
>>52227868
These STEM guys think advertising doesn't work on them. I love flogging shit to these cretins.
>>
so is amd going to be at the ces?
>>
>>52227887
I'm retarded
>>
>>52227877
Don't worry BMW will get bought by VW soon and they'll have no choice but to use their engines.
>>
>>52227885
How about you get a trip so I can filter your stupid ass asking people to get a trip in every single thread for the past week
>>
>>52227905
Totally irrelevant, also I'm not an arab so I'll never drive a German car.

Anyhow can you wrap your mind around the concept of brand value and the idea that by admitting the competition in the CPU market to be better AMD is also hurting their GPU division?
>>
>>52227877
>BMW is never going to use an Audi engine for the same reason

Comparing car industry to PC hardware is retarded.
That's like comparing prebuilt PC company like Alienware and Dell.

Even Apple iPhone used Samsung parts and processor in the past.
>>
>>52227921
>Anyhow can you wrap your mind around the concept of brand value
Yes, and to me the brand adds value when their CEOs can make unbiased decisions like that. They devalue by having inferior products, and value by being honest about it.
>>
>>52227921
Everybody knows amd cpus are shit, there's no point hiding it. It's it the first time they've done this either.
>>
>>52227910
I'm honoured that you should think I am others. But no. This thread is the first I've posted in for weeks on /g/.

Should the mods do their jobs and remove shitposters, then you wouldn't have people asking others to get a trip so that they may filter them.
>>
>>52227925
Are you autistic? I'm not talking about the actual hardware here, I'm talking about the idea of companies using competitors stuff as a substitute to their own which is a sign of weakness and damaging to their brand.

Imagine Stallman giving a presentation in M$ powerpoint, why the hell would you use his software if he himself doesn't even us it?
>>52227939
>>
>>52227925
>in the past
I guarantee that 60% of the components in an iPhone are produced by Samsung.
>>
>>52227961
None of these components were produced by Apple as well though.
There's little wrong in saying 'we don't make screen so we let Samsung do it'.

However saying 'we make CPUs but those are shit so we use Intel ones' is rather stupid.
>>
>>52227945
Get fucking real nobody is going to get a trip you might just as well call him names and it'll end up with the same result.
>>
>>52227956
>using competitors stuff as a substitute to their own which is a sign of weakness and damaging to their brand.

And Apple mac and Iphone got their own CPU too until they switched to Intel and Samsung.

Sorry if most big company are not retarded like you.
>>
>>52227877
Mercedes uses PSA diesel engines. No one cares about these intricacies.
>>
>>52227925
>in the past
not only in the past, right now
>>52227877
>>52227905
Off topic fun fact: the Audi R8 and the Lamborghini Gallardo are the same car.
>>
>>52227956
I've explained my point twice, and you keep repeating "Imagine X would use its competitor Y's product, that devalues the brand". I don't need analogies, I got your point and already replied to it: it devalues the product, but values the management, to me the brand image stays the same.
>>
>>52227956
>Imagine Stallman giving a presentation in M$ powerpoint
What is it with you and these strawman metaphors?

The fact of the matter is, if you want to display the best possible performance of the product you are trying to market, you use the best possible hardware alongside it in order to alleviate any limiting factors.

The fact that you fail to comprehend this simple concept is amazing.
>>
>>52227638
underrated
>>
>>52227989
>Engine == Car
No
>>
>>52227998
The fact of the matter is that AMD is not only selling their GPU, they are also selling a piece of AMD.
When you buy a phone you also buy the brand attached, when you buy a car you buy the brand attached and when you buy a GPU you buy the brand attached.
And AMD just says 'the brand attached is shit'
>>
>>52227980
It will.
But putting 1 shitpost in a thread will do more for overall post quality than 10 shitposts.
>>
>>52227999
>>52227638
Nuffing new

AMD are incompetent
Nvidia are evils
>>
>>52228004
also the same structure m8
>>
>>52228009
So what do you want AMD to do?
Use a 4 year old Processor and harm the impact of a new GPU they are marketting?
Will them making themselves seem shit by shitting on themselves make you happy you shitposting urchin?
>>
File: 1451471487313.png (357 KB, 397x402) Image search: [Google]
1451471487313.png
357 KB, 397x402
The 4790k doesn't support DDR4.

AMD is so incompetent they can't even make slides properly. The results are obviously rigged too. 140w 950? Come on.
>>
>>52228009
They're selling GPUs.
They aren't fucking samsung.
>>
>>52228009

So its not fine for a car company using other brand engine but its alright for phone to used other brand CPU?

Your logic is retarded.
>>
>>52228013
>>52227638

You both seem to think that AMD actually made these slides.
>>
>>52228049
Yep, it's typical. No joke.
>>
>>52228041
It's NOT fine for BMW to use the engine of a direct competitor. That is very damaging to their brand.

However for example Skoda using VW engines is no problem whatsoever, since they
1) don't make their own engines
2) are part of the same company
>>
why are amdretards are so angry all the time? something announced by intel - lol jew shit good goy, something announced by amd - lol it's shit anyways i don't play games tips fedora mom, nvidia announces something - ayy 3.5 all over again, 200$ for a-vsync, wood screws lmao.
>>
>>52228049
Fuck off Haswell can use DDR4 with a BIOS update retarded shitposter
>>
>>52228073

And Apple got their own CPU and still used other brand CPU and mobile CPU.
Do that damaging their brand from being the best selling phone/PC company in the world?
>>
>>52228085
>A bios update can give a CPU a new memory controller compatible with DDR4
k den
>>
>>52228092
I'm quite sure Apple uses their own processors whenever possible.
But in case they don't, then yes this is damaging to their brand but not as badly as AMD having to use the CPU of Intel, read up on 'core business'.
>>
>>52228092
>>52228104
And I was right, ever since the iPhone 4 at least Apple is using their own processors.
>>
>>52228104

Core business mean nothing in tech industry since forever.
If this really happen then all CPU company in the past will not paid money to licensed x86 to intel and make their own architecture.
>>
>>52228153
>means nothing
It means everything. There's a reason Samsung spends more money investing their mobile phones than their keyboards.
>>
>>52228136

Designed =/= make.

That's like saying EVGA make their own GPU.
>>
>>52228073
https://en.wikipedia.org/wiki/Prince_engine
>>
>>52228165
>There's a reason Samsung spends more money investing their mobile phones than their keyboards.

Mobile phones is not samsung core business, they own a whole country.
>>
>>52228155
thread/
>>
File: intel btfo.png (10 KB, 710x423) Image search: [Google]
intel btfo.png
10 KB, 710x423
omg, check out page 37 cpu leak
>>
Why does power efficiency/performance per watt actually matter for consumers?

30-40 watts in difference has no real affect for consumers, not like it will magically fuck the power supply.

It only matters for servers/workstations.
>>
File: graph.png (8 KB, 600x463) Image search: [Google]
graph.png
8 KB, 600x463
>>
>>52227357
will this work with a r270?
>>
>>52228248
thanks, doc
>>
in 45 minutes guys
>>
So is this a bad time if I bought an R9 390?
>>
File: 1451658505993.png (27 KB, 580x276) Image search: [Google]
1451658505993.png
27 KB, 580x276
OMFG
>>
File: p.jpg (26 KB, 826x435) Image search: [Google]
p.jpg
26 KB, 826x435
that's it, intel is done
>>
File: 1450717893361.png (24 KB, 1095x805) Image search: [Google]
1450717893361.png
24 KB, 1095x805
>>
>>52228306
>>52228337
>>52228364
Can you stop shitting up the thread?
>>
>>52228290
Not really. This new stuff is almost a year away.
>>
>>52228393
>This new stuff is almost a year away.

heh, and here I am being paranoid that next gen will come out in the next six months.
>>
>>52227603
Fury x already has a higher performance per watt than the 980 ti though.
>>
>>52228290
>actually buying a gpu at the start of q1
i think the wasted money is not the biggest problem you're dealing with
>>
>>52228281
ces AMD press release?
>>
>>52227603
Just like YOU Nvidias faggots, making the same arguments for your Fermi and Kepler house fires?

Fuck off and kill yourself. You are obviously senile, because you can't even remember 5 years ago.
>>
>>52228384
>amd shill on suicide
>>
>>52228419

I need something to replace my broken HD4850, what's wrong with buying a GPU in at Q1 of the year?
>>
>>52228306
>dem bars
Take all my money.
>>
>1.2 straight to 4.0

HOLD ON TO YOUR FUCKING SEATS LADS, THE RIDE JUST GOT A WHOLE LOT STEEPER.
>>
>replace my AMD CPU with Intel
>replace my AMD card with Nvidia
>Winter comes
>suddenly I'm freezing because my computer doesn't get as hot as it used to
>>
>>52228419
>implying Pascal will come out q1 or q2
>>
>>52228474
There is no such thing as GCN 1.1, 1.2 or 1.3
>>
>950
>140w

AMD trying so hard!
>>
File: image.jpg (71 KB, 640x457) Image search: [Google]
image.jpg
71 KB, 640x457
>>
10 min

http://www.hwbattle.com/bbs/board.php?bo_table=hottopic&wr_id=1316
>>
Will there be a stream?
>>
>>52228695
Of course not.
>>
File: graph.png (13 KB, 600x463) Image search: [Google]
graph.png
13 KB, 600x463
>>
http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture

http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture

http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture

http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture
>>
HERE WE GO BOYS
>>
http://www.guru3d.com/articles-pages/radeon-technologies-group-january-2016-amd-polaris-architecture,1.html

http://www.tomshardware.com/news/amd-polaris-14nm-finfet,30823.html


http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture
>>
Fuck AMD, this is actual news
http://www.anandtech.com/show/9894/nvidia-announces-the-geforce-gtx-vr-ready-program
>>
>>52227543
Kek
You are retarded
>>
>>52228920
GTX980 has a 140W TDP you stupid AMDKEK
>>
>>52228949
GTX980 has a 165w TDP, and to achieve the typical benchmark numbers the card has to run much closer to 200.

0/10 trolling though
>>
>>52228803
>>52228841
>4790k with ddr4
>>
>Meanwhile RTG has also disclosed that the first Polaris parts are GDDR5 based.

IT'S OVER, AMD IS FINISHED & BANKRUPT
>>
>>52228977
That means Pascal ones are too.
>>
File: 1451745925975.gif (4 MB, 340x340) Image search: [Google]
1451745925975.gif
4 MB, 340x340
>>52228977
>>52228949
>MEMEMEMEMEMEMEMEMES

>MEEEEEEEEEEEEEEEEEEEMS

>REV UP THOSE MEMES BOIZ
>>
ITT: Nvidikek's get triggered
>>
>>52227750
That's because OpenGL's Vulkan is alive.
>>
>4790k with ddr4
>>
>>52227357

>Comparing 14nm to 28nm power consumption
>LOOK HOW ADVANCED WE ARE

This is high class bait from AMD right here
>>
https://www.youtube.com/watch?v=5g3eQejGJ_A
;/\)
>>
File: 1426099677708.jpg (52 KB, 660x555) Image search: [Google]
1426099677708.jpg
52 KB, 660x555
>>52227357
>caring about perf/w on a consumer grade hardware
>>
>using frame limiter to reduce power consumption
>not even a 100% power efficiency improvement
>fabrication is literally half the size
AMDshills on suicide watch

And
>i7-4790k
>DDR4
They're liars too
>>
>>52229153
So the power consumption is the entire system? That's pretty neat!
>>
>AMD using outdated Nvidia driver for the comparison
>361.43 WHQL available since last year
>Cherry picked AMD biased game because DICE is AMD asskisser
>>
>>52229168
t. 500w 390x
>>
>>52229153
DESIGNATED
>>
>>52228155
That has been independently tested an confirmed to be accurate.
It's just that independent reviewers used settings that didn't favour AMD as much.
>>
I'd buy a AMD card if their GNU/Linux drivers weren't utter shit.
>>
>>52229511
AMD products would be alright if it had drivers and the games released aren't lopsided into working better on nvidia cards.
>>
>>52229168
it's a mobile chip, smartass
>>
>>52229511
Catalyst is being modified to use the open source gallium kernel drivers rather than having to rely on a half working kernel module blob. When that happens the stability will be improved a lot and the open source driver will have at least twice as many devs working on it as it had previously so the performance will improve also.
>>
>>52227357
It's a 120mm^2 GPU, so yeah, that's total system wattage.
>>
File: AMD_Polaris_Daten__14_-pcgh.png (223 KB, 1593x894) Image search: [Google]
AMD_Polaris_Daten__14_-pcgh.png
223 KB, 1593x894
>>
I like how they still write competition instead of Nvidia.
>>
>To that end RTG also plugged each system into a power meter to measure the total system power at the wall. In the live press demonstration we saw the Polaris system average 88.1W while the GTX 950 system averaged 150W. Meanwhile in RTG’s own official lab tests (and used in the slide above) they measured 86W and 140W respectively.


HAHAHAHAHAHAHA
>>
File: AMD_Polaris_Daten__10_-pcgh.png (335 KB, 1606x907) Image search: [Google]
AMD_Polaris_Daten__10_-pcgh.png
335 KB, 1606x907
>less variation
will overclocking die with FinFET?
>>
>>52229674
Intel are very close competition on the IGP market now.
>>
>>52229674
I like how they think they're still competing.

They're only playing catch up now.
>>
>>52229713
The more lithography goes down the more leakage there is when overclocking.
OCing won't be a thing in a few years when chips will turbo up to their OC max.
>>
File: 1448462676116.png (433 KB, 1780x1408) Image search: [Google]
1448462676116.png
433 KB, 1780x1408
>>52227543
>>
>>52227357

Don't you retards ever learn?

Seriously, has anyone noticed that whenever we start hearing about AMD's next lineup of a product you people hype it up and say how it's gonna dominate and whatnot, then the thing comes out, it looks absolutely NOTHING like what you retards were hyping up and then you change your speech to "n-nobody s-said it was g-gonna be g-g-g-good" by trying to revise history for damage control purposes.

Stop this shit, jesus.
>>
>>52229699
Given that we know the 950 draws 90W on average, they are implying the entire rest of that i7 system is drawing 60W and their card is pulling only 26. That doesn't sound very realistic.
>>
>>52229751
It's not all that weird for a mobile chip.
>>
>>52229761
Why are they testing a mobile chip on a desktop i7 system?
Oh right, marketing.
>>
>>52229736
Except everyone agrees that AMD's 7000 series was better than Nvidia's 600 series and AMD's Rx 200 series was better than Nvidia's 700 series.
>>
>>52229782
It's a 120mm^2 chip, it's a mobile chip or very low-end desktop chip, no difference.
>>
So what will they be calling their recycled 7970 this year?
>>
>>52229820
GOAT
>>
>>52229820
>Radeon 7970 so good it remained competitive with three generations of Nvidia cards
>>
No one mentions that AMD will be dual-sourcing from both TSMC and GloFo?
>>
>>52229872
I have doubts of GloFo making large dies so it'll probably go like this.
CPU/APU - GloFo
low-midrange GPUs - GloFo
Highend GPUs - TSMC
Server chips/MCM - GloFo
>>
>>52229820
amd R5/R3 440
cost less than $80

>fugg
>>
the 7990 is the best video card out there right now

prove me wrong
>>
>>52229893
>dual GPU
Next.
>>
>>52227505
Heat and Power Usage are the only reason why I dont actually buy Radeon
>>
File: 1451033018609.jpg (41 KB, 396x382) Image search: [Google]
1451033018609.jpg
41 KB, 396x382
>>52229967
>>
>>52229978
I dont even understand the layout of that shit site
>>
>>52229893
295x2/395x2
What now?
>>
>>52230014
>I dont understand Reddit
Okay, maybe is Nvidia is right for you after all.
>>
>>52229914

what's your point?

you can buy one used in good condition for under $300 usd
>>
>>52227470
It's total system consumption. At the wall. Compared to maxwell.

It's more than 2x for earlier GCN then.
>>
>>52230022
Fury X2 when?
>>
>>52229861
This
>>
>>52227357
Fuck you AMD.
Give me PERFORMANCE, I don't give a fuck about TDP.

Its fine if the card eats 140-180-200W, just make it FAST.
>>
>>52230169
This is a byproduct of a die shrink. AMD didn't do anything.
>>
>>52228474
>GCN 1.0 = GCN 1
>GCN 1.1 = GCN 2
>GCN 1.2 = GCN 3
>GCN 1.3 = GCN 4

cmon, it isn't that hard to understand. gcn 1.x never existed on AMD namings, they just call it the codename, like Volcanic Islands or whatever.
>>
>>52228977
Oh wait. If this is true, and the showcased cards are indeed GDDR5, it means that the 2x efficiency shown is purely due to the node shrink and the improvements to the architecture. Imagine Fury X performance with 100-120 W consumption. Nice.
>>
>>52229479
I internally scream >DESIGNATED whenever I see/hear something about indian but I make an exception for Raja
>>
>>52229153
>>52229479
>>52230355
> guy at the end
> FAIZ

> YOU ARE BEAUTIFUL MADDY
> THANK YOU FAIZ
>>
>>52230014
1. Read the links
2. Click the link that interests you
Repeat

Optional:
1. Click "comments" to read what redditors have to say about it
2. Create an account if you want to participate

How retarded can you be?
>>
>>52230014
>>52230406
reddit userscripts (similar to 4chanx) also help a lot.
though 99% of the site is junk, so unless you live forever and don't mind wasting your time, don't try.

spent a few days finding sub plebbits and shit, and it's all reposts, karma whoring, etc.
>>
>>52230437
The defaults are garbage, mostly the one that are supposed to be funny and entertaining.

The site for me only works as a news aggregator and some very specific small subs, but even so I feel like fucking slashdot posts on facebook give me more tech news that reddit
>>
>>52230471
>but even so I feel like fucking slashdot posts on facebook give me more tech news that reddit

That's what I meant.
You can make it less shitty, but you cannot fix the 'content'.
>>
>>52227521
>which nvidia card is 140w?
>GTX 950 apparently.

Are you guys autistic? Can you even do a damn google search?

GTX 950 TDP is 90 watts......
>>
>>52229820
Fuck yeah, Tahiti GOAT.
>>
Will this give me a competitive advantage in professional eSports gaming?
>>
>>52230563
Less heat generated so you can keep your fat ass cool and prevent the mouse from slipping out of your sweaty hands.
>>
>>52230517
Can you read? It's in the small print
>>
>>52230563
Yes, pair it with your 144hz monitor for best result, goy.
>>
File: 1408585355687.png (13 KB, 150x222) Image search: [Google]
1408585355687.png
13 KB, 150x222
So many slides, and yet not a single bit of useful information besides:

>Will be bettar then 28nm hurhurhur

Might as well be WCCF

No shader counts memory bandwidth memory capacity ETC ETC
>>
>>52230231
1.0/1.1/1.2 is mentioned on driver information releases, IIRC.
>>
>>52230666
go away turbonerd
>>
>>52230517
They did a live demonstraion with regards to power consumption. Stop being retarded.
>>
File: 1434657915009.png (19 KB, 600x463) Image search: [Google]
1434657915009.png
19 KB, 600x463
>>52229861
Pitcairn and Tahiti will live forever.
>>
>>52230794
>rapeplay
what
>>
>>52230822
oh, it's a joke graph

nvm
>>
>>52230794
>Rapelay is the only game that actually exists
>>
>>52229523
On Windows this hasn't been true for almost 3 years now. There has literally never been anything wrong with AMD video drivers in the time I've been using them
>>
>>52230927
this.

I've been using Nvidia for over a decade, and expeically the early 300s drivers, where a shit show, constant crashing that driver that fried GPUs.

heck I had a laptop from 2008? that the nvidia GPU would overheat so badly it melted its own solder, would do this repeatedly had the mobo replaced 7 times on it (best buy warrenty ftw)
>>
tldr when will the next cards be released?
>>
File: rapelay.jpg (82 KB, 1024x768) Image search: [Google]
rapelay.jpg
82 KB, 1024x768
>>52230834
but the game is real
>>
>>52229883
>Highend GPUs - TSMC
No, they wrote them off completely.
>>
>>52231012
summer most likely
but nvidia is somewhat silent about pascal recently, TSMC has it's hic-ups, one can hope for 300 series/20nm disaster again
>>
>>52227357
>all that fast 265/HEVC fast encoded animu!
>>
>>52227470
if the cpu consumes 40w then its 46w vs 100w and is then more than 2x perf per watt of maxwell and more than 3x the perf per watt of hawaii
>>
>>52227487
Are you saying the 2gb of low clocked gddr5 consumes more than 50w cause that is retarded.
>>
>>52227666
because if they put an AMD cpu in there you'd be looking at 180w vs 240w and it looks way less impressive
>>
>>52227487
Low end cards will be gddr5
>>
>>52229140
>pascal
>28nm
Step it up m80
>>
>>52236050
>GTX 950
>pascal
>>
>>52236050
are you retarded?
>>
>>52227357
>i7 4790k
>DDR4-2600
It might just be me but something doesn't add up
>>
>>52227487
The showcased card is GDDR5, probably, as they said the low-tier cards will still be GDDR5. Also, it's the total power consumption, so it should be safe to say that the card consumes 50-60% the power of the 950.
>>
>>52236732
>Also, it's the total power consumption
It has a i7-4790K CPU which alone has 88W. It can't be the total power consumption.
>>
>>52236792
I doubt it ran with every core at 100%
>>
>>52236792
No. It's TDP is 88W, maybe, but that's if it was running at 100%. I doubt that it ran 100% with a low/mid-range VSync capped card.
>>
>>52236792
Read anandtech, both systems wattages were run from the mains in front of the press.
Before that he showed a 120mm^2 chip that AMD is using.
Thread replies: 255
Thread images: 35

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.