[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Nvidia Pascal GP104 GPU Leak
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 21
File: Leakity leak.jpg (86 KB, 635x523) Image search: [Google]
Leakity leak.jpg
86 KB, 635x523
Have you seen this?
Have you heard about this?

>NVIDIA’s next-gen Pascal GP104 has once
>again been leaked but this time we are looking
>at the full fat version which will be featured on
>the fastest GeForce graphics card that will be
>available to consumers in the coming months.
>The latest Pascal GP104 GPUs will be built
>on TSMC’s industry leading 16nm FinFET
>process which will allow NVIDIA to drastically
>increase the transistor count of their GPUs
>while retaining a small and efficient die
>package.

http://isitvr.com/2016/04/24/gp104-gpu-leak/
>>
Fuck off.
>>
>>54201711
b..but new GPUs Anon kun
>>
>No Hardware Async
>Will get shit on by Polaris
If nvidia doesnt improve by Volta i dont know what will happen.

Fucking shit drivers too
>>
Are they just gonna be more "efficient" cards ?
Oh god im gonna be so pissed if that happens.
>>
>>54201784
Nothing will happen. Nvidiots will keep buying Nmemedia but lots
>>
IT'S OVER, AMD IS FINISHED & BANKRUPT
>>
>Nvidia house fire meme
>AMD no driver meme
>>
>>54201627
Does it have hbm2? If not you can fuck off
>>
>>54201627
Ok, this is good info... but what's the clock speed on those? We're seeing roughly 1200mhz on 9xx cards on average..
So what's 10xx card's clock speed?
>>
>>54202145
kek'd hard bro, I like being in on memes.
>>
>>54201962
i fell for the nvidia meme before but at least they where ok cards back in the day.

Never again though ive been burnt too many times.
>>
>>54202332
no chips until at least Q4 will have high bandwidth memery
>>
>>54202609
>Clockspeed
>important
So what my 770 runs at 1215mhz and its slow as shit because no hardware async and a bunch of other factors.

Nvidia GPU's are just cheap shit charged at 2x what they are worth
>>
There is literally no incentive to buy an AMD card over a similarly priced Nvidia card. The only reason AMD exists is so that there's "competition" but they can't put out good products. Cute that you retards think AMD will be able to compete with Pascal. This is even cuter than your Zen memes.
>>
>>54203148
>implying i can't work out how to put a fan in my computer to stop it running at the temperature of the sun.
gee it feels like the red and green shills are on overtime this weekend
>>
>>54202722
There's a far higher risk of getting burned with AMD, both literally and figuratively.
>>
>>54203216
But anon, I never said anything about temps, why are you such an autist?
>>
>>54201784
>Async being important

Topkek, it's just a handful of games that even make use of it, due to it being so difficult to implement.
And just to gain about 5% more performance, as stated by developers.

Nvidia doesn't care about such marginal gains.
>>
>>54203287
because almost everything else is an insentive amd to nvidia is a strange one.
Nvidia: better optimisation in drivers for 2-3 years while its relevant.
more stable and reliable possibly because better operating temperature
usually higher cost for equivilent card

AMD:
Better spec hardware so once no super driver support you get better performance(so better ageing)
Usually a little cheaper
better computational capabilities
but if there is a bad series you'd do well to skip it

so shillary explain if there are no benefits the improved aging of amd and the increased computation
>>
>>54201627
>Polaris 11 will be an R7 460X at best, a wimpy GTX 1050 competitor
>Polaris 10 will be an R9 480X at best, will barely compete with the GTX 1070 at the top end
>no R9 490/X to take on the GTX 1080
>still using GDDR5
AMDone, amirite ;^)
>>
>>54203525
>I know nothing but will pretend because I'm paid to :)
>>
>>54203525
AMDefunct, AMDestroyed, AMDead etc
>>
>>54203344
>DX12
>Handful of games
Even DX11 games run better and the gpu's cost a shitload less than nvidias crap

How blind can you be this is literally 2009 all over again it will take nvidia until 2018 or later to fix this
>>
>>54201627
>Why did you
>quote it
>like this
>you mongoloid
>>
>>54203542
Let's look at the facts, shall we
>Polaris 10
>40 CU 2560 Stream Processors
>sub 1000MHz core clock

>GTX 1070
>at least 2300 CUDA cores (and CUDA cores are generally stronger than AMDead's Stream Processors, see GM200 vs Fiji for details)
>rumored to have an even greater base frequency than GM204, so greater than 1400MHz core clock
Not looking good for AMDead
>>
>>54201627
images were leaked here like 2 days ago.
Don't live in that hole, it's dangerous with a weather like this.
>>
>>54203666
>>sub 1000MHz core clock
>pulling facts
>out my my ass
it will likely be 1300mhz+
>>
>>54203687
And run at 90+ degrees again, which is supposed to be "perfectly safe".
>>
>>54203696
So? Alot of nvidia cards run at 80degrees+

GPU's run HOT
>>
Async compute is irrelevant

Even the Hitman dev writes in a GDC slide it's not worth it, lots of work for very little performance, 5%

What a good laugh
>>
File: 1458130322628.png (23 KB, 400x400) Image search: [Google]
1458130322628.png
23 KB, 400x400
>>54203714
>Async compute is irrelevant
>Literally every DX12 game on nvidia stutters like shit
Its literally the foundation of DX12 but ok sure it doesnt matter because nvidiots cheaped out and fucked up their designs and made GPU even worse than kepler
>>
>>54203687
Nope. A Polaris 10 chip was caught in a benchmark with a sub-par 800MHz core clock. That means that its most powerful version will only have between 100-200MHz more, at best.
At worst, that 800MHz IS its limit.
Meanwhile, Pascal is designed to run at much higher clockrates than Maxwell.
>>
>>54203707
Sure, but those NVidia cards are reference cards.

Now there's AMD non-reference cards too, but they need coolers literally twice the size of NVidia cards to run at the same temperatures because they draw twice as much power.
>>
>>54203734
Nigga please
Async is a one-time feature that AMD has been pushing for DX12 because they have nothing else to stand on once Nvidia starts optimizing their hardware for DX12.
Literally no one is going to be using Async computing when it will harm over 80% of their customer base.
>>
>>54203737
Still doesnt have Hardware ayysync
>muh clocks
not going to matter since nvidia cards are still single threaded most of the time
>>54203741
So? Most serious builds already have decent cooling.

Its not rocket science if you want performance cooling is the first thing you want.
>>54203750
>Nvidia
>optimizing hardware for dx12
>hardware
>dx12
HOW?

You cant do Async Compute in software it runs and scales like absolute shit hasnt the numerous DX12 games already demonstrated this?
>>
RIP AMD
>>
>>54203764
>rip nvidia
>>
>>54203714
Meanwhile the consoles are encouraging devs to implement async processing (ps4's gpu can every directly address the cpu l2 cache)

Its also a huge part of dx12
Nvidia is being left behind
>>
>>54203759
>Still doesnt have Hardware ayysync
Not even relevant for DX12. IT'S JUST A FEATURE THAT DEVS WOULD HAVE TO SUPPORT, NOT A MANDATORY FEATURESET FOR DX12. Get that through your tiny, AMD-addled brain.
>You cant do Async Compute in software it runs and scales like absolute shit
Then why is a 980Ti still kicking a Fury X's ass when it comes to the most recent DX12 benchmarks, AotS and Hitman? Looks like software async is plenty powerful enough as it is. Wait until they start refining the firmware.
>>
>>54203778
Shh they think 'drivers' will save them.

There is a reason nvidia havent jumped on vulkan/mantle and its because their gpu's wont be hardware async until volta
>>
>>54203789
Go look at quantum break and ashes of the singularity benchmarks then

Go look at hitmans or any other DX12 game or even Vulkan

nvidia gets rekt
>>
http://www.extremetech.com/extreme/213519-asynchronous-shading-amd-nvidia-and-dx12-what-we-know-so-far
>Final result (for now): AMD GPUs are capable of handling a much higher load. About 10x times what Nvidia GPUs can handle. But they also need also about 4x the pressure applied before they get to play out there capabilities.

So basically NVIDIA as fucked from the get go and AMD is only going to get better over time.
>>
>>54203803
...Until Nvidia releases a new driver where *gasp* the 980Ti once again stomps the Fury X. It's a constant cat-and-mouse game, but Nvidia always claws ahead once they put the time to further optimize the drivers.
You AMDniggers keep using outdated benchmarks where the 390X kept up with the 980 Ti, but those days are months behind us.
>>
>>54203848
>paying 2x for 5fps more because of 'drivers'
>'drivers' being nvidia bribing the devs to nerf amd cards
WEW
e
w
>>
>>54203789
>Then why is a 980Ti still kicking a Fury X's ass when it comes to the most recent DX12 benchmarks, AotS and Hitman?
It isn't.
>Not even relevant for DX12. IT'S JUST A FEATURE THAT DEVS WOULD HAVE TO SUPPORT, NOT A MANDATORY FEATURESET FOR DX12. Get that through your tiny, AMD-addled brain.
Why wouldn't they? Async Compute hasn't even reached full fruition, it's a hardware feature that can be taken advantage of in both consoles and PCs. Why would we continue to use scalar, single pipeline compute which requires context switching to run seemingly "simultaneous" rendering? Explain how this technology is inferior to nVidia's by design in any way.
>>
>>54203864
>wasting $400 on a rebrand of a housefire card that's going to die within a year from heat stress
Enjoy your money "well" spent
>>
>>54203778
This, devs are lazy they are going to optimize for amd hardware. Nvidia has nothing to stand on this generation except their drivers that kill cards and brick screens. Their cards are so lock down to dx11 gaming they fall apart when hit by new standards.
>>
>>54203923
>It isn't.
Prove it
With benchmarks from this month.
>>
>>54201627
OMG OMG ITS A NEW CHIP AND ITS MADE BY NVIDIAAAAAAAAAAAAA
I LOVE THIS COMPANY <<<<<<<<3
TRUE GAYMERS UNITE :^)
>>
>>54203992
http://www.hardocp.com/article/2016/04/01/ashes_singularity_day_1_benchmark_preview/4

https://www.youtube.com/watch?v=VbodVlc8htA

Not a lot of benchmarks from April but Google is your friend. You should probably do your research.
>>
AMD is for poorfags who don't change their GPU(s) every 2 years.
>>
>>54201784
I honestly think the Nvidia cards will be faster again. While the AMD cards will be more power efficient this time.

AMD has been shouting so much about performance per watt that I'm thinking the cards might disappoint in raw power compared to Pascal
>>
>>54204059
>Quantum Break
Really? You're using that buggy, unoptimized piece of shit game as a benchmark? WEW LAD
>AotS
>AMD SPONSORED GAME
>NO GAMEWORKS
Hmmm, I wonder why the 980 Ti is falling behind...
>>
>>54204073
>only changing your gpu every 2 years
>>
>>54204101
Okay, sure, burden of proof is on you then.
>>54204097
I don't think that's a bad thing for AMD though, because GPUs are becoming more and more scalable. Also, GPUs tend to be the most energy guzzling part of your machine, so I think it's going to be an exciting time to build 4x CF flagship set ups with less than 1kW.
>>
>>54203794
Mate, nvidia has already added support for vulkan, but if Pascal really has no async at all, I'll be switching to Amd.
>>
>>54204155
You'll be back on the green team within 6 months.
>>
>>54204162
Why?
>>
>>54204162
enjoy your sinking ship and shit drivers friend. don't forget to buy from amazon or newegg so you can get a full refund for your pascal purchase.
>>
>>54204073
your right, except my 290x is literally faster than when i bought it in January 2013...a card that operates at 95 degrees celcius will last four years, playing games at 1440p at their highest settings as well. AMD is a great company bruh, more than i can say about 780ti users, the card the 290x was originally competing against.
>>
>>54204172
Because Nvidia will always deliver a more consistent and better support than AMD can afford. They spend the most money of making sure drivers work and releasing drivers with far more frequency than AMD. AMD has only recently started optimizing for individual games, but it's not going to last. Meanwhile, optimizing for titles is part of Nvidia's core business strategies.
>>
>>54203750
>80% of customer base
What is xbone
What is ps4
What is Nintendo
What is phones
What is Intel igpu

Nigga, you dumb
>>
>>54201784
I would go for amd but i want to save money on my electric bill. Amd cards use too much power just to stay in the game.
>>
File: poo.jpg (16 KB, 300x206) Image search: [Google]
poo.jpg
16 KB, 300x206
Why does every one of these threads turn in to a shit flinging contest?
>>
>>54204308
DESIGNATED
SHITTING
THREADS
>>
>>54204206
>your right, except my 290x is literally faster than when i bought it in January 2013

That's only a testament of how long it takes AMD to get their drivers in order.
>>
I wonder if we're in for another Geforce FX/Radeon 9800 situation
>>
File: 1434824294009.jpg (117 KB, 774x809) Image search: [Google]
1434824294009.jpg
117 KB, 774x809
>Every leak shaves off widely touted feature sets from both AMD and Nvidias new GPU lineup

This shit is the new incremental trash that will "revolutionize PC gaming?

We went from substantial performance gains to partial to practically incremental and these products are not even out yet.

I am tired of the constant back-and-forth from fervent supporters on both sides who are defending what have amounted to be overblown press releases and marketing.
>>
See AMD's master plan. Nvidia are finished. FINISHED!
>>
>>54205368
We are seeing the same thing happening with GPU's as with CPU's. The game is now about making smaller more power efficient processors and saving money on silicon die. AMD are going to make dual GPU's (to begin with) using smaller chips and in conjunction with Vulkan and DX12 take over the market from the top down (They lead by 75% of the market because of consoles compared to Nvidia's PC and business high end strategy).
>>
>>54205368
I would say having a single gpu that can drive games at 4k high settings would be quite a big leap for pc gaming.

The 980ti came close so the top dogs of this generation might finaly be able to do that
>>
>>54203631
>amd
>shitloads less

No. Every nvidia card is significantly cheaper than the amd competitor over here. Burgerfat amerilards think they live in the only country in the world.
>>
>>54204269
>consoles
>mobiles
>discrete gpu market

Found the retarded ayymdrone.
>>
>>54206354
>discrete gpu market
Oh, you mean the 15% of desktops, themselves 12% of the gaming market?

You mean that 80% of 5% of the total market? Yeah, nah, nigga you dumb
>>
>>54206444
>b-but

Damage control harder.
>>
>>54203714
>Even the Hitman dev writes in a GDC slide it's not worth it

Got a source on that?
>>
>>54206479
I'm failing to see when the goal posts were defined as only including discrete graphics on Windows desktops and high-end laptops, you insufferable chode.
>>
>>54206444
>le amd owns the whole market meme

Amd is getting btfo left right and centre by nvidia and Intel. Nvidia owns 80% of the discrete gpu market and there are more people using Intel graphics than every generation of console combined. Your >le amd owns the whole market because consoles xD meme is factually incorrect. Go back to watching more adoredtv delusion.
>>
>>54203666
>GM200 vs Fiji for details

Doesn't Fiji destroy GM200 in compute tasks? It only sucks in games because it did not have enough ROPs to keep up with the shaders.
>>
>>54206556
If you'll read here >>54204269, I explicitly mention Intel. Try reading next time.
>>
>>54206556
>Nvidia owns 80% of the discrete gpu market

Meanwhile, AMD owns 100% of the console market and 95% of the VR market.
>>
>>54201627
Smartphones SoC's are already at 14nm FinFET
> Not impressed
>>
>>54206591
>AMD owns 100% of the console market
>what is the ps3
>what is nvidia RSX

there have been more ps3's manufactured and sold than both next gen consoles put together. so no, amd doesn't own 100% of the market. the ps3 still sells fairly well. even the 360 with its ibm chip sells decently.
>>
>>54206568
It doesn't matter if Fiji does slightly better in compute tasks. What matters for a GAMING GPU is gaming performance, and even a cut-down GM200 will monster a full Fiji.
>>
>>54201784
>async being important outside of 5 games on dx12
>dx12 being important literally ever

i know as an amd fag it's all you have to cling on so i won't pull the raft from underneath you, but eventually you're going to have to swim.
>>
>>54206867
>but eventually you're going to have to swim
Yeah, when AMD finally goes under.
>>
>>54206867
Async is gonna get real important when EMA becomes normal. It's also gonna be important on constrained devices likes phones and igpus who need all the help they can get. Async is also already supremely important in VR.

Finally, async is an optimization path that will eventually become the norm. Pretending to support it by stalling the pipeline and switching the program like nvidia does not only isn't async, but seveely degrades performance. Nvidia would simply be better off saying they don't support it in current hardware, which would allow their gpus to perform as best they can, but would limit the capabilities for VR.

But you'll never see nvidia admit to not being able to do something. They always turn to software to support their unimpressive architecture
>>
>>54203737
>flashbacks to the Pentium 4 shilling again
>but it's 3.8ghz!!!11!!
>>
>>54206980
>async
>becoming the norm

no. devs don't like async and they've come out and said this. async is very difficult to implement and the only games to do this so far are ashes and hitman, both of which are coincidentally sponsored by amd. no games other than amd sponsored ones will utilize it since it takes too much work, and devs already have too much work optimizing for 3 different platforms (ps4, xbox, pc). its literally a balancing act. async is such a bandwidth heavy process that it can prevent any form of AO or AA being used at all. its not some simple thing built into dx12 that devs tick a box to activate which amd fanboys have been led to believe by adoredtv.
>>
>>54206556
>80%
even by sales numbers it's only 70%, by install base it's 60%

>>54206654
Yeah and the RSX was a pile of shit, just a fuckin 7800GS slapped in there. ATI went with a prototype Terascale chip for the 360, and it showed in multiplats.
>>
>>54207099
This is like devs in 2004 talking shit about multithreading, and how it'll never ever happen.

Did you know that multithreading is also known as asynchronous threading?
>>
>>54207115
>RSX was a pile of shit

that's not the point. >>54206591 said amd owns 100% of the console market and was proved wrong. everyone knows the 360 was miles better than the ps3 in terms of literally everything anyway.
>>
>>54207147
Programming for several threads is much easier than programming for several thousands of threads.
>>
>>54207147
>This is like devs in 2004 talking shit about multithreading, and how it'll never ever happen.

games still don't utilize multithreading so they're correct in a sense. devs have too much work to do already so they're not going to optimize for i7's or other multi threaded cpu's. the same goes for async.
>>
>>54201784
>Polaris

''AMD demonstrated its “Polaris” 10 and 11 next-generation GPUs, with Polaris 11 targeting the notebook market and “Polaris” 10 aimed at the mainstream desktop and high-end gaming notebook segment''

http://www.amd.com/en-us/press-releases/Pages/press-release-2016apr21.aspx
>>
>>54204225
>"Because Nvidia will always deliver a more consistent and better support than AMD can afford"
>unironically saying this about the company responsible for the 3.5 GB disaster

>AMD has only recently started optimizing for individual games
Nigga, I've been getting individual gaming optimizations since I bought my first AMD card, a HD 4850, in late 2008. Where the hell did you get that retarded idea from?
>>
>>54207099
new anon to thread.

when you say devs don't like async, it's more than just the team behind the new Hitman?
>>
>>54207227
wow i read >>54207147 as hyper threading. my point still stands though. very few games actually utilize more than 1 or 2 cores hence why an i5 is the recommended choice for top gaming pc's and i3's are recommended for budget gaming pc's. it's all about that performance per core rather than core count.
>>
>>54207248
>Polaris 11 targeting the notebook market and “Polaris” 10 aimed at the mainstream desktop and high-end gaming notebook segment

AMD BTFO
>>
Anything below R9 390 performance paired with a higher typical heavy gaming consumption of more than 140 Watts is a massive dissappointment.
>>
>>54207248
What the actual fuck?

So Polaris means fuck all for gaming?
>>
>>54207306

This is where most of the money is though right?

High end cards' volume is pretty low...
>>
>>54207406
I blame games like Halo for making games mainstream and causing the casuals to drive the reduced need for high end graphics because they're all console babies
>>
>>54207266
check out beyond3d. they have technical discussions about this stuff and many devs post there.

the devs there didn't favor it very much since its time consuming. if they don't need to then they won't. just like devs won't waste that extra time implementing gameworks if nvidia didn't require them to in the sponsorship deal. the same goes for amd.
>>
>>54207359
Polaris means fuck all, as is to be expected of AMD.
>>
>>54207482

What does mainstream mean in 2016?

1080p 60fps
1440p 60fps
?
>>
>>54207262

nvidia doesn't have any 3.5gb cards.
>>
>>54207611
iGPU
>>
>>54207611
It means 1440p 60fps in new games according to amd's chilli demo. This is some shills latching onto the mainstream meaning league of legends tier shit which runs on toasters. It's wierd because people in places are refering to the 960/380 as entry level cards for gaming these days.
>>
>>54207611
1080p 30fps

Not kidding
>>
File: Polaris_11_10.png (30 KB, 663x534) Image search: [Google]
Polaris_11_10.png
30 KB, 663x534
>polaris

WHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

http://wccftech.com/amd-polaris-10-desktop-polaris-11-notebook-gpu/
>>
>>54201627
>no HBM
>AMD already 50% up in the market

NSHILLDIA BTFO´D

HOW WILL THEY RECOVER
>>
>>54207611
950 and 360 tier cards or lower.
>>
>>54207689

Can we assume performance of Polaris 10 Ellesmere XT to be close to a current gen card using 275 to 375W?

Based on the 2.5x perf per watt improvement claimed by AMD.
>>
>>54207890
>Based on the 2.5x perf per watt improvement claimed by AMD.
>believing poo-in-loo lies
kek
It will be OC'd 380X performance at best and maybe the same power consumption as a R7 360
>>
>>54208251
Judging that they actually uprated that from saying a general 2x perf/w improvement to 2.5x in the last year probably means its fairly close to true. Lying about things like that to investors is rather frowned upon.
>>
>>54207771
>HBM2
>no HBM
>>
>>54208371
It wouldn't be the first time, though.

http://www.theregister.co.uk/2015/11/06/amd_sued_cores/
>>
>>54208251
So performance of a 250W chip at 100W?
Sounds good to me. 2.5x power reduction confirmed.
>>
>>54208800
lel, i hope the people suing them got blown the fuck out in court.
>DURR WE DONT KNOW WHAT A CORE IS
>LETS SUE AMD BECAUSE WE THINK THIS ISNT AN ACTUAL CORE

Literally just ask them to define what a CPU core is and watch them get fucking annihilated.
>>
>>54201811
Pretty sure 980 ti was both more efficient and faster than 780 ti.

Just like fury x was more powerful and faster than 390x
>>
>>54209938
>more powerful and faster

What did he mean by this?
>>
>>54210047
Gotta go fast
>>
File: GP104-200-A2.jpg (23 KB, 534x326) Image search: [Google]
GP104-200-A2.jpg
23 KB, 534x326
>>54201627
rumors of a 300-310mm2 chip are all but confirmed
>>
>>54201784
>Fucking shit drivers too
kek

If their drivers are shit, I imagine what's your opinion of AMD always-late-to-the-party drivers.
>>
>>54201811
What it seems to me is that they're releasing mainstream cards with current gen's high end performance but low-end power draw and replacing the whole low-end line, then releasing high end cards with mainstream power draw later?

So, low-end power draw, high-end performance, sweep out all the old R7 cards
Later, release high-end power draw, extreme-end performance, replace all the R9 cards

Maybe even later, release some monolithic dies to replace the Fury set of chips but I don't think so. I'm expecting both teams to push harder for power draw reductions and maybe more towards 4k60FPS performance, but I think that DX12 and Vulkan will pick up most of the slack on that.

We're looking towards one of the biggest performance increases and power reductions in one generation of chips.
>>
>>54203737
clock rates aren't everything, and having a bad core that clocks to the moon is not a good thing.

>pentium 4
>bulldozer
>>
>>54203789
>why is 980 ti still kicking fury x ass
Only in GoW iirc, other games even put a 290 within spitting distance of the 980 ti
>>
File: 61846.jpg (132 KB, 768x432) Image search: [Google]
61846.jpg
132 KB, 768x432
>>54204101
>no gameworks
>I wonder why no cards are getting gimped for insignificant gains in visual detail

Hairworks might look good, but it takes a very keen eye and top tier display to see the difference between 16x and 64x tessellation.

I personally don't see much difference going from 8x to 16x 2bh. Only reason to push 64x tessellation must be gimping.
>>
>>54211426
maybe if the physics ("gameworks") effects were offloaded to a dedicated unit like the old physX days I could handle it.

The only thing Nvidia got right with Maxwell was global voxel illumination except for the sub-pixel tessellation. I don't care for anything else, hairworks is ahead of it's time in accuracy, believe-ability, and required resources.
>>
>>54211426

Is Gameworks just a blur filter?
>>
>>54211494

Supposedly it's a software physics rendering module for things like hair, wetness, ect

But because of the way it actually works, it's just a way for nVidia to keep Gameworks games only running on nVidia cards because AMD cards will get between 20-45% lower frames on the same game with stuttering and dropped frames.
>>
>>54211654
what the fuck
>>
>>54211720

The fuck got removed?
>>
>>54206980
VR is falling well below projected sales. It was a gimmick, a cool gimmick, but still a gimmick. I will continue to buy the top tier NVidia cards until AMD releases a card superior in every single aspect including drivers and quality.
>>
>>54211874
Yeah, I don't know either what this was doing here.
>>
>>54207689
REEEEEEEEEEEEEEEEEEEEEEEEEEEEee

ITS A 390 WITH 14nm

2 months of waiting for nothing.

2017-2018 it is then, i still need a new gpu

guess im going back to the yearly gpu upgrade meme
>>
>>54211874
Some stupid weeb shit that some retard accidently posted after he got BTFO for not posting a source on his bullshit.
>>
Tempted to get a Fury Nano but 4gb isnt enough vram
>>
File: 1389595308853.jpg (87 KB, 588x437) Image search: [Google]
1389595308853.jpg
87 KB, 588x437
>>54211904
>>54212042

Fair enough.

Anyway, back to the topic.

Jesus Christ, people. Both nVidia and AMD have problems. They fling shit at each other and get a boner when their fans throw shit and opposing fans.

Got money to spend? Do your fucking homework, buy the best card. I literally give not one fuck what goes in my PC. If it runs, great. If it runs well, even better. I just want my game to look good and I have $XYZ to spend. If that means I need AMD's XYZ priced card, then I'll buy AMD. If that means I need nVidia's XYZ priced card, then I'll buy fucking nVidia.

To say one is inherently better than the other is just flinging more shit. AMD is a housefire waiting to happen. nVidia is fucking your wallet straight up the seam. You're never going to completely win one way or the other.

Just shut up and buy a goddamn card if you're gonna. The significance of Pascal ahead of launch is literally "This is moving the industry forward."

>mfw
>>
File: 1457223954420.png (1 MB, 557x605) Image search: [Google]
1457223954420.png
1 MB, 557x605
>>54201627
>NVIDIA announces uses FinFET
>Massive performance increases inbound; wow this will be amazing; AMD is finished lol!

>AMD announces uses FinFET
>Who cares, it's going to under deliver; AMD has no market share and they're going to lose even more; FinFET is a shit process and they're only using it because they're stuck with global foundries
>>
>>54212119

Even the nano has HBM, so that 4GB is more than enough to do 4K on
>>
>>54212119
for memory there's two things to think about:
1. how big is the cache
2. how fast is the cache

You can (almost) make up for having half the cache if it's twice as fast, and HBM is /very/ fast. Effectively, you need to see it as 6-7GB of GDDR5.
>>
>>54212213
im only going to be doing 2.5k
>>54212270
B-but muh AA and textures
>>
Is this a good deal?

$765 posted is like what $600usd lol

https://www.pccasegear.com/products/34559/sapphire-radeon-r9-nano-4gb
>>
File: amd mistake.png (284 KB, 2068x1460) Image search: [Google]
amd mistake.png
284 KB, 2068x1460
>>54212119
4GB is the biggest issue like 3.5 was with the 970.
>>
>>54212460
NOPE

Waiting for the 8gb and 16gb versions
>>
Nshitya
>>
>>54212492
>8gb and 16gb versions of HBM1

Thats impossible
>>
>>54207689
Seriously, fuck Polaris.
>>
>>54212460
>3.5gb meme

This isn't an issue anymore.
>>
>>54212695
@1080p it's not an issue.
@4K it is with few rare games.
>>
>>54212731
It's still an issue at 1080p

Gta v uses 4GB, shadows of modor uses 6GB, at Max settings 1080p.

There are other games but I can't remember them all off the top of my head
>>
>>54212782
Even if you hit 3.5, modern games have adjustable VRAM hardcaps and failing that, just lower some damn settings.

It's a non-issue.
>>
>>54212782
>970
>suck with gta V

Rrrright...
>>
>>54212460
>asscreed
still, looks like >>54212270 was pretty much right, works about like having 6-7GB of GDDR5.
>>
>>54212867
>far cry 4
>gta 4

Why Polaris don't use HBM1?
>>
>>54212899
Cost, probably.
>>
>>54212654
Why?

Its already in the leak
>>
>>54212867
>because the vram is faster, you need less vram
>>
>>54201784
>Will get shit on by Polaris

Kek, you AMD drones said the same thing about the Fury X.
>>
>>54212966
>HBM1
>+4GB
Stop ridicule yourself, only HBM2 can...

>Its already in the leak
sauce
>>
>>54212899
Cost to implement + Yield problems
You need to get yields from the GPU die + the VRAM dies + the MCM you mount them to. 14nm is brand new for GPUs, the yields aren't going to be there yet. HBM has been around for a little longer, but still has yield issues.

>>54212980
Because the VRAM is faster it can do transfers in and out of memory faster, swapping unused shit for active shit faster. Not quite as good as just having the memory there, but better than having much slower but more VRAM. If you have twice as much memory but it transfers more than half as slow, you might as well have not even bothered putting more of it on in the first place.

I take it you've never read up on memory structures and interfaces? This is pretty basic stuff for CPU cache structures, and extends pretty easily to other memory structures as well.
>>
>>54212782
https://youtu.be/COEwgUI6D3w

https://youtu.be/hEmQZCsXcV0

https://youtu.be/p6MDUjFiUR4

It really isn't an issue. You can exceed 3.5gb without any consequence 90% of the time. It's only if a game remaps all the textures/resources when over 3.5gb vram that you'll see stuttering because of the slower bandwidth. People seem to think that you can't access the last 0.5gb at all when in actual fact you can 100% of the time. It's only when the contents of that last 0.5gb vram change drastically will you see any problems. It really is a non issue for the most part and will almost never effect you because only a select few games actually use more than 3.5 gb vram at 1080p with a respectable framerate. Some people actually try to exceed the limit but this normally result in them turning up everything to max settings which end up tanking the fps to like 20 fps and then they stupidly blame the vram for the low fps. Hell, even some retard kid on YouTube ran 2 games at once on the same pc and was getting massive stuttering because of the cpu unable to handle both games and he blamed it on the vram.

A lot of people choose to be uneducated on the subject either because they don't care due to their love of shitposting memes or they actually choose not to because they love an opposing company too much so they delude themselves into thinking everything said about nvidia (or any company for that matter) is true. I see this shit all the time in the console wars on /v/.
>>
>>54213389
Wrong mirrors edge video

Meant to post this:
https://youtu.be/IW-H0qNw38M

The vram is over 3.5gb most of the time
>>
>>54201742
Fuck off, nobody gives a shit.
>>
>>54202145
AMD has a driver now? Holly shit. That's news.
>>
>>54213389
I have owned a 970 and I never once had an issue. Beautiful card, handles everything I throw at it. Using the nvidia game application whatever it's called automatically sets the game settings to something reasonable, never really had to worry.

I managed to sell it, and middle of last year stuck two 980 ti's on a 800 w psu, I don't see myself needing anything new in the next year or two. The 4970k is still going strong, all in all I'm pretty glad I fell for the Intel & nvidia meme. Even fell for the SSD meme as well, 3 in a raid 0 config, barely anything takes more than a second.

All the things /g/ told me to stay away from have really worked out great, I don't know what everyone else's problem is besides not having a few thousand bucks to spend. Last I checked I can sell the GPU's and recoup 75% easy.
>>
>>54201784
They won't, nvidia has proven they don't give a single shit about the mainstream consumer market when they've made so many mistakes in the past decades. The only reason why nvidia is still a thing is because they completely BTFO AMD in the professional market, where CUDA is better than GCN.
>>
>>54212997
See, you buy AMD you always play the waiting game.

Doesn't matter, you buy your shit, you have to wait for it to start working.
>>
>>54201627
Nigger, do you even know how to greentext?
>inb4 newfag shill detected
>>
>>54211948
Get a 980ti and quit worrying about the future.
>>
>>54213688
that's because the mainstream just buys consoles.
>>
>>54213736
>the mainstream buys consoles

that and the fact that nvidia literally cannot run directx 12 no matter how hard you try. it can't use any of the features like asynchrous computing, shaders, any of those. They also lied about it instead of flat out saying they can't do it. I don't know how can you possibly support a company that lies to you time and time again like this, it's like nvidia users love to be cucked on a daily basis.
>>
>>54213732
Yeah man, let's get a card that can't do 60fps on UHD or at least 120fps on 1440p. Why get a middle ground 970/390 that can run 1080p fine, when I can buy a card that's practically double in price?
>>
>>54213813
if all you want is cheap 1080p buy a console
>>
>>54213851
>cheap
>consoles

consoles now cost about the same as a 1080p pc (at least here). I'd rather get the pc, since you can do more than just play games on it.
>>
>>54213851
You fucking moron.
>>
>>54213877
well a console here costs about the same as gtx970, so it'd be cheaper
>>
>>54212199
>nvidia announces that maxwell will be extremely power efficient
>"i would never use amd.. it's too big of a hit on my power bill"

>AMD announces that polaris will be even more power efficient than maxwell
>"i just love nvidias cock in my mouth give me that 7.5gb senpai"
>>
>>54213648
Lesson 1: never listen to /g/

Everyone round here are rampant amd fanboys who think Intel and nvidia are plotting a conspiracy to take over the world or some bullshit. Fucking war of the world's shit.
>>
>>54213789
Why would I support a company who promises to give me good price/performance when I can buy a gtx 970 for $40 (equivalent) cheaper than a r9 390? >muh amd da best price/performance meme is so untrue. Also why would I support a company who tried to jew me with all their bulldozer bullshit? I'm never going back to amd ever again.
>>
>>54202738
>because no hw async
You have no idea what you are talking about.
>my 770
yeah no only AMD shills bring up hw async
>>
>>54212695
It isn't?

Please explain.
>>
>>54214521
>who tried to jew me with all their bulldozer bullshit

??? If you're posting about the performance of AMD cpus it's only because AMD releases cpus at a much slower rate than intel does. This is good for some reasons and bad for others, it's just a different business model than what intel follows.

About price/performance, AMD wins in this criteria hands down. Aside from the fact that AMD cards consume way less power than nvidia cards do (maxwell consumed about 600w tdp as far as I remember, which is literal housefire tier), they also have more vram (very good for 4k or multiple screens) and support async computing (decent performance increase). That's a way better bang for your buck card than a 3.5 gb card which is barely relevant today. Let's not forget about nvidia's colossal failure recently, the titan z. The card which nvidia promised to be the fastest card in the world, priced at $3000. Meanwhile it was actually outmatched by a lot of single gpu cards, and the r9 295 xx2 (priced at $1500) completely beat the titan z. Nvidia later on dropped support for this card if I remember and.

>I'm never going back to amd ever again

Ok fine, that's your choice. I'd rather side with a company that doesn't lie to me about its card specifications and actually respects my consumer rights. To each his own though.
>>
>>54201784
10 games currently use dx12.

10 bad games, dx11 will be a standard for a year more or so.
>>
File: power_average.gif (73 KB, 400x1227) Image search: [Google]
power_average.gif
73 KB, 400x1227
>>54214692
>About price/performance, AMD wins in this criteria hands down.

Except nvidia is way cheaper here. Not everyone is a burgerfat American. Try again.

>Aside from the fact that AMD cards consume way less power than nvidia cards do

This is top tier delusion now. You must be some hardcore amd shill to say that because no other amd shill I've ever encountered has said this. It's a given that maxwell is way more efficient than anything amd has had.

>they also have more vram

More vram? I didn't know the fury has more vram than the 980ti or titan x. I didn't know the 380 has more than the 4gb vram found in the 960.

The only cards to have more vram is the 390 and 390x which aren't even powerful enough to utilize more than 4gb vram since they're literally a refresh of the 290 and 290x which had 4gb vram. The only time it can use more is when it's cacheing resources. It can't effectively use 8gb vram in any situation.

See:
http://www.techspot.com/review/1114-vram-comparison-test/

Give up your blatantly fanboyism and hate for everything that isn't amd. I've tried amd, Intel and nvidia and have come to my conclusion based on my own experiences. You quite obviously have based your views on your buyers remorse and deluded amd ubershills like adortedtv who has been proved wrong every single time.
>>
>>54214865
>gtx 970 4gb

well memed.
>>
>>54214865
>apparently a gtx 780 ti consumes more power than a 980 titan x with 12 gb of vram

Who the fuck made up this bullshit, lmao.
>>
>>54214926
>implying vram has any relation to average power usage

Found the tech newbie.
>>
>>54214865
>More vram? I didn't know the fury has more vram than the 980ti or titan x. I didn't know the 380 has more than the 4gb vram found in the 960.

I thought we were talking about the r9 300 series? Nonetheless, fury has hbm which is way better than gddr5 but had a limitation of 4gb (it's fixed now, in hbm2).

>buyer's remorse

no, I've actually tried both. I just went amd this year and never felt happier.
>>
>>54214975
GDDR5 is almost 20% of the power draw of modern GPUs.

To say that VRAM doesn't have any impact on average power draw is just ignorant maximum.
>>
>YOU NEED 999999999 GIGABYTES OF VRAM OR YOU CANT PLAY ON 4K

When will this meme die, holy shit. I've been using a 4gb card just fine for years now.
>>
>>54215019
Vram chips stay in a low power state until they're actually needed. No game would have pushed that titan x to use 12gb vram hence why it has similar power usage as the 980ti which is essentially a slightly cut down titan x.
>>
>>54214988
>I thought we were talking about the r9 300 series?

>Aside from the fact that AMD cards consume way less power than nvidia cards do (maxwell consumed about 600w tdp as far as I remember, which is literal housefire tier), they also have more vram

>aside from the fact AMD cards.... they also have more vram

No. You were clearly talking about amd as a whole here. There wasn't any mention of the 300 series.

>I just went amd this year and never felt happier.

It's funny you mention this because everyone I know who has gone amd this year regrets it fully. I'm honestly not joking, they are all upgrading to Intel and nvidia next time round. Amd just doesn't work well with the software they use. The shitty driver meme is actually real when it comes to complex software.
>>
File: 1448423384306.png (1 MB, 1360x1176) Image search: [Google]
1448423384306.png
1 MB, 1360x1176
>>54214865
>The 390 and 390X are really graphics cards we never wanted. At the time of their release the Radeon R9 290 and 290X were exceptional buys. The 290X cost just $330, while today the 390X costs around $100 more for no additional performance and it is no different with the 290 and 390.

>We see plenty of gamers claiming that the 390 and 390X are excellent buys due to their 8GB frame buffer ensuring that they are "future proofed," and well, that simply isn’t the case, as neither GPU has the horsepower to efficiently crunch that much data. Perhaps the only valid argument here is that the larger frame buffer could support Crossfire better, but we haven’t seen any concrete evidence of this yet.

holy shit ive been saying this for ages. ayymd are robbing their own customers with the new gpu

>pic related
>>
>>54215269
So if I'm doing 4k then 290x will be the same as the 390x?
>>
>>54215421
except the 290x will explode but yeah.

Save your money for the new Fury the current one is handicapped by 4gb no matter what /g/ says alot of games use over 4gb especially AMD even at 2k resolution with AA
>>
>>54215421
Not him but yes they're literally exactly the same card. The additional 4 gb vram on the 390X is all marketing. As the article stated, it can't even use that 8 vram and uses at max 4gb vram except a single game which had no performance impact anyway. Save $100 and get a 290x.
>>
>>54215474
Shame polaris is still the exact same fucking card

God Nvidia and AMD are useless these days.
>>
File: Big-Pic-2.jpg (333 KB, 1600x801) Image search: [Google]
Big-Pic-2.jpg
333 KB, 1600x801
>>54215421
A 970 is miles better than a 290x at 1080p and has similar performance at 4k whilst being cheaper.

>inb4 old games and drivers

These games are from last year and still commonly used as benchmarks today. The drivers are from literally a month ago. I think there's only been one more beta driver released by amd and a couple by nvidia in that same time so if anything nvidia is the one at a disadvantage here.
>>
>>54215573
Actually I was wrong. The latest amd driver is 16.3 and the fury is seen to be using it in the pic. It must obviously be only for fury line up.

I think nvidia are on 364.72 so they've only released one driver since. That 970 is at a disadvantage here even though it's smoking that 290x. Who know what improvements this latest driver could have provided.
>>
>>54215573
>Cherrypicked games
>BabelTechShills
>Project CARS
>Gameworks
>Not posting the version where the 390x outperforms the 290x at identical clocks because that would kill your argument
>>54215632
>16.3
>Latest driver
>>
>>54213813
Then there is no helping you, no product will ever be the right fit.
>>
>>54215632
The 361 branch is old as fuck.

The 364 branch put the GTX 970 on par as the 290X on Far Cry Primal already.

Also Fallout 4 had some issues that gave poor performance that have been fixed already.
>>
>>54201627

So is this the kind of thing that we buy and get excited about or watch with amusement?
>>
File: Big-Pic.jpg (589 KB, 1122x1828) Image search: [Google]
Big-Pic.jpg
589 KB, 1122x1828
>>54215730
>Not posting the version where the 390x outperforms the 290x at identical clocks because that would kill your argument

>implying I wasn't the original poster of the 290x vs 390x babel benchmarks months ago
>implying the argument is about 290x vs 390x
>actually trying to justify someone buying a 390x which is well over $150 more than a 970 and 290x just to beat the 970 by 5 fps on average (kek)

>b-but cherrypicked gaems
Using the most popular games is cherry picking now huh? Why don't we use ashes of singularity since its sooo popular right? Wouldn't you just love that.
>>
>>54215906
>$150 more than a 970 and 290x just to beat the 970 by 5 fps on average (kek)
You are describing a 980. 390x is sub-$400 now and only 2fps slower than a 980, which still sells for close to $500.
>>
File: 1440p.png (149 KB, 1299x3103) Image search: [Google]
1440p.png
149 KB, 1299x3103
>>54216022
>2 fps slower

U wot? The stock 980 steamrolls the 390x in games that aren't amd sponsored (hitman, ashes) or the 2 recent dx12 titles as seen in >>54215906


Also there is a £20/£30 difference here between a 980 and a 390x and I'd spend that extra money for the better average performance and much better overclocking potential. I'd like to see anyone attempt an oc on a 390x that brings it to fury x levels of performance whereas a measly 1450+ mhz clock on a 980 brings it to around gtx 980 ti stock levels of performance.
>>
>>54216147
>Broken japanese console port
>Gameworks
>>
>>54215903
Watch with great amusement.

Its going to flop with no ayylmaosync
>>
>>54216191
>dark souls 3
>gameworks
>having any proprietary software at all

Is this this only excuse amdrones have left? If they get trashed in any benchmark all they scream is gameworks even if the game doesn't actually have it lmao. All sign's of someone losing a battle I guess.
>>
>>54207689
From now on im calling it Poolaris.

Go Vega or go Home.
>>
File: r6siege_2560_1440.png (28 KB, 500x570) Image search: [Google]
r6siege_2560_1440.png
28 KB, 500x570
>>54216242
meanwhile in games not developed by imbeciles
>>
File: acs_1920_1080.png (27 KB, 500x530) Image search: [Google]
acs_1920_1080.png
27 KB, 500x530
>>54216324
>ubisoft
>not imbeciles

So you accept any performance from any ubisoft game then? They're not 'imbeciles' after all.
>>
>>54216147
>tfw still using 750Ti because it is still the most powerful card with passive cooling
>>
>same engine
>not gameworks sponsored >>54216324
>gameworks sponsored >>54216488
notice a pattern here?
>>
>>54216564
Are you actually this fucking stupid?

http://www.geforce.co.uk/whats-new/guides/tom-clancys-rainbow-six-siege-graphics-and-performance-guide
>>
>>54216147
>The stock 980 steamrolls the 390x in games that aren't amd sponsored (hitman, ashes) or the 2 recent dx12 titles as seen in

Maybe, just maybe it's because the 390x wasn't built to compete with the 980 in the first place, ever thought of that. Try comparing it alongside the fury x, which was meant to be AMD's competitor to the 980 series. What you're doing is the equivalent of me comparing a fury x against a gtx 480.
>>
File: 1441278936414.gif (646 KB, 512x481) Image search: [Google]
1441278936414.gif
646 KB, 512x481
>>54216147
>680 which scores lower, placed above 2GB 960/380 which scores higher
>tested 4GB 960 but not a 4GB 380

And the kicker
>950 matches 680
>>
>>54216630
You could maybe not bring the example to such an extreme and simply say that the 290/X (390X) series was meant to compete with the 780/Ti series

Maybe.
>>
>>54216630
http://www.trustedreviews.com/amd-radeon-r9-390x-review

>The R9 390X is built using older hardware that's been upgraded, and is designed to take on the Nvidia GeForce GTX 980 – one of the most popular cards around.

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/

>Expected to compete with the $480 GeForce GTX 980, the R9 390X is boldly priced at $429.

http://m.hardocp.com/article/2015/10/23/xfx_r9_390x_double_dissipation_core_edition_video_card_review_ready_for_edit/#.Vx3N9hnTXqA

>we will pit up against its green team competition of a reference 980

Every review you'll read about the 390x will tell you it was brought out to compete with the 980. And that's because what they're saying is true.

The 3xx series was brought out to compete with the 970 and 980 whereas the fury x was brought out to compete with the 980ti. The air cooled fury isn't a competitor for the 980. It's more expensive and was only brought out by amd because not everyone favoured the idea of having a water cooled gpu in their rig and is still essentially a competitor for the 980ti.
>>
>>54216692
>tested 4GB 960 but not a 4GB 380

Are you blind?
>>
File: 1445543391004.webm (351 KB, 606x340) Image search: [Google]
1445543391004.webm
351 KB, 606x340
>>54216781
yes
>>
>>54216756
Except Fury is selling for 980 prices while performing significantly better so it is in fact the 980 competitor.
390x is now selling for $400 or even less, so it's not a direct competitor to the 980.
>>
>>54210279
950gtx here
honestly I'd prefer late-to-the-party drivers than drivers that kill processes

had to downgrade to 362.x because the newest one kept crashing games and stuff
and I'm not a beta tester
nvidia should get their shit together and test those fucking drivers on another channel, not the release one
>>
>>54216822
Here we go again with Americans and their prices. Not everything is priced like it is in America. A 980 here is equivalent to $130 less than a fury. The fury is even more expensive.
>>
>>54216873
The fury X is even more expensive*
>>
>>54216873
>>54216884
you act like it's our fault you agree to your politicians fucking you all in the ass for your "value added" tax, import tariffs, or whatever else you commie cucks bend over and take with a smile.
>>
http://www.guru3d.com/news-story/nvidia-pascal-gp104-gpu-photo-surfaces-and-shows-gddr5x-memory.html

AMD BTFO !!
>>
>>54216873
Sapphire Fury Nitro 449€ at multiple German dealers
Cheapest 980 I can find is 485€. Most are 500€+
>>
>>54216934
GDDR5X is not available in June or July. If they launch it it will be a pure paper launch.
>>
>>54216941
Cheapest 980 (zotac) is equivalent to $547.82 here and a fury (xfx) is $648.73.

Nvidia is much cheaper even in local stores here.
>>
>>54216985
If you are not in the USA or Europe you are irrelevant
>>
>>54216934
>gddr5x
>meanwhile, titan, which is their $1000 card is the only one to get HBM2
>vega is probably going to get HBM2 at $600

If nvidia doesn't try to competitively price their titan cards, it's over for them, nobody's going to pay a thousand dollars just for HBM2.
>>
>>54216992
I do live in europe. I just gave USD prices so amerilards like you can understand the pricing. Stop getting assblasted because amd's >'best price/performance ratio' is literally a meme here.
>>
>>54216996
They're predicting this chip to be the 1080 so that means the 1070 and 1080 will most likely have gddr5x since the 70 and 80 cards are always based on the same chip. If the next titan has hbm2 that pretty much confirms the 1080ti will have hbm2 since it'll be a cut down titan.
>>
>>54215068

I pass 4gb daily at 1440p.... I guess you probably have to turn down settings to play at 4k though so maybe you use less.
>>
>>54217069
sure I'd like to believe that, but knowing nvidia and how they just love gimping their customers for cash I'm pretty sure it's only the titan card. Nvidia doesn't really come off as the kind of company to be so nice to pt hbm2 in both their ti and titan cards. Especially since it's new technology.
>>
>>54217033
if you live in the EU you can just buy from the EU countries where it's cheaper, dumbfuck.
>>
>>54217129
Cheaper or not, I'd rather not buy electronics from Polskiland.
>>
>>54217129
>importing just to buy amd

I'll buy whichever is cheaper for me in the country I live in and thats nvidia. Across Europe the most popular card the 970 is generally cheaper than the 390 anyway.
>>
>>54217150
>Germany
>Polskiland

>>54217160
>Supporting local greedy merchants when you could be supporting superior German PC gamer merchants
>>
>>54217171
The 970 is still cheaper in Germany compared to the 390 though.
>>
>>54217171
German import + exchange rate doesn't make much difference famalamadingdong
>>
>>54217196
It's good to know germany is pro AMD.
>>
>>54217820
Where the fuck do some of you people live, mars or something? Nvidia was more expensive than amd everywhere I looked.
>>
>>54217171
>Supporting local greedy merchants when you could be supporting superior German PC gamer merchants

If I wanted to support Syrian doctors I'd go down to fucking Calais myself m8 get fuckt
>>
>>54214822
I remember there was a time when there was almost no dx9 games.
>>
>>54203789
Ahaha this mad nvidia fan.
>>
>>54203789
>Then why is a 980Ti still kicking a Fury X's ass when it comes to the most recent DX12 benchmarks, AotS and Hitman?

Because nvidia is throwing money at devs to fuck with AMD cards?

>wat is nvidia goyworks
>>
>>54203555
Three point five words
>>
>>54218468
ur a cheeky fuckn cunt m8 I'll giv u tht
>>
You can tell that an Nvidia launch is near because the shillbots are out in full force
>>
>>54217839
You're not looking hard enough. I've found nvidia to be cheaper everywhere I've looked, which also includes Germany and the UK. For the 970 vs 390 anyway. I haven't checked other cards.
Thread replies: 255
Thread images: 21

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.