[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
IT'S HAPPENING - AMD Polaris 10 Pictured - NVIDIA Pascal
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 169
Thread images: 28
File: AMD_Polaris_10_Back.jpg (3 MB, 3024x2953) Image search: [Google]
AMD_Polaris_10_Back.jpg
3 MB, 3024x2953
http://wccftech.com/amd-polaris-10-gpu-pictured/

http://wccftech.com/nvidia-pascal-specs/
>>
>>
>>53544096
>I did get to see AMD's Polaris 11 GPU running PASSIVELY while playing back 4K VR content last night. Pretty impressive.

Nice
>>
File: 1454781310577.jpg (112 KB, 630x576) Image search: [Google]
1454781310577.jpg
112 KB, 630x576
>>53544120
>pretty cool

That's more like incredible
>>
>>53544106
>twice the shaders
>>53544120
>literal insanity

This generation will be the most exciting of the decade, how unfortunate that we aren't going to see widespread adoption of HBM, but just the new node is insane
>>
>>53544106
Fake. There is no way the GP100 will have both ΗΒΜ and GDDR5 versions
>>
>>53544152
>>53544120
VR is barelly something more than video with detection of head movement

no?

so we live in the time where integrated GPU on a cpu can do your 4k video playback, a discrete card should be able to do the bit of extra work to allow for head movement/turning
>>
>>53544190
>theres no way that video cards had GDDR3 and GDDR5 versions
you new?
>>
i want to replace my fire-starter 290 already

please hurry
>>
>>53544120
>>53544152
>inb4 pic related

>>53544096
>ib4 4GB HBM1
>>
>>53544196
You need to render at least to 2K, and at least at 120 FPS, for decent VR (ie. no motion sickness and overall a decent experience), oh and you need to render it twice
That's why VR is the new performance meme
>>
https://www.zauba.com/import-baffin-xt-hs-code.html

>4GB

TOPPEST LELKEK, AMDPOORFAGS
>>
File: colorful-1.jpg (45 KB, 500x318) Image search: [Google]
colorful-1.jpg
45 KB, 500x318
>>53544226
>>
>>53544297

>3.5GB

You were saying, Nvidiot?
>>
>>53544106

>nvidia confidential slapped at the bottom

kek'd, at least fucking try
>>
>>53544096
finally a GPU that I can fit in my butt.
>>
>>53544096
if it's 4GB HBM1, don't want.
>>
>>53544785
every NDA has this shit
>>
>tfw just bought a r9 390 a month ago.

sigh.
>>
File: 1458241639970.png (652 KB, 600x600) Image search: [Google]
1458241639970.png
652 KB, 600x600
>>53544106
>How Accurate Are These Specs?
Any person with basic knowledge of chip design will tell you that this will NOT happen. Each GPU will only feature one memory standard, either GDDR5/X or HBM/2, not both. You will not see a GPU with two completely different memory interfaces supporting two completely different memory technologies. It will only support one or the other.
>>
>>53544871
skylake supports ddr3/ddr4

every nvidia series had new low end gpus that used gddr3
>>
>>53544871
>>53544190

*Maybe* GP104 and GP100 with GDDR controller and a GP100 with HBM controller with the same names.
>>
stopped reading at
>The GTX 1080 will allegedly
>>
>>53544152
Holy shit that's the most abstract rayquaza I've ever seen
>>
>>53544991
Yeah, I don't see why this shouldn't be possible.
>>
>>53544871
That depends on profits they would be extracting by fragmenting the market. You have a point though, it costs more to have different Asics.
>>
>>53544871
I've seen budget cards like the r7 250x offered with ddr3 and gddr5, so why not offer a budget gddr5 alternative in the high end for people looking only for 1080p?

In fact, the ti cards could be HBM while the non ti could be gddr5/x. That actually makes sense.
>>
File: 1458019772030.png (650 KB, 512x800) Image search: [Google]
1458019772030.png
650 KB, 512x800
>>53544766
it just works man!
>>
>passively cooled
You have to be a special kinda retard to think that this is relevant to desktop graphics card. It's possible to playback 4k video with passive cooling but there's no way in hell it's doing that while rendering a video game at 4k@60fps
>>
>>53544297
Isn't that the mid range one?
>>
>>53544798
There are going to be GDDR5X cards that have similair performance but consume more power, you aren't even going to need more than 4GBs of VRAM until 8K gaming becomes a thing, aside from that you could just turn down the textures
>>
File: HR 22.jpg (150 KB, 602x635) Image search: [Google]
HR 22.jpg
150 KB, 602x635
>>53545273
>Implying it doesn't have one of these strapped to it

Also it could have fins on both sides of the PCB and be meant to work with a water cooled CPU
>>
This year is going to be very interesting.

GPUs are going to jump up in power like crazy.
Storage is going apeshit with it's speed increases at the moment and the prices are going to fall like a rock during summer.

Can't wait to see these things hitting the market.
>>
>>53545471
>Zen, Broadwell-E, Kaby Lake, Pascal, Polaris all in same year
A year unlike any in this decade
>>
>>53545366
Not even that, AMD improved their framebuffer efficiency so they don't actually need gobs of vram like nvidia does.

>>53545605
>tfw I will finally have enough money for a completely new system
Finally everything's coming up Milhouse!
>>
>>53545605

Considering how much of a let down progress has been during these last few years, about damn time we get a major performance jump.

Now if only the borderline stagnated battery tech would go through a radical change like HDD to SSD is going now.
That would be the tits.
>>
>>53544096
fake, op a faggot
only children think this is real
>>
>>53545369
GDDR5X isn't ready
>>
>>53545721
>polaris demo
>fake
>>
>>53545605
>Kaby Lake
Wow it's fucking nothing.
>>
>>53544833
>>tfw just bought a r9 390 a month ago.
Don't worry friend, I'm on the same boat
>>
>>53544833
>>53545910
Why do you say that will there even be an upgrade for the $300-400 range with pascal/polaris?
>>
>>53544798
>>53545369
Seeing the polaris 10 in mITX case, it's likely that they went with HBM to save pcb space.

Was it a spec limitation that HBM couldn't fit more than 4GB or space constriction within the interposer though? A smaller 14nm die would leave space for more memory chips on the interposer compared with the large 28nm Fury die.
>>
>>53546524
4GB is still probably the limit, but DX12/Vulkan is going to let both card's VRAM be utilized in a dual GPU set up
>>
>>53546548
>That moment when people with 2-3 cards are in heaven
>>
File: amd-fijigpuvc-3_674_4c0f1.jpg (98 KB, 1000x562) Image search: [Google]
amd-fijigpuvc-3_674_4c0f1.jpg
98 KB, 1000x562
>>53546524
The limitation is four 256MB memory chips per stack IIRC. Obviously Fury used four of them giving us 4GB's. Unsure if they're able to place 6 down onto the interposer of whether they'll need more interposer space.

An AMD rep mentioned that the interposer is hitting close to the limits of what can be made into silicon. They're making things smaller and smaller but couldn't go any bigger. I could be wrong however.
>>
>>53544106

So only the titan has hbm2? Why? Why isn't the x80 also 8 gb at least even the 390 has that shit
>>
>>53546623
>tilted mem
>that mem chip alone in the bottom
TRIGGERED
>>
>>53546642
I somehow doubt that the Titan will get HBM2. If AMD's flagship won't then Titan most likely will not either. However, let's imagine that it does. They charge a premium already got the Titan series. What will the fucker cost WITH HBM2?! You think $999USD is pricey for a single chip card? Just you fuckers wait.
>>
>>53546723

I don't really care if titan gets it or not.

The titan is a niche in niche

Most people can only buy mid range. And only a few can opt for the ti which here isn't even HBM2 in order to shill their titan.

What the fuck does HBM2 even do? It's just more vram bandwidth or something for textures right? Isn't 8gb ddr5 vram enough already?
>>
>>53546688
>has never checked his pcb before
welcome to 1992
>>
>>53546752
Currently there's no benefit on 28nm, the GPU's just simply aren't powerful enough to utilize all that bandwidth. Now if you for arguments sake double the power of what a GPU can do then then 300-500GB/s will become a bottleneck when you're asking the GPU to push massively high resolution textures.
>>
File: 9000_hours.png (17 KB, 1177x1064) Image search: [Google]
9000_hours.png
17 KB, 1177x1064
>>53546623
>An AMD rep mentioned that the interposer is hitting close to the limits of what can be made into silicon

yeah, the fiji interposer is supposedly right at the limits of the mask/reticle size limit for the photolithography.

seems like a BS reason though, since I can't see why you couldn't just print 2 or 4 adjacent sub-interposers and just saw out the whole block from the finished wafer.
as long as the sub-interposer edges didn't cross the boundaries of an HBM interface area, I don't see why routing power, ground, etc. lanes straight through vertically to the package/card would be an issue.
>>
File: retarded.jpg (424 KB, 1280x647) Image search: [Google]
retarded.jpg
424 KB, 1280x647
>space efficiency

who genuinely cares?
>>
>>53547228
They're much smarter than us and probably already considered that. DESU, I'm not fussed what form of memory they use. So long as it's not hindering the performance of the chip
>>
>>53547269
to the extent that GPUs are generally power limited, using GDDR5(X) does hinder performance slightly, since you can save something like 20-30 Watts on a high-end GPU by moving to HBM.

FWIW, the fact that they were talking about "Navi" in 2018 as being scalable and using a next-next-gen memory, I'm strongly suspecting that they are working with multi-GPU-on-interposer CrossFire as the basis for future chips, so they only need to make a small number of die sizes and just Voltron them together in different counts to make different SKUs.
>>
>>53547370
Jesus, multiple GPU's on an interposer.. How much PCB will there be on one of those heh
>>
>>53547416
no, just image cutting fiji in half, and putting 1, 2, 3, or 4 skinny rectangular slices side-by-side with an HBM module hanging of the top and bottom end of each.

you basically need one "master" die with the display and PCIe controllers, multimedia engine, etc., but the other slave ones can be basically pure shader and memory controller blocks.

if you're eating the cost of an interposer anyway, you might as well slice up the GPU so you get better yields and less binning.
>>
File: 9000_hours_2.png (6 KB, 740x701) Image search: [Google]
9000_hours_2.png
6 KB, 740x701
>>53547480
as long as sub-interposer edges didn't cross HBM or GPU-to-GPU interconnect area boundaries, I think you could get away with something like this.
>>
>>53546559
never because now multi-gpu are in implicit mode and for now there is now way to hack the driver to make it work like in explicit mode in DX11, it is completely up to the developer to add multi gpu support to their game from now own and it must also take in account the different hardware configurations.
>>
>>53547241
Anyone that custom water cools would appreciate it

Otherwise it could be smaller, but then you'd need faster fans to keep it cool, they make it larger to make it quieter
>>
>>53547241
If the PCB was half its size we might be able to go back to single-slot coolers. Or for flagships, two fan cooling.
>>
>>53547695
No, anyone who watercools is obviously also interested in aesthetics and having a manlet gpu shorter than your mobo is ugly as fuck.

Also waterblocks are standardized in that the gpu ports are always aligned with the ports on the cpu, and tiny pcbs breaks that completely.
>>
File: 1419225286564.gif (1015 KB, 485x200) Image search: [Google]
1419225286564.gif
1015 KB, 485x200
>Raja Koduri Promises Sweet Performance Per Dollar
LITERALLY HIS NAME IS RAJEET
HE LITERALLY HAD MONEY DEPOSITED INTO HIS ACCOUNT FOR SAYING THAT
>>
>>53546623
Kek, I knew that one on the left was familiar. I put a custom cooler on my 290x and thats exactly where I had to put the cooling pads and shit
>>
>>53544955
>skylake
>DMI 3.0
Ahaha
>>
>>53547665
the number of high-end engines isn't exactly skyrocketing.

as long as the top 10 Unreal, Source, Cry, Unity, etc. engines write explicit multi-GPU balancing code, most games will support it.
>>
>>53547370
>multi-GPU-on-interposer CrossFire as the basis for future chips
This is how 3DFX died.

Thinking like this is what led to Bulldozer and its general failures.

I really, really doubt AMD can pull out some kind of generalized kernel/controller/ISA/what have you which processes code meant for single-GPU operation a) as efficiently and b) as reliably as just having a "monolithic" processor. I highly doubt Nvidia will follow the same route.

If this is the path AMD takes I will hedge my bets on Nvidia dominance
>>
>>53548794
you are retarded
>>
>>53548794
3DFX had more problems than just their on-card SLI and its scaling.
The Voodoo 5 didn't even have a hardware vertex transform & lighting engine, much less DX8 pixel shaders like the competition.
It didn't even have MPEG2 offload for watching DVDs, etc.
>>
>>53548794
anon, the reason that current GPUs have shit multi-ASIC parallelism is that they communicate over a very high level through PCIe messages routed through a PLX PCIe switch chip or the host root complex.

there's not nearly enough bandwidth and far too much latency to coordinate shader units between chips, so work has to get shared at a very coarse level.

It wouldn't be easy of course, but it's entirely possible to have a multi-ASIC architecture with a single GCP driving ACEs on slave chips, so long as you can issue them instructions with no more than a few clocks' latency.
>>
File: 1452183276987.gif (3 MB, 200x150) Image search: [Google]
1452183276987.gif
3 MB, 200x150
>>53544096
>wccftech
>>
File: 1440977713326.gif (549 KB, 245x138) Image search: [Google]
1440977713326.gif
549 KB, 245x138
>>53547719
>single slot gpu's coming back
>>
>>53544096
>passively cooled
nvidia on suicide watch
>>
File: 1376379239450.gif (1017 KB, 500x500) Image search: [Google]
1376379239450.gif
1017 KB, 500x500
>>53544096
>No analog output
Dropped. The RAMDACs cost almost nothing, take up a negligible amount of space, and don't generate enough heat to be relevant. There's no excuse for including a DVI port that isn't DVI-I.
>>
>>53549380
why would you need a analog output on a modern card?
>>
>>53549380
that is annoying, but what's to stop a 3rd party AIB vendor from slapping on a DAC ASIC and DVI-I port?

>>53549390
the <1% of customers still using CRTs who don't want to pay for an external DAC I guess
>>
>>53549425
>extremely small niche
>you want them to make EVERY card have additional hardware nearly no one will use
>>
>>53549265
https://twitter.com/ryanshrout/status/709761872778514433

http://www.guru3d.com/news-story/nvidia-geforce-x80-and-x80-ti-pascal-specs.html
>>
>>53544120
>any gpu
>4k VR workload
>running passively

yeah nah, absolute fucking lies.
>>
>>53549390
I have a CRT that I use occasionally.

>>53549425
Not much is stopping them, unless there's some manufacturer agreement, but no way in hell will they actually do that.

>>53549440
I'd like it, yes. It's cents in components and solder, and has no ill effect on the form or function of the card.
>>
File: 20160313_003201.jpg (1 MB, 2560x1920) Image search: [Google]
20160313_003201.jpg
1 MB, 2560x1920
>>53547791
>also interested in aesthetics
there's nothing cuter than a short card that packs a punch. having to move my reservoir to the right side of my case because my 980 is too big just fucked my entire build up. so i'm looking forward to small form factor next gen cards.
also hoping polaris 10 is strong enough to the point where it wont make want to wait for vega 10.
>>
>>53549625
>right side
i mean left side.
>>
tfw since I still have a 560ti, basically anything is a great upgrade.
>>
>>53548907
k

>>53549103
it was a generalization

>>53549218
explain to me how you're going to get "a few clocks" latency from physically separate chips, operating at (I assume) 1.3Ghz, where a clock has less than a billionth of a second to propagate.
>>
>>53544096
Where is the analog output?
>>
File: gta5memuseage.png (4 KB, 289x112) Image search: [Google]
gta5memuseage.png
4 KB, 289x112
AHHH! Need more memory!
>>
If I'm running on a USD 350 budget, will it be worth it to wait for July? I'm interested in the 980, but some are saying there won't be a >10 drop in price.
>>
>>53551817
If you're willing to buy used, probably worth the wait.
>>
>>53544096
time for an upgrade boiz
>i5 2500k & 780
>>
File: 1456763857132.jpg (52 KB, 494x426) Image search: [Google]
1456763857132.jpg
52 KB, 494x426
whats with nvidya and amd getting rid of having two DVI ports? i need them for my korean monitors ...
>>
>>53552390
Your monitors don't have displayport?
>>
>>53552390
this tbqh
i don't like having to pay another $150 for a stupid dp->dl-dvi adapter that can't even do full 120hz
>>
>>53547936
He's the Jim Keller of GPU's fucking retard
>>
>>53552524
Agreed. Hes world class when it comes to making wideo cards.
>>
Why do you feel the need to buy new graphics cards when most new CPUs have powerful enough iGPUs to run games at decent frame rate? Seems like a dick measuring contest?
>>
>>53549562
polaris 11 handheld when?
>>
>>53553327
no new CPU will be able to handle a VR GPU load, mr. intel
>>
>>53552390
>>53552469
>Falling for the korean monitor meme
You paid for a tech that wasn't ready for prime time and now you're gonna suffer the consequences
>>
>>53544096
so then which card is likely to win this generation on specs alone?
>>
>>53553327
>to run games at decent frame rate?
define which games
define decent
>>53552440
not him, but the fan fave qnix 2710 requires dual link dvi. Shit is a 1440p PLS/IPS display that can overclock to 96hz. Costs $300 too on a bad day
>>53553792
we haven't seen polaris specs yet, but since polaris is 14nm and Nvidia is 16nm im putting my money on polaris

I wont build any new parts until Zen comes out though
>>
>>53553839
>we haven't seen polaris specs yet, but since polaris is 14nm and Nvidia is 16nm im putting my money on polaris
is that largely down to reduced tdp and potential lower temps? What if Nvidia comes along with more powerful cards again?
>>
>>53553890
AMD will also have HBM1 for the first round
Nvidia's titan and ti will probably outpace AMD
Polaris 11 will probably BTFO the 1070 and Polaris 10 will give the 80 a run for its money

pricing is the most important part. I think AMD is waiting for Nvidia pricing so they can undercut them
>>
>>53553923
>Nvidia's titan and ti will probably outpace AMD
the question is how soon will they? This might help form my decision
>>
>>53553939
both are releasing this summer
>>
>>53553953
I may still be tempted with a ti based on polaris' temp output and power requirements
>>
>>53553953
>he thinks there will even be a titan and Ti in the same way that there was with the previous maxwell and kepler

Even if there are, there is no way in slob hell they will release this summer.

This summer we'll get the 1070 and 1080 if we're lucky a 1060.

Big Pascal not coming till 2017.
>>
>>53545203
This image sums up kiddies who use mommies credit card to buy a 980ti SLI setup for "streaming"
>>
The market will be flooded by 970s and 980s, and anything below is scrap metal

980 ti barely runs vr
>>
>>53544200

you dumb?
>>
>>53544955

nigger. how is that the same thing? stop being stupid on purpose.
>>
Why do people care that much about performance/watt? I just want more performance, I don't care if it runs hot or if it requires a lot of power.
>>
>>53547241

i care since my 4liter case cant take anything larger than a mini card or the r9 Nano.

and the r9 nano needs a 500watt PSU, and those 500flex atx psus dont exactly grow on trees.
>>
>>53555107
>muh VR

Nobody is going to care about that gimmicky shit after half a year.
>>
>>53555191
Because drastically increasing performance is pretty much over. Performance/watt is the new yardstick because you aren't going to see huge generational leaps anymore. Not at least until there's another breakthrough.
>>
>>53555235
Fuck. I'm going to guess they can't just make it take more watt for more performance either. Aside from just crossfiring it obviously.
>>
http://wccftech.com/intel-talks-cross-licensing-amd-gpu-ip/
Where where you when nVidia was cucked by Intel?
>>
>>53555176
I should be asking you that.
>>
>>53555250
You would end up with a graphics card that weighs a ton, sounds like a hairdryer at idle and is too big to fit in anything but the biggest monster sized case.
>>
>>53555331
Why do you think that? Supposedly these Polaris graphics cards are pretty small, adding shit to it wouldn't make it extraordinarily huge.
>>
>>53555191
heat basically
the less heat produced the higher overclocking potential as well
>>
>>53555285
>intel laptops with integrated polaris 11 gpus and thunderbolt 3
I'd cum
>>
>>53555285
I tell you now, this will 100% happen as long as Polaris doesn't flop harder than bulldozer; and I really don't see Polaris being bulldozer in any regard.
Intel KNOWS AMD needs their help, and they DO NOT want to be facing yet more U.S court demands to either shell out a load of money to AMD directly or even worse, to be told that they have to split into different companies.
As well as this, Intel knows that it wouldn't be wise for Nvidia to have the same kind of grip because if Nvidia goes to 95% marketshare, I can see AMD crumbling and then Intel would be implicit in supporting Nvidia become a monopoly - which would come down on Intel also if it could be proven that AMD would've provided the same service for less.
So, its in Intel's best interests to support AMD with this. Not only for their own financial interests, but also for genuine innovation; the freedom of use regarding Freesync for but one example.

Can't fucking wait.
>>
>>53544106
Fuck. I don't wanna buy a Titan. Guess my 980 Ti's will last me until next gen unless DX12 cucks them.
>>
>>53544190
>http://wccftech.com/nvidia-pascal-specs/
have you read the article?
>>
File: 1453038915717.png (144 KB, 1279x710) Image search: [Google]
1453038915717.png
144 KB, 1279x710
can't wait, still running a i3-2120 and a gtx 660 but it's getting a little weak for the games I like as they keep getting shitty updates, can't keep a solid 144fps, and more performance intensive games have to be turned down pretty low to get 60.

Hoping zen and polaris will be great and I can go full AMD. Anyone know if the linux AMD GPU drivers will be good enough to use by then, or will they still be cucked by nvidia?
>>
>>53557806
>that graph
>wanting to go full AMD
can i have some of your koolaid
>>
>>53557806
Dat graph

far as linux goes, why don't you just do a GPU passthrough to a windows VM?
>>
File: benchmarks.png (415 KB, 646x438) Image search: [Google]
benchmarks.png
415 KB, 646x438
>>53557853
>>
File: 1448477646744.png (244 KB, 629x796) Image search: [Google]
1448477646744.png
244 KB, 629x796
>>53557862
well quite a lot of my games run perfectly fine on linux, as they're not really gpu intensive, like tf2, cs, and other indie games, but if I did gpu passthrough I believe I have to have a gpu just for windows, so I'd have to run all my games under windows? I'm going to look into it but it seems a little inconvenient.
>>
It's going to be interesting what this means for consoles .
The performance gap is growing so big that it's getting ridiculous.

I'm pretty sure we'll see the rise of extrernal GPUs for them in the very near future.
>>
>>53557907
https://www.youtube.com/watch?v=16dbAUrtMX4

Linux would run off your iGPU, and windows would run off the GPU

but it does give you other advantages like running 2 games at once
>>
>>53557930
SUPPOSEDLY
Sony have plans for a "PS4.5", whatever the fuck that means

http://wccftech.com/nvidia-pascal-3dmark-11-entries-spotted/
>>
>>53555219
I bet you are big in the tech world.
>>
>>53557930
>I'm pretty sure we'll see the rise of extrernal GPUs for them in the very near future.
Jaguar does not support Thunderbolt 3, which is the only way you can connect external GPUs due to its bandwidth (roughly equivalent to PCIe 3.0 x4 lanes).
Keep dreaming.
>>
>>53558640

ITS ACTUALLY FUKIN HAPPNNNING

http://wccftech.com/nvidia-pascal-3dmark-11-entries-spotted/
http://wccftech.com/nvidia-pascal-3dmark-11-entries-spotted/
http://wccftech.com/nvidia-pascal-3dmark-11-entries-spotted/
http://wccftech.com/nvidia-pascal-3dmark-11-entries-spotted/


jesus SAVE ME BABY
>>
>>53558737
>doesn't even read the bullshit that he posts
Quit spamming that article. It has nothing to do with the PS4 and is just plain shilling for the currytech site.
>>
File: yepitstrash.png (450 KB, 454x600) Image search: [Google]
yepitstrash.png
450 KB, 454x600
>>53544096
>wccftech
>>
>>53558640

I don't even own consoles or play games, no dreaming here.

And actually you can connect a gpu through Tunderbolt 2.
GTX 780 Ti was connected through it here:
http://www.anandtech.com/show/7987/running-an-nvidia-gtx-780-ti-over-thunderbolt-2
>Based on squinks' own tests and GTX 780 Ti reviews posted online, the performance seems to be around 80-90% of the full desktop performance
>PCIe bandwidth may cause little to no difference in some games while in others the drop can be close to 50%.

So throw in the normal 780 and you'll pretty much get 100% of the performance, while getting performance dips in some games.
After all, Sony already announced the external VR processor, so external GPUs aren't really that impossible of a concept.
>>
File: 1440534781565.webm (1 MB, 1280x720) Image search: [Google]
1440534781565.webm
1 MB, 1280x720
>>53558737
>The first entry we have is of an unidentified Nvidia GeForce graphics card with 7,680 MB of graphics memory, 512MB short of 8GB,

>512MB short of 8GB,
>7.5GB

Ayy
>>
>>53558963
>So throw in the normal 780 and you'll pretty much get 100% of the performance
>hurr, if it's only 80% of the performance, maybe it won't get bottlenecked by the connection!
But that's not how bandwidth works. If that was the case, then a Fury X in a PCIe x4 would perform the same as an R9 380X sharing the same lanes.
>>
File: 1439758196744.png (23 KB, 213x252) Image search: [Google]
1439758196744.png
23 KB, 213x252
I want a Polaris, but here in Brazil it will be too expensive, will have to sell both of my kidneys.
>>
So, what will the Polaris cards that release this summer be? 390 replacements or Fury replacements?
>>
>>53559185
Polaris 11 will be a 360 replacement and Polaris 10 will be a 380 replacement.

The new high end cards will be Vega in 2017 .
>>
File: Cry.gif (142 KB, 500x375) Image search: [Google]
Cry.gif
142 KB, 500x375
>>53559258
That was what I was fearing. Fucking "wait for Polaris". I just want a new high end graphics card.
>>
>>53559185
They said they want to lower the entry level price for VR with Polaris and I think it has been confirmed there are GDDR5 chips so guessing the former.
>>
>>53559258

If Polaris 10 really is a 380 replacement then it better be at a sub $300 pricetag
>>
File: vr-capable-video-cards-645x301.jpg (33 KB, 645x301) Image search: [Google]
vr-capable-video-cards-645x301.jpg
33 KB, 645x301
>>53559324
It has to have more performance than a 290 and less than $350 if you go by >>53559292
>>
>>53559258
People said in previous thread that 11 is 960/370 and 10 is 980 tier.
>>
Is HBM a meme? DDR5 is pretty fast as it is.
>>
>>53559610
From what i understood, Polaris 11 covers the 370 and 380 tiers, while Polaris 10 covers the 390 and 390x tier.
>>
>>53559657
ddr5 doesn't exist yet, shitlord.
>>
>>53559720
>>
If Zen can deliver too, I'll be ready to do a new build.
>>
>Just sold AMD stock
it's going to tank when Q1 numbers hit, right?
>>
>>53559829
I hope youre right. I need a 380 tier card.
>>
>>53559185
The only certain information is that Vega 10 is the Fury/Fury X replacement.
Where Polaris 10 and 11 fit in is pure guesswork based on rumors and die size measurements

Polaris 11 is 120-150mm2 based on journalists who got a close look at the chip. Polaris 10 is rumored to be 232mm2 based on some LinkedIn profile of some AMD engineer.
>>
>>53560082
Fair enough, I still think I don't want to settle for anything less than Fury-tier. In any case I'd have to wait anyway.
>>
>>53560177
>TFW I knew my day 0 purchase of a Fury X was a good investment back then and never doubted myself
'Drivers won't mature' faggots said, 'Fury X is weak as shit' they said, 'My 980Ti has better DX12 support than your card' they claimed.
Now look at the field. Fury X drivers are maturing and only bound to get better.
'But next gen cards!' many cried; yes, next gen cards which we won't see the successors to the Fury X until 2017.
I'm glad I made the choice I did.
>>
>>53559324
>bought a 380 less than half a year ago

Fug
>>
>>53549704
Did you happen to get that card for free fám? ;)
>>
>>53560010
>not holding
huee
>>
So the long awaited 16nm was literally the best thing that could happen to GPUs.

No fucking wonder both companies were milking 28nm last year so hard.
>>
Welp I'll be waiting till Polaris comes out and I'll buy an r9 nano and ek water block to replace my 960 itx
>>
>>53560238
>we won't see the successors to the Fury X until 2017
No, not in terms of the lineup, but we will see cards that are faster, cheaper, use less power and come with more VRAM before that though.
>>
>>53561020

Yeah what a better way to market than have everyone buy the previous gen cards and then release the Kraken.
Everyone is going to see their flagship 1k cards get matched by the next gen mid range cards.
They're going to upgrade the hell out of their machines when that happens.
>>
>>53545645
depending on gpu cost, i may reuse my 280X for a while, but otherwise, 100% new system for me too.
>>
>>53558337
its so easy to spot the people who never had vr of any kind on their head.
>>
>>53554625
pfft, 980Ti? You ain't no pro streamer unless you got 3 TITAN's, bro.
Thread replies: 169
Thread images: 28

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.