[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
>The GTX 1080 presentation slides have already been edited.
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 43
File: fanboi.pngZ.png (117 KB, 746x388) Image search: [Google]
fanboi.pngZ.png
117 KB, 746x388
>The GTX 1080 presentation slides have already been edited.
> Nvidia claimed 9TFLOPS, but that number has dropped to 8.2TFLOPS based on official leaks and now official reviews.
>Ashes of the Singularity - GTX 1080 Benchmarks Missing
>Nearly all of them are using the Rise of the Tomb Raider DX12 benchmarks in their reviews. The issue with this is that everyone knows that RotTR DX12 benchmark is horrendously broken.
>2-Way SLI bridge- $29.99
>Need to register card to get key to unlock 3-4 way SLI
>In Doom AMD GPUs are running an older version of OpenGL [4.3] than Nvidia GPUs [4.5].
>>
>>54610739
Forgotto linku senpai
http://www.overclock-and-game.com/news/pc-gaming/46-gtx-1080-what-s-not-being-discussed
>>
>amd 4 year old rebrands literally (as in LITERALLY) cannot run opengl newer than 4.3 because it lacks the hardware
>this is somehow Nvidia's fault

and the a second later the same faggot starts taking about async
>>
my friend, I wish you would delete this, thank you friend
>>
>>54610814
I don't see how that's a problem though. That'll be like me bitching about my gtx 260 not being able to utilize dx11 and above
>>
>>54610822
Why pajeet?
>>
>>54610814
If I remember correctly the multiplayer beta ran on opengl 4.5, only to be gimped on release.
>>
can this thread get deleted
>>
>>54610829
That's exactly what's going on. Idiots buy extremely old rebrands and blame Nvidia when only the newer gcn (285,380x,fury) supports opengl 4.5. All those cards performs fine.
>>
>>54610847
Mad nvidicuck?
>>
>>54610847
>>54610865
Nice try AMDrone
>>
Hi, first of all there is nothing about the new Nvidia GTX 1080 that Nvidia has misled customers on.

Second of all I find it very offensive to the thousands of hard working engineers that have put so much love into this exceptional hardware.

It's obvious that the GPU performs very well in all api OpenGL and DX12.

AMD is not good they are always slower it is not the fault of Nvidia.

So i've flagged this topic to be deleted by a moderator.

Thank you my friend.

Have a nice day.
>>
>>54610739
>NVIDIA LIED BECAUSE AMD SUCKS
Go home pajeet.
>>
File: oy vey.png (116 KB, 351x500) Image search: [Google]
oy vey.png
116 KB, 351x500
DELETE THIS
>>
>>54610896
>maximum damage control
relax, you will still get payed, even if the thread doesn't go down
>>
File: Screenshot_2016-05-18-16-31-13.png (416 KB, 1080x1920) Image search: [Google]
Screenshot_2016-05-18-16-31-13.png
416 KB, 1080x1920
>>54610907
>>54610896
>>54610847
>>54610822
>>
>unique posters not going up
>amdpoorfags ley fanny samefagging

DELET THIS
>>
>>54610951
>nVidia doing the same false marketing bullshit
>nVidicucks damage control mode

get rekt.
>>
>>54610814
>Game is developed on Vulkan
>Doesn't release with Vulkan support
>Vulkan patch delayed until Pascal release to make AMD look bad right before a new GPU launch
genius marketing
>>
Nvidia - The Way it's meant to be Gimped
>>
>>54610896
Thanks Pajeet, here's your $0.01
>>
File: pooinloo.jpg (566 KB, 2100x1500) Image search: [Google]
pooinloo.jpg
566 KB, 2100x1500
>>
Still the best GPU money can buy
>>
>>54610896
>MAXIMUM
>DAMAGE
>CONTROLE

Your sheckels will be deposited Mohinder.
>>
>>54610766
Nah, but I'd fuck the fox guy.
>>
So since all the new cards are dirty as fuck with shady business practices, what should I look into since I need to replace my aging r9 270x?

I'm expecting a pricedrop on the 980s when the 1080 hits the shelves, but I've heard the 980 is a meme from several sources.
>>
>>54613141
The 980 is a solid card and would be a great purchase if you can get 2nd hand one at a very good price.
Alternatively, if you can hold out for it, Polaris will either be amazing, or at least drop the prices of 2nd hand 390's and furies.
>>
>>54613163
I got a fantastic philips screen with nvidia gsync, 3dvision and a bunch of other features really, really cheap from a buddy a couple of weeks back, so I figure I'll pick up an nvidia card to mess around with those. The screen is excellent without bothering with them, but since they're there, may as well use them.

I've heard a ton of bad stuff about the 980s though, fake benchmarks, frequent overheating, low lifespan, buggy drivers, overall crappy performance. Is it all memes spawned by the less than stellar release, or is it related to the non-ti 980s?
>>
>>54613214
Its all amd fanboy lies.
Much like every other thing this shithole bitches about.
>>
>>54610814
>game runs 4.5 on beta/alpha
>release version somehow broken as fuck and only runs 4.3
>game is supposed to run on vulkan
>game dev has deep ties with nvidia

Coincidence or not, there is a strong correlation and history between nvidia partnered games and release fuck ups on AMD cards. Its almost always patched within a week, but this skews the impression of broken drivers.
>>
File: i-hate-goyim.jpg (41 KB, 350x200) Image search: [Google]
i-hate-goyim.jpg
41 KB, 350x200
DELETE THIS
>>
>>54610739
WILL DEFEND
>>
>card not even released yet
>already getting gimped
LMAO
>>
Let's not forget the doom drivers were fixed byamd and they run buttery smooth within a 1-2fps difference (higher or lower). Thank you based amd.
>>
>>54610739
Alright, tell me another card with this much raw power.

>m-muh 980ti
wrong, at least 20% weaker
>radeon pro duo
kek, no game can make use of 2 gpus efficiently

nvidia wins again
>>
>>54610739
The 8.2TFPLOPS are with the base clocks
SLI is irrelevant

1060 announcement when?
>>
>>54613440
>at least 20% weaker

Overclock vs overclock it's more like 10-15%. And you're paying 15% more for a 1080 vs a 980 Ti. For the mainstream card on the small die.

Anybody who buys a 1080 is a fucking moron.
>>
>>54613452
They dont have enough cards available to even launch either of the cards yet, why are you expecting 1060 so soon? 1070 is completely a paper launch. 1080 is semi-paper launch, you'd only see the founders edition right now. This is a complete sham by nvidia but fanboys eat it up.


>>54613440
>20%
over reference 980ti

OC 980ti will probably get close to 1080, and 980ti goes cheaper than 1080 FE too.

How the fuck does this get so much positive attention?
>>
File: 1451960673722.jpg (8 KB, 325x325) Image search: [Google]
1451960673722.jpg
8 KB, 325x325
>>54610896
REEEEEEEEEEEEEE GET OUT OF HERE
BOYS ONLY CLUB
NO SHILLS ALLOWED
>>
>>54610739
There is no editing

Pascal is 9 TFLOPs based on Boost clock speeds which is what the card runs at, OP IS A FAGGOT that can't even read
>>
>>54613452
No 1080 even reached the advertised 2.1ghz. 1.8 at most. Also ran at 88c at a lower clock than advertised. I.e. lied about their "efficient" """cooler""" """"""design"""""" AND the clocks AND the temperatures. They tried to impress with doom by gimping amd cards but the latter promptly released a new driver and fixed things up nicely. The 20% improvements from a theoretical 2x is not impressive at all. Just a superclocked maxwell at this point. Let's hope they don't fuck up the 1080ti
>>
File: 1403837987365.png (48 KB, 400x389) Image search: [Google]
1403837987365.png
48 KB, 400x389
>video games
>say "TFLOPS" to sound smart, will in fact only use it for video games
>"overclock vs overclock"
>"benchmarks"
>"outperforms"
>"rig"
>>
>>54613472
>why are you expecting 1060 so soon?
I'm not
I just hope there's one soon because I want a new GPU around €200
>>
>>54613485
>boost clock is 2.1ghz running at 60c
>no benchmark reached 2.1ghz
>all benchmarks hover around 1800mhz
>all benchmarks are above 80c usually around 87c
What now shill
>>
>>54613467
>15% more power for 15% more money
Sounds fair to me.

>>54613472
I can oc the 1080 too.
>>
>>54612418
So you're saying you'd yiff the thicc nicc dicc?
>>
>>54613496
Polaris 10/11 is probably your best bet.
>>
>>54613496
That's polaris 10 about.
>>
>>54613504
>>54613505
After Fast Sync was announced yesterday I only want Nvidia
I don't think AMD will have something comparable soon even though it's just a software thing
>>
>>54613499
Theres a smaller headroom for 1080 oc than 980ti.

Its already throttling itself and hitting 104 C temp.
>>
Sir, I'm kindly asking you to delete this thread. Please.
>>
>>54613524
probably the ref version with the tiny ass cooler
I plan on hooking it up to water cooling
>>
File: REEEEEEE.png (116 KB, 1250x609) Image search: [Google]
REEEEEEE.png
116 KB, 1250x609
>>54613523
REEEEEEEEEEEEEEEEEEEE

MUH FAST SYNC
>>
>>54613498
http://www.geforce.com/hardware/10series/geforce-gtx-1080

The Boost clock is 1733MHz

Like I said, you can't read and you shouldn't post on /g/
>>
>>54613214
Shame you won't be hsing either vsync or gsync since the 1080 deprecated them.
>>
File: sip1.png (165 KB, 450x600) Image search: [Google]
sip1.png
165 KB, 450x600
>Nvidia being lying sneaky cunts
Wow, what a surprise. Stop the fucking presses.
>>
>>54610739
>Stock reference cooler is hellfire

FERMI V2.0 EVERYONE!
>>
what the fuck is async even
I only know vsync and freesync from which I haven't heard in years now
>>
>>54613545
No need to apologize for your retardation friend, you are a fanboy after all. Nvidia on release advertised the 1080 as having a 2.1ghz boost clock. This is edited horseshit. Now shut the fuck up.
>>
>>54610766
Is this movie worth watching assuming I'm an adult who doesn't want to fuck the rabbit? I enjoyed the recent CG Disney princess movies.
>>
File: nvidia jew.png (194 KB, 603x802) Image search: [Google]
nvidia jew.png
194 KB, 603x802
>>54610739
I see people bitching about the 1080, but what about the 1070?
>>
>>54610846
It's very likely there were bugs with 4.5 on AMD, forcing a downgrade. Not everything is a conspiracy.
>>
>>54613545
explain the card running at 60c on 2100MHz in their demo
>>
>>54613544
I want Fast Sync for the minimal input lag
Triple Buffering adds lags
>>
>>54613576
Amd released a driver update yesterday that fixed the bug. Look it up.
>>
>>54613559
Nvidia never announced a 2.1GHz boost clock, that shows how much a liar you are and how much an AMDPOORFAG you are

It always has been 1733MHz boost clock
>>
>>54613576
I heard AMD doesn't properly support 2 OpenGL extensions that DOOM uses
Don't know which or if it's true though
>>
>>54613570
Nobody fucking knows anything about it. It's likely going to be popular, just because it's way more affordable and it occupies the 970 space, but that's pretty much all we know.
>>
>>54613553
Nothing to do with monitor syncing. It's an internal resource allocation thing.
>>
>>54613588
Kill yourself cock sucking dense faggot.
>>
AMDPOORFAGS, MAD & BUTTHURT NVIDIA CRUSHING THEM IN PERFORMANCE AND POWER EFFICIENCY WITH PASCAL GTX 1080

SPREADS FUD EVEN WHEN THEY'RE WRONG
>>
>>54613600
Good, let the butthurt flow through you as your lies are exposed for everyone to laugh at
>>
>>54613581
Fast Sync is literally Triple Buffering. Both work to get the best out of Vsync issues.

Triple buffering will get much less latency than vsync on, but more than vsync off

Same exact thing for Fast Sync. The details may be different. Fast Sync may be better at reducing more input lag than triple buffering, but essentially its just different specifics.

Its the same function as Triple Buffering.
>>
>>54613609
Yawn. You're literally too stupid to understand just how incredibly retarded you actually are. I maintain that you kill yourself asap.
>>
>>54613613
Fast Sync is not triple buffering, stop posting if you don't know anything
>>
>>54613626
Look at lying faggot, got caught posting lies about Boost clock and now attacking anyone like a mad dog with rabies
>>
>>54613613
Fast Sync uses 3 buffers
But otherwise it has nothing in common with Triple Buffering
>>
>>54613588
lmfao u are fucking retarded
>>
>>54613639
* Also Triple Buffering increases latency since 3 extra frames are being buffered instead of 2
>>
I'm still on the 500 series. When I finally decide to upgrade, it will be glorious.

Also, grammar people, did the previous sentence need the comma, is it optional, or should I not have used one?
>>
>>54613648
>BUTTMAD AMDPOORFAGS LIAR GOT CAUGHT LYING ABOUT BOOST CLOCK DETECTED
>>
>>54613592
>Nobody fucking knows anything about it. It's likely going to be popular, just because it's way more affordable and it occupies the 970 space, but that's pretty much all we know.

I have a GTX 760 currently. Think it may be a viable upgrade?
>>
File: bender-club.jpg (113 KB, 600x336) Image search: [Google]
bender-club.jpg
113 KB, 600x336
But! Will it be loud as fuck?
>>
>>54613658
>using a gtx760
>2016

there is literally NO excuse you fucking povery amd poorfaggot

fucking buy a new nvidia gpu, you literally do not deserve to use a fucking computer if youre just going to run shit like that

honestly fuck off idiot
>>
>>54613658
Probably. If it's anything like the 970, it'll be a pretty much where you want it on the price/performance ratio. It's a new architecture and it doesn't have any obvious flaws that I see.
Then again, there could be something like the 3.5 bullshit again. Personally I'm not going to buy anything from Nvidia anymore because of all their bullshit, but you can't deny that it'll likely be a pretty good card for the money.
>>
File: fastsyncgraphujuud.jpg (133 KB, 2090x1154) Image search: [Google]
fastsyncgraphujuud.jpg
133 KB, 2090x1154
>>54613639
>>54613628
Are you sure?

Because the slides and the nvidia presentation talks about the same function as triple buffering.


Fast Sync creates its own buffer as well, so its own input lag as well.


Pics related, its nvidia official graph. Note the difference between vsync off and fast sync on.

I think you guys are retarded.
>>
>>54613689
>Running Crysis 3 at 102 degrees celcius
>Not an obvious flaw

Lmao the mental gymnastics that Nvidicucks go through
>>
>>54613588
I planed on getting the 1080 and kinda still do but even I'll admit that it originally was said to be 2.1
>ITT fags laughing because nvidia said this card had a 14 inch sick but only has a 12 inch dick and it's still only 14 years old
>>
>>54613315
Fuck Doom wish i could pirate it.
>>
>>54613708
Well, I haven't seen that then. I haven't been following Pascal's news that closely because I know I'll buy an AMD card anyway. I thought the only benchmarks available were of the 1080.
>>
>>54613702
And?
Fast Sync adds like half a frame of lag
That's well worth it for what it does
>>
>>54613724
I got it for 40$ at cdkeys. Pretty cool.

>>54613708
>>54613740
I don't see he didn't any "mental gymnastics". Literally said he just didn't see anything personally. If you have some sources, then throw 'em up. Faglord.
>>
>>54613774
>I got it for 40$ at cdkeys. Pretty cool.
its gone up in price everywhere now fucking sucks.
>>
>>54613774
Thank you for your comment Pajeet. 0.00001 has been diposited into your account
>>
>>54613702
Seeing this graph, I'm sure it's renamed triple buffering. Triple buffering has only 1 Frame lag for seeing the result so this graph confirms it.
>>
>>54613765
>worthiness of fast sync
Thats a different issue altogether.

For the current discussion fast sync works same as triple buffering. It functions as same thing, to reduce input lag and smoothen tearing issues with vsync.

Whether is "half a frame of lag" or "1 frame of lag", the function is same.

>>54613807
Thats what I've been saying, but this could either be an improved version, or simply marketing tactics. FreeSync vs G-Sync etc.
>>
>>54613816
The function is not the same you AMD drone. It's way fucking better than triple buffering, watch a video for fucks sake and stop your shilling.
>>
>>54613830
-t nvidiots

Can't even see reason or facts. Jesus christ.
>>
>>54613816
No, it doesn't
Fast Sync is almost as good as FreeSync/G-Sync
It basically makes the old laggy VSync redundant
>>
>>54613724
What is stopping you?
>>
File: 1405808635781[1].jpg (57 KB, 350x350) Image search: [Google]
1405808635781[1].jpg
57 KB, 350x350
>>54613816
enjoy your laggy vsync you cuck
>>
>>54613846
>Fast Sync is almost as good as FreeSync/G-Sync
not even close. Fast sync will also have shuttering, but it wont be that noticeable. Freesync/Gsync totaly eliminates that effect and has no lag.
>>
>>54613890
>shuttering

Pajeet will you please fuck off
>>
File: triple buffering.jpg (201 KB, 2090x1154) Image search: [Google]
triple buffering.jpg
201 KB, 2090x1154
>>54613889
>>
>>54613801
>Unemployed white guy living in the south.
You honestly couldn't BE more wrong.
>>
>>54613913
>he got so booty bothered he made an edit to post on an anonymous image board
>>
>>54613902
>a simple mistake thanks auto correction
>pajeet
Are Nvidiots this desperate?
>>
>>54613948
Thank you for your comment Pajeet. 0.00001 has been diposited into your account
>>
>>54613961
lmao. Did you just get rekt and had to repeat?... Holy shit.
>>
>>54613874
An excellent piece of protection software called Denuvo has, at least for now, solved the PC gaming piracy problem. If you want to play Doom, RotTR, JC3, Hitman, Mirror's Edge Catalyst, or FC: Primal, you have to actually pay for them. No way for freeloaders to defraud the developers. It's great news and should hopefully allow publishers to invest more in PC ports.

I just wish it had been ready in time for GTA5- that would have been very interesting.
>>
>current year
>illiterates still throwing up frequency numbers
>monkeys fighting each other over peanuts are smarter
modern gpus are all about cache policy, reordering and predictions. Exactly like cpus, you plebs.
>>
File: 1425693507736.gif (398 KB, 500x460) Image search: [Google]
1425693507736.gif
398 KB, 500x460
>>54613979
Totally Rekt xD
>>
>>54613989
it doesn't matter to be 4 times faster if you still take O(n^x) to the face for just mapping a fucking texture
>>
>>54614022
Says the faggot who can't offer a better solution.

Trying to casually pull out formulas does not make you look smart, you pathetic pseudo-intellectual manchild.
>>
>>54614055
>Says the faggot who can't offer a better solution.
I make what runs your faggot magnet smartphone.
>>
File: 56586721.png (8 KB, 493x402) Image search: [Google]
56586721.png
8 KB, 493x402
>>54613986
> No way for freeloaders to defraud the developers
Nice bait. The only people who benefit from this are already rich publishers and corrupt game journalists.
>>
>>54614081
t. "Java" ""Developer""
>>
>>54614055
>>54614081
Optionally it also runs your mom's dildo, and it's backdoored as fuck.

(this one is stupid I admit, but hilarious to me after seeing that "connected" sextoys startup)
>>
GTX 1070 has 1920 cores
from a defective 1080

1920x1080
Pls meme more Novidya.
>>
>>54614127
>faggot phone
>not nesignated phones
dude it's cpus, and don't blame us for the shit some people decide to run on them
>>
>>54613570
http://www.guru3d.com/news-story/nvidia-geforce-gtx-1070-specifications-surface.html

worse.
>>
>>54613996
>anime pic XD
>>
>>54613141
Do you need to replace it? I have a 270x and want something better, but can't justify upgrading at today's prices. I'm still able to get 60fps in pretty much everything without turning settings down much using a 2gig 270x and I haven't bothered pushing the card past its factory overlock. Sure there's cards that would let me run everything at top settings with minimums of 60fps, and would let me kick up to 4k even at ok settings, but then I'm looking at either a few hundred dollars for a minimal improvement of eye candy or the price of a monitor and a video card (also like it or not most games are made for 1080p not higher resolutions). Frankly from a gaming standpoint I'd be better off spending the money on storage so I could have all my games installed than upgrading. Hopefully that'll change with the other cards released this year.
>>
Why did nvidia callit a 1080? Because its barely better at 1080p and cant do 4k
>>
Well 760, looks like we're going to have to hold out a little longer before you retire the dying 4890 in the backup machine.
>>
>>54610865
>>54610847
Samefagging this hard on a thread about a gpu. You sad fuck
>>
>>54611164
>Vulkan patch delayed until Pascal release to make AMD look bad right before a new GPU launch

AMD's Vulkan support is behind Nvidia's as well.

Doubtless you have some other conspiracy to explain this.
>>
>>54613559
Not on nvidias side personally, but all they said was they had it clocked at 2.1ghz at the reveal. They never specified whether or not that was going to be a boost clock from factory or an overclock. But I thought it was fairly obvious it was an overclock considering they were using a fucking overclock utility to monitor the specs. Still doesn't explain the bogus Temps. Though a large portion of people have repeatedly said it was in a controlled climate open air. Simply closing the side panel alone can make a significant difference in Temps.
>>
>>54614413
>AMD's Vulkan support is behind Nvidia's as well
no actual game to test it yet
and after driver fix 390 gain 10fps on 970
>>
Reported this thread

Please deleet
>>
>>54610739
AMD GPU drivers don't have support for 4.3+ IIRC
>>
>>54613658
depends

Is there anything your 760 can't do currently that you need it to do?
How much are you willing to spend for what sort of improvement?

IMO gtx 660 or better is still viable and on the amd side 7850 or better is still viable.

Personally I'd only upgrade a 760 if the card was dying, I was using the computer for paying work where a faster card would help (relatively small time savings add up quickly for one's livelihood) or I wanted to upgrade monitors to higher than 1080p 60hz.

Most people are using something worse than a 760.
>>
>>54614127
I'll end you you little faggot
>>
>>54610739
>In Doom AMD GPUs are running an older version of OpenGL [4.3] than Nvidia GPUs [4.5].
as if I needed more reason to put Zenimax-companies on my shitlist
>>
>>54614484
Beta did. Alpha did.
>>
>>54613675
I'm using GTX 650 Ti 1GB, r8 & h8
>>
I don't understand discussing computer components before a third party gets to benchmark them in many different scenarios.

It's all just useless speculation and calling people indians. Just let them roll those cards out and talk shit about them after it's proven that Nvidia plays you guys the way you are meant to be played by releasing another 3.5gb card or doing something
>>
How does nvidia get away with blatantly shit products?

>die shrink
>barely 10% performance increase with OC
>market it $700

nvidiots!
>>
>>54614533
mad stackoverflow copypaster detected
>>
>>54614517
You seem knowledgeable about these things. How long would you expect my 4GiB 960 to remain viable?
>>
File: 1315703783414.jpg (42 KB, 350x256) Image search: [Google]
1315703783414.jpg
42 KB, 350x256
The only people who win here are those getting 980s and 980ti cheap on idiots offloading them on second hand market to buy new 1080s.
>>
>>54615851

First of all your 4GB isn't a 4GB 960, they too are 3.5GB and you can google proof for that, I think it's still the first search result.

Secondly, your GPU is already outdated. Yeah, sure it has 4GB but not enough actual processing power to utilize the memory, basically you are getting the same performance as with a 2GB card, you can look up those benchmarks as well.

Your GPU is outdated already. It basically has the same power as an overclocked Radeon 7870, which is a minimum requirement for most new games. That you can google as well.

You should've bought a GTX970, even with the gimped VRAM I think you can use it for a few years to come. R9 390 if you actually want something for your money, but most people prefer getting played the way they are meant to be played and don't even consider an alternative.

Basically you wasted $250 unless you want to play simple shit like LoL or CS:GO, for which a 750ti would've been a better because it already gets like 100 FPS in those games.
>>
File: mfw.png (145 KB, 531x640) Image search: [Google]
mfw.png
145 KB, 531x640
>nshills
>>
>>54615851
>>54616201
960 is basically a 770 which is literally a 680. so really, you were out of date when you bought it. but really it doesn't matter if you don't have high standards. of course newer cards are more powerful, but do you really NEED it? if you have a specific game you actually want to play for more than 50 hours I'd say it's a reason to upgrade if you can't play it comfortably with your setup now.
>>
>>54610896
>Shills in full damage control
I want to believe
>>
>>54615998
yeah, will see in two drivers
>>
File: 1463167572918.jpg (81 KB, 640x480) Image search: [Google]
1463167572918.jpg
81 KB, 640x480
>>54610739
>Nvidia has to lie to make their products appear better

Oh no way! Their marketing team is full of truth telling contest winners I hear.
>>
>>54610814
OpenGL is deprecated

only Vulkan matters now
>>
>>54616201
>>54616404
>First of all your 4GB isn't a 4GB 960, they too are 3.5GB
Google actually contradicts that.
>not enough actual processing power to utilize the memory
I got the 4GiB version simply as a defense against the ballooning size of textures.
>Your GPU is outdated already. It basically has the same power as an overclocked Radeon 7870, which is a minimum requirement for most new games.
Eh. As an example, I'm running RotTR at High settings, Very High textures, and easily syncing to 30fps. Same at 1440p if I knock things down a bit. And this is apparently an obscenely demanding game. 1080p60 is easily obtainable in most.

>AMD
That was straight out. I hackintosh and the recent R9 cards simply don't work reliably.

>970
Wasn't in my budget.

Anyways thanks for the replies.
>>
File: 1426846247110.png (306 KB, 680x544) Image search: [Google]
1426846247110.png
306 KB, 680x544
>>54610739
Wait a second.

You seriously need to "register" your card to use try or quad SLI?

That's fucking hilarious! GO TEAM GREEN.
>>
>>54613589
Do those 2 extensions start GL_EXT_NVIDIA_*?
>>
>Maximum GPU Temperature (in C) 94
WEW 2ghz housefires here we cooooooooomeeeeeeeeeeeeeeeee


Fuck Maxwell kepler fermi trash
>>
>>54616729
They want a database of idiots with disposable money to jew directly to them.
>>
>>54616710
lol, you paid money for a 960, gross.
>>
>>54616772
It was a) this, b) gamble and buy some used card, or c) go without a GPU and wait months for the 1070.
>>
>>54616756
kaek

In fact AMD has always been OpenGL friendly

Its listed right there in the GCN specs its full openGL compatible all the way back to GCN1
>>54616772
i paid $450 for a windforce 770 4gb in mid 2014

good card.
>>
>>54616791
I don't see how the 970 wouldn't be an option.
>>
The truth is if the 1080p was any faster, it would blow up your monitors left and right.
>>
OP is one seriously upset AMD owner.
>>
>>54611164
>>54613315

you guys are also forgetting how nvidia ruined the witcher 3

the needless tessellation added is actually direct evidence of sabotauge, but what you don't know is how TW3 was delayed because nvidia literally had cdprojekt RED scrap all those high-end graphics at the last moment.
because nvidia GTX cards perform much better when the graphics are made 'simpler'

nvidia really thought TW3 was going to be a generation defining game because their marketing team are idiots and none of them are actually gamers. so nvidia really went the extra mile in jewing this game back to the stone age.

they payed cdprojekt RED over 2 million just for the last moment graphical downgrade alone
>>
>>54614413
False. AMD Vulkan support was slow to get out the gate but its actually very good. I develop with Vulkan on AMD and NVidia and both work fine. AMD's driver is actually more strict and produces more helpful information when debugging.
>>
>>54610739
Every time I see a Rise of the Tomb Raider benchmark I instantly know it's more Nvidia lies and bullshit.

Crystal Dynamics doesn't know how to implement DX12 correctly, even IO barely does with HITMAN. It's going to be a year or two at least until any standard of DX12 quality is established.

DX12 is insanely complicated compared to DX11.
>>
>>54616700
Nope. Vulkan is the new hotness but OpenGL will live on. Vulkan is not friendly for new developers or small projects. Literally no reason to use Vulkan for a 2D indie game unless you like wasting time.
>>
>>54616873
got a source on that?

Witcher 3 is shit because of consoles
>>
>>54616940

>he thinks developers are still pissy little bitches stuck in the past who cant learn new things

this is why we have pajeet providing low cost competition in the software field
so exactly this doesn't happen
>>
I'm not even sure why I follow these threads considering I live in brazil, have a 280x and play at 1080p

it's still kinda interesting though
>>
>>54617012
ausfag here with no interest in tech besides gaming and VR

i know that feel
>>
File: wtf4.png (61 KB, 235x235) Image search: [Google]
wtf4.png
61 KB, 235x235
Are the 980ti's supposed to go down in price when these drop?
>>
>>54610739

Its a 5-20% increase over the last generation in some games not even all.

Its not a major splash pretty much exactly what everyone should have expected.

Yes the hype is half bullshit it always is why is anyone surprised?
>>
>>54616729
>>54616760

I assumed they were trying to fuck with people who use the cards for network Photoshop or neural networks. How will they sell their massively overpriced neural network boxes if people can just do it themselves for half the price?
>>
>>54617105

Also the biggest increases are in VR because of software gains. All the hype about how much better it was then the 980ti was in vr stuff.
>>
>>54613580
They did explain that that card was Overclocked, not just running the normal boost clock, and it was an heavily air conditioned venue, not your mamas hot and sweat basement.
>>
>>54615146
hey me too

i'm buying a 1070 soon though
>>
>>54610814
they literally do though
>>
File: 1462120863059.jpg (71 KB, 960x768) Image search: [Google]
1462120863059.jpg
71 KB, 960x768
Did Nvidia gimp the 670 in Doom?
It's like 60 FPS until action starts or i look at something more than a corridor.
Im getting like 30 fps in 1080p when that happens, on LOW.
>LOW
IS THIS THE GIMPING YOU TALKED ABOUT /G/!?
>>
>>54617263
even with a heavily airconditioned venue it would never run at 60 at that clock, looking at the benchmarks. not with a blower cooler.
>>
>>54617355
Yes. Same happens in TW3 with my 780Ti. I also can run Battleborn in 4K until the effects show up, then it drops, mostly to 30, sometimes to 10 FPS. PhysX is off.
>>
>>54613724
Yea man fr I dont wanna pay for it.
>>
>>54617355
670 is old af senpai
>>
Its hilarious that Nvidiots aren't even getting the 1080 until at LEAST october because its a paper launch. Vega with HBM2 will be out by then and will outperform it by 2x while costing less
>>
>>54617049
Prices are already going down.
>>
File: 1.jpg (291 KB, 1920x1277) Image search: [Google]
1.jpg
291 KB, 1920x1277
>>54613675
What's your problem man, did mommy not buy you a 760 when it was new? I'm running a 760 4gb and it runs nearly everything I play fine at 1080p at 60+fps. Only reason I'd upgrade my 760 is because of a few AAA titles I want to play and I could use the 760 to replace the old 460 in my second desktop friends use when they come over. No issues with the 760 really, but the 460 needs replaced and I may as well upgrade the newest system.
>>
>>54617587
A friend of mine still runs a 560Ti. I told him to wait for the new ones, as the 970 is pure dogshit. He only plays Warframe and older games so its still usable.

I used a 670 2 GB to play ME3 in 4K last year, it is the minimum requirement for that but it can handle it in 60Hz without problems.
>>
>>54617587
>top end mobo
>top end cpu
>shit gpu
how can you build a rig like this? Did you pick parts because salesman recommend you?
>>
>>54613949
>tumblr
12 year old detected
>>
>>54617587
>>54617637
>Alienware
>>
>>54617587
Why doesnt speccy tell what case you use?
>>
>le delete this

do i fit in yet lol
>>
File: 1.jpg (177 KB, 2372x818) Image search: [Google]
1.jpg
177 KB, 2372x818
>>54610739
It's fast
It uses very low power
It handles DX 12 just fine
>see movie
https://www.youtube.com/watch?v=RqK4xGimR7A
The vram is plentiful and fast.
It stomps the fury x which is close to it in price

This card basicaly stomps on all the memes AMDrones spout and then some.

Face it, AMD won't have an answer to this or the 1070 for the remainder of the year.

Polaris 10 will most likely be a cheap mainstream card.

>muh vega

Vega will be good not denying that, but by that time the 1080ti will be out to compete with it.
>>
>>54617587
>I'm running a 760 4gb and it runs nearly everything I play fine at 1080p at 60+fps

Then you must play old shit at low resolutions
>>
>>54617727
Nvidia can't even do DX12, and the 10paperlaunch80 won't even be available till vega gets released retard
>>
>>54617727
Thank you pajeet for do the kindly needful..
is 5 rupees in account sir.
>>
>>54617736
Even a GTX570 maxes 99% of modern games at 1080p desu senpai.
>>
>>54617746
>Nvidia can't even do DX12
Watch the movie mongloid
> the 10paperlaunch80 won't even be available till vega gets released retard
Holy fucking shit you, do you honestly believe that? Fucking brainless idiot.
>>
>>54617768
the 570 is actualy overkill for most games on 1080 maxed out
>>
>>54617727
>Rise of Tomb Rider

Stopped watching there.
Rise of Tomb Rider actually perform WORSE in DX12 than it did in DX11.
That shit is broken and have yet to be fixed, why the fuck do people still use this broken shit in the benchmark?
>>
>>54617776
NO ASYNC COMPUTE

and you won't be able to get a pascal card till AUGUST at the earliest, nvidiot.
>>
>>54617803
>>Rise of Tomb Rider Stopped watching there.

What's wrong, are two cherrypicked games that are known for running better on AMD hardware not enough?
>>
>>54617816
>no async compute
Source? Doesn't seem so important anyway when we look how hard the 1080 owns all the other cards that do have it
>>
>>54617837
read his fucking post retard
>>
>>54617855
dude just read /g/ it has no async compute
>>
>>54617879
Read my post you fucking infant
>>
>>54617837
Because it is outright broken and still broken.
That game is in similar situation as the new Gear of War, but without any fix.

How can a benchmark be fair when it use a broken tool?
>>
>>54617900
What a fucking comparison, Gears of war is hardly playable you tard.

This game runs perfectly on DX11 and DX12.

You idiots seem to think that if a game does not offer a performance benefit in dx12 over dx11 it is 'broken'.

It might be a poor implimentation of dx12 as it's new but we don't know. Maybe the effects of dx12 aren't really that big and in some cases even negative.

The way you people dismiss anything that does not validate your crazy idea that dx12 will cause AMD to have massive gains over Nvidia as 'broken' is so funny.

>But what about this benchmark?
>doesn't count!
>and this one?
>It's broken!
>>
>>54615825
>How does nvidia get away with blatantly shit products?

it just werks

and their only competition is barely competent, always playing catch-up.
>>
>>54614337
>Why did nvidia callit a 1080?
Because you turn 1080 degrees and walk away.
>>
>>54614457
more like 2x performance in most cases
>>
>>54615825
What does AMD have to offer that's better?
>>
>>54618004
That's not accurate. Nvidia just werks, but AMD isn't playing catch-up, they're just media'd out of the picture. The main issue with AMD is that their software fucking sucks ass. Nvidia caters to multiple scientific and vidya industries by providing very high-quality software that relies on their proprietary technology. This is very complicated software that is very expensive to develop in-house. On the other hand, AMD has nothing.
>>
>>54618051
they're desperately trying to catch up on the software front, hence my point.
>>
>>54616940
And for those games you don't need huge performance anyway
>>
>>54617736
54617680
>>54617637
Didn't want to buy a 980 with the 1080 being within 6-9 months of release when I built it and I had the 760 sitting around when a friend upgraded he paid me with that card.


The Alienware monitor I got LNIB at a yard sale for $60 and I couldn't pass it up. 2ms response time and an antiglare/fatigue screen with great color. Same as the 21.5" dell of the same year which was a $250+ monitor. Like that it has 2x HDMI, a USB hub and an audio pass thru as well because that monitor is used by my older spare system when friends come over to game. I test every new build with it.

Games wise, I play D3, some BF3, ESO/ES5, Fallout 3-4, and more recently Forza MS. Sure everything isn't at ultra, but the FPS is solid for my needs and works. Really solid card desu.
>>
>>54613675
GTX 550 Ti here, it serves for my purposes.

Fuck you.
>>
>>54617988
>What a fucking comparison, Gears of war is hardly playable you tard.
Its the same because both of these title are not true DX12 title, but a DX11 with tacked on DX12 api.
It is also known that Nvidia have a higher feature level support in this game compared to AMD when we all know AMD is capable of pulling the same exposure.

Rise of Tomb Rider benchmark have been included in the majority of the test to inflate the numbers.
Even in all Nvidia presentation chart they use this particular benchmark to showcase the power differences.

Its dubious as fuck since the performance in this single particular game does not reflect onto the other title.
>>
>>54618149
You're an idiot.

Literally two out of the three games in that benchmarks are games that are known for heavily favoring AMD's architecture, Hitman and Ashes of the singularity.

But still you people will bitch and whine that it isn't fair. I find it really funny when Nvidia owners are often labeled as fan boys here when the most blatant fanboyism and ignoring of facts I see pretty much always comes from AMD supporters.

Face it, the card is fast, it beats anything AMD has into the ground and there is no alternative on the AMD side.
>>
>>54610739
>Ashes of the Singularity - GTX 1080 Benchmarks Missing
https://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,15.html
I found the missing benchmarks.
>>
>>54617306
I've been waiting for 1070 as well, yet it looks like that in my poorfag country it's going to about 40% more expensive than its predecessor, which is way too much for. Looks like I will have to wait for 1060(Ti) or get 970 when the prices will drop. Fortunately, I don't feel the urge to upgrde asap, because so far only The Witcher 3 and Doom Beta were unplayable for me in Full HD, yet playable in 720p so I still can play whatever I want.
>>
>>54618348
yeah I just read up on the 1070, seems it's going to be a shitty deal just like the 970
>>
>>54618254
This is DX12 benchmark anon, its favor a GPU that have good DX12 implementation.
Rise of Tomb Rider is not a good representation for DX12 game as it is barely one.

Answer this question: Why does GTX1080 result in RoTR DX12 were not reflected in the other 2 title? Or ANY OTHER TITLE?
>>
File: 1.jpg (88 KB, 934x823) Image search: [Google]
1.jpg
88 KB, 934x823
>>54618323
>https://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,15.html

Last bastion of AMemeD's destroyed
>>
>>54618323
It was missing from AoS benchmark page, not from the review website.
>>
>>54618380
>This is DX12 benchmark anon, its favor a GPU that have good DX12 implementation.
Rise of Tomb Rider is not a good representation for DX12 game as it is barely one.

>dx12 benchmark runs better on amd cards
Oh wow what a proper DX12 game
>dx12 game runs better on Nvidia cards
HURRR its broken

Then look at the other games, the performance hits/gains on Hitman are pretty much equal between the fury x and the 1080
>>
>>54618382
What do mean? Vega comes out in October
>>
>>54613675
When I find a game that actually runs poorly on my GTX 670 I'll upgrade, but since I play GTA 5 at an average 50FPS I'm ok, thanks.
>>
>>54618429
>Then look at the other games, the performance hits/gains on Hitman are pretty much equal between the fury x and the 1080

Yes, it trade blows.
Its a good news now?
But you still haven't answer my question; Why does Hitman and AoS benchmark does not reflect the significant gain it got from Rise Of Tomb Rider benchmark?

Where did that 40~ 50% difference go anon? Why, this one single title favor Nvidia so much? Why are this broken title still being used in all of the benchmark? Why are this not true DX12 title were used to gauge DX12 performance? Why does Rise of Tomb Rider failed to do what DX12 meant to do; optimization and performance increase from the low level API?
>>
File: 1080nvodja.jpg (195 KB, 949x1126) Image search: [Google]
1080nvodja.jpg
195 KB, 949x1126
DX11 to DX12 changes everything.
Just look at the performance increase on the 1080.
>>
>>54618382
>1 day old flagship gpu with newest manufacturing process beats a 1 year old gpu
>nvidiots think this is an impressive achievement
>>
>>54618507
Only to be shat on by the 1080ti

And what does AMD do until then?

> Why does Hitman and AoS benchmark does not reflect the significant gain it got from Rise Of Tomb Rider benchmark?

Because it obviously favors AMD's architecture, like others games favor Nvidia's architecture, it's not that hard
>>
File: 1458323979356.jpg (88 KB, 650x650) Image search: [Google]
1458323979356.jpg
88 KB, 650x650
>>54616970

this isn't wikipedia you fucking idiot

we know CDprojekt RED had alpha versions with much higher quality graphics.
and we know how CDprojekt RED can be good at fooling idiots into believing its games are epic masterpieces, when they are basically cheap dime-novel hack developers who are good at marketing their own bullshit

and we know nvidia also has a very large marketing team full of industry type business faggots who suck cock and don't actually know shit about gaming, and would seriously believe TW3 would be an epic game of the entire planet. desperate to pour resources into this game to make it favorable for nvidia.

what do you propose the reason for the downgrade really was?
there is no reason to downgrade the PC version. games with heavy graphics which overcome the capability of current GPUs get released all the time.
you really can't blame consoles for this either because there is literally no reason.
we know much higher quality alpha versions existed on PC, so there is literally no reason.

we already have evidence of foul play by nvidia with the needless tessellation mesh added. what im proposing really isn't a far stretch.
internal testing would reveal AMD to run the game very good. and the only real way to counter this, was to put the maxwell architecture to its specific core advantage of running basic poly-type calculations faster.
basically focusing on the few things GTX does better than Radeon

the witcher 3 totally got ruined by nvidia
if the witcher 3 would have at least kept those graphics, it might have been worth playing for people who actually have decent taste in vidya games
>>
>>54618803
>AMemeDrones always spouting that Nvidia can't into dx12
>ashes of the singularity literally the benchmark they always come up with
>card comes out by Nvidia that shits all over the top card of AMD in the game that literally sees AMD cards perform best compared to Nvidia cards.
>hurrrr it's a new card it make sense durrr

Keep shifting that goalpost
>>
File: shot-20160518-902-147xhnx.jpg (207 KB, 676x1269) Image search: [Google]
shot-20160518-902-147xhnx.jpg
207 KB, 676x1269
>>54618941
>Because it obviously favors AMD's architecture, like others games favor Nvidia's architecture, it's not that hard

Its not just those 2 other tile anon, RoTR result for GTX1080 were not reflected in any other title.
>>
>>54619051
Just like ashes of the singularity gains of older amd cards over newer nvidia cards are not reflected in any game
>>
>>54618969
I'm not sure if your image is supposed to be irony, because damage control is exactly what you're doing for AMD's shit tessellation performance.
>>
>>54618507
AMD has not said when. That October date was traced back to a forum post.
>>
There is no card better than the 1080. Case closed you faggots. Nvidia is the market leader with the highest performance card available. They will continue to shit all over amd because amd will never catch up.
If you got the money and want the highest fps, you gotta buy nvidia.
If you got food stamps you can keep using them on amd trash.

Sorry poor FAGS, you'll never experience the fps that an nvidia card has in every game.
>>
>>54619108

>adding gigantic, needless tessellation meshes just for the sole purpose of targeting AMD cards is not direct evidence of sabotage

>b-but its AMDs fault!!!

yeah ok bro
>>
>>54618801
I dead.
>>
>>54619265
They make it prettier.
>>
Really sad to see just how hard the economy problems have hit people where they can't afford cards that pump out huge fps and smooth beautiful gameplay.

Is this nvidia privilege? I guess amd FAGS get so triggered by our privilege. Very sad.

The 1080 is a class leader. Nothing amd has can come close. Hell nothing on their roadmap will come close. Start saving your pennies up, one day you might be able to afford a second hand 980ti and get just a glimpse of the power and fps 1080 users will have.
>>
File: 1448702541401.jpg (36 KB, 400x460) Image search: [Google]
1448702541401.jpg
36 KB, 400x460
>>54619255
>>54619520
All this banter
>>
>>54618969
>This isn't wikipedia you fucking idiot
AKA I have no evidence Im just making up bullshit because Im an AMDrone
>>
>>54613580
>2100MHz in their demo

I swear /g/ is retarded. They announced the stock clocks during the presentation if you bothered to watch the whole thing. http://www.techspot.com/news/64736-nvidia-announces-geforce-gtx-1080.html
They probably overclocked the founders edition christ you guys have autism.
>>
>>54619446
>tessellating an ocean that never appears onscreen at any time makes it prettier
>>
>>54617727
>Polaris 10 will most likely be a cheap mainstream card.

They already showed Polaris running Hitman at 1440p/60 2 months ago which is 20% faster than both Fury X and Titan.
>>
>>54617880
The 1080 does
>>
There was a webinar about AMD few hours ago, but only for "partners". Does any of you find any info? I've read comments about GDDRX5 confirmed for the full Polaris 10, but I would take it with a pinch of salt.

Shit, the wait for Computex is going to kill me
>>
>>54620098
It makes the ocean prettier, yes.
>>
>>54620225

*A webinar about Polaris by AMD
>>
>>54619924
actually, 2.1 ghz gained only 10% performance from stock in real world tests
>>
>>54620225
probably true, micron gone full throttle on gddr5x production
>>
>>54613807
It's actually two hence the name triple buffering. The first buffer is the one you see the next two are being rendered in the background
>>
File: Capture.png (42 KB, 687x838) Image search: [Google]
Capture.png
42 KB, 687x838
>>54620157
>They already showed Polaris running Hitman at 1440p/60 2 months ago which is 20% faster than both Fury X and Titan
They did show it, but it doesn't need to be 20% faster than a Fury X/Titan to do that. A 390X gets 60FPS just fine, which is exactly where rumors are placing Polaris 10.
>>
>>54620383
Polaris fags rekt
>>
>>54617306
Are you me?
>>
>>54620290
It was more like 10-15% and the stock card boosts up to 1850+ MHz in practice. If you actually do the math, that's a core clock boost around the same 10-15% range, depending on what the exact clocks are. So you're getting an actual performance boost that is very close to the core clock boost, which is exactly as expected.

What did you want, 15% clock boost and 50% extra performance? This shit is a graphics card, not a magical dream fulfilling machine.
>>
>>54620409
More like everyone rekt if AMD fucks up, NVIDIA is selling a midrange card at $700, that's pretty fucking sad.
Thread replies: 255
Thread images: 43

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.