[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
For a shitty looking game
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 38
File: fallout 4.png (52 KB, 913x899) Image search: [Google]
fallout 4.png
52 KB, 913x899
How come AMD is getting rekt
>>
>R9 290
>Less than 1/4th the price of the 980 ti
>capable of 60fps
>half the performance of a 980 ti at 1/4th the price
>>
Because Bethesda is horrid at PC development.
>>
>>51264230
https://youtu.be/15wOp7_dD8E

Selling my 290x for a 980 ti on Friday. Fuck amd and it's piss poor ability to release decent drivers.
>>
>>51264230
Expect 15.12 or some shit a week from now.
>>
>>51264230
>3.5+0.5
>>
What is gameworks

If nvidia didnt pay companies to put this shit in, this would look different
>>
>>51264230
>over 100 fps
Explain to me what the point of that is? To shit post on a Chinese oragami image website?
>>
Gee I wonder why a nvidia sponsored game runs better on nvidia at launch?
>>
>>51264302
>not playing the gay man on 1440p 144 Hz
>>
>>51264230
Feels good having a GTX 960 which apparently beats the 380 and 280x
>>
>gameworks title
>absolutely massive frame rate drops everywhere
>it looks like shit
>a current gen i7 and $300 GPU can't even maintain 60fps at 1080p and high settings

Wow, good job Nvidia. The way its meant to stutter.
>>
AMD HAS NO DRIVERS
>>
File: 1367107700390.png (96 KB, 329x313) Image search: [Google]
1367107700390.png
96 KB, 329x313
>Game is unoptimized piece of shit
>Blame AMD
Why are you playing Bethesda games,OP?
>>
>>51264367
I will always buy Nvidia™ because I only play games The Way It's Meant to be Played™. Nvidia also pioneers innovative new technologies like PhysX™, Gameworks™ and the highest quality driver to ever grace Windows.
When I boot up with a brand new Nvidia™ Geforce™, I can experience the game just like it's mean to be played. Nvidia™ also delivers a far more silkysmooth experience.
Nvidia Geforce™ is also very power efficient. A graphic card is the most power hungry device in your house. Refrigerators, air conditioners, water heaters, dish washers, lights, etc all use significantly less power than a graphic card. Which is why Nvidia™ puts gamers first by ensuring that their gaming experience is of the highest quality while looking out for gamers by giving them the most value in their electrical bill.
At this point in time, there's really no reasons to consider an AMD graphic card at all. I tried one one time, it caused so much heat that it exploded. It also consumed so much power that it gave on an EMP and destroyed the rest of my computer.
Nvidia™ also pioneered how useless GPGPU is with CUDA™. Years ago, everyone thought GPGPU, CUDA™, and OpenCL were the future. Now, Nvidia™ has removed those useless features from their GPUs and increased efficiency. Now you can save thousands a year in electricity thanks to Nvidia™ ensuring that useless features like GPGPU are "optimized" for gamers.
It's quite clear that OP's an AMD shill trying to convince you to settle on something less than The Way It's Mean to be Played™. Nvidia™ is the only real way to play games. We have seen recently that they offer incredible libraries for software developers like Nvidia Gameworks. He is probably too poor to afford the Nvidia Geforce Experience and can not afford to play any games The Way It's Mean To be Played™.
Don't be a poor gamer with bad drivers and a huge power bill. Play games with the Geforce™ Experience™: The Way It's Mean To Be Played™
>>
>>51264392
Why are you so mad? It's just games
>>
>>51264392
/thread/
>>
File: 1440977153744.jpg (65 KB, 824x720) Image search: [Google]
1440977153744.jpg
65 KB, 824x720
>muddy textures even worse than Skyrim
>less render distance than Skyrim
>actually recycles Skyrim animations unironically
>doesn't even support 21:9 resolutions without an ini edit
>HUD stretches at wide resolutions
>>
>>51264413
Was it last week or this week you started using 4chan?
>>
Remember when Skyrim released and GTX 580 would get 30fps due to the game running pretty much solely on CPU?
Good times, keep up the good work Betshitsda!
>>
>>51264230
>>51264268
>>51264367
Fallout 4 uses Gameworks.
if this isn't enough of an answer for you, you don't belong on /g/.
>>
File: perfrel_2560_1440.png (52 KB, 500x1050) Image search: [Google]
perfrel_2560_1440.png
52 KB, 500x1050
>>51264230
It isn't. AMD wins in most benchmarks.
Nvidia sponsored games on release day are not a good metric for GPU performance.
>>
>>51264526
>lying on the internet
>>
Its the Nvidia godrays - the difference from low to high is almost impossible to see but the performance hit is enormous.
>>
>>51264575
>being this jelly
>>
People just can't optimise things for a shitty looking game
>>
File: 1436153909471.png (2 MB, 1117x1208) Image search: [Google]
1436153909471.png
2 MB, 1117x1208
I bet you don't even force supersampling AA from the control panel.
>>
File: ss+(2015-11-09+at+11.13.55).png (91 KB, 651x970) Image search: [Google]
ss+(2015-11-09+at+11.13.55).png
91 KB, 651x970
>>51264537
I'm sorry you're too retarded to google.
>>
>>51264526
>>51264389
>>51264378

If I remove the tesselation will Nvidia die?

I dont even know what the fuck tessellation does considering Fallout 4 looks like some FPS on early 2000s cept for the vegetation
>>
>Gameworks
Neverbuying/10
>>
File: 1363287143441.jpg (87 KB, 487x659) Image search: [Google]
1363287143441.jpg
87 KB, 487x659
>>51264230
a-atleast amd is better for VR r-right?
>>
>>51264650
AMD is better in almost all respects. They're only losing at the very top end.

This is a single benchmark using unoptimized drivers from both sides. It only favors Nvidia at the moment because it uses newer gameworks code.
>>
>>51264650
But I was told Pascal will have like x2 on 3D whatever compared to Haswell

I guess AMD will get BTFO next year
>>
>>51264230
because certain developers tend to optimise their games for certain cards.

crysis, far cry, etc. etc.
>>
>>51264666

Flip it the other way, the same site has a 390x rivalling a 980ti in black ops III.

http://www.pcgameshardware.de/Call-of-Duty-Black-Ops-3-Spiel-55478/Specials/Test-Benchmarks-1176980/
>>
>>51264622
>his GPU doesn't support Gameworks
TOP KEK
>>
File: jews.png (87 KB, 539x488) Image search: [Google]
jews.png
87 KB, 539x488
>>51264679
>x2

Dont you mean X10
>>
>>51264666
>AMD is better in almost all respects. They're only losing at the very top end.
But the charts shows that the GTX 960 is better than a 380 and 280x
>>
>>51264757
Please to reading comprehension.

If you don't reply to the whole post, don't reply at all.
>>
>>51264757
In only this Nvidia optimized game so far.
See
>>51264535

15.11 is a certified shitwrecker in Black Ops 3 and older.
>>
>>51264743
>supporting bad proprietary code
why would i do this?
>>
>GoyWorks

Every time. :^)
>>
File: old man huang.jpg (25 KB, 475x324) Image search: [Google]
old man huang.jpg
25 KB, 475x324
>>51264788

'Cuz I fucking said so.
>>
AMD always wins in the long run. The 280X is already better than the 780, and they were released for 200$ and 600$ respectively.

Nvidia always wins because of early game ready drivers, which usually end up getting fucked up and crashing every other hour.

AMD takes a bit longer, but at least their drivers are stable, despite popular memes. They always end up performing better in the long run, no matter the game or card. Been proven truth for the last 5 or 6 years.

The owners of 780's must be thrilled to have less relative performance than a card that costed one third their card's price when they bought it. And it's not even a new card, it's a card that existed back then too.
>>
>>51264808
>280X
>Released at $200

Might wanna check that again captain.
>>
>Buy inferior AMD junk with no drivers
>Blame Gameworks which pushes state of the art gaming forward

Typical AMDPOORFAGS
>>
>>51264848
Stale bait
>>
gameworks title? guess i'll pirate it. ;)
>>
>>51264624
>I dont even know what the fuck tessellation does considering Fallout 4 looks like some FPS on early 2000s cept for the vegetation
Well, it does nothing. An AMD dev complained nvidia games (like Fallout 4) feature models with an inordinate amount of tessellation.

Yes, even UNDERGROUND models that you CANNOT see. This is done to sabotage AMD. This plus gameworks.
>>
>>51264917
Proof: http://hothardware.com/news/indepth-analysis-of-dx11-crysis-shows-highly-questionable-tessellation-usage

>Using an AMD tool called GPU PerfStudio, it's possible to see which objects and surfaces have been tessellated and where the GPU is spending the bulk of its rendering time
>there's compelling evidence that certain surfaces in Crysis 2--like concrete blocks--are rendered in extremely high detail
>Water is another issue. At present, there are multiple game areas where water is only visible in one small area or isn't visible at all. Despite this, the game is still rendering (and tessellating) an invisible ocean underneath the player's feet.

>Scott notes: "Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces... The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%."
>>
>>51264970
>>51264808
>nvidiots fled thread after being presented good and well founded arguments

What a surprise
>>
>>51264970
Evil and Incompetent.
>>
File: cod-bo3.png (50 KB, 1004x958) Image search: [Google]
cod-bo3.png
50 KB, 1004x958
>>51264230
How come NVIDIA is getting rekt
>>
I've been playing the game, "God Rays" murders performance and I see no difference between Ultra and Low, but using Low nearly doubles my performance over Ultra. And those are my results even on a 980 Ti.
I think over the next few days you will see a recurring theme: Turn OFF God Rays or put them on LOW if possible. This applies to everyone pretty much regardless of hardware.
>>
AMD can't handle tessellation or DX11 because of their high CPU overhead

That's AMD's problem, nothing to do with Nvidia or Gameworks
>>
>>51265105
>AMD can't handle tessellation or DX11 because of their high CPU overhead
>That's AMD's problem, nothing to do with Nvidia

So how do you explain inordinate amounts of tessellation being used to render OCEANS underneath the character that you LITERALLY cannot see?

Refer to >>51264917 and >>51264970
>>
>>51265121
AMD's problem, if they can't handle tessellation then don't advertise hardware as DX11 compliant
>>
>>51265139

GCN tessellates juts fine - you really don't need to crank it as high as gamesworks does for anything.
>>
>>51265139
It's made to gimp AMD on purpose though, how do you explain that?

They can handle it, and they do comply to DX12, just not as well as the opposing brand. Does not justify abusing it's usage, which is common practice in games where Nvidia usually "cooperates" with.
>>
IMO this is blatent performance smuthering in order to drum of GPU sales.
Tell me, if losing an addition 30% of your frame rate is worth this difference:

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-god-rays-quality-interactive-comparison-003-ultra-vs-low.html
>>
>>51265158
>DX12
Meant 11, obviously.
>>
>>51265139
Nope. They can handle tesselation, but the performance penalty is 40%, compared to Nvidia's 20%, so Nvidia bribes the devs to use inordinate amounts of it to further cripple AMD cards.

You're a dumb fuck brand loyalist. If Nvidia wasn't anti-competitive both cards would benefit from higher frame rates, although AMD's line up would be on top more often than they are now.
>>
>>51265155
>>51265158
This is PC gaming, it's meant to push the limits and state of the art, otherwise go play on your AMD consoles

If your inferior hardware can't handle high tessellation which is valid usage of the GPU, too fucking bad
>>
>>51265139
AMD GPUs have absolutely no problem with tessellation. Nvidia knew they couldn't directly compete against AMD so they put larger geometry processors in their GPUs and have all their sponsored games crank tessellation up to unreasonable levels. Using X64 when anything past X16 is impossible to display. It just wastes hardware cycles processing.
Nvidia also use custom game profiles to have driver interpret a call for X and instead render Y. They use driver level shortcuts to get an edge in benchmarks rather than natively rendering what a game requests the GPU to do.

They would have been absolutely slaughtered in BF3 performance without doing so.
>>
>>51265182
See
>>51264611
>>
>>51265182
But it's not pushing anything. You can't even see it, as proved in the sources above.

I know you're just baiting now, but you're just looking like a dense motherfucker
>>
>>51265182

>If your inferior hardware can't handle high asynchonus compute which is valid usage of the GPU, too fucking bad
>>
>>51264230
>>>/v/
>>
>>51265182
>it's meant to push the limits and state of the art
So rendering invisible objects is okay? Fuck off shill.
>>
>>51265215
>not preloading objects
AMD fags
>>
>>51265168
I don't even see a difference.
>>
>>51265206
Maxwell handles async compute just find, just look at the latest Ashes benchmark, outperforms your MEME async compute

Stay mad though, AMDPOORFAGS and enjoy your console level graphics while Nvidia users enjoy beyond console level graphics with highest level of details
>>
>>51264525
>Betshitsda
I applaud your uncreativeness
>>
>>51265226
>>not preloading objects
You STILL have to render them, despite preloading the textures.
>>
What resolution is this at?
>>
>>51265231
I would have said they were the same image, but the angle of that tree's branches change.
10/10 effect
>>
>>51265270
Most likely 1080p considering the maxed settings.
>>
https://scalibq.wordpress.com/2011/01/19/catalyst-hotfix-11-1a-amd-admits-defeat-in-tessellation/

AMD can't handle tessellation, hence the driver cheats
>>
File: nvidia cannot into compute.jpg (83 KB, 1026x839) Image search: [Google]
nvidia cannot into compute.jpg
83 KB, 1026x839
>>51265239

>Maxwell handles async compute just find, just look at the latest Ashes benchmark

Ashes barely touches async, as per the words of Oxide themselves.

Still, enjoy your gimped compute abilities.
>>
File: 67232.png (34 KB, 650x450) Image search: [Google]
67232.png
34 KB, 650x450
>>51265309
>some 4 year old bullshit

Look at how fucking desperate you are
>>
>>51265309
>AMD can't handle tessellation, hence the driver cheats
* that you have to manually enabled on a per game basis

But no, it's okay that developers literally render invisible oceans beneath your feet with inordinate amounts of tessellation gimping both cards in process, but gimping AMD 40%, and Nvidia 20%.

If you can't understand why this is bad, you should kill yourself, you worthless brand loyalist shill.
>>
>>51265309

>2011
>prior to GCN

Wew lad, are we gonna say the Nvidia TNT is shit because it doesn't support DX11?
>>
>>51265344
>it's okay that developers literally render invisible oceans beneath your feet with inordinate amounts of tessellation

Crysis did this, right?
I vaguely remember that little controversy.
>>
It's based on Skyrim's engine. Why would want to go above 60 fps? 980 ti is literally a waste of money on this game.
>>
>>51265321
Luxrender devs are well known AMD asslicking shills that can't write good compute code
>>
>>51265365
Not just Crysis, the article mentions 'other titles'.

>>51265382
>Why would want to go above 60 fps?
Not everyone has a 60hz monitor, dipshit.
>>
>>51265392
>desperate shitposting

How adorable.
>>
>>51265405
No one uses Luxrender junk

Everyone uses CUDA to do real work, but keep up your shitposting, it's amusing
>>
>>51265392
>>51265415


Whatever helps you sleep at night.
>>
File: 74799.png (31 KB, 650x380) Image search: [Google]
74799.png
31 KB, 650x380
>>51265430
>Cherry picking benchmarks

Anyone can do that
>>
>>51265415
Its a hardware agnostic compute benchmark using OpenCL. Trying to shit talk OpenCL is just beyond ridiculous.
CUDA is a proprietary compute language.

Your desperate tech illiterate shitposting really is adorable.
>>
File: 74801.png (35 KB, 650x380) Image search: [Google]
74801.png
35 KB, 650x380
>>51265430
U MAD?

Nvidia CRUSHES AMD in compute despite not even optimizing at all for OpenCL junk
>>
>>51265449
>using single precision for any kind of scientific work
nobody is stupid enough to do this.
>>
>>51265449
Not him but
>Single Precision
>>
>>51265239
>maxwell handles async compute fine, guise, I swear
>even though maxwell does not have the hardware to perform async compute

>the latest Ashes benchmark, which had async compute significantly reduced after nvidia bitched about it to the developers, proves it!

>nvidia pushes the state of the art with gameworks(tm)
>except when AMD is ahead, in which case they pressure the developers to remove features and set graphics rendering advancement back so their inferior hardware doesn't have to suffer
>>
>>51265470
Single precision is all F@H uses

Shows what an ignorant moron you are
>>
File: 75495.png (33 KB, 650x400) Image search: [Google]
75495.png
33 KB, 650x400
>Radeon HD 7970 is beating the R9 Fury X here.

>AMD
>double precision

Choose none
>>
>>51265465
the only thing this shows is Nvidia getting shitslapped at every level but the ultra-high-end.

Doesn't even have the Fury or Fury X, and it's obviously not using latest drivers. 295x2 doesn't even have proper support, since it's evidently running in single mode.
>>
>>51264230
>GameWorks™ by Nvidia®
>>
>>51265529

Fury is gimped down to 1/32 rate iirc - hawaii runs at 1/8 and tahiti 1/4 (again, iirc).
>>
>>51265571
Is the 290X using FirePro drivers here?
>>
>>51265571
This benchmark is fake

290/290X is gimped the double precision performance to 1/8 while only the Firepro is full 1/2 enabled

Tomshardware can't benchmark for shit with fake results
>>
>>51264340
The physics shits itself past 60fps with Gamebryo.
>>
File: 59314.png (30 KB, 650x500) Image search: [Google]
59314.png
30 KB, 650x500
>>51265571
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/18

>Meanwhile double precision performance also regresses, though here we have a good idea why. With DP performance on 290X being 1/8 FP32 as opposed to ¼ on 280X, this is a benchmark 290X can’t win.

>AMD
>double precision

CHOOSE NONE
>>
>>51265654
>stuck on same node for three years
>die sizes increase, try to control power draw
>reduce dp unit functionality to offset

>LOL WOW AMD A SHIT
>NEVERMIND NVIDIA WAS SHIT AT DP COMPUTE FROM THE START
>>
GameWorks (tm)
>>
970 owner here. Ofc amd is btfo, like usual
>>
>>51265571
>Double memecision
Who cares
>>
>>51267346

People who spend considerably more money on gpus than /g/.
>>
>>51267312
970 owner here, fuck off. I hate Nvidia, but a 970 for $220 is the best price to performance by far.
>>
Reposting from other thread so others know to put Godrays on low

http://images.nvidia.com/geforce-com/international/comparisons/fallout-4/fallout-4-god-rays-quality-interactive-comparison-006-ultra-vs-low.html

Literally no difference and the Godrays are over tessellated just like the Hairworks on Witcher 3 which has a bigger hit on AMD. This is what happens when you have Bethesda's incompetence and Nvidia's bloated as fuck Gameworks.
>>
>>51264535
Tfw not including sli 980 and 980ti. R9 295x2 is like the best of the best of the 200 series in crossfire. I'm sorry but-
AMD SHILL BTFO
>>
The real problem is driver overhead with dx11 or lower.

The same tests showing amd destroying Nvidia on dx12 (draw calls) shows nvidia destroying amd on Dx11 and lower.

The Amd 15.x drivers gets better results but are far from perfect.
>>
>>51268075
You are going to be ass blasted as hell when nvidia games come out for dx12.
>>
>>51267511
The only difference I notice is a bit blurrier in the center
>>
>>51268104
I'm not a fanboy or anything I'm exposing the facts as they are right now.

Dx12 drawcalls are higher on AMD but on Dx11 and bellow are just crap.

Dx11 will be around some time so amd need to lower the overhead while nvidia needs to do the same on dx12 (Nvidia will probably need new hardware for this).
>>
>>51268075
Full dx12 games won't be out till next fall earliest. By then the new nvidia cards will be out with dx12 in mind. They're predicted to come out in Q2.
>>
>>51264230
money
exclusive contracts
>>
>>51268455
that's just what i said.. nvidia will probably need new hardware to increase draw calls, I'm not sure how pascal will work with dx12 so it's too early to say something about it.

otoh amd haven't reduced the overhead on the driver enough on dx11 and there's no dx12 games out there to shine.

In my opinion this is a good time to just wait for the next gen (I'm waiting with a 7970).
>>
>AMD fags
>M-MUH TESSELATION!!!!!!!!!!!!!!!!!!

>NVIDIA fags
>M-MUH ASYNC!!!!!!!!!!!!!!!!!!!!!!!
>>
>>51264848
>state of the art

Project Cars
>not even nvidia shills use this game for benchmarks
Witcher 3
>gets pushed 6 months, a week after this, nvidia gameworks logo appears
>x64 tessellation from hairworks cripples anything below the 980. Including Kepler cards
Bamham
>for maybe the first time in history, a large publisher unreleased their game from Steam due to bugs
>Nvidia trailer is a render of the game that is sped up
Crysis 3
>users find objects in the game that the player never sees, but has ridiculous tessellation
>under foliage, the non visible ground is heavily tessellated too

And the list goes on and on
>>
>>51268670
Works on my machine
>>
>gaming
>>
>>51264230
>jewvidia sabotaging every big title to run like shit on AMD
>full of low IQ idiots supporting jewvidia and claiming its just "faster"
>meanwhile everyone buys nvidia thanks to nvidia's jew tricks

I don't want to live on this planet anymore
>>
How well do you guys think this game will run on a laptop gtx 960m?
>>
game is shit, game is optimized like shit

runs like shit on both amd and nvidia cards

whats the problem here
>>
File: gamewreks.png (21 KB, 983x248) Image search: [Google]
gamewreks.png
21 KB, 983x248
> Turn gamewrekker off
>performance gap between 3yo 290x and 980ti supercharged buttmad edition drops by 5%
>>
>>51264230
>optimized by nvidia
>sponsored by nvidia
no idea
>>
>>51264230
Gameworks always screws over AMD cards

also is that x16 AA I see?
>>
File: Jewvidia.jpg (2 MB, 1200x7888) Image search: [Google]
Jewvidia.jpg
2 MB, 1200x7888
>>51264917
>>51264970
>>51264624
>>
>>51265182
>If your inferior hardware can't handle high tessellation which is valid usage of the GPU, too fucking bad

You can't even see a reasonable difference after a short while, just render the game at 4k and have it downsample it to your monitor fora better overall effect
>>
>>51269389
Seems like an advert to nvidia more than anything

>vote with your wallet! buy an AMD card because they run modern games like shit!

vs

>buy nvidia card
>your games are automatically optimized
>>
>>51269389

>3200x1800
>All settings ultra+HairWerks
>60+ FPS


>...On two 290x

That jew gook from NVIDIA can suck a dick. I don't buy Nvidia simply because of their dirty ass business practices, not because NVIDIA doesn't tend to offer higher performance tricks or no.
>>
>>51269441
But if you buy an nvidia card you have to pay $100 more on average for an adaptive sync display, and you have to give up your personal info just to get drivers

besides, AMD cards are better at nearly every price point, aside from the super low end with the 750ti and super high end with an overclocked 980ti
>>
File: 1406670379171.jpg (41 KB, 493x309) Image search: [Google]
1406670379171.jpg
41 KB, 493x309
>Gameworks
>Game doesn't work
>>
>>51268740
>Nvidiots
>>
>>51269470
>But if you buy an nvidia card you have to pay $100 more on average for an adaptive sync display

Nobody is pointing a gun to your head and telling you to buy a gsync monitor.

Gsync monitors cost more because Nvidia has an actual standard/prerequisite for them and will not allow any random no name company to tarnish the brand. Gsync monitors needs a custom asic and a special backlight to ensure things like 3D, 3D surround, and ULMB work without issues. Gsync is a suite of features while Freesync is a single feature.
>>
>>51264230
What do you mean? a 390x is on par with a 980 at 4k and 1440p

http://www.gamersnexus.net/game-bench/2177-fallout-4-pc-video-card-fps-benchmark-all-resolutions
>>
>>51269606
I'm pretty sure no one actually gives a shit about 3D, and it's never a largely advertised feature

aside from that, if you're serious about gaming, then you're going to get an adaptive sync display at some point

and right now there's only 1 free-sync display that doesn't preform, the 1440p 144hz IPS display

but there's a number of 144hz TN panels that function fine up to 144hz for the free-sync range


You also have a much wider variety of free-sync displays to choose from, and intel is going to support free-sync as well

G-sync is going to die within the next year or so, as open source always wins in the end
>>
>>51269658
No, if you're serious about gaming you would get a gsync monitor for ulmb/lightboost.

http://www.blurbusters.com/zero-motion-blur/lightboost/
>>
>>51269658

>If you're serious about gaming

>You'll buy a new placebo faggatron monitor that costs 10x as much as the one you're using cause it has some buzzword "new" "life enhancing" technology!

>Curved

>Swords
>>
File: motion-blur-graph.png (43 KB, 576x460) Image search: [Google]
motion-blur-graph.png
43 KB, 576x460
>>51269699
>>
File: 4k free-sync.png (14 KB, 825x177) Image search: [Google]
4k free-sync.png
14 KB, 825x177
>>51269709
the LG is comparably priced to other 4k IPS displays, and it has free-sync

also adaptive sync refreshing gives you a noticeable gain to smoothness
>>
>>51269699
>>51269721
The first 2 comments of that article are people having trouble getting it working correctly

also
Currently, G-SYNC and ULMB is a mutually-exclusive choice – you cannot use both simultaneously (yet), since it is a huge engineering challenge to combine the two.

G-SYNC: Eliminates stutters, tearing and reduces lag, but not motion blur.
LightBoost/ULMB: Eliminates motion blur, but not stutters or tearing.

So, ya, can't use both at the same time, or did you just forget about that detail?
>>
>>51269760
>60hz
>5ms

That's god fucking awful. Pay extra and buy a PG279Q.

>>51269800
No shit, it even saids so on the box. You can't do 3D at and have it suck your dick at the same time either.

Switch to ULMB for competitive games such as csgo and Gsync for newer stressful games.


>The first 2 comments of that article are people having trouble getting it working correctly

Lightboost as an unofficial hack. Nvidia acknowledged, refined it, and is now calling it ULMB.

I still can't believe you said get a freesync monitor for serious gaming with a straight face not knowing about ULMB. Ignorant turd.
>>
>>51269903
>>51269800
So whats the difference between freesync and gsync?
>>
>>51270600
freesync is like six piece chicken nuggets and gsync is like the happy meal. onlybgood boys get happy meals. the rest gets unhappy meals.
>>
>>51270640
>>51270600

The difference is Gsync lost because Intel (which are bigger than AMD and Nvidia combined) are supporting freesync.
>>
>uses gamewerks
good
AMD poorfags BTFO.
>>
>>51270600
one is for poor people and one is premium quality
>>
>>51270712
>>51270640
Is it unrelated to 144hz? Should I get a freesync monitor if I game alot?
>>
>>51270768
Only if you can't afford a gsync monitor.

What kind of games do you play?
>>
>>51264392
top kek, thanks for new pasta
>>
Are there ever any good deals on cards during black Friday?
>>
>>51270791
Not answering my questions at all

Any games
>>
>>51270847
You sound like a retard so you know what go ahead and buy that freesync monitor
>>
>>51270890
You sound like a dumb faggot shill
>>
>>51270768

At any refresh rate above 40 or so gysnc and freesync are basically identical - its the really low refresh rates (i.e 30 and under) where the differences matter.

Even then both technologies work best when you have a massively fluctuating framerate.
>>
nvidia trollworks and fucking stutter game is a fucking mess like every other pc game released in the last 2 years. gonna keep playing old games from like 2007 on my r9 390 lmao
>>
>>51270791
Different anon

Fighting:Street Fighter, Dead or Alive
FPS: CSGO, COD, Crysis 3
RPG: Witcher 3, FF 15 (soon)
Sports: NBA 2k16, WWE 2k16
Moba: Dota 2 Reborn
>>
>>51270908
Is 144hz unrelated to x-sync? How do they function with 144hz? And if you never even reach the refresh rate in frames?
>>
>>51270935
def gsync then. make sure to get one with ulmb.

buy one off amazon if you're unsure. they have an extended holiday return policy atm and you can try it out until feb 1st. at that point you can return if you don't like it or get a price match if the price drops.
>>
>>51270953

Each sync with have a range from which it works in (lets say 30-144). Basically anytime you are inside that windows your screen effectively acts as if vsync is engaged i.e no screen tearing regardless of what your framerate is doing.

Off the top of my head gsync caps to the refresh rate of the panel whereas freesync simply disengages if you run above and below it. Gsync does some crazier stuff at really low framerates (i.e under 30) but its still not pretty.
>>
>>51270956
>gsync shill
>>
>>51270990
shoo shoo poverty ghost
>>
>>51270986
so does that mean i can run games as smooth as back when I have CRT?
>>
>>51270986
Is a 144hz monitor strictly beneficial? Is it worse in any games?

Is it a safe bet if not every game will run at 144fps?
>>
>>51271005
Buying a 500-1000$ monitor doesn''t make you rich or luxurious you delusional retard.

You're like apple retards who think 1000$ phones make them look rich.
>>
i hate niggers like amd is gud so is intel u nigger
>>
File: bf2ec9dbcf[1].png (58 KB, 996x662) Image search: [Google]
bf2ec9dbcf[1].png
58 KB, 996x662
ok guys, here's the ultimate proof that shitworks is cancer
>>
>>51271008

Most likely, but my memory of how CRT works is vague.

>>51271013

>Is a 144hz monitor strictly beneficial?

It feels smooth but you get deminishing returns over about 80fps or so. The bigger issue is actually running games at this sort of framerate.

>Is it worse in any games?

Unless the game ties is physics to framerate no, at face value more fps is always better.

>Is it a safe bet if not every game will run at 144fps?

Its just a number - as long as are within the window of *sync your screen provides you get the full benefit. Some freesync screens for example only have a freesync range of 45-75hz.

Side note: not all freesync panels are identical thus researching the freesync range is important (I really want a 4k screen that has a freesync range of 30hz to whatever).
>>
these graphs are useless u retarded niggers
>>
>>51271080
you should check out if somebody made an EDID hack to the monitor you're interested in. i was able to push my freesync range from 47-75 to 32-75 on my LG 29UM67-P. and it did work with other freesync monitor aswell.
>>
>>51271022
wow all that projecting

middle class and above wouldn't have reacted this way
>>
>>51271136
You're oozing raw faggotry
>>
File: 107.jpg (14 KB, 300x246) Image search: [Google]
107.jpg
14 KB, 300x246
>>51270956
>takes the time to explain that ULMB and GSYNC won't go hand in hand
> tries to convince him to get a gsync monitor with ulmb
>>
>>51271080
Wouldn't a 144hz monitor with freesync have it working at that range?
>>
>>51271175

Most likely but it never hurts to check.
>>
>>51271174
You didn't even know about ULMB before I told you about it. Now you're the leading expert on it trying to educate people (including the guy who told you about it) after I linked you a page to it?

Isn't this what they call mount stupid?
>>
>>51271216
except that i knew about ultra low motion blur mode on those asus panels..

it's just that it's not necessary if you don't own a sli setup to begin with
>>
>>51271225
If you knew about it you wouldn't have recommended freesync in the first place.

Just let it go dude, you got rekt like 2 hours ago and you're still mad, I know, but you gotta let it go.
>>
>>51271255
except that i just got in the thread 5 minutes ago. i'm not that same anon
>>
>>51270600
Free-sync functions on it's own using the panel's built in tech over display port

G-sync Requires a little module to function which adds about 100 bucks to the price, and it has a few other features

But when both are working correctly there's no difference, and free-sync has less latency as seen

https://www.youtube.com/watch?v=MzHxhjcE0eQ
>>
>>51271271
>lying to save face

lol amdfaga
>>
>>51270768
>>51270935
You should be getting a free-sync display because it supports freedom and saves you money at the same time, unless you're going to buy one of the 1440p 144hz IPS displays that run 500/700 dollars free-sync and G-sync function the same

so basically buy a wasabi mango that can run at 120Hz 1080p, or 4k

https://www.youtube.com/watch?v=SPXdOpaNUtg
>>
>>51271396
Which 1440p 144hz freesync is good?
>>
>>51271426
This one, Linus said this is the greatest monitor he's ever used

https://drive.google.com/file/d/0B2b1_NkKXyhoODRrODF6RTNTTmM/view?pli=1
>>
>>51264230
>gameworks
nuff said
>>
>have 560ti
>college student, so don't have a lot of game time or money
should I just get a 290 and hope that memeworks sorts itself out or wait another year for 970s to drop enough so that I can afford one?

budget is $300 CAD, willing to buy used
>>
i can confirm the game is completely fucked on AMD, i have a r9 290 and get drops into the 30s in town areas while 970s run 60 solid

lowering the graphic options seriously only gets me like 5 fps more at best
>>
why do you care about benchmarks for a shitty game
>>
>>51264230
How come the 970 is beating the 780 ti?
>>
>>51271396
If I get Nvidia Pascal + Freesync I get less input lag on very high 144fps?
>>
>>51271645
You can't use nvidia cards with free-sync until nvidia starts caring about your freedom or the drivers get hacked so it thinks it's a mobile G-sync display(which is just free-sync)

In either case a 390 or a Fury is going to be able to push competitive games like CS:GO at 144fps no issue
>>
>>51271557
Are you retarded? Why would you get a 970 after having a 290?
>>
>>51271665
fugg

jews win
>>
>>51271593
drivers and architecture, look out the 650€ 980ti being beaten by 350€ GTX 1070 or whatever it will be called by then
>>
>>51271683
are you retarded? I said I have a 560ti and want to upgrade to either a 290 OR and 970, but as is, 970s are out of my budget, so it's either 290 NOW or 970 LATER
>>
>>51271738
You can get 290 for $200 from newegg last i checked
>>
>>51271713

No shit old gen would get cuucked so badly this time around I hope your price stated is like that considering all the new parts that needs to be bought.

Wonder how much the mobo would cost that supports NVLink on initial release
>>
>>51271776
cheapest 290 new in Canada is $320 and that's only the reference version. if I buy used I can get a tri x 290 for $250 (probably haggle to $225), which is what I'm looking at
>>
>>51271713
But shouldn't the 980 trade blows with the 780 ti?
Not have the 970 beating it?

Are AMD cards the same?
Does the 280x get beat by the 370?
>>
>>51271843
Get used, if it's not reference
>>
>>51271843
Don't buy used 290s, they were probably bit coin mining cards, just go for the 290 now, you have a full 4GBs of VRAM and better DX12 support, plus you save money on the free-sync display down the road
>>
>>51264281
>Implying there aren't always day 1 drivers for AAA releases
>Implying Crimson isn't later this month
>Implying you own anything AMD
>>
>>51270792
>new
>>
>>51271912
>bitcoin mining
Didn't that meme die a couple of years ago?
>>
>>51271965
Why do you people just call everything and anything a meme?

It was viable for a while if you weren't paying your own electricity

but now the difficulty is too high so you'd have to move onto other coins
>>
>>51271593
780ti is keppler and nvidia gutted keppler to run slower with their drivers so the 970 looks like a good card worth buying, if you install older drivers befor the gutting your 780ti gonna be faster again.
Nvidia the way its meant to be playedTM
>>
>>51271912
honestly I was never going to go nvidia. when 970s drop to $250, 290s will be $150, so there'd still be no reason to go nvidia over amd

and I think sapphire has a 2 year warranty so even a mined 290/290x should most likely be covered
>>
>>51272026
>when 970s drop to $250, 290s will be $150,

970s are currently $240, 290s are $330
>>
>>51272016
But wouldn't the newer driver still be more optimised for newer games than the old one?

Maybe they're just optimising keppler less than maxwell?
>>
>>51272181
Because 290 are discontinued.
And 970 at 240$ ? Not happening yet.
>>
>>51272259
He means b stock.
Not brand new.
>>
>>51272259

$240 on evga b-stock and $270 for the same new gpu with a game that can be sold to make up the extra $30 in the price

imo unless the purchaser is stupidly brand loyal like most idiots on /g/ are the 970 is the best value for 1080p
>>
>>51265093
because nvidia is the apple of gpus
>>
>>51272310
290 is $200 though. And it's not gimped.
>>
>>51272310
This.I really hate Nvidia, but I managed to sell my R6 code to some friend of a friend for $40 so I got the cúck card for $230 until Greenland or Pascal come out. A 290 for $200 sounds great, but I don't dig the reference design and consuming so much power idling multi monitors isn't good for me.
>>
>>51272346
>$200

maybe for a reference gpu that was used for bitcoin mining

>And it's not gimped.

no compression in gcn 1.0 and 1.1 so the 4gb of vram on a 290 is useless and will be consumed faster than the 3.5 on a meme card, this is why AMD only ships 8gb 390s now
>>
>>51272310
But isnt 780Ti the best value for 1080p @ $347 considering most benchmarks shows that it performs faster than a 970
>>
File: powerdraw (1).png (114 KB, 602x363) Image search: [Google]
powerdraw (1).png
114 KB, 602x363
>>51272366
Well, if you're using nvidia i hope you're not on a 144hz screen.
>>
File: Newegg_meme_970.png (25 KB, 741x368) Image search: [Google]
Newegg_meme_970.png
25 KB, 741x368
>>51272377
No nigger, you could get new reference 290s on newegg several days ago. This was after my 970 purchase though.
>>
>>51272377
Brand new r9 290 is $199 you cock.
>>
>>51272394
Are you the same guy everytime? I point out multi monitors and there's always someone throwing the 144Hz.

Keep at it, I actually didn't knew before someone replied it. It's hilarious how Nvidia brute forces it like AMD, but Fiji can stay static with such monitors.
>>
>>51272398
>>51272409

>reference

yeah, not worth it even for $200.
>>
>>51272377
>no compression in gcn 1.0 and 1.1
They've literally been using compression on textures for +10 years now...
>>
I have a 280 X, am I fucked?
>>
>>51272431
I actually just found out about it an hour ago.
There's advantages and disadvantages to picking either brand.
>>
>>51272448
At least it's not a 970.

>>51272533
280x is a fantastic gpu, just wait for the new drivers, should he here in a couple of weeks.
>>
>>51272564
>At least it's not a 970.

the 970 wipes the floor with a 290
>>
>>51272366
>consuming so much power idling multi monitors isn't good for me
Ironically, it's one of the main thing that made me quit nVidia. The GTX 470 went full load with 2 screens, meaning 90+°C on desktop. And a freaking jet engine noise.
>>
>>51272550
Yup. On load it doesn't really matter who you pick. Idling or doing shit with BluRays however is a different story.
Nvidia might save you a couple of bucks if you idle your computer enough with multiple monitors, but high refresh monitors make Nvidia go crazy and there's the stupid premium for g-sync monitors since nobody is getting freesync right with IPS panels.
>>
>>51272591
Niggers don't know previous housefires. As expected of Fermi.
>>
>>51272587
970 is a stuttering piece of shit with no async.
Falsely advertised as having more cache and rops, vram limited to 3.5gb, it is the worst card of this generation.

Fuck off shill
>>
tfw won't be until next year til I build a new PC.

I only have a 7850, it ran MGSV and GTAV well, hopefully I can run FO4.
>>
>>51272625
>muh false advertising

how it was marketed (which was shitty) has nothing to do with the performance characteristics of the card or the maxwell architecture

the 970 is a demonstrably better value and performer over AMD's offerings
>>
>>51272607
Is that multimonitor issue on their older cards too? Does it affect the gpu under load?

My 5870 idles around 55 degrees and 75 under load.

For comparison my cpu is 34 degrees idle.
>>
>>51272640
Good choice desu
>>
>>51272655
http://youtu.be/k9cKZiJw6Pk
>>
>>51272625
>async
>dx12
>botnet

I thought /g/ was against using Windows 10

So all these AMD fags wants Win 10 for their async meme.
>>
>>51264230
Its not just AMD getting rekt. Nvidia seems to be getting destroyed too. Whatever those guys at Palit are doing seems to be killing it for everyone else.
>>
>>51272781
Wait I thought Palit is the shittiest manufacturer
>>
>>51272448

Never got why Reference 290 is so hated, it just gets loud when you turn up settings while gaming. Never really bothered me that much. And hey...it doesn't sag at least.
>>
>>51272715
I only use my desktop for gaming though, why wouldn't I want dx12 support?

Everything else i do on my laptop.
My current gpu is a hd 6870, when I buy something, I buy it to last.

If you want a gtx 970 and will upgrade to pascal or 400 series amd then go for it, but if you want a good investment that will last a few years the 970 isn't something I can recommend.
>>
>>51272891
>loud

Not even that loud, it's the reference 290x you gotta look out for
>>
>>51272891

i went through amd reference hell for a few years with a 4870 and a 6970, can't image what it would be like with a 290 which uses considerably more power at load

compared to aftermarket coolers they really are a joke, my meme card isn't even audible to my amd damaged ears now
>>
>>51272891
It sounds like a jet engine when gaming. Unless you live in a rural area, gaming at night can become a police case.
>>
>>51272655
>the 970 is a demonstrably better value and performer over AMD's offerings
The 390 kicks it's shit in m8
>>
>>51273289

the 390 draws twice the power, costs $100 more and only has 5% more performance over a REFERENCE 970.
>>
>>51273338
It only costs $100 more if you're in India, under a gaming load it hardly uses more power, and it gives you 125% more VRAM, and better DX12 support

the 390 has a 50% higher TDP, but that's only going to matter during stress testing
>>
>>51273338
>the 390 draws twice the power
I hope you're memeing
>>
>>51273383

AMD is like those CRT monitors cheap yet expensive with your monthly electricity bill
>>
>>51264268
>Because Bethesda is horrid at development.
ftfy
>>
So do I get 390 or 970?
>>
File: 1444411359281.jpg (66 KB, 720x733) Image search: [Google]
1444411359281.jpg
66 KB, 720x733
>>51264289
hahahaha
>>
>>51265168
Is this what PC gaming has come to? Games that look like garbage with random caltrops shoved in the background that happen to tank your competitor's framerates more than yours?
Thread replies: 255
Thread images: 38

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.