[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
https://twitter.com/PellyNV/status/ 739890789208055808 nVid
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 255
Thread images: 41
File: x7kgqrm4unv3549n88bn_bigger.jpg (16 KB, 73x73) Image search: [Google]
x7kgqrm4unv3549n88bn_bigger.jpg
16 KB, 73x73
https://twitter.com/PellyNV/status/739890789208055808

nVidia purposely gives the press specific drivers that lower image quality to improve framerate scores. Then give s the public something else
>>
Is this supposed to be news or something?

Everyone knows this.
>>
Nothing new, everyone knows nvidia is ran by a bunch of kikes.
>>
/Nvida/ - Anti consumer lies and deceit.
>>
>>54949166
>>54949175
>you don't need proof when EVERYBODY knows it!
>>
>>54949137
hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaha

this is Intel levels of jewry
>>
>>54949225
Nvidiots will deny the proof anyways, everyone with a brain has been aware of it for years now.

Thanks for the post I guess?
>>
>>54949137
>press only drivers

lol this fares well for their new products
>>
Yet they still have crushing majority on the market and there is nothing you are going to do about it.
>>
>>54949137
yeah, I've seen memes about this, where some special effects are not seen in nvidia benchmarks.

but implying this news will deter the nvidiots.
>>
this has been standard practice on the green camp for a couple of "releases" now
i find it amusing that people still believe anything that the chink guy says in those """presentations"""
>bah, say the sheep
>>
>>54949137
I love how every idiot just jumps into the bandwagon that confirms his believes without any proof of them.
>>
File: 1442439906168.gif (3 MB, 300x252) Image search: [Google]
1442439906168.gif
3 MB, 300x252
>>54949269
>>
>>54949366
>http://imgur.com/a/pyC3r
>>Much better texture detail on the right demo (GTX 1080) than on the left one (RX 480)
>no snow coverage on the 1080, only terrain visible
nvidiots are on full damage control
>>
>>54949468
You know you are comparing screenshots of a streaming, right?
>>
>>54949493
GTX 1080 CAN'T STREAM, IT'S OFFICIAL, NVIDIA IS FINISHED, IMAGINATION MASTER RACE,
>>
>>54949493
you know im not blind and can see that THERE IS NO SNOW ON THE 1080 right?
>>
>>54949529
How do you know it's not randomly generated?
>>
>tfw those people thinking their drivers are gimped just get fixed drivers which render the games properly
>tfw no fac
>>
>>54949554
>drivers get fixed
>game runs as it should
>15% reduced performance
>buyers remorse for not buying AMD or maxwel.
>>
>>54949549
im not a stupid nvidia sheep and have run that game benchmark on my own
wtf nigga, take your damage control to >>>/reddit/

its a crappy game btw
>>
>>54949575
who the fuck can trust those 1080/1070 benchmarks now?

fron now on benchmarks need to show side by side game rendering, so we can make sure nvidia doesnt cheat.

incredible
>>
>>54949632
1080 is probably between fury and 980 ti performance irl.
>>
>>54949632
>who the fuck can trust those 1080/1070 benchmarks now?
Who the fuck trusted them before?
>fron now on benchmarks need to show side by side game rendering, so we can make sure nvidia doesnt cheat.
Those have always been the only kinds to trust, Nvidia straight up cheats and lies, AMD presents an ideal situation and pretends that's how it will always be.
>incredible
Classic misdirection.
>>
>>54949632
Just install the press drivers. You won't notice any difference in image quality :^)
>>
>>54949549
Even Nvidia admitted it
>>
>>54949752
>Classic misdirection.
I just didnt even think this could be happening. Have to buy gpu soon and almost fel for the 1070.

Now I need to wait for some trustworthy benchmarks.
>>
>>54949779
They should offer 2 diffrent versions to download on their site)
>Nvidia p(erformance) drivers.
>Nvidia q(ality) drivers.
>>
>>54949804
Kek.
>>
File: out.webm (1 KB, 73x73) Image search: [Google]
out.webm
1 KB, 73x73
>>54949137
>>
>>54949137
>amd shills at work

get out of /g/, we are nvidia here
>>
>>54950047
Nvidiafags belong on /v/. Your shilling is more effective over there anyway.
>>
>>54950047
lol
>>
File: original[1].jpg (357 KB, 1920x1080) Image search: [Google]
original[1].jpg
357 KB, 1920x1080
GTX1070 can only utilize 6,784 Mb out of the advertised 8Gb VRAM. HAHAHAHHA FK BY NVIDIA AHAHAHAHAHAH
>>
>>54950047
>we
just fuck off back to >>>/v/ kid
>>
>>54950099
6.7? you don't need 6.3. anything more than 5.8 is worthless!
>>
>>54950118
Oy vey, what do you need 5.3 GB vram for?
>>
>>54950099
is this a 6gb 980 Ti being benchmarked?
>>
File: 50591440.png (260 KB, 424x508) Image search: [Google]
50591440.png
260 KB, 424x508
>>54950118
>>54950135
>>
>>54950160
GTX 1070.
>>
>>54949137
who do we blame for this? the jews or the chinks?
>>
>>54950171
Oy vey, goy, you somehow got a faulty card. Other cards have 8 GB. Call customer support goy, we'll send you a new one.
>>
>>54949313
Lisa Su gets multiple degrees at MIT

Huang is a business guy!
>>
>>54950194
>you must pay shipping of course
1 month later
>it seems your card went above stock speed and as such is not covered by our warrantee, goy
>we will return the card and invoice you for shipping and labor
>have a nice day :^)
>>
>>54949632
we really do its that sad. I never thought Nvidia would be that cheap.
>>
Most people in here knows that Nvidia always cheat their way for years now.
What bedazzle me is why /g/ even bother to defend them.
>>
>>54950099
pfffft srsly?
>>
>AMD never lies, they're intentions are positive, please ignore how they lied about the multi GPU utilization % in their announcement to show a single card would perform better
>Nvidia is the only one motivated by being malicious, honest to god bugs in their drivers or in third party games that get patched quickly represent Nvidia acting maliciously
Both companies are shitty, face it.
>>
>>54950309

Who are you quoting?
>>
>>54950321
It's "whom" not "who".
>>
>>54950309
See
>>54949752

Fucking moron.
>>
>>54950309

Whom are you quoting?
>>
>>54949256
Heh, you were right after all-> >>54950309
>>
>>54950309
the nfagya ghost in his mind
take your pills, philip
>>
>>54950344
>AMD always presents an ideal condition
Are you trying to prove my point? That you guys are so tribal you will see someone calling both companies shit as primarily targeting your company because you have the preconceived notion that your company has more positive intentions than the other one? I'm not denying Nvidia does shitty things, but like when games that are literally patched in a day to fix performance issues are used as proof of gimping, it's clear Nvidia is being judged unfairly by AMDrones (and vice versa).
>>
File: 1194647542923.jpg (33 KB, 400x293) Image search: [Google]
1194647542923.jpg
33 KB, 400x293
>>54950402
>when games that are literally patched in a day to fix performance issues are used as proof of gimping,

You mean like how they "fix" the first Assassins Creed DX10 issue, right.
>>
>>54949632
The site I use the most for benchmarks, sweclockers.com, did this until a few years ago and has since seemed to be more pro nVidia without being outright och obvious. Maybe they are bribed like anyone else.

Why does Huang ruin everything?
>>
>>54950428
I was referring to the Witcher thing actually.
>>
>>54949632
The 1080 is literally out now, you can already find benchmark videos from amateur youtubers (I guess that's what you'd call them). https://youtu.be/u3oGAxFtlug
>>
>>54950468

The Town memory leaking glitch wasn't fixed for almost 8 months.
>>
>>54950428
>You mean like how they "fix" the first Assassins Creed DX10 issue, right.

Hahahah, I remember that. They put DX10.1 support in it leading to better framerate, but then they patched it out cause Nvidia hardware had no 10.1 support.
>>
>>54950533

They patched it to removed the 10.1 support since it make Nvidia supported game looks bad because only AMD GPU is the only GPU that got 10.1 support at that time.
Now both of them got no 10.1 support because its already got "fixed"
>>
File: Screenshot_20160606-175201.png (289 KB, 1440x2560) Image search: [Google]
Screenshot_20160606-175201.png
289 KB, 1440x2560
>>54950525
The update to fix Kepler performance in the Witcher 3 came much quicker after launch. Like less than a month. It might not have been a day and I remembered that wrong. But Nvidia did improve the performance of their Kepler cards with an update very soon after launch.
>>
File: 1233441231492s.jpg (83 KB, 255x255) Image search: [Google]
1233441231492s.jpg
83 KB, 255x255
>>54950664

So which textures that they lowered the image quality to improve the framerate with said driver?
>>
>>54950747
>AMDoofuses are this paranoid when anyone can upload to YouTube benchmarks for a game that's been out for a year
>>
>>54950780

>taking re-used OP joke seriously.

I thought /g/ was joking when they said Nvidia zealot already infesting this place.
>>
>>54950747
>>54950816

texture quality doesn't even affect framerate like that
>>
>>54950099
Oh shit, it's DOUBLE 3.5.
>>
AYYMDPOORFAGS still trying to post lies about a winning company, sad pathetic people
>>
>>54951023
2*3,5 is 7. more like double 3,35 lol
>>
>>54951061
>when Nvidia jews out on 3.5
It's literally "what do you need these 3GB for?" at this point
>>
File: slide4.gif (12 KB, 550x168) Image search: [Google]
slide4.gif
12 KB, 550x168
http://techreport.com/review/6754/ati-radeon-x800-texture-filtering-game

>full trilinear . . . all of the time

AMD, cheating & lying blatantly
>>
>>54951097
Did you seriously have to skip a fucking decade to "prove" your point? It wasn't even AMD at that point.
>>
>>54951095
arent there laws against false advertising? the fucking EU has laws for everything, but when you get jewed, they let it slide
>>
https://blogs.nvidia.com/blog/2010/11/19/testing-nvidia-vs-amd-image-quality/

>Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

?AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

>Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).

AMD got busted by German websites for cheats, doing 1 thing but saying another, pathetic company defended by pathetic AYYMDPOORFAGS blinded by their bias
>>
>>54951167
People tried to organize a class action lawsuit but Nvidia not so subtly threatened to sue every single person involved for defamation if they tried it. Not worth the risk to try and sue a megacorporation anymore.
>>
>>54950099
No way. No FUCKING way.
>>
>>54950099
HAHAHAHHAHAHAHAHAHAHAHHAHAHAAHAHAHAHAHAHHAAHAHA
>>
File: .gif (176 KB, 279x240) Image search: [Google]
.gif
176 KB, 279x240
>>54951304
Fucking what?
>>
>>54951304
> we're gonna sue you
> well WE'RE gonna sue you for suing us
I can't believe that's a thing
>>
>>54951210
>>54951097
You examples are years old, stop grasping at straws.
>>
>>54950402
Don't be retarded. The issue was there during the review period. You know, when much of the marketing hype was generated. It'd be different if the bug was in a driver release a few months down the road and promptly got fixed, but that's not what happened.

Moreover, do you HONESTLY think that AMD noticed an Nvidia driver issue before Nvidia did?
>>
>>54951507
It's literally all they've got at this point.
> nVidia does something dodgy now
> BUT LOOK AMD DID SOMETHING DODGY YEARS AGO LMAO AMD BACKRUPT
>>
>>54950099
that test is worthless. it shows 3200MB on a 980
>>
>>54951548
lol, getting cucked by nvidia on all cards?
>>
>>54951558
same on amd
>>
>>54950099
DELETE THIS
>>
>>54951658
DELETE THIS
>>
>>54950099
holy shit
>>
>>54951518
I'm pretty sure it wasn't even AMD back then, Just ATI.
>>
>>54950099
hello sir this is patel from wccftech i am the sensior editor and i wud like to licnese ur post for our content pls contact at [email protected]
>>
File: GeFuccboiXXXGTR.png (944 KB, 1920x1080) Image search: [Google]
GeFuccboiXXXGTR.png
944 KB, 1920x1080
>>54950099
>>
>>54952488
EIGHT POINT FIVE GIGABYTES?
WHAT A STEAL!
AMD IS BANKRUPTURED AND FINNISH
>>
File: 1459612860710.jpg (33 KB, 432x576) Image search: [Google]
1459612860710.jpg
33 KB, 432x576
>>54950047
made me reply.
>>
>>54950047
Wut
>>
File: 111111.jpg (3 KB, 160x160) Image search: [Google]
111111.jpg
3 KB, 160x160
>>
>>54951097
That's ATI dumbass, they merged with AMD in 2006
>>
>>54953309
No they didn't
>>
>>54953357
You're right, they got bought out by AMD. ATI is as dead as cyrix and 3dfx are.
>>
>>54950509
>increase clock speeds by 35%
>get less than 10% gains

Fucking top god damn kek.
>>
File: 1463909434051.jpg (229 KB, 647x1326) Image search: [Google]
1463909434051.jpg
229 KB, 647x1326
>>54953414
>>
>mfw working at nvidia
>mfw driver version is way higher than that
>>
File: not an argument.jpg (44 KB, 225x253) Image search: [Google]
not an argument.jpg
44 KB, 225x253
>>54950309
>>
When a developer uses the most inefficient way to render terrain (why do they need to use multple shaders instead of a texture?!) then i would trade it off for more performance. It is funny that Oxide doesnt give me this option. But i guess they are trying very hard to do so much work in every frame to put AMD at the number one spot.

And yes: Even as a Maxwell user i want this option. Snow or no snow - it doesnt matter because graphics is not important here.
>>
File: image.jpg (285 KB, 2067x1602) Image search: [Google]
image.jpg
285 KB, 2067x1602
>>54950099
>what is memory being reserved for Windows UI elements
>all parts available for the test operating at full speed
What kind of dumbass do you have to be to think its 3.5gb all over again?youre supposed to run that test headless you fucking mong.
>>
>>54950099

>literally double 3.5gb

POTTERY
>>
>>54950664
Oh I remember that, I had 770 at a time it was nightmare, that was the turning point to go for AMD.
>>
File: not an argument.jpg (24 KB, 225x253) Image search: [Google]
not an argument.jpg
24 KB, 225x253
>>54953772
>>
>>54953832
A statement doesn't need to be an argument. I'm pointing out the flaw in his testing leading to the results he obtained.
>>
>>54949804
>Kekforce drivers
>Jewdro drivers
>>
>>54951510
AMD isn't even allowing reviews for their cards to be released until the cards are out. AMD depends on creating hype by keeping people in the dark.

And Nvidia probably knew about the issue, but hadn't finished the fix yet. Things don't get fixed the instant the bug is discovered.
>>
>>54951507
To be fair, AMDs current cards are just rebrands from that time period :^)
>>
>>54949137
>https://twitter.com/PellyNV/status/739890789208055808
CLASS ACTION WHEN
>>
>>54953772
Yes
>>
>>54955050
>embargo date
Doesn't matter since the benchmarking for the reviews is done with the fraudulent drivers. Thus consumers get falsified results on release.

Honestly, if you're planning to buy Nvidia, at least wait a few weeks for newer benchmarks and memory tests etc. God knows what they'll pull next.
>>
>>54949137

Calm down people, surely it's nothing, nvidia probably noticed what happened at amd's demo quickly, fixed it in the 368.25 driver and-

>368.25 was released before amd's demo

...well then
>>
>>54949137

>twitter posts
>pulling on straws
>random screenshots
>not a single living proof of anything

rarely seen AMDegenerates this poor and envious at the same time

Nvidia obviously won
>>
>>54950099
IT KEEPS HAPPENING
>>
>>54956677

>fanboy damage control
>>
>>54950099
THIS IS TOO MUCH :D:D:D THEY CAN'T GET AWAY WITH THIS
>>
>>54953730
Word for word, same reply as one of the resident nvidia shills and apologists from the anandtech forums. I bet you do it for free too.
>>
>>54950099
oops, they did it again
>>
AMD did the same with the 390 drivers, doing optimizations but leaving the 200 series out until people on the guru3d/overclockers forums called them out.
>>
>>54950099
HUEHUEHUEHEUEHUEHUEHEUHEUHEUHEUE
>>
>>54949137
>https://twitter.com/PellyNV/status/739890789208055808

#NVIDIAFAGS-KEKED-AGAIN
>>
>>54950047
shoo shoo
back to your cuckshed
>>
>No more performance improvements for Maxwell
>Only rendering half the scene to get more FPS for Pascal
This is how they get the "70% faster" bullshit claims when in reality it's only 40%.
>>
>>54950099
WEWEWEWEWEWEWEW

holy fuck
>>
File: cuckedagain.jpg (1 MB, 2582x1940) Image search: [Google]
cuckedagain.jpg
1 MB, 2582x1940
>>54949137
ayy lmao
>>
>>54950099

>this is a 1070 test
>no source
>>
File: 1460290911172.jpg (67 KB, 591x960) Image search: [Google]
1460290911172.jpg
67 KB, 591x960
>>54959130
This you fucking shills
>>
>>54950099

now run this on any AMD card hang yourself

trying so hard to find a flaw its pathetic
>>
https://forum.beyond3d.com/threads/nvidia-pascal-reviews-1080-and-1070.57930/page-17#post-1919727

>Just checked it this morning on a GTX 1070. The average performance difference between 368.19 and 368.39 is 0.6%, well inside the margin of error.

AYYMDPOORFAGS BTFO
B
T
F
O
>>
>>54959218
6.7 GB
.
7
G
B
>>
>>54950099

>windows 10 year 2559
>no source
>names it original[1].jpg

literal negro
>>
>>54959542
https://twitter.com/RyanSmithAT/status/737042686683713536

>And yes, I checked. There are no ROP/L2 shenanigans going on this year.

You are stupid dumb faggot and it really shows

What other lies & slander desperate AMD defense force trolls will try next? :^)
>>
>>54959542

I feel bad for you existence I really do

1070 been tested and uses full 8192 MB maxed out, so you kinda have to contradict this with the 6.7 GB
>>
File: 1464811632311.png (67 KB, 616x596) Image search: [Google]
1464811632311.png
67 KB, 616x596
>>54959589
>>54959602
>>
>>54959620
>BUTTMAD AYYMPOORFAG WITH NO BALLS AFTER GETTING KICKED BETWEEN THE LEGS DETECTED
>>
>>54959602
The 970 also uses all 4GB.
>>
File: 1235.gif (2 MB, 200x177) Image search: [Google]
1235.gif
2 MB, 200x177
>>54959620

Classic suicidal AMDrone response. Oh my fucking sides. Nvidia won so fucking eassy without even trying.
>>
>>54949137
The nvidia logo looks like the fish from the bait meme in that picture.
>>
>>54959637
*3.5GB
>>
>>54959669
The 970 uses 4GB of VRAM, idiot.
It's called 3.5 because the last .5 is like a tenth of the speed and almost worthless
>>
>>54959674
If you utilize vram beyond the 3.5, all the memory is forced to the slowest speed effectively making all 4gb useless
>>
>>54959620

>NVIDIA USES 6.7 GB LOOK LOOK
>I HAVE NO SOURCE OR ANYTHING BUT LOOK GUYS LOOK
>go away retard this doesn't proof anything
>GUYS LOOK HAHA LOOK
>go away retard
>GUYS OH OK GUYS WAIT GUYS WAIT PLS LOOK
>guys ?? ?

You pathetic inbred. I can only imagine how what your PC specs are like.
>>
File: 51144.png (43 KB, 550x450) Image search: [Google]
51144.png
43 KB, 550x450
AYYMDPOORFAGS, they fell for the Turdozer 8 cores meme, it's pretty obvious they are using 8 half-cores Turdozer or Pileshitter HOUSEFIRES in their systems with outdated chipset that doesn't even support PCIe 3.0 16x
>>
File: h-heh.gif (2 MB, 300x241) Image search: [Google]
h-heh.gif
2 MB, 300x241
>>54959761
>Turdozer or Pileshitter

thx anon. i got a good laugh at that
>>
>>54959641
>Ignoring the rest of the thread
All the nvidia fans can only claim OP is a liar and one retarded anon using 10 year old reports to try and bash AMD.
>>
>>54950223
This, has happened to me, never had the problem with Sapphire with Radeons, always got covered.
>>
File: so ez.jpg (55 KB, 817x322) Image search: [Google]
so ez.jpg
55 KB, 817x322
>>54950099


GET SHIT ON KID
>>
>>54959761

>294.3 watts

what the fucking fuck burn my house down already holy shit
>>
>>54959761
As if anything currently out can utilize more than 2.0 at 8x.this is literally why intel only puts 20pcie lanes on their mainstream i7s 8 years after initially putting the controller on chip.
>>
>>54959863
Yeah, the card has 8GB but can't utilize it all.
That's the point we trying to make, thanks for helping.
>>
>>54959761
Awesome uh? I bet you're jelly that those 4.8GHz's, my 8300 can go up to 5.2GHz tho with a shitty AIO cooler.
>>
>>54960020
Who's jealous of 220W HOUSEFIRES that run SLOWER than Intel CPUs at STOCK speeds?

Clearly your brain can't even comprehend how stupid you sound
>>
>>54959927
That's the entire system's power consumption.
>>
>>54959761
Also, this cart is false. FX-6300 never takes over 120W no matter what the load is, part of my job is to test hardware for market and if the hardware complies with what the distributor says.
>>
>>54960034
wow, stop being so jealous anon. fuck
>>
>>54960046
My dad works at Nintendo and he said they're going with Intel on the NX because they're afraid if they went with AMD it'd cause too many house fires.
>>
>>54960034
>SLOWER than Intel CPUs at STOCK speeds
Are you sure? Back in 2012 when I had the 8300, running even at stock speed Intel had NOTHING to challenge it, not to mention at 4.8GHz, what a Intel CPU back then could not even achieve without liquid nitrogen.
>>
>>54960063
Actually Apple, Sony and Microsoft all are with AMD. Fuck Nintendo and you weeaboos.
>>
>>54960085
>weeaboos
Nintenshits aren't like that at all. They are mostly /v/tards and /co/tards and prefer the uglier american cartoon style.
>>
>>54960085
Apple uses Intel processors and switches between AMD and Nvidia graphics depending on what suits their needs better. Consoles use AMD because they're cheap, but look at how big the xbone is to accommodate cooling for the AMD apu kek.
>>
>>54960136
>but look at how big the xbone is
It's at that size to make it practically soundless.

The Xbox is lower wattage and lower temperature than PS4.
>>
File: 1385245882307.jpg (263 KB, 1432x1052) Image search: [Google]
1385245882307.jpg
263 KB, 1432x1052
>>54960136
>but look at how big the xbone is to accommodate cooling for the AMD apu kek

Nah, that's just shitty engineering
>>
>>54960136
Apple will go full APU, haven't you read the news?

Xbone has almost same chip then PS4, and PS4 has no problems, but thinking back on all the Xbox'es they always had cooling problems, it's not AMD's fault.
>>
>>54960162
Wat, a VCR is very organised, it's like cockwork.
You must be like 16 and not even know what a VCR is.
>>
>>54960162
You're wrong.

The Xbox 360 got a lot of criticism, so microsoft took that criticism to heart and made the system with no compromises in terms of cooling.

Cooling had higher priority than small form factor on the Xbox One.
>>
File: 1385054033867.jpg (8 KB, 300x225) Image search: [Google]
1385054033867.jpg
8 KB, 300x225
>>54960172
>>
>>54959998

[sources and citations needed]

last warning
>>
>>54960193
ANOTHER AYYMD GPU HOUSEFIRES
>>
>>54960078
You know the press conference they showed off the new AMD card (480) the pc they used had an i7 in it as they render faster .
Imagine that a co having to use a rivals product to boost their gpu.
>>
>>54960193
Nice, as I said all Xbox'es have shitty cooling, that's just crap engineering
>>
File: you won't get the joke.jpg (94 KB, 551x360) Image search: [Google]
you won't get the joke.jpg
94 KB, 551x360
>>54960180
>You must be like 16 and not even know what a VCR is.

K. I am 16 and have no idea what a VCR is.
>>
>>54960156
Maybe they should have let it perform a bit better cause its getting absolutely destroyed by Sony at the moment.
>>
>>54960162
The XB1 and PS4 have the same Jaguar based APU don't they? Just the XB1 APU is lower clocked? Why the fuck does design matter if it cools well, and is quiet while doing so? Just look at the cooler differences. A Large probably 120mm fan pulling air out of the case for the XB1, or a loud as fuck (they always are) blower cooler in the PS4. What does slimmer more "sleek" design mean if cooling/noise takes a dump?
>>
>>54960205
Yes I know, but I'm saying that the FX series was serious shit back when it came out, till like mid 2014.
>>
>>54960213
Im 41 & thus I get the joke. (we had beta at first)
>>
>>54949256
I'm an Nvidiot, and I didn't need proof for this. Of course they do that.

But what's the alternative? AMD? AHahaha. seriously now.
>>
>>54960172
>>54960221
>Xbone has almost same chip then PS4

The Xbox One chip is physically larger.
But the Xbox One chip also has less graphics compute units, so the APU runs cooler with less wattage.
You guys are so ignorant.
>>
File: leo.jpg (9 KB, 219x239) Image search: [Google]
leo.jpg
9 KB, 219x239
>>54960228
>Im 41 & thus I get the joke
>>
>>54960218
They can't.

They were stuck with slow DDR3 ram and had to compensate with a large pool of ESRAM on-cpu.
The ESRAM took up the real estate that could have been used for graphics power.

So the Xbox One ended up with less power despite being larger chip.
>>
>>54959733

He is probably one of those fags that runs a game at high settings with 2xaa and tells people on forums it runs "maxed out"
>>
>>54960242
http://www.extremetech.com/gaming/156273-xbox-720-vs-ps4-vs-pc-how-the-hardware-specs-compare

>The PS4 and Xbox One CPUs are virtually identical, except the Xbox One is clocked at 1.75GHz, while the PS4 is at 1.6GHz.

The only thing different is their GPU. That is where the compute units come into play as well as Die size. I was talking pure architecture. They're both Jaguar 28nm based.

And if anything that makes it worse on Sony's part. Faster clocked, and higher end GPU. Shouldn't they have been the ones who had overkill cooling? My buddy bought the PS4 Destiny edition or some shit. In a 19C room, the PS4 started complaining of heat after 2 hours. He had to point a fan directly at it to stop it from kicking him back to menu due to heat.
>>
>>54960240
>But what's the alternative? AMD? AHahaha. seriously now.
Oh god I don't think you understand how sad and pathetic that sounded.
>>
https://forum.beyond3d.com/threads/nvidia-pascal-reviews-1080-and-1070.57930/page-17#post-1919727

>Just checked it this morning on a GTX 1070. The average performance difference between 368.19 and 368.39 is 0.6%, well inside the margin of error.

AYYMDPOORFAGS BTFO
B
T
F
O
>>
File: 1464818485425.gif (3 MB, 200x200) Image search: [Google]
1464818485425.gif
3 MB, 200x200
>>54960329
Almost as sad and pathetic as AMD's Linux drivers
>>
>>54960345
You mean the public ones or the pre...
>>
File: 1463754466780.png (296 KB, 908x531) Image search: [Google]
1463754466780.png
296 KB, 908x531
damn Nvidia defense force out in full effect too bad it's a bit late m8s
>>
File: 1463785863226.png (1 MB, 1186x635) Image search: [Google]
1463785863226.png
1 MB, 1186x635
Foundry Edition FTW
>>
>>54960318
>The only thing different is
If one chip has 50% more GPU then you can't say they "have almost same chip ".
It would be like saying Polaris 11 and Polaris 10 has the same chip.
It's just wrong.

>I was talking pure architecture. T
On CPU you would be right, but 33% of the Xbox One chip consist of ESRAM, which is nowhere to be foun on the PS4 chip.

Hence I feel like pulling out all of the hair on my head when people say they are the same.
>>
>>54960355
>bleeding alpha on nvidia's side
Top fucking kek
>>
File: 1459614894918.png (393 KB, 512x542) Image search: [Google]
1459614894918.png
393 KB, 512x542
GFexperience.exe
>>
>>54960329
why because he doesn't want to buy a specific company? I had many AMD setups. Single r9 290, r9 290 xfire, then a single r9 390X (Sapphire OC Nitro etc). While the cards ran fine on games that actually played nice with AMD, they ran like dogshit in 90% of any new titles. It would take a week or more to get a patch that allowed gameplay without severe FPS dips or out right crashes. My friend was having trouble with Dark Souls 3 on Low settings, 1600x900, no AA because of some bullshit about the bonfire's casting light. On an r9 380X.

Meanwhile Nvidia is plug n play most of the time. Have they made mistakes in the past? Of course. What company hasn't? (Their "card killing" drivers were quite something) But overall since getting my 980ti, the vast majority of problems have disappeared. Not to mention stuff like shadowplay. Nothing even touches it.
>>
File: 3.5.png (358 KB, 600x888) Image search: [Google]
3.5.png
358 KB, 600x888
>>
>>54960406
Enjoy your gimped graphics for an fps boost.
>>
>>54960318
>>54960374
The XB1 uses that extra die space for the eSRAM that's being used to make up for GPU the performance stemming from the use of regular DDR3 instead of GDDR5. However the GDDR5 with it's much slower access times does cripple the PS4's CPU so it's not just the higher clock speed that sets the XB1's CPU apart from the PS4's.
>>
>>54960431
>I would rather have an unplayable game than a playable one

AMDcucks in a nutshell
>>
>>54960438
>However the GDDR5 with it's much slower access times does cripple the PS4's CPU
It doesn't.
The games run on far higher framerate on PS4 than on Xbox.

The memory timing was just bullshit spread by anti-concole /v/tards.
>>
>>54960431

LOL ?
>>
>>54960406
Just wait for the new driver to fix the new title, it has always been like that, Nvidia just does it earlier because the cooperate with game devs.
>>
File: 1386952836995.jpg (409 KB, 1272x721) Image search: [Google]
1386952836995.jpg
409 KB, 1272x721
>>
>>54960355
>>54960369
>>54960390
>>54960413

>lost every argument got educated and told. Better post some NVidia images that I created in my spare time to show off my hate against NVidia. Truly new level of autism.
>>
>>54960431
>Says the guy who is banking on DX12 or Vulkan

Yup I sure will enjoy my close to 40% better performance with less heat and noise. When the newest latest and greatest comes out, and I feel my 980ti is getting too slow for my liking, I'll buy something new. Whether it be AMD or Nvidia. AMD can make the BEST absolute monster of a GPU and I'd still buy nvidia if the driver support if still terrible. What fucking use is all that power if you can't utilize it?

Before my 980ti I had a Fury. Sapphire Nitro Fury. Plagued with problems. I gave AMD a chance. I'll give them another when the RX490 hits. But I'm not exactly confident in AMD.
>>
>>54960521
so why the fuck doesn't AMD? It costs money yes, but MUCH less money than they stand to lose because their card can't play Timmy's new Call of Battle Duty 7 on release day. Swear to god it would sound like an informercial.

>Timmy couldn't wait to play the new game when he got home with his friends
>They all have Nvidia and were online in minutes
>Timmy's PC was AMD, and kept black screening
>Remember kids, don't cut corners. Don't buy AMD, buy Nvidia.

When the 390X came out it was like a $500 card. I should NOT have to wait a week or more for basic functionality. Tweaks to the driver to squeeze an extra few FPS out of it sure. But I remember stuff like COD BLOPS 3, GTA 5 (until they launched their game day beta driver which was still garbage for xfire), and Fallout 4 ran like dog shit or not at all. This is inexcusable.
>>
>>54960542
I'm confident in them, because I have no fucking problems. I won a 980 Ti a while back, great card, I had been running a HD 6970 before that and a 9800 GTX before that.

Sold the 980 Ti and got myself a R9 390X for half the price, works great, I run a 1080p@75Hz anyways.
>>
>>54960531
>Dat low resolution grass on the Xbox

Anyway, Microsoft is taking that underpower criticism to heart, and the rumours say the Xbox Scorpio is stronger than PS4 Neo.

Not that it matter though, the PS4 Neo will still win over the Xbox, because it's build on Polaris architecture which is available here and now.

The Xbox won't arrive until a year later, so it's kind of fucked even if it's stronger.
>>
>>54960613
Nvidia get that because it has the market, companies don't want to cooperate with AMD because of the low market share.

Then go with Nvidia and don't. I had no problem waiting 4 days for the Fallout 4 driver and enjoy it then.
>>
>>54960613

>so why the fuck doesn't AMD?

Sadly exclusivity contracts are a thing. And Nvidia is a company that does excel at them
>>
>>54960613
>>54960656
Also, Nvidia pays more support for developers who choose to cooperate with them.

That's why people say "nvidia gimps games".
>>
>>54960613
>>>Timmy couldn't wait to play the new game when he got home with his friends
>>They all have Nvidia and were online in minutes
You fucking idiot.

Timmy and his friends play Call of Duty on the PS4.
>>
File: 1465117201854.png (66 KB, 400x400) Image search: [Google]
1465117201854.png
66 KB, 400x400
>>54960669
They don't pay shit to the devs. They send engineers to help the devs. That's what devs need help. do your fucking research.

AMD never really help devs and if they do they usually don't help much
>>
>>54960700
They send in engineers to fuck up the video games with gameworks cancer.
>>
>>54960700
Yeah, to gimp the games in Nvidias favor.
>>
>>54960678
This. Lonely neckbeards play at PC.
>>
File: 1384351926086.gif (2 MB, 772x640) Image search: [Google]
1384351926086.gif
2 MB, 772x640
>>54960743
>>
>>54950099
>6.5gb

the meme continues
>>
>>54960775
Yeah, console games look so much more fun, PC gaming will totally die.
>>
>>54960731
>>54960720
Please be more specific about how they accomplish this.
>>
>>54960678
that's true. Good point. Was just trying to make an example.
>>54960656
Then you and I wouldn't see eye to eye. I paid the premium price tag of the 290X when it first launched. $500 + tax and shipping. Close to $550. I paid for an item or service, I require that this item or service do what I want. When it doesn't I complain as a consumer is wont to do. While you're willing to wait, many others are not. And this is what causes loss of sales.

I'd much rather have a company that takes longer to make quality drivers than a company that keeps pumping out half assed ones that barely work. Although the only thing I will admit is that AMD cards seem to get stronger and stronger as time goes on. That's the problem. They should start as strong as possible. Not plague ridden. This is why AMD gets known as the "bargain brand"
>>54960663
>>54960669
This much I know but even so, AMD can play the same game. Look at how many AMD sponsored games there are compared to Nvidia's "the way it's meant to be played" garbage. AMD keeps cranking out cards that look amazing on paper, but fail to deliver until 1 month later.
>>
>>54960775
What movie is the left one from?
>>
File: 1464351997346.jpg (29 KB, 500x509) Image search: [Google]
1464351997346.jpg
29 KB, 500x509
>>54960700
My post

>>54960720
>>54960731
Well amd should fucking fight back and add their own shit with. Maybe if amd worked more with devs, then we wouldn't have this problem.

>gimp works wah
>amd iz life
>devs are gay cuz it runs like shot on amd
>not being mad at amd for working closer with devs.

Kills your selves please. Getting mad at the wrong crowd. Blame amd you cucks

>pic related on what amd is too busy doing
>>
>>54960794
Gameworks, it's using CUDA. Nothing more.
>>
>>54960034
RANDOMLY capitalising words IS so fucking anoying YOU fucking reTARD.
>>
>>54958455

Where do you think it was pasta'd from? The scary thing is nobody else on /g/ has called it out - shows the state of this board huh?
>>
>>54960821
>>54960798
>Well amd should fucking fight back and add their own shit with
That would hurt Nvidia customers.

Did you ever think maybe AMD doesn't want to hurt innocent bystanders just because they chose to buy Nvidia?
>>
>>54960798
>>54960821
Yeah, nope, AMD won't cuck its customers, even if they wanted to, they can't afford it.

If you like to be cucked, if your wife fucks other guys and you don't care that you pay extra for shit while your older hardware gets no support and you are constantly being lied to.
Then Nvidia is your game!
>>
>>54960834
Why would I take the time to call it out when it's pretty clear that whoever wrote it was fishing for (You)s?
>>
>>54960846
>>54960872
The only reason I admire AMD.
>>
>>54960846
I own 980 Ti in SLI but even I know that
Nvidia would do anything for the $$$
Fucking jews....
>>
>>54960895
There are plenty other reasons.

The Radeon technoloy group is overall far smaller than Nvidia, and they have less financials too, but they still manage to bring out cards like Polaris 10 which are highly competitive.

That alone is admirable.
>>
>>54960846
>muh corporations have morals, only Nvidia is bad
They can still send engineers to help game devs optimize for AMD without gimping Nvidia.
>>
File: 1464777195097.webm (943 KB, 1280x720) Image search: [Google]
1464777195097.webm
943 KB, 1280x720
>>54960821
This. You fags are all communist cock sucking fags who don't understand business

>but fairness

This is why you are poor. The world isn't fair. Fucking losers this is a cruel world and only the strongest survive.

Rip amd
>>
>>54960798
>I'd much rather have a company that takes longer to make quality drivers
That would be AMD. That's why you get polished drivers that support the game.

You notice it when you play with a older driver before the game was released and then with the new one. That's exactly what a GOOD driver is.

>inb4 Nvidia gives good drivers from the start
they are OK drivers, they polish them too, you just don't notice much of the crap at first
>>
>>54960941
>They can still send engineers to help game devs optimize for AMD
The PS4 is already doing that for AMD.

The game engines for all the larger publishers are being optimised for GCN.
>>
File: 1460192331019.jpg (10 KB, 238x211) Image search: [Google]
1460192331019.jpg
10 KB, 238x211
>>54960941
>>54960821


/Thread fags now stfu
>>
>>54960846
>>54960872
yea great. While AMD takes the moral high ground, Nvidia takes more and more of the market share. I admire AMD, I really do, but business is business. Whose to say AMD sponsoring the game means they HAVE to make nvidia run like shit? I'm simply talking more press. Super popular games would get people considering AMD if their logo was splashed in the title screen or loading screen of the game when first started up.

Something. Anything. It just seems like AMD launches stuff and never follows through and never commits. Maybe if Zen is a huge success and gives AMD money, they can market better?
>>
File: image.png (214 KB, 750x1334) Image search: [Google]
image.png
214 KB, 750x1334
Reminder that the 480 is equivalent to a 290, and gets BTFO by a 1070
>>
AMD offer shit APU for this console.
>>
>>54960985
$200 card gets beat by a $400 card? no shit?

you heard it here first
>>
>>54960957
Sponsoring games require money.

The smart way is the get your GCN architechture into the Playstation and the Xbox, and tell the developers their game is going to suck ass if they don't optimise for the consoles.
>>
>>54960998
Everything is relative. Nvidia offers even shittier APUs.

Just wait until you see Nintendo NX, it's going to be the laughinstock of the entire industry.
>>
>>54960985
why do you keep posting this? why did you cut the 1440p benchmark?
>>
AMD went 10 months without releasing a new driver last year and when they finally released one it killed cards.
>>
>>54961031
It's a phone screen shot, it's not that hard to find the rest of the benchmarks
>>
>>54959761

>lets compare 2012 amd vs 2008 intel
>>
Here's some proof of nvidia gimping:
https://www.reddit.com/r/buildapc/comments/2one2z/discussion_has_nvidia_forsaken_kepler_cards_has/
>>
>>54960775

orange shirt kid knows this is retarded
>>
>>54961144
>buy Nvidia
>play games

>buy amd
>immediately get on social media to tell others how shit Nvidia is
>>
>>54961055

Nvidia has done that half a dozen times
>>
>>54961144
>tfw AMD is going to start doing the same thing since it works so well for Nvidia
>>
>>54960944
I take away property and money from single moms without work, am I fair? NO, do I use AMD, YES.

Seriously, people these days.
>>
>>54950099
> Implying Mb == MB == MiByte
> Implying Gb == GB

God damn is /g/ really this retarded?
>>
>>54961236
Probably yeah

Thank fuck CPUs aren't reliant on drivers so intel couldnt do the the same.

>Haswell, now 40% faster than Ivy!
>>
>>54961288
6650 MiByte ain't 8GB shitlord, nice try tough
Thread replies: 255
Thread images: 41

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.