[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
W3680 Fag Who Knows PCs Better Than You Reporting In
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 46
Thread images: 6
File: 1447761540489.gif (372 KB, 499x281) Image search: [Google]
1447761540489.gif
372 KB, 499x281
Hi everyone, in addition to this being your daily reminder that 90% of PC gamers don't know what the fuck they are talking about when it comes to building a cost-effective system for gaming, I have come back from a weekend of stress testing my garbage build, which many of you were apt to make fun of due to your limitless ignorance and utter lack of initiative.

Here's the system spec for reference:
CPU: Intel Xeon W3680
CPU Cooler: Cooler Master Seidon 120V AIO Water Cooler
Motherboard: EVGA FTW3
RAM: 4x4 16gb (dual channel configuration due to my motherboard being a repair job and not all slots working)
GPU: 2x EVGA GTX 760 SSC 2gb SLI
HDD1: Patriot Blaze 120Gb SATA III SSD on SATA II connection (don't ask)
HDD2: 250gb 7200rpm SATA III HDD
HDD3: 500gb 7200rpm SATA II HDD
Cooling Subsystem: 5x 120mm fans + 1x 40mm
Case: Antec GX500 modded for frontal water cooler mounting
PSU: EVGA SuperNEX 750W Gold rated modular PSU
OS: Windows 10 Pro 64-bit (upgraded from Windows 7 Pro)

In the last thread, I had people request that I benchmark my system with the following three titles: Witcher 3, DayZ Standalone, and Assassin's Creed 4: Black Flag. It was theorized that DayZ and AC4:BF especially would make my processor weep, since they demand high single core performance.

Here are the results, and a question from me to the unwashed masses that will follow:
>>
Witcher 3
Settings: 1080p, all settings maximum except for vsync and hair, Nvidia SLI settings enabled
Avg FPS (as measured by PlayClaw 5): 45-60fps depending heavily on area (locked at 60)
Avg. CPU usage: ~30-50% across all 12 threads
Avg. GPU usage: ~100% on both cards
Avg. RAM usage: 8gb

DayZ
Settings: 1080p, all settings maximum except for AA and , Nvidia SLI DISABLED
Avg FPS (as measured by PlayClaw 5 and Steam Overlay): ~24 (never went lower than 14, never went higher than 50 - cities were the worst)
Avg. CPU usage: Very uneven and jumpy usage across all threads; usually about 4-5 threads picking up 80% of the work; one thread in particular hovering between 50-91% utilization, with an average of around ~75% utilization. Rampant thread switching.
Avg. GPU usage: ~50%, never saw it go to 100% or really even close. Maybe 80% a few times?
Avg. RAM usage: 6gb

Assassin's Creed 4: Black Flag
Settings: 1080p, all settings maximum, Nvidia SLI settings enabled
Avg FPS (as measured by PlayClaw 5): solid 60fps (locked at 60)
Avg. CPU usage: ~40-90% usage on a single thread at all times; definitely the bottleneck at times; minor usage across up to five additional threads
Avg. GPU usage: ~80-100% on both cards
Avg. RAM usage: 6gb
>>
>>53378069
Witcher 3 was pretty much as expected.

As you can see, DayZ was really something of an anomaly. Based on usage stats alone, there's no bottleneck - yet the FPS experienced was extremely, noticeably low. HOWEVER, gameplay did not feel as choppy as that FPS would normally feel on other games; in fact the game was pretty much playable, though it did have some very undesirable choppiness when moving quickly in large urban environments. I really don't know what to think of this benchmark, so I'd appreciate someone more experienced with DayZ to speak up and tell me what's going on.

AC4:BF was unusually good. Using the Nvidia GeForce Experience SLI Preset settings not only caused my GPU performance to be very good, but it (somehow?) seemed to lower CPU usage slightly as well. In the past I had been very CPU limited in Havana and other busy areas, with a single core sitting at 90-100% usage pretty commonly, but not now. Either Nvidia has some magic to offload processing work on the GPU, or perhaps the game has been optimized since my last benchmark. Ultimately no longer a good benchmark of this system, as it was easily playable.

Hatred, admissions of inferiority, and any other commentary on that spectrum of honesty is appreciated.
>>
>AIO Water Cooler
:^)
>>
>>53378084
Oh yeah, I forgot to mention CPU temps for all the games! Your'e gonna like this:

CPU Clock: 4ghz (670mhz OC)
CPU TDP: 130W stock
CPU temp across all cores, at all times: <40C
Decibels: <30 at all times
Swag: >than u at all times
>>
>>53378062
>it's a "hey give men attention because I'm insecure about my aging rig" thread.
>>
Please at least try when you are baiting
>>
>westmere
>2016

meh

A 4.6ghz westmere is like a stock 3.3ghz 5820k.
>>
>>53378342
>>53378358
>insecure about all that money they wasted
lel
>>
>>53378390
OP here, I wouldn't go that far. Anyways, I can't get my rig up that high without massive and unsafe overvolting.

I'm not here to say the chip is the most powerful thing on the block. I'm here to say that something with this IPC, frequency, and core count works really very well for today's games.
>>
>>53378166
I don't know what that w3680 uses stock, but I took the liberty of calculating your OCd power draw:

130(stock tdp) * (4000 / 3330) * sqr(1.325 / 1.1125) = 170 watts

I am assuming your voltage settings as well as the stock VID at stock speeds. I am also assuming you have turbo disabled.

>>53378390
only with FP operations, really.

General performance improvements are roughly as follows
Nehalem > SB = ~13%
SB > IB = ~5%
IB > HW = ~7%

3.3 * 1.3 = ~4.3

>>53378428
My x5650 hits 4Ghz with 1.225 on the rails, 1.21 to the chip.

I need to get a 5670 for that 24x (all core) turbo multi, so I can push 4.4 at 1.25-1.28 instead of doing 1.3625 like the 5650 wants.
>>
>>53378494
thanks for the calculation. It's a bit off because my voltages and VID is all stock; nothing changed there. I also have kept turbo enabled, I think (I actually don't remember since it so rarely matters anymore past 4ghz).

what would a safe voltage for my chip be, do you think?
>>
>>53378062
This seems like a super rig to me. Water cooled, bridged Gtx 760's, too bad about the 2nd dual channel being wack, but this rigs seems awesome to me (rookie).
How is win 10 working for you? I'm hesitant to upgrade. What's the max RAM win 10 can utilize?
>>
>>53378428
>I wouldn't go that far
I upgraded from a x5650 4.5ghz to a 5820k. Those results were what I got running cinebench.

I did feed the chip 1.46v to get there though, but it was stable. Too bad the motherboard took a shit on me.
>>
File: b 9068.png (27 KB, 401x407) Image search: [Google]
b 9068.png
27 KB, 401x407
>>53378574
you can clock to 4Ghz on the stock voltage?

My chip sets itself up to run 2.66Ghz/2.926Ghz at 1.005Vcore, no way in hell I can clock that thing on stock volts any higher than an extra 200Mhz. I can do 3.6Ghz at 1.0875 volts though.

All Westmeres have a hard safe limit of about 1.35Vcore, but you need to remember that the voltage you set in your BIOS is going to be about 25mV higher than the voltage actually fed to the CPU

pic related, 4.1Ghz (turbo) at 1.2 to the CPU
>>
>>53378667
OP here. I really don't put much stock in cinebench as a benchmark. But I won't deny that what you accomplished was pretty cool.

>>53378670
I will confess that I didn't know my chip could dynamically change its voltage without my knowledge; all I know is that I haven't touched a thing in terms of voltage. I can get as high as 4.2ghz with rock solid stability, and I can get to 4.5ghz and still maintain a very good thermal signature, but at that point something in the chip goes wonky and I start getting CTDs.

PS: my W3680 is an ES chip; not sure if that makes a difference or not.
>>
>>53378062
>Xeon
>gaming
Goodbye, sides.
>>
>>53378755
Why would a server grade chip be bad for gaming?
>>
>>53378785
Westmere Xeons were the last overclockable server chips Intel produced.

The only viable upgrade path would be Haswell-E, Broadwell/Skylake-E, or Zen
>>
>>53378785
>using a chip designed for "embarrassingly parallel" tasks for tasks that, well, aren't at all
If DayZ and arma haven't taught you anything then emulation surely will. Prior to dx12 and Vulcan games couldn't even make draw calls asynchronously.
>>
>>53378810
Ahhhh I see. No real scalability without dishing out beaucoup bucks on one of those chip or buying a new mobo cpu combo. Is that about right?
>>
>>53378838
I haven't played those games. H1Z1 looks enjoyable though.
>>
>>53378810
>Westmere Xeons were the last overclockable server chips Intel produced

Haswell-E 1xxx chips (such as the 1660 v3) are overclockable and upcoming Broadwell-E chips will be overclockable too.
>>
>>53378864
Honestly, PC gaming doesn't even require a good rig. Majority of games with good jraphics are extremely dumbed down. It's all about conceptual or competitive shit nowadays. Check out SUPERHOT.
>>
Surprisingly not a shit thread even though it's about PC gaming, props OP.
>>
>>53378755
you're a retard

>>53378785
they aren't, at least on sockets 771, 1366, 1155, and 1150.

>>53378838
I highly suggest you reread the reasons I did these benchmarks and then reread the weirdness of the DayZ results. Nobody doubts the W3680 wins at parallel compute. Everyone doubted it could handle shitty single core games, even at 4ghz. They were proved, by and large, wrong.

On the flipside, there are a whole bevy of games I own which use most or all of my threads evenly and to great effect, most of which I did not test because no one would really deny the W3680 could handle such games:

- Planetside 2
- Satellite Reign
- Witcher 3

Stand out to me in memory, but there are certainly others which do it as well, both indie and AAA.
>>
>>53379002
It's because the DayZ devs are actually shit and can't program a game to save their miserable lives so it's an unoptimized piece of shit and doesn't take advantage of your computer the way it should. I can name countless amounts of games with that issue.
How much did you spend on this build anyways?
>>
File: 1444568701972.jpg (18 KB, 257x276) Image search: [Google]
1444568701972.jpg
18 KB, 257x276
>>53378952
>>
>>53379002
>reread the weirdness of the DayZ results
I told you the reason right there, your CPU can't effectively load your GPU. That one high load thread is the one that sends draw calls, but it's doing something else because the devs are retarded.
>>
>>53379047
....rust. fuck that game
>>
>>53378952
Kek
The game is 2 hours long and requires a good computer to not drop below 30fps in the slow motion parts
>>
>>53379063
*something else besides that
>>53379070
Played it on my laptop perfectly. Check your drivers of something.
>>
>>53379063
How can you tell? That bigass thread doesn't even top off at 100% or anywhere near what I see when I do get bottlenecked by my CPU in literally any other circumstance.

>>53379047
With monitors included and everything (keeping mind that I am a garbage digger), I spent around ~800 USD
>>
>>53379161
>top off at 100%
It works in spikes, you simply might not see it.
>>
>>53378867
It seems that not many people know the 16xx v1, v2 and v3 Xeons are unlocked, not even certain Intel employees you'd think should know. It is most everyone's supposition that all Xeons beyond Nehalem/Westmere are locked down.

I guess I'll be holding onto this dusty old Westmere for a couple years until/unless IB-EP chips are bottom of the barrel, if a Zen or Kaby machine isn't in my budget.
>>
>>53379204
I understand that, but when you have spikes that fast, you don't expect FPS to be CONSTANTLY hovering around 24fps

I have the polling rate set pretty high and just to be sure I monitored it simultaneously with the following system info suites:

- PlayClaw 5 overlay
- Windows 10 task manager (detailed view)
- Process Lasso Pro

PlayClaw 5 reports CPU usage as percentage number on each core, Windows 10 reports as a line graph for each core, and Process Lasso reports as a bar graph for each core. They each have varying (high frequency) polling rates.

I'm really not seeing how my CPU is possibly limiting me here. At this point my best guesses would be:
1. Really shitty texture caching to and from HDD
2. FPS counter was not actually accurate (kinda makes sense, kinda doesn't based on my results; but there's no reason for this I can think of)
3. Game is heavily dependent on RAM bandwidth speed, since my RAM is only 1600mhz and dual channel. I've never see that bottleneck for any other game, but at this point I'm grasping at straws.
>>
>>53379302
You can check the first guess by playing with settings. But I'm telling you, the game is notoriously CPU-bound.
>>
>>53379333
I'm sure it is, since at 75-80% load spikes (which I DID see) on this CPU, you'd expect even an FX-8350 to start choking at stock speeds. Indeed, I think my CPU may have noticeably choked on those spikes. But they were rare, and don't really explain the overall shittyness of the FPS result.

Something truly nefarious is at work in the DayZ code, I assure you of that. In any case, it doesn't seem to really be a good game to benchmark systems with. The results are just too wonky.

On a side note, I did begin to test other games just to see if I could find one that limited itself on my CPU in anyway. Right away I found that in multiplayer games of Insurgency with max graphics settings, I did see 100% CPU utilization on 1-2 threads pretty much constantly... but then my FPS never dropped below 130, so that wasn't exactly a good measure either.
>>
File: log file.png (118 KB, 868x653) Image search: [Google]
log file.png
118 KB, 868x653
>>53379302
>>53379478
Why are you running all of those conflicting monitors? It's just going to eat a bit of CPU time and mess with the results.

Use a single one with integrated graph logging and viewing. I prefer HWiNFO64, but you need a seperate program to view logs and it (viewing logs) isn't as user friendly.

For the second post:

CPU load is not wholly indicative of the work required by any part of the CPU. Your integer units might be getting completely slammed, while the FPU is idle most of the time, and bad programming might mean the data is being tossed around and the everything is waiting on cache access, making it look like CPU load is lower.

>pic
CS:S with no frame cap on maximum settings, 4xMSAA. GPU load was never higher than 85% but I was still limited to 325-350 FPS
>>
>>53379574
When you have 12 threads, you don't really worry about little programs eating up CPU time.

But, to actually answer your question, I started off only using PlayClaw 5 for all these benches. I only threw in the other two when I started to realize how fucked up my DayZ results were. Seriously, I might make a video of the gameplay. Shit's weird, yo.

>CPU load is not wholly indicative of the work required by any part of the CPU. Your integer units might be getting completely slammed, while the FPU is idle most of the time, and bad programming might mean the data is being tossed around and the everything is waiting on cache access, making it look like CPU load is lower.

That makes sense to me, but then I guess that makes me ask: how do CPU utilization programs estimate CPU usage? What part of each pipeline are they actually looking at?

I will say that one notable feature of my time "enjoying" DayZ was that the big thread kept jumping around from one core to another... which kind of indicates what you're saying about the data being tossed around. However, the Xeon has pretty damn good cache. I think the real problem might be delays caused by jumping the data from one pipeline to the next, with some kind of associated delay inherent to that process.

That being said, Process Lasso (which was running in the background the whole time for all of these tests) does apply the core unparking fix that Windows systems sometimes need for CPUs like mine, eliminating the most probable source of performance reduction from jumping cores. It's still a little mysterious to me.
>>
>>53379683
Taken from StackExchange:
>"There's a special task called the idle task that runs when no other task can be run. The % usage is just the percentage of the time we're not running the idle task. The OS will keep a running total of the time spent running the idle task"
This is a simplified explanation, the operating system does some math between the reported time to completion between tasks and a special no-op idle task/script which is periodically polled. Implementation is different for every OS, the function is still fundamentally the same.

>core unparking
complete meme. This only affected AMD FX 4/6/8xxx CPUs for a short time until Microsoft released a small patch for Win7
HPET timer/changing your timer is also a meme.

>big thread kept jumping cores
What was happening was, the main thread was stalling waiting for another thread on a different CPU to finish. It also could indicate pipeline stalls (cache misses/bad code)
Program threads do not "jump cores", they are loaded into the CPU and they stay where they were put until/unless you stop and start the program again.

DayZ is just horribly coded, it's a piece of shit.
>>
>>53379860
>What was happening was, the main thread was stalling waiting for another thread on a different CPU to finish. It also could indicate pipeline stalls (cache misses/bad code)
Program threads do not "jump cores", they are loaded into the CPU and they stay where they were put until/unless you stop and start the program again.


This makes a lot of sense. OK, so I was CPU limited in that game. Why did it feel so (relatively) fluid?

>core unparking
>complete meme.

My own experiences, and the experiences of others, have shown otherwise. In any case there's no harm in unparking the cores; performance certainly doesn't go down; efficiency only goes down slightly or not at all with the right manager. Yes, it affected FX chips the most, but not exclusively. It also was not completely fixed with that patch.

I have no experience/knowledge of HPET timer etc.so I can't comment on that.
>>
>>53379860
Anon, threads CAN change cores on their own. They don't do that often on desktop though.
>>
>>53379090
Your laptop is..Super...HOT
>>
>>53380268
Super
Hot
Super
Hot
>>
File: 1457137704968.jpg (195 KB, 1424x842) Image search: [Google]
1457137704968.jpg
195 KB, 1424x842
>>53378062
do all x58 non-server boards work with the 1366 xeons?
Thread replies: 46
Thread images: 6

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.