I'm starting a CNTK neural network project build with a friend. What would /g/ recommend for hardware specs? I was looking at a couple S1070 units, but I'm concerned they wouldn't be much more powerful than just using two 960's since the only advantage would be the huge memory boost. My other, more viable candidate at the moment would be as many M2090's as I can budget, with more added later.
My biggest issue is the motherboard, I'm not sure if having enough PCIe lanes to support full x16 bandwidth on every card is going to make that much of a difference, or if I could even slot 8 prospective cards into a single machine. If I'm going to be using older PCIe 2.0 hardware, I'll definitely want to squeeze as much speed out of the interconnect as possible.
I know CNTK can use two networked computers, but for efficiency the plan for now is to stick it all into one machine or use ePCIe to link some external housing into one central computer. Most of this is flexible at the moment, so any suggestions are appreciated. Definitely going to need intel though, something about math libraries. Estimated budget at time of posting; $800 USD, flexible, prefer to use fewer better parts and add more over time.
http://timdettmers.com/2015/03/09/deep-learning-hardware-guide/
Having just gone through this process, my recommendation is to wait for pascal if you can. You can rent cloud services in the meantime.
>>53696677
I don't get why so many deep learning guides recommend GTX cards instead of tesla. I get that some projects only need ~two gigs per card and GTX cards are more easily available, but there's so many wasted resources on a gaming card when used for something like this.
>>53696677
Here are more links cuz I'm feeling helpful
http://timdettmers.com/2014/08/14/which-gpu-for-deep-learning/
http://graphific.github.io/posts/building-a-deep-learning-dream-machine/
https://blogs.nvidia.com/blog/2015/03/17/pascal/
https://www.quora.com/I-would-like-to-build-a-PC-for-deep-neural-networks-experiments-and-a-maximum-budget-of-2000-2500-What-are-some-suggestions
Really tldr of all of this:
40 pcie lane cpu for multi gpu setup.
More than 3 maybe 4 gpus doesn't do much
Pascal will be better
Make sure your psu can support the watts and amps for a multi gpu system
Deep learning doesn't use sli, just pcie
>>53696748
https://www.quora.com/Would-you-build-a-multi-GPU-system-for-deep-learning-with-GTX-Titan-X-or-Tesla-K40-K80-What-are-the-pros-and-cons
Go down to tim dettmers answer
>>53696842
>>53696903
Thanks guys, this isn't getting built until may or June so I'll definitely wait for Pascal. Any leads on what motherboards/generation I should be looking at?
>>53697436
Cheapest board that works with pascal and supports multiple gpus
Try /r/buildapc
They have some good existing builds for this
>Inb4 plebbit
How well would do PBX chips work for this kind of application? Anyone have experience with them?