[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
How come neural nets get so much more attention than support

You are currently reading a thread in /sci/ - Science & Math

Thread replies: 17
Thread images: 1
How come neural nets get so much more attention than support vector machines these days?
>>
>>8199555
Can a SVM make trippy pictures? I thought so.
>>
DUDE IT'S NEURAL LMAO

but both are shit desu.
>>
>>8199555
Because of deep learning which lets you solve harder problems. It's also worth mentioning that there are a fuckload of huge frameworks for working with deep neural nets as well as nvidia libraries and shit.
>>
>>8199565
an**
retard
>>>/b/
>>
>>8199606
a Support Vector Machine
retard
>>>/b/
>>
>>8199615
>a ess-vii-em
>not an ess-vii-em
>>>/b/
>>
>>8199555
flavor of the month.
>>
>>8199643
>a ess-vii-em
>not a support vector machine
>>>/b/
>>
>>8199647
That's not how initialisms work.

Also I think either works.
>>
>>8199654
retard
>>>/b/
>>
>>8199602
AMD on suicide watch!!!
>>
$$$ and loads of dick around applications for quick + cool results
>>
>>8199555
1. DUDE BRAINS LMAO
2. neural nets scale better if you throw more computational power at them. That's literally all about deep learning, people realized that with enough nodes you can use all the stuff from the 80s and started to continue the old lines of research
>>
>>8199555
because ANNs find solution literally by magic and because math is hard i result most of the the literally "researchers" don't know what they're actually doing
>>
Neural nets have build in feature learning between layers.

You would have to nigger rig some crazy SVM to get similar results, although I believe it theoretically would be possible.
>>
>>8199783
Actually, as it turned out, the old stuff from the 80s doesn't work when it comes to large networks. There were breakthroughs in the last decade that gave us a new way to deal with large networks.

Deep learning is actually more complex than "just very large neural networks". You have to use special types of layers and techniques.
Thread replies: 17
Thread images: 1

[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
If a post contains illegal content, please click on its [Report] button and follow the instructions.
This is a 4chan archive - all of the content originated from them. If you need information for a Poster - you need to contact them.
This website shows only archived content and is not affiliated with 4chan in any way.
If you like this website please support us by donating with Bitcoin at 1XVgDnu36zCj97gLdeSwHMdiJaBkqhtMK