[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
How come neural nets get so much more attention than support
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /sci/ - Science & Math

Thread replies: 17
Thread images: 1
How come neural nets get so much more attention than support vector machines these days?
>>
>>8199555
Can a SVM make trippy pictures? I thought so.
>>
DUDE IT'S NEURAL LMAO

but both are shit desu.
>>
>>8199555
Because of deep learning which lets you solve harder problems. It's also worth mentioning that there are a fuckload of huge frameworks for working with deep neural nets as well as nvidia libraries and shit.
>>
>>8199565
an**
retard
>>>/b/
>>
>>8199606
a Support Vector Machine
retard
>>>/b/
>>
>>8199615
>a ess-vii-em
>not an ess-vii-em
>>>/b/
>>
>>8199555
flavor of the month.
>>
>>8199643
>a ess-vii-em
>not a support vector machine
>>>/b/
>>
>>8199647
That's not how initialisms work.

Also I think either works.
>>
>>8199654
retard
>>>/b/
>>
>>8199602
AMD on suicide watch!!!
>>
$$$ and loads of dick around applications for quick + cool results
>>
>>8199555
1. DUDE BRAINS LMAO
2. neural nets scale better if you throw more computational power at them. That's literally all about deep learning, people realized that with enough nodes you can use all the stuff from the 80s and started to continue the old lines of research
>>
>>8199555
because ANNs find solution literally by magic and because math is hard i result most of the the literally "researchers" don't know what they're actually doing
>>
Neural nets have build in feature learning between layers.

You would have to nigger rig some crazy SVM to get similar results, although I believe it theoretically would be possible.
>>
>>8199783
Actually, as it turned out, the old stuff from the 80s doesn't work when it comes to large networks. There were breakthroughs in the last decade that gave us a new way to deal with large networks.

Deep learning is actually more complex than "just very large neural networks". You have to use special types of layers and techniques.
Thread replies: 17
Thread images: 1

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.