Is it possible for decision trees to split a set of data into subsets where the sum of the entropies (weighted respectively) of each subset will be greater than the original entropy of the set?
I know its a bit contrived, but can there be a scenario where I would need to worry about negative information gain?
Or will this not be a problem and all I have to do is plug and chug and only minimize entropy after, without worrying about its relationship to entropy before?
common /g/
lets stop shitposting and actually answer a good question
I don't really understand op, but I am curious about this 'negative information gain'.
How would that work exactly?
entropy always increases the more shit you do / add to it
so yes
>>52182488
>>52182428
most retarded answers I have ever read
Comic name?
>>52182531
Shithead. I didn't understand somthing, so I asked about it.
sauce
>>52182488
it's the fundamental law of thermodynamics, noob
>>52182569
>>52182776
Citrus. It's okay.
>>52182810
oh my fucking god I want to slap you right now you are so dumb.
>>52182899
Looks gay.
>>52182899
The only thing that's even remotely okay about citrus is the art. Quite literally everything else is fucking dogshit.
>>52183268
"I-I have a good personality"
>>52181860
Implement this shit in psuedocode.
What does she look like inside?
>>52182921
yea. It starts off hetero I guess, but goes full gay after a few chapters.
>>52183151
meant to quote
not >>52182921