So I'm trying to compute a metric of phase synchronization between electric brain signals.
I'm trying to use the method in pic related.
I got the Hilbert transform down and computed mutual information ([math]I[/math]). But I cannot for the life of me figure out what the fuck this step is to arrive at the measure they denote with gamma.
So far I have an n x n matrix of mutual information values, where the diagonal is the mutual information of the signal with itself. Values outside of the diagonal indicate mutual information between any two brain electrodes. The values exceed 1. What do I have to do to arrive at the gamma measure in pic related?
Please, this is driving me fucking nuts.
>>8092238
gib definition of gamma and I'll help
>>8092255
Thanks. The definition of gamma is listed in pic related. Second to last equation on the right. This is all the information that the authors provide.
>>8092254
butt ugly face tho
>>8092275
>>8092294
that one is nice :)
Just to clarify, I'm stuck at the red part.
>>8092329
Also, I'm doing all this in matlab, in case that makes a difference.
>>8092329
the guy is just bullshitting, that's all I can say.
You would expect him to use density of entropy for a continuous process such as phase.
In general, when someone is vague like that, it means they don't understand it themselves.
Try to look for what he cited, maybe you'll be luckier.
>>8092339
Yeah I figured as much. This description leaves much to be desired and gives the impression that it was written to sound smart rather than to explain what was done.
Anyway, the reason I'm stuck is that there's no reference for the gamma measure at all. All I have to go on is this piece of shit paper right here.
>>8092329
>[math] H = - \sum _{ i = 1 } ^{ N } p_i \ln \left ( p _i \right ) [/math]
This is just the definition of Shannon entropy.
>>8093660
for a discrete distribution.
Which is not even what this is about.
Even worse, the guy says he "involes shannon's entropy" as if it was some super theorem.
How does gamma relate to I?
>>8092329
Think what hes trying to say is that another way of finding wether they are synced or not is to use shannons entropy equation quoted there. He basically wants you to to compare the entropy of the two signals with what I assume is an optimised value for N. Took shannon in information theory, dont know how it tranlsates to your application though.