[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Hello /sci/, i have the following problem: Given a d-dimensional
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /sci/ - Science & Math

Thread replies: 18
Thread images: 1
File: eigenvectors.jpg (31 KB, 499x349) Image search: [Google]
eigenvectors.jpg
31 KB, 499x349
Hello /sci/, i have the following problem:
Given a d-dimensional array M and an integer n, find vectors [math]V_{i,j}[/math] such that
[eqn] \sum_{i=n}^m V_{i,1} \otimes \dots \otimes V_{i,d} [/eqn]
approximates M as good as possible (in the least squares sense).
For d=2 the problem is easy: M is just a matrix and we find the vectors by looking at the eigenvectors of [math]MM^T[math] that belong to the largest eigenvalues.
But for d>2 i am completely clueless.
Is this problem familiar to someone? Maybe knowing tensor algebra would help to solve this?
>>
Maybe this is a bit clearer: I want to minimize
[eqn] \sum_{i_1=1}^{m_1} \cdots \sum_{i_d=1}^{m_d} \left( M_{i_1,\dots,i_d} - \sum_{i=1}^{n} V_{i,1,i_1} \cdots V_{i,d,i_d} \right)^2 [/eqn]
with respect to the Vs
>>
>>7924862
take the derivative, set equal to zero, solve, profit.
>>
>>7924899
yes that is the obvious approach, but this results in a system of polynomial equations which don't really tell much... in the 2d case these equations reduce to finding the eigenvectors of a matrix, which is a well studied problem. I was hoping for something similiar for higher dimensions to show up, but I have no idea how you would even define eigenvectors of higher dimensional arrays (and in a way that solves this problem!)
>>
>>7924832
I'm wondering if you can solve iteratively, assume n=1, get the best, the go after the residual M-best.

For the d=2 case it seems it would work if the eigenvectors are orthogonal. Or is it more general?

Is it easy for n=1, i.e. when you don't have a sum of the VxVxV...?
>>
>>7924930
I believe for a matrix of the form [math]MM^T[/math] the eigenvectors are always orthogonal, so the iterative process should work

the case n=1 doesn't really make it easier. The difference is just that the number of eigenvectors/eigenvalues we have to look at is smaller (only the very biggest one)
>>
>>7924918
Are you looking for theory or an algorithm? If the latter, you could hold all but one of the V's constant and minimize over the remaining one. Move on to the next V and repeat. This will converge to a minimum, I think. Whether or not it's a global minimum would depend on the convexity of the problem, I guess.
>>
>>7924964
I'm looking for a theory, because I'm interested in the global minimum (and i would love to have a beautiful connection to eigenvectors, but i only get it for d=2)
>>
here is what happens for n=1,d=2: (note that I omit transposition stuff)
taking derivatives gives us the equations:
[eqn]MU = V|U|^2 \\
MV = U|V|^2 \\[/eqn]
Then multiply 2. eq. by M and substitute 1. eq.:
[eqn]MMV = V|U|^2|V|^2 \\[/eqn]
Now we know V must be an eigenvector of MM. Choosing the maximum eigenvalue then gives maximal [math]|U|^2|V|^2[/math] which somehow gives the best approximation
>>
for n=1,d=3, firstly the notation falls a bit apart, and also i can't separate the vectors:
[eqn]
MUV = W|U|^2|V|^2 \\
MVW = U|V|^2|W|^2 \\
MWU = V|W|^2|U|^2 \\
[/eqn]
>>
>>7925003
So for the simplest case, where M is a matrix, you want column vectors u and v to minimize

|| M - u v' ||^2

where the norm is the sum of squares Frobenius norm?
>>
>>7925072
right
>>
>>7925072
Simple case is easy with SVD

http://math.stackexchange.com/questions/1307836/approximating-matrix-with-n1-rank-as-outer-product-of-vectors
>>
>>7925082
General case discussion

http://www.cs.cornell.edu/cv/tenwork/Slides/Drineas.pdf

Interesting problem OP. I think I'm the only one commenting on this thread with you though. :)
>>
>>7925082
Yes, the SVD basically gives the exact equality
>>
>>7925092
Thanks, that's exactly what i'm looking for! Let's hope they have some nice results
>>
>>7925092
>"For r=3, computing the minimal k such that A is exactly equal to the sum of rank-one components is NP-hard"
very interesting
>>
>>7925112
Interesting indeed. Good luck to you OP.
Thread replies: 18
Thread images: 1

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.