Adonalsium's Eigenbasis

Finally made an account

9 posts in this topic

Hello everyone; 

I just finished a re-read of almost all the cosmere; finally decided instead of searching the forums/wobs to see if various theories/questions I came up with had already been answered, I might as well just post the theories here and get more discussion going! 

I've been reading the books for almost exactly 10 years, and have lurked on these forums since maybe around 2012? 

 

Anyways looking forward to actually interacting with all of you!

2

Share this post


Link to post
Share on other sites

Welcome from lurkerland, person with a math name I will never understand! :P

0

Share this post


Link to post
Share on other sites

Welcome to the Shard! Ugh, algebra name! I'm being banished! Aughhhhhhhhhhh!

1

Share this post


Link to post
Share on other sites

Welcome!

I have a master's degree in chemistry, and know I passed at least one graduate course in linear algebra.  I was supposedly able to do eigenvalue/eigenvector calculations... and I'm STILL not clear on what they are LOL

Do you have a single favorite Cosmere novel?

0

Share this post


Link to post
Share on other sites

Welcome to the Shard! Who's your favorite Cosmere character?

0

Share this post


Link to post
Share on other sites

Thanks! 

 

My favorite Cosmere novel is probably either Final Empire or Way of Kings -- I really like the presentation of the world building in both of these novels, perhaps slight edge to WoK -- when I recently re-read WoK, I really focused on the alienness of the world and it gave me a very interesting and different vibe compared to the first time I read it, which really enhanced my enjoyment of the novel. Secret History is very fun though. I think the real strength of Cosmere novels, however, tends to be in aggregate, so it's harder to pinpoint specific novels as being my favorite. 

Favorite Cosmere character: Hmmmmmm...this is hard. At one point I would probably have said Hoid, but I don't think that's true anymore. I like Navani, Raboniel, Kelsier, Vin, Taln (assuming he's similar in character to WoKP), Kaladin (I think his WoK arc is amazing in particular), Vasher, Shai for more main characters.  For an out-there pick, Handerwym. 

Haha, I thought of the name because [RoW Spoilers]

Spoiler

the different rhythms and lights really remind me of Fourier analysis. IMO the Shattering was basically the Vessels applying a "Fourier transform" to Adonalsium, a function that can be sparsely represented in the Fourier basis and picking up the sparse components (k = 16). Note if this interpretation is correct, it also suggests that if all other coefficients for Adonalsium are just small, but nonzero, there should be a small amount of "free floating" investiture that isn't associated with any particular Shard (but with the other, infinitely many, elements of the "Fourier basis"). The elements of the Fourier basis turn out to be eigenfunctions of the shift operator. Since Adonalsium is not a function sparsely representable by an orthonormal eigenbasis of the shift operator, we instead speak of "Adonalsium's Eigenbasis" as the stand-in. 16 of the elements of this basis are the Shards (which probably account for the bulk of eigenvalues).  Shards are eigenfunctions for some operation which we don't know! Of course this interpretation means that Adonalsium is a member of some function space whose elements are "gods like Adonalsium", so Adonalsium's Eigenbasis really refers to "some orthonormal basis of that function space which the element Adonalsium has a sparsity of approximately 16 when represented in, where the 16 basis elements that capture the bulk of Adonalsium are the Shards, and where all basis elements are eigenfunctions of some interesting operation (like the shift operation for Fourier basis). 

 

Here's an explanation of eigenvectors/eigenvalues that I find much more intuitive than the way it is typically initially presented (at least in my experience) in linear algebra classes: 

Spoiler

 

Regarding eigenvalues and eigenvectors, this is the way I think about it (though keep in mind I study theoretical machine learning and computer science, and not pure math): Basically, one main idea is you want to decompose the effect of an operator (something that acts on a function from a space of functions F and spits out a new function) into manageable components. We define "manageable" as "it just multiplies the function by a constant", since this is a very simple operation. This is therefore how we choose to define eigenvalue and eigenvector: T(f) = lambda* f, where T is the operator, f is a function from F, and lambda is a real-valued constant. Here (f, lambda) is an eigenfunction(vector) eigenvalue pair for T.

Why would we care about doing this? Well, it turns out for certain operators (real-valued, symmetric, finite-dimensional matrices for one thing, which you might see in linear algebra courses, compact self-adjoint operators which act on certain nicely behaved function spaces in more generality), it is possible to write the operator T as a linear combination of operators generated by the eigenvectors/eigenfunctions, where the coefficients in the linear combination are the eigenvalues, and moreover, the eigenvectors/functions are orthogonal to each other! This is a famous theorem called the spectral theorem (see Thm. 3 here: http://individual.utoronto.ca/jordanbell/notes/SVD.pdf for a relatively general version). 

Ok, that was a lot of math babble, what does this actually mean and why do we care? Let's stick to the assumption that T is a finite-dimensional real symmetric matrix (a linear operator) (this means that in its representation as a matrix, entry (i, j) = entry (j, i), and all its values are real). Then, if we look at its eigenvectors, they have two special properties: 1) they together form a basis for the input space to the matrix representation of T (the finite-dimensional space of vectors that we will apply T to); this means that linear combinations of the eigenvectors can describe any vector in the space; 2) the eigenvectors form an orthonormal set -- this means that if we form a matrix V such that its rows are the eigenvectors (say row i is eigenvector x_i), and apply V to another eigenvector x_j, we will get the standard basis vector (all 0s except one 1) where the 1 will show up in position j. This means that applying V to x_j zeros out the contribution of any eigenvectors other than x_j. 

These properties are extremely useful. The spectral theorem says that in this special case, T = sum_{i = 1}^d lambda_i * x_ix_i^T. (Here, x_ix_i^T is also a real symmetric matrix -- this is the operator generated by eigenvector x_i that I mentioned previously). 

Since the eigenvectors form a basis, we can write any other vector in the space as a linear combination of eigenvectors: y = sum_{j= 1}^d c_j x_j. 

Then we can try applying T to y, and use the fact that the eigenvectors are an orthonormal basis: 

Ty = sum_{i = 1}^d lambda_i * x_ix_i^T ( sum_{j= 1}^d c_j x_j) = sum_{i = 1}^d lambda_i * x_i * c_i 

Wow that's way simpler -- in an orthonormal eigenbasis, we can see the effect of T as multiplying by lambda_i in coordinate i!

This is particularly useful if a lot of eigenvalues are small -- we can approximate the effect of T by throwing those terms in the description of T out (approximating them by 0), and thus simplifying the description of T.  This can be quite useful in data analysis -- PCA (principal components analysis), for instance, is exactly this idea -- you look at the covariance matrix of some data and re-write it in an orthogonal basis, and throw out the components corresponding to small eigenvalues -- this simplifies the description of the data and gives you some insight into which directions in the data space explain most of the variation of the data. This can be helpful for visualization and interpretation, among many other applications. 

The effect of the Fourier transform on the space of functions generated by linear combinations of eigenfunctions of the shift operation (shift_{x0}(f)(x) = f(x - x0)) is analogous to re-writing the functions in terms of the eigenfunction components. The Fourier basis is also orthonormal (with respect to the inner product defined in Hilbert space). 

This gives us similar power of interpretation for the effect of shift operations on functions in Hilbert space. In general, eigenfunctions of other interesting linear operators (for instance, the linear operator corresponding to the wave equation -- relevance to chemistry comes in here!) are very helpful in solving various differential equations -- it helps you decompose the effects of the various operators involved in various differential equations and combine them together. This is exactly the same intuition as for real-symmetric matrices for why it's useful to look at the eigenvalues and eigenvectors (the spectrum). 

 

 

Edited by Adonalsium's Eigenbasis
0

Share this post


Link to post
Share on other sites

Welcome to the shard. Though I didn't understand half of that, I think its really interesting. What's a particular piece of lore that you enjoy?

 

0

Share this post


Link to post
Share on other sites

Welcome to the Shard!

So, it has been a while, but an eigenbasis is a situation where every vector is an eigenvector, correct? And eigenvectors helps make linear transformations easier. Eigenvectors and eigenvalues put together, in nonlinear motion dynamics, can help us better understand data by transforming and representing the data in manageable sets. They can be used to decouple three phase systems through symmetrical component transformation.

It has been a while, and that is all I remember, was I close?

0

Share this post


Link to post
Share on other sites

Thanks!

My favorite piece of lore... [RoW Spoilers]

Spoiler

right now I'm really curious about Foil deep in his ocean

@Chinkoln I wrote a brief description of what an eigenbasis is in my previous post in this thread. It's a situation where you can find a basis where every basis vector is an eigenvector. It's very pervasive concept that shows up anytime you are dealing with linear operators. I don't know what a three phase system is, but what you say sounds likely! You can certainly apply the concept of eigenfunctions etc in nonlinear systems (one recent approach is based on approximating the Koopman operator, which can be viewed as an infinite dimensional linear operator, by a finite-dimensional linear operator based on trajectory data).

Edited by Adonalsium's Eigenbasis
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.