Skip to main content

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M$M$ using the eigenvalues, which gives you a nice expression for M^k$M^k$. Specifically, you may be able to decompose your matrix into a product SDS^-1$SDS^{-1}$ , where D$D$ is diagonal, with entries the eigenvalues, and S$S$ is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

Mr. Arturo: Interesting approach!. This seems to connect with the theory of characteristic curves in PDE's(who knows if it can be generalized to dimensions higher than 1), which are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said, curves along which the PDE decouples.

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M using the eigenvalues, which gives you a nice expression for M^k. Specifically, you may be able to decompose your matrix into a product SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

Mr. Arturo: Interesting approach!. This seems to connect with the theory of characteristic curves in PDE's(who knows if it can be generalized to dimensions higher than 1), which are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said, curves along which the PDE decouples.

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix $M$ using the eigenvalues, which gives you a nice expression for $M^k$. Specifically, you may be able to decompose your matrix into a product $SDS^{-1}$ , where $D$ is diagonal, with entries the eigenvalues, and $S$ is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

Mr. Arturo: Interesting approach!. This seems to connect with the theory of characteristic curves in PDE's(who knows if it can be generalized to dimensions higher than 1), which are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said, curves along which the PDE decouples.

I included a reply to Mr.Arturo's observation, which I thought was interesting. Better to add a comment than start a new reply.
Source Link
Herb
  • 271
  • 2
  • 5

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M using the eigenvalues, which gives you a nice expression for M^k. Specifically, you may be able to decompose your matrix into a product SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

Mr. Arturo: Interesting approach!. This seems to connect with the theory of characteristic curves in PDE's(who knows if it can be generalized to dimensions higher than 1), which are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said, curves along which the PDE decouples.

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M using the eigenvalues, which gives you a nice expression for M^k. Specifically, you may be able to decompose your matrix into a product SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M using the eigenvalues, which gives you a nice expression for M^k. Specifically, you may be able to decompose your matrix into a product SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.

Mr. Arturo: Interesting approach!. This seems to connect with the theory of characteristic curves in PDE's(who knows if it can be generalized to dimensions higher than 1), which are curves along which a PDE becomes an ODE, i.e., as you so brilliantly said, curves along which the PDE decouples.

Source Link
Herb
  • 271
  • 2
  • 5

I think if you want a better answer, you need to tell us more precisely what you may have in mind: are you interested in theoretical aspects of eigenvalues; do you have a specific application in mind? Matrices by themselves are just arrays of numbers, which take meaning once you set up a context. Without the context, it seems difficult to give you a good answer. If you use matrices to describe adjacency relations, then eigenvalues/vectors may mean one thing; if you use them to represent linear maps something else, etc.

One possible application: In some cases, you may be able to diagonalize your matrix M using the eigenvalues, which gives you a nice expression for M^k. Specifically, you may be able to decompose your matrix into a product SDS^-1 , where D is diagonal, with entries the eigenvalues, and S is the matrix with the associated respective eigenvectors. I hope it is not a problem to post this as a comment. I got a couple of Courics here last time for posting a comment in the answer site.