Matrices are a thing that people use

When I was in the ninth grade, I was taught how to find the determinant of a matrix.  For the uninitiated—or perhaps the spared—this procedure involves writing a bunch of numbers in rows and columns, and then meticulously multiply, adding, and subtracting in a very specific order.  The larger the matrix, the more tedious this calculation becomes, and the more likely you are to make a mistake.

I didn’t really learn what a determinant was until my second year of college.  In fact, I wasn’t really sure what the purpose of a matrix was until college, either.  I knew they vaguely had something to do with solving systems of equations (another nightmare of a problem to do by hand).

I did not realize that they are actually one of the most useful tools in all of the applied sciences.

This dichotomy stems from the very purpose of matrices clashing with how they were introduced to me: the reason we write data in matrices—the reason we define these strange multiplication rules—the reason we have these algorithms for finding determinants—is because it is convenient.

“But!” interjects Suzie, “You just finished complaining about how annoying it was to do anything with matrices by hand.”

That’s because no one in their right mind does anything involving individual matrix entries by hand unless they absolutely have to.  Matrices, and the more overarching formalism surrounding their theory—Linear Algebra—are extremely powerful tools that allow the efficient analysis of abstract objects called vector spaces.  The convenience of using matrices is inextricably tied to the fact that so many physical systems are vector spaces.  Essentially all of the intuition for what defines a vector space is in this image:

That is, vectors spaces are those sorts of objects that can be added together in this “tip-to-tail” way, with some additional rules describing how to stretch or shrink the little arrows (vectors) when multiplied by a constant—rules that behave essentially how you would expect them to.  If you multiply a vector by 3, it becomes three times as long, while pointing in the same direction, etc.

Seen in this way, matrices are just rows (or columns) of vectors.  Matrix multiplication, then, is just a procedure for operating vectors on other vectors.  Finding a determinant is just a procedure for determining whether or not a set of vectors has a particular property.

But no one told me this.  Until college.

Probably no one reading this with an education that included a differential equations class would be surprised by any of my claims—he or she will likely have already used linear algebra to analyze a large dataset (with a computer, hopefully).  To wit, trying to convince such readers of the importance of matrices is philosophically identical to trying to convince them of the usefulness of addition and subtraction.

Less people have an intuition for the sheer number of systems that follow these rules, nor know that there are actually open problems pertaining to things as fundamental as matrix multiplication.  For example: the fastest algorithm to multiply two matrices together is not known (a new record for fastest asymptotic algorithm was actually found last year).

Few people that aren’t computer scientists know that graphics cards are essentially designed to optimize matrix and other vector related operations, as the computationally intensive part of most modern video games involves what amounts to tracing vectors through space.

Others probably don’t know that Google’s breakthrough search algorithm, PageRank, is an extremely clever matrix operation.

Few other than physicists and chemists know that understanding linear algebra is 90% of the battle that is understanding Quantum Mechanics (Feynman might suggest that no one really wins the last 10%).

When divorced from their innumerable real world applications, matrices look silly, are a headache to write down, and seem pointlessly arcane.

But linear algebra is as important for the engine of modern scientific innovation as the digital computer.

Share this post:

Leave a Reply

*

1 comment

  1. The usefulness of Linear Algebra is something I didn’t appreciate until too late — half way through grad school, when I didn’t want to take the time to sit through a proper class. So instead, I try to learn what I can on the side, while occasionally stumbling across problems that make me regret not understanding LA better (I’m a biologist).

    Some friends had told me that there is an excellent quantum mechanics book that introduces the concepts through LA. I forgot the name, might you know which one they were talking about?

Type to Search

See all results