AlphaEvolve (AI Agent developed by Google) improved Algorithms for Matrix Multiplication. So What is Algorithms for Matrix Multiplication?

Tensor Decompositions (Algorithms for Faster Matrix Multiplication)

Goal: Explain Algorithms for Matrix Multiplication. Don't worry, it will not contains complex concepts.


1. First, what’s “matrix multiplication”?

Think of a matrix as a neatly arranged table of numbers—like rows and columns on a piece of squared paper.

|           | Column 1 | Column 2 |
| --------- | -------- | -------- |
| Row 1     | 2        | 3        |
| Row 2     | 4        | 1        |

If you have two such tables, “multiplying” them means:

  1. Pick a row from the first.
  2. Pick a column from the second.
  3. Multiply & add the matching pairs (like 2 × number + 3 × number).
  4. Write the answer in a new table.

For a 2 × 2 example you must do 8 tiny multiplications (4 cells × 2 numbers each).


2. What’s a “tensor”?

A tensor is just a fancy word for an array that can be 1-D (a list), 2-D (a matrix), or even 3-D and beyond (like stacking many matrices into a cube).


3. Decomposition: smashing a big thing into bite-size blocks

Tensor decomposition” means breaking that cube of numbers into simpler bricks that are easy to handle.

| LEGO Picture | Math Name         | Idea in kid words                                           |
| ------------ | ----------------- | ----------------------------------------------------------- |
| 🟦           | **Rank-1 piece**  | A single sheet of numbers made from one column × one row.   |
| 🟦+🟩+🟥    | **Sum of pieces** | Build the whole cube by stacking just a few colored bricks. |

If you can rebuild the original cube with fewer bricks than expected, you’ve found a shortcut!


4. How does that speed up matrix multiplication?

The rules for multiplying two 2 × 2 matrices can be written as one special 3-D tensor (think “instruction cube”).

  • Normal way: needs 8 little multiplications.
  • Strassen’s clever way (1970): notices the instruction cube can be split into 7 bricks—so only 7 multiplications are needed!
| Method     | # of little multiplications | What changed?               |
| ---------- | --------------------------- | --------------------------- |
| Schoolbook | 8                           | No shortcuts                |
| Strassen   | 7                           | Uses a smarter brick layout |

For big tables (256 × 256, 1000 × 1000 …) these saved steps add up to huge time savings on computers.


5. A pocket-size picture of the idea

Big-Cube  =  Brick 1
           + Brick 2
           + Brick 3
           + …

If each Brick is simple (just a row multiplied by a column), the computer can reuse tiny results instead of starting from scratch for every cell.


6. Why should anyone care?

  • Faster phone cameras (they use matrices for image tricks).
  • Quicker training of AI models (they multiply gigantic matrices).
  • Less electricity used in data centers (fewer calculations).

7. Key take-aways

  • Matrix multiplication = lots of “multiply then add.”
  • The instructions for that job live inside a 3-D tensor.
  • Tensor decomposition = rewriting those instructions with fewer, simpler bits.
  • Clever decompositions (like Strassen’s 7-brick trick) make computers finish the job faster.

You’ve just met an idea at the heart of modern super-speed math—without leaving grade-school arithmetic!