Einstein summation or einsum

So today I was studying the paper dEFEND: Explainable Fake News Detection and their code, where there was a line using einsum, short for Einstein summation.

What is einsum?

So basically, you give it an equation, and it will give you output according to the equation. For example, if we have two matrices m1 and m2, m1 has shape ij and m2 has shape jk, so after multiplication it will have shape ik, so the equation becomes,

ij,jk → ik

In TensorFlow, it becomes like this:

tf.einsum('ij,jk→ik',m1,m2)

But if we have three matrix multiplications like the paper code,

tf.einsum('btd,dD,bDn->btn', m1, m2, m3)

So in the above line, the first matrix multiplication is performed between m1 of shape btd and m2 of shape dD, and the output will be of shape btD, which will be multiplied with m3 of shape bDn, and the output will become of shape btn. If you want to read further about it, please go to the TensorFlow documentation.