Recent Posts

Attention module in JAX

9 minute read

The attention module is the key ingredient of what makes up a transformer layer. In this blogpost we will show how to implement it from scratch in JAX alongs...

Multi chip performance in JAX

4 minute read

The larger the models we use get the more it becomes necessary to be able to perform training of machine learning models over multiple chips. In this blog po...