In 2017, eight machine-learning researchers at Google released a groundbreaking research paper called Attention Is All You Need, which introduced the Transformer AI architecture that underpins almost all of today's high-profile generative AI models. The Transformer has made a key component of the modern AI boom possible by translating (or transforming, if you will) input chunks of data called "tokens" into another desired form of output using a neural network.