1830s Transformer and Embedding Model
I've been watching Josh Starmer's videos about transformers, and wanted to apply what I was learning! I trained a transformer model on religious and historical texts from the 1830s, particularly the Book of Mormon. The project demonstrates how to train a language model from scratch and extract the embedding layer to create a standalone embedding model.
I learned a lot about PyTorch, transformers, embedding models, and HuggingFace through completing this project.
Check out the GitHub Repo to learn more!
© Thomas Smith.