The landing pages of oerum.org are a growing archive of online experiments and prototypes that don’t belong anywhere else - work that sits between the artist practice and the research, not quite either. Local Trainer is its current landing page: the current sum of a (mis)understanding. Programming a primitive version of a language model makes the principles and material reality of it a little easier to grasp than reading about it does.
It explains and enables the training of a small language model in the browser, using the same underlying principles as large language models - tokenisation, embeddings, gradient descent, temperature sampling - with annotations tracing the mathematics and linguistics behind each step. You can train a model on your own text and see the results. It is also part of an ongoing attempt to understand these tools in practice, as an artist working with them, and as a citizen in a world where they are apparently going to shape both daily life and political arguments about what kind of society we want. Hands-on and DIY. Training locally on a specific corpus, on modest hardware, with no subscription and no infrastructure dependency, is a different relationship to the technology than most people have access to. Not a solution to anything. A way of staying hands-on while trying to reduce the damage.