Turning my local model output into study material ...
This desktop app for hosting and running LLMs locally is rough in a few spots, but still useful right out of the box.