8
I was at a hackathon in Austin last weekend and saw a team use a local LLM to run their whole demo without any internet.
I mean, it was just a laptop with Ollama on it, but it totally changed how I think about needing cloud APIs for every single project idea I have.
1 comments
Log in to join the discussion
Log In1 Comment
sarahsullivan1h ago
What model were they running, and how fast was it?
1