Hacker News new | past | comments | ask | show | jobs | submit login

Honestly, Ollama (posted by someone earlier) was surprisingly simple to use. If you have WSL (on windows) or Linux/OSX it's a one-line install and a one-line use. I was up and running using a lowly 6GB VRAM GPU in about 3 minutes (time it took for initial download of models).

Installing WSL (on Windows) is similarly straightforward nowadays. In your search bar, lookup the Microsoft Store, open the app, search for Ubuntu, install it, run it, follow the one-liner for installing Ollama.

https://ollama.ai/library/mistral




but will that wsl ubuntu be able to access my gpu card? do i need to install drivers for that? At untold CLI pain?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: