Hacker News new | past | comments | ask | show | jobs | submit login

I look more to Google for efficient and inexpensive LLM APIs, and in a similar way to Groq Cloud for inexpensive and fast inferencing for open models.

ChatGPT has a nice consumer product, and I also like it.

Google gets a bad rap on privacy, etc., but if you read the documentation and set privacy settings, etc. then I find them reasonable. (I read OpenAI’s privacy docs for a long while before experimenting with their integration of Mac terminal, VSCode, and IntelliJ products.)

We live in a cornucopia of AI tools. Occasionally I will just for the hell of it do all my research work for several days just using open models running on my Mac using Ollama - I notice a slight hit in productivity, but still a good setup.

Something for everyone!






Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: