Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If anyone wants to eval this locally versus codellama, it's pretty easy with Ollama[0] and Promptfoo[1]:

  prompts:
    - "Solve in Python: {{ask}}"

  providers:
    - ollama:chat:codellama:7b
    - ollama:chat:codegemma:instruct

  tests:
    - vars:
        ask: function to return the nth number in fibonacci sequence
    - vars:
        ask: convert roman numeral to number
    # ...
YMMV based on your coding tasks, but I notice gemma is much less verbose by default.

[0] https://github.com/ollama/ollama

[1] https://github.com/promptfoo/promptfoo




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: