Hacker News new | past | comments | ask | show | jobs | submit login

That could end up with significant false-positives when updating the thing without updating the estimated memory usage (especially if slow bounded fragmentation over a long period of time applies). You might also have some expected ratio of memory usage (e.g. process B uses 3x the memory of process A), but want to allow the absolute usage to grow as more data is processed.



Having the wrong limits will cause the wrong thing to die, regardless of whether they kick in during OOM or orchestration. Better to find out during your planned rollout window if you ask me.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: