Hacker News new | past | comments | ask | show | jobs | submit login

The sad thing is everyone pushes so hard on understanding Big O, but yet in my 23 years of working since learning it, it's never come up beyond "you have some nested loops, that's gonna be a bad time". I'm glad it was introduced in college and that there is general awareness of it, but it seems like people stress it way too hard, especially in interviews.

Like, once you write a few functions you sort of just intuitively know the O runtime of the algorithm. Once you store a few things in memory, you intuitively know the O space of the storage. And when it breaks, you look for somewhere that you introduced n^2 and fix it.

Unless you're optimizing search algorithms, it almost never matters in practice. Even that double nested loop won't matter until you have a big dataset.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: