Hacker News new | past | comments | ask | show | jobs | submit login

I am always wondering how much are we expected to understand the code we’re working on.

It is very easy to just say people should know everything, but in practice people don‘t review every single library, nor defend against every single possible case, and surely don’t keep everything in memory while working on any single problem.

I really can’t imagine someone “fully understanding” how the linux kernel works for instance, or even a full graphic card driver, or the software on a POS works as working knowledge.

So yeah, people cause bugs because they don’t know it all, but I’m not sure that’s saying much.

PS: real progress is probably made starting from assuming people don’t understand, and having frameworks that somewhat keep the system manageable under that premise.




Total newb here, but according to Stroustrup, the whole point of programming is so you don't have to know everything. You're responsible for:

A. Understanding relevant computing fundamentals

B. Understanding the interfaces to code you use

C. Writing reliable implementations and documenting them well

D. Writing useful and appropriate interfaces for your implementations

In a perfect world, this creates a nice chain where every programmer only needs to look after these four points and then everything will work seamlessly.

Too bad we don't live there.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: