Hacker News new | past | comments | ask | show | jobs | submit login

When I was taking my CS degree, the definition of big data for data intensive applications was exactly that. You cannot have it all in memory, and will have to use the disk.

If it fits in memory, you can honestly apply normal algorithmic analysis and optimize for memory access and cpu cycles. Once it no longer fit in memory, you become severely limited by IO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: