Hacker News new | past | comments | ask | show | jobs | submit login

Binaries need to target a minimum subset, but can use run time feature detection to query differences.

If a binary does that, the OS can just not migrate that binary across cores.




How would the OS know?

The only mechanism I can come up with is to detect illegal instruction traps on small cores, then flag the thread as big-core-only and re-start the execution at the bad instruction. That's not ideal but maybe workable.

(If it traps on the big core, too, it's just a bad instruction and SIGILL is raised to userspace like usual.)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: