Hacker News new | past | comments | ask | show | jobs | submit login

Milepost Gcc is interesting. Regehr does not believe in it, though: "machine learning in compilers will not end up being fundamental."

His argumentation is flawed though. He says "I could always get a good optimization result by running all of my optimization passes until a fixpoint is reached," but unfortunately there is no such fixpoint. Many optimizations reverse each other (e.g. loop fusion vs loop spitting) or just arbitrarily choose some normalization (e.g. 2*x vs x+x vs x<<1).

You can build a superoptimizer, which constructs all variations (e.g. equality saturation http://portal.acm.org/citation.cfm?doid=1480881.1480915), though this is no fixpoint search, but an optimization problem to choose the least cost alternative. You can not construct all variations anyway. For a simple example consider loop unrolling an infinite loop.

Hence, unlike Regehr I would not devalue machine learning. I would not bet on it either, though. ;)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: