Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most escape analysis these days is interprocedural, and as long as it can see all the things that happen to the variable, it is fine.

At the very least, most compilers support doing interprocedural escape analysis, whether it's on by default at basic optimization levels or not.

In that sense, this is not a very hard problem - it was solvable in the 70's.

If you give the compiler all the source, or have libraries compiled as fat lto objects, etc, it should work fine.



Interesting. Thin lto wouldn't be enough?


It should be, actually, as long as you give it all the source code or give it object files with the summary data in them.

The summary mod/ref info should be enough to get it right, because it should say the function does not modify the argument, and that should get propagated through.

The tricky case is where it sometimes modifies the argument - the summary data is not flow sensitive, IIRC

Flow sensitive algorithms are still really expensive worst case (N^3 minimum, sometimes exponential), but could be made practical here using BDD's. Most compilers still do not use BDD's to represent this sort of large binary decision data, even though they are very good at it. It's an area where research always felt more advanced than production (IE even if you don't go whole-hog on datalog, the data structures are still very useful for this sort of otherwise-memory/time intensive data)

In any, fat LTO should defintely be enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: