Selective Specialization for Object-Oriented Languages
Jeffrey Dean,
Craig Chambers, and
David Grove
Dynamic dispatching is a major source of run-time overhead in
object-oriented languages, due both to the direct cost of method
lookup and to the indirect effect of preventing other optimizations. To
reduce this overhead, optimizing compilers for object-oriented
languages analyze the classes of objects stored in program variables,
with the goal of bounding the possible classes of message receivers
enough so that the compiler can uniquely determine the target of a
message send at compile time and replace the message send with a
direct procedure call. Specialization is one important technique for
improving the precision of this static class information: by compiling
multiple versions of a method, each applicable to a subset of the
possible argument classes of the method, more precise static
information about the classes of the method's arguments is obtained.
Previous specialization strategies have not been selective about
where this technique is applied, and therefore tended to significantly
increase compile time and code space usage, particularly for large
applications. In this paper, we present a more general framework for
specialization in object-oriented languages and describe a goal-
directed specialization algorithm that makes selective decisions to
apply specialization to those cases where it provides the highest
benefit. Our results show that our algorithm improves the
performance of a group of sizeable programs by 65% to 275% while
increasing compiled code space requirements by only 4% to 10%.
Moreover, when compared to the previous state-of-the-art
specialization scheme, our algorithm improves performance by 11%
to 67% while simultaneously reducing code space requirements by
65% to 73%.
PLDI'95 Conference Proceedings, La Jolla, CA June, 1995.
To get the PostScript file, click
here.
An earlier version
of this paper appeared in PEPM '94
Cecil/Vortex
Project