SLIDE 37
There is an interesting problem with mark-and-sweep, already for the mark-phase. The marking may use DFS, but if done recursively, we need a stack for that. Resp, even without recursion but with an explict stack, the marking need dynamic memory to do the marking! In the worst case the size of the stack corresponds to the size of the heap. Appel calls that unacceptable.
This is clever technique to deal with the “stack-memory” problem.
Presentation in [Cooper and Torczon, 2004]
- 1. Overview This book refers very litte to the notion of run-time environment. It’s all contained in [Cooper and Torczon, 2004,
Chapter 6: the procedure abstraction]. They presumably concentrate on the treatment of procedures (at run- time), i.e., the stack as the most important part of the run-time environment. They cover, however, also other aspects like heap etc, but the focus is on procedures and to achieve “the procedure abstraction” and the famous “Algol-like” languages, which basically is exactly that: languages with procedures as primary abstraction mecha- nism and a stack-based organization of the run-time environment. So basically, no closures or higher-order func-
- tions. They seem also count object-oriented languages among those, but not everyone would agree (see for instance
[Gelernter and Jagannathan, 1990], but they concentrate on other aspects. Standard object-oriented languages still have a stack-based run-time environment.
The intro stresses the importance of the notion of procedures as abstraction mechanism, interpreted both “theoretical” as well as “practical”. It’s a nice angle, so that the material is not immediately about “forms of memory organization”. The goal is to provide a meaningful abstraction, and that concerns name spaces and procedure local scope. Procedures are a central way to organize code. It’s also the core of “components” and their “interfaces”. It’s als a unit of
- compilation. And the chapter is about how that can be realized, i.e., how the requirement of a proper abstraction
mechanism is translated into memory management, allocation of addresses, stack frames. Furthermore, being a unit
- f compilation, also the issue of separate compilation is important (and calling conventions etc, even if perhaps not
covered by the book). Separate compilation is the only way to make compiled software systems scale. What is proper interfaces for the programmer are calling conventions at a lower level of abstraction. An important part of the procedure abstraction is parameter passing (i.e., parametricity). Being a central abstraction, it’s they key for the programmer to “conceptually” develop large programs. The fact that the compiler treats it as a unit of compilation is the key to compile and maintain large systems. For terminology: there’s the distinction between caller and callee. The book repeatedly state that procedures create a “controlled execution environment”, but it’s a but unclear what they mean. Perhaps that procedures are supposed to provide an “abstraction” in the form of “encapsulation” (hence “controlled environment”). The text mentions three ingredients for the procedure abstraction (a) Procedure call abstraction (b) Name space (c) External interface control abstraction linkage convention, linkers, and loaders It’s non-trivial to maintain such an abstraction mechanism, and the compiler most “provide” it, i.e., generate code and data structure that at run-time the abstraction is maintained. It does so relying also on the operating system.
The book does not explicitly define what Algol-like languages are, but implicitly they seem to be lexically-scoped, block-structured languages with procedures (but not higher-order functions). The text states that ALLs have a simple call and return discipline. Fair enough, but it’s a bit unclear which languages does not. A call and return discipline for me is synonymous with a LIFO treatment of the control flow, something that also applies to higher-order languages (only the memory allocation there does not work in a LIFO-manner, which is perhaps what they mean). The book shows a Pascal program with nested procedures (but none with procedure variables. . . ) which is used to illustrated the notion of call graph and of execution history. The program is non-recursive (which can be seen in the call-graph). The execution history is the “history” or “trace” of calls an returns. The different “occurrences”
- f function bodies (the area from call to return of a particular function or procedure) is here called instance or
activation. There is a connection between the call-graph They state that the call and return behavior of ALLs may be modelled with a stack, but again, it’s unclear what the alternatives were. Perhaps is like that: the control flow, the call and return, definitely is stack organized. ALLs are languages, where also the local memory can be realized accordingly. As far as the call- and return-behavior is concerned, object-oriented languages are analogous to ALLs. What goes beyond ALLs are closures, where the stack discipline does no longer works. It’s a general phenomenon: a stack only works if a variable does not outlive the activation of the procedure.
A scope seems synonymous to a name space (at least in ALLs. Again it’s difficult to see alternatives). Actually, if the alternative are higher-order functions (but with lexical scope), I don’t see that the situation is different there. Perhaps not even in higher-order lanuguages with dynamic scoping. (a) Name spaces in ALLs Algol was a, actually the pionier here, so many languages follow in the footsteps of Algol here. In particular it’s about nested lexical scopes. Scopes are connected to procedures. The book illustrates it by (another) Pascal
- program. They discuss the notion of static coordinate which was already introduced in connection with the
symbol tables.
37