[Prev][Next][Index][Thread]
types
[------ The Types Forum ------- http://www.dcs.gla.ac.uk/~types ------]
|--------------------------------------------------|
|REMINDER REMINDER AND THE PROGRAMME AND ABSTRACTS |
|--------------------------------------------------|
|-----------------------------------------------------------------------|
|Programme of the International School on Type Theory and Term Rewriting|
|-----------------------------------------------------------------------|
|--------------------|
|12-15 September 1995|
|Glasgow University |
|--------------------|
Note that some talks are 45 minutes followed by 30 minutes discussion
and others are 30 minutes followed by 15 minutes discussion.
|-----------------------------------| |-----------------------------------|
|Thursday 12 September | |Friday 13 September |
|-----------------------------------| |-----------------------------------|
| 9:00- 9:45 Henk Barendregt | | 9:00- 9:45 Bob Constable* |
| 9:45-10:15 Discussion | | 9:45-10:15 Discussion |
| | | |
| 10:15-10:45 Gilles Barthe | | 10:15-10:45 Zurab Khasidashvili |
| 10:45-11:00 Discussion | | 10:45-11:00 Discussion |
| | | |
| 11:00-11:30 COFFEE BREAK | | 11:00-11:30 COFFEE BREAK |
| | | |
| 11:30-12:15 Michael Fourman | | 11:30-12:15 Joe Wells |
| 12:15-12:45 Discussion | | 12:15-12:45 Discussion |
| | | |
| 12:45-14:00 LUNCH | | 12:45-14:00 LUNCH |
| | | |
| 14:00-14:45 N.G. de Bruijn | | 14:00-14:45 Jean-Jacques Levy |
| 14:45-15:15 Discussion | | 14:45-15:15 Discussion |
| | | |
| 15:15-15:45 Fairouz Kamareddine | | 15:15-15:45 Judith Underwood |
| 15:45-16:00 Discussion | | 15:45-16:00 Discussion |
| | | |
| 16:00-16:30 COFFEE BREAK | | 16:00-16:30 COFFEE BREAK |
| | | |
| 16:30-17:15 Roberto Di Cosmo | | 16:30-17:15 Gordon Plotkin |
| 17:15-17:45 Discussion | | 17:15-17:45 Discussion |
| | | |
| 17:45-18:15 Alejandro Rios | | 17:45-18:15 Paul-Andre Mellies |
| 18:15-18:30 Discussion | | 18:15-18:30 Discussion |
|-----------------------------------| |-----------------------------------|
|-----------------------------------| |-----------------------------------|
|Saturday 14 September | |Sunday 15 September |
|-----------------------------------| |-----------------------------------|
| 9:00- 9:45 Rod Burstall* | | 9:00- 9:45 Giuseppe Longo |
| 9:45-10:15 Discussion | | 9:45-10:15 Discussion |
| | | |
| 10:15-10:45 John Glauert | | 10:15-11:00 Adolfo Piperno |
| 10:45-11:00 Discussion | | 11:00-11:30 Discussion |
| | | |
| 11:00-11:30 COFFEE BREAK | | 11:30-11:45 COFFEE BREAK |
| | | |
| 11:30-12:15 Therese Hardin | | 11:45-12:30 S. Ronchi Della Rocca|
| 12:15-12:45 Discussion | | 12:30-13:00 Discussion |
| | | |
| 12:45-14:00 LUNCH | | 13:00-14:00 LUNCH |
| | | |
| 14:00-14:45 Jan-Willem Klop | | 14:00-14:45 Richard Kennaway |
| 14:45-15:15 Discussion | | 14:45-15:15 Discussion |
| | | |
| 15:15-16:00 Assaf Kfoury | | 15:15-16:00 Jos Baeten |
| 16:00-16:30 Discussion | | 16:00-16:30 Discussion |
| | | |
| 16:30-17:00 COFFEE BREAK | | 16:30-17:00 COFFEE BREAK |
| | | |
| 17:00-17:45 Pierre Lescanne | | 17:00-17:45 Steffen van Bakel |
| 17:45-18:15 Discussion | | 17:45-18:15 Discussion |
| | | |
|-----------------------------------| |-----------------------------------|
* Tentative speaker
Registration fee is 150 UK Pounds (75 UK Pounds for full time students).
Accommodation is available at the following rates per person per night:
Staff: 20.50 UK Pounds (bed and breakfast) or 11.50 (bed only).
Students (bed only): 8.50 UK Pounds (UK students) and
11.50 UK Pounds (overseas students).
If you are interested in attending, send the registration fee (with evidence
of full time studentship if applicable), together with the fee for the
accommodation for the relevant number of nights and type of accommodation to:
|--------------------------------------------------------------------|
|Fairouz Kamareddine, University of Glasgow, Department of Computing |
|Science, 17 Lilybank Gardens, Glasgow G12 8QQ, Scotland, UK, |
|email fairouz@dcs.gla.ac.uk |
|--------------------------------------------------------------------|
SPEAKERS AND TITLES (in alphabetical order):
---------------------------------------------
Jos Baeten: Sequential Data Algebra
Steffen van Bakel: Normalisation and approximation results for term rewriting
systems (with abstraction and \beta-rule) typeable using intersection types.
Henk Barendregt: Efficient Computations in Formal Proofs
Gilles Barthe: Congruence and oracle types
N.G. de Bruijn: Features of Automath
Rod Burstall: Tentative speaker
Bob Constable: Tentative speaker
Roberto Di Cosmo: A brief history of rewriting with extensionality
Michael Fourman: Indeterminate structures
John Glauert: Neededness and Relative Normalization
Therese Hardin: What about freeness?
Fairouz Kamareddine: Type Theory and Term Rewriting at Glasgow, past and future
Richard Kennaway: Transfinite Rewriting
Assaf Kfoury: A Linearization of the Lambda-Calculus and Applications
Zurab Khasidashvili: Comparing stable models of deterministic computation
Jan-Willem Klop: Decreasing Diagrams
Pierre Lescanne: EXPLICIT SUBSTITUTIONS AND EXPLICIT ALPHA CONVERSION
Jean-Jacques Levy: Dependency calculus for the lambda calculus.
Giuseppe Longo: SUBTYPES, GENERICITY AND NORMALIZATION
Paul-Andre Mellies: Rewriting in Normalisation Proofs
Adolfo Piperno: Normalization by developments and subtyping
Gordon Plotkin: Compositionality and Universal Type Theory
Alejandro Rios: Two styles of explicit substitutions, a survey
Simona Ronchi Della Rocca: The operational content of intersection types
Judith Underwood: Classical logic and nonlocal control,
the case against confluence
Joe Wells: Generalized Reduction, Let-Floating, and CPS
----------------------------------------------------------------------------
Sequential data algebra
Jos Baeten
Eindhoven university of technology
---------------------------------
(joint work with Jan Bergstra (Univ. of Amsterdam and Utrecht) and Alex
Sellink (Univ. of Amsterdam))
We propose a style of data type specification based on four valued logic.
Four valued logic has constants for true, false, meaningless and divergent;
this allows to deal with error values and non-terminating computations. The
word sequential indicates that all operators come with a definite order of
evaluation, thus a normalization strategy. The aim is to avoid endless
duplication and reinvention in the practice of algebraic specification.
As an example, we study the specification of floating-point numbers.
----------------------------------------------------------------------------
Normalisation and approximation results for term rewriting systems
(with abstraction and \beta-rule) typeable using intersection types.
Steffen van Bakel
Imperial college, London
-------------------------------------------------------------------
Research preformed in cooperation with Maribel Ferna'ndez, of E'cole Normale
Supe'rieure, Paris, and Franco Barbanera, University of Turin, Italy.
The talk will start with a short discussion of intersection type assignment
for Lambda Calculus, together with its major properties:
- a filter model,
- a strong normalization result,
- a head-normalization result,
- an approximation theorem.
We are interested in defining a filter-like semantics for other computational
models, and our starting point was to have a look at first order term rewriting
systems. We start with the definition of a notion of type assignment for
first order rewrite systems, extended with a notion of application, for
which we proved a subject reduction result.
We will present the restictions put on recursive definitions in order to
obtain the strong normalization and the head-normalization results. We then
will define a notion of approximants for terms in our rewrite system; the same
restrictions as before on recursion are also sufficient for the approximation
result.
We then look at a calculus that combines term rewriting and beta-reduction,
for which similar results are obtainable.
-------------------------------------------------------------------------
Efficient Computations in Formal Proofs
Henk Barendregt
Catholic university of Nijmegen
-------------------------------------------
Formal proofs in mathematics and computer science are being
studied because these objects can be verified by a very simple computer
program. An important open problem is whether
these formal proofs can be generated with an effort not much greater
than writing a mathematical paper in, say, \LaTeX.
Modern systems for proof-development make the formalization of
reasoning relatively easy. Formalizing computations such that the
results can be used in formal proofs is not immediate.
In this talk it is shown how to obtain formal proofs of
statements like $\mathsf{Prime}(\num{61})$ in the context of Peano
arithmetic or $x^2-1=(x-1)(x+1)$ in the context of rings. It is hoped
that the method will help bridge the gap between systems of computer
algebra and systems of proof-development.
-----------------------------------------------------------------------------
Congruence and oracle types
Gilles Barthe
CWI, Amsterdam
----------------------------
The combination of type theory and algebraic rewriting has traditionally been
studied in the context of programming languages. In this talk, we examine the
use of such a combination in theorem proving. More specifically, we will
introduce two new notions, congruence types and oracle types, and show how
they can be used in theorem proving.
Congruence types
----------------
The syntax of congruence types is a two-level syntax which allows to introduce
a term-rewriting system as a quotient of an inductive type. They can be used
to give a computationally and logically faithful representation of canonical
term-rewriting systems*.
Oracle types
------------
Oracle types is a sub-syntax of congruence types. They can be used as a model
for the interaction between theorem provers and computer algebra systems or
to automate equational reasoning in theorem proving.
* The standard approach to combine type theory and term rewriting is not
faithful because of the absence of an induction principle for algebraic types.
----------------------------------------------------------------------------
Features of Automath
N.G. de Bruijn.
Eindhoven University of Technology
----------------------------------
Various proof verification systems, at least those
which combine typed lambda calculus with the idea of
proof objects, have much in common with old Automath. But
Automath has some features that were not taken over by
others and nevertheless have technical and conceptual
advantages. In particular this refers to the book
structure of PAL, the lambda-free fragment of Automath.
Instead of replacing that structure by lambda formalism,
Automath maintained it, thus providing a double mechanism
for functionality. One of the effects of keeping the PAL
structure is the natural way to internalize definitions, a
thing that has to be given up if the PAL mechanism is
replaced by typed lambda calculus. It can be reclaimed if
ordinary typed lambda calculus is replaced by the only
slightly different modification called delta-lambda.
----------------------------------------------------------------------------
A brief history of rewriting with extensionality
Roberto Di Cosmo
Ecole Normale Superieure, Paris
---------------------------------------------------
Over the past years, much work has been done on typed lambda
calculi with extensional axioms turned into expansive rules
rather than into contractive ones. We will survey the rewriting
techniques that have been developed and used for these calculi,
which are a typical example of fruitful interaction between
lambda calculus and rewriting, and we will survey also the most
recent results in the field.
------------------------------------------------------------------------
Indeterminate Structures
Michael Fourman
The University of Edinburgh
----------------------------
We use the notion of a generic object, from topos theory, to model
indeterminate types.
This allows us to give a semantics for SML functors. We describe the
construction of indeterminate
structures, used to model functor parameters, and discuss the features of a
module system based on this semantics.
----------------------------------------------------------------------------
Neededness and Relative Normalization
John Glauert
University of East Anglia
--------------------------------------
(This work is performed in the context of Expression Reduction Systems, a
form of higher-order rewriting, and is joint work with Z. Khasidashvili.)
Practical applications of rewriting systems often require reduction to
head-normal forms or other classes of "normal" form. We generalize the
theory of normalization via neededness to normalization relative to
desirable sets of terms, S, having natural properties of stability which
guarantee normalization, whenever possible, by contraction of S-needed
redexes. We also discuss minimal and optimal needed reduction and show that
they need not coincide.
---------------------------------------------------------------------------
What about freeness?
Therese Hardin
INRIA, PARIS
-----------------------
In calculi with binders, variable names are well-known to be a source
of problems and De Bruijn notation, which replaces names by numbers
recording bindings, is a recognized way to avoid these
difficulties. It is easy to replace bound variables by numbers. Free
variables can also be replaced by numbers, by implicitly closing the
term but they may also be kept as names. We compare these two points
of view and we explain to code these free variables by metavariables
in a calculus of explicit substitutions. Then, we show how this coding
may be fruitfully used for unification and compilation.
--------------------------------------------------------------------------
Type Theory and term Rewriting at Glasgow: Past and Future
Fairouz Kamareddine
University of Glasgow
---------------------
This talk describes some of the problems attacked by the type theory and term
rewriting group at Glasgow in the last five years and outlines some open
problems and future goals. The discussion will be directed towards implicitly
versus explicitly typed lambda calculi and the study of their theory and
implementation. I shall concentrate on extending those calculi with explicit
definitions, explicit substitutions, new notions of typing and reductions
and the logical systems based on these calculi. I will also be looking for
future generalisations of this work and its advantages. For example, the
various new notions of reductions can be generalised even FURTHER and this
generalisation can provide a decidable charaterisation which is the nearest
approximation to the undecidable notion of reductional equivalence between
terms. It is very interesting indeed that almost all new notions of
reductions that have been found in the last twenty years by various
researchers (as Moggi, Plotkin, Nederpelt, Klop, Regner, etc), were trying
to remain within this notion of reductional equivalence.
I shall outline the future I forsee for the type theory and
term rewriting group at Glasgow both within Glasgow and at the various
centres with whom we collaborate.
---------------------------------------------------------------------------
Transfinite rewriting
Richard Kennaway
University of East Anglia
--------------------------
We survey the basic concepts and properties of transfinite rewriting for
orthogonal term rewrite systems, lambda calculus, higher-order rewrite
systems, and abstract rewrite systems.
---------------------------------------------------------------------------
A Linearization of the Lambda-Calculus and Applications
Assaf Kfoury
Boston University
----------------------------------------------------------
If every $\lambda$-abstraction in a $\lambda$-term $M$ binds at most one
variable occurrence ($M$ is thus a ``linear'' term), then $M$ is
$\beta$-strongly normalizing. We extend the syntax of the standard
$\lambda$-calculus $\Lambda$ to adapt this idea to the general case
when $\lambda$-abstractions are allowed to bind more than one variable
occurrence.
In the extended calculus, denoted $\Lambda^\wedge$, a subterm $Q$ of a
term $M$ can be applied to several subterms $R_1,\ldots,R_k$ in parallel,
which we write as $(Q. R_1\wedge\cdots\wedge R_k)$. The appropriate
notion of $\beta$-reduction $\beta^\wedge$ for this extended calculus
is such that, if $Q$ is the $\lambda$-abstraction $(\lambda x.P)$ with
$\ell\geqslant 0$ bound occurrences of $x$, the reduction can be carried
out provided $k = \max(\ell,1)$. Every $M\in\Lambda^\wedge$ is thus
$\beta^\wedge$-SN. The {\em contraction} of $M\in\Lambda^\wedge$
is a standard term $|M|\in\Lambda$, which is defined provided for
every subterm of the form $(Q. R_1\wedge\cdots\wedge R_k) \subseteq M$,
each of $R_1,\ldots,R_k$ contracts to the same standard term
$|R_1| = \cdots = |R_k|$. We say that $M\in\Lambda^\wedge$ is {\em fully
linear} if the $\beta^\wedge$-normal form $N$ of $M$ is such that
$|N|$ is defined and is a standard term in $\beta$-normal form.
One key lemma states that for every standard term $M\in\Lambda$:
{\em $M$ is $\beta$-SN iff there is a fully linear $M'\in\Lambda^\wedge$
such that $M = |M'|$.} We draw several consequences from this result,
in particular a new simple proof for the (known) fact that a (standard)
term $M$ is $\beta$-SN iff $M$ can be assigned a so-called ``intersection''
type (``top'' type disallowed).
-----------------------------------------------------------------------------
Comparing stable models of deterministic computation
Zurab Khazidashvili
University of East Anglia
-------------------------------------------------------
(This work has been performed in cooperation with John Glauert.)
Event Structures are well established models of concurrent computation.
Here we study Event Structure semantics for Orthogonal Rewrite Systems,
such as the lambda-calculus and its sharing graph model, and of functional
programming languages based on such systems. Conflict-free (or
Deterministic) Prime Event Structures, DPESs, which are simply event sets
partially ordered by a causal dependency relation, are sufficient for this
purpose.
We study orthogonal systems in the form of Stable Deterministic Residual
Structures (SDRSs), which are Abstract Reduction Systems with axiomatized
residual relation. Basic theory of orthogonal systems, such as
normalization, can be developed already in the framework of SDRSs. In order
to interpret SDRSs as DPESs, an equivalence relation on transitions in
SDRSs is required that would relate transitions performing (part of) the
`same' event. We introduce such structures, Deterministic Family Structures
(DFSs), by axiomatizing family relation on transitions in SDRSs, by analogy
with the family relation in the lambda calculus and other orthogonal
rewrite systems. DFSs can easily be interpreted as DPESs (by taking
redex-families as events and taking the contribution relation on them as
the causal dependency relation on events).
However, DPESs are linear in nature (no event can erase or duplicate
another -- any event can occur at most once in the course of a
computation), and therefore they cannot give an adequate account of erasure
of redexes/transitions in SDRSs. To remedy this situation, we extend DPESs
in two ways: by axiomatizing erasure relation on events, and by
axiomatizing permutation equivalence on event configurations. We then
define translations between the two extended event models, non-duplicating
SDRSs, and non-duplicating DFSs, and show that the translations commute,
implying isomorphism of the above set-theoretic and transition models of
deterministic computation.
--------------------------------------------------------------------------
Decreasing diagrams
Jan Willem Klop
CWI and Vrije Universiteit, Amsterdam
--------------------------------------
We discuss confluence criteria for abstract rewriting,
notably the method of decreasing diagrams due to
De Bruijn and van Oostrom. This method generalizes
various lemma's in abstract rewriting such as Newman's Lemma,
Huet's strong confluence lemma and several others.
We also pay attention to infinite reduction diagrams.
Joint work with Marc Bezem and Vincent van Oostrom.
----------------------------------------------------------------------------
EXPLICIT SUBSTITUTIONS AND EXPLICIT ALPHA CONVERSION
Pierre Lescanne
Inria, Nancy
---------------------------------------------------------
Calculi of explicit substitution internalise the main process of
lambda-calculus, namely substitution. Thus substitution is better
understood and its complexity is better mastered. Actually calculi of
explicit substitution can be divided into two classes, namely those
which use de Bruijn's indices and those which use names. It is well
known that de Bruijn's indices make terms less readable and therefore
names are often prefered by people. But when one uses names, one is
faced to the problem of alpha conversion which is is somewhat hidden
in de Bruijn's indices and one may like to make it explicit either. We
propose a calculus which uses a convention for attributing names based
on their binding depth and introduced as ``levels'' by de Bruijn in
1972. Thanks to levels, alpha conversion can be made explicit in our
calculus. We discuss some of its properties and applications.
----------------------------------------------------------------------------
Dependency calculus for the lambda calculus.
Jean-Jacques Levy
Inria, Paris
--------------------------------------------
Vesta is a modeling system with a functional language for
describing software packages. Incremental calculation of dependencies
may be viewed as a simple lambda calculus problem, with an extra
origin calculation. Joint work with Abadi and Lampson.
------------------------------------------------------------------------------
SUBTYPES, GENERICITY AND NORMALIZATION
Giuseppe Longo
Ecole Normale Superieure, Paris
------------------------------------------
A recent logical theory of subtypes is introduced. Its motivation
goes back to the introduction, in a functional frame, of a typical
Object Oriented feature, inheritance, reinterpreted in a specific
and functional, way. The presentation in terms of sequent calculi
allows to formulate and prove a coherence theorem and a
"transitivity elimination" result. The later may be handled as a
cut-elimination (normalization) result, in the sense of logic. The
implicit challenges will be shortly discussed. The analysis of
cut-elimination and of "injectivity" of subtyping relates to an
apparently unrelated result on parametricity of higher order
lambda-calculi, which in turn gives some insight into the proof of
normalization.
---------------------------------------------------------------------------
Rewriting in Normalisation Proofs
Paul-Andre Mellies
LFCS, University of Edinburgh
------------------------------------
Normalisation proof techniques can be divided in two areas:
--( semantical techniques which extend a proof by Tait for the simply typed
lambda-calculus and use saturated term-models,
==( syntactical techniques which use more combinatorial recipes.
I will present here my personal improvements on both sides:
--) a generic normalisation proof of Pure Type Systems generalising
a work by Thorsten Altenkirch --- this part is joint work with Benjamin Werner.
==) a combinatorial proof of strong normalisation for non confluent
label-decreasing rewriting systems --- introducing the ``zoom-in'' strategy.
Finaly, I will show that the two techniques can be mixed
to obtain Strong Normalisation proofs in the case of
Type Systems enriched with Rewriting.
-------------------------------------------------------------------------------
Normalization by developments and subtyping
Adolpho Piperno
University of Roma
---------------------------------------------
Normalization proofs for typed lambda-calculi have received
a considerable interest in the last few years.
In particular, investigations have been made to clarify
the relationship between weak and strong normalization,
to refine the notion of development and to give
syntactical tools for studying properties of reduction.
Some of these issues will be reviewed and discussed.
A general method for proving normalization
in typed lambda calculi will be presented, based on the
use of eta-expansions, superdevelopments and subtyping.
----------------------------------------------------------------------------
Compositionality and Universal Type Theory
Gordon Plotkin
University of Edinburgh
---------------------------------------
The principle of compositionality or referential transparency is central to
denotational semantics. Taking this seriously requires a theory of
language. For example, viewing language as a free multi-sorted algebra
enables one to express compositionality homomorphically, as advocated long
ago by the ADJ group. It also provides a uniform notion of translation
between signatures that allows one to pass from the semantics of one
language to that of another.
However such a theory of language is inadequate, and for several reasons.
We consider two of these: the lack of a treatment of binding operators, and
the fact that sorts (= types), can themselves have a complex linguistic
structure. Making a (partial) identification of type theories with
programming languages, we advocate a programme to find a treatment of type
theories analogous to the universal algebraic treatment of algebraic
theories. With this, the language used to express (typically)
domain-theoretic semantics is itself brought into the picture, and one can
see compositional denotational semantics itself as translation, in a
uniform sense.
-----------------------------------------------------------------------------
Two styles of explicit substitutions: a survey
Alejandro Rios
University of Glasgow
----------------------------------------------
(This work is carried out in collaboration with Fairouz Kamareddine and is
supported by EPSRC grant GR/K25014: "Theoretical and implementational
advantages of a new lambda notation".)
Most formulations of the lambda-calculus use implicit rather than explicit
substitutions. The last decade however has seen an explosion of calculi
which deal with substitutions explicitly.
This talk starts by giving a brief introduction to two different styles of
explicit substitutions: the already classical lambda-sigma style and the
recently developped lambda-s style. We discuss substitutions calculi
with de Bruijn indices and with variable names, try to establish a bridge
between both and compare substitutions calculi in both styles from the point
of view of efficiency to simulate beta-reduction.
Furthermore, for each style, we outline the remaining open problems and the
disadvantages versus the advantages they offer.
An important issue will also be, the use of term rewriting techniques for
the generalisations of proofs of preservation of strong normalisation and
confluence of calculi of explicit substitutions.
----------------------------------------------------------------------------
The operational content of intersection types
Simona Ronchi Della Rocca
Univerity of Turino
---------------------------------------------
STILL TO COME....
----------------------------------------------------------------------------
Classical logic and nonlocal control: the case against confluence
Judith Underwood
University of Edinburgh
-----------------------------------------------------------------
Nonlocal control operators were introduced to functional languages to
allow programmers to guide the evaluation process more efficiently.
They can also be used to implement error handlers and concurrency
mechanisms such as coroutines. However, they represent a significant
addition to the principles of lambda-calculus which underlie many
ideas in functional programming. Furthermore, there appears to be a
deep connection between classical logic and typed lambda-calculus plus
nonlocal control, similar to the connection between constructive logic
and typed lambda-calculus.
An important consideration in studying languages
with nonlocal control operators is that they may be nonconfluent.
This property can be justified both by the connection with classical
logic and by applications to programming. Because of this,
many proof techniques for properties such as normalization
cannot be used directly. It is hoped that this talk will motivate
work on such proof methods.
----------------------------------------------------------------------------
Generalized Reduction, Let-Floating, and CPS
Joe Wells
Boston University
--------------------------------------------
Generalized reduction recognizes implicit beta redexes in a lambda term
and reduces them directly, ignoring intervening abstractions and
applications. Let-floating lifts or lowers beta redexes to other
equivalent locations. A CPS transformation changes terms into a form
which is executable on a stack-based implementation and which is
indifferent to call-by-value vs. call-by-name evaluation.
We evaluate and survey different restrictions on let-floating. We discuss
how generalized reduction is subsumed by a particular subset of
let-floating. We show how a larger subset of let-floating is subsumed by
beta reduction on CPS transformed terms. We discuss the application of
these techniques in such areas as type inference, strong normalization,
and program equivalence.
----------------------------------------------------------------------