• No results found

A simple axiomatic basis for programming language constructs

N/A
N/A
Protected

Academic year: 2021

Share "A simple axiomatic basis for programming language constructs"

Copied!
21
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

constructs

Citation for published version (APA):

Dijkstra, E. W. (1973). A simple axiomatic basis for programming language constructs. (EWD; Vol. 372). International Summer School.

Document status and date: Published: 01/01/1973 Document Version:

Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication:

• A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website.

• The final author version and the galley proof are versions of the publication after peer review.

• The final published version features the final layout of the paper including the volume, issue and page numbers.

Link to publication

General rights

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain

• You may freely distribute the URL identifying the publication in the public portal.

If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the “Taverne” license above, please follow below link for the End User Agreement:

www.tue.nl/taverne Take down policy

If you believe that this document breaches copyright please contact us at: openaccess@tue.nl

providing details and we will investigate your claim.

(2)

INTERNATIONAL SUMMER SCHOOL

ON STRUCTURED PROGRAMMING AND

PROGRAMMED STRUCTURES

MuNICH, GERMANY

JULY

25

TO AUGUST

4, 1973

L E C T U R E N 0 T E S

EDSGER H. DIJKSTRA

A

SI~PLE

AXIOMATIC BASIS FOR PROGRAMMING

LA~GUAGE

CONSTRUCTS.

REPRINT OF:

REPORT EWD

372-· 0

JECHNOLOGICAL UNIVERSITY EINDHOVEN

1973

(3)

. A Simple Axiomatic Basis for Programming Language Constructe.

by Edsger W.Dijkstra

Abstract. The sementics of a program can be defined in terms of a

predicate trensfermer associating with any post-condition (characterizing

a set of fiMal states) the corresponding weekest pre-condition

(cha~acterizing

a set of ini tial states). The sementics of a programming language cah be

defined by

rega~ding a program text as a prescription for constructing its cor~esponding

predicate transformer.

lts conceptual simplicity, the modest amount ·Of rnathematics needed

and its constructive nature seem to be its outstanding virtues. In camparisen

with

. alternative approaches it should be remarked, firstly, that all

non-terminating computations are

r~garded

as

eq~ivalent ·and,

secandly, that a

program construct like the gata-statement falls outside its scope; the latter

characteristic, however, does nat strike the author as a shortcoming, on the

contrary, it confirms him in one of his prejudices!

(4)

EWD372 - 1

A Simple Axiomatic Basis for Programming Language Constructs.

bY Edsger W.Dijkstra

Program testing can be used very effectively to show the presence of bugs, but is hopelessly inadequate for showing their absence and a convincing

. .

correctness proef seems the only way to reach the required confidence level.

In order that such a convincing correctness proef may exist, two conditions must be satisfied by such

a

correctness proef:

1) i t must be a proef and that implies that we Meed a set of axioms to start with

2) i t must be convincing and that implies that we must be able to write, to check, to understand and to.appreciate the proef.

This essay deals with the first of these two topics;

We are considering finite computations only; therefore we can restriet ourselves to computational processas taking place in a fihite state machine -although the possible number of states may be 'very, very large- and take the point of view that the net effect.of the computation can bedescribed by the transition from initia! to final state. (Since the classica! werk of A.M.Turing, and again since the recentworkof D.5cott, one aftenencounters

· the tacid assumption that size and speed of today's computers are sa. huge, that the inclusion of infinite computations leads to the most appropriate

b.:'L'

model. I would be willing to ~that if -as in the case of "the line at

infinity", sweetly leading to projective geometry- the suggested generalization would clean up the majority of the arguments. Thanks to Cantor, Dedekind et al., however, we know that the inclusion of the infinite computation is nat a

logically painless affair, on the contrary! In the light of that experience i t seems more effective to restriet oneself to finite computations taking place in a finite, but sufficiently large universe, thereby avoiding a number of otherwise self-inflicted peins. These mathematicians that are so biased as to refuse to consider an area of thought worthy of their attention, unless i t pays full attention to their pet generalizations, should perhaps net try te foll:-J~>. the rest of my a rgL;ment.) The computation is assumed to take :J la c e

ur.der COI1'tiOl of a,-~ c;lgorithm, and i-.18 war.c: te r>"~ake asserticr.s ato~t al..i. possible computations that may be evoked under control of such a program.

(5)

And we want to base these assertions on the program text! (In particular for sub-programs our aims are usually more modest, being content with assertions about the class of computations that can take place under the additienel eenstraint that the initia! state satisfies some further c.ondition, as we are ablè to show that such a condition wl.ll always be satisfied whenever the sub-program is invoked.)

This implies that we must have a formal definition of the s~mentics of the programming language in which the program has be~n expressed.

The earliest efforts directed towards such definition of semantic~ that I am aware of have been what I call "mechanistic d~finitions": they gave a definition (or "a description") of the steps that should be eerried oût in

executi~g a program, they gave you "the rules of the game" necessary to

carry out any given camputstion (as determined by program ~n.s!_ initia.!. .§.t.2_t~!) by hand or by machine. The basic shortcoming of this approach was that the sementics .of an algorithm were expressed in terms of "the rules of the game",

i.e~ in terms of another algorithm.· The game can only bè played fora chosen initia! state, and as a result i t is as powe.rless as program testing! A mechanistic d~finition as such is not a sound basis for making assertions about the whole class of possible computations associated with a program.

It is this shortcoming that the axio'matic methad seeks to remedy.

We consider predicates P, Q, R, .•. on thesetof states; for each possible state a given predicate will be either ~rue ar false and if we

sa desire, we can regard the predicate as characterizing the subset of states for. which i t is true. There are two special predicates, r.amed T and F: T

is~ for all possible states (characterizes the universe), Fis false

for all possible states (characterizes the empty set). We call two predicates Pand

Q

Elqual ("P

=

Q")

when the sets of states for which they are

.1E!d.ê.

are the same. (Note that P /: T -or ~(P

=

T)- does not allow us to conclude P

=

F

!)

We restriet ourselves to state spaces that are defined as the

cartesian product of (the individual state spaces of) a number of named

variables of known types. Predicates P,

Q,

R, .•• are then formal expr~ssions

in terms of

1) the afore:r.entioned variables (i.e. the "co-ordinatE.s" of our state

'

sp~·.r:e)

(6)

EWD":~72 - ':I

~) free variables of the appropriate types.

The rules for evaluation of these formal expresslons fall outside the scope of this essay: we assume them to be given "elsewhere", net tempted te rede,

say, the work of a Boole or a Peano. (The ability to fo~mulate the specificetions to be met by the program presupposes that such work hes already bien ~one

"elsewhere".)

We consider the se~antics of ~ program S fully det~rmined when we can derive for any post-condition P to be satisfied by the final state, the

weekest pre-condition that for this purpose should be satisfied by the initial state. We regard thi~ weekest pre-condition as a function of the post-condition Pand denote i t by "f5(P)11

• Here we regard the f5 as a "predicate transformer", as a rule for deriving the weakest pre-condition from the post-condition to which i t corresponds.

The sementics of a program 5 are defined when its corresponding predicate transfarmer f5 ~s given, the sementics of a programming language are defined when the rules are given which tell .how to construct the predicate transfarmer fS corresponding to any program 5 written in that language~

As most programming languages are defined recursively, we can expect such construction rules for the predicate transfarmer of the total program to be expressed in terms of predicate transfarmers associated with components. But, as we shall see in a moment, we must abserve some restrictions, for if we allow ourselves toa much freedom in the construction of oredicate

trans-farmers we may arrive at predicate transtrans-farmers fS such tha~ fS(P) can na langer be int~rpreted as the weekest pre-condition ccrresponding to the post-condi~ion P fora oossible deterministic machine.

Dur construction rules for predicate transfarmers fS must be such that, whatever f5 we construct, i t must have the following four basic properties:

1) P

=

Q

implies fS(P)

=

f5(0)

2)

fS(F)

=

F

)) rS(P and

Q)

= f5(P) and fS(Q)

(7)

Proparty 1 assures that we are justified in regarding the prediestas

as characterizing our true subject matt_er, viz. sets of states: it would be awkward if fS(x

>

0)

differed from fS(O

<

x)

Proparty 2 is the so-called "L~w of the Excluded Miracle" end does not

need any further justification.

The justification for properties

3

and 4 beoomes fairly obvious when we consider, for instance, P

=

(0 <x < 2) and Q

=

(1 ~x~

3)

and require that each initia! state satisfying fS(P) is mappad into a single state

satisfying P and similarly for Q. Conversely it can be shown that each

healthy predicate transfarmer fS can be interpreted as describing. the net effect of a deterministic machine, whose actions are fully determined by

the initia! state.

From our 1st and 4th properties we can derive a conclusion. Let P

=>,Q;

from this it fellows that there exists a predicate

R

such that we can write

Q

=PEL

R.

Dur 1st and 4th properties then tellus that fS(Q)

=

fS(P E.f.

R)

=

fS(P) or fS(R)

from which we deduce that

5)

P

=>

Q implies fS(P)

=>

fS(Q) :

A further useful proparty of healthy predicate transfarmers can be

derived already at this stage. Properties end 4 allow us to conclude for

any p fS(P) E.f.

fS(~

P) == fS(P

~

~

P) -- fS(T)

Taking at bath sides the conjunction with non fS(P) we reach

fS(~ P) and ~ fS(P) == fS(T) and .!22.!l fS(P)

Properties 1 , 2 and 3 allow us to conclude for the same fS and same P

fS(P) end fS(~ P) == fS(P and .!22.!l P)

=

fS(F)

=

F

Taking in the last two formulae at bath sides the disjunction we find for healthy predicate transfarmers proparty

6)

fS(~ P) == fS(T) and ~ fS(P)

ar, r:=-ol•~ci>·<_' P t,y ~Pand tak:ing -tre negatio,~ at h:Jth sides, its al

(8)

EWD372 - 5

6')

non fS(P)

=

fS(~ P) ~ ~ fS(T)

The simplest predicate transfarmer enjoying the four basic properties is the identity transformation:

fS(P)

=

P

The corresponding stat~ment

is

well known to programmers, they usually cal! i t "the empty statement".

But i t is very hard to build up ~ery powerful programs from empty

statements alone, we need somethin~ more powerful. We really want to transfarm a given predicate P into a possibly different predicate fS(P).

One of the .most basic operations that can be performed upon forma! expréssions is substitution, i.e. replacing all accurences of a variabie by (the same) "something else".· If in the predicate p. all otcurrences of the variable "x" ~re replaced by (E), then we denote the result of this

tranformation by p .

E -

x

Now we can ~onsider statements S such that

fS(P) : P .

·

E -

x

where x is a "co-ordinate variable" of our state space and E an expression bf the appropriate type. The above rule introduces a whole class of

statements, each of them given by three things a) the identity of the variable x to be replaced

b) the fact that the substitution is the corresponding rule for predicate transformation

c) the expression E which is to replace every occurrence of x in P. The usual way to write such a statement is

x := [

and s~ch a statement is known under the name of an "assignment statement". We can formulate the

Axiom of Assignment. When the state~ent S is of the farm x .-

E

its sem~ntics are given by the predicate transfarmer fS that is such that

~or

al

i

?

f

S

(P)

z P_

L

(9)

Although from a logi~al priint of view unnecessary -we can take this predicate transfarmer to give ~ definition the sementics of what we call

assignment statement~- it is wise to confront this axiomatic definition with our intuitive understanding of the a~signment statement -if we have

one!-and it is comforting to discover that indeed it captures the assignment statement as we (may) know it, as the following examples show. They are

written in the format:

{a

>

0}

{(1) <2}

{fS ( P) } S { P } x:=

x:=

{a

>

0 a nd (x

+

1 )

<

9}

x:= x

+

1

{a

>

0}

{x

<

2}

{a

>

0

~x

< 9}

The above rules enable us to establish the sementics of the empty program and of the program consisting of a single assignment statement.

In

order to be able to campose more complicated predicate transformers, we

abserve that the functional composition of two healthy predicate transfarmers

is again healthy. Sa this is a legitimate way of constructing a new one and

we are led to the

Axiom of Concatenation. Given two statements 51 and 52 witr. healthy

predicate transförmers f51 and f52 respectiv~ly, the predicate transfarmer f5, given for all

P

by

f5(P) = f51 ( f52(P))

is healthy and taken as the sementic definition of the statement 5 that we

denote by

51 52

Functional composition is associative and we are therefore justified in

the use of the term "concatenation": it makes na difference if we perse

"51 ; 52; 53" either as "(51 ; 52); 53" ar as "51 ;(52; 53)".

Relating the axiomatic definition of the concatenation operator ";" ta

aur intuitive understanding of a sequentia! computation, it just means that each execution af 51 (when campleted) will immediately be fellewed by an

execution of 52 and, conversely, that each execution nf 52 has immediately

been preceded by an execution of 51. The functional compositian identifies the initia! state of 52 with the final state of 51.

(10)

EWD372 - 7

new ways of constructing predieets trensformers, but all this, of course,

subject to the restrietion that the ensuing predicate transfarmer must be

healthy. And a number of obvious suggestions must be rejected on that ground,

such as:

fS(P)

=

~

f51 (P)

for that would vialate the Law of the Excluded Miracle.

A

lso

fS(P)

=

f51 (P) and f52(P)

.

must be rejected as such a fS violates the basic property

4:

fS ( P

.2f.

Q)

=

f51 ( P

.fL!:

Q)

~

f52

(

P

.fL!:

Q)

= { f51 (P) .2.!: f51

(Q))

and {f52(P) .2.!: f52(Q)}

while

fS(P)

2E fS(Q)

=

{f51(P) and f52(P)}

EL {f51 (Q) and f52(Q)}

and they are in general different, as the first of the two leads to the

additional terms in the disjunction

{f51 (P) and f52(Q)}

2E {f51 (Q) and f52(P)}

Similarly, if we choose

fS(P)

=

f51 (P) or f52(P)

property

3

is violated, because

fS(P

~

Q)

=

f51 (P and Q) or f52(P

~

Q)

= {

f51

(P)

~

f51 (Q)}

2E

{

f52(P) and f52(Q)}

while

fS(P)

~

fS(Q)

=

{f51(P)

2E f52(P)} and

{f51

(Q)

~

f

52(Q)}

and here the secend one leads to the additienel terms in the disjunction

{rs1

(P)

and f52(Q)}

~

{rs!

(

Q)

and

f52(P)}

This leids to the suggestion that we look for

fS1

and f52 ( in general

fS.) su

c

h

that

for

any

Pand Q

~

i

i=

j

impli

e

s

f5.(P)

and f5.(Q)

=

F

~ J

Doing it for a pair leads to the

Axiom of Binary Selection. Given two statements 51 and 52 with healthy

~radjcate

transfarmers

f51 and

f52 resp8ctively

and

a

predicate B, the

(11)

f5(P)

=

{B ~ f51 (P)} ar{~ Band f52(P)}

is healthy and taken as the semantic definition of the statement S that we denote by

if B then 51 ~ 52 fi

(This is readily extended to a choice between three, four ar any explicitly enumerated set of mutually exclusive alternatives, leeding to the so-called case-construction.)

For an arbitrary given sequence f5. we can nat hope that i ~ j implies

~

f5.(P) and f5.(Q) = F for any Pand Q, but we may hope to achieve this if

~ - - J

we can generate the f5. by a recurrence relation. Befare we embark upon 1

such a project, however, we should derive a useful property of the predicate transfarmers we have been willing to construct thus far.

If two predicate transfarmers fS and f51 satisfy the property that for all P: f5(P)

=>

f51(P), then we call fS as strong as f51

and f5' es week as fS.

(The predicate transf~rmer given fór ~11 P by f5(P)

=

F is 9S. strong

as any ether, the predicate transfarmer given by fS(P) = T would be as weak as any other if i t were admitted, but i t is not healthy: i t violatas the Law of the Excluded Miracle.)

We can now formulate and derive our

Theerem of Monotonicity. Whenever in a predicate transfarmer fS, formed by concatenation and/or selection, one of the constituent predicate transfarmers is replaced by one as weak (strong) as the original one, the resulting

predicate transfarmer f51 is as weak (strong) as f5.

Dbv{ously we only need to show this for the elementary transfarmer constructions.

Concatenation, case 1:

Let 5 be: 51 52

let 5' be: 51 I j 52 let 51 ' !Je as weak as 51 , then for a~y P, fS(P) = T3i

(G)

as f51

(Q)

=>

f51 1

(Q)

for any Q,

Bf"d t

·

s•

1\ I

CJ)

_

- .c-c1. ._)' •\ (Q\ I '

(12)

Concatenation, case 2: Let 5 be: let 5' be: 51 51 52 52' let 52' be as weak as 52,

EWD372 -

9

then for any P, fS(P)

=

f51

(Q)

and f5'(P)

=

f51 (R) where

Q

=

f52(P) and

R

=

f52' (P). Because for any P, Q

=>

R, i t fellows from the healthiness of

f51, that fS(P)

=>

fS'(P) for any P.

QED.

Binary selection, case 1:

Let 5 be: i f B ~ 51 else 52 fi

let 5' be: if B then 51 I el se 52 fi

let 51 1

be as weak as 51 '

then for any p

f5(P) =

{B

and fS1 (P)} .91: {~ B ~ fS2(P) J

=>

{B and f5l'(P)} ar {~ B and f52(P)}

=

fS' (P)

QED.

Binary selection, case 2, can be left to the industrious reader.

Let us now consider a pred~cate. transfarmer G constructed1 by means of

concatenation and selection, oui of a number_of healthy predicati transformers,

amon9 which is fH.· (This latter predica,te transfarmer may be used. "more than

once'': then G cotresponds to a program text in which the corresponding

statement H occurs more than on6e.) We ~ish to regard this predicate

trans-farmer as a function of fH and indicate that by writing G(fH), i.e. G

derives, by concatenation and/ or selection with other, in this conneetion

fixed predicate transforme~s. a new predicate transformer. We now consider the

recurrence .relation

fH .

=

G ( fH .

1 )

l l

-( 1 )

which is a tractable thing in the:sense that i f fH

0 is ~s streng (weak) as

fH

1, i t fellows via mathematical induction from the Theerem of Monotonicity

that fHi is as streng (weak) as fH

1+1 for all i .

W~

should like to start the

recurrence relation with a constant transfor~er fH

0 that is either as streng

or as weak as any other. We can do this for a predicate transfarmer as streng

as any ether by choosing fH

0

=

f5TOP given by

f5TDP(P)

=

F for any P

(13)

given by fHO

=

fSTOP and for i > 0: fHi = G(fHi_

1)

(2)

with fhe property that

1)

all fH. are healthy (by induction)

1

2)

for

i

:S

j and any P: fH. (P). => fH . (P)

1 J

Because all fH. are healthy and any P => T, we also know that for any

P

1

fH . ( P) => fH . ( T)

1 1

We now reeall that we were looking for fS. such that for any P and

Q

1

and i

f

j we would have fS.(P) and fS.(Q)

=

F .

1 J

We can derive such predicate transfarmers from the fH .. As each fH. (P)

1 l.

implies for the same P the next one in the sequence, ~e could try for i

> 0

i.e. the fS.(P) i~ the "incremental tolerance", but -bath on account of

1

(3)

the conjunction and on account of the negation- it is nat immediately obvious that such a construction is a. healthy predicate transformer. Therefore we praeeed a l i ttle bit more carefully, first deriving a fev.• at her theorems

about two predicate transfarmers fS and fS', su~h that f5 is as strong as fS',

i.e. fS(P) => fS'(P) for any P. Another way of writing this same implication

is

fS' (P) = fS(P)

2L {

fS' (P) ~ non f5(P)}

Referring to property

6

'

of healthy predicate transfarmers we can replace "non fS(P)" and find

fS' (P) = fS(P) ~ { fS' (P) and {rs(~ P) or ~ fS(T)}} Because fS(~ P) => fS' (nOn P) ~ ~ fS' (P), this reduces to

fS'(P)

=

fS(P)

2L

{fS'(P) and ~ fS(T)}

(4)

from which we derive (by taking the conjunction with fS(T))

fS'(P) and fS(T) = fS(P)

(5)

and (by taking the conjunction with non fS(P))

fS' (P) and ~ fS(P) = f5' (P) and ~ fS(T)

(6)

From

(6)

we conclude, because fH._1(P) =>fH.(P), that o~r tertati11e

1 . ~

(14)

fSi (P)

=

fHi (P) and .!2!21l fHi_1 (P)

=

fHi(P)

~

~

fHi_ 1 (T)

EWD372- 11

and because "~ fHi_1 (T)" is a predicate independent of P, the fSi as defined by (3) are healthy.

Defining

Ko = F and for i >0: K. fS.(T)

l. l.

i t is easy ta show that

i ~ j implies

This is proved by a reduttia ad absurdum. Let i

<

j and suppose K. and K. ~ F; then there exists a pointvin statespace such that

l. J

K.(v) and K.(v) ~ true

1 - - J

-(7)

(8)

However, K.(0) implies fH.{T)(J) which implies fHi-l (T)(v) -because j-1

2

i

-l. .l. ~

which implies K.(v) = false and this is the contradietien we ~ere after. In

J '

ether words: in each point in state space at most one K is ~·

i

· .. ·.

From

(7)

combined with· fH.(P)

.

=>

fH.(T) it fellows ·that

l. 1 .

fS.(P} = K. and fH.(P)

(9)

1 1 - - - 1

which tagether with (8) leads to the conclusion that for any p and

Q

i

-!

"

j implies fS.(P) and fS .(Q) = F

1 - - ] ( 1 0) and this is exactly the relation we have been looking for.

In passing we nate that, on account of (9), K. = F implies fS.(P)

=

F; on account

know that

l

af (7) this tells us that for any P fHi(P)

=>

fHi_

1 (P)

=>

fHi (P) f'or any P and we cor.clude

l . fHi_ 1 (P); we also fH . (

P)

= fH . 1 ( P) . l 1

-As this holds for any p' we conclude fH. = fH

i-1 a r.d therefore 1 fHi+1

=

G ( fH.) = G ( fH . 1 ) fH l. 1 - ' i In other words K i F implies fHJ . = fH1 -. 1 for J

>

1 and K = F for J

>

i ( 11 ) J

(15)

fH ( P) = (~ i : 1

<

i: fS,(P))

l.

but that ene, although healthy, is nat interesting because on account of

(10)

i t is identically

F;

and secondly

fH (

p)

=

Cs.

i= 1

:S

i=

fs. (

p))

l." ( 12)

The latter one is nat identically F and we call i t a predicate tra~sformer

"composed by recursion11

• In form~la (12), fo.r each point v_ in state space, such that fH(P)(v)

=

true; the existentiel quantifier singles out a unique value of i.

Alternatively we may write

fH ( P) ==

(.s_

j : _ 1

:S

j : -fH . ( P))

J

( 1

3)

It is by now most urgent that we relate the above·to our intuitive understanding of the recursive procedure: then all our formulae become quite obvious.

First a remark abóut the Theerem of Monotonicity: it.just states that if we repla~e a nomp6nent of a atructure by a more powerful one, th~ modified structure will be at least as powerful as the original one. (Consider, for instance, an implementation of a programming language that leads to program abortion when integer overflow occurs, i.e. when an integer value outside

the range

[-M, +M]

is generated. When we modify the machine by increasing M, all computations that were originally feasible, remain so, but pas~ibly we can do more~)

Now for the recursion. All we have been talking about is a recursive procedure (without local variables ahd without parameters) that could have been declared by a text of the farm

proc H: • • • • • H • • • • • H • • • • • H . • • • • corp

i.e. a procedure

H

that may call itself from various places in its body. Mentally we are consiclering a sequence of procedures Hi with

prae H.:

l

Dur rules

H

.

(16)

EWD372 - 13

fHO

=

fSTOP and for i

>

0 fH .

=

G ( fH .

1 )

~

~-are such that the predicate transfarmer fH corresponds to our intuitive i

understanding of the callof procedure H .. In termsof the procedure H,

. ~

fH. describes what a callof the procedurè. H can do under the additional

l

eenstraint that the dyna~ic recursion depth will nat exceed i. In particular, fH.(T) characterizes the initia! states such that the procedure call wil! 1 .

terminste wi th a dynamic recursion depth nat ex.ceeding i, while ·. K. '1 characterizes . those initia! states such that a call of H will give rise to a ~aximum

recursion depth exactly i , This intuitive interpretation ~akes our earlier formulae quite obvious, fH(T) is the weekest pre-condition that the call wil! lead to a terminating computation.

The Theerem of Monotonicity was proved for predicate transfarmers formed by concatenatie~ an~/or selection. If in the body of H one of the

pr~dicate transfarmers fS is replaced by fS', as weak (streng) as fS, then G'(fH) will be as weak (strong) as G(fH), giving rise to an fH! as weak

1

(streng) as fH .; as aresult the Theerem of Monotonicity holds also for

1 .

predicate transfarmers construèted via recursion.

Dur axiomatic definition of the semantica 6f a recursive procedure

for i

>

0:

fHO "" fSTOP. and

· fH :::

i G ( fH 1 -. 1 ) and

fH(P)

=

(I

i

~

i

.

>

o~

fH.(P))

1

finally:

is nice and compact, in actual 'practice i t has one t remendous disadvantage: for all but the simplest bodies, i t is impossible to use i t dire~tly. fH

1 (P)

becomes a line, fH

2(P) becomes a page, etc. and this circumstance makes i t aften very unattractive to use i t directly. We cannot blame our axiomatic definition of the recursive procedure 1or this unattractive state of affairs: recursion is such a powerful technique for the construction of new predicate transfarmers that we can hardly expect a recursive proceciure "chosen at

random" to turn out to be a mathematically manageable object. Sa ~e had better discover which recursive procedures can be managed intellectually and how. This

is nothing more nor less than asking for useful theorems about the sementics of recursive procedures.

(17)

Now we are going to prove the Fundamental Invariance Theorem for Recursive Procedures.

Consider a text, called H", of the farm

H '' : • • • • • H ' • • • • • H ' • • • • • H '

to which corresponds a predicate tr~nsformer fH", such that for a specific pair of predicates Q .and R, the assumption Q

=>

fH' (R) is a sufficient assumption. about fH' for proving Q

=>

fH"(R)

procedure H given by

In that case, the recursive

prae H: H . . . H ..•.• H . . . corp

(where we get this text by removi_ng the dashes and enclosing the resulting text between the brackets prae and corp) enjoys the property that

{ Q a nd fH ( T) }

=>

.

fH ( R)

(The .tentative conclusion example prae H: H corp

Q

=>

fH(R)

.

)

is wrong as is shown by the

We show this by showing that then for §11 i

>

0

( 14)

{ Q and fHi (T)}

=>

fHi (_R)

(15)

and from

(15), (14)

fellows trivially. Relation

(15)

holds for i =

0,

and we shall show i f it holds for i = j-1, i t will hold for i = j as well.

In the formulation of the Fundamental Invariance Theerem for Recursive Procedures we have mentioned "a pair of predicates

Q

and R"; we did so, because besides the co-ordinate variables of the state space, in which the computations evolve, and the constants, they may contain free variables as well and they are paired by the fact that they are the same in a pair

Q

and

R.

For instance, bath

Q

and R may end w:ith "and (x = x

0

)",

where "x" is a co-ordinate variabie and "x " a free variable

0

thus expressing that the value of x will remain unchanged, whatever its initial value, Ta denote a specific s~t (or sets) of free variable values, we shall use smal! letters, supplied as subscripts. Dur statement of affairs, say

(18)

EWD372- 15

Q

=>

fH11

(R )

e e

in order to indicate that

Q

and R are coupled by a set of free variables. (As subscripts I shall use 11e11 for external and "i" for interneL)

Let us first consider, for the sake af simplicity, the case that the text H" contains a single r~ference to H'. In the evalustion of fH" ( R ) ,

e

let· P1 be the argument that, werking backwards, is supplied to fH'; with e

P2 = fH I ( P1 )

e e

we can then write fH"(R )

=

E(P2 ) e ( 1

6)

We can regard E as a predicate transfarmer eperating on its argument

P2 , but considered as predicate transfarmer i t is nat necessarily healthy:

e

i t may vialate the Law of thé,Excluded Miracle. It enjoys, however the

ether three properties:

P

=

Q

implies E(P)

=

E(Q) E(P'and

Q) =

E(P) and E'(Q)

E(P!2L Q)

=

E(P) o~

E(Q)

and thereforé also th~ fifth:

P ; > Q implies E(P)

=>

E(Q)

The statement that with regard· to the predicate pair

Q

and R the assumption Q

=>

fH'(R) is a sufficient assumption about fH' in order to prove

Q

:>

fH"(R) amounts more explicitly to the followi'lg statement:

There exist .for the free variables occurring in

Q

and R a set i of values (in general functionally dependent on the set e), such that

R. d> P1

1 e

Q

=>

E(O.)

e 1

(For instance, consider the statement

H": n:= n - 1; H'; n:= n + 1

with Q and

R

bath: n = n

0

,

where n

0

is a free variable. Dur proof for

( n

=

n )

=>

fH " ( n n )

e e

can be bss~~ nn the assumption

I \ (i n.) ; > fi-1'\n 1 \

= "

.

J 1

(17)

(19)

.

..

with n.=n - 1 .

~ e

HeieR and

Q

arè bath: n

=

n and R. and Q. are bath: n

=

n . . )

e e e ~ ~ ~

When we are now able to show that fH j ( T)

=>

E (

fH j-t

(T) )

then it fellows from (17) that

{Q and fH . (T)}

=>

E(Q. and fH .

1 (T})

e - - J . . ~-

J-( 18)

and as aresult {Qi and fHj_1(T)J

=>

fH'(Ri) is then a sufficient assumption about fH' to conclüde that .

{Q

and fH.

(T)}

=>

fH"(R ) . As fH . depends on

e - - J e · J

fH.

1 as H" on H', this w~uld conclude the induction step and

(14)

would

J-have been proved.

We have two holes to fill: we have to show (18) and we have to extend the line of reasoning to texts of H'', containing more than one reference to H'. Let us first concentrate on (18).

We have defined fH .

=

~(fH .

1); but because for any P, we have

J J

-fHj-1

(P)

=>

fHj_

1

(T),

an identical defin.itiol'l would have been

fHj = G(fHj_1 (T) and fHj_1)

i.e. each predicate for~ed by applying fHj_

1 is replaced by its conjunction

with fH.

1 (T). And therefore, instea~ of

J-P1 = fS(T) P2

=

fH . .

1

(P1)

J -fH . ( T) = E ( P2) J

(i.e. P1 is the argument supplie~ to fH'

in the evalustion of fH"(T).) .

we could have written equally well P1

=

fS(T) P2

=

fH . 1 ( P1 ) J-fH .(T)

=

E(P2 and J - fH J. -

1

( T

)

) .

But {P2 ~ fHj_

1 (T)}

=>

fHi_1 (T) and therefore. because the transfarmer E enjoys the fifth property, we àre entitled to conclude

i.e. relation (18) .

(20)

EWD?72 - 17

ene reference to H' may occur, is easier. Werking backwards in the evalustion of fH"(R) means that we first ~ncounter the innermost evaluation(s) of fH';

e

whose argument does nat contain fH'. For those predicate transfarmers we apply our previous argument, showing that for them the weaker assumption

Q and fH.

1 ( T)

==>

fH

1

( R) is sufficient. Th en i ts va lue is replaced by Q.

J- 1

(or Qi

1 i f you prefer) and we start afresh. IA this way the sufficiency of the weaker assumption about fH' can be established for all occurrences of fH' -only a finite number!- in turn.

*

*

*

For the recursive routines of the particularly simple farm prae H: if B then 51; H else

.fi.

corp

we can ask ourselves what must be known about B and 51 , when we take for R the special farm Q and ~ B.. Then

fH''(Q .§!.!l9...!2E.!l B) ={Band fS1(fH'(Q ~~ B))} E.f. {Q and ~ B} In order to be able to conclude. Q .,~ fH"(Q and non B) on account of

Q

=>

fH 1 (Q and ~ B), the necessary and. sufficient assumption ab"aut f51 is

{Q and B}

=>

f51

(Q) .

Procedures of this slmple fdr~ are such useful elements that i t is generaily felt justified to introduce a specific notatien for it, in which the recursive procedure remeins anonymcus: i t should contain as "parameters" the Band the 51 and we usually write

while B do 51 ad

With the statement S of the aböve farm, we have now proved that

{Q .ê.!l2.

B}

=>

f51 (Q) implies {Q and f5(T)}

=>

fS(Q ~ .!2E.!J. B) This is called ''The Fundamental Invariance Theerem for Repetition".

(21)

Acknowledgements.

Acknowledgements are due to the members of IFIP Werking Group 2~3 on Programming Methodology with whom I had the privilege to discuss a preliminary version of this paper at the Munich meeting in April

1973.

Among them, special' thanks deserve M.Woodger, whose inspiring influence and assistance with

respect to this work extended over the weeks bath befare and after that meeting, J.C.Reynolds, who has drawn my attention to an incompleteness in my first proef of the Invariance Theerem ~nd finallyi of ~ourse, C.A.R.Hoare, because without his innovating work mine ~o~ld not have been possible at all.

Special thanks are further due to C.S.Scholten for ~leaning u~ several of my formal proofs and tö my collaboiators at the Technological University, Eindhoven, W.H.J.Feijen and M.Rem fortheir encouragement and cominents while the work was done.

Referenties

GERELATEERDE DOCUMENTEN

coop.lang.System.defaultBinding #325: operators.MethodInheritance.virtualBinding #9963: operators.FieldInheritance.virtualBinding

Maar voor relatief langlevende soorten met grote individuele mycelia, zou de conclusie wel eens kunnen zijn dat veel gevoelige soorten (soorten die zeer zeldzaam zijn, maar niet

Daar in bepaalde gevallen deze rek vrij plaatselijk is, rnoet men er voor zorgen, dat dit dunner worden niet voor insnoering wordt aangezien. Aan weerszijden van

den aangetrofïcn. We merken daarbij terloops op dat ook hier de kleine lampionplant {Physalis alke- kenjji), welke in tabel I bij de wilde planten gerang- schikt is,

10 Ιανουαρίου 2020 10 Φεβρουαρίου 2020 10 Μαρτίου 2020 10 Απριλίου 2020 10 Μαΐου 2020 10 Ιουνίου 2020 10 Ιουλίου 2020 10 Αυγούστου 2020 10 Σεπτεμβρίου

Tobiah Lissens • June 2020 • A novel field based programming language for robotic swarms paper adds coordination with the aggregate

The implementation of a language which implements the programming model, and which has constructs for value deviation, latency and erasure-tolerance constraints, is presented in

“Elke intu¨ıtief berekenbare functie is berekenbaar op een Turingmachine.”. Turingmachine Minsky register