• No results found

Information measures and their applications to identification : a bibliography

N/A
N/A
Protected

Academic year: 2021

Share "Information measures and their applications to identification : a bibliography"

Copied!
31
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Information measures and their applications to identification :

a bibliography

Citation for published version (APA):

Ponomarenko, M. F. (1981). Information measures and their applications to identification : a bibliography. (EUT report. E, Fac. of Electrical Engineering; Vol. 81-E-123). Technische Hogeschool Eindhoven.

Document status and date: Published: 01/01/1981

Document Version:

Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers)

Please check the document version of this publication:

• A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website.

• The final author version and the galley proof are versions of the publication after peer review.

• The final published version features the final layout of the paper including the volume, issue and page numbers.

Link to publication

General rights

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain

• You may freely distribute the URL identifying the publication in the public portal.

If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the “Taverne” license above, please follow below link for the End User Agreement:

www.tue.nl/taverne Take down policy

If you believe that this document breaches copyright please contact us at: openaccess@tue.nl

providing details and we will investigate your claim.

(2)

Electrical Engineering

Information Measures and their Applications to Identification (a oibliography)

compiled by M.F. Ponomarenko

EUT Report 81-E-123 ISBN 90-6144-123-4 November 1981

(3)

Department of Electrical Engineering Eindhoven The Netherlands

INFORMATION MEASURES AND THEIR APPLICATIONS TO IDENTIFICATION

(a bibliography)

compiled by

M.F. Ponomarenko

EUT Report 81-E-123 ISBN 90-6144-123-4

Eindhoven November 1981

(4)

PREFACE

The present bibliography list covers publications (up to 1981) on two related topics: information theoretic measures and information approach to identification. For user's convenience, the first

section is divided into three parts, listing separately publications on probabilistic syntactic information measures, non-probabilistic syntactic measures and on semantic and pragmatic information measures. This work was done during the author's stay in the

Measurement and Control Group of the Eindhoven University of Technology. Thanks are due to Professor P. Eykhoff for his interest and support, the library assistance by Mrs. Henriette de Brouwer, Mr. Peter van de Ven and Mr. P.S.A. Groot for their help, and to Mrs. Barbara Cornelissen and to Miss Marjolein Verbeek for their patient and skilful typing.

Ir. I.V. Bra!a has contributed greatly to the bibliographic quality of the references.

Address of the author:

Dr. M.F. Ponomarenko,

Kiev (Order of Lenin) Polytechnic Institute, Brest-Litovsky prospekt 39,

KIEV, USSR

(5)

CONTENTS

I . INFORMATION MEASURES Page

I • I . Probabilistic Syntactic Information Measures

1.2. Non-probabilistic Syntactic Information

Measures 8

I .3. Semantic and Pragmatic Information

Measures 10

2. APPLICATIONS OF INFORMATION MEASURES

TO IDENTIFICATION 14

Ponomarenko, M.F.

INFORMATION MEASURES AND THEIR APPLICATIONS TO IDENTIFICATION (a bibliography).

Eindhoven University of Technology, Department of Electrical Engineering, Eindhoven, The Netherlands, 1981.

Eindhoven University of Technology Research Reports, lUT Report 81 E 123

(6)

I. INFORMATION MEASURES

1.1. Probabilistic Syntactic Information Measures

Aczel, J. (1964)

Zur gemeinsamen Charakterisierung der Entropien a-ter Ordnung und der Shannonschen Entropie bei nicht unbedingt vollstandigen Verteilungen. Zeitschrift fur Wahrscheinlichtkeitstheorie und verwandte Gebiete, Vol.3, p.I77-183.

AcziH, J. (1968)

PrObability and information theory.

In: On different characterizations of entropies. Proc. 1st Int. Symp., Hamilton, 4-5 April 1968. Ed. by M. Behara et al.

Berlin: Springer, 1969.

Lecture Notes in Mathematics, Vol.89, p.I-II. Aczel, J. (1977)

Some recent results on characterization of measures of information related to coding.

In: Abstracts 1977 IEEE Int. Symp. on Information Theory; Ithaca, N.Y. 10-14 October 1977. New York: IEEE, p.77.

Aczel, J. (1978)

some-recent results on characterization of measures of information related to coding.

IEEE Trans. Inf. Theory, Vol.IT-24, p.592-595.

~, J. and Z. Daroczy (1963)

Charakterisierung der Entropien positiver Ordnung und der Shannonschen Entropie. Acta Math. Acad. Sci. Hungary, Vol.14, p.95-121.

Aczel, J. and Z. Daroczy (1975)

on-measures of information and their characterizations. New York: Academic Press.

Mathematics in Science and Engineering, Vol.1IS.

~, J. and J. Pfanzagl (1966)

Remarks on the measurement of subjective probability and information. Metrika, Vol.II, p.91-105.

Aggarwal, N.L. (1973)

Sur l'information de Fisher.

In: Theories de l'information. Actes des Rencontres de Marseille-Luminy, 5 au 7 juin 1973. Ed. by J. Kampe de Feriet and C.F. Picard.

Berlin: Springer, 1974.

Lecture Notes in Mathematics, Vol.398, p.III-117. Aggarwal, N.L. and C.F. Picard (1978)

Functional equations and information measures with preference.

Kybernetika, Vol.14, p.174-181. Arimoto, S. (1975)

Information measures and capacity of order a for discrete memoryless channels. In: Topics in information theory. Proc. 2nd Colloquium on Information Theory; Keszthely, Hungary, 25-28 August 1975.

Ed. by I. Csiszar and P. Elias. Amsterdam: North-HoI land

,-r977.

Colloquia Mathematica Societatis Janos Bolyai, Vol.16, p.41-52. Barret, T.W. (1976 a)

Information measurement. I. On maximum entropy conditions applied to elementary signals.

(7)

Barret, T.W. (1976b)

Information measurement. II. On minimum conditions of energy order applied to elementary signals.

Acustica, Vol.36, p.282-286. Bhattacharyya, A. (1946-47)

On some analogues of the amount of information and their use in statistical estimation.

Sankh ya, Vol.8, p.I-14, 201-218. Behara, M. and P. Nath (1973)

Additive and non-additive entropies of finite measurable part1t10ns.

In: Probability and information theory II. Ed. by M. Behara et al. Berlin: Springer.

Lecture Notes in Mathematics, Vol.296, p.102-138. Boekee, D.E. (1975)

An extension of the Fisher information measure.

In: Topics in information theory. Proc. 2nd Colloquium on Information Theory; Keszthely, Hungary, 25-29 August 1975.

Ed. by I. Csiszar and P. Elias. Amsterdam: North-Holland,~.

Colloquia Mathematica Societatis Janos Bolyai, Vol.16, p.113-123. Boekee, D.E. (1976)

On the notion of entropy metrics.

In: Abstracts 1976 Int. Symp. on Information Theory; Ronneby, Sweden, 21-24 June 1976.

New York: IEEE, p.36. Boekee, D.E. (1977)

A generalization of the Fisher information measure.

Ph.D. Thesis. Delft University of Technology, Netherlands. Delft University Press.

Boekee, D.E. and J.C.A. van der Lubbe (1980) The R-norm information measure.

Ini. & Control, Vol. 45, p. 136-155. Bozic, S.M. (1980)

A simple approach to decision and information theory.

Electron, Eng., Vol.52, No.643, October 1980, p.91,93,95,99,101. Brillouin, L. (1962)

Science and information theory. 2nd ed. New York: Academic Press.

Csiszar, 1. (1974)

Information measures: a critical survey.

In: Trans. 7th Prague Conf. on Information Theory, Statistical Decisions Functions, Random Processes Bnd the European Meeting of Statisticians; Prague, 18-23 August 1974. Vol.B.

Dordrecht: Reidel, 1978, p.73-86. Dar6czy, Z. (1970)

Generalized information functions. Ini. & Control, Vol.16, p.36-51. Dar6cz¥, Z. (1963)

Uber d,e gemeinsame Charakterisierung der zu den nicht vollstandigen Verteilungen gehorigen Entropien von Shannon und von Renyi.

Zeitschrift fur Wahrscheinlichkeitstheorie und verwandte Gebiete,

(8)

DeGroot, M.H. (1962)

Uncertainty, information and sequential experiments. Ann. Math. Stat., Vol.33, p.404-419.

El-Sayed, A.-B. (1977)

The independence inequality and its application to information theory. Inf. & Control, Vol.35, p.229-245.

Fintushal, S.M. (1975)

Representation of Fisher information in terms of distribution moments. Probl. Inf. Transm., Vol.II, p.253-255.

(Transl. of "Probl. Peredachi Inf. ") Forte, B. and C.T. ~ (1975)

Derivation of a class of entropies including those of degree

a.

Inf. & Control, Vol.28, p.335-351.

Forte, B. and C. Sempi (1978)

Pruning and measures of uncertainty. RAIRO Inf. Theor., Vol.12, p.157-168. Fraser, D.A.S. (1965)

On information in statistics.

Ann. Math. Stat., Vol.36, p.890-896. Georgescu-Roegen, N. (1975)

The measure of information: a critique.

In: Modern trends in cybernetics and systems. Proc. 3rd Int. Congress of Cybernetics and Systems; Bucharest, 25-28 August 1975. Vol.3. Ed. by J. Rose and C. Bilciu.

Berlin: SprInger, p.187 217. Giustini, P. (1973)

The information and energy concepts (French).

7th Int. Congress on Cybernetics, Namur, 10-15 September 1973. Abstract only: Comput. & Control Abstr. 74-1616.

Gupta, H.C. and B.D. Sharma (1976) On non-additive measures of inaccuracy. Czech. Math. J., Vol.26 (101), p.584-595. Gyorfi, L. and T. Nemetz (1975)

f dissimularity: a general class of separation measures of several

probability measures.

In: Topics in information theory. Proc. 2nd Colloquium on Information Theory; Keszthely, Hungary, 25-29 August 1975.

Ed. by I. Csiszar and P. Elias. Amsterdam: North-Holland, 1977.

Colloquia Mathematica Societatis JAnos Bolyai, Vol.16, p.309-321. Hartley, R.V. (1928)

Transmission of information.

Bell Syst. Tech. J., Vol.7, p.535-563. Havrda, J. and F. Charvat (1967)

Quantification method of classification process. Concept of structural a-entropy.

Kybernetika, Vol.3, p.30-35. Jaynes, E.T. (1977)

Information theory in physics.

In: Abstracts 1977 IEEE Int. Symp. on Information Theory; Ithaca, N.Y., 10-14 October 1977.

(9)

Kagan, A.M. (1963)

On the theory of Fisher's amount of information. Sov. Math.-Doklady, Vol.4, p.991-993.

(Transl. of "Doklady Akademii Nauk SSSR, Vol.151, 1963, No.I-6.) Kampe de Feriet, J. (1970)

Measure of information by a set of observers: a functional equation. In: Functional equations and inequalities. Corso tenuto a la Mendola, Trento, dal 20 al 28 agosto 1970. III CicIo.

Coordinatore: B. Forte.

Roma: Edizioni Cremonese, 1971.

Centro Internazionale Matematico Estivo - C.I.M.E. - International Mathema-tical Summer Center. p.163-193.

Kannappan, Pl. (1972)

On Shannon's entropy, directed divergence and inaccuracy.

Zeitschrift fur Wahrscheinlichkeitstheorie und verwandte Gebiete, Vol.22, p.95-IOO.

Kannappan, Pl. (1972)

On directed divergence and inaccuracy.

Zeitschrift fur Wahrscheinlichkeitstheorie und verwandte Gebiete, Vol.25, p.49-55.

Kannappan, Pl. and C.T. ~ (1973)

Measurable solutions of functional equations related to information theory. Proc. Amer. Math. Soc., Vol.38, p.303-310.

Kannappan, Pl. and P.N. Rathie (1973)

On a characterization of directed divergence. Inf. & Control, Vol.22, p.163-171.

Kerridge, D.F. (1961) Inaccuracy and inference.

J. Royal Stat. Soc. Ser. B, Vol.23, p.184-194. Kullback, S. (1953)

A note on information theory.

J. Appl. Phys., Vol.24, p.106-I07. Kullback, S. (1954)

Certain inequalities in information theory and the Cramer-Rao inequality. Ann. Math. Stat., Vol.25, p.745-751.

Kullback, S. (1959)

Information theory and statistics. New York: Wiley,

Wiley Publications in Statistics. Kullback, S. and R.A. Leibler (1951) On information and sufficiency. Ann. Math. Stat., Vol.22, p.79-86. Liboff, R.L. (1974)

Gibbs vs. Shannon entropies.

J. Stat. Phys., Vol.II, p.343-357. Mallows, C.L. (1959)

The information in an experiment.

J. Royal Stat. Soc. Ser. B, Vol.2l, p.67-72. Mathai, A.M. (1967)

Dispersion and information. Metron, Vol.26, p.314-325.

(10)

Mathai, A.M. and P.N. Rathie (1975)

Basic concepts in information theory and statistics: Axiomatic foundations and applications.

New Delhi: Wiley Eastern. Nath, P. (1968)

Entropy, inaccuracy and information. Metrika, Vol.13, p.136-148.

Otten, K.W. (1972)

Basis for a science of information.

In: Information Science, Search for Identity. Proc. NATO Advanced Study Institute; Champion, 12-20 August 1972.

Ed. by A. Debons.

New York: Marcel Dekker, 1974.

Books in Library and Information Science. p.91-106. Papaioannou, P.C. (1970)

On statistical information theory and related measures of information. Ph.D. Thesis. Iowa State University of Science and Technology, Ames. Available from: University Microfilms, Ann Arbor, Mich., USA.

Order No. 70-25815.

Patni, G.C. and K.C. Jain (1976)

on-sDme

information measures.

Int. & Control, Vol.31, p.185-192.

Picard, C.-F. and T. van der Pyl (1977)

Information d'ordre a et de type

8

pour des produits d'experiences. C.R. Hebd. Seances Acad. Sci. Ser. A (Paris), Vol.284, p.417-420. Rathie, P.N. (1971)

On some new measures of uncertainty, inaccuracy and information and their characterizations.

Kybernetika, Vol.7, p.394-403. Rathie, P.N. (1973)

Some characterization theorems for generalized measures of uncertainty and informations.

Metrika, Vol.20, p.122-130.

Rathie, P.N. and Pl. Kannappan (1972) A directed-divergence function of type

8.

Int. & Control, Vol.20, p.38-45. Renyi, A. (1960)

On measures of entropy and information.

In: Proc. 4th Berkeley Symp. on Mathematical Statistics and Probability. Vol.l: Contributions to the Theory of Statistics.

Berkeley, 20 June-30 July 1960. Ed. by J. Neyman.

Berkeley-Los Angeles: University of California Press, 1961. p.547-561. Renyi, A. (1966)

On the amount of missing information and the Neyman-Pearson lemma. In: Research papers in statistics. Festschrift for J. Neyman. Ed. by F.N. David.

London: Wiley:-p728 1-288. Mnyi, A. (1970)

Probability theory.

Amsterdam: North-Holland.

(11)

Sears, S.B., R.G. Parr and U. Dinur (1980)

Quantum-mechanical kinetic energy as a measure of the information in a

distribution.

Israel J. Chern., Vol.19, p.165-173. Shannon, C.E. (1948)

A mathematical theory of communication.

Bell Syst. Tech. J., Vol.27, p.379-423; 623-656. Shannon, C.E. and W. Weaver (1949)

The mathematical theory of communication. Urbana: University of Illinois Press. Sharma, B.D. (1972)

On the amount of information of type-S and other measures. Metrika, Vol.19, p.I-IO.

Sharma, B.D. and R. Autar (1973 a)

Relative-information-rullCtions and their type (a,S) generalizations. Metrika, Vol.2l, p.41-50.

Sharma, B.D. and R. Autar (1973 b)

On characterization ~generalized inaccuracy measure in information

theory.

J. Appl. Probab., Vol.IO, p.464-468. Sharma, B.D. and D.P. Mittal (1975)

New non-additive measures of entropy for discrete probability distributions.

J. Math. Sci., Vol.IO, p.28-40. Sharma, B.D. and D.P. Mittal (1977)

New non-additive measures of relative information. J. Combin. Inform. System Sci., Vol.2, p.122-132. Sharma, B.D. and I.J. Taneja (1974)

On axiomatic characterization of information-theoretic measures. J. Stat. Phys., Vol.IO, p.337-346.

Sheng, C.L. and S.G.S. Shiva (1966) On measure of information.

In: Proc. National Electronics Conference, Chicago, 3-5 Oct. 1966. Vol.22. Chicago: National Electronics Conference, p.798-803.

Shimizu, R. (1974)

On F~sher's amount of information for location family.

ln~ A Modern Course on Statistical Distributions in Scientific Work. Vol.3: Characterizations and Applications.

Proc. NATO Advanced Study Institute, Calgary, 29 July- 10 August 1974. Ed. by G.P. Patil et al.

Dordrecht: Reidel, 1975.

NATO Advanced Study Institutes Series C: Mathematical and Physical Sciences, Vol.17, p.30S 312.

Shiva, S.G.S:-, N.D. Ahmed and N.D. Georganas (1973) Order preserving measures of information.

J. Appl.. Probab., Vol. 10, p.666-670.

Tane~a, I.J. (1974)

A jo~nt characterization of directed divergence, inaccuracy, and their generalizations.

(12)

Taneja, LJ. (1976)

On measure of information and inaccuracy. J. Stat. Phys., Vol.14, p.263-270.

Vaganov, A.M. and G.G. Kosenko (1972)

A general approach to radar information measures of Kotel'nikov, Shannon and Kul 'baka.

Radio Eng. & Electron. Phys., Vol.17, p.1200-1202. (Transl. of "Radiotekh.

& Elektron.")

Van der Lubbe, J.C.A. and D.E. Boekee (1977) R-norm information.

Proc. 10th European Meeting of Statisticians; Leuven, Belgium, 22-26 August 1977, p.173.

Van der Lubbe, J.C.A. and D.E. Boekee (1979)

On measures of certainty and information in sequential and nonsequential hypothesis testing.

In: Abstracts 1979 IEEE Int. Symp. on Information Theory; Grignano, Italy, 25-29 June 1979.

(13)

1.2. Non-probabilistic Syntactic Information Measures Aczel, J. (1979)

rnset

measures: a new, unified theory of information.

In: Abstracts 1979 IEEE Int. Symp. on Information Theory; Grignano, Italy, 25-29 June 1979.

New York: IEEE, p.26. Backer, E. (1975)

A non-statistical type of uncertainty in fuzzy events.

In: Topics in Information Theory. Proc. 2nd Colloquium on Information Theory; Keszthely, Hungary, 25-29 August 1975.

Ed. by I. Csiszar and P. Elias. Amsterdam: North-Holland,-r917.

Colloquia Mathematica Societatis Janos Bolyai, Vol.16, p.53-73. Bonchev, D., D. Kamensky and V. Kamenska (1976)

Symmetry and information content of chemical structures. Bull. Math. BioI., Vol.38, p.119-133.

Boxma, Y. and E. Backer (1979)

PrObabilistic and non-probabilistic certainty measures in relation to information and distance measures.

In: Abstracts 1979 IEEE Int. Symp. on Information Theory; Grignano, Italy, 25-29 Juni 1979.

New York: IEEE, p.25. Forte, B. (1971)

Applications of functional equations and inequalities to information theory.

In: Functional equations and inequalities. Corso tenuto a La Mendola, Trento, dal 20 al 28 agosto 1970. III CicIo.

Coordinatore: B. Forte. Roma: Edizioni Cremonese.

Centro Internazionale Matematico Estivo - C.I.M.E. - International Mathema-tical Summer Center. p.113-140.

Forte, B. (1969)

Measures of information: the general axiomatic theory.

Revue Fran~aise d'Informatique et de Recherche Operationelle, Vol.3, No.R-2, p.63-89.

Ingarden, R.S. (1965)

Simplified Axioms for information without probability. Prace Matematyczne, Vol.9, p.273-282.

Ingarden, R.S. and K. Urbanik (1962) Information without probability. Colloq. Math., Vol.9, p.131-150. Kolmogorov, A.N. (1965)

Three approaches to the quantitative definition of information. Probl. Inf. Transm., Vol.I, No.1, p.I-7.

(Trans!. of "Probl. Peredachi Inf. ") Lagrand, C. and N.-T. Hung (1972)

Sur les mesures interi"Ei'lIT'es de l'information et les a-precapacites.

(14)

Losfeld, J. (1973)

Information generalisee et relation d'ordre.

In: Theories de l'information. Actes des Rencontres de Marseille-Luminy, 5 au 7 juin 1973. Ed. by J. Kampe de Feriet and C.F. Picard.

Berlin: Springer, 1974.

Lecture Notes in Mathematics, Vol.398, p.49-61. McGuire, C. B. (1972)

Comparisons of information structures.

In: Decision and Organization. A volume in honor of Jacob Marshak. Ed. by C.B. McGuire and R. Radner.

Amsterdam: North-Holland.

Studies in Mathematical and Managerial Economics, Vol.12, p.101-130. Mowshowitz, A. (1968 a)

Entropy and the complexity of graphs. I: An index of the relative complexity of a graph.

Bull. Math. Biophys., Vol.30, p.175-204. Mowshowitz, A. (1968b)

Entropy and the complexity of graphs. II: The information content of diagraphs and infinite graphs.

Bull. Math. Biophys., Vol.30, p.225-240.

Mowshowi tz, A. (1968c)

Entropy and the complexity of graphs. III:

information content.

Bull. Math. Biophys., Vol. 30, p.387-414.

Mowshowitz, A. (1968d)

Entropy and the complexity of graphs. IV: structure.

Bull. Math. Biophys., Vol. 30, p.533-546. Rashevsky, N. (1955)

Life, information theory, and topology. Bull. Math. Biophys., Vol.17, p.229-235. Trucco, E. (1956 a)

Graphs with prescribed

Entropy measures and graphical

A note on the information content of graphs.

Bull. Math. Biophys., Vol.18, p.129-135. Trucco, E. (1956 b)

On the information content of graphs: Compound symbols; different states for each point.

Bull. Math. Biophys., Vol.18, p.237-253. Urbanik, K. (1972)

On the concept of information.

Bull. Acad. Pol. Sci. Ser. Sci. Math. Astron. & Phys., Vol.20, p.887-890. Van der Lubbe, J.C.A. (1981)

A generalized probabilistic theory of the measurement of certainty and information.

Ph.D. Thesis. Delft University of Technology, Netherlands.

Department of Electrical Engineering, Delft University of Technology. Technical Report IT-81-02.

(15)

1.3. Semantic and Pragmatic Information Measures

Bar-Hillel, Y. (1964)

Language and information: Selected essays on their theory and application. Reading, Mass.: Addison-Wesley.

Addison Series in Logic.

Bar-Hillel, Y. and R. Carnap (1953) Semantic information.

Br. J. Philos. Sci., Vol.4, p.147-157. Belii' M. and S. Guiaiu (1968)

A quantitative-qualitative measure of information in cybernetic systems. IEEE Trans. Inf. Theory, Vol.IT-14, p.593-594.

Bongard, M.M. (1963)

On the concept of "useful information". (Russian) Problemy Kibernetiki, No.9, p.72-102.

Bouchon, B. (1976)

Useful information and questionnaires. Inf. & Control, Vo1.32, p.368-378. Brauch, H. (1969)

Moglichkeiten zur Quantifizierung von Informationen fur Entscheidungs-prozesse.

Ph.D. Thesis. University of Mannheim. Carnap, R. (1950)

Logical foundations of probability. Chicago: University of Chicago Press. Casfi, P., ~. Mili and Ch. Robach (1977)

An lnformatlon measure on nets - application to the testability of digital systems.

In: Information and Systems. Proc. IFAC Workshop, Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978, p.35-39.

Corley, M.R. and J.J. Allan, III, (1976)

Pragmatic information processing aspects of graphically accessed computer-aided design.

IEEE Trans. Syst., Man & Cybern., Vol.SMC-6, p.434-439. Evans, F.J. (1977)

The informational content of system structure - a survey and some open problems.

In : Information and Systems. Proc. IFAC Workshop, Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978, p.51-64. Fishburn, p.C. (1966)

On the prospects of a unified theory of value for engineering. IEEE Trans. Syst. Sci. & Cybern., Vol.SSC-2, p.27-35.

Gottinger, H.W. (1973)

Qualitative information and comparative informativeness. Kybernetik, Vol.13, p.81-94.

(16)

Gottinger, H.W. (1975)

Lecture notes on concepts and measures of information. In: Information Theory: New trends and open problems. Ed. by G. Longo.

Wien: Spri~

International Centre for Mechanical Sciences: Courses and Lectures, Vol.219, p.I-44.

Le Guyader, H., Cl. Vallet, Th. Moulin, L. Lafreniere and H. Apter (1977) Arithmetical relators and virtual information.

In: Information and Systems. Proc. IFAC Workshop, Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978, p.119-129. Harrah, D. (1963)

Communication: A logical model. Cambridge, Mass.: M.I.T. Press. M.I.T. Research Monograph, Vol.15. Hintikka, J. (1970)

On semantic information.

In: Information and Inference.

Ed. by K.J.J. Hintikka and P. Suppes. Dordrecht, Reidel.

Synthese Library, p.3-27. Hintikka, J. (1967)

The varieties of information and scientific explanation. In: Logic, Methodology and Philosophy of Science Ill. Proc. 3rd Int. Congress; Amsterdam, 25 Aug. -2 Sept. 1967. Ed. by B. van Rootselaar and J.F. Staal.

Amsterdam: North-Holland, 1968.

---Studies in Logic and the Foundations of Mathematics; p.311-331. Howard, R.A. (1966)

Information value theory.

IEEE Trans. Syst. Sci. & Cybern. , Vol.SSC-2, p.22-26. Hurley, W.V. (1965)

A mathematical theory of the value of information. Int. J. Comput. Math., Vol.l, p.97-146.

Jumarie, G. (1978)

Some technical applications of relativistic information, Shannon informa-tion, fuzzy sets, linguistics, relativistic sets and communication (I). Cybernetica, vol.21, p.93-123.

Jumarie, G. (1973)

Towards a new approach to self-organizing systems. Int. J. Syst. Sci., Vol.4, p.707-726.

Jumarie, G. (1974)

Structural entropy, informational potential, information balance and evolution in self-organizing systems.

Int. J. Syst. Sci., Vol.5, p.953-972. Jumarie, G. (1975 a)

Further advances on the general thermodynamics of open systems via information theory: effective entropy, negative information.

(17)

Jumarie, G. (1975 b)

A relativistic information theory model for general systems. Lorentz transformation of organizability and structural entropy. Int. J. Syst. Sci., Vo1.6, p.865-886.

Jumarie, G. (1976 a)

New results in relativistic information theory: Application to deter-ministic, stochastic and biological systems.

Int. J. Syst. Sci., Vol.7, p.393-414. Jumarie, G. (1976 b)

A relativistic information approach to the structural dynamics of general systems. Morphogenesis.

Cybernetica, Vol.19, p.273-304. Jumarie, G. (1976 c)

A relativistic information theoretic approach to identification and system parameter estimation.

In : Proc. 4th IFAC Symp. on Identification and System Parameter Estima-tion; Tbilisi, 21-27 September 1976.

Ed. by N.S. Rajbman.

Amsterdam: North-Holland, 1978, p.1305-1314. Jumarie, G. (1977)

A survey of relativistic information and its applications.

In: Information and Systems. Proc. IFAC Workshop, Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978, p.113-118

Kamp~ de F~riet, J. (1973)

La th€orie g€n€ra1isee de l'information et 1a mesure subjective de

1 'information.

In: Theories de l'information. Actes des Rencontres de Marsei1le-Luminy, 5 au 7 juin 1973.

Ed. by J. Kampe de Feriet and C.F. Picard. Berlin: Springer, 1974.

Lecture Notes in Mathematics, Vol.398, p.I-35. Kharkevits, A.A. (1960)

On the value of information.

Problems of Cybernetics, VoI.4, 1962, p.1193-1198.

(TransI. of "Problemy Kibernetiki", VoI.4, 1960, p.53-58.) Kulikowski, J.L. (1971)

Notes on the value of information transmitted through a communication channel.

In: Proc. 2nd Int. Symp. on Information Theory; Tsahkadsor, Armenia, 2-8 September 1971.

Ed. by B.N. Petrov and F. Cs§ki.

Budapest: Akademiai Kiad6, 1973, p.61-71. MacKay, D.M (1969)

Information, mechanism and meaning. Cambridge, Mass.: M.I.T. Press. MacKay, D.M. (1950)

Quantal aspects of scientific information. Philos. Mag., Vol.4l, p.289-311.

(18)

Nauta, D. (1970)

The meaning of information.

Ph.D. Thesis. University of Leyden, 1970. The Hague: Mouton.

Nauta, D. (1973 a)

Information-measurement and meaning. Linguistics, No.97, p.95-104.

Nauta, D. (1973 b)

Portret van de semiotiek.

Algemeen Nederlands Tijdschrift voor Wijsbegeerte, Vol.65, 1973, p.170-198.

Neidhardt, P. (1971)

Moglichkeiten der Einfuhrung des Wertbegriffs in die Informationstheorie. Int. Elektronische Rundschau, Vol.25, p.265-268.

Schreider, Yu.A. (1963)

On the quantitative characteristics of semantic information. (Russian) Nauchno-Tekhnicheskaya Informatsiya, No.IO, p.33-38.

Stanoulov, N. and V. L~utskanov (1974) Pragmatic information 1n control processes. C.R. Acad. Bulg. ScL, VQl.27,. p.1025-1028. Stratonovich, R.L. (1965)

On information cost.

Eng. Cybern., Vol.3, No.5, p.I-9. (,fransl. of "Tekh. Kibern. ") Ursul, A.D. (1975)

The problem of the objectivity of information.

In: Entropy and information in science and philosophy. Ed. by L. Kubat and J. Zeman.

Amsterdam:~vier; Prague: Academia. p.187-200. Van Peursen, C.A., C.P. Bertels and D. Nauta (1968)

Informatie, een interdisciplinaire stud~

Utrecht: Spectrum. Aula-boeken.

Voishvillo, E.K. (1966)

An attempt of semantic interpretation of the statistical concepts of information and entropy (Russian).

In: Kybernetiku - na sluzhbu kommunismu. Vol.3. Moscow: Energiya, p.275-293.

Wells, R. (1960)

A measure of subjective information.

In: Structure of Language and its Mathematical Aspects.

Proc. 12th Symp. on Applied Mathematics; New York City, 14-15 April 1960. Ed. by R. Jakobson.

Providence, R.I.: American Mathematical Society, 1961.

Proceedings of Symposia in Applied Mathematics, Vol.12, p.237-244. Zunde, P. (1971)

On signs, information, and information measures.

(19)

2. APPLICATIONS OF INFORMATION MEASURES TO IDENTIFICATION Abe, K and H. Takeda (1973)

Finite parameter estimation problems under incomplete information on stochastic matrices.

Electron. &Commun. Japan, Vo1.56-A, No.8, p.II-17. Akaike, H. (1972)

Use of an information theoretic quantity for statistical model identification.

In: Proc. 5th Hawaii Int. Conf. on System Sciences; Honolulu, 11-13 January 1972. Ed. by A. Lew.

North Hollywood,-calif.: Western Periodicals,

p.249-2S0.

Akaike, H •. (1973)

Information theory and an extension of the maximum likelihood principles. In: Proc. 2nd Int. Symp. on Information Theory; Tsahkadsor, Armenia, 2-8 September 1971.

Ed. by B.N. Petrov and F. Csaki. Budapest: Akademiai Kiad6, p.267-281. Akaike, H. (1976a)

Canonical correlation analysis of time series and the use of information

criterion.

In: System identification: Advances and case studies. Ed. by R.K. Mehra and D.G. Lainiotis.

New York: Academic Press, 1976.

Mathematics in Science and Engineering, Vol.126, p.27-96. Akaike, H. (1976 b)

On entropy maximization principle.

In: Applications of Statistics. Proc. Symp.; Dayton, 14-18 June 1976. Ed. by P.R. Krishnaiah.

Amsterdam: North-Holland, 1977. p.27-41. Akaike, H. (1981)

~rn development of statistical methods.

In: Trends and Progress in System Identification. Ed. by P. Eykhoff.

Oxford: Pergamon.

IFAC Series for Graduates Research Workers & Practisin Vol.I, p.1 9-184.

~, M.A. and A.P. Sage (1975)

ineers,

Sequential estimation and identification of reflection coefficients by minimax entropy inverse filtering.

Comput. & Electr. Eng., Vol.2, p.315-338.

Aleksandrovskaya, L.N. and A.S. Golubkov (1973)

Informational estimation of stationarity in measurement data processing

problems (Russian).

Izv. VUZ Priborostr., No.1, p.18-21. Arimoto, S. (1971)

Information-theoretical considerations on estimation problems. Inf. & Control, Vol. 19, p.181-194.

Arimoto, S. and H. Kimura (1971)

Optimum input test signals for system identification - an information-theoretical approach.

(20)

Astrom. K.J. (1967)

On the achievable accuracy in identification problems.

In: Preprints 1st IFAC Symp. on Identification; Prague. 1967. Prague: Academia. Paper 1.8.

Baram. Y. and N.R. Sandell Jr. (1977)

An information theoretical approach to dynamical systems modeling and identification.

In: Proc. 1977 IEEE Conf. on Decision & Control. Including the 16th Symp. on Adaptive Processes and a Special Symp. on Fuzzy Set Theory and

Applications; New Orleans. 7-9 December 1977. New York: IEEE. p.1113-1118.

Barankin. E.W. (1949)

Locally best unbiased estimates. Ann. Math. Stat •• Vol.20. p.477-501. Barnes. J.L. (1968)

Information theoretic aspects of feedback control systems. Automatica. Vol.4. p.165-185.

Ben-Bassat. M. (1978)

f-entropies. probability of error. and feature selection. Inf. & Control. Vol.39. p.227-242.

Ben-Bassat, M. and J. Raviv (1978)

Renyi's entropy and the probability of error. IEEE Trans. Inf. Theory. Vol.IT-24. p.324-331. Boekee, D.E. (1977)

Generalized Fisher information with application to estimation problems. In: Information and Systems. Proc. IFAC Workshop. Compiegne, France. 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon. 1978. p.75-82. Boekee, D.E. (1976)

Maximum information in continuous systems with constraints.

In: Proc. 8th Congress on Cybernetics; Namur, 6-11 September 1976. Namur: Association Internationale de Cybernetique, 1977. p.243-259. Broekstra, G. (1978)

On the representation and identification of structure systems. Int. J. Syst. Sci., Vol.9, p.1271-1293.

Carnap, R. (1952)

The continuum of inductive methods. Chicago: University of Chicago Press. Probability and induction, Vol.2. Chen, C.H. (1976)

On information and distance measures, error bounds and feature selection. Inf. Sciences, Vol.IO. p.159-173.

Chernyak. V.S. (1971)

Use of Fisher's information matrix in the analysis of potential accuracy of maximum likelihood estimates in the presence of interfering

parameters.

Radio Eng. & Electron. Phys .• Vo1.16, p.951-960. (Transl. of "Radiotekh. & Elektron. ") .

Chu, J.T. and J.C. Chueh (1966)

Inequalities between information measures and error probability. J. Franklin Inst •• Vol.282, p.121-125.

(21)

Comyn, G. (1977)

Generalized information and data analysis.

In: Information and Systems. Proc. IFAC Workshop, Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978. p.29-33. Conant, R. C. (1980)

Structural modelling using a simple information measure. Int. J. Syst. Sci., Vol. II, p.72I-730.

Davies, E.B. (1978)

Information and quantum measurement.

IEEE Trans. Inf. Theory, Vol.IT-24, p.596-599. Dayantis, J. (1973)

Essai de quantification de I'information obtenue lors de mesures

a

un

appareil de mesure.

J. Chim. Phys., Vol.70, p.865-872. Devijver, P.A. (1977)

Entropies of degree

S

and lower bounds for the average error rate. InL & Control, Vol.34, p.222-226.

Devijver, P.A. (1973)

Information measure in identification and parameter estimation.

In: Identification and System Parameter Estimation. Proc. 3rd IFAC Symp.; The Hague/Delft, 12-15 June 1973.

Ed. by P. Eykhoff.

Amsterdam: North Holland, p.631-638. Devijver, P.A. (1974 a)

Entropie quadratique et reconnaissance des formes.

In: Computer Oriented Learning Processes. Proc. NATO Advanced Study Institute; Chateau de Bonas, Gers, France, 26 Aug.- 5 Sept. 1974. Ed. by J.C. Simon.

Leyden: Noordhoff, 1976.

NATO Advanced Study Institutes Series. Series E: Applied Science, Vol.14, p.257-278.

Devijver, P.A. (1974 b)

On a new class of bounds on Bayes risk, inmultihypothesis pattern recognition.

IEEE Trans. Comput., Vol.C-23, p.70-S0. Dixon, L.C.W. (1976)

Cramer-Rao bounds and the choice of model and parameters in system identification.

In: Identification and System Parameter Estimation. Proc. 4th IFAC Symp.; Tbilisi, 21-27 September 1976.

Ed. by N.S. Rajbman.

Amsterdam, North Holland, 1978. p.451-460. Donath, M., B.E. Boyle and W.C. Flower (1980)

An information theory approach to the identification of significant diagnostic measurements.

In: Proc. 19S0 Joint Automatic Control Conf.; San Francisco, 13-15 August 1980. Vol. I.

New York: IEEE. Paper WA6-D (9 pages). Dowson, D.C. and A. W:agg (1973)

Maximum-entropy distr1bution having prescribed first and second moments. IEEE Trans. Inf. Theory, Vol.IT-19, p.689-696.

(22)

Enta, Yu. (1978)

A measure for the discriminative effect of information.

In: Proc. 1978 Joint Automatic Control Conf.; Philadelphia, 15-20 October 1978. Vol.3.

Pittsburgh, Pa.: Instrument Society of America. p.69-80. Eykhof f, P. (1980)

System identification: approach to a coherent picture through template functions.

Electron. Lett., Vol.16, p.502-504.

Eykhoff, P., A.J.W. van den Boom and A.A. van Rede (1981)

System identification methods: - unification and information - development using template functions.

In: Control Science and Technology for the Progress of Society. Preprints 8th IFAC World Congress; Kyoto, 24-28 August 1981. Vol.6, p.83-88.

Farag, R.F.H. (1978)

~formation theoretic approach to image partitioning. IEEE Trans. Syst., Man & Cybern., Vol. SMC-8 , p.829-833. Fleyshman, B.S. and G.B. Linkovsky (1958)

Estimation of the maximum possible value of entropy of unknown distribution represented by several theoretical moments (Russian).

Trudy Nauchno-Techn. Obshch. Radiotechn. i Elektrosvyazi im A.S. Popova, No.2, p.87.

Forte, B. (1968)

~e amount of information given by an experiment.

In: Proc. Colloquium on Information Theory; Debrecen, 19-24 September 1967. Ed. by A. Renyi, Vol. I .

Budapest: Janos Bolyai Mathematical Society. p.149-166. Gaines, B.R. (1977)

System identification, approximation and complexity. Int. J. Gen. Syst., Vol.3, p.145-174.

Gart, J.J. (1959)

xn-extension of the Cramer-Rao inequality. Ann. Math. Stat., Vol.30, p.367-380.

Good, I,J. (1963)

Maximum entropy for hypothesis formulation, especially for multi-dimensional contingency tables.

Ann. Math. Stat., Vol.34, p.911-934. Gray, R.M. (1976)

The maximum mutual information between two random processes.

In: Abstracts 1976 Int. Symp. on Information Theory; Ronneby, Sweden, 21-24 June 1976.

New York: IEEE. p.37.

Hamdan, M.A. and C.P. Tsokos (1971)

An information measure of association in contingency tables. Inf. & Control, Vol. 19, p.174-179.

Hilpinen, R. (1970)

On the information provided by observations.

In: Information and Inference.

Ed. by K.J.J. Hintikka and P. Suppes. Dordrecht: Reidel.

(23)

Hogarth, R.M. (1975)

Cognitive processes and the assessment of subjective probability distributions.

J. Amer. Stat. Assoc., Vol.70, p.271-294. Ibragimov, I.A. and R.Z. Has'minsky (1971)

On the information in a sample about a parameter.

In: Proc. 2nd Int. Symp. on Information Theory; Tsahkadsor, Armenia, 2-8 September 1971.

Ed. by B.N. Petrov and F. Csaki.

Budapest: Akademiai Kiado,-r973. p.295-309. Ihara, J. (1980)

x-sFructural analysis of criteria for selecting model variables. IEEE Trans. Syst., Man & Cybern., Vol.SMC, p.460-466.

Ishii, N. and N. Suzumura (1975)

on-the estimation of the order of Markov process from the random data of time series (Japanese).

Bull. Nagoya Inst. Technol. (Japan), 1975, No.27, p.445-453. Ishii, N. and N. Suzumura (1977 a)

on-e8timation of the order of the autoregressive process. Electron. & Commun. Japan, Vol.60-A, No.6, p.I-8.

Ishii, N. and N. Suzumura (1976 b)

Estimation of the order of autoregressive process. Int. J. Syst. Sci., Vol.8, p.905-913.

Jansen, J. and J.F. Barrett (1979)

An extension of Kullback Leibler information measure to linear structural relations.

In: Identification and System Parameter Estimation. Proc. 5th IFAC Symp.; Darmstadt, 24-28 September 1979.

Ed. by R. Isermann.

Oxford: Pergamon, 1980. p.539-548. Jaynes, E.T. (1957 a)

Information theory and statistical mechanics. Phys. Rev., Vol.I06, 1957, p.620-630.

Jaynes, E.T. (1957 b)

Information theory and statistical mechanics II. Phys. Rev., Vol. 108, 1957, p.171-190.

Jaynes, E.T. (1962)

Information theory and statistical mechanics.

In: Brandeis University Summer Institute Lectures in Theoretical Physics, 1962. Vol.3: Statistical Physics.

Ed. by K.W. ~. New York: Benjamin, 1963, p.181-218. Ja¥nes, E.T. (1968)

Pr~or probabilities.

IEEE Trans. Syst. Sci. & Cybern., Vol.SSC-4, p.227-241. Kagan, A.M. (1978)

On the problem of the optimal detection of a constant signal in the presence of additive noise.

In: Proc. 2nd Int. Symp. on Information Theory; Tsahkadsor, Armenia, 2-8 September 1971.

Ed. by B.N. Petrov and F. Csaki.

(24)

Kagan, A.M., Y.V. Linnik and C.R. Rao (1973)

Characterization problems in mathematical statistics. New York: Wiley.

Wiley Series in Probability and Mathematical Statistics. Kailath, T. (1967)

The divergence and Bhattacharyya distance measures in signal selections. IEEE Trans. Cornmun. Technol., Vol.COM-15, p.52-60.

Kalata, P. and R. Priemer (1978)

On system identification with and without certainty. J. Cybern., VOI.8, No.1, p.31-50.

Kalata P.R. (1974)

An information-theoretic approach to estimation in discrete-time systems. Ph.D. Thesis. Illinois Institute of Technology, Chicago.

Available from: University Microfilms, Ann Arbor, Mich., USA. Order No.74-23983.

Kalata, P. and R. Priemer (1979)

Linear prediction, filtering and smoothing: an information-theoretic approach.

Info Sci., Vol.17, p.I-14.

Kalata, P. and R. Priemer (1974)

On minimal error entropy stochastic approximation. Int. J. Syst. Sci., Vol.5, p.895-906.

Kaveh, M. (1978)

AlmOdified Akaike information criterion.

In: Proc. 1978 IEEE Int. Conf. on Decision & Control.

Including the 17th Symp. on Adaptive Processes. San Diego, 10-12 Jan. 1979. New York: IEEE, 1979. p.949-950.

Kazakos, D. (1978)

On the maximization of divergence.

IEEE Trans. Inf. Theory, Vol.IT-24, p.509. Kopilovich, L.Ye. (1964)

Distribution laws of an envelope of random processes with maximum entropy. Radio Eng. & Electron. Phys., Vo!.9, p.268-272.

(Trans!. of "Radiotekh. & Elektron. ") Korsh, J.F. (1970)

On decisions and information concerning an unknown parameter. Inf. & Control, Vo!.16, p.123-127.

Kullback, S. (1952)

An application of information theory to multivariate analysis. Ann. Math. Stat., Vol.23, p.88-102.

Lindley, D.V. (1956)

On a measure of the information provided by an experiment. Ann. Math. Stat., Vol.27, p.986-1005.

Linfoot, E.H. (1957)

An informational measure of correlation. Inf. & Control, Vo!.I, p.85-89.

Lisman, J.H.C. and M.e.A. van Zuylen (1972)

Note on the generation of most probable frequency distributions. Statistica Neerlandica, Vol.26, p.19-23.

(25)

Lissack, T. and King-Sun Fu (1976)

Error estimation

in

pattern recognition via La - distance between posterior

density functions.

IEEE Trans. Inf. Theory, Vol.IT-22, p.34-45. Ljung, L. and J. Rissanen (1976)

on-canonical forms, parameter identifiability and the concept of complexity. In: Identification and System Parameter Estimation. Proc. 4th IFAC Symp.; Tbilisi, 21-27 September 1976.

Ed. by N.S. Rajbman.

Amsterdam: North-Holland, 1978. p.1415-1426. Melnick, E.L. and S. Kullback (1973)

An application of the minimum discrimination information estimate to

compute log-likelihood ratios. J. Appl. Prob., Vol. 10, p.469-474. Minamide, N. (1979)

An information theory consideration of a design of test input signals. Int. J. Syst. Sci., Vol. 10, p.927-942.

Nagashima, J. and D.G. Andrews (1980)

An application of information divergence to nuclear reactor noise analysis. Nucl. Technol., Vol.50, No.2, p.124-135.

Nahi, N.E. (1969)

Estimation theory and applications. New York: Wiley.

Niple, E. and J.H. Shaw (1980)

Information measures in nonlinear experimental design. J. Phys. E, Vol.13, p.506-511.

Noonan, J.P. N.S. Tzannes and T. Costello (1976) On the inverse problem of entropy maximizations. IEEE Trans. Inf. Theory, Vol.IT-22, p.120-123. Ohmatsu, S., A. Kikuchi and T. Soeda (1977)

State estimations and mutual information for a continuous-time linear system.

Trans. Inst. Electron. &Commun. Eng. Japan Sect. E, Vol.E60, p.388. akita, T., M. Ohta and S. Yamaguchi (1977)

x-method of identification for nonlinear systems by use of Kullback's information quantity.

Trans. Inst. Electron. & Commun. Eng. Japan Sect. E, Vol.E60, p.511-512. Ozaki, T. and H. ada (1977)

NOn=rinear time serres model identification by Akaike's information criterion.

In: Information and Systems. Proc. IFAC Workshop; Compiegne, France, 25-27 October 1977.

Ed. by B. Dubuisson.

Oxford: Pergamon, 1978. p.83-91. Papantoni-Kazakos, P. (1976)

A generalized evaluation criterion in parameter estimation. IEEE Trans. Inf. Theory, Vol.IT-22, p.599-603.

Perez, A. (1967)

Risk estimates in terms of generalized f-entropies.

In: Proc. Colloquium on Information Theory; Debrecen, 19-24 September 1967. Ed. by A. Renyi. Vol.2.

(26)

21

-Pinsker, M.S. (1972)

The information content of observations, and asymptotically sufficient statistics.

Probl. Inf. Transm., Vol.8, p.33-46. (Trans!. of "Probl. Peredachi Inf. ") Provencher, S.W. and R.H. Vogel (1980)

Information loss with transform methods in system identification: a new set of transforms with high information content.

Math. Biosci., Vol.50, p.251-262. Raj bman , N.S. and V.M. Chadeev (1975)

Identification of industrial processes: The application of computers in research and production control.

Amsterdam: North-Holland, 1980.

(Russian ed.: Energiya, Moscow, 1975.)

Rajbman, N.S., M.I. Shpunt, F.A. Ovsepian and I.S. Dur~arian (1971) Informational measure of certainty and its use in the 1dentification of plants.

J. Optimiz. Theory & Appl. , Vol.8, p.212-227. Rao, C.R. (1945)

Information and the accuracy attainable in the estimation of statistical parameters.

Bull. Calcutta Math. Soc., Vol.37, p.81-91. Renyi, A. (1968)

on-sDme problems of statistics from the point of view of information theory. In: Proc. Colloquium on Information Theory; Debrecen, Hungary, 19-24

September 1967.

Ed. by. A. Renyi. Vol.2.

Budapest: Janos Bolyai Mathematical Society. p.343-357. Renyi, A. (1965)

On some basic problems of statistics from the point of view of information theory.

In: Proc. 5th Berkeley Symp. on Mathematical Statistics and Probability; Berkeley, 21 June - 18 July 1965 and 27 Dec. 1965 -7 Jan. 1966.

Vol.l: Statistics.

Ed. by L.M. Le Cam and J. Neyman.

Berkeley-Los Angeles: University of California Press, 1967. p.531-543. Rife, D.C., M. Goldstein and R.R. Boorstyn (1975)

A unification of Cramer-Rao type bounds.

IEEE Trans. Inf. Theory, Vol.IT-21, p.330-332. Rissanen, J. (1976 a)

Minimax entropy estimation of models for vector processes. In: System Identification: Advances and case studies. Ed. by R.K. Mehra and D.G. Lainiotis.

New York: Academic Press.

Mathematics in Science and Engineering, Vol.126, p.97-119. Rissanen, J. (1976b)

Parameter estimation by shortest description of data.

In: Proc. 16th Joint Automatic Control Conf.; West Lafayette, 27-30 July 1976.

(27)

Rosenberg, V.Ya. and N.A. Rubichev (1966) An inverse problem in information theory. Probl. Inf. Transm., Vol.2, No.2, p.63-64.

(Transl. of "Probl. Peredachi Inf. ")

Schwarz, C. (1978)

Estimating the dimension of a model. Ann. Stat., Vol.6, p.461-464.

Shibata, R. (1976)

Selection of the order of an autoregressive model by Akaike's information

criterion.

Biometrika, Vol.63, p.117-126. Smith, S.A. (1974)

A derivation of entropy and the maximum entropy criterion in the context

of decision problems.

IEEE Trans. Syst., Man & Cybern., Vol.SMC-4, p.157-163.

Solodow, A.W. (1972)

Theorie der Informationsubertragung in automatischen Systemen. Berlin: Akademie-Verlag.

Elektronisches Rechnen und Reglen, Sonderband 4. Stone, M. (1977)

An asymptotic equivalence of choice of model by crossvalidation and

Akaike's criterion.

J. Royal Stat. Soc. Ser.B, Vol.39, p.44-47. Tomita, Y., S. Ohmatsu and T. Soeda (1976 a)

An application of the information theory to filtering problems. InL Sciences, Vol. II, 1976, p.13-27.

Tomita, Y., S. Ohmatsu and T. Soeda (1976b)

An application of the information theory to estimation problems. InL & Control, Vol.32, 1976, p.101-1I1.

Tong, H. (1975)

Det;rmination of the order of a Markov chain by Akaike's information criterion.

J. Appl. Probab., Vol.12, p.488-497.

Toussaint, C.T. (1976)

Probability of error and equivocation of order a.

In: Abstracts 1976 Int. Symp. on Information Theory; Ronneby, Sweden, 21-24 June 1976.

New York: IEEE. p.37. Tse, E. (1976)

Identification and estimation of dynamic linear models with unknown

structure.

In: Proe. 16th Joint Automatic Control Conf.; West Lafayette,

27-30 July 1976.

New York: American Society of Mechanical Engineers, p.5RI-585. Tse, E. (1973)

Information matrix and local identifiability of parameters.

In: Proc. 14th Joint Automatic Control Conf.; Columbus, 20-22 June 1973. New York: IEEE. p.611-619.

(28)

Tsuji, S. and S. Tanaka (1974)

Estimation of unknown function based on information theory. Electr. Eng. Japan, Vol.94, No.3, p.117-124.

(Transl. of "Denki Gakkai Ronbunshi", Vo1.94C, No.5, p.97-I04.) Tzannes, N.S. and T. Avergis (1978)

A new approach to the estimation of continuous spectra.

In: Current Topics in Cybernetics and Systems. Proc. 4th Int. Congress of Cybernetics & Systems; Amsterdam, 21-25 August 1978.

Ed. by J. Rose.

Berlin: Springer, p.351-352.

Vaganov, A.M. and G.G. Kosenko (1971)

A method of estimation of lower error bounds and entropy of recognition of two objects.

Radio Eng. & Electron. Phys., Vol.I6, p.1749-1751. (Transl. of "Radiotekh. & Elektron. ")

Vajda, I. (1967)

on-the convergence of information contained in a sequence of observations. In: Proc. Colloquium on Information Theory; Debrecen, Hungary,

19-24 September 1967. Ed. by A. Renyi. Vol.2.

Budapest: Janos Bolyai Mathematical Society, 1968. p.489-501. Vajda, I. (1968)

A contribution to the informational analysis of pattern.

In: Methodologies of Pattern Recognition. Proc. Int. Conf.; Honolulu, 24-26 January 1968.

Ed. by S. Watanabe.

New York: Academic Press, 1969. p.509-519. Vajda, I. (1971)

Xa-divergence and generalized Fisher information.

In: Trans. 6th Prague Conf. on Information Theory, Statistical Decision Functions, Random Processes; Prague, 19-25 September 1971.

Prague: Academia, 1973. p.873-886. Van den Bos, A. (1971)

Alternative interpretation of maximum entropy spectral analysis. IEEE Trans. Inf. Theory, Vol.IT-17, p.493-494.

Van der Lubbe, J.C.A. (1915)

On the maximization of information for continuous distributions which satisfy some constraints (Dutch)(Thesis).

Information Theory Group, Department of Electrical Engineering, Delft University of Technology, Netherlands.

Van Meurs, J. and J.H.C. Lisman (1961) Meest aannemelijke verdelingen.

Statistica Neerlandica, Vol. 15, p.147-151. Van Trees, H.L. (1968)

Detection, estimation, and modulation theory. New York: Wiley.

Part 1: Detection, estimation, and linear modulation theory. 1968. Part 2: Nonlinear modulation theory. 1971.

Part 3: Radar-Sonar signal processing and Gaussian signals in noise. 1971. Vaurio, J.K. (1972)

Information and estimation in nuclear measurements. Nucl. Instrum. &Methods, VoL99, p.373-378.

(29)

Vincze, 1. (1967)

On the information-theoretical foundation of mathematical statistics. In: Proe. Colloquium on Information Theory; Debrecen, Hungary,

19-24 September 1967. Ed. by A. R~nyi. Vol.2.

Budapest: Janos Bolyai Mathematical Society, 1968. p.S03-S09. Watanabe, S. (1969)

Knowing and guessing; A quantitative study of inference and information. New York: Wiley.

Watanabe, S. (1960)

Information theoretical analysis of multivariate correlation. IBM J. Res. &Dev., Vo1.4, p.66-82.

Weidemann, H.L. (1969)

Entropy analysis of feedback control systems. In: Advances in Control Systems. Vol.7.

Ed. by C.T. Leondes.

New York: Academic Press. p.22S-25S. Weidemann, H.L. and E.B. Stear (1970) Entropy analysis of estimating systems.

IEEE Trans. Inf. Theory, Vol.IT-16, p.264-270. Whittemore, B.J. and M.C. Yovits (1974)

The quantification and analysis of information used in decision processes. Inf. Sciences, Vol.7, p.171-184.

Wong, A.K.C. and M.A. Vogel (1977)

Resolution-dependent information measures for image analysis. IEEE Trans. Syst., Man & Cybern., Vol.SMC-7, p.49-61.

Wragg, A. and D.C. Dowson (1970)

Fitting continuous probability density functions over (0,00) using information theory ideas.

IEEE Trans. Inf. Theory, Vol.IT-16, p.226-230. Zaborszky, J. (J 966)

An information theory viewpoint for the general identification problem. IEEE Trans. Autom. Control, Vol.AC-II, p.130-131.

(30)

Reports:

105) Videc, H.F.

STRALI N(;SVERSCH LJNSELEN I N PLASMA'S EN BEI<E(;ENDE MEn!A: Een geomet

risch-ol)tiscl1C en een goLfzonebenadering.

TH-Rep,'rt 80-E-IOS. 19HO. [SBN 90-6144-105-6 lab) Hajdasinski, A.K.

LINEAR MULTIVARIABLE SYSTEMS: Preliminary problems in mathematical

description, modelling and identification.

TH-Report 80-E-I06. 1980. ISBN 90-6144-106-4 107) Heuvel, W.M.C. van den

CURRENT CHOPPING IN SF6'

TH-Report SO-E-I07. 19S0. ISBN 90-6144-107-2

108) ~I w.e. van and T.M. Lammers

TRANSMISSION OF FM-MODULATED AUDIOSIGNALS IN THE S7.5 - lOS MHz BROADCAST BAND OVER A FIBER OPTIC SYSTEM.

TH-Report SO-E-I0S. 1980. ISBN 90-6144-108-0

109) Krause, J.e.

SHORT-CURRENT LIMITERS: Literature survey 1973-1979. TH-Report SO-E-I09. 19S0. ISBN 90-6144-109-9

110) Matacz, J.S.

UNTERSUCHUNGEN AN GYRATORFILTERSCHALTUNGEN. TH-Report SO-E-I10. 19S0. ISBN 90-6144-110-2 111) Otten, R.H.J.M.

STRUCTURED LAYOUT DESIGN.

TH-Report SO-E-l11. 1981' ISBN 90-6144-111-0 112) Worm, S.C.J.

OPTIMIZA~'ION OF SOME APER'rURE ANTENNA PERFORMANCE INDICES WITH AND WITHOUr PATTERN CONSTRi\INTS.

TH-Report 1l0-E-112. 1980. ISBN 90-6144-112-9

113) Theeuwen, J.F.M. en J.A.G. Jess

EEN INTERACTIEF FUNCTIONEEL~RPSYSTEEM VOOR ELEKTRONISCHE SCHAKELINGEN.

TH-Report SO-E-113. 19S0. ISBN 90-6144-113-7 114) Lammers, T.M. en J.L. Manders

EEN DIGITAAL AUDIO-DISTRIBUTIESYSTEEM VOOR 31 STEREOKANALEN VIA GLASVEZEL.

TH-Report 80-E-114. 19S0. ISBN 90-6144-114-5

115) Vinck, A.J., A.C.M. Oerlemans and I.G . .! .A. Martens

'i'Wi)APPLICATlONS OF A CLASS OF CONVOLUTIONAL CODES WITH REDUCED DECODER COMPLEXITY.

(31)

Reports:

EUT Reports are a continuation of TH-Heports.

116) Versnel, W.

THE CIRCULAR HALL PLATE: Approximation of the geometrical correction factor for small contacts.

TH-Report 81-E-116. 1981. ISBN 90-6144-116-1

117) Fabian, K.

DESIGN AND IMPLEMENTATION OF A CENTRAL INSTRUCTION PROCESSOR WITH A MULTIMASTER BUS INTERFACE.

TH-Report 81-E-117. 1981. ISBN 90-6144-117-X

118) Wang Yen Ping

ENCODING MOVING PICTURE BY USING ADAPTIVE STRAIGHT LINE APPROXIMATION. EDT Report 81-E-118. 1981. ISBN 90-6144-118-8

119) Heijnen, C.J.H., H.A. Jansen, J.F.G.J. Olijslagers and W. Versnel FABRICATION OF PLANAR SEMICONDUCTOR DIODES, AN EDUCATIONAL LABORATORY EXPERIMENT.

EUT Report 81-E-119. 1981. ISBN 90-6144-119-6.

120) Piecha, J.

DESCRIPTION AND IMPLEMENTATION OF A SINGLE BOARD COMPUTER FOR INDUSTRIAL CONTROL.

EUT Report 81-E-120. 1981. ISBN 90-6144-120-X

121) Plasman, J.L.C. and C.M.M. Timmers

DIRECT MEASUREMENT OF BLOOD PRESSURE BY LIQUID-FILLED CATHETER MANOMETER SYSTEMS.

EUT Report 81-E-121. 1981. ISBN 90-6144-121-8

122) Ponomarenko, M.F.

INFORMATION THEORY AND IDENT IFICATION.

EUT Report 81-E-122. 1981. ISBN 90-6144-122-6 123) Ponomarenko, M.F.

INFORMATION MEASURES AND THEIR APPLICATIONS TO IDENTIFICATION (a bibliography).

Referenties

GERELATEERDE DOCUMENTEN

The aim of this study therefore was to develop a family physician impact evaluation tool to evaluate the perceived impact of family physicians on health system performance and

As our first application of the Moyal bracket method for flow equations, we will look at a harmonic oscillator with an additional quartic interaction in one dimension.. This

De auteur heeft materiaal bekeken van enkele privé-collecties en van de museumcollecties van het Nationaal Natuurhistorisch Museum Naturalis, te Leiden ( rmnh), het Zoölogisch

For instance, Toms et al.‟s (2014) study showed that the divergence of two lineages of the klipfish, Clinus cottoides, was linked to lowered sea levels that changed the topology

This opens new horizons, and of course, challenges, for Bible translation scholars, for example, (1) it encourages them to be more realistic when spelling out to churches

We kunnen natuurlijk ook overal gaten graven maar misschien is dat wel helemaal niet nodig, het hoeft niet overal hetzelfde te worden. We hebben nu een grote variatie in

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of

WF6-CVD tungsten film as an emitter for a thermionic energy converter I.. Production, texture and morphology of WF6-CVD