Authors
Claus-Peter Wirth, Jörg Siekmann, Christoph Benzmüller, Serge Autexier
Title
Lectures on Jacques Herbrand as a Logician
In
SEKI Report SR-2009-01
Bibtex Entry
Copyright Owner
SEKI
Abstract
We give some lectures on the work on formal logic of Jacques Herbrand, and sketch his life and his influence on automated theorem proving. The intended audience ranges from students interested in logic over historians to logicians. Besides the well-known correction of Herbrand's False Lemma by Gödel and Dreben, we also present the hardly known unpublished correction of Heijenoort and its consequences on Herbrand's Modus Ponens Elimination. Besides Herbrand's Fundamental Theorem and its relation to the Löwenheim-Skolem-Theorem, we carefully investigate Herbrand's notion of intuitionism in connection with his notion of falsehood in an infinite domain. We sketch Herbrand's two proofs of the consistency of arithmetic and his notion of a recursive function, and last but not least, present the correct original text of his unification algorithm with a new translation.
Keywords
Jacques Herbrand, History of Logic, Herbrand's Fundamental Theorem, Modus Ponens Elimination, Löwenheim-Skolem-Theorem, Falsehood in an infinite domain, Consistency of Arithmetic, Recursive Functions, Unification Algorithm
Review
by Irving H. Anellis, Paolo Bussotti, and Lawrence C. Paulson
Full paper

Shallow Confluence of Conditional Term Rewriting Systems
Authors
Claus-Peter Wirth
Title
Shallow Confluence of Conditional Term Rewriting Systems
In
J. Symbolic Computation, 2009, 44:60--98
Bibtex Entry
Permanent URL
http://dx.doi.org/10.1016/j.jsc.2008.05.005
Copyright Owner
Elsevier
Abstract
Recursion can be conveniently modeled with left-linear positive/negative-conditional term rewriting systems, provided that non-termination, non-trivial critical overlaps, non-right-stability, non-normality, and extra variables are admitted. For such systems we present novel sufficient criteria for shallow confluence and arrive at the first decidable confluence criterion admitting non-trivial critical overlaps. To this end, we restrict the introduction of extra variables of right-hand sides to binding equations and require the critical pairs to have somehow complementary literals in their conditions. Shallow confluence implies [level] confluence, has applications in functional logic programming, and guarantees the object-level consistency of the underlying data types in the inductive theorem prover QuodLibet

Jacques Herbrand: Life, Logic, and Automated Deduction
Authors
Claus-Peter Wirth, Jörg Siekmann, Christoph Benzmüller, Serge Autexier
Title
Jacques Herbrand: Life, Logic, and Automated Deduction
In
In Dov M. Gabbay & John Woods (eds.). Handbook of the History of Logic
Vol 5: Logic from Russell to Godel, North-Holland, 2009, ISBN 0444516204
Copyright Owner
Elsevier

A New Semantics for Hilbert's epsilon as an Operator of Indefinite Committed Choice
Authors
Claus-Peter Wirth
Title
[A New Indefinite Semantics for]

Hilbert's epsilon

[as an Operator of Indefinite Committed Choice]
In
Journal Version:
Cite as
Journal of Applied Logic 6 (2008), pp. 287-317
Permanent URL
http://dx.doi.org/10.1016/j.jal.2007.07.009
Bibtex Entry
bib.bib
Long Version:
SEKI-Report SR-2006-02
Bibtex Entry
Short version:
11th TABLEAU 2002, LNAI 2381, pp. 298-314, Springer.
Bibtex Entry.
Submitted version in .ps.gz; note that the editor has done some changes in the printed version which the author cannot condone. At least the editor has changed the small epsilon in the title to a capital epsilon, but maybe he garbled more than that. Give authors the control! (Vanevar Bush's WWW ideas would enable this.)
Copyright Owner
Journal Version:
Elsevier
Long Version:
SEKI
Short version:
Springer
Up-to-dateness
Yes!
Abstract
Paul Bernays and David Hilbert carefully avoided overspecification of Hilbert's epsilon-operator and axiomatized only what was relevant for their proof-theoretic investigations. Semantically, this left the epsilon-operator underspecified. In the meanwhile, there have been several suggestions for semantics of the epsilon as a choice operator. We propose a further one with the following features: We avoid overspecification (such as functionality), but admit indefinite choice, committed choice, and classical logics. Moreover, our semantics for the epsilon supports proof search optimally and is natural in the sense that it does not only mirror some cases of referential interpretation of indefinite articles in natural languages, but may also contribute to philosophy of language.
Review
Complete summary of anonymous review of a short version for TABLEAU 2002
Long Version Full paper

Thomas S. Kuhn: The Structure of Scientific Revolutions --- Zweisprachige Auszüge mit Deutschem Kommentar
Authors
Claus-Peter Wirth
Title
Thomas S. Kuhn: The Structure of Scientific Revolutions --- Zweisprachige Auszüge mit Deutschem Kommentar
In
SEKI-Working-Paper SWP-2007-01
Bibtex Entry
Copyright Owner
SEKI
Zusammenfassung
Obwohl Kuhns Essay ``Die Struktur wissenschaftlicher Revolutionen'' vielleicht den wichtigsten Text der Wissenschaftstheorie darstellt, kennen die heutigen Wissenschaftler meist nur ein paar Schlagworte daraus, und die methodologische Erziehung von Wissenschaftlern geht kaum über den Kritischen Rationalismus Poppers hinaus. Die Ursache für diesen Missstand dürfte in der Länge und der Unstrukturiertheit von Kuhns Essay liegen, sowie in dessen gelegentlicher Schwerverständlichkeit. Wir legen hier eine Sammlung von Deutschen Kommentaren und zweisprachigen Originalzitaten aus Kuhns Essay in Englisch und Deutsch vor. Diese Zusammenstellung hat das Ziel, eine Essenz des Essays abzuliefern und dabei dem Mangel an leichter, direkter und effizienter Zugänglichkeit des Themas abzuhelfen.
Abstract
Although Kuhn's s essay ``The Structure of Scientific Revolutions'' may well be the most important text in theory of science, today's scientists still tend to know only some catchwords from it, and methodological education of scientists hardly goes beyond Popper's Critical Rationalism. The reason for this poor state of affairs may be the length and lack of structure of Kuhn's essay, and the occasional difficulty to understand it. We present a arranged set of German comments and bilingual quotations from Kuhn's original essay in English and German. This compilation aims at delivering the essence of the essay, and thereby filling the gap of an easy, straightforward, and efficient accessibility of the subject.
Review
by Paolo Bussotti and Marvin Schiller
Full paper

A Self-Contained and Easily Accessible Discussion of the Method of Descente Infinie and Fermat's Only Explicitly Known Proof by Descente Infinie
Authors
Claus-Peter Wirth
Title
A Self-Contained and Easily Accessible Discussion of the Method of Descente Infinie and Fermat's Only Explicitly Known Proof by Descente Infinie
In
SEKI-Working-Paper SWP-2006-02
Bibtex Entry
Copyright Owner
SEKI
Abstract
We present the only proof of Pierre Fermat by descente infinie that is known to exist today. We discuss descente infinie from the mathematical, logical, historical, linguistic, and refined logic-historical points of view. We provide the required preliminaries from number theory and present a self-contained proof in a modern form, which nevertheless is intended to follow Fermat's ideas as interpreted into the cited Latin original. We then annotate an English translation of Fermat's original proof with terms from the modern proof. Although the paper consists of reviews, compilations, and simple number theory with mainly pedagogical intentions, its gestalt is original and it fills a gap regarding the easy accessibility of the subject.
Review
by Paolo Bussotti
Full paper
Related Talk
Ceterum censeo Descente Infinie esse disputandam
Further reading

Progress in Computer-Assisted Inductive Theorem Proving by Human-Orientedness and Descente Infinie?
Authors
Claus-Peter Wirth
Title
Progress in Computer-Assisted Inductive Theorem Proving by Human-Orientedness and Descente Infinie?
In
SEKI-Working-Paper SWP-2006-01
Bibtex Entry
Copyright Owner
SEKI
Abstract
In this short position paper we briefly review the development history of automated inductive theorem proving and computer-assisted mathematical induction. We think that the current low expectations on progress in this field result from a faulty narrow-scope historical projection. Our main motivation is to explain---on an abstract but hopefully sufficiently descriptive level---why we believe that future progress in the field is to result from human-orientedness and descente infinie.
Review
Dieter Hutter, Peter Padawitz, and Tobias Schmidt-Samoa
Full paper

Flexible Heuristic Control for Combining Automation and User-Interaction in Inductive Theorem Proving
Authors
Tobias Schmidt-Samoa
Title
Flexible Heuristic Control for Combining Automation and User-Interaction in Inductive Theorem Proving
In
Ph.D. Thesis, Univ. Kaiserslautern, 2006.
Bibtex Entry
Copyright Owner
Tobias Schmidt-Samoa
Abstract
The validity of formulas w.r.t. a specification over first-order logic with a semantics based on all models is semi-decidable. Therefore, we may implement a proof procedure which finds a proof for every valid formula fully automatically. But this semantics often lacks intuition: Some pathological models such as the trivial model may produce unexpected results w.r.t. validity. Instead, we may consider just a class of special models, for instance, the class of all data models. Proofs are then performed using induction. But, inductive validity is not semi-decidable -- even for first-order logic. This theoretical drawback manifests itself in practical limitations: There are theorems that cannot be proved by induction directly but only generalizations can be proved. For their definition, we may have to extend the specification. Therefore, we cannot expect to prove interesting theorems fully automatically. Instead, we have to support user-interaction in a suitable way.
In this thesis, we aim at developing techniques that enhance automatic proof control of (inductive) theorem provers and that enable user-interaction in a suitable way. We integrate our new proof techniques into the inductive theorem prover QuodLibet and validate them with various case studies. Essentially, we introduce the following three proof techniques:
-We integrate a decision procedure for linear arithmetic into QuodLibet in a close way by defining new inference rules that perform the elementary steps of the decision procedure. This allows us to implement well-known heuristics for automatic proof control. Furthermore, we are able to provide special purpose tactics that support the manual speculation of lemmas if a proof attempt gets stuck. The integration improves the ability of the theorem prover to prove theorems automatically as well as its efficiency. Our approach is competitive with other approaches regarding efficiency; it provides advantages regarding the speculation of lemmas.
-The automatic proof control searches for a proof by applying inference rules. The search space is not only infinite, but grows dramatically with the depth of the search. In contrast to this, checking and analyzing performed proofs is very efficient. As the search space also has a high redundancy, it is reasonable to reuse subproofs found during proof search. We define new notions for the contribution of proof steps to a proof. These notions enable the derivation of pruned proofs and the identification of superfluous subformulas in theorems. A proof may be reused in two ways: upward propagation prunes a proof by eliminating superfluous proof steps; sideward reuse closes an open proof obligation by replaying an already found proof.
-For interactive theorem provers, it is essential not only to prove automatically as many lemmas as possible but also to restrict proof search in such a way that the proof process stops within a reasonable amount of time. We introduce different markings in the goals to be proved and the lemmas to be applied to restrict proof search in a flexible way: With a forbidden marking, we can simulate well-known approaches for applying conditional lemmas. A mandatory marking provides a new heuristics which is inspired by local contribution of proof steps. With obligatory and generous markings, we can fine-tune the degree of efficiency and extent of proof search manually.
With an elaborate case study, we show the benefits of the different techniques, in particular the synergetic effect of their combination.
Full paper

Flexible heuristics for simplification with conditional lemmas by marking formulas as forbidden, mandatory, obligatory, and generous
Authors
Tobias Schmidt-Samoa
Title
Flexible Heuristics for Simplification with Conditional Lemmas by Marking Formulas as Forbidden, Mandatory, Obligatory, and Generous
In
Journal of Applied Non-Classical Logics 2006, 16(1-2): pp. 209-239
Bibtex Entry
Copyright Owner
Editions Hermes-Lavoisier
Abstract
Due to its practical importance, context-dependent simplification of goals with conditional lemmas has been studied for three decades, mostly under the label of ``contextual rewriting''. We present a flexible framework for controlling the recursive relief of conditions by marking formulas in goals and lemmas. Within this framework, by marking goal formulas as forbidden, we can simulate and improve the well-known approaches of contextual rewriting and case rewriting. Furthermore, we develop novel heuristics which may mark goal formulas as mandatory and lemma formulas as obligatory or generous. Our case studies in the field of rewrite-based inductive theorem proving are encouraging.
Full paper
http://jancl.revuesonline.com/resnum.jsp?editionId=758

A Generic Modular Data Structure for Proof Attempts Alternating on Ideas and Granularity
Authors
Serge Autexier, Christoph Benzmüller, Dominik Dietrich, Andreas Meier, Claus-Peter Wirth
Title
A Generic Modular Data Structure for Proof Attempts Alternating on Ideas and Granularity
In
Mathematical Knowledge Management: 4th International Conference, MKM 2005, Bremen, Germany, July 15-17, 2005, Revised Selected Papers, LNAI 3863, pp. 126-142, Springer, 2006, ISBN 354031430X.
Bibtex Entry.
Copyright Owner
Springer
Abstract
A practically useful mathematical assistant system requires the sophisticated combination of interaction and automation. Central in such a system is the proof data structure, which has to maintain the current proof state and which has to allow the flexible interplay of various components including the human user. We describe a parameterized proof data structure for the management of proofs, which includes our experience with the development of two proof assistants. It supports and bridges the gap between abstract level proof explanation and low-level proof verification. The proof data structure enables, in particular, the flexible handling of lemmas, the maintenance of different proof alternatives, and the representation of different granularities of proof attempts.
Review
No review yet
Full paper

Mandatory versus Forbidden Literals in Simplification with Conditional Lemmas
Authors
Tobias Schmidt-Samoa
Title
Mandatory versus Forbidden Literals in Simplification with Conditional Lemmas
In
5th Int. Workshop on First-Order Theorem Proving (FTP 2005)
Bibtex Entry
Copyright Owner
Author
Abstract
Due to its practical importance, context-dependent simplification with conditional lemmas has been studied for three decades, mostly under the label ``contextual rewriting''. We develop novel heuristics restricting the relief of conditions with mandatory literals in the goals and obligatory literals in the lemmas. Our case studies in the field of rewrite-based inductive theorem proving are encouraging.
Review
By anonymous CADE referee (literal):
-------------------- review 106 --------------------
OVERALL RATING: 2 (accept)
CONFIDENCE: 3 (high)
----------------------- REVIEW --------------------
Summary: The author develops an approach for context-dependent simplification (of proof goals) with conditional lemmas. Novel heuristics for restricting the way conditions are verified, with "mandatory" literals in the goals and "obligatory" literals in the lemmas, are presented. Illustrating example are given, and a comparison with related approaches for this kind of contextual rewriting is sketched, based on case studies and an implementation in the author's inductive theorem prover QuodLibet. The experimental results seem to indicate that the new heuristics are comparable with ordinary contextual rewriting and, in some cases, are also clearly better. What is interesting in the paper, besides the quantitative aspects of the above comparison, is the abstract conceptual level used to define the heuristics, by introducing different types of literals in goals and lemmas. The progress that can be achieved with this approach doesn't seem to be dramatic. However, in view of the utmost important of context-dependent simplification (as fundamental operation) in automated deduction, the paper nevertheless represents solid and interesting work that deserves acceptance and publication.
Full paper

lim+, delta+, and Non-Permutability of beta-Steps
Authors
Claus-Peter Wirth
Title
lim+, delta+, and Non-Permutability of beta-Steps
In
SEKI-Report SR-2005-01
Bibtex Entry
Copyright Owner
SEKI
Abstract
Using a human-oriented formal example proof of the (lim+) theorem, i.e. that the sum of limits is the limit of the sum, which is of value for reference on its own, we exhibit a non-permutability of beta-steps and delta+-steps (according to Smullyan's classification), which is not visible with non-liberalized delta-rules and not serious with further liberalized delta-rules, such as the delta++-rule. Besides a careful presentation of the search for a proof of (lim+) with several pedagogical intentions, the main subject is to explain why the order of beta-steps plays such a practically important role in some calculi.
Review
Chad E. Brown
Full paper

An Even Closer Integration of Linear Arithmetic into Inductive Theorem Proving
Authors
Tobias Schmidt-Samoa
Title
An Even Closer Integration of Linear Arithmetic into Inductive Theorem Proving
In
Proc. 12th Symposium on the Integration of Symbolic Computation and Mechanized Reasoning (Calculemus 2005), Electronic Notes in Theoretical Computer Sci. 2006, 151, pp. 3-20, Elsevier
Copyright Owner
Elsevier
Abstract
To broaden the scope of decision procedures for linear arithmetic, they have to be integrated into theorem provers. Successful approaches e.g. in NQTHM or ACL2 suggest a close integration scheme which augments the decision procedures with lemmas about user-defined operators. We propose an even closer integration providing feedback about the state of the decision procedure in terms of entailed formulas for three reasons: First, to provide detailed proof objects for proof checking and archiving. Second, to analyze and improve the interaction between the decision procedure and the theorem prover. Third, to investigate whether the communication of the state of a failed proof attempt to the human user with the comprehensible standard GUI mechanisms of the theorem prover can enhance the speculation of auxiliary lemmas.
Full paper

The New Standard Tactics of the Inductive Theorem Prover QuodLibet
Authors
Tobias Schmidt-Samoa
Title
The New Standard Tactics of the Inductive Theorem Prover QuodLibet
In
SEKI-Report SR-2004-01
Bibtex Entry
Copyright Owner
SEKI
Abstract
QuodLibet is a tactic-based inductive theorem prover for the verification of algebraic specifications of algorithms in the style of abstract data types with positive/negative-conditional equations. Its core system consists of a small inference machine kernel that merely acts as a proof checker. Automation is achieved with tactics written in QML (QuodLibet-Meta-Language), an adapted imperative programming language. In this paper, we describe QuodLibet's new standard tactics, a pool of general purpose tactics provided with the core system that support the user in proving inductive theorems. We aim at clarifying the underlying ideas as well as explaining the parameters with which the user can influence the behavior of the tactics during the proof process. One of the major achievements of this paper is the application of conditional lemmas controlled by obligatory and mandatory literals. This has drastically improved the degree of automation without increasing the runtime significantly as will be illustrated by the case studies. Nevertheless, the degree of automation depends on the specification style used. Thus, we will also give some guidelines how to write specifications and how to use the new tactics efficiently.
Review
Reviewed by Claus-Peter Wirth
Full paper

History and Future of Implicit and Inductionless Induction: Beware the old jade and the zombie!
Authors
Claus-Peter Wirth
Title
History and Future of Implicit and Inductionless Induction: Beware the old jade and the zombie!
In
Mechanizing Mathematical Reasoning: Essays in Honor of Jörg H. Siekmann on the Occasion of His 60th Birthday.
LNAI 2605, pp. 192-203, Springer.
Bibtex Entry
Copyright Owner
Springer
Up-to-dateness
Yes! (Oct. 21, 2002)
Abstract
I recollect some memories on the history of implicit induction as it is relevant for future research on computer-assisted theorem proving, especially memories that significantly differ from the presentation in a recent handbook article on ``inductionless induction''. Moreover, the important references excluded there are provided here. To clear the fog a little, there is a short introduction to inductive theorem proving and a discussion of connotations of implicit induction like ``descente infinie'', ``inductionless induction'', ``proof by consistency'', implicit induction orderings (term orderings), and refutational completeness.
Review
Two complete Comments for authors on a short preliminary version
Full paper
Original
Version with minor updates

A Systematic Study of Groebner Basis Methods
Authors
Birgit Reinert
Title
A Systematic Study of Gröbner Basis Methods
In
Habilitation Thesis, ISBN 3936890587,
Informatik-Schriftenreihe, Fachbereich Informatik, Vol. 20,
Univ. Kaiserslautern, 2004.
Bibtex Entry
Copyright Owner
Fachbereich Informatik, Univ. Kaiserslautern, 2004
Abstract
(This monograph does not have any abstract. Please check the table of contents for more information!)
Full paper

On Gröbner Bases in Monoid and Group Rings
Authors
Birgit Reinert
Title
On Gröbner Bases in Monoid and Group Rings
In
PhD Thesis, Fachbereich Informatik, Univ. Kaiserslautern, 1996.
Bibtex Entry
Copyright Owner
None
Abstract
Developed by Bruno Buchberger for commutative polynomial rings, Gröbner Bases are frequently applied to solve algorithmic problems. For instance, the congruence problem for ideals can be solved with the help of Gröbner bases. Until now, these ideas have been transmitted to different in part non-commutative and non-Noetherian algebras. Most of these approaches require an admissible ordering on terms. In this Ph.D. thesis, the concept of Gröbner bases is generalized to finitely generated monoid and group rings. Reduction methods are applied to represent monoid and group elements and to describe the right-ideal congruence in the corresponding monoid and group rings, respectively. In general, monoids and especially groups do not offer admissible orderings anymore. Thus, the definition of an appropriate reduction relation is confronted with the following problems: On the one hand, it is difficult to guarantee termination. On the other hand, reduction steps are not compatible with multiplication anymore, so that the reduction relation does not necessarily generate a right-ideal congruence. In this thesis, several possibilities of defining reduction relations are presented and evaluated w.r.t. the described problems. The concept of saturation (i.e. the extension of a set of polynomials in such a way that the right-ideal congruence it generates can be captured by reduction) is applied to to characterize Gröbner bases relatively to the different reductions by s-polynomials. Using these concepts and special structural properties, it was possible to define special reduction relations for special classes of monoids (e.g., finite, commutative, or free) and for different classes of groups (e.g., finite, free, plain, context-free, or nilpotent) and to develop terminating algorithms for computing Gröbner bases w.r.t. these reduction relations. (English translation by Claus-Peter Wirth)

Gröbnerbasen, entwickelt von Bruno Buchberger für kommutative Polynomringe, finden häufig Anwendung bei der Lösung algorithmischer Probleme. Beispielsweise lässt sich das Kongruenzproblem für Ideale mit Hilfe der Gröbnerbasen lösen. Bis heute wurden diese Ideen auf verschiedene zum Teil nichtkommutative und nichtnoethersche Algebren übertragen. Die meisten dieser Ansätze setzen eine zulässige Ordnung auf den Termen voraus. In dieser Dissertation wird das Konzept der Gröbnerbasen für endlich erzeugte Monoid- und Gruppenringe verallgemeinert. Dabei werden Reduktionsmethoden sowohl zur Darstellung der Monoid- beziehungsweise Gruppenelemente, als auch zur Beschreibung der Rechtsidealkongruenz in den entsprechenden Monoid- beziehungsweise Gruppenringen benutzt. Da im allgemeinen Monoide und insbesondere Gruppen keine zulässigen Ordnungen mehr erlauben, treten bei der Definition einer geeigneten Reduktionsrelation wesentliche Probleme auf: Zum einen ist es schwierig, die Terminierung einer Reduktionsrelation zu garantieren, zum anderen sind Reduktionsschritte nicht mehr mit Multiplikationen verträglich und daher beschreiben Reduktionen nicht mehr unbedingt eine Rechtsidealkongruenz. In dieser Arbeit werden verschiedene Möglichkeiten Reduktionsrelationen zu definieren aufgezeigt und im Hinblick auf die beschriebenen Probleme untersucht. Dabei wird das Konzept der Saturierung, d.h. eine Polynommenge so zu erweitern, dass man die von ihr erzeugte Rechtsidealkongruenz durch Reduktion erfassen kann, benutzt, um Charakterisierungen von Gröbnerbasen bezüglich der verschiedenen Reduktionen durch s-Polynome zu geben. Mithilfe dieser Konzepte ist es gelungen für spezielle Klassen von Monoiden, wie z.B. endliche, kommutative oder freie, und verschiedene Klassen von Gruppen, wie z.B. endliche, freie, plain, kontext-freie oder nilpotente, unter Ausnutzung struktureller Eigenschaften spezielle Reduktionsrelationen zu definieren und terminierende Algorithmen zur Berechnung von Gröbnerbasen bezüglich dieser Reduktionsrelationen zu entwickeln.

Full paper

Descente Infinie + Deduction
Authors
Claus-Peter Wirth
Title
Descente Infinie + Deduction
In
Logic Journal of the IGPL, Vol. 12(1), pp. 1-96, January 2004
Bibtex Entry
Copyright
Oxford University Press
Up-to-dateness
Optimal. Accepted version, submitted September 12, 2003. Only superficial corrigenda.
There is also an older version.
Abstract
Inductive theorem proving in the form of descente infinie was known to the ancient Greeks and is the standard induction method of a working mathematician since it was reinvented in the middle of the 17th century. We present an integration of descente infinie into state-of-the-art free-variable sequent and tableau calculi. It is well-suited for an efficient interplay of human interaction and automation and combines raising, explicit representation of dependence between variables, the liberalized delta-rule, preservation of solutions, and unrestricted applicability of lemmas and induction hypotheses. The semantical requirements are satisfied for a variety of two-valued logics, such as clausal logic, classical first-order logic, and higher-order modal logic.
Full paper
Official version with journal cover. Beware: Prints out page 1 on page 6 ("uneven to even" error)!
  • Format
    pdf
    Size
    0.9 Mbytes
Official version with journal cover, but nicely printing page 1 as uneven.
  • Format
    .ps.gz
    Size
    0.7 Mbytes
Versions without journal cover and proper volume number, but with corrigenda incorporated
  • File in dvi.gz
    all.dvi.gz
    Size
    0.2 Mbytes
  • File in ps.gz
    all.ps.gz
    Size
    0.4 Mbytes
  • File in searchable pdf
    pdf.pdf
    Size
    0.7 Mbytes
  • LaTeX source
    all.tex
    Size
    0.8 Mbytes
  • Style for LaTeX source
    lj-igpl.sty
    Size
    0.0 Mbytes
  • PS for LaTeX source
    penta.ps
    Size
    0.0 Mbytes
Application for a DFG research grant; submitted June 30, 2004. Rejected January 4, 2005; the reviews show no sign of any knowledge on deduction, but of ignorance on induction only. I would like to publish the whole evalution data as an example for the DFG's inability to evaluate non-trivial proposals.
Carefully typeset original
  • Format
    pdf
    Size
    0.5 Mbytes
    Format
    ps.gz
    Size
    0.4 Mbytes
    Format
    dvi.gz
    Size
    0.1 Mbytes
The fonts of this version differ from the original, some originally identical fonts are different in this version, but the whole file can be searched!
  • File
    pdf
    Size
    0.3 Mbytes
Related Talk
Ceterum censeo Descente Infinie esse disputandam
Further Reading
CP's online store

How to Prove Inductive Theorems? QuodLibet!
Authors
Jürgen Avenhaus, Ulrich Kühler, Tobias Schmidt-Samoa, Claus-Peter Wirth
Title
How to Prove Inductive Theorems? QuodLibet!
In
19th CADE 2003, LNAI 2741, pp. 328-333, Springer. Bibtex Entry
Copyright Owner
Springer
Up-to-dateness
Yes! April 9, 2003
Abstract
QuodLibet is a tactic-based inductive theorem proving system that meets today's standard requirements for theorem provers such as a command interpreter, a sophisticated graphical user interface, and a carefully programmed inference machine kernel that guarantees soundness. In essence, it is the synergetic combination of the features presented in the following sections that makes QuodLibet a system quite useful in practice; and we hope that it is actually as you like it, which is the Latin ``quod libet'' translated into English. We start by presenting some of the design goals that have guided the development of QuodLibet Note that the system is not intended to pursue the push bottom technology for inductive theorem proving, but to manage more complicated proofs by an effective interplay between interaction and automation.
Photos
Tobias Schmidt-Samoa and Claus-Peter Wirth
Full paper
Slides of Talk CADE, Miami, 2003
Slides of a Presentation in 2008

Proof Development with Omega
Authors
Jörg Siekmann, Christoph Benzmüller, Vladimir Brezhnev, Lassaad Cheikhrouhou, Armin Fiedler, Andreas Franke, Helmut Horacek, Michaël Kohlhase, Andreas Meier, Erica Melis, Markus Moschner, Immanuël Normann, Martin Pollet, Volker Sorge, Carsten Ullrich, Claus-Peter Wirth, Jürgen Zimmer
Title
Proof Development with Omega
In
19th CADE 2003, LNAI 2741, pp. 328-333, Springer.
Bibtex Entry
Copyright Owner
Springer
Full paper

A Tactic-Based Inductive Theorem Prover for Data Types with Partial Operations
Authors
Ulrich Kühler
Title
A Tactic-Based Inductive Theorem Prover for Data Types with Partial Operations
In
PhD thesis, ISBN 3-89838-238-9, Infix, Akademische Verlagsgesellschaft Aka GmbH, Berlin
Copyright Owner
Akademische Verlagsgesellschaft Aka GmbH, Berlin
Abstract
Data types such as the natural numbers, lists, strings, trees, graphs etc. are essential for the design and implementation of most software systems. A given data type $D$ can often be adequately formalized using an equational specification $spec$ with inductive semantics (i.e. the carriers of the models associated with $spec$ as the semantics of $spec$ are inductively defined). This entails that statements expressing valid properties of the operations of $D$ are represented by the inductive theorems of the specification $spec$. Therefore, inductive theorem proving constitutes the basis of a suitable formal method for reasoning about data types. The subject of this thesis is an inductive theorem prover (named QuodLibet) which we have developed to attain two essential goals. Firstly, our prover is applicable to data types with partial operations, although data types like these lead to specifications that need not be sufficiently complete or may induce non-terminating rewrite relations. Secondly, by offering a high degree of flexibility with regard to the supported forms of proof control, the tactic-based approach to the problem of proof control developed for QuodLibet yields a practically useful compromise between interactive proof control on the one hand and automated proof control on the other hand. In the first part of this thesis, we present a comprehensive and rewrite-based formal framework for inductive theorem proving which serves as the logical foundations of QuodLibet. The main components of this framework are (i) an algebraic specification language for the formalization of data types, (ii) a ``semantic'' induction order for guaranteeing applicability of induction hypotheses, (iii) a calculus for inductive proofs for formal reasoning about data types and (iv) a concept of a so-called proof state graph for the adequate representation of proof constructions. In particular, the specification language allows axioms to be conditional equations (or rewrite rules) with positive and negative conditions, and it has precisely defined admissibility conditions, which can be easily verified for most practically relevant specifications (as no proofs of termination are required). Moreover, by incorporating proof state graphs, our framework is capable of supporting the delayed or lazy computation of induction hypotheses, mutual induction for conjectures about mutually recursive operations as well as multiple complete or incomplete proof attempts for the same conjecture. The second part of this thesis deals with QuodLibet as a software system. We sketch the software architecture and describe the functionality of the system (which can be utilized through a command interpreter or a graphical user interface). The focus of the second part, however, is on the approach to the problem of proof control developed for QuodLibet. This approach is based on so-called (proof) tactics, i.e. proof control routines written in a special proof control language named QML. QuodLibet provides, as part of the approach, a set of tactics (in addition to the elementary inference rules), which range from tactics for trivial simplification steps to tactics representing comprehensive inductive proof procedures. Moreover, the system allows new tactics (written by the user in QML) to be integrated into the system to enhance its performance. Finally, we supply experimental evidence for our claim that the desired (partial) automation of the proof control of QuodLibet has been achieved to a considerable extent.


Authors
Claus-Peter Wirth
Title
Full First-Order Sequent and Tableau Calculi With Preservation of Solutions and the Liberalized delta-Rule but Without Skolemization
Abstract
We present a combination of raising, explicit variable dependency representation, the liberalized delta-rule, and preservation of solutions for first-order deductive theorem proving. Our main motivation is to provide the foundation for our work on inductive theorem proving, where the preservation of solutions is indispensable.
Available versions

An Algebraic Dexter-Based Hypertext Reference Model
Title
An Algebraic Dexter-Based Hypertext Reference Model
Authors
Volker Mattick, Claus-Peter Wirth
Up-to-dateness
Addenda
Abstract
We present the first formal algebraic specification of a hypertext reference model. It is based on the well-known Dexter Hypertext Reference Model and includes modifications with respect to the development of hypertext since the WWW came up. Our hypertext model was developed as a product model with the aim to automatically support the design process and is extended to a model of hypertext-systems in order to be able to describe the state transitions in this process. While the specification should be easy to read for non-experts in algebraic specification, it guarantees a unique understanding and enables a close connection to logic-based development and verification.
In
Research Report 719/1999 (green/grey series), Fachbereich Informatik, Universität Dortmund
Full paper

Full First-Order Free Variable Sequents and Tableaus in Implicit Induction
Authors
Claus-Peter Wirth
Title
Full First-Order Free Variable Sequents and Tableaus in Implicit Induction
In
8th TABLEAUX 1999, LNAI 1617, pp. 293-307, Springer.
Bibtex Entry
Copyright Owner
Springer
Up-to-dateness
There is a significantly improved and extended journal version!
Abstract
We show how to integrate implicit inductive theorem proving into free variable sequent and tableau calculi and compare the appropriateness of tableau calculi for this integration with that of sequent calculi.
Complete Referee Report
"Comments for authors" word-for-word with some comments of the author
Full paper

Positive/Negative-Conditional Equations: A Constructor-Based Framework for Specification and Inductive Theorem Proving
Authors
Claus-Peter Wirth
Title
Positive/Negative-Conditional Equations: A Constructor-Based Framework for Specification and Inductive Theorem Proving
In
PhD thesis, ISBN 3-86064-551-X, Verlag Dr. Kovac, Hamburg, 1997,
Bibtex Entry
Copyright Owner
Verlag Dr. Kovac
Abstract
Format
.ps.gz
Size
.02 Mbytes

Conditional Equational Specifications of Data Types with Partial Operations for Inductive Theorem Proving
Authors
Ulrich Kühler, Claus-Peter Wirth
Title
Conditional Equational Specifications of Data Types with Partial Operations for Inductive Theorem Proving
Bibtex Entry
In
Long Version:
SEKI-Report SR-1996-11
Full paper: Format: ps.gz.
Short version:
8th RTA 1997, LNCS 1232, pp. 38--52, Springer.
Full paper: Format: ps.gz.
Copyright Owner
Long Version:
SEKI
Short version:
Springer
Up-to-dateness
Yes! (Dec 31, 2007)
Abstract
We propose a specification language for the formalization of data types with partial or non-terminating operations as part of a rewrite-based logical framework for inductive theorem proving. The language requires constructors for designating data items and admits positive/negative conditional equations as axioms in specifications. The (total algebra) semantics for such specifications is based on so-called data models. We present admissibility conditions that guarantee the unique existence of a distinguished data model with properties similar to those of the initial model of a usual equational specification. Since admissibility of a specification requires confluence of the induced rewrite relation, we provide an effectively testable confluence criterion which does not presuppose termination.
Keywords
Algebraic specification, partial operations, conditional rewriting, confluence, inductive theorem proving


Authors
Bernhard Gramlich, Claus-Peter Wirth
Title
Confluence of Terminating Conditional Term Rewriting Systems Revisited
In
7th RTA, 1996, LNCS 1103, pp.245-259, Springer
Abstract
We present a new and powerful criterion for confluence of terminating (terms in) join conditional term rewriting systems. This criterion is based on a certain joinability property for shared parallel critical peaks and does neither require the systems to be decreasing nor left-linear nor normal, but only terminating.
Full paper
Format
.ps.gz
Size
.11 Mbytes


Authors
Claus-Peter Wirth
Title
Syntactic Confluence Criteria for Positive/Negative-Conditional Term Rewriting Systems
In
SEKI-Report SR-95-09, Univ. Kaiserslautern, 1995
Bibtex Entry
Copyright Owner
SEKI
Abstract
We study the combination of the following already known ideas for showing confluence of unconditional or conditional term rewriting systems into practically more useful confluence criteria for conditional systems: Our syntactical separation into constructor and non-constructor symbols, Huet's introduction and Toyama's generalization of parallel closedness for non-noetherian unconditional systems, the use of shallow confluence for proving confluence of noetherian and non-noetherian conditional systems, the idea that certain kinds of limited confluence can be assumed for checking the fulfilledness or infeasibility of the conditions of conditional critical pairs, and the idea that (when termination is given) only prime superpositions have to be considered and certain normalization restrictions can be applied for the substitutions fulfilling the conditions of conditional critical pairs. Besides combining and improving already known methods, we present the following new ideas and results: We strengthen the criterion for overlay joinable noetherian systems, and, by using the expressiveness of our syntactical separation into constructor and non-constructor symbols, we are able to present criteria for level confluence that are not criteria for shallow confluence actually and also able to weaken the severe requirement of normality (stiffened with left-linearity) in the criteria for shallow confluence of noetherian and non-noetherian conditional systems to the easily satisfied requirement of quasi-normality. Finally, the whole paper may also give a practically useful overview of the syntactical means for showing confluence of conditional term rewriting systems.
Full paper
Original version of 1995
Format
.ps.gz
Size
.6 Mbytes
This site
../sr9509/original.ps.gz
Original site
Uni KL site
Slightly updated version
Format
.ps.gz
Size
.7 Mbytes
This site
../sr9509/all.ps.gz
Slightly updated version with some strange typesetting, but searchable!
Format
.pdf
Size
1.3 Mbytes
This site
../sr9509/pdf.pdf

Abstract Notions and Inference Systems for Proofs by Mathematical Induction
Authors
Claus-Peter Wirth, Klaus Becker
Title
Abstract Notions and Inference Systems for Proofs by Mathematical Induction
In
4th CTRS, LNCS 968, 1994, pp. 353-373, Springer.
Bibtex Entry
Abstract
Soundness of inference systems for inductive proofs is sometimes shown ad hoc and a posteriori, lacking modularization and interface notions. As a consequence, these soundness proofs tend to be clumsy, difficult to understand and maintain, and error prone with difficult to localize errors. Furthermore, common properties of the inference rules are often hidden, and the comparison with similar systems is difficult. To overcome these problems we propose to develop soundness proofs systematically by presenting an abstract frame inference system a priori and then to design each concrete inference rule locally as a sub-rule of some frame inference rule and to show its soundness by a small local proof establishing this sub-rule relationship. We present a frame inference system and two approaches to show its soundness, discuss an alternative, and briefly classify the literature. In an appendix we give an example and briefly discuss failure recognition and refutational completeness.
Full paper


Authors
Claus-Peter Wirth, Rüdiger Lunde
Title
Writing Positive/Negative-Conditional Equations Conveniently
In
SEKI Working Paper SWP-94-04, Univ. Kaiserslautern, 1994
Bib Entry
Abstract
We present a convenient notation for positive/negative-conditional equations. The idea is to merge rules specifying the same function by using case-, if-, match-, and let-expressions. Based on the presented macro-rule-construct, positive/negative-conditional equational specifications can be written on a higher level. A rewrite system translates the macro-rule-constructs into positive/negative-conditional equations.
Full paper


Authors
Rüdiger Lunde, Claus-Peter Wirth
Title
ASF+   -   eine ASF-ähnliche Spezifikationssprache
In
SEKI Working Paper SWP-94-05, Univ. Kaiserslautern, 1994
Bib Entry
Abstract
German
Ohne auf wesentliche Aspekte der in [Bergstra&al.89] vorgestellten algebraischen Spezifikationssprache ASF zu verzichten, haben wir ASF um die folgenden Konzepte erweitert: Waehrend in ASF einmal exportierte Namen bis zur Spitze der Modulhierarchie sichtbar bleiben muessen, ermoeglicht ASF+ ein differenziertes Verdecken von Signaturnamen. Das fehlerhafte Vermischen unterschiedlicher Strukturen, welches in ASF beim Import verschiedener Aktualisierungen desselben parametrisierten Moduls auftritt, wird in ASF+ durch eine adaequatere Form der Parameterbindung vermieden. Das neue Namensraum-Konzept von ASF+ erlaubt es dem Spezifizierer, einerseits die Herkunft verdeckter Namen direkt zu identifizieren und anderseits beim Import eines Moduls auszudruecken, ob dieses Modul nur benutzt oder in seinen wesentlichen Eigenschaften veraendert werden soll. Im ersten Fall kann er auf eine einzige global zur Verfuegung stehende Version zugreifen; im zweiten Fall muss er eine Kopie des Moduls importieren. Schliesslich erlaubt ASF+ semantische Bedingungen an Parameter und die Angabe von Beweiszielen.
English
Maintaining the main aspects of the algebraic specification language ASF as presented in [Bergstra&al.89] we have extend ASF with the following concepts: While once exported names in ASF must stay visible up to the top the module hierarchy, ASF+ permits a more sophisticated hiding of signature names. The erroneous merging of distinct structures that occurs when importing different actualizations of the same parameterized module in ASF is avoided in ASF+ by a more adequate form of parameter binding. The new ``Namensraum''-concept of ASF+ permits the specifier on the one hand directly to identify the origin of hidden names and on the other to decide whether an imported module is only to be accessed or whether an important property of it is to be modified. In the first case he can access one single globally provided version; in the second he has to import a copy of the module. Finally ASF+ permits semantic conditions on parameters and the specification of tasks for a theorem prover.
Full paper

On Notions of Inductive Validity for First-Order Equational Clauses
Authors
Claus-Peter Wirth, Bernhard Gramlich
Title
On Notions of Inductive Validity for First-Order Equational Clauses
In
12th CADE 1994, LNAI 814, pp. 162-176, Springer.
Bibtex Entry
Abstract
We define and discuss various conceivable notions of inductive validity for first-order equational clauses. This is done within the framework of constructor-based positive/negative conditional equational specifications which permits to treat negation and incomplete function definitions in an adequate and natural fashion. Moreover, we show that under some reasonable assumptions all these notions are monotonic w.r.t. consistent extension, in contrast to the case of inductive validity for initial semantics (of unconditional or positive conditional equations). In particular from a practical point of view, this monotonicity property is crucial since it allows for an incremental construction process of complex specifications where consistent extensions of specifications cannot destroy the validity of (already proved) inductive properties. Finally we show how various notions of inductive validity in the literature fit in or are related to our classification.
Full paper


Authors
Claus-Peter Wirth, Bernhard Gramlich
Title
A Constructor-Based Approach for Positive/Negative-Conditional Equational Specifications
In
J. Symbolic Computation (1994) 17, pp. 51-90, Academic Press
Bib Entry
Abstract
We study algebraic specifications given by finite sets R of positive/negative-conditional equations (i.e. universally quantified first-order implications with a single equation in the succedent and a conjunction of positive and negative (i.e. negated) equations in the antecedent). The class of models of such a specification R does not contain in general a minimum model in the sense that it can be mapped to any other model by some homomorphism. We present a constructor-based approach for assigning appropriate semantics to such specifications. We introduce two restrictions: firstly, for a condition to be fulfilled we require the evaluation values of the terms of the negative equations to be in the constructor sub-universe which contains the evaluation values of all constructor ground terms; secondly, we restrict the constructor equations to have ``Horn''-form and to be ``constructor-preserving''. A reduction relation for R is defined, which allows to generalize the fundamental results for positive-conditional rewrite systems. This reduction relation is monotonic w.r.t. consistent extension of the specification, which is of practical importance as it allows for an incremental construction process of complex specifications without destroying reduction steps which were possible before. Under the assumption of confluence, the factor algebra of the term algebra modulo the congruence of the reduction relation is a minimal model which is (beyond that) the minimum of all models that do not identify more objects of the constructor sub-universe than necessary. We define several kinds of compatibility of R with a reduction ordering for achieving decidability of reducibility, and present several criteria for the confluence of our reduction relation.
Full paper
Format
.ps.gz
Size
.16 Mbytes