Research


Overall objectives

The VeriDis project team includes members of the MOSEL group at LORIA, the computer science laboratory in Nancy, and members of the research group Automation of Logic at Max-Planck-Institut für Informatik in Saarbrücken. It is headed by Stephan Merz and Christoph Weidenbach. VeriDis was created in 2010 as a local research group of Inria Nancy – Grand Est and has been an Inria project team since July 2012.

The objectives of VeriDis are to contribute to advances in verification techniques, including automated and interactive theorem proving, and to make them available for the development and analysis of concurrent and distributed algorithms and systems, based on mathematically precise and practically applicable development methods. The techniques that we develop are intended to assist designers of algorithms and systems in carrying out formally proved developments, where proofs of relevant properties, as well as bugs, can be found with a high degree of automation.

Within this context, we work on techniques for automated theorem proving for expressive languages based on first-order logic, with support for theories (fragments of arithmetic, set theory etc.) that are relevant for specifying algorithms and systems. Ideally, systems and their properties would be specified in high-level, expressive languages, errors in specifications would be discovered automatically, and finally, full verification could also be performed completely automatically. Due to the fundamental undecidability of the problem, this cannot be achieved in general. Nevertheless, we have observed important advances in automated deduction in recent years, to which we have contributed. These advances suggest that a substantially higher degree of automation can be achieved over what is available in today’s tools supporting deductive verification. Our techniques are developed within SMT (satisfiability modulo theories) solving and superposition reasoning, the two main frameworks of contemporary automated reasoning that have complementary strengths and weaknesses, and we are interested in making them converge when appropriate. Techniques developed within the symbolic computation domain, such as algorithms for quantifier elimination for appropriate theories, are also relevant, and we are working on integrating them into our portfolio of techniques. In order to handle expressive input languages, we are working on techniques that encompass tractable fragments of higher-order logic, for example for specifying inductive or co-inductive data types, for automating proofs by induction, or for handling collections defined through a characteristic predicate.

Since full automatic verification remains elusive, another line of our research targets interactive proof platforms. We intend these platforms to benefit from our work on automated deduction by incorporating powerful automated backends and thus raise the degree of automation beyond what current proof assistants can offer. Since most conjectures stated by users are initially wrong (due to type errors, omitted hypotheses or overlooked border cases), it is also important that proof assistants be able to detect and explain such errors rather than letting users waste considerable time in futile proof attempts. Moreover, increased automation must not come at the expense of trustworthiness: skeptical proof assistants expect to be given an explanation of the proof found by the backend prover that they can certify.

Our methodological and foundational research is accompanied by the development of efficient software tools, several of which go beyond pure research prototypes: they have been used by others, have been integrated in proof platforms developed by other groups, and participate in international competitions. We also validate our work on proof techniques by applying them to the formal development of algorithms and systems. We mainly target high-level descriptions of concurrent and distributed algorithms and systems. This class of algorithms is by now ubiquitous, ranging from multi- and many-core algorithms to large networks and cloud computing, and their formal verification is notoriously difficult. Targeting high levels of abstraction allows the designs of such systems to be verified before an actual implementation has been developed, contributing to reducing the costs of formal verification. The potential of distributed systems for increased resilience to component failures makes them attractive in many contexts, but also makes formal verification even more important and challenging. Our work in this area aims at identifying classes of algorithms and systems for which we can provide guidelines and identify patterns of formal development that makes verification less an art and more an engineering discipline. We mainly target components of operating systems, distributed and cloud services, and networks of computers or mobile devices.

Beyond formal verification, we pursue applications of some of the symbolic techniques that we are developing in other domains. We have observed encouraging success in using techniques of symbolic computation for the qualitative analysis of biological and chemical networks described by systems of ordinary differential equations that were previously only accessible to large-scale simulation. Such networks include biological reaction networks as they occur with models for diseases such as diabetes or cancer. They furthermore include epidemic models such as variants and generalizations of SEIR models, which are typically used for Influenza A or Covid-19. This work is being pursued within a large-scale interdisciplinary collaboration. It aims for our work grounded in verification to have an impact on the sciences, beyond engineering, which will feed back into our core formal methods community.

Last activity report : 2020

Results

New results

Automated and Interactive Theorem Proving

Contributions to SMT Techniques

A satisfiability problem is often expressed in a combination of theories, and a natural approach consists in solving the problem by combining the satisfiability procedures available for the component theories. This is the purpose of the combination method introduced by Nelson and Oppen. However, in its initial presentation, the Nelson-Oppen combination method requires the theories to be signature-disjoint and stably infinite. The design of a generic combination method for non-disjoint unions of theories is difficult, but it is worth exploring simple non-disjoint combinations that appear frequently in verification. An example is the case of shared sets, where sets are represented by unary predicates. Another example is the case of bridging functions between data structures and a target theory (e.g., a fragment of arithmetic).

In 2015, we defined a sound and complete combination procedure à la Nelson-Oppen for the theory of absolutely free data structures (including lists and trees) connected to another theory via bridging functions . This combination procedure has also been refined for standard interpretations. The resulting theory has a nice politeness property, enabling combinations with arbitrary decidable theories of elements. We also investigated other theories amenable to similar combinations: this class includes the theory of equality, the theory of absolutely free data structures, and all the theories in between.

In 2018 and 2019, we have been improving the framework and unified both results. This was published in the Journal of Automated Reasoning in 2020 .

The above works pave the way for combinations involving the theory of algebraic datatypes as found in the SMT-LIB. Together with colleagues in Iowa and Stanford, this was published at IJCAR 2020 . This article received a best paper award.

SMT solvers generally rely on various instantiation techniques for handling quantifiers. We built a unifying framework encompassing quantified formulas with equality and uninterpreted functions, such that the major instantiation techniques in SMT solving can be cast in that framework. It is based on the problem of E-ground (dis)unification, a variation of the classic Rigid E-unification problem. We introduced a sound and complete calculus to solve this problem in practice: Congruence Closure with Free Variables (CCFV). Experimental evaluations of implementations of CCFV demonstrate notable improvements in the state-of-the-art solver CVC4 and make the solver veriT competitive with state-of-the-art solvers for several benchmark libraries, in particular those originating in verification problems.

In 2019 and 2020, we investigated machine learning techniques for predicting the usefulness of an instance in order to decrease the number of instances passed to the SMT solver. For this, we proposed a meaningful way to characterize the state of an SMT solver, to collect instantiation learning data, and to integrate a predictor in the core of a state-of-the-art SMT solver. This ultimately leads to more efficient SMT solving for quantified problems.

SMT solvers have throughout the years been able to cope with increasingly expressive formulas, from ground logics to full first-order logic (FOL). In contrast, the extension of SMT solvers to higher-order logic (HOL) was mostly unexplored. We proposed a pragmatic extension for SMT solvers to support HOL reasoning natively without compromising performance on FOL reasoning, thus leveraging the extensive research and implementation efforts dedicated to efficient SMT solving. We showed how to generalize data structures and the ground decision procedure to support partial applications and extensionality, as well as how to reconcile quantifier instantiation techniques with higher-order variables. We also discussed a separate approach for redesigning an SMT solver for higher-order logic from the ground up via new data structures and algorithms. We applied our pragmatic extension to the CVC4 SMT solver and discussed a redesign of the veriT SMT solver. Our evaluation showed that they are competitive with state-of-the-art HOL provers and often outperform the traditional encoding into FOL.

This result was published at CADE 2019  . Work in 2020 focused on extending the CCFV algorithm to higher-order logic : the first-order algorithm is not directly usable since it strongly relies on the fact that functions are fully applied, and no variable can appear in a function place. It is also necessary to find a radically different approach. Our approach is to work on an encoding of the CCFV higher-order problem into SAT.

We have previously developed a framework for processing formulas in automatic theorem provers, with generation of detailed proofs that can be checked by external tools, including skeptical proof assistants. The main components are a generic contextual recursion algorithm and an extensible set of inference rules. Clausification, skolemization, theory-specific simplifications, and expansion of `let’ expressions are instances of this framework. With suitable data structures, proof generation adds only a linear-time overhead, and proofs can be checked in linear time. We implemented the approach in the SMT solver veriT. This allowed us to dramatically simplify the code base while increasing the number of problems for which detailed proofs can be produced. In 2019, the format of proof output was further improved, while also improving the reconstruction procedure in the proof assistant Isabelle/HOL. This allowed the tactic using SMT with proofs to be regularly suggested by Sledgehammer as the fastest method to automatically solve proof goals. This was the subject of a workshop publication in 2019 . In 2020, we have made steady progress on this front, and thanks to this progress, the veriT solver has been integrated into Isabelle with full support of reconstruction for veriT proof. This lead to improvements in the Sledgehammer facility to automatically discharge Isabelle proofs.

In we consolidate our research in effective methods for the existential theory of Presburger Arithmetic over the past years in a journal article. We consider feasibility of linear integer problems in the context of verification systems such as SMT solvers or theorem provers. Although satisfiability of linear integer problems is decidable, many state-of-the-art implementations neglect termination in favor of efficiency. We present the calculus CutSat++ that is sound, terminating, complete, and leaves enough space for model assumptions and simplification rules in order to be efficient in practice. CutSat++ combines model-driven reasoning and quantifier elimination to the feasibility of linear integer problems.

Certification of automated reasoning techniques

We are part of a group developing a framework for formal refutational completeness proofs of abstract provers that implement automated reasoning calculi, such as CDCL (Conflict Driven Clause Learning), ordered resolution, or superposition.

For CDCL we have been able to derive the state-of-the-art algorithms from a simple, verified core, inheriting its properties via instantiation. Then we could further refine this setting up to the generation of executable code that performs surprisingly well compared to hand-coded CDCL implementations .

The framework relies on modular extensions of lifted redundancy criteria that underlie the deletion of subsumed formulas. In presentations of proof calculi, this aspect is usually only discussed informally. Our framework allows us to extend redundancy criteria so that they cover subsumption, and also to model entire prover architectures in such a way that the static refutational completeness of a calculus immediately implies the dynamic refutational completeness of a prover implementing the calculus, for instance within an Otter or DISCOUNT loop. Our framework is mechanized in Isabelle/HOL. This research was presented at IJCAR 2020 .

The Coq proof assistant is powerful enough to reproduce cyclic reasoning for first-order logic with inductive definitions (FOL ID ) in terms of cyclic proofs. We identify a class of Coq-certifiable cyclic proofs convertible to a set of Coq proofs relying on normalized well-founded explicit induction. These proofs start with a unique explicit induction step whose induction schema is derived from the definition of new admissible predicates. The admissibility property, as well as the rest of the proofs, can be deduced from the input proof. The conversion procedure does not backtrack and no extra reconstruction proof techniques are needed. In practice, it has been used to certify FOL ID Cyclist proofs, including a proof for the 2-Hydra problem, and non-trivial cyclic SPIKE proofs of conjectures about conditional specifications.

Automated reasoning for specific logics

Abduction is the process of explaining new observations using background knowledge. It is central to knowledge discovery and knowledge processing and has been intensely studied in various domains such as artificial intelligence, philosophy and logic.

Signature-based abduction aims at building hypotheses over a specified set of names, the signature, that explain an observation relative to some background knowledge. This type of abduction is useful for tasks such as diagnosis, where the vocabulary used for observed symptoms differs from the vocabulary expected to explain those symptoms. In the description logic literature, abduction has received little attention, despite being recognised as important for ontology repair, query update and matchmaking.

S. Tourret, together with P. Koopmann, W. Del-Pinto and R. Schmidt, presented the first complete method solving signature-based abduction for observations expressed in the expressive description logic 𝒜ℒ𝒞 . The method is guaranteed to compute a finite and complete set of hypotheses, and is evaluated on a set of realistic knowledge bases.

In joint work with P. Koopmann, we are currently investigating an alternative approach to abduction for description logics based on a translation to first-order logic and back. This work is motivated by the recent development of efficient tools for abductive reasoning in first-order logic.

In joint work with P. Koopmann , we define a notion of relevance of a clause for proving a particular entailment by the resolution calculus. We think that our notion of relevance is useful for explaining why an entailment holds. A clause is relevant if there is no proof of the entailment without it. It is semi-relevant if there is a proof of the entailment using it. It is irrelevant if it is not needed in any proof. By using well-known translations of description logics to first-order clause logic, we show that all three notions of relevance are decidable for a number of description logics, including ℰℒ and 𝒜ℒ𝒞 . We provide effective tests for (semi-)relevance. The (semi-)relevance of a DL axiom is defined with respect to the (semi-)relevance of the respective clauses resulting from the translation.

This notion of semi-relevance is particularly interesting and can be detected using SOS-resolution derivations. We are currently working on a generalized proof of the SOS strategy for resolution to validate the theory behind the tests for semi-relevance in first-order logic.

Inductive Logic Programming (ILP) is a form of machine learning that induces hypotheses from examples and background knowledge. Many forms of ILP use second-order Horn clauses as templates, also denoted as meta-rules, to learn logic programs, and several of them rely on SLD resolution to produce new candidate solutions. Determining which meta-rules to use for a given learning task is a major open problem in ILP and most approaches use clauses provided by the designers of the systems without any theoretical justifications.

In joint work with A. Cropper we formalized the derivation reduction problem for SLD resolution, the undecidable problem of finding a finite subset of a set of clauses from which the whole set can be derived using SLD resolution. We studied the reducibility of various fragments of second-order Horn logic that are relevant in ILP and extended our results to standard resolution. We also conducted an empirical study of the effects of using reduced sets of such metarules on the overall learning accuracy and time, which shows a substantial improvement over the state of the art, in addition to the theoretical guarantees offered.

We reconsider the encoding of proof obligations that arise in proofs about TLA+ specifications in multi-sorted first-order logic. In his PhD thesis, Antoine Defourné studies correctness criteria for assigning types to TLA+ expressions, based on embeddings between models for different logics. The objective is to delineate what type assignments are sound when translating from the untyped TLA+ language into the multi-sorted logics underlying typical automated reasoning engines.

He also implemented a new back-end reasoner based on Zipperposition for handling proof obligations that involve features of higher-order logic such as function or predicate variables that may appear in TLA+ proof sequents. The design of this back-end was presented at JFLA 2020 , and a working prototype at the TLA+ Community Meeting .

During his undergraduate internship, Raphaël Le Bihan revisited the coalescing technique used in TLAPS for separating first-order and modal reasoning.

Formal Methods for Developing and Analyzing Algorithms and Systems

Contributions to Formal Methods of System Design

Refinement of a specification expressed at a high level of abstraction by a lower-level specification is a fundamental concept in formal system development. A key problem in proving refinement is to demonstrate that suitable values of internal variables of the high-level specification can be assigned to every possible execution of the low-level specification. The standard technique for doing so is to exhibit a refinement mapping where values for these variables are computed for each state, but it is also well known that this technique is not complete. In joint work with Leslie Lamport (Microsoft Research), we revisit the classic paper that introduced constructions for auxiliary variables in order to strengthen the refinement mapping technique. In particular, we introduce simpler rules for defining prophecy variables and demonstrate how they can be used for proving the correctness of an algorithm implementing a linearizable object. We also show that our constructions of auxiliary variables yield a complete proof method. An article based on this work has been submitted for publication to a journal.

In we present an approach for combining correct-by-construction approaches and transformations of formal models expressed in Event-B to executable programs written in DistAlgo, a domain-specific language embedded in Python. Our objective is to address the design of verified distributed programs. We define a subset LB (Local Event-B) of the Event-B modelling language restricted to events modelling typical actions of distributed programs, including internal or local computations, as well as sending and receiving messages. We define transformations of the various elements of the LB language into DistAlgo programs. The general methodology consists in starting from a statement of the algorithmic problem and then progressively producing an LB model obtained after several refinement steps of the initial LB model. The derivation of the LB model has already been addressed in previous research. The transformation of LB models into DistAlgo programs is illustrated through a simple example. The refinement process and the soundness of the transformation allow one to produce correct-by-construction distributed programs.

PlusCal is a language for describing algorithms. It has the look and feel of pseudo-code, but also has a formal semantics through a translation to TLA+ specifications. During her master internship, Heba Al Kayed extended the PlusCal language and translator so that it is more suitable for modeling distributed algorithms. As a first extension, parallel processes may have several code blocks that represent threads communicating through local variables. Second, communication channels are introduced as first-class entities, together with send, multicast, and receive operations. This work was presented at the TLA+ Community Meeting .

When interactive systems allow users to interact with critical systems, they are qualified as Critical Interactive Systems. Their design requires the support of different activities and tasks to achieve user goals. Examples of such systems are cockpits, nuclear plant control panels, medical devices, etc. Such critical systems are very difficult to model due to the complexity of the offered interaction capabilities. In joint work with Ismaël Mendil, Neeraj Kumar Singh, Yamine Aït-Ameur, and Philippe Palanque (IRIT Toulouse), we present a formal framework, F3FLUID (Formal Framework For FLUID), for designing safety-critical interactive systems. It relies on FLUID as the core modelling language. FLUID enables modelling and using interactive systems domain concepts and supports an incremental design of such systems. Formal verification, validation and animation of the designed models are supported through different transformations of FLUID models into target formal verification techniques: Event-B for formal verification, ProB model checker for animation and Interactive Cooperative Objects for user validation. The Event-B models are generated from FLUID while ICO and ProB models are produced from Event-B. We exemplify the real-life case study TCAS (Traffic alert and Collision Avoidance System) to demonstrate our framework.

Automated Reasoning Techniques for Verification

In joint work with Markus Kroetzsch and Christof Fetzer (Technical University of Dresden), we have introduced a logical fragment called SUPERLOG (Supervisor Logic) that is meant to provide a basis for formalizing abstract control algorithms found in ECUs (Electronical Control Unit), plus fully automated verification, plus execution . Technically, the language is an extension of the first-order Bernays-Schoenfinkel fragment with arithmetic constraints. It extends the well known SMT fragment by universally quantified variables. We have developed a sound and complete calculus for the SUPERLOG language . The calculus supports non-exhaustive propagation and can therefore be a role model for other calculi where exhaustive propagation cannot be afforded . Based on the decidability results obtained by Marco Voigt , we are working on fully automatic verification approaches for fragments of the SUPERLOG language. One line of research is to “hammer” verification conditions in SUPERLOG fragments to DATALOG for efficient solving. The other is to use abstractions guiding the search of the calculus, related to our abstraction refinement approach .

Margaux Durœulx defended her PhD thesis , funded by the excellence program of University of Lorraine and prepared in cooperation with Nicolae Brînzei (Centre de Recherche en Automatique de Nancy). The thesis studies the use of satisfiability techniques for assessing the reliability of complex systems, represented by static or dynamic fault trees that determine which combinations of component failures lead to system failures. Based on encodings of fault trees in propositional logic, a SAT solver can be used to compute minimal tie sets or sequences, and these are instrumental for probabilistic realiability assessment.

Verification of Quantitative Systems or Properties

We completed in 2020 our work on developing a prototype tool for performing statistical model checking within the SimGrid framework. The goal was to give users the opportunity, in one single framework, to take advantage of both verification and simulation possibilities. To do so, we added to SimGrid the possibility to use stochastic profiles, introducing probabilities in the model of the network. The prototype tool can be interfaced with the SimGrid simulator to perform statistical model checking on the actual programs simulated using the SimGrid framework. The prototype was evaluated on examples such as the Bit Torrent protocol in which we added a probabilistic model of node failures. This work resulted in a publication at the SIMULTECH conference .

Hybrid systems are characterized by the interaction of continuous dynamics and discrete control. As hybrid systems become ubiquitous and more and more complex, analysis and synthesis techniques for designing safe hybrid systems are in high demand. This is however challenging due to the nature of hybrid systems and their designs, and the question of how to formulate and reason about their safety problems. Previous work has demonstrated how to extend the discrete modeling language Event-B with support for continuous operators and how to integrate traditional refinement in hybrid system design. In the same spirit, we extend previous work by proposing a strategy that can coherently refine an abstract hybrid system design with safety constraints down to the concrete one with implementable discrete control that can behave safely. Our proposal is validated on the design of a smart heating system.

Verification and Analysis of Dynamic Properties of Biological Systems

Our work in toricity of steady state ideals of biomodels from 2019 has been accepted for journal publication and will appear during 2021. The approach there was to automatically recognize relevant geometric structure of steady state varieties in Kn , where K stands for either the complex or the real numbers. For the complex numbers we used quite complicated algebraic techniques based on Gröbner basis theory. For the real numbers, in contrast, our approach was purely based on logic. Technically we employed real quantifier elimination; SMT-solving in QF_NRA is a possible alternative, which has not been studied systematically. This year we managed to treat also the case of complex numbers on a purely logical basis , . Based on arguments from algebraic model theory, this also gives insights into the interdependencies of the occurrences of relevant geometric structures over the complex numbers versus the real numbers.

Geometric toricity of a variety resembles the algebraic concept of binomiality of the corresponding polynomial ideal. Generalizing that well-studied binomiality concept of chemical reaction networks, in , unconditional binomiality has been defined, its properties have been investigated and a linear algebra approach has been given for testing unconditional binomiality in the case of reversible reactions. A graph theoretical version of the linear algebra approach has been presented in .

Joint work with Russell Bradford (Bath, UK), James Harold Davenport (Bath, UK), Matthew England (Coventry, UK), Hassan Errami (Bonn, Germany), Vladimir Gerdt (Dubna, Russia), Dima Grigoriev (Lille), Charles Hoyt (Bonn, Germany), Marek Košta (Bratislava, Slovak Republic), Ovidiu Radulescu (Montpellier), and Andreas Weber (Bonn, Germany)

In

we address, on the one hand, the simpler question whether or not there is are unique steady states, without going into details on the exact geometry. On the other hand, we do so in dependence on

parametric

reaction rates, so that the results are necessary and sufficient formal logical conditions in the Tarski Algebra. Again, the underlying methods are of logical nature, mostly real elimination methods like virtual substitution, cylindrical algebraic decomposition, and real triangular sets.

Joint work with Niclas Kruff (Aachen, Germany), Christoph Lüders (Bonn, Germany), Ovidiu Radulescu (Montpellier), Sebastian Walcher (Aachen, Germany)

Our interdisciplinary work

in computer science, mathematics, and systems biology is concerned with the reduction of a system of ordinary differential equations (in time) into several simpler subsystems, each corresponding to a certain orders of magnitude of velocities, also called time scales, of the corresponding differential variables. To our knowledge this is the first mathematically rigorous approach for reaction networks that allows for multiple time scales. Previous work either did not give any formal guarantees on the obtained results, or was limited to only two different time scales. The computation is based on massive SMT solving over various theories, including

QF_LRA

for tropicalizations,

QF_NRA

for testing Hurwitz conditions on eigenvalues, and

QF_LIA

for finding sufficient differentiability conditions for hyperbolic attractivity of critical manifolds. Gröbner reduction techniques are used for final algebraic simplification.

As an example consider a model related to the transmission dynamics of subtype H5N6 of the avian Influenza A virus in the Philippines in August 2017 . That model is identified as BIOMD0000000716 in the BioModels database, a repository of mathematical models of biological processes . The model specifies four species: S_b (susceptible bird), I_b (infected bird), S_h (susceptible human), and I_h (infected human), the concentrations of which over time we denote by differential variables y1,,y4 , respectively. The input system S is given by

y ˙ 1 = 9137 2635182 y 1 y 2 1 730 y 1 + 412 73 , y ˙ 2 = 9137 2635182 y 1 y 2 4652377 961841430 y 2 , y ˙ 3 = 1 6159375000 y 2 y 3 1 25258 y 3 + 40758549 3650000 , y ˙ 4 = 1 6159375000 y 2 y 3 112500173 2841525000000 y 4 .

Our approach reduces this to three systems T1 , T2 , T3 along with corresponding attractive manifolds 1 , 2 , 3 :

T 1 : y ˙ 1 = 1 · 9137 2635182 y 1 y 2 + 412 73 , y ˙ 2 = y ˙ 3 = y ˙ 4 = 0 1 : y 1 y 2 = 1085694984 667001 T 2 : y ˙ 2 = 1 125 · 116309425 192368286 y 2 + 51500 73 , y ˙ 3 = y ˙ 4 = 0 2 : y 1 = 4652377 3335005 , y 2 = 5428474920 4652377 T 3 : y ˙ 3 = 1 15625 · 15625 25258 y 3 + 203792745 1168 , y ˙ 4 = 1 15625 · 15079097 5094352815 y 3 112500173 181857600 y 4 3 : y 1 = 4652377 3335005 , y 2 = 5428474920 4652377 , y 3 = 7051228977 25000 , y 4 = 441466240042010928888 327120760850763125 .

Notice the explicit constant factors on the right hand sides of the differential equations. We see that the system T2 is 125 times slower than T1 , and T3 is another 125 times slower. The total computation time was about one second. Figure  visualizes the direction fields of T1 , …,  T3 on 1 , …,  3 , respectively.

Figure 1: Reduction of an epidemic model of avian Influenza A. (a) The surface is the critical manifold 1 projected from 4 into real (y1,y2,y3) -space. The line located at (y1,y2)(1.4,1166.8) is the critical submanifold 21 . The dot located at (y1,y2,y3)(1.4,1166.8,282049.2) is the critical submanifold 32 . Both 1 and 2 extend to ± in both y3 and y4 direction, and 3 is located near (1.4,1166.8,282049.2,1349.6) . (b) The direction field of T1 projected from 4 into real (y1,y2) -space. The curve is the critical manifold 1 . (c) The direction field of T2 on 1 projected from 4 into real (y3,y2) -space. The line is the critical submanifold 21 . (d) The direction field of T3 on 2 projected from 4 into real (y3,y4) -space. The dot is the critical submanifold 32 .

This multiple time scale reduction of the bird flu model emphasizes a cascade of successive relaxations of different model variables. First, the population of susceptible birds relaxes, meaning that these variables reach quasi-steady state values. This relaxation is illustrated in Fig. (b). Then the population of infected birds relaxes as shown in Fig. (c). Finally, the populations of susceptible and infected humans relax to a stable steady state as shown in Fig. (d), following a reduced dynamics described by  T3 .

Joint Work with with Werner Seiler and Matthias Seiß (Kassel, Germany)

Implicit differential equations, i.e. equations which are not solved for a derivative of highest order, appear in many applications. In particular, the so-called differential algebraic equations may be considered as a special case of implicit equations. Compared with equations in solved form, implicit equations are more complicated to analyze and show a much wider range of phenomena. Already basic questions about the existence and uniqueness of solutions of an initial value problem become much more involved. One reason is the possible appearance of singularities.

In we discuss the effective computation of geometric singularities of implicit ordinary differential equations over the real numbers using methods from logic. Via the Vessiot theory of differential equations, geometric singularities can be characterized as points where the behaviour of a certain linear system of equations changes. These points can be discovered using a specifically adapted parametric generalisation of Gaussian elimination combined with real quantifier elimination methods and other logic-based simplification techniques. We demonstrate the relevance and applicability of our approach with computational experiments using a prototypical implementation based on Reduce and Redlog.

A key novelty of our approach is to consider the decisive linear system determining the Vessiot spaces first, independently of the given differential system. This allows us to make maximal use of the linearity and to apply a wide range of heuristic optimizations. Compared with the more comprehensive approach of , this also leads to an increased flexibility and we believe that the new approach will be in general more efficient in the sense that fewer cases will be returned.

Comments are closed.