TY - CONF AB - Quantitative automata are nondeterministic finite automata with edge weights. They value a run by some function from the sequence of visited weights to the reals, and value a word by its minimal/maximal run. They generalize boolean automata, and have gained much attention in recent years. Unfortunately, important automaton classes, such as sum, discounted-sum, and limit-average automata, cannot be determinized. Yet, the quantitative setting provides the potential of approximate determinization. We define approximate determinization with respect to a distance function, and investigate this potential. We show that sum automata cannot be determinized approximately with respect to any distance function. However, restricting to nonnegative weights allows for approximate determinization with respect to some distance functions. Discounted-sum automata allow for approximate determinization, as the influence of a word’s suffix is decaying. However, the naive approach, of unfolding the automaton computations up to a sufficient level, is shown to be doubly exponential in the discount factor. We provide an alternative construction that is singly exponential in the discount factor, in the precision, and in the number of states. We prove matching lower bounds, showing exponential dependency on each of these three parameters. Average and limit-average automata are shown to prohibit approximate determinization with respect to any distance function, and this is the case even for two weights, 0 and 1. AU - Boker, Udi AU - Henzinger, Thomas A ID - 2891 T2 - Leibniz International Proceedings in Informatics TI - Approximate determinization of quantitative automata VL - 18 ER - TY - CONF AB - Systems are often specified using multiple requirements on their behavior. In practice, these requirements can be contradictory. The classical approach to specification, verification, and synthesis demands more detailed specifications that resolve any contradictions in the requirements. These detailed specifications are usually large, cumbersome, and hard to maintain or modify. In contrast, quantitative frameworks allow the formalization of the intuitive idea that what is desired is an implementation that comes "closest" to satisfying the mutually incompatible requirements, according to a measure of fit that can be defined by the requirements engineer. One flexible framework for quantifying how "well" an implementation satisfies a specification is offered by simulation distances that are parameterized by an error model. We introduce this framework, study its properties, and provide an algorithmic solution for the following quantitative synthesis question: given two (or more) behavioral requirements specified by possibly incompatible finite-state machines, and an error model, find the finite-state implementation that minimizes the maximal simulation distance to the given requirements. Furthermore, we generalize the framework to handle infinite alphabets (for example, realvalued domains). We also demonstrate how quantitative specifications based on simulation distances might lead to smaller and easier to modify specifications. Finally, we illustrate our approach using case studies on error correcting codes and scheduler synthesis. AU - Cerny, Pavol AU - Gopi, Sivakanth AU - Henzinger, Thomas A AU - Radhakrishna, Arjun AU - Totla, Nishant ID - 2890 T2 - Proceedings of the tenth ACM international conference on Embedded software TI - Synthesis from incompatible specifications ER - TY - CONF AB - Formal verification aims to improve the quality of hardware and software by detecting errors before they do harm. At the basis of formal verification lies the logical notion of correctness, which purports to capture whether or not a circuit or program behaves as desired. We suggest that the boolean partition into correct and incorrect systems falls short of the practical need to assess the behavior of hardware and software in a more nuanced fashion against multiple criteria. AU - Henzinger, Thomas A ID - 2888 T2 - Conference proceedings MODELS 2012 TI - Quantitative reactive models VL - 7590 ER - TY - CONF AB - The classical (boolean) notion of refinement for behavioral interfaces of system components is the alternating refinement preorder. In this paper, we define a quantitative measure for interfaces, called interface simulation distance. It makes the alternating refinement preorder quantitative by, intu- itively, tolerating errors (while counting them) in the alternating simulation game. We show that the interface simulation distance satisfies the triangle inequality, that the distance between two interfaces does not increase under parallel composition with a third interface, and that the distance between two interfaces can be bounded from above and below by distances between abstractions of the two interfaces. We illustrate the framework, and the properties of the distances under composition of interfaces, with two case studies. AU - Cerny, Pavol AU - Chmelik, Martin AU - Henzinger, Thomas A AU - Radhakrishna, Arjun ID - 2916 T2 - Electronic Proceedings in Theoretical Computer Science TI - Interface Simulation Distances VL - 96 ER - TY - CONF AB - The notion of delays arises naturally in many computational models, such as, in the design of circuits, control systems, and dataflow languages. In this work, we introduce automata with delay blocks (ADBs), extending finite state automata with variable time delay blocks, for deferring individual transition output symbols, in a discrete-time setting. We show that the ADB languages strictly subsume the regular languages, and are incomparable in expressive power to the context-free languages. We show that ADBs are closed under union, concatenation and Kleene star, and under intersection with regular languages, but not closed under complementation and intersection with other ADB languages. We show that the emptiness and the membership problems are decidable in polynomial time for ADBs, whereas the universality problem is undecidable. Finally we consider the linear-time model checking problem, i.e., whether the language of an ADB is contained in a regular language, and show that the model checking problem is PSPACE-complete. Copyright 2012 ACM. AU - Chatterjee, Krishnendu AU - Henzinger, Thomas A AU - Prabhu, Vinayak ID - 2936 T2 - roceedings of the tenth ACM international conference on Embedded software TI - Finite automata with time delay blocks ER - TY - CONF AB - Interface theories provide a formal framework for component-based development of software and hardware which supports the incremental design of systems and the independent implementability of components. These capabilities are ensured through mathematical properties of the parallel composition operator and the refinement relation for components. More recently, a conjunction operation was added to interface theories in order to provide support for handling multiple viewpoints, requirements engineering, and component reuse. Unfortunately, the conjunction operator does not allow independent implementability in general. In this paper, we study conditions that need to be imposed on interface models in order to enforce independent implementability with respect to conjunction. We focus on multiple viewpoint specifications and propose a new compatibility criterion between two interfaces, which we call orthogonality. We show that orthogonal interfaces can be refined separately, while preserving both orthogonality and composability with other interfaces. We illustrate the independent implementability of different viewpoints with a FIFO buffer example. AU - Henzinger, Thomas A AU - Nickovic, Dejan ID - 2942 T2 - Conference proceedings Monterey Workshop 2012 TI - Independent implementability of viewpoints VL - 7539 ER - TY - CONF AB - Continuous-time Markov chains (CTMC) with their rich theory and efficient simulation algorithms have been successfully used in modeling stochastic processes in diverse areas such as computer science, physics, and biology. However, systems that comprise non-instantaneous events cannot be accurately and efficiently modeled with CTMCs. In this paper we define delayed CTMCs, an extension of CTMCs that allows for the specification of a lower bound on the time interval between an event's initiation and its completion, and we propose an algorithm for the computation of their behavior. Our algorithm effectively decomposes the computation into two stages: a pure CTMC governs event initiations while a deterministic process guarantees lower bounds on event completion times. Furthermore, from the nature of delayed CTMCs, we obtain a parallelized version of our algorithm. We use our formalism to model genetic regulatory circuits (biological systems where delayed events are common) and report on the results of our numerical algorithm as run on a cluster. We compare performance and accuracy of our results with results obtained by using pure CTMCs. © 2012 Springer-Verlag. AU - Guet, Calin C AU - Gupta, Ashutosh AU - Henzinger, Thomas A AU - Mateescu, Maria AU - Sezgin, Ali ID - 3136 TI - Delayed continuous time Markov chains for genetic regulatory circuits VL - 7358 ER - TY - CONF AB - Given a dense-time real-valued signal and a parameterized temporal logic formula with both magnitude and timing parameters, we compute the subset of the parameter space that renders the formula satisfied by the trace. We provide two preliminary implementations, one which follows the exact semantics and attempts to compute the validity domain by quantifier elimination in linear arithmetics and one which conducts adaptive search in the parameter space. AU - Asarin, Eugene AU - Donzé, Alexandre AU - Maler, Oded AU - Nickovic, Dejan ID - 3162 TI - Parametric identification of temporal properties VL - 7186 ER - TY - CONF AB - We describe a framework for reasoning about programs with lists carrying integer numerical data. We use abstract domains to describe and manipulate complex constraints on configurations of these programs mixing constraints on the shape of the heap, sizes of the lists, on the multisets of data stored in these lists, and on the data at their different positions. Moreover, we provide powerful techniques for automatic validation of Hoare-triples and invariant checking, as well as for automatic synthesis of invariants and procedure summaries using modular inter-procedural analysis. The approach has been implemented in a tool called Celia and experimented successfully on a large benchmark of programs. AU - Bouajjani, Ahmed AU - Dragoi, Cezara AU - Enea, Constantin AU - Sighireanu, Mihaela ID - 3253 TI - Abstract domains for automated reasoning about list manipulating programs with infinite data VL - 7148 ER - TY - JOUR AB - The induction of a signaling pathway is characterized by transient complex formation and mutual posttranslational modification of proteins. To faithfully capture this combinatorial process in a mathematical model is an important challenge in systems biology. Exploiting the limited context on which most binding and modification events are conditioned, attempts have been made to reduce the combinatorial complexity by quotienting the reachable set of molecular species into species aggregates while preserving the deterministic semantics of the thermodynamic limit. Recently, we proposed a quotienting that also preserves the stochastic semantics and that is complete in the sense that the semantics of individual species can be recovered from the aggregate semantics. In this paper, we prove that this quotienting yields a sufficient condition for weak lumpability (that is to say that the quotient system is still Markovian for a given set of initial distributions) and that it gives rise to a backward Markov bisimulation between the original and aggregated transition system (which means that the conditional probability of being in a given state in the original system knowing that we are in its equivalence class is an invariant of the system). We illustrate the framework on a case study of the epidermal growth factor (EGF)/insulin receptor crosstalk. AU - Feret, Jérôme AU - Henzinger, Thomas A AU - Koeppl, Heinz AU - Petrov, Tatjana ID - 3168 JF - Theoretical Computer Science TI - Lumpability abstractions of rule based systems VL - 431 ER -