Talk details
- Jason Baumgartner, Ambar Gadkar and Pradeep Kumar Nalla, IBM
Formal Verification at IBM: Applications and Algorithms
(Slides (PDF))
Abstract
Functional verification is critical element in the
development of today's complex hardware
designs. Simulation is traditionally the most
predominantly used technique to expose design flaws.
However, insufficient coverage, i.e., the inability to
explore all design behaviors and expose all design
flaws, plagues the practical utility of simulation.
Thus, formal verification (FV) has become an interesting
alternative given its exhaustiveness and ability to
uncover corner-case bugs. FV has thus flourished as an
industrially essential technology during the past decade
to complement the coverage achievable with
simulation. However, FV is itself limited in practical
utility given its computational complexity and
scalability challenges.
This presentation describes the verification methodology
at IBM, covering the essence of a common execution model
which forms the basis for verification across various
hierarchies, technologies and disciplines. Next, we
overview the broad set of applications of formal
verification at IBM. We additionally detail IBM's
state-of-the-art FV tool "Rulebase SixthSense Edition",
used for both functional verification and sequential
equivalence checking. Technically, it is built upon a
unique transformation-based verification framework
encompassing numerous synergistic transformation and
abstraction algorithms, providing state-of-the-art
capacity and robustness. This tool is continually being
enhanced through world-wide internal and collaborative
research and development efforts. We conclude with
statements of FV successes and open challenges.
About the speakers
Jason Baumgartner joined IBM in 1995, and quickly became
involved in applying formal verification tools to find
intricate design flaws. These experiences evolved into
the pioneering of numerous advanced verification
algorithms, which culminated into the SixthSense project
and his PhD from University of Texas in 2002 and have
been the subject of numerous patents and
publications. Today Jason is the Formal Verification
Technology Lead within IBM Systems and Technology Group,
and a central figure in the external Formal Verification
community active in numerous conferences, university
collaborations, and the Semiconductor Research
Corporation.
Ambar Gadkari joined IBM India in 2011 as a Senior R&D
Manager. Prior to joining IBM, he has worked with Texas
Instruments and General Motors R&D in the areas of
simulation & modeling, formal verification for embedded
systems leading to several publications and patents. He
has a PhD from IIT Mumbai and ME from University of
Mumbai. Presently, Ambar manages a team in Bangalore
developing verification tools and methods with a strong
focus on research collaborations and product
applications in IBM's hardware design flows.
Pradeep Kumar Nalla started his career with IBM India in
Jan 2012 as part of EDA-India Verification tools team. He
is involved in development and support for the Formal
verification tool SixthSense. Prior to joining IBM, he
was with "Atrenta India", where he developed and
maintained the tool Sequential Equivalence Checking for
Power optimization. He did his Ph.D. from University of
Tuebingen, Germany in 2008 - on the topic - "Efficient
distributed bounded property checking".
- Supratik Chakraborty, IIT Bombay
Practical Quantifier Elimination for Linear Bit-vector
Inequalities
Abstract
Bit-precise quantifier elimination is an important
operation in the formal verification and analysis of
software and hardware systems that use fixed width words
in the underlying machine architecture. In this talk,
we present a layered algorithm for scalable, bit-precise
quantifier elimination from boolean combinations of
linear inequalities, equalities and disequalities over
fixed-width words. Our algorithm invokes cheaper,
relatively less complete heuristics initially, and
invokes more expensive and complete techniques only when
needed. We present a substitution based heuristic and a
counting based heuristic that are extremely efficient
and can eliminate almost all quantifiers in our
extensive set of benchmarks. To handle cases that are
not amenable to these heuristics, we also present an
adaptation of the classical Fourier-Motzkin quantifier
elimination algorithm. While this is an expensive step,
our experiments indicate that this is invoked in a very
small fraction of cases. Our experiments clearly show
that a layered approach with carefully designed
heuristics lends both precision and scalability to
bit-precise quantifier elimination.
About the speaker
Supratik Chakraborty is currently a Professor of
Computer Science and Engineering at IIT Bombay. His
research interests include formal methods and their
applications to the analysis and verification of
hardware and software systems, with a special emphasis
on scalable and precise model checking techniques. He
is also affiliated to the Centre for Formal Design and
Verification of Software at IIT Bombay.
- Pallab Dasgupta, IIT Kharagpur
Formal verification in informal worlds
Abstract
Reactive systems interact with the outside world, and
therefore, formal methods for verifying a reactive
system must consider an appropriate abstraction of the
world in which it works. Quite often the model of the
world is based on informal specifications borne out of
incomplete knowledge which is typically the case for
large component based systems. It is also quite common
that the outside world is very big and complex and
thereby we are forced to work with an informal summary
of the world. This presentation explores the
consequences of performing formal verification with
informal models of the world and outlines some recent
trends. The use of classical AI techniques in
conjunction with formal methods helps in developing new
verification frameworks, which appear promising for a
wide variety of problems.
About the speaker
Pallab Dasgupta is a Professor of Computer Science and
Engineering at IIT Kharagpur. His research interests
include Formal Verification, Artificial Intelligence and
VLSI. He has over 150 research papers and 2 books in
these areas. He currently leads the
Verification and CAD group at the Dept of Computer
Science and Engineering, IIT Kharagpur which has
strong collaborations with several industries, including
Intel, Synopsys, General Motors, National Semiconductors
(now with Texas Instruments), IBM, Freescale, SRC and
Google. He has been the administrative head of several
key laboratories at IIT Kharagpur, including the
Advanced VLSI Laboratory, the Synopsys CAD Laboratory,
and the Intel Embedded Innovation Laboratory.
- Manoj Dixit, Mathworks
Early time-budgeting in Distributed Embedded Control Systems
(Slides (PDF))
Abstract
The development of complex real-time distributed embedded
systems is challenging since the functional requirements
also include the timing requirements. While it is
well-understood how to decompose system level untimed
functional requirements into component and subsystem
level functional requirements, the same can not be said
about timing requirements.
We propose an early stage hierarchical time-budgeting
methodology to bridge the above gap. The main emphasis of
this methodology is on the systematic derivation of
component level timing (constraints) from system level
timing requirements. The component timing requirements
are modeled using parameters and the constraints are
synthesized over these parameters. The computed
constraints provides flexibility in selecting a parameter
valuation as the component's time-budget.
For specifying the requirements we use Parametric
Temporal Logic (PLTL) which extends the linear temporal
logic with parametric operators like G≤ y
and F≤ x. One of the main steps in the
component time-budgeting involves checking the validity
of a PLTL formula. Due to the presence of parameters, the
validity checking problem reduces to a constraint
computation problem, such that any satisfying assignment
of the computed constraint makes the formula valid.
About the speaker
Manoj is Principal Software Engineer in Mathworks
India. Prior to that he has worked for about 14 yrs in
India Science Lab, GM R&D as a resercher and Sasken
Communication Technologlies ltd. as a software
engineer. His focus area of work is developing scalable
tools and methods for V&V of embedded systems. He
obtained his M.Tech and PhD from IIT Kharagpur and M.Sc
in Mathematics from IIT Bombay.
- Deepak D'Souza, IISc
Verification of Free-RTOS
(Slides (PDF))
Abstract
We will describe our efforts towards a formally verified
version of Free-RTOS, a popular open-source real-time
operating system for embedded applications. The idea is
to first come up with a high-level specification of
Free-RTOS using the user guide and the code, and then to
show that the implementation is a refinement of the
abstract specification.
About the speaker
Deepak D'Souza received his PhD from Chennai Mathematical
Institute in 2000. Since 2003 he has been at the
Department of Computer Science and Automation of the
Indian Institute of Science, Bangalore, where he is
currently an Associate Professor. Among his areas of
interest are specification and analysis of real-time and
hybrid systems and program analysis for concurrency and
security.
- A Kirankumar, Intel
Symbolic Trajectory Evaluation: The Prime Validation
Vehicle for Intel's Next Generation Processor Graphics
FPU
(Slides (PDF))
Abstract
Formal Verification (FV) is widely acknowledged for
improving validation effectiveness. Usually formal
verification has been used to supplement more traditional
coverage oriented testing activities. Arithmetic
Data-path FV has matured over the time to completely
replace traditional dynamic validation methodologies.
Moreover, it gives an additional promise of 100%
data-space coverage. Symbolic Trajectory Evaluation
(STE) is the best proven method of FV on Intel® data-path
designs. The Floating Point Units (FPUs) are generally
very data-path intensive. In the next generation Intel
Processor Graphics design, the FPU was completely
re-architected and this necessitated a methodology which
could guarantee complete verification in a tight
verification schedule. STE was brought in to meet this
formidable target. This paper discusses the efficient
application of this methodology to achieve convincing
results. More than 201 bugs were caught in a very short
verification cycle using STE.
About the speaker
M V A Kiran Kumar is currently Staff Engineer at Intel
Technologies. He been with Intel for the past 10 years.
He leads the formal verification efforts at Intel India
and handles data path and control path and formal
equivalence. He also runs an FV focus group at Intel
India. He graduated from IISc Bangalore. At Intel, he
has been with the server and graphics processor design
group, functioning at various levels as a designer,
backend engineer and has finally settled with pre silicon
validation.
- N V Krishna, IIT Madras
Correctness Issues in Transforming Task Parallel Programs
(Slides (PDF))
Abstract
Task parallelism has increasingly become a trend with
programming models such as OpenMP 3.0, Cilk, Java
Concurrency, X10, Chapel and Habanero-Java (HJ) to
address the requirements of multicore programmers. While
task parallelism increases productivity by allowing the
programmer to express multiple levels of parallelism, it
can also lead to performance degradation due to increased
overheads. In this paper, we introduce a transformation
framework for optimizing task-parallel programs with a
focus on task creation and task termination
operations. These operations can appear explicitly in
constructs such as async, finish in X10 and HJ, task,
taskwait in OpenMP 3.0, and spawn, sync in Cilk, or
implicitly in composite code statements such as foreach
and ateach loops in X10, forall and foreach loops in HJ,
and parallel loop in OpenMP. Our framework includes a
definition of data dependence in task-parallel programs,
a happens-before analysis algorithm, and a range of
program transformations for optimizing task
parallelism. Further, we discuss the impact of exception
semantics on the specified transformations, and extend
them to handle task-parallel programs with precise
exception semantics.
About the speaker
V. Krishna Nandivada is currently an Assistant Professor
in the department of Computer Science and Engineering at
IIT Madras. Before joining IIT Madras, he spent nearly
5.5 years at IBM India Research Lab (Programming
Technologies and Software Engineering group). Prior to
joining IBM Research, he has been associated with
Hewlett Packard and Sun Labs for different periods of
times. He holds a B.E. degree from Regional Engineering
Colleges (now known as National Institute of Technology)
Rourkela, M.E. degree from Indian Institute of Science,
Bangalore, and a PhD from University of California, Los
Angeles. His research interests are Compilers, Program
Analysis, Programming Languages, Fault Localization, and
Multicore systems.
- Daniel Kroening, Oxford
Reasoning about floating-point arithmetic with abstract conflict-driven clause learning
(Slides (PDF))
Abstract
I will present a novel technique that lifts technology
from propositional SAT solvers to reasoning in abstract
domains. IEEE floating-point arithmetic in the interval
domain will serve as an exemplar.
About the speaker
Daniel Kroening received the M.E. and doctoral degrees
in computer science from the University of Saarland,
Saarbruecken, Germany, in 1999 and 2001,
respectively. He joined the Model Checking group in the
Computer Science Department at Carnegie Mellon
University, Pittsburgh PA, USA, in 2001 as a Post-Doc.
He was an assistant professor at the Swiss Technical
Institute (ETH) in Zurich, Switzerland, from 2004 to
2007. He is now a Professor at the Computer Science
Department at the University of Oxford. His research
interests include automated formal verification of
hardware and software systems, decision procedures,
embedded systems, and hardware/software co-design.
- Aditya Nori, MSR
Program Verification via Machine Learning
(Slides (PDF))
Abstract
In this talk, I will describe our experiences using
machine learning techniques for computing proofs for
assertion validation as well as program termination. Our
main technical insights are: a) that interpolants in
program verification can be viewed as classifiers in
supervised machine learning, and b) loop bounds can be
computed using linear regression. This view has the
advantage that we are able to use off-the-shelf
classification as well as regression techniques for
proving safety as well as termination. Furthermore, we
can also use data generated from existing test suites for
programs in order to compute both safety and termination
proofs. We demonstrate the feasibility of our techniques
via experiments over several benchmarks from various
papers on program verification.
About the speaker
Aditya Nori is a member of the programming languages and
machine learning groups at Microsoft Research India. He
is also an adjunct professor at IIT Hyderabad. His
research interests include algorithms for the analysis of
programs and machine learning with special focus on tools
for improving software reliability and programmer
productivity. He received his PhD in Computer Science
from the Indian Institute of Science, Bangalore.
- Madhusudan Parthasarathy, UIUC/MSR
Natural Proofs for Verification of Dynamic Heaps
(Slides (PDF))
Abstract
We propose natural proofs for reasoning with programs
that manipulate dynamic data-structures against complex
specifications. Natural proofs are a subclass of proofs
that are amenable to completely automated reasoning,
that provide sound but incomplete procedures, and that
capture common reasoning tactics in program
verification. We develop a dialect of separation logic
over heaps, called Dryad, with recursive definitions that
avoids explicit quantification, develop ways to reason
with heaplets using classical logic over the theory of
sets, and develop natural proofs for reasoning using
proof tactics involving disciplined unfoldings and
formula abstractions. Natural proofs are encoded into
decidable theories of first-order logic so as to be
discharged using SMT solvers.
We show, using a large class of more than 100 correct
programs, ranging from standard data-structures, the
Schorr-Waite algorithm for garbage collection, a large
number of low-level C routines from the Glib library,
the OpenBSD library and the Linux kernel, and routines
from a secure OS-browser project, the effectiveness of
natural proofs in deductive verification.
About the speaker
Madhusudan Parthasarathy is an Associate Professor at the
University of Illinois at Urbana-Champaign, and is
currently on sabbatical, visiting Microsoft Research,
Bangalore. He obtained his PhD at the Institute of
Mathematical Sciences, Chennai. He is interested in
software verification and in building reliable and secure
software systems using formal correctness techniques.
- K V Raghavan, IISc
Precise, on-demand null-dereference verification for
Java
Abstract
The problem addressed in this paper is sound, scalable,
demand-driven null-dereference verification for Java
programs via over-approximated weakest preconditions
analysis. The base version of this analysis having been
described in a previous publication, in this paper we
focus primarily on describing two major optimizations
that we have incorporated that allow for longer program
paths to be traversed more efficiently, hence increasing
the precision of the approach. The first optimization
is to bypass certain expensive-to-analyze constructs,
such as virtual calls with too many possible targets, by
directly transferring dataflow facts from points after
the construct to points before along def-use edges of a
certain kind. The second optimization is to use manually
constructed summaries of Java container class methods,
rather than analyze the code of these methods directly.
We evaluate our approach using 10 real world Java
programs, as well as several micro benchmarks. We
demonstrate that our optimizations result in a 45%
reduction in false positives over the base version on
the real programs, without significant impact on running
time.
About the speaker
Komondoor V. Raghavan is an Assistant Professor in the
Indian Institute of Science, Bangalore. His research
interests are in program analysis, and formal methods in
software engineering.
- Prahlad Sampath, Mathworks
Translation Validation for Stateflow Code Generation
(Slides (PDF))
Abstract
Code-generators play a critical role in the model-based
development of complex software systems. In many
industries, code is auto-generated from models and
directly flashed onto embedded controllers. This scenario
requires thorough verification of the equivalence between
models and generated code. The approach followed in
industry is to use testing based techniques to perform
this verification. Translation validation has been
proposed as an alternative to testing which can improve
the quality of verification and at the same time automate
the verification process.
Our approach for translation validation of Stateflow code
generation builds upon the significant progress over the
last few years in software model-checking: we employ
off-the-shelf software model-checkers like CBMC to make
translation validation practical for a complex modeling
language like Stateflow. We have demonstrated this
approach on a number of Stateflow models from the
automotive industry.
About the speaker
Prahlad Sampath is currently a manager at Mathworks
India, where he leads a team developing V&V tools for
model based development using the MATLAB
Simulink/Stateflow platform. Prior to joining Mathworks
Prahlad was a researcher at GM India Science Labs,
Bangalore; and earlier at TRDDC, Pune. Prahlad obtained
his MSc and PhD degrees from Imperial College, London.
- Nishant Sinha, IBM
Big-step Bounded Model Checking for Software
(Slides (PDF))
Abstract
The talk will present a method to perform scalable
bounded model checking of Java programs towards the goal
of finding bugs in them. The method is based on an
alternating, goal-driven call graph exploration strategy
which exploits the hierarchical program structure to find
bugs in large code bases efficiently. The method is also
able to infer local proofs in some cases.
About the speaker
Nishant Sinha is a Researcher in the Programming
Technologies and Software Engineering group at IBM
Research, Bangalore, India. His interests lie in analysis
of sequential and concurrent programs, automated
compositional analysis, rewriting and decision
procedures. He is also interested in automated design and
development of web applications and distributed systems.
- R Venkatesh, TRDDC
Challenges in applying formal verification to industry applications
(Slides (PDF))
Abstract
This talk aims to highlight the scalability limitations
in the current state of the art in formal
verification. We start by presenting our experience in
applying formal methods for both static analysis of, and
test data generation for some automotive models and
code. This will be followed by some insight on code
patterns that hinder scalability of current tools and
finally some thoughts on what can help in achieving
better scalability. The aim of this talk is to try and
influence the direction of research in formal
verification.
About the speaker
Venkatesh has been with TRDDC since 1989. He has spent
this time mainly in the areas of design of languages and
verification. His current research interests include
developing highly scalable formal verification tools and
also design of requirements specification notations
focussed mainly on testing and verification.