Network Reliability in the Software Era - Finding Bugs in OpenFlow-based Software Defined Networks

[Seminar cancelled]

Speaker:        Dr. Marco Canini
                Senior Research Scientist
                TU Berlin / T-Labs

Title:          "Network Reliability in the Software Era - Finding
                Bugs in OpenFlow-based Software Defined Networks"

Date:           Friday, 21 December 2012

Time:           4:00pm - 5:00pm

Venue:          Room3501 (via lifts 25/26), HKUST

Abstract:

Nowadays users expect to experience highly dependable network connectivity
and services. However, several recent episodes demonstrate that software
errors and operator mistakes continue to cause undesired disturbances and
outages. SDN (Software Defined Networking) is a new kind of network
architecture that decouples the control plane from the data plane---a
vision currently embodied in OpenFlow. By logically centralizing the
control plane computation, SDN provides the opportunity to remove
complexity from and introduce new functionality in our networks. On the
other hand, as the network programmability enhances and software play s a
greater role in it, risks that buggy software may disrupt an entire
network also increase.

In this talk, I will present efficient, systematic techniques for testing
the SDN software stack at both its highest and lowest layers. That is, our
testing techniques target at the top layer, the OpenFlow controller
programs and, at the bottom layer, the OpenFlow agents---the software that
each switch runs to enable remote programmatic access to its forwarding
tables.

Our NICE (No bugs In Controller Execution) tool applies model checking to
explore the state space of an unmodified controller program composed with
an environment model of the switches, and the hosts. Scalability is the
main challenge, given the diversity of data packets, the large system
state, and the many possible event orderings. To address this, we propose
a novel way to augment model checking with symbolic execution n of event
handlers (to identify representative packets that exercise code paths on
the controller), and effective strategies for generating event inter
leavings likely to uncover bugs. Our prototype tests Python applications
on the popular NOX platform. In testing three real applications, we
uncover eleven bugs.

Our SOFT (Systematic OpenFlow Testing) tool automates testing the
interoperability of OpenFlow switches. Our key insight is in automatically
identifying the testing inputs that cause different OpenFlow agent
implementations to behave inconsistently. To this end, we first
symbolically execute each agent under test in isolation to derive which
set of inputs causes which behavior. We then crosscheck all distinct
behaviors a cross different agent implementations and evaluate whether a
common input subset causes inconsistent behaviors. Our evaluation shows
that our tool identified several inconsistencies between the publicly
available Reference OpenFlow switch and Open vSwitch implementations.

******************
Biography:

Marco is a senior research scientist at T-Labs, a joint institute of TU
Berlin and Telekom Innovation Laboratories. Marco obtained his Ph.D.
degree in Computer Science and Engineering from the University of Genoa
in 2009 after spending the last year as a visiting student at the
University of Cambridge, Computer Laboratory. He holds a laurea degree
with honors in Computer Science and Engineering from the University of
Genoa. He also held positions at Intel Research and Google, and he was a
postdoctoral researcher at EPFL from 2009 to 2012.