PL techniques at OSDI...

From: Sorin Lerner (lerns@cs.washington.edu)
Date: Sat Dec 14 2002 - 18:00:24 PST

  • Next message: Craig Chambers: "Re: PL techniques at OSDI..."

    At OSDI (Operating Systems Design and Implementation) last week there was
    a session entitled "Robustness" with 3 apply-PL-techniques-to-systems
    papers. I attached the abstracts below (and links to the papers I was able
    to find)

    If there is enough interest, we could look at some of these papers in 590l
    or 590k sometime in the future. The technical content of these papers from
    a PL perspective may be low, but I think it's interesting for us to see
    the latest uses of PL techniques in a systems conference. It may also give
    us some insight on how to improve the acceptance of our own current/future
    work by the systems community. We can also bundle up multiple of these
    papers in one week, since they should be pretty easy to read.

    -------------------------------------------------------------------------

    Defensive Programming: Using an Annotation Toolkit to Build DoS-Resistant
    Software

     Xiaohu Qie, Ruoming Pang, and Larry Peterson, Princeton University

    Abstract

     This paper describes a toolkit to help improve the robustness of code
    against DoS attacks. We observe that when developing software, programmers
    primarily focus on functionality. Protecting code from attacks is often
    considered the responsibility of the OS, firewalls and intrusion detection
    systems. As a result, many DoS vulnerabilities are not discovered until
    the system is attacked and the damage is done. Instead of reacting to
    attacks after the fact, this paper argues that a better solution is to
    make software defensive by systematically injecting protection mechanisms
    into the code itself. Our toolkit provides an API that programmers use to
    annotate their code. At runtime, these annotations serve as both sensors
    and actuators: watching for resource abuse and taking the appropriate
    action should abuse be detected. This paper presents the design and
    implementation of the toolkit, as well as evaluation of its effectiveness
    with three widely-deployed network services.

    <could not find this paper on-line>

    -------------------------------------------------------------------------

    Using Model Checking to Debug Device Firmware

    Sanjeev Kumar and Kai Li, Princeton University

    Abstract

     Device firmware is a piece of concurrent software that achieves high
    performance at the cost of software complexity. They contain subtle race
    conditions that make them difficult to debug using traditional debugging
    techniques. The problem is further compounded by the lack of debugging
    support on the devices. This is a serious problem because the device
    firmware is trusted by the operating system.

    Model checkers are designed to systematically verify properties of
    concurrent systems. Therefore, model checking is a promising approach to
    debugging device firmware. However, model checking involves an exponential
    search. Consequently, the models have to be small to allow effective model
    checking.

    This paper describes the abstraction techniques used by the ESP compiler
    to extract abstract models from device firmware written in ESP. The
    abstract models are small because they discard some of the details in the
    firmware that is irrelevant to the particular property being verified. The
    programmer is required to specify the abstractions to be performed. The
    ESP compiler uses the abstraction specification to extract models
    conservatively. Therefore, every bug in the original program will be
    present in the extracted model.

    This paper also presents our experience with using Spin model checker to
    develop and debug VMMC firmware for the Myrinet network interfaces. An
    earlier version of the ESP compiler yielded models that were too large to
    check for system-wide properties like absence of deadlocks. The new
    version of the compiler generated abstract models that were used to
    identify several subtle bugs in the firmware. So far, we have not
    encountered any bugs that were not caught by Spin.

    http://www.cs.princeton.edu/~skumar/papers/osdi02.pdf

    -------------------------------------------------------------------------

    CMC: A Pragmatic Approach to Model Checking Real Code

     Madanlal Musuvathi, David Y.W. Park, Andy Chou, Dawson R. Engler, and
    David L. Dill, Stanford University

    Abstract

     Many system errors do not emerge unless some intricate sequence of events
    occurs. In practice, this means that most systems have errors that only
    trigger after days or weeks of execution. Model checking 4 is an effective
    way to find such subtle errors. It takes a simplified description of the
    code and exhaustively tests it on all inputs, using techniques to explore
    vast state spaces efficiently. Unfortunately, while model checking systems
    code would be wonderful, it is almost never done in practice: building
    models is just too hard. It can take significantly more time to write a
    model than it did to write the code. Furthermore, by checking an
    abstraction of the code rather than the code itself, it is easy to miss
    errors.

    The paper's first contribution is a new model checker, CMC, which checks C
    and C++ implementations directly, eliminating the need for a separate
    abstract description of the system behavior. This has two major
    advantages: it reduces the effort to use model checking, and it reduces
    missed errors as well as time-wasting false error reports resulting from
    inconsistencies between the abstract description and the actual
    implementation. In addition, changes in the implementation can be checked
    immediately without updating a high-level description.

    The paper's second contribution is demonstrating that CMC works well on
    real code by applying it to three implementations of the Ad-hoc On-demand
    Distance Vector (AODV) networking protocol 7. We found 34 distinct errors
    (roughly one bug per 328 lines of code), including a bug in the AODV
    specification itself. Given our experience building systems, it appears
    that the approach will work well in other contexts, and especially well
    for other networking protocols.

    http://www.stanford.edu/~engler/osdi2002.pdf

    _______________________________________________
    Cecil mailing list
    Cecil@cs.washington.edu
    http://mailman.cs.washington.edu/mailman/listinfo/cecil



    This archive was generated by hypermail 2.1.5 : Sat Dec 14 2002 - 18:00:30 PST