Date: Thursday, December 1st, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Dr. Tobias Nipkow
Department of Informatik
Technical University of Munich
Germany
We introduce Jinja, a Java-like programming language with a formal semantics designed to exhibit core features of the Java language architecture. Jinja is a compromise between realism of the language and tractability and clarity of the formal semantics. The following aspects are formalised: a big and a small step operational semantics for Jinja and a proof of their equivalence; a type system and a definite initialisation analysis; a type safety proof of the small step semantics; a virtual machine (JVM), its operational semantics and its type system; a type safety proof for the JVM; a bytecode verifier, i.e. data flow analyser for the JVM; a correctness proof of the bytecode verifier w.r.t. the type system; a compiler and a proof that it preserves semantics and well-typedness.
The emphasis of this work is not on particular language features but on providing a unified model of the source language, the virtual machine and the compiler. The whole development has been carried out in the theorem prover Isabelle/HOL jointly with Gerwin Klein.
The talk will give a very high-level overview of these formalizations.
Full paper available at www.in.tum.de/~nipkow/pubs/Jinja/
Date: Thursday, December 1st, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Robert Lewis
Department of Mathematics
Fordham University
Discovering flexibility of two- and three-dimensional structures has been a mathematical problem since the time of Cauchy. Recently such questions have become prominent in computational chemistry. We will discuss how to detect flexibility by examining the resultant of a system of polynomial equations that defines the object.
Date: Tuesday, November 22, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Robert A. Ballance, Ph.D
Principal Member of Technical Staff, Org. 04328
Sandia National Laboratories
Large computing systems, like Sandia National Laboratories' Red Storm, are neither designed nor deployed in isolation. The computer, its infrastructure support, and the operations of both have to be designed to support the mission of the organization. System engineering is the process of developing and tuning complete systems. System management is the ongoing process of managing tradeoffs during operations. This talk describes Red Storm, along with the infrastructure and operational requirements of Red Storm. A discussion of the facilities that surround and support Red Storm: shelter, power, cooling, networking, interactions with other platforms, operations support, and user services is included.
Date: Thursday, November 17, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Rupak Majumdar
Department of Computer Science
University of California, Los Angeles (UCLA)
A modular program analysis considers components independently and provides a succinct summary for each component, which is used when checking the rest of the system. Consider a system consisting of a library and a client. A temporal summary, or interface, of the library specifies legal sequences of library calls. The interface is safe if no call sequence violates the library's internal invariants; the interface is permissive if it contains every such sequence. Modular program analysis requires full interfaces, which are both safe and permissive: the client does not cause errors in the library if and only if it makes only sequences of library calls that are allowed by the full interface of the library.
Previous interface-based methods have focused on safe interfaces, which may be too restrictive and thus reject good clients. We present an algorithm for automatically synthesizing software interfaces that are both safe and permissive. The algorithm generates interfaces as graphs whose vertices are labeled with predicates over the library's internal state, and whose edges are labeled with library calls. The interface state is refined incrementally until the full interface is constructed. In other words, the algorithm automatically synthesizes a typestate system for the library, against which any client can be checked for compatibility.
Date: Thursday, November 10, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Patrick Bridges
Department of Computer Science, University of New Mexico
Many high-end computing (HEC) systems have relied on commodity operating system solutions like Linux to provide basic system software services. Unfortunately, there are well-known problems with commodity system software in specialized environments, particularly from underlying assumptions that aren't necessarily appropriate and the creeping featuritis induced by their general-purpose nature. Xen, a "small" virtualization layer for modern architectures has been put forth as a potential boon in specialized environments like HEC. In my talk, I will describe why Xen is beginning to suffer from the same problems that have plagued Linux in specialized environments, what I believe this means to HEC OS researchers, and discuss the limitations, uses, possible improvements to Xen for dealing with specialized environments like HEC systems.
Date: Thursday, November 10, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Arthur "Barney" Maccabe
Department of Computer Science, University of New Mexico
This talk will survey some of the systems software projects that have been developed jointly between UNM and Sandia National Laboratory, with a focus on the principle of lightweight design which has evolved from these projects.
Date: Tuesday, November 1, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Andree Jacobson
Department of Computer Science, University of New Mexico
Many people think, that being a minority would make so many things easier. It's much easier to get a scholarship because there's less competition, and there are so many opportunities available for minorities that just aren't there if you're a "typical" computing sciences person. If you truly believe this - how come there are still so few people from minorities in our field, and even if they do enter a CS program, they are statistically much more unlikely to graduate, than the typical white male?
With CS enrollment dropping all over the country, it is especially noticed in the minority populations. Therefore, it is important that those of us who belong to a minority know that there is a large support network out there. Also, it's equally important that those of us who are not considered minority, make an effort not to discriminate against anyone, to minimize any unfair stressors.
The third biannual Richard Tapia conference on diversity in computing just took place in Albuquerque. It is a student oriented conference, targeted toward people belonging to minorities in the field of Computing Sciences. This year there were 350 participants, with over 100 student attendees. I believe the conference did a very good job at targeting some of the pressing issues, and inform those of us who were there, what to look out for, and to how to further increase our understanding of the issues.
In this talk, I will share with you some of the things I experienced during the conference. Please come, listen, and participate with your ideas and reflections.
Date: Thursday, October 27, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Stephanie Forrest
Department of Computer Science, University of New Mexico
Natural immune systems are sophisticated information processors. They learn to recognize relevant patterns, they remember patterns that have been seen previously, they use combinatorics to construct pattern detectors efficiently, and they use diversity to promote robustness. Further, the individual cells and molecules that comprise the immune system are distributed throughout our bodies, encoding and controlling the system in parallel with no central control mechanism. The talk will describe recent progress on several related projects that are investigating the effector side of the immune system—how it chooses its response, how it controls the magnitude of response, and how it knows when to terminate a response.
Date: Tuesday, October 25, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Shuang Luan
Department of Computer Science, University of New Mexico
During the course of radiotherapy, the patient anatomy and position tend to vary from that have been used for treatment planning. This gives rise to insufficient dose coverage to tumors and overdose to the surround sensitive structures. We have proposed a new 4D radiotherapy technique called patient motion and gantry rotation synchronized IMAT. Our new method is fundamentally different from current gating and tracking 4D radiotherapy techniques and can takes full advantage of the emerging 4D CT image technology. We have implemented a prototype 4D IMAT planning system based on this innovative 4D radiotherapy planning scheme, and carried out comparison studies with currently used clinical radiotherapy technique using a dynamic phantom. Our study indicated that: (1) Our proposed 4D IMAT technique is a feasible solution to the 4D radiotherapy; (2) Compared with currently used step-and-shoot techniques, our 4D IMAT technique produces superior dose distributions that matches the ideal radiotherapy plan.
Date: Tuesday, October 25, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Lance Williams
Department of Computer Science, University of New Mexico
In this talk I will present an overview of my current research activities, including:
Date: Thursday, October 20, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Andrew Bernat
Executive Director
Computing Research Association
The Computing Research Association is the
Washington-based non-profit looking out for the health of the computing research
enterprise by focusing on the people and money necessary to conduct computing
research. This talk brings people up to date on what is going on with research
funding at the federal level and what CRA is doing to improve the situation.
Then I turn to what CRA is doing in general to support the people doing computing
research and how students and faculty can make use of these efforts.
Date: Tuesday, October 18, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Manuel Hermenegildo
Depts. of CS & EECE, UNM
Computer programs are some of the most complicated artifacts built by mankind—many
arguably much more complicated than the computer itself. Because of this complexity,
it takes large amounts of manpower to develop and maintain software, and, specially,
to avoid errors. The objective of our research is to contribute to improving
this situation by developing techniques and tools that help programmers write
large, complex programs, in a shorter time, and at the same time ensuring that
the programs written are correct and result in efficient executions. I will
present some details of how we attack this problem, by developing higher-level,
multiparadigm programming languages, as well as more powerful formal techniques
and practical tools for program debugging and verification. We also develop
resource-aware, optimizing compilers which can produce efficient sequential
code from programs written in very high-level languages as well as parallelize
such programs automatically for running on multiprocessors and/or in distributed
environments.
Date: Tuesday, October 11, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Aaron Clauset
Department of Computer Science
University of New Mexico
The traditional analysis of global terrorism holds that rare but catastrophic
events are qualitatively different from common but less severe events. In this
talk, using the set of terrorist attacks between 1968 and 2004, as compiled
by the National Memorial Institute for the Prevention of Terrorism (MIPT),
we will show that such events are actually part of a global statistical pattern.
That is, we will show that the statistics of terrorism, like other kinds of
both manmade and natural disasters, e.g., wars, forest fires, floods and earthquakes,
have a mathematical form like a power law, P(x) ~ x^(-alpha) where alpha is
the scaling exponent that describes the relationship between the frequency
and severity of events. Notably, in such heavy-tailed distributions, events
that are orders of magnitude larger than the average are actually relatively
common.
The focus of this talk will be on describing the several global trends in the
statistics of terrorism that we have discovered, as well as the computational
and statistical tools used to support them. We will briefly contrast our approach
with the traditional one for conflict analysis, and discuss its relative strengths/weakness.
Finally, we will close with a brief discussion of the policy implications and
outstanding questions raised by this work.
This is joint work with Maxwell Young, and has been covered in the popular
science press by The Economist, The Guardian (UK), Die Welt (Germany), Nature,
and the Institute of Physics.
The preprint is available at http://arxiv.org/abs/physics/0502014 .
Date: Thursday, October 6, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Darko Stefanovic
Professor of Computer Science and Regents' Lecturer
I will describe past and ongoing work in the Molecular Computing Group. In particular, I will describe our technology for amorphous bio-compatible computing, using deoxyribozyme-based logic gates.
Oligonucleotides can act as enzymes, or (deoxy)ribozymes, on other oligonucleotides, yielding oligonucleotide products. Moreover, these reactions can be controlled by inputs which are also oligonucleotides. We interpret these reactions as logic gates, and the concentrations of chemical species as signals. Since these reactions are homogeneous, i.e., they use oligonucleotides for both their inputs and their outputs, we can compose them to construct complex logic circuits.
I will describe the several kinds of logic gates we have developed, as well as their initial applications in simple feed-forward circuits, including arithmetic and game-playing automata. I will also demonstrate our open-system designs for a biomolecular realization of elementary components for a digital computer, including feedback circuits such as bistable memory elements (flip-flops) and oscillators. I will discuss the microfluidic setting for these devices.
Date: Thursday, Sep. 29, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Ed Angel
Professor of Computer Science, Electrical and Computer Engineering,
and Media Arts Director, Arts Technology Center and ARTS Lab
The Art, Research, Technology, and Science Laboratory (ARTS Lab) was created recently at UNM as part of the Governor's Media Industries Strategic Project (MISP). At the core of MISP is the notion that by combining New Mexico's strengths in technology with its unique artistic and cultural resources we can create a successful media industry in the state. UNM will play a major role in this effort through its research and educational programs. ARTS Lab will provide a venue for these interdisciplinary efforts.
In this talk, I will discuss the background of MISP and ARTS Lab, show some of our projects, and present opportunities for participation by students and faculty from CS. I will also describe the new "Garage" facility that will be shared by ARTS Lab and the Center for High Performance Computing. The Garage will house a black box studio, an imaging/graphics lab, a motion capture system, an experimental multi-projector dome, and a variety of computing platforms.
Date: Thursday, Sep. 22, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Michel Ferreira
Departamento de Ciencia de Computadores
Faculdade de Ciencias da Universidade do Porto
Porto, Portugual
Logic programming and relational databases have common foundations based on
First Order Logic. By coupling both paradigms, we can combine the efficiency
and safety of databases in dealing with large amounts of data with the higher
expressive power of logic and, thus, build more powerful systems. Although
much work has been developed and described in the area of logic programming
and relational databases, there are very few references which regard implementation
alternatives in coupling a logic system with a relational database. In this
talk, we will discuss the impact of using different approaches for coupling
a Prolog system (Yap) with a relational database system (MySQL). As we will
see, with current implementations the communication architecture is not tight enough
to support a completely transparent use of extensionally (or relationally)
defined predicates in the logic program. Such an example is the cut operation,
when it is used to prune database predicates. We will address this problem
and describe the solution implemented in the Yap/MySQL coupled system. We will
end by describing a traffic characterization application that is being developed
and that takes advantage of the logic and database features.
Date: Thursday, September 15, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. Jared Saia
Department of Computer Science
University of New Mexico
In this talk, we will present results on designing algorithms and data structures which are distributed, scalable and robust to adversarial attack. The first part of the talk will describe a robust distributed hash table (DHT), based on the popular DHT, Chord. This new DHT, is robust with high probability for any time period during which: 1) there are always at least z total peers in the network for some integer z; 2) there are never more than (1/4-ε)z bad peers in the network for a fixed ε>0; and 3) the number of peer insertion and deletion events is no more than zk for some tunable parameter k. We assume there is an adversary controlling the bad peers and that the IP-addresses of all the bad peers and the locations where they join the network are carefully selected by this adversary. In comparison to Chord, the resources required by our new DHT are only a polylogarithmic factor greater in communication, messaging, and linking costs.
The second part of the talk will describe a new scalable algorithm for the leader election problem. In the leader election problem, there are n processors of which (1-b) n are good for some b>0. The problem is to design a distributed protocol to elect a good leader from the set of all processors. I will describe a new, leader election protocol that is scalable in the sense that each good processor sends and processes a number of bits which is only polylogarithmic in n. For b < 1/3, our protocol elects a good leader with constant probability and ensures that a 1-o(1) fraction of the good processors know this leader. To the best of our knowledge, this is the first leader election algorithm which ensures that each good processor sends and processes a sublinear number of bits. This result can be used to provide scalable solutions to Byzantine agreement and other problems.
This is joint work with Amos Fiat (U. Tel Aviv), Valerie King (U. Victoria), Erik Vee (IBM Labs), Maxwell Young (U. New Mexico) and Vishal Sanwalani (U. New Mexico) and represents work previously published in PODC and ESA.
Date: Tuesday, September 13, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Prof. George F. Luger (jointly with Chayan Chakrabarti, Roshan Rammohan)
Department of Computer Science
University of New Mexico
We have created a diagnostic/prognostic software tool for the analysis of complex systems, such as monitoring the "running health" of helicopter rotor systems. Although our software is not yet deployed for real-time in-flight diagnosis, we have successfully analyzed the data sets of actual helicopter rotor failures supplied to us by the US Navy. In this paper, we discuss both critical techniques supporting the design of our stochastic diagnostic system as well as issues related to its full deployment. We also present four examples of its use.
Our diagnostic system, called DBAYES, is composed of a logic-based, first-order, and Turing-complete set of software tools for stochastic modeling. We use this language for modeling time-series data supplied by sensors on mechanical systems. The inference scheme for these software tools is based on a variant of Pearl's loopy belief propagation algorithm (Pearl, 1988). Our language contains variables that can capture general classes of situations, events, and relationships. A Turing-complete language is able to reason about potentially infinite classes and situations, similar to the analysis of dynamic Bayesian networks. Since the inference algorithm is based on a variant of loopy belief propagation, the language includes expectation maximization type learning of parameters in the modeled domain. In this paper we briefly present the theoretical foundations for our first-order stochastic language and then demonstrate time-series modeling and learning in the context of fault diagnosis.
Date: Thursday, September 1, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Dr. Calogero Zarba
Department of Computer Science
University of New Mexico
Suppose your best friend told you that
union(x, y) = x, for all sets x, y .
Would you believe her?
In this talk we present a method for automatically deciding if your friend told you the truth.
This method can be implemented in your favorite programming language. Moreover, when the number of involved sets is "small", this method can be carried out on a blackboard using old-fashioned chalks.
Date: August 25, 2005
Time: 11:00-12:15pm.
Place: Woodward 149
Haixia Jia
Department of Computer Science
University of New Mexico
The evaluation of incomplete satisfiability solvers depends critically on the availability of hard satisfiable instances. A plausible source of such instances consists of random k-SAT formulas whose clauses are chosen uniformly from among all clauses satisfying some randomly chosen truth assignment A. Unfortunately, instances generated in this manner tend to be relatively easy and can be solved efficiently by practical heuristics. Roughly speaking, for a number of different algorithms, A acts as a stronger and stronger attractor as the formula's density increases. Motivated by recent results on the geometry of the space of satisfying truth assignments of random k-SAT and NAE-k-SAT formulas, we introduce a simple twist on this basic model, which appears to dramatically increase its hardness. Namely, in addition to forbidding the clauses violated by the hidden assignment A, we also forbid the clauses violated by its complement, so that both A and its complement are satisfying. It appears that under this ``symmetrization'' the effects of the two attractors largely cancel out, making it much harder for algorithms to find any truth assignment. We give theoretical and experimental evidence supporting this assertion.
Date: Wednesday, August 10, 2005
Time: 3:00 p.m.
Place: FEC 141
Dr. Marcus Magnor
Max Planck Institute
Saarbruecken, German
Expectations on computer graphics rendering quality are rising continuously: whether in flight simulators, surgerical planning systems, or computer games, ever more realistic rendering results are to be achieved at real-time frame rates. And in fact, from the computational as well as algorithmic side, visual realism is within reach of modern PC graphics boards.
With rapidly advancing graphics hardware capabilities, the modeling process is becoming the limiting factor towards realistic rendering. Higher visual realism can be attained only by having available more detailed and accurate scene descriptions. So far, however, modeling 3D geometry and object texture, surface reflectance characteristics and scene illumination, character animation and emotion is a labor-intensive, tedious process. The cost of increasing model accuracy using conventional approaches today threatens to stall further progress in realistic rendering applications.
In my talk, I present an alternative modeling approach. I describe how real-world scenes and events may be acquired from the "real thing". Given a handful of synchronized video recordings, complex, time-varying scenes and natural phenomena can be modeled from reality to be incorporated into time-critical 3D graphics applications. Photo-realistic rendering quality and truly authentic animations are the result. Besides offering a solution for realistic rendering applications in computer graphics, research into video-based modeling and rendering algorithms also leads to new tools for video editing and may even make possible novel forms of visual media.