Date: Thursday, May 5, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Vittorio Cristini
Victor & Ruby Hansen Surface
Professor of Pathology, Chemical and Biomedical Engineering at the
University of New Mexico
Bio:
Vittorio Cristini, PhD,
Victor & Ruby Hansen Surface Professor of
Pathology, Chemical and Biomedical Engineering at the University of
New Mexico is a leading expert and researcher in the fields of
mathematical and computational biology, applied and computational
mathematics, mathematical oncology, complex fluids and microfluidics,
and multidisciplinary (bio)materials science. He serves as editor for
Cancer Research and several biomedical journals, and has published one
book with Cambridge University Press, numerous book chapters and over
60 journal articles. Among a number of awards, Dr. Cristini received
the .Andreas Acrivos Dissertation Award in Fluid Dynamics. by the
American Physical Society in 2000. His 2005 paper in the Bulletin of
Mathematical Biology has been designated as a .New Hot Paper in the
field of Mathematics. by the ISI Web of Knowledge; two articles have
been featured in the Cancer Research Highlights of the American
Association for Cancer Research. His research has been supported by
the Cullen Trust for Health Care, the National Science Foundation, the
National Cancer Institute, the Department of Defense, the State of
California, the State of Texas, and the State of New Mexico among the
others. Currently, Dr. Cristini serves as overall PI, Project PI and
Core PI on several NSF, NIH, and DoD grant to develop multi-scale
models of tumor growth, most notably as part of two NCI Physical
Sciences in Oncology Centers (PS-OC) and of the Methodist Hospital NCI
ICBP grant where he develops models of stem cell growth in breast
cancer. In addition to his position at the University of New Mexico,
Dr. Cristini has been appointed as SULSA professor by the Scottish
Universities Life Sciences Alliance and as honorary professor of
mathematics by the University of Dundee, Scotland.
Date: Tuesday, May 3, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Walt Beyeler, Tom Moore, and Patrick Finley
Sandia National Labs
Bio:
Walt Beyeler is an electrical engineer who designs and codes complex adaptive systems
models of critical infrastructures, disease propagation, economics and finance.
He develops novel hybrid modeling methods to provide parsimonious representations of these diverse systems.
Walt has articulated and formalized methodologies to generate comprehensive conceptual models of a variety of complex systems over a wide range of scales.
Tom Moore is a theoretical biologist who applies concepts from evolution, selection,
and complexity science to large scale issues of public health and social interactions.
He designs succinct models of organizational change, network dynamics, and multi-level
selection which are useful for representing complex system origin and development.
Tom draws upon biological and social-science metaphors to characterize and understand a vast range of complex system issues and public policy options.
Pat Finley is a computer scientist who develops and applies novel mathematical
approaches to interpret complex system model results.
He designs and executes experiments on large computational clusters to
rigorously explore model parameter space and to map stable and unstable
regions of state space. Pat has designed unique algorithms merging
advanced graph-theortic search concepts and Gaussian process
meta-models to extend decision theory for public policy through uncertainty quantification.
Date: Thursday, April 28, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Brandon Rohrer
Sandia National Labs
Autonomous robots are good at spot welding auto bodies and vacuuming. In most instances where the environment is predictable and the task in known beforehand, they do well. But when the environment is unfamiliar and the task must be learned, they tend to fail, sometimes spectacularly. A brain emulating cognition and control architecture (BECCA) was developed to try to solve this problem. In this talk, I will be describing how BECCA works, the pilot tasks it has been applied to, and the implications of successfully creating a general learning agent.
Bio:
Brandon Rohrer:
Machines that appear to think and move on their own have fascinated Brandon
since he was a child. He has pursued this interest through mechanical engineering
degrees at BYU (BS '97) and MIT (MS '99, PhD '02) and through his research in the
Intelligent Systems, Robotics, and Cybernetics Group at Sandia National Laboratories
in Albuquerque, NM. Current research topics include biomimetic machine learning and automated exploratory robots.
Date: Friday, April 22, 2011
Time: 11:00 am — 11:50 am
Place: Centennial Engineering Center Auditorium
Prof. Danny Z. Chen
Department of Computer Science and Engineering, University of Notre Dame
New image acquiring modalities and technologies continue to revolutionize the
fields of biological studies and medical care today. Quantitative biomedical
image analysis plays a critical role in solving many problems that are faced by
biologists, cardiologists, orthopedists, radiologists, and other physicians
on a daily basis. The increasing sizes of image data, especially in 3-D and
higher dimensions, present a significant challenge to conventional approaches
for automated biomedical image analysis, which has often been a difficult or
even unrealistic process due to its time-consuming and labor-intensive characteristics.
Image segmentation, the problem of identifying objects of interest in volumetric image data,
is a central problem in biomedical image analysis and computer-aided diagnosis.
Robust, efficient, and accurate automated segmentation methods are highly desirable
for numerous biomedical studies and applications. In this talk, we present effective
image segmentation techniques for detecting biomedical objects in 3-D and higher
dimensional images. The techniques are based on graph search frameworks or
computational geometry paradigms. In comparison with most known segmentation
methods for volumetric images that suffer from their inability to attain optimal
segmentation or lengthy computation time, our techniques produce, in an efficient
manner, segmentation of optimal quality with respect to general cost functions on
a wide range of biomedical objects with complex topological structures. Segmentation
results computed by our techniques on various biomedical objects (e.g., pulmonary
fissures, airway trees, aorta, coronary vessels, retina, knee cartilage, blood clots, etc) and different imaging modalities are shown.
Bio:
Prof. Danny Z. Chen
received a B.S. degree in Computer Science and in Mathematics from the University of San Francisco,
California, in 1985, and M.S. and Ph.D. degrees in Computer Science from Purdue University,
West Lafayette, Indiana, in 1988 and 1992, respectively. He has been on the faculty of the
Department of Computer Science and Engineering at the University of Notre Dame,
Indiana since 1992, and is currently a Professor. Dr. Chen's main research interests
are in the areas of algorithm design, analysis, and implementation, computational geometry,
computational medicine, parallel and distributed computing, data mining, and VLSI design.
Dr. Chen received the NSF CAREER Award in 1996, the Kaneb Teaching Award of
the Department of Computer Science and Engineering at the University of Notre Dame in 2004,
and the James A. Burns, C.S.C. Award for Graduate Education of University of Notre Dame in 2009.
Date: Tuesday, April 19, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Reid Priedhorsky
IBM T.J. Watson Research Center, Cambridge, Massachusetts
Of the many fascinating trends that the Internet is nurturing,
I will focus on two. One is open content, where users produce
most or all of a site's value (c.f. Wikipedia, Stack Overflow,
and YouTube). The other is geographic content: Google Maps and
its peers make easy-to-use and high-quality maps available to
anyone with a web browser, and their associated APIs support
geographic "mashups" on a wide range of topics, from taxi
fare to earthquakes to "geogreetings". Furthermore, these
trends are merging. Internet-based open content tools and
communities are useful even when people are physically
present in the same city or neighborhood, and shared locality leads to shared local experiences and needs.
This talk is concerned with a new type of system which enhances the
utility of this emerging area: the geographic wiki or "geowiki".
I will explain how the logical conclusion of open content (wikis,
where anyone can edit anything) can be adapted to the geographic
domain and how this strange new model of mass collaboration
functions within it. The discussion will focus on Cyclopath,
a web-based mapping application serving the navigation needs
of bicyclists in the Minneapolis-St. Paul metro area, which we
created to explore the geowiki idea. The results of our
experiments show that this new collaboration model works,
and these results are of broad interest because they affect
any geographically-grounded community where important information is distributed among its members.
But the core innovation of geowikis is not about geography; rather,
it is about the utility of adapting the wiki model to structured data.
We need to build wikis with an arbitrary data model, and we need to
let the crowd change not just content but form as well.
I will outline a vision for such wikis and discuss how we are moving
forward with this vision at IBM, in a project code-named MoCoMapps.
Bio:
Reid Priedhorsky
is a research staff member at IBM T.J. Watson Research Center in
Cambridge, Massachusetts, USA and holds a Ph.D. in computer science
from the University of Minnesota. As a researcher focusing on
collaborative and social computing, the principle which motivates
him is sustainability - he works to empower communities to make
better decisions in pursuit of a more sustainable future.
He does this by building new tools for creating and communicating
knowledge, with a special focus on open content and mass collaboration techniques such as wikis.
In his spare time, he enjoys reading, bicycling, photography,
hiking (especially in the mountains and deserts of the American West),
tinkering and building things, and general hacking and programming.
Date: Thursday, April 14, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Frederick T. Koster, MD
Associate Scientist, Infectious Diseases Program, Lovelace Respiratory Research Institute
From one biologist's perspective, computational modeling of immunity may be the only practical solution to a quantitative understanding of the immune response as a Complex System. Vaccine development is very expensive and more efficient strategies must be found to identify safer and more efficacious vaccine candidates. Even more important may be the identification of fundamental principles behind the successful immune response through analysis of communication, networks and scaling. This seminar will illustrate the complexity of the immune response (without naming a single cell type or protein) and describe recent work to "deconstruct" the functional modules of the immune response.
Bio:
Dr. Koster's
research interests center around emerging viral and bacterial diseases.
With an interest and background in cellular immunology, he is focusing his
research on the lymphocytes participating in the immunopathogenesis of
Hantavirus Cardiopulmonary Syndrome (HCPS) and has investigated the cause
of the lethal complication of hantavirus lung infection, cardiogenic shock,
in the hamster model and has compared in vitro virus behavior among pathogenic
and non-pathogenic hantaviruses. His current funded projects include in vitro
dynamics of avian influenza viruses, viral dynamics of avian influenza viruses
in the ferret model, and inhalation infection models of plague, anthrax and
tularemia in nonhuman primates. These developed models are currently being
used to test the efficacy of vaccine candidates and of post-exposure
therapeutics in these Select Agent infections in nonhuman primates.
Date: Thursday, April 7, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Rick Chartrand
Los Alamos National Laboratory
In this talk, we'll examine some of the surprising capabilities of simple optimization problems to extract meaningful information from seemingly inadequate data. The starting point for this work is the applied mathematics field known as compressive sensing, which has shown impressive abilities to recover images and other signals from very few measurements. We'll look at some recent generalizations of this work, with applications to MRI reconstruction and the extraction of features from video.
Bio:
Rick Chartrand
received a Ph.D. in pure mathematics at UC Berkeley,
and now works in applied mathematics at Los Alamos National
Laboratory. His research interests include compressive sensing,
nonconvex optimization, feature extraction from high-dimensional data,
and image regularization.
Date: Thursday, March 31, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Matthew Lakin
Microsoft Research Cambridge
DNA strand displacement has been used to implement a broad range of information processing devices using nucleic acids: from logic gates, to chemical reaction networks, to architectures for universal computation. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. In this talk I will present a methodology for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. Device designs can be programmed using a common syntax and then analyzed at varying levels of detail, with or without interference. I will also discuss simulation and verification techniques for strand displacement systems which may produce unbounded numbers of species and reactions.
Bio:
Matthew Lakin
graduated from the University of Cambridge in 2005 with a B.A. in
Computer Science. He remained in Cambridge to do his Ph.D. in programming
language semantics, which he received in 2010. Since 2009 he has been a
member of the Biological Computation Group at Microsoft Research Cambridge,
working on programming languages, software tools and theoretical techniques
for modeling and reasoning about biological systems and artificial DNA
computing devices.
Date: Thursday, March 24, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Venkatesan Guruswami
Associate Professor, Computer Science Department
Carnegie Mellon University
The construction of error-correcting codes that achieve the best
possible trade-off between information rate and the amount of errors
that can be corrected has been a long sought-after goal. In this
talk, I will survey some of our work on list decoding, culminating in
the construction of codes with the optimal rate for any desired error-
correction radius. I will describe these codes (called folded Reed-
Solomon codes), and give a peek into the ideas underlying
their error-correction. These list decoding algorithms correct a factor of
two more errors compared to the conventional algorithms currently used in
every CD player and desktop PC, as well as many other applications.
List decodable codes have also found several applications
extraneous to coding theory, in algorithms, complexity theory, and
cryptography. Time permitting, I will mention some of these, including a
construction of graphs with good expansion properties.
Bio:
Venkatesan Guruswami
received his Bachelor's degree from the Indian
Institute of Technology at Madras in 1997 and his Ph.D. from the
Massachusetts Institute of Technology in 2001. He is currently an
Associate Professor in the Computer Science Department at Carnegie
Mellon University. From 2002-09, he was a faculty member in the
Department of Computer Science and Engineering at the University of
Washington. Dr. Guruswami was a Miller Research Fellow at the
University of California, Berkeley during 2001-02, and was a member in
the School of Mathematics, Institute for Advanced Study during
2007-08.
Dr. Guruswami's research interests span a broad array of topics including
the theory of error-correcting codes, approximation algorithms and
non-approximability results for NP-hard optimization problems, explicit
combinatorial constructions and pseudorandomness, probabilistically
checkable proofs, computational complexity theory, and algebraic
algorithms.
Dr. Guruswami currently serves on the editorial boards of the SIAM Journal on Computing,
IEEE Transactions on Information Theory, and the ACM Transactions on Computation Theory.
He is recipient of the Computational Complexity Conference best paper award (2007),
Packard Fellowship (2005), Sloan Fellowship (2005), NSF CAREER award (2004),
ACM's Doctoral Dissertation Award (2002), and the IEEE Information Theory Society
Paper Award (2000).
Date: Tuessday, March 8, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Mohammed Al-Saleh
UNM Department of Computer Science
PhD Graduate Student
Remote attackers use network reconnaissance techniques, such as port
scanning, to gain information about a victim machine and then use this
information to launch an attack. Current network reconnaissance
techniques, that are typically below the application layer, are
limited in the sense that they can only give basic information, such
as what services a victim is running. Furthermore, modern remote
exploits typically come from a server and attack a client that has
connected to it, rather than the attacker connecting directly to the
victim. In this paper, we raise this question and answer it: Can the
attacker go beyond the traditional techniques of network
reconnaissance and gain high-level, detailed information?
We investigate remote timing channel attacks against ClamAV antivirus
and show that it is possible, with high accuracy, for the remote
attacker to check how up-to-date the victim.s antivirus signature
database is. Because the strings the attacker uses to do this are
benign (i.e., they do not trigger the antivirus) and the attack can be
accomplished through many different APIs, the attacker has a large
amount of flexibility in hiding the attack.
Bio:
Mohammed Al-Saleh
is from Jordan. He received his Bachelor degree from Jordan University of
Science and Technology (JUST) Computer Science Dept. in 2003. He then worked as
a Research Assistant for 2 years. He came to the US in August 2005 to continue
his studies and completed his Master degree in computer science from New Mexico
State University (NMSU) in summer 2007. He started his PhD program at NMSU but
decided to transfer to the University of New Mexico (UNM). His advisor is Jed Crandall.
Date: Thursday, March 3, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Raul Rojas
Professor of Artificial Intelligence
Freie Universitat Berlin
We have been developing autonomous cars at the Free University of Berlin since 2006. In this talk, I will describe the sensors and software used for autonomous navigation. Laser scanners provide distance measurements which allow us to compute a 3D view of the environment, while video cameras provide information about the road, traffic lights, and lane markings. The navigation software produces the desired path, which is adjusted whenever obstacles are detected. I will describe some of the short-term applications of this technology, for example for electric cars, and the ultimate goal of having driverless vehicles. I will show some videos of the vehicle in traffic and some experiments of the man-in-the-loop type. Finally, I will relate this research to city planning and will comment on the social impact of autonomous cars.
Bio:
Raul Rojas
is a professor of Computer Science and Mathematics at the Free University of Berlin
and a renowned specialist in artificial neural networks. He is now leading an autonomous car project called Spirit of Berlin.
He and his team were awarded the Wolfgang von Kempelen Prize for his work on Konrad
Zuse and the history of computers. His current research and teaching revolves around artificial intelligence and its applications.
The soccer playing robots he helped build won world championships in 2004 and 2005.
In 2009 the Mexican government created the Raul Rojas Gonzales Prize for scientific achievement by Mexican citizens. He holds degrees in mathematics and economics.
Date: Tuesday, March 1, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Deborah Frincke
Pacific Northwest National Laboratory
This talk will provide an overview of cybersecurity research at the Dept. of Energy's Pacific Northwest National Laboratory (PNNL). PNNL cybersecurity research is conducted in two ways. One, through government contracts. The second is through internally funded Lab Directed R&D. This talk will provide a perspective on mission-oriented LDRD within a laboratory environment. In specific, it will provide a high level view of how PNNL's Information and Infrastructure Integrity initiative is organized and managed, both in terms of original goals and metrics for success. As illustration, the talk will provide highlights of three specific open research projects currently funded at PNNL: insider threat, cooperative defense, and cyber/physical systems.
Bio:
Deborah Frincke
joined the Pacific Northwest National Laboratory in 2004 as Chief Scientist for Cyber Security,
and currently leads PNNL's internal research investment in cyber security,
the Information and Infrastructure Integrity Initiative. Prior to joining PNNL,
Frincke was a (Full) Professor at the University of Idaho, and co-founder/co-director
of the University of Idaho Center for Secure and Dependable Systems, one of the first
such institutions to receive NSA's designation of a national Center of Excellence in
Information Assurance Education. She is an enthusiastic charter member of the
Department of Energy's cyber security grass roots community.
Frincke's research spans a broad cross section of computer security, both open and classified,
with a particular emphasis on infrastructure integrity and computer security education.
She co-founded TriGeo Network Systems, whose original products were based on her early
research at U of Idaho. TriGeo was recently positioned by Gartner in the "Leaders Quadrant"
for security information and event management. She has written over ninety published articles and technical reports.
Frincke is an active member of several editorial boards, including: Journal of Computer Security,
the Elsevier International Journal of Computer Networks, and the International Journal of
Information and Computer Security. She co-edits the Basic Training Column, IEEE Security and Privacy.
She is a steering committee member for Recent Advances in Intrusion Detection (RAID),
Systematic Advances in Digital Forensic Engineering (SADFE), and VizSEC.
She is a member of numerous advisory boards, including the University of Washington's
Governing Board for the I-School's Center for Cyber Security and Information Assurance.
Frincke received her Ph.D. from the University of California, Davis in 1992.
Date: Thursday, February 24, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Andrew Lumsdaine
Professor of Computer Science
Computer Science Department, Indiana University
Graphs and graph algorithms have long been a fundamental abstraction in computer science and
many types of data-driven applications - the emerging "fourth pillar" of science -
depend on graph computations. The resource requirements for graph computations
can be quite large, however, running graph algorithms on today's HPC systems
presents a number of challenges - the paradigms, software, and hardware that
have worked well for mainstream scientific applications are not well matched to large-scale graph problems.
In this talk we present the design and implementation of the Parallel Boost Graph Library,
a library of reusable high-performance software components for large-scale graph computation.
Like the sequential Boost Graph Library (BGL) upon which it is based, the Parallel BGL applies
the paradigm of generic programming to the domain of graph computations. To illustrate how the
Parallel BGL was built from the sequential BGL, we revisit the abstractions comprising the BGL
and lift away the implicit requirements of sequential execution and a single shared address space.
This process allows us to create generic algorithms having sequential expression and requiring
only the introduction of parallel data structures for parallel execution.
By characterizing these extensions as well as the extension process, we develop general principles
and patterns for using (and reusing) generic parallel software libraries. We demonstrate that the
resulting algorithm implementations are both efficient and scalable with performance results
for several algorithms implemented in the open-source Parallel Boost Graph Library.
We conclude by discussing on-going and future work, most notably the new active-message system
being incorporated into PBGL to enable efficient execution on multi-core, hybrid, and exascale architectures.
Bio:
Andrew Lumsdaine
obtained his Ph.D. in Electrical Engineering and Computer Science at Massachusetts Institute of Technology in 1992.
His research interests include:
* Compilers
* Computational Photography
* Computational Science and Engineering
* Computer Networks
* Cyberinfrastructure and e-Science
* Generic Programming
* High Performance Computing
* Mathematical Software
* Numerical Analysis
* Parallel and Distributed Computing
* Programming Languages
* Software Engineering
* Software and Systems
* Visualization, Computer Vision, and Graphics.
Date: Thursday, February 17, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Yvo Desmedt
Chair of Information Communication Technology
University College London, UK
Lots of new software systems are being deployed. Moreover, several countries
are moving to e-Government. In the US, for example, Internet Voting for
absentee voting for military and overseas voters will likely be
developed. The 2002 Help America Vote Act in essence mandates NIST to
propose such a system. A question is whether these e-Government systems are
secure. In this lecture we focus on Internet Voting.
The lecture starts by analyzing a state of the art voting system,
Helios 2.0. Helios 2.0 is a web-based open-audit voting system using state
of the art web technologies and advanced cryptographic techniques to provide
integrity of ballots and voter secrecy in an insecure Internet environment.
As we show, the cryptographic techniques used in Helios 2.0 can easily be
bypassed by exploiting well known web browser vulnerabilities.
We reflect back on the 30 years research on e-voting by the cryptographic
community and conclude that only recently researchers have started to take
into account the fact that the machine used to vote may be hacked. A first
solution to this problem was presented by Chaum. We analyze the weaknesses
of his approach. We then propose new solutions that avoid the shortcomings
of Chaum's solution.
One of the new solutions requires the use of unconditionally secure MIX
servers. To achieve this, we need secure multiparty computation over a
non-Abelian group. We explain how to achieve this in an "efficient" way when
the adversary is passive.
Bio:
Yvo Desmedt
received his Ph.D., Summa cum Laude, in 1984 from the University
of Leuven, Belgium. At present, he is the Chair of Information
Communication Technology at University College London, UK. He is a Fellow of
the International Association for Cryptologic Research (IACR). His interests
include cryptography, network security and computer security. He was
program chair of several conferences and workshops, including PKC and
Crypto. He is editor-in-chief of IET Information Security and editor of 4
other journals. He was an invited speaker at conferences and workshop in 5
continents. He has authored over 150 refereed papers. He is ranked 4th (out
of 1922) most productive researcher at the three main research conferences
in Cryptology.
Date: Thursday, February 10, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Dorian Arnold
Assistant Professor
UNM Department of Computer Science
In this presentation, I will provide digests of some of the research we have going on in the Scalable Systems Lab (co-directed w/ Prof. Patrick Bridges). Major research areas include large scale software infrastructures, fault-tolerance and resilience, virtualization for HPC systems and HPC tools.
Bio:
Dorian Arnold
is an assistant professor in Computer Science at the University of New Mexico
and a visiting scientist at the Lawrence Livermore National Laboratory.
He received an associate's degree from St. John's College in Belize,
a bachelor's degree from Regis University in Denver, Colorado,
a master's degree from The University of Tennessee, and a Ph.D from The University of Wisconsin-Madison.
His research interests fall under the broad areas of high performance computing
and large scale distributed systems. In particular, he is interested in abstractions
and mechanisms that allow system non-experts to harness the power of high-performance
systems in scalable, efficient, reliable ways.
Date: Thursday, February 3, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Lydia Tapia
Assistant Professor
UNM Department of Computer Science
At first glance, robots and proteins have little in common. Robots
are commonly thought of as tools that perform tasks such as vacuuming
the floor, while proteins play essential roles in many biochemical
processes. However, the functionality of both robots and proteins is
highly dependent on their motions. In the case of robots, complex
spaces and many specialized planning methods can make finding feasible
motions an expert task. In the case of protein molecules, several
diseases such as Alzheimer's, Parkinson's, and Mad Cow Disease are
associated with protein misfolding and aggregation. Understanding of
molecular motion is still very limited because it is difficult to
observe experimentally. Therefore, intelligent computational tools
are essential to enable researchers to plan and understand motions.
In this talk, we draw from our unique perspective from robotics to
present a novel computational approach to approximate complex motions
of proteins and robots. Our technique builds a roadmap, or graph, to
capture the moveable object's behavior. This roadmap-based approach
has also proven successful in domains such as animation and RNA
folding. With this roadmap, we can find likely motions (e.g., roadmap
paths). For proteins, we demonstrate new learning-based map analysis
techniques that allow us to study critical folding events such as the
ordered formation of structural features and the time-based population
of roadmap conformers. We will show results that capture biological
findings for several proteins including Protein G and its structurally
similar mutants, NuG1 and NuG2, that demonstrate different folding
behaviors. For robots, we demonstrate new learning-based map
construction techniques that allow us to intelligently decide where
and when to apply specialized planning methods. We will show results
that demonstrate automated planning in complex spaces with little to
no overhead.
Bio:
Lydia Tapia
is an Assistant Professor at UNM's Computer Science
Department. Previously, she was a Computing Innovation Post
Doctoral Fellow in the Institute for Computational Engineering
and Sciences at the University of Texas at Austin. She received a
Ph.D. in 2009 from Texas A&M University. At A&M she participated
as a fellow in the Molecular Biophysics Training and GAANN
programs and was awarded a Sloan Scholarship and a P.E.O.
Scholars Award. Prior to graduate school, she worked as a member
of technical research staff at Sandia National Laboratories.
Date: Thursday, January 27, 2011
Time: 11:00 am — 11:50 am
Place: Mechanical Engineering 218
Tyler Moore
Harvard University
Center for Research on Computation and Society
During the past several years online crime has organized and industrialized substantially. Profit-motivated criminals have identified many new ways to leverage the Internet's openness and scale to perpetrate high-volume, globally distributed frauds that have proven difficult to eradicate. In this talk I discuss three instances of such fraud that I have studied closely: phishing attacks impersonating banks, online-advertising fraud carried out by typosquatting, and fake-online pharmacies promoted by search-engine manipulation. While these activities may appear rather different on the surface, in fact all share many similarities upon closer inspection. Each fraud exploits a lack of coordination between the Internet's defenders, compensates for low individual profits through automation, and avoids detection by spreading the harm across many victims. Furthermore, in all three cases, I present evidence that the vast bulk of the harm is carried out by a few fraudsters. Along the way, I will demonstrate several general techniques for gathering evidence of Internet frauds and analyzing their dynamics.
Bio:
Tyler Moore
is a postdoctoral fellow at Harvard University's
Center for Research on Computation and Society. His research
interests include the economics of information security, the study of
electronic crime, and the development of policy for strengthening
security. Moore completed his PhD in Computer Science at the
University of Cambridge, supervised by Professor Ross Anderson. His
PhD thesis investigated cooperative attack and defense in the design
of decentralized wireless networks and through empirical analysis of
phishing attacks on the Internet. Moore has also written reports for
the European Union and US National Academy of Sciences detailing
policy recommendations for improving cyber security. As an
undergraduate, he studied at the University of Tulsa, identifying
several vulnerabilities in the public telephone network's underlying
signaling protocols. He is a 2004 Marshall Scholar.