Date: Tuesday, May 5th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Dongwan Shin
New Mexico Tech
Abstract:
As the use of personal information in online social networking seems
manifold, including the representation of an individual's digital
persona (social role) and identification, so does the abuse or misuse of
the information. The issue of privacy is critically important in this
context. Privacy encompasses the right to control information about
individuals, including the right to limit access to that information,
and the loss of such control often makes us exposed to a bewildering
excess of intentional and unintended consequences, including criminal
activities ranging from identity theft to online and physical stalking;
from embarrassment to price discrimination and blackmailing. In this
talk, I will present a novel framework for enabling user-controlled
sharing of sensitive personal information for better privacy protection
in current online social networks. Specifically, the framework called
U-Control is proposed to facilitate digital persona and privacy
management (DPPM) in a user-centric way that it can satisfy diverse
privacy requirements and specification, and social network environments
Bio:
Dr. Shin is currently an assistant professor of Computer Science and
Engineering at New Mexico Tech. His research interest lies in the areas
of computer security and privacy, especially, access control, digital
identity, pervasive computing security, and applied cryptography. His
research at Tech has been supported by National Science Foundation,
Sandia National Laboratories, Department of Defense, Intel, and New
Mexico Tech. He is currently the director of Secure Computing Laboratory
at New Mexico Tech. Prior to joining Tech, he was actively involved in a
variety of research projects sponsored by NSA, DoE, ETRI, and Bank of
America. Dongwan received his PhD in Information Technology from the
University of North Carolina at Charlotte.
Date: Thursday, May 7th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Milan Stojanovic
Columbia University and Center for Molecular Cybernetics
Abstract:
Primary focus of the seminar will be on robotic behaviors in single
molecules. We start with legs implementing local residency rules, and
build higher-order behaviors by adding leg-leg communications, and
introducing various sensors. Current robots walk directionally over
prescriptive landscapes or walk randomly searching for sticky points.
But, could they leave their well-controlled environments and interact
with living tissues?
Date: Thursday, April 23rd, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Jonathan J. Mandeville
Verizon FNS (at Sandia National Laboratories)
Abstract:
Digital forensics is a branch of forensic science that seeks to
understand artifacts in computers, portable electronic devices, and
any other form of electronic media. The goal may be to investigate a
cyber intrusion, employee waste/fraud/abuse, or other criminal
activities. Each year millions of dollars are spent on digital
forensics equipment, training, and personnel by corporations and
governments around the world. There is also a growing industry in
private-sector data recovery and investigation that uses principles of
digital forensics. Closely related to digital forensic are Intrusion
Detection and Cyber Security. This presentation will cover the basics
of digital forensics, including chain of custody and
evidence-gathering methods, software and hardware tools, courtroom
testimony, as well as discuss preparation for internships or
employment with law enforcement or other government agencies. If
there is time, some discussion of IDS will be included.
Bio:
Jonathan J. Mandeville is a contractor at Sandia National Labs in the
Cyber Monitoring and Policies group. Prior to his current role he
helped develop and deploy a program at Sandia that secures laptop
computers for travel abroad. He is in the process of receiving his
ENCE certification, and is a Certified Ethical Hacker. He expects to
graduate with his B.S. in Computer Science from the University of New
Mexico in the Spring of 2010.
Date: Thursday, April 16th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Marko A. Rodriguez
Center for Nonlinear Studies
Los Alamos National Laboratory
Abstract:
The World Wide Web is the defacto medium for publicly exposing a corpus of interrelated documents.
In its current form, the World Wide Web is the Web of Documents.
The next generation of the World Wide Web will support the Web of Data.
The Web of Data utilizes the same Uniform Resource Identifier (URI) address space as the Web of Documents,
but instead of a exposing a graph of documents, the Web of Data exposes a graph of data.
Given that the URI address space of the Web is distributed and infinite,
the Web of Data provides a single unified space by which the worlds data can be publicly exposed and interrelated.
The Web of Data is supported by both graph databases (which structure the data) and distributed computing mechanism
(which process the data). This presentation will discuss the Web of Data, graph databases, and models of computing in this emerging space.
Bio:
Marko A. Rodriguez is a Director's Fellow at the Center for Nonlinear Studies at the Los Alamos National Laboratory. While Marko has degrees in both cognitive (BS) and computer science (MS and PhD), he is very eclectic in his research interests. Marko focuses on multi-relational graph analysis techniques, models of computing on the Web, novel logics and reasoning mechanisms, as well as computational systems to support various ethical theories. Finally, Marko is also the co-founder and chief technology officer of the Santa Fe-based Knowledge Reef Systems Inc., where he focuses on novel algorithms to support the scholarly communication process. Please visit Marko at http://markorodriguez.com for more information.
Date: Thursday, April 2nd, 2009
Time: 11 am — 12:15 pm
Place: Centennial Engineering Center Auditorium
Dilma da Silva
IBM T. J. Watson Research Center
Abstract:
Cloud Computing has been receiving a lot of attention from the computing community.
It is perceived by some as the "IT fad of the moment" and by others as a revolutionary approach to deliver computing services.
In this talk we analyze cloud computing from the perspective of system software,
exploring how this new model impacts current practices in operating systems and distributed computing.
We identify a set of exciting research opportunities in resource management for cloud computing
and discuss how cloud computing itself affects the way we carry out research projects
Bio:
Dilma da Silva is a researcher at the IBM T. J. Watson Research Center, in New
York. She manages the Advanced Operating Systems group. She received her Ph.D in Computer Science from Georgia Tech in 1997.
Prior to joining IBM, she was an Assistant Professor at University of Sao Paulo, Brazil.
Her research in operating systems addresses the need for scalable and customizable system software.
She has published more than 60 technical papers. Dilma is a member of the board of CRA-W (Computer Research Association's Committee
on the Status of Women in Computing Research) and a co-founder of the Latinas in Computing group.
For relaxation and inspiration, Dilma spends time reading novels, knitting, and coming up with plots for books she may write one day.
For more information, visit www.research.ibm.com/people/d/dilma
Date: Thursday, March 26th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Patrick Widener, Ph.D.
Department of Computer Science
University of New Mexico
Abstract:
Future I/O systems for increasingly data-intensive computing environments face a challenging set of requirements.
Data extraction must be efficient, fast, and flexible; on-demand data annotation — metadata creation and management — must
be possible without modifying application code; and data products must be available for concurrent use by multiple applications
(such as visualization and storage), requiring consistency management and scheduling. In this talk, I will present a collection
of techniques (DataTaps, IOgraphs, and Metabots) designed to address these challenges by decoupling data operations
in space and in time from core application codes. Our research results show that these techniques can extract data efficiently
and without perturbing compute nodes, that they can be used to perform application-specific transformations while maintaining
acceptable I/O bandwidth and avoiding back-pressure, and that they can exploit "in-band" and "out-of-band" decoupling
to improve overall I/O performance. Our approach enables the creation of scalable data services which can keep pace with the next generation of data-intensive applications.
Bio:
Patrick M. Widener is a Research Assistant Professor in the Department of Computer Science at the University of New Mexico.
Dr. Widener's research interests include experimental systems, I/O and storage software for large-data environments,
middleware, and the generation and use of metadata. He focuses primarily on high-performance, enterprise, and pervasive application domains.
Dr. Widener received his Ph.D. in Computer Science from the Georgia Institute of Technology in 2005.
Prior to beginning his Ph.D. study, he was employed as a software developer by several companies which no longer exist.
He also holds a Master of Computer Science degree from the University of Virginia (1992), and a Bachelor of Science in Computer Science from James Madison University (1990).
Date: Thursday, March 12th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Edward J. Nava
Sandia National Laboratories
Abstract:
Considerable work is done today to analyze hardware, application software, operating systems,
and network communication components for security issues and to develop enhancements for each in an effort to achieve a more secure system.
A typical approach is to develop and analyze each of these independently and then develop security enhancements for each.
With this approach, improvements can be made but the overall system security may still be weak.
In order to effectively analyze a system and develop a secure solution, all components of the system must be included in the process.
The operating environment and the life cycle of the system must also be considered.
This presentation describes some of the shortcomings of current designs and outlines the beginning steps for a systems-level approach for designing a secure computing system.
Biography:
A graduate of New Mexico State University, University of New Mexico, and Stanford University, he has worked at Sandia National Laboratories since 1979.
His initial assignment was in the intrusion detection sensors division, where he conducted research on new sensors, sensor configurations,
and signal processing techniques. Later, he moved to the exploratory systems division, where he developed Kalman Filter based,
real-time aided navigation systems for use on military terrestrial and space applications. In 1987, he was promoted
and led a group that designed multi-processor flight computers for real-time applications and other electronic weapon subsystems.
In 1996, he moved to the Systems Analysis and Research Center, where he leads vulnerability analyses activities for high-consequence systems for the US Government.
In 1985 he was commissioned as an Engineering Duty Officer in the US Navy Reserve. As part of his Navy duties, he focused on Information Security for ship-borne systems.
In 2001, he was recalled to active duty by the US Navy, first to teach at the US Naval Academy, and later to represent the Navy at Los Alamos National Laboratory.
He was released from active duty in 2006 and returned to Sandia.
Date: Thursday, Feburary 24th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Scott Miller
LANL
Abstract:
Economic forces and user demand have driven computer systems to increasing complexity and to increasing deployment speed.
All complex systems have complicated failure modes, we've become reliant on prophylactic anti-malware systems that are increasingly expensive and decreasingly effective.
How does the human immunity prevent and manage infection without an A/V subscription or updates?
This talk will overview some of the key systems present in human immunity -- self/non-self, data reduction, federated system, security in depth -- from a Computer Science perspective.
Biography:
Scott Miller works in the Advanced Computing Solutions Program,
a Los Alamos National Laboratory organization chartered with the forward-thinking research and development of next generation security systems.
He holds a Masters in Computer Science from the New Mexico Institute of Mining and Technology, his thesis being "A Bioinformatics Approach to the Automated Analysis of Binary Executables."
Date: Thursday, Feburary 19th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Connie U. Smith
L&S Computer Technology, Inc
Abstract:
Performance, both responsiveness and scalability, is an important quality of today's software.
Yet many software systems cannot be used as they are initially implemented due to performance problems.
These performance failures can translate into significant costs due to damaged customer relations, lost income, and time and budget overruns to correct the problem.
Our experience is that performance problems are most often due to fundamental architectural or design problems rather than inefficient coding.
Thus, performance problems are introduced early in the development process but are typically not discovered until late, when they are more difficult and costly to fix.
The talk first introduces the Software Performance Engineering (SPE) approach for predicting performance during the early stages of development, before the architecture is fully determined. Next, the software and system performance model technology and data requirements are explained. Then an overview of the foundations of the software performance model interoperability approach is presented. It presents two performance model interchange formats: S-PMIF and PMIF, and proof of concept results of model interoperability in the SPE process. We briefly cover recent extensions to the model interoperability approach, and conclude with a discussion of future work.
Bio:
Connie U. Smith, Ph.D. is a principal consultant of the Performance Engineering Services Division of L&S Computer Technology, Inc.
She received a BA in mathematics from the University of Colorado and MA and Ph.D. degrees in computer science from the University of Texas at Austin.
She is the author of Performance Engineering of Software Systems published in 1990 by Addison-Wesley, Performance Solutions: A Practical Guide to Creating Responsive,
Scalable Software published in 2002 in Addison-Wesley's Object Technology Series, and approximately 100 scientific papers.
She is the creator of the SPE-ED™ performance engineering tool and collaborated on developing several performance model interchange formats.
She has over 25 years of experience in the practice, research and development of the SPE performance prediction techniques, computer performance modeling and evaluation,
performance patterns and antipatterns, tool interoperability, and tool development.
Dr. Smith received the Computer Measurement Group's prestigious AA Michelson lifetime achievement award for technical excellence and professional contributions for her SPE work.
She frequently serves on conference and program committees, including founding and chairing the First International Workshop on Software and Performance (WOSP) in 1998, serving on Conference and Program Committees for subsequent WOSP conferences, and currently leads the WOSP Advisory Committee. She served as an officer of ACM SIGMETRICS for 10 years, is a past ACM National Lecturer, is an active member of the Computer Measurement Group and the Quantitative Evaluation of Systems (QEST) conferences. She was previously a faculty member of the computer science department at Duke University. Since then she has been with L&S Computer Technology specializing in the development and support of the software performance engineering tool, SPE-ED™, applying performance prediction techniques to software, teaching SPE seminars, and research and writing on SPE. Dr. Smith can be reached by email at {mailto address="cusmith@spe-ed.com" encode="javascript" text="cusmith@spe-ed.com" } . A list of publications may be found at http://www.spe-ed.com.
Date: Thursday, Feburary 12th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Andrea Polli
Director, Interdisciplinary Film and Digital Media (IFDM)
UNM
Abstract:
As has been seen in recent Hurricane disasters, many lives can depend
on the interpretation of global information. Developing a language or series of languages for communicating this mass of data must evolve,
and part of that evolution must include the work of artists.
The interpretation and presentation of data using sound is part of a growing movement in what is called data sonification. Like its more popular counterpart, data visualization, sonification transforms data in an attempt to communicate meaning.
Andrea Polli presents her sonification research interpreting actual and simulated data that describe local and global climates.
Bio:
http://www.andreapolli.com/bio.htm
Date: Thursday, Feburary 5th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Dorian Arnold
Professor, UNM Computer Science
Abstract:
As high performance computing (HPC) systems continue to increase in
size, scalable, reliable computational models become critical.
Tree-based overlay networks (TBONs) leverage the logarithmic scaling
properties of the tree organization to provide scalable data
multicast, data gather, and data aggregation services. In this
talk, I describe our use of the tree-based overlay network (TBON)
model to address tool and application scalability. In particular, I present
MRNet, our TBON prototype, and several example MRNet applications
including debugging and profiling tools developed and used at the
Lawrence Livermore and the Los Alamos National Laboratories. I will
also highlight some novel algorithms we developed for failure
recovery in TBON environments and describe the major research
directions I am currently exploring.
Bio:
Dorian is an assistant professor in Department of Computer Science
at University of New Mexico. His research focuses on the scalable
performance and reliability of extremely large scale systems with
tens of thousands, hundreds of thousands or even millions of cores.
Dorian earned his Ph.D. at the University of Wisconsin in 2008, where he
developed MRNet with Phil Roth and their advisor, Barton Miller. He
received M.S. and B.S. degrees from the University of Tennessee in
1998 and Regis University in 1996. Dorian also worked in the
Innovative Computing Laboratory, directed by Dr. Jack Dongarra, as
technical lead of the NetSolve project from 1999 - 2001 -- NetSolve won
an R&D Top 100 award in 2000. As a student scholar at the Lawrence
Livermore National Laboratory in 2006, Dorian (in collaboration with
LLNL researchers) developed the Stack Trace Analysis Tool for
effectively debugging large scale applications.
Date: Thursday, January 29th, 2009
Time: 11 am — 12:15 pm
Place: ME 218
Prof. Nina Amenta
UC Davis
Abstract:
Evolutionary histories for groups of living organisms are now routinely constructed based on genomic data. These phylogenies imply theories about what
the hypothetical ancestor specieslooked like, which can be visualized dramatically using modern computer graphics.
As an example, we present some visualizations of the skulls of the hypothetical ancestors of the Old World monkeys, based on the skulls of their living descendants. But these reconstructions are clearly not the whole story; the oldest fossils in the group are quite different from the our reconstructions.
We consider integrating the information from fossils into the tree, in order to not only improve the visualization, but also to study the differences between possible placements of the fossil in the tree.
Bio:
Prof. Nina Amenta studies computational geometry and three-dimensional geometry processing, and applications of these computational techniques to the visualization of biological data.
She got her BA at Yale Univeristy in 1979, and worked for several years in the medical ultrasound industry. She holds a PhD from the University of California at Berkeley and
was a professor at the University of Texas at Austin from 1997-2002, prior to her current appointment at the University of California at Davis. She is U.C. Davis Chancellor's Fellow,
and an editor of ACM Transactions on Graphics.