Web Content Display Web Content Display

This is the OR calendar for upcoming seminars, brown bags, guest lectures, and events of interest.

Web Content Display Web Content Display

Deterministic Methods Inspired by Stochastic Programming and Risk Management

Speaker: Matthew Norton, University of Florida, Department of Industrial and Systems Engineering
Operations Research Seminar Thursday 7/20 at 1500, Glasgow Hall 118

Abstract: When uncertainty arises in engineering applications, particularly those involving optimization, stochastic methods are often the first choice for characterizing uncertainty and risk to create models that are robust. These methods, however, can be computationally demanding and unintuitive for practitioners more familiar with a deterministic regime.  This has inspired the creation of deterministic alternatives that can be used in optimization or engineering applications in similar ways. We discuss recent work that has been done to develop two of these methods. First, we discuss a new characteristic called the Cardinality of the Upper Average, used to count the number of large components in a data set. We show that this new characteristic is not only informative, but also easy to integrate into optimization frameworks where large outcomes are undesirable and the frequency of their occurrence should be minimized. Second, we discuss new developments in robust optimization, a popular deterministic alternative to stochastic programming. Specifically, we argue for the benefits of taking an optimistic, best-case view of uncertainty as opposed to a purely pessimistic, worst-case view. Over the course of the presentation, we demonstrate the use of these new deterministic techniques for network optimization, capacity planning, and machine learning.

Modeling Uncertainty of Expert-Elicited Data for Use in Risk-Based, Capital Budgeting

Speaker: LTC Michael Teter, Director, TRAC-Monterey
Operations Research Seminar 4/27 at 1500, Glasgow 118

Abstract: TRAC studies often rely on elicited Subject Matter Expert (SME) opinion to determine parameter values. These elicitations do not always yield perfect information. As part of TRAC research, we model an unknown true value of parameter values as random variables from to-be-determined distributions. Using SME-elicited data, we estimate distributions using both univariate and multivariate densities. For the first time, we derive and implement a three-dimensional density estimate using exponentially transformed second-order epi-splines to compare results with other estimation techniques. Sampling from densities, we generate scenarios and implement a risk-based, capital budgeting optimization model for demonstration purposes. Our research findings demonstrate using naively averaging SME data for use in an optimization model, rather than incorporating uncertainty, results in an overly optimistic portfolio. We demonstrate a 20% improvement when using our approach as opposed to the naive method.

Lieutenant Colonel Michael D. Teter was appointed as the Director, United States Army Training and Doctrine Command Analysis Center - Monterey on September 15, 2016. As director, he leads the organization with a mission to conduct relevant and credible applied research to improve military operations analysis. Lieutenant Colonel Teter has a Master of Science Degree in Mineral and Energy Economics and a Doctorate of Philosophy in Operations Research with Engineering from the Colorado School of Mines.

 

Tensors and their Eigenvectors

Speaker: Bernd Sturmfels (UC Berkeley and MPI Leipzig)
Operations Research Seminar Friday 1/27 at 1200 in Glasgow 286

Abstract: Eigenvectors of square matrices are central to linear algebra. Eigenvectors of tensors are a natural generalization. The spectral theory of tensors was pioneered by Lim and Qi over a decade ago, and it has found numerous applications. This lecture offers a first introduction, with emphasis on algebraic aspects.

Bernd Sturmfels received doctoral degrees in Mathematics in 1987 from the University of Washington, Seattle, and the Technical University Darmstadt, Germany. After postdoctoral years in Minneapolis and Linz, Austria, he taught at Cornell University, before joining UC Berkeley in 1995, where he is Professor of Mathematics, Statistics and Computer Science. Since 2017 he is a director at the Max-Planck Institute for Mathematics in the Sciences, Leipzig. His honors include a Sloan Fellowship, a David and Lucile Packard Fellowship, a Clay Senior Scholarship, an Alexander von Humboldt Senior Research Prize, the SIAM von Neumann Lecturership, the Sarlo Distinguished Mentoring Award, and an Einstein Fellowship in Berlin. He served as Vice President of the American Mathematical Society, and he was awarded an honorary doctorate from Goethe University Frankfurt in 2015. A leading experimentalist among mathematicians, Sturmfels has authored ten books and 240 research articles, in the areas of combinatorics, algebraic geometry, symbolic computation, and their applications. He has mentored 41 doctoral students and numerous postdocs. His current research addresses questions in algebra that are inspired by statistics, optimization, and biology.

Web Content Display Web Content Display

Distinguished Lecture Series

Perspectives on Stochastic Modeling
Professor Peter Glynn, Stanford University

2 June 2017
Glasgow 109 at 1100
See the slides | Watch the video

Uncertainty is present in almost every decision-making environment. In many applications settings, the explicit quantitative modeling of uncertainty clearly improves decision-making. In this talk, I will discuss some perspectives on such stochastic models and their application. Specifically, I will talk about the interplay between modeling, data, and computation, and some of the lessons learned that are relevant to building models that can add value and insight.

Peter W. Glynn is the Thomas Ford Professor in the Department of Management Science and Engineering (MS&E) at Stanford University. He is a Fellow of INFORMS and of the Institute of Mathematical Statistics, and has been co-winner of Best Publication Awards from the INFORMS Simulation Society in 1993, 2008, and 2016 and the INFORMS Applied Probability Society in 2009. He was the co-winner of the John von Neumann Theory Prize from INFORMS in 2010 and in 2012, he was elected to the National Academy of Engineering. His research interests lie in stochastic simulation, queueing theory, and statistical inference for stochastic processes.

The lecture will be recorded and posted.

Bayesian Search for Missing Aircraft
Lawrence D. Stone

20 April 2017
Glasgow Hall 109 at 1500

See the slidesWatch the video

In recent years there have been a number of highly publicized searches for missing aircraft such as the ones for Air France flight AF 447 and Malaysia Airlines flight MH 370.

Bayesian search theory provides a well-developed method for planning searches for missing aircraft, ships lost at sea, or people missing on land.  The theory has been applied successfully to searches for the missing US nuclear submarine Scorpion, the SS Central America (ship of gold), and the wreck of AF 447.  It is used routinely the by U. S. Coast Guard to find people and ships missing at sea.

This talk presents the basic elements of the theory.  It describes how Bayesian search theory was used to locate the wreck of AF 447 after two-years of unsuccessful search and discusses how it was applied to the search for MH 370.  A crucial feature of Bayesian search theory is that it provides a principled method of combining all the available information about the location of a search object.  This is particularly important in one-of-a-kind searches such as the one for AF 447 where there is little or no statistical data to rely upon.

Applied Risk Analytics: Making Advanced Analytics More Useful
Dr. Tony Cox

4 Sep 2016
See the slides  |  Watch the video

Traditional operations research emphasizes finding a feasible decision that maximizes an objective function.  In practice, how decisions affect the objective function, and even what decisions are feasible, are often initially unknown. Managing risks effectively usually requires using available data, however limited, to answer the following questions, and then improve the answers in light of experience:

  1. DESCRIPTIVE ANALYTICS: What’s happening? What has changed recently? What should we be worrying about?
  2. PREDICTIVE ANALYTICS: What will (probably) happen if if we do nothing?
  3. CAUSAL ANALYTICS:  What will (probably) happen if we take different actions or implement different policies? How soon are the consequences likely to occur, and how sure can we be?
  4. PRESCRIPTIVE ANALYTICS: What should we do next? How should we allocate available resources to explore, evaluate, and implement different actions or policies in different locations?
  5. EVALUATION ANALYTICS: How well are our risk management policies and decisions working? Are they producing (only) their intended effects? For what conditions or sub-populations do they work or fail?
  6. LEARNING ANALYTICS: How might we do better, taking into account value of information and opportunities to learn from small trials before scaling up?
  7. COLLABORATIVE ANALYTICS:  How can we manage uncertain risks more effectively together?

This talk discusses recent advances in these areas and suggests how they might be integrated into a single decision support framework, which we call risk analytics, and applied to important policy questions such as whether, when, and how to revise risk management regulations or policies. Current technical methods of risk analytics, including change point analysis, quasi-experimental design and analysis, causal graph modeling, Bayesian Networks and influence diagrams, Granger causality and transfer entropy methods for time series, causal analysis and modeling, and low-regret learning provide a valuable toolkit for using data to assess and improve the performance of risk management decisions and policies by actively discovering what works, what does not, and how to improve over time.