Spring 2023

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Rowe at daniel.rowe@marquette.edu

Expand all   |   Collapse all  

September 1 - 

 

September 8 - 

 

September 15 - 

 

September 22 - 

 

September 29 - 

 

October 6 - 

 

October 13 - 

 

October 20 - 

 

October 27 - 

 

November 3 - 

 

November 10 - 

 

November 17 - 

 

November 24 - 

 

December 1 - 

 

Expand all   |   Collapse all  


Previous Semesters

Spring 2023

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Spiller at elaine.spiller@marquette.edu

 

March 3 - MSSC Faculty
"Elevator" Research Updates

Speakers:
Wim Ruitenburg
Anne Clough
Dan Rowe
Jay Pantone
Elaine Spiller
Cheng-Han Yu
Greg Ongie

 

March 10 - Kyle Petersen (DePaul University)
Napkin Problems

Suppose a number of mathematicians sit down at a circular banquet table that has napkins evenly spaced between each place setting. When a particular diner sits, they might encounter two napkins (in which case they choose their preferred napkin), they might encounter one napkin because a neighbor already took one (in which case they take the other napkin), or they might encounter zero napkins because both their napkins were already taken by neighbors. If people sit down in a random order and grab napkins from the left or right side of their place at random, what is the expected proportion of napkinless diners? What is the worst order in which people might sit?

In this talk I will tell you the answers to both these questions, as well as some related open questions. Along the way, I will tell you the human story of my engagement with these questions in two different projects, separated by almost 20 years. Characters in this story include eminences such as John Conway, Rob Pike, Pete Winkler, and Don Knuth, each of whom has made major contributions to mathematics and computer science. And napkins.

 

March 24 - Laurie Cavey (Boise State University)
Student Reasoning Evidence as a Tool for Equitable University Math Instruction

What makes math instruction equitable? What might we consider doing differently to make our math instruction more equitable? To address these questions, I will share an example from a video-based curriculum project (VCAST) designed to engage secondary math teacher candidates in the analysis of student reasoning evidence. While the original purpose of the VCAST project was to support future middle and high school teachers’ ability to make math accessible to all of their students, the project resulted in significant outcomes for university instructors as well. Building upon the results of the VCAST project, we will consider the potential for engaging faculty in the analysis of student reasoning evidence as a mechanism for establishing equitable university mathematics instruction.

 

March 31 - Anthony Parolari (Marquette University)
Process-based and data-driven modeling in ecohydrology 

In this data-rich era, hydrologists and other environmental scientists are motivated to measure and model everything, everywhere. Yet, limited time, budgets, and technology constrain the number of variables and resolution that can be measured and modeled; and, furthermore, not all variables and spatiotemporal scales in a system provide useful information. Therefore, broad questions in environmental systems modeling include: What variables, times, and locations are most informative of the relevant processes? And what is the minimum sampling required to achieve robust measurement and modeling? In this talk, I will introduce the field of ecohydrology and discuss current modeling trends and challenges, including the major challenge of model complexity and model order reduction. As a first example, we will review models of soil moisture dynamics, a key variable that controls the sensitivity of plant and soil processes to hydroclimatic variability and is amenable to model order reduction strategies. Secondly, we show that, generally, environmental signals are “sparse” and this sparsity can be leveraged to reduce temporal sampling requirements and model complexity. Data-driven sparse methods are applied to predict pollutant concentrations and streamflow in ungauged or poorly gauged basins. Further development and application of these methods promises to improve ecohydrological systems sensing and modeling by reducing sample requirements and identifying a minimal set of variables essential to complete characterization of the dynamics. 

 

April 14 - Jessica Conway (Penn State)
HIV viral dynamics following treatment interruption

Antiretroviral therapy (ART) effectively controls HIV infection, suppressing HIV viral loads to levels undetectable using commercial testing. Typically, suspension of therapy is followed within weeks by rebound of viral loads to high, pre-therapy level. However recent observations give nuance to that statement: in a small fraction of cases, rebound may be delayed by months, years, or even possibly, permanently, termed post-treatment control (PTC). We begin with a discussion of mechanisms that may permit PTC, hypothesizing that early treatment induces PTC by restricting the latent reservoir size. Activation of cells latently infected with HIV are thought to drive viral rebound, and early treatment may render it sufficiently small for immune responses to control infection after treatment cessation. ODE model analysis reveals a range in immune response-strengths where a patient may show bistability between viral rebound or PTC. In case of viral rebound, data reveals significant heterogeneity in timing and ensuing dynamics. We will also discuss a proposed phenomenological model assuming simple heterogeneous dynamics in latent reservoir activation to make predictions on time to rebound following treatment interruption. We rely on time-inhomogeneous branching processes to derive a mechanistically-motivated survival function for time-to-rebound. We validate our model with data from Li et al. (2016), specifically a collection of observations of times to viral rebound across 235 study participants following treatment suspension. We show that our model provides good agreement with survival curves generated from study participants.

 

April 21 - Bruce Wade (University of Louisiana at Lafayette)
EPEM: Efficient Parameter Estimation for Multiple Class Monotone Missing Data

The problem of monotone missing data has been broadly studied during the
last two decades and has many applications in various fields. Commonly
used imputation techniques require multiple iterations through the data
before yielding convergence. Moreover, those approaches may introduce
noise or biases to the subsequent modeling. We derive exact formulas and
propose a novel algorithm to compute the maximum likelihood estimators
(MLEs) of a multiple class, monotone missing dataset. Our EPEM algorithm
does not require multiple iterations through the data as other imputation
approaches, thus promising less computing time.

 

April 28 - Rajarshi Guhaniyogi (Texas A&M)
Bayesian Single and Multi-object Regressions with Applications in Neuroimaging 

Of late, neuroscience or related applications routinely encounter regression scenarios involving objects (e.g., multi-dimensional array or tensor, networks). While the most common practice in such scenarios is to construct summary measures from these objects as predictors, it makes scientific and statistical sense to exploit the structure of the objects for more meaningful inference. We will discuss the strategy to perform Bayesian regression with a tensor or a network response, the construction of novel prior distributions on object-valued parameters and posterior inference. Applications of the proposed methodology are presented in reference to brain activation and brain connectome studies. We will further discuss a new multi-object response regression framework in the study of multi-modal imaging data integration in the study of primary progressive aphasia (PPA), a neurological disorder sharing a similar neuro-degenerative pathway as Alzheimer's.

Fall 2022

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Some talks will be virtual, and some talks will be in-person. For in-person talks, a concurrent Microsoft Teams meeting will be run to allow virtual attendance. Please address inquiries/suggestions to Dr. Spiller at elaine.spiller@marquette.edu

 

September 9 - Mason Porter (UCLA)
Bounded-Confidence Models of Opinion Dynamics on Networks

I will discuss the modeling of opinion dynamics on various types of networks. After introducing some general questions and ideas in the field, I will focus on bounded-confidence models (BCMs), in which nodes have continuous-valued opinions and update those opinions when they interact with nodes with sufficiently similar opinions. I will discuss various generalizations of BCMs and examine how they affect consensus, polarization, and fragmentation of opinions on BCMs.

 

September 30 - Summer Graduate Research Symposium

Megan Lantz
Deep Learning for Dual Energy CT Reconstruction

Dual energy CT imaging offers the potential to generate high resolution, high contrast, low noise diagnostic images that allow for improved material discrimination while minimizing radiation exposure.  Using dual energy CT transmission data to reconstruct material maps (images) of three distinct materials is an ill-posed, non-linear inverse problem.  In the context of distinguishing between adipose tissue, fibroglandular tissue, and calcifications in simulated breast CT transmissions, we worked towards effective ways to solve this inverse problem using both iterative and neural network approaches.


Soroush Mahmoudiandehkordi
Genome Wide Identity by Descent (GWID)

Genome Wide association studies (GWAS) have discovered several genes associated with diseases, but it has disadvantages in inheritance patterns and rare variants. Identity by Descent (IBD) mapping is a powerful tool for unraveling the genetics underlying complex diseases. We developed a software package Genome Wide Identity by Descent (GWID) which is an effective visualization tool that enables users to scan the genome and examine patterns of disease association. It highlights specific genomic regions of interest, and displays artifacts in the data. GWID uses statistical tests to investigate significance of IBD regions in case-control setups.


Ke Xu
A Shifted Field of View method for Multi-coil Separation of Parallel Encoded Complex-valued Slices in fMRI

Simultaneous multi-slice (SMS) techniques allow for the reduction of repetition time while maintaining a high-resolution for fMRI. One challenge with SMS has been the mitigation of inter-slice signal leakage. Here the Multi-coil Separation of Parallel Encoded Complex-valued Slice with Shifted Field of View (mSPECS-sFOV) model is presented to reconstruct without leakage. Combining the orthogonal properties of Hadamard encoding with the in-plane image shift, voxels in different locations are aliased with different combinations and therefore make the separation process easier. Bootstrap sampling and artificial aliasing for calibration images also included in mSPECS-sFOV. The least square estimation is used for un-aliasing process.


Emily Corcoran
Neural Networks for Classification of Breast Tissue Using Electrical Impedance Tomography Voltage Data

The current best imaging modality for breast cancer detection is mammography, though mammograms offer a relatively high rate of false positives due to their inability to distinguish between cancerous and benign lesions without subsequent biopsies. However, there has been significant evidence that malignant tumors have much higher conductivities than their benign counterparts. Electrical impedance tomography (EIT) is an imaging modality that applies low amplitude current through electrodes placed on the body, and the resulting voltages are used to recover the conductivities within the body. Hence there is great promise in using EIT to detect breast cancer without the high probability of false positives. The research conducted during this 2022 summer fellowship involved the generation of simulated breast phantoms and large EIT voltage datasets and explored the use of EIT voltage data to classify breast tumors as malignant or benign using neural networks. Initial testing is indicating that this is a promising avenue to explore.


Yue Zhao
Regularized Multivariate Functional Principal Component Analysis

In Multivariate Functional Principal Component Analysis, there is an apparent demand for smoothing the functional principal components, and we call this smoothing technique: regularized Multivariate Functional Principal Component Analysis (reMFPCA). In our paper, three computational approaches are established: (1) iterative power algorithm (2) half smoothing approach (3) generalized eigen-decomposition method. Also, a closed-form of tuning parameter selection approach is proposed, and it dramatically increases computational efficiency than the traditional cross-validation approach in Silverman’s paper (1996). In addition, a reMFPCA R package is in process. It will allow the user to perform our approaches in R.


Jesse Adikorley
Multivariate Functional Time Series Forecasting: Multivariate Functional Singular Spectrum Analysis Approaches Applied To Images and Curves (Remote Sensing Data)

The functional singular spectrum analysis (FSSA) method, applied to functional time series (FTS), was developed by Haghbin et al., 2019 as a functional extension of the singular spectrum analysis (SSA) method developed by Golyandina et al., 2001. SSA is a non-parametric, exploratory method used in time series analysis to identify and extract components that capture mean, seasonal, trend, and noise behaviors in time-dependent data where observations are scalars. Trinka et al. 2021 developed the multivariate FSSA (MFSSA) as the functional extension of multivariate SSA (MSSA) applied to multivariate FTS (MFTS). They also developed both FSSA recurrent forecasting and FSSA vector forecasting algorithms for the FSSA and MFSSA methods. 
In this work, we present an extension of FSSA recurrent forecasting and FSSA vector forecasting which is applied to MFTS data across different dimensional domains.  


Joey Lyon
Emulating a Soil Hydrology Column Model Using a Gaussian Process

We will review a computationally intensive hydrology model of water flowing into and out of a one-dimensional soil column and discuss constructing cheap surrogates with Gaussian process emulators. We will explore where and when the model response is active for various outputs throughout the soil column over a time series of rainfall forcing. We then will look at emulating the entire soil column at any given instant in the rainfall time series. We will discuss these results as well as the future directions for this work.


Shirin Nezampour
Nonparametric Collective Spectral Density Estimation of Multiple Multivariate time series

There are many situations in which more than one aspect of a phenomenon is observed at each time point, giving rise to multivariate time series. In studying multivariate time series, spectral analysis plays an important role in investigating relationships between time series. Analysis of the power spectrum has helped us to understand the dynamics in many serially correlated data in a way that does not require the development of complex parametric models. In this project, we have worked on extending the non-parametric collective spectral density estimation (NCSDE) method introduced by Maadooliat et al. (2018) to the multivariate time series.

October 14 - Muge Karaman (University of Illinois Chicago)
Advanced Diffusion-Weighted MRI for Comprehensive Characterization of Tissue Microstructures

Biological tissues are complex due to the underlying cellular structures and their related functions. In vivo characterization of biological tissues, normal or cancerous, has been a focus of diffusion-weighted MRI (DWI) for the past three decades. Conventional DWI techniques, however, cannot comprehensively characterize biological tissue which contains an array of underlying properties such as cellularity, vascularity, and heterogeneity. This seminar will highlight advanced DWI techniques our group has been developing; describe the theory, implementation, and validation of these novel techniques; and showcase their clinical applications in cancer detection, diagnosis, and treatment assessment. We will also discuss comprehensive approaches to expand the benefits of advanced DWI to many organs. 

 

October 28 - Applied Statistics Master Practicum Presentation, 1pm - 3pm

Our APST master students will present their practicum work.

 

November 4 - Ahmad P. Tafti (University of Pittsburgh) - Virtual via Teams
Enforcing deep few-shot learning for knee semantic segmentation and measurement

Osteoarthritis (OA) is the most prevalent chronic joint disease worldwide, where knee OA takes more than 80% of commonly affected joints. The early detection of knee OA has significantly focused on analyzing knee joint space and cartilage degeneration. Segmentation of the knee joint space thus became the very first step to measure the level of joint degeneration quantitatively and qualitatively. From the computational perspective, deep learning computer vision methods have already demonstrated very successful applications in a variety of medical image analysis tasks, including object detection, image registration, segmentation, and classification. However, there are several fundamental challenges that stop deep learning methods to obtain their full potential in healthcare settings. One can see that they often need a large column of annotated training data to achieve better accuracy over traditional machine learning methods. In this talk, we present a deep few-shot learning strategy to tackle the problem of knee joint space segmentation and measurement in plain radiographs using only a few samples of manually segmented radiographs.

 

November 18 - Zeno Madarasz (Bowling Green State University)
A Strictly Weakly Hypercyclic Subspace

An interesting topic of study for a hypercyclic operator T on a Fréchet space X has been whether X has an infinite dimensional closed subspace containing entirely, except for the zero vector, hypercyclic vectors for T. These subspaces are called hypercyclic subspaces. The existence of a strictly weakly hypercyclic operator T, which is a weakly hypercyclic operator that is not norm hypercyclic on a Hilbert space H has been shown by Chan and Sanders. However, it is not known whether there exists a strictly weakly hypercyclic subspace of H. We first show that the left multiplication operator LT with the aforementioned strictly weakly hypercyclic operator T is a strictly WOT-hypercyclic operator on the operator algebra B(H). Then we obtain a sufficient condition for an operator T on a Hilbert space H to have a strictly weakly hypercyclic subspace. After that we construct an operator that satisfies these conditions and therefore prove the existence of a strictly weakly hypercyclic subspace.

 

December 2 - Pamela Harris (UW Milwaukee) - 4pm - 5pm
Multiplex juggling sequences and Kostant's partition function

Multiplex juggling sequences are generalizations of juggling sequences (describing throws of balls at discrete heights) that specify an initial and terminal configuration of balls and allow for multiple balls at any particular discrete height. Kostant’s partition function is a vector function that counts the number of ways one can express a vector as a nonnegative integer linear combination of a fixed set of vectors. What do these two families of combinatorial objects have in common? Attend this talk to find out!


December 9 - Geoff Wodtke (University of Chicago)
Structural Mean Models with Application to Causal Inference in the Social Sciences

Social scientists are often interested in estimating the marginal effects of a time-varying treatment on an end-of-study outcome. With observational data, estimating these effects is complicated by the presence of time-varying confounders affected by prior treatments, which may lead to bias and inconsistency in conventional approaches to estimation (e.g., matching). In this situation, inverse-probability-of-treatment-weighted (IPTW) estimation of a marginal structural mean model (MSM) remains consistent if treatment assignment is sequentially ignorable and the conditional probability of treatment is correctly modeled, but this method is not without limitations. In particular, it is highly sensitive to model misspecification, relatively inefficient, and difficult to use with many valued or continuous treatments. In this talk, I introduce an alternative method – regression-with-residuals (RWR) estimation of a structural nested mean model (SNMM) – that overcomes these limitations. RWR is consistent for the marginal effects of a time-varying treatment if treatment assignment is sequentially ignorable and a model for the conditional mean of the outcome, which nests models for the time-varying confounders, is correctly specified. The performance advantages of RWR-SNMM relative to IPTW-MSM are demonstrated with a series of simulation experiments and with an empirical example based on longitudinal data from the Panel Study of Income Dynamics. I conclude with a discussion of the method’s limitations and directions for future research.

Spring 2022

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Talks will be initially be held only in a virtual format. Those affiliated with Marquette University can join virtually by joining the "MSSC Department Colloquium" Team on Microsoft Teams. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu

 

February 11 - Nazmus Sakib (University of Buffalo, SUNY)
Multi-disciplinary Collaborative Research: Exploring Endeavors in the Intersection of Medical Informatics, mHealth, and Computational Sustainability

The alarmingly aggravating incidents of sepsis and septic shock, and associated mortality, morbidity, and annual treatment costs among ICU admissions are an increasing concern. SepINav is a medical informatics endeavor that helps ICU practitioners and researchers to monitor and intervene on the existing sepsis patients more efficiently and interactively and conduct retrospective studies to seek rationales to different sepsis scenarios in the ICU. Moreover, Bayesian Online Changepoint Detection will help the practitioners understand the structural changes in patients’ vital sign regimes that may harbinger prior to septic shock. Besides, several additional features are added to this data-driven software tool to promise efficient monitoring and intervention and address confounding medical interventions in the ICU.

 

March 4 - Kaitlyn and Peter Muller (Villanova University)
Modeling Covid-19 Spread on a College Campus

In this talk, we present a model for the spread of the original covid-19 variant at a medium-sized college. Our model accounts for the effects of various mitigation measures, both individual and institutional, available in a college setting. We explore the effect of these measures on the spread of covid-19 over the course of a semester. We then fit our model to Villanova University's Fall 2020 dashboard data.

 

March 25 - Wim Ruitenberg (Marquette University)
One Hundred Years of Logic for Constructive Mathematics

We clarify what is constructive mathematics wihout emotional coloring. There is no need to 'be' a constructivist. Well-known expounders of constructive mathematics include Brouwer, Markov, and Bishop. Common classical mathematics has a formal logic associated with it, known as classical logic. Boolean algebra is associated with this logic. Shorty before 1930 Heyting developed a logic for constructive mathematics. Almost from the beginning critics wondered whether this so-called intuitionistic logic could be justified as the logic of constructive mathematics. Some, including Gödel, were not convinced that it was, or at least that it lacked a proper justification. We confirm that intuitionistic logic is not the logic of constructive mathematics. We present a new correct version of constructive logic.

 

April 1 - Sue Minkoff (University of Texas at Dallas)
Modeling of Trace Gas Sensors

Trace gas sensors that are compact and portable are being deployed for use in a variety of applications including disease diagnosis via breath analysis, monitoring of atmospheric pollutants and greenhouse gas emissions, control of industrial processes, and for early warning of terrorist threats. One such sensor is based on optothermal detection and uses a modulated laser source and a quartz tuning fork resonator to detect trace gases.  If the laser lightis tuned to the right frequency to be absorbed by whatever gas one wishes to detect, then if the gas is present, that heat energy will cause a thermal wave to propagate in the air until it reaches a quartz tuning fork. The tuning fork then vibrates due to a heating of the tines.  We develop the first mathematical model of ROTADE (resonant optothermoacoustic) sensors and solve it via the finite element method. I will discuss determining an optimally designed sensor that maximizes the signal as a function of the geometry of the quartz tuning fork (length and width of the tines, etc).

 

April 8 - Yaser Samadi (Southern Illinois University)
Time Series Analysis for Interval-Valued Data

Many series of data record individual observations as intervals, such as stock market values with daily high-low values, or minimum and maximum monthly temperatures, recorded over time. Moreover, with the advent of supercomputers, datasets can be extremely large, and it is frequently the case that observations are aggregated into intervals (or histograms, or other forms of so-called symbolic data). Taking the average of the intervals results in a loss of information. Therefore, in comparison with classical data, they are more complex and can have internal structures that impose complications that are not evident in classical data. In particular, the time dependency makes it more difficult to deal with and incorporate their complex structures and internal variations. In this talk, we present our proposed autocovariance/autocorrelation functions for interval-valued autoregressive series models. Maximum likelihood estimators are derived by using the ideas of composite likelihood and the pairwise likelihood functions. Asymptotic properties of these estimators are derived. A simulation study shows that the new estimators perform considerably better than those obtained previously.

 

April 22 - Andrew Nencka (Medical College of Wisconsin)
Optimizing traditional and deep-learning based accelerated MR imaging

Magnetic resonance imaging (MRI) offers a rich variety of physiologically relevant contrasts although it is among the slowest medical imaging modalities. The comparatively long duration of MRI acquisitions limits their utility in some diseases and patient populations, while also limiting available resolution and other information content in MRI exams. Work over the last decade has led to the ability to simultaneously acquire images covering multiple cross sections of a patient, thereby shortening acquisition duration by an integer factor equal to the number of simultaneously acquired slices. Such acceleration unlocks new opportunities for enhancing information content in MRI exams. We will discuss the techniques of accelerated MRI and the recent optimization of traditional and deep learning-based image reconstruction algorithms while keeping an eye on the folklore of imaging technology.

 

April 29 - Ben Russo (Oak Ridge National Lab)
System identification techniques

A dynamical system is given as ẋ = f(x), where x : [0, T] → R^n is the system state and f : R^n → R^n is some function. Dynamical systems are prevalent in the sciences, such as engineering, biology, neuroscience, physics, and mathematics. However, in many cases even physically motivated dynamical systems can have unknown parameters (i.e. a gray box), such as mass and length of mechanical components, or the dynamics may be completely unknown (i.e. a black box). In such cases, system identification methods are leveraged to gain estimates on the dynamics of the system based on data generated by the system itself. In this talk, we’ll go over some current techniques in non-linear system identification which use some tools from functional analysis.

 

May 20 - Jordan Trinka (Pacific Northwest National Laboratory)
Rfssa: An R Package for Functional Singular Spectrum Analysis

Functional data analysis is a growing field of statistics that is finding increasing use in applied realms such as finance, medicine, and ecology. With the improvements in computational resources of recent years, functional time series (FTS) have become increasingly prevalent. In this talk, we present the functionalities of the Rfssa R package, available on CRAN, that allows the user to perform nonparametric signal extraction and forecasts of FTS. We illustrate the functionalities offered by the package by way of FTS examples of curves that describe call center data and remotely sensed images of vegetation. Through the talk and code demonstrations we illustrate the advantage of creating user-friendly, flexible, and accessible software that can be readily used by practitioners both in academia and industry.

Fall 2021

Colloquium talks will take place on Fridays, 1:00pm - 2:00pm. Talks will be held in a hybrid in-person / virtual format. The in-person portion will be held in the Katharine Reed Cudahy Building, Room 401 on the Marquette University campus. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu. Those affiliated with Marquette University can join virtually by joining the "MSSC Department Colloquium" Team on Microsoft Teams.

 

September 10 - Shane Hesprich (MU Research Computing Services)
High Performance Computing at Marquette

High Performance Computing (HPC) is a powerful tool used in almost every field from economics and engineering, to healthcare and business information. Raj, Marquette's freely available, centralized HPC resource, provides students and faculty the ability to utilize HPC for research and academic purposes. Understanding how to access and properly utilize this resource can be of great benefit to your career here at Marquette.

 

October 15 -  Chase Sakitis (Marquette University)
A Formal Bayesian Approach to SENSE Image Reconstruction

In fMRI, capturing cognitive temporal dynamics is dependent upon the rate at which volume brain images are acquired. The sampling time for an array of spatial frequencies to reconstruct an image is the limiting factor. Multi-coil SENSE image reconstruction is a parallel imaging technique that has greatly reduced image scan time. In SENSE image reconstruction, complex-valued coil sensitivities are estimated once from a priori calibration images and used to form a “known” design matrix to reconstruct every image. However, the SENSE technique is highly inaccurate when the sensitivity design matrix is not positive definite. Here, we propose a formal Bayesian approach where prior distributions for the unaliased images, coil sensitivities, and variances/covariances are assessed from the a priori calibration image information. Images, coil sensitivities, and variances/covariances are estimated a posteriori jointly via the Iterated Conditional Modes maximization algorithm and marginally via MCMC using the Gibbs sampling algorithm. Since the posterior marginal distributions are available, hypothesis testing is possible. This Bayesian SENSE (BSENSE) model to reconstruct images is applied to realistically simulated fMRI data. This BSENSE model accurately reconstructs a single slice image as well as a series of slice images without aliasing artifacts and was used to produce magnitude-only task activation.

 

October 20 (Wednesday, 1:00pm) - Danny Smyl (University of Sheffield)
Some recent advances in inverse problems applied to NDE and SHM

The field of inverse problems, the mathematics of estimating and understanding causalities from effects (data), has taken massive strides in the past 20 years. Since the advent of high performance, probabilistic, and learned computation, inversion-based applications in nondestructive evaluation (NDE) and structural health monitoring (SHM) have become increasingly pervasive. In this seminar, we highlight some key contemporary advances in inverse problems applied to NDE and SHM. In this effort, we evidence recent developments in learned (direct) inversion, multi-state reconstruction, sensor optimization, highly dynamical spatial loading prediction, and finite element model error prediction/compensation.

 

October 29 - John Lipor (Portland State University)
Improving K-Subspaces via Coherence Pursuit

Subspace clustering is a powerful generalization of clustering for high-dimensional data analysis, where low-rank cluster structure is leveraged for accurate inference. K-Subspaces (KSS), an alternating algorithm that mirrors K-means, is a classical approach for clustering with this model. Like K-means, KSS is highly sensitive to initialization, yet KSS has two major handicaps beyond this issue. First, unlike K-means, the KSS objective is NP-hard to approximate within any finite factor for large enough subspace rank. Second, it is known that the subspace estimation step is faulty when an estimated cluster has points from multiple subspaces. In this paper we demonstrate both of these additional drawbacks, provide a proof for the former, and offer a solution to the latter through the use of a robust subspace recovery (RSR) method known as Coherence Pursuit (CoP). While many RSR methods have been developed in recent years, few can handle the case where the outliers are themselves low rank. We prove that CoP can handle low-rank outliers. This and its low computational complexity make it ideal to incorporate into the subspace estimation step of KSS. We demonstrate on synthetic data that CoP successfully rejects low-rank outliers and show that combining Coherence Pursuit with K-Subspaces yields state-of-the-art clustering performance on canonical benchmark datasets.

 

November 12 - Jessi Cisewski-Kehe (UW Madison)
Topological Data Analysis That's Out of This World

Data exhibiting complicated spatial structures are common in many areas of science (e.g., cosmology, biology), but can be difficult to analyze. Persistent homology is an approach within the area of Topological Data Analysis (TDA) that offers a framework to represent, visualize, and interpret complex data by extracting topological features which may be used to infer properties of the underlying structures. For example, TDA is a beneficial technique for analyzing intricate and spatially complex web-like data such as fibrin or the large-scale structure (LSS) of the Universe. LSS is known as the Cosmic Web due to the spatial distribution of matter resembling a 3D web. The accepted cosmological model presumes cold dark matter but discriminating LSS under varying cosmological assumptions is of interest. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each object in the 3D data set can represent structures such as galaxies, clusters of galaxies, or dark matter haloes, and topological summaries ("persistence diagrams") can be obtained for these simulated data that summarize the different ordered holes in the data (e.g., connected components, loops, voids). The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries provide a way to make more rigorous comparisons of LSS under different theoretical models. We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carry out a simulation study to investigate the performance of the proposed test statistics using cosmological simulation data for inference on distinguishing LSS assuming cold dark matter versus a different cosmological model which assumes warm dark matter.

 

November 19 - Robert Krafty (Emory University)
Interpretable PCA for Multilevel Multivariate Functional Data

Many studies collect functional data from multiple subjects that have both multilevel and multivariate structures. An example of such data comes from popular neuroscience experiments where participants' brain activity is recorded using modalities such as EEG  and summarized as power within multiple time-varying frequency bands within multiple electrodes, or brain regions. Summarizing the joint variation across multiple frequency bands for both whole-brain variability between subjects, as well as location-variation within subjects, can help to explain neural reactions to stimuli. This article introduces a novel approach to conducting interpretable principal components analysis on  multilevel multivariate functional data that decomposes total variation into  subject-level and replicate-within-subject-level (i.e. electrode-level) variation, and provides interpretable components that can be both sparse among variates (e.g. frequency bands) and have localized support over time within each frequency band. Smoothness is achieved through a roughness penalty, while sparsity and localization of components are achieved by solving an innovative rank-one based convex optimization problem with block Frobenius and matrix L1-norm based penalties. The method is used to analyze data from a study to better understand reactions to emotional information in individuals with histories of trauma and the symptom of dissociation, revealing new neurophysiological insights into how subject- and electrode-level brain activity are associated with these phenomena.

 

December 3 - Luke Mcguire (University of Arizona)
Post-wildfire debris flow hazards

Fire temporarily alters soil and vegetation properties, promoting increases in runoff and erosion that can dramatically increase the likelihood of destructive flash floods and debris flows. Debris flows, or fast-moving landslides that consist of a mixture of water, mud, and rock, initiate after fires when surface water runoff rapidly erodes sediment on steep slopes. Due to the complex interactions between runoff generation, soil erosion, and post-fire debris-flow initiation, the study of post-fire debris-flow hazards necessitates an approach that couples these processes within a common modeling framework. Models used to simulate these processes, however, often contain a number of poorly constrained parameters, particularly in post-fire settings where there is limited time to collect data and where parameters related to soil and vegetation properties will change over time as the landscape recovers. Here, we describe physics-based models designed to simulate runoff, erosion, and debris flow processes in burned areas as well as how these models can inform reduced-complexity models used to facilitate rapid hazard assessments. We highlight existing gaps in our ability to assess post-fire debris-flow hazards and motivate the need to expand our ability to use numerical modeling to support post-fire hazard assessment and mitigation efforts.

December 8 (Wednesday, 2:15pm) -  Md. Fitrat Hossain (Marquette University)
Personalized mHealth Monitoring System for Veterans

The word “Crisis” refers to events or event that may lead to dangerous and unstable situation which may affect personal, social or community life adversely. It is used to represent negative changes in affairs like political, social, economic, or environmental. Mental health crisis is an important social issue which represents one’s behavior, feelings and actions that can be harmful to themselves and the people around them. Because of the wars, in the last decade, mental health crises have severely affected the United States veterans. To take necessary steps to avoid greater damage of a crisis, it is important to predict the different levels of crisis. However, because of the nature of crisis, it is difficult to understand and quantify the different levels of crisis. This research is focusing defining and predicting different levels of mental health crisis of Milwaukee based veterans who are suffering from PTSD in a mHealth setting. As part of this, long-term crisis (a severe stage) defined and validated from PCL-5 score using decision tree and statistical tests. In order to create a mobile based alert system, acute crisis (an intermediate severe situation) has been defined using Cognitive Walkthrough and decision tree based on ecological momentary assessment data.

 

December 10 –

Adikorley (Marquette University)
Multivariate Functional Time Series Forecasting : Multivariate Functional Singular Spectrum Analysis approaches applied to "images and curves/remote sensing”


Sunil Mathew (Marquette University)
Model interpretability in terms of dropout in Neural Networks using Bayesian learning

Neural networks tend to have many layers that enables it to learn complicated patterns via weights that describe the connection between nodes of each layer. Often large number of nodes tend to cause overfitting and poor generalization. A probabilistic framework enables explainability and better insight to what a node in a model learns. A Bayesian approach can be used to determine connection dropout in each layer which enables better model interpretability & combats overfitting.

Spring 2021

Colloquium talks for Spring 2021 will be held virtually via Microsoft Teams. Please address inquiries/suggestions to Dr. Ongie at gregory.ongie@marquette.edu

May 7th (12pm CT) - Prof. Kaitlyn Muller, Department of Mathematics and Statistics, Villanova University

"Clutter Mitigation Techniques in Synthetic-Aperture Radar Imaging"

Abstract: In this talk we will discuss two different methods of addressing the problem of clutter in synthetic-aperture radar (SAR) imaging. SAR images are often used for the purpose of target detection and classification. Therefore it is necessary to produce images that display the target/object of interest clearly and in such a way that they are distinguishable from other objects present in the scene.  These other objects are referred to as clutter and they often scatter just as strongly as targets and can obscure the presence of targets in images. We will begin with the basics of radar/SAR imaging and discuss a common model for volume scattering clutter (i.e. foliage). We will then present two techniques to mitigate the presence of clutter in the images. First we will discuss the more common filtering techniques which require knowledge of the clutter statistics. Second we will discuss correlation imaging, a second order imaging technique, that in certain cases can provide clutter mitigation without a priori knowledge.

April 30th (3:30pm CT) - Prof. Kimia Ghobadi, Department of Civil and Systems Engineering, Johns Hopkins University

"Hospital Resource Optimization for COVID-19"

Abstract: The COVID-19 pandemic has created a significant strain on the healthcare systems since its start. As hospitals cope with the unknown demand and surges in the cases, critical resources like ICU beds have become scarce. Additional beds and field hospitals are considered to meet the increased demand, but simply expanding the capacity is not viable for all hospitals. Better utilization of the currently available capacity can improve access to resources, lower the burden to hospitals and staff, and lead to better patient care. To this end, we developed mathematical models that match the demand with available resources in a regional system of hospitals. Our robust mixed-integer linear models minimize the resource shortage while considering operational constraints and desirable allocation properties such as transfer sparsity, consistency, and locality. Our models can consider primary resources (e.g., beds) in addition to complementary resources (e.g., nurses). We have tested and validated our models on the first wave of the COVID-19 pandemic and the subsequent surges and are currently in use at the Johns Hopkins Health System hospitals. We expanded our models to all hospitals in the US and developed an interactive public website (https://covid-hospital-operations.com/) to help decision-makers on various levels to plan and use their bed resources.

April 9th (1pm CT) - Prof. Peter Hinow, Department of Mathematical Sciences, University of Wisconsin - Milwaukee.

"Automated Feature Extraction from Large Cardiac Electrophysiological Data Sets"

Abstract: A multi-electrode array-based application for the long-term recording of action potentials from electrogenic cells makes possible exciting cardiac electrophysiology studies in health and disease. With hundreds of simultaneous electrode recordings being acquired over a period of days, the main challenge becomes achieving reliable signal identification and quantification. We set out to develop an algorithm capable of automatically extracting regions of high-quality action potentials from terabyte size experimental results and to map the trains of action potentials into a low-dimensional feature space for analysis. Our automatic segmentation algorithm finds regions of acceptable action potentials in large data sets of electrophysiological readings. We use spectral methods and support vector machines to classify our readings and to extract relevant features. We show that action potentials from the same cell site can be recorded over days without detrimental effects to the cell membrane. The variability between measurements 24 h apart is comparable to the natural variability of the features at a single time point. Our work contributes towards a non-invasive approach for cardiomyocyte functional maturation, as well as developmental, pathological, and pharmacological studies.
This is joint work with Viviana Zlochiver, Stacie Kroboth (Advocate Aurora Research Institute), and John Jurkiewicz (graduate student at UWM).

March 26th (1pm CT) - Dr. Ben Freedman, Department of Mathematical and Statistical Sciences, Marquette University.

“On Weakly Nonlinear Boundary Value Problems on Infinite intervals.” 

Abstract: In this talk, we will analyze boundary value problems on infinite intervals subject to weakly nonlinear boundary conditions. For such problems, we provide criteria for the existence of solutions as well as a qualitative description of the behavior of solutions depending on a parameter. We investigate the relationship between solutions to these weakly nonlinear problems and the solutions to a set of corresponding linear problems.

March 19th (1pm CT) - Prof. Alex Konomi, Department of Mathematical Sciences, University of Cincinnati.

“Computer model emulation with high-dimensional functional output in large-scale observing system uncertainty experiments: An application to NASA’s Orbiting Carbon Observatory-2 (OCO-2) mission” 

Abstract: Observing system uncertainty experiments (OSUEs) have been recently proposed as a cost-effective way to perform probabilistic assessment of retrievals for NASA’s Orbiting Carbon Observatory-2 (OCO-2) mission. One important component in the OCO-2 retrieval algorithm is a full-physics forward model that describes the mathematical relationship between atmospheric variables such as carbon dioxide and radiances measured by the remote sensing instrument. This complex forward model is computationally expensive but large-scale OSUEs require evaluation of this model numerous times, which makes it infeasible for comprehensive experiments. To tackle this issue, we develop a statistical emulator to facilitate large-scale OSUEs in the OCO-2 mission with independent emulation. Within each distinct spectral band, the emulator represents radiances output at irregular wavelengths via a linear combination of basis functions and random coefficients. These random coefficients are then modeled with nearest-neighbor Gaussian processes with built-in input dimension reduction via active subspace and gradient-based kernel dimension reduction. The proposed emulator reduces dimensionality in both input space and output space, so that fast computation is achieved within a fully Bayesian inference framework.

March 12th (1pm CT) - Drs. Sarah Hamilton, Elaine Spiller, and Mehdi Maadooliat, Jay Pantone, and Greg Ongie, Department of Mathematical and Statistical Sciences, Marquette University.

A number of faculty members will be giving short (5-10 min) intro/summaries of their research areas. This is a great opportunity for students at all levels, as well as faculty, to learn about current MSSC research and to start new collaborations.

 

Spring 2020

Colloquium talks will be held in the Katharine Reed Cudahy Building, Room 401 at Cudahy Hall on the Marquette University campus. Please address inquiries/suggestions to Dr. Hamilton at sarah.hamilton@marquette.edu

March 6th (2 pm CT) - Dr. Peter Muller, Department of Mathematics and Statistics, Villanova University.

Abstract. Electrical impedance tomography (EIT) is an imaging modality that measures currents and voltages on the surface of a body to image the electrical conductivity within the body.  Image reconstruction in EIT is a severely ill-posed, nonlinear inverse problem.  In this talk, I will present two direct reconstruction methods based on complex geometrical optics solutions: Calderón's method and Nachman's D-bar method.  Both methods provide a point-wise reconstruction of the image. Calderón’s method is a linearized approach while the D-bar method solves the fully non-linear inverse problem.  I will present both methods and their ability to address clinical application concerns. (more information)

February 28th (2 pm CT) - Dr. Elaine Spiller, Department of Mathematical and Statistical Sciences, Marquette University.

Abstract. Geophysical natural hazards — storm surge, post-fire debris flows, volcanic flows and ash fall, etc. — impact thousands to millions of people annually. Yet the most devastating hazards, those resulting in loss of life and property, are often both geographically and temporally localized. Thus they are effectively rare events to those impacted. We will present methodology to produce probabilistic hazard maps that can rapidly be updated to account for various aleatoric scenarios and epistemic uncertainties. This hazard analysis utilizes statistical emulators to combine computationally expensive simulations of the underlying geophysical processes with probabilistic descriptions of uncertain scenarios and model parameters. The end goal is not a map, but a family of maps that represent how a hazard threat evolves under different assumptions or different potential future scenarios. Further, this approach allows us to rapidly update hazard maps as new data or precursor information arrives. (more information)

February 14th  (2 pm CT) - Drs. Daniel Rowe, Anne Clough, Sarah Hamilton, Naveen Bansal, Wenhui Sheng, Elaine Spiller, and Mehdi Maadooliat, Department of Mathematical and Statistical Sciences, Marquette University.

A number of faculty members will be giving short (5 min) intro/summaries of their research areas. This is a great opportunity for students at all levels, as well as faculty, to learn about current MSSC research and to start new collaborations.

February 3rd (1 pm CT) - Swati Patel, Department of Mathematics, Tulane University, On Dynamics for Maintaining Biological Diversity at Various Scales.

Abstract. One of the fundamental questions in ecology and evolutionary genetics is how is biological diversity maintained within and amongst populations. Classical nonlinear differential equations that capture population or genetic interactions have played an important role in developing biological theories on how diversity is maintained. As ongoing empirical investigations uncover the nuances of these interactions, they open the way for more sophisticated models and the need for expanding mathematical methods to analyze them. In this talk, I will develop two sets of multi-scale models, motivated by recent empirical evidence. The first couples differential equations that capture interactions amongst populations with variation within the population. At a finer scale, the second models specific protein-gene interactions that influence population-level traits. For both models, I will discuss new mathematical questions and analysis that provides insight into mechanisms that enable diversity at these various scales. (more information)

January 31st (1 pm CT) - Owen Lewis, Department of Mathematics, Florida State University, Electrodiffusion Mediated Maintenance of the Gastric Mucus Layer.

Abstract. Diffusion of charged particles, or electrodiffusion, plays an important role in many physiological systems including the human stomach. The gastric mucus layer is widely recognized to serve a protective function, shielding your stomach wall from the extreme acidity and digestive enzymes present in the stomach. However, there is still much debate regarding the control of electrodiffusive transport through the mucus layer. In this talk, I will discuss a mathematical description of electrodiffusion within a two-phase gel model of gastric mucus and the challenges associated with its analysis and numerical simulation. This model is used to investigate physiological hypotheses regarding gastric layer maintenance that are beyond current experimental techniques. (more information)

January 22nd (1 pm CT) Greg Ongie, Department of Statistics, University of Chicago, Rethinking regularization in modern machine learning and computational imaging.

Abstract. Optimization is central to both supervised machine learning and inverse problems in computational imaging. These problems are often ill-posed and some form of regularization is necessary to obtain a useful solution. However, new paradigms in machine learning and computational imaging necessitate rethinking the role of regularization, as I will illustrate with two examples. First, in the context of supervised learning with shallow neural networks, I will show how a commonly used form of regularization has a surprising reinterpretation as a convex regularizer in function space. This yields novel insights into the role of overparameterization and depth in learning with neural networks having ReLU activations. Second, I will discuss a novel network architecture for solving linear inverse problems in computational imaging called a Neumann network. Rather than using a pre-specified regularizer, Neumann networks effectively learn a regularizer from training data, outperforming classical techniques. Beyond these two examples, I will show how many open problems in the mathematical foundations of deep learning and computational imaging relate to understanding regularization in its many forms. (more information)

January 21st (1 pm CT) - Scott Hottovy, Department of Mathematics, United States Naval Academy, A simple stochastic model of tropical atmospheric waves.

As tropical storms go, you have probably heard of Hurricanes, Tropical Cyclones, El Niño, and La Niña. But you probably haven't heard of the Madden-Julian Oscillation (MJO). It is the major contributor to rainfall in tropical regions and influences the climate in Wisconsin regularly. Unlike Hurricanes and El Niño, the MJO is still not well understood. In an effort to understand the mechanisms of the MJO, I will describe a model building from a dynamically stationary "background" tropical rainfall model and coupling that to a tropical wave model. These models use Stochastic Differential Equations (SDE) and Stochastic Partial Differential Equations (SPDE) as the building blocks. In the "background" model, an SDE model is used which leads to characteristics of criticality and phase transitions. For the full model with waves, we use a continuous one-dimensional SPDE. Because of the simplicity of the models, we are able to solve many statistics exactly, or run fast numerical experiments. (more information)

 

Fall 2019

Colloquium dates and speakers for Fall 2019 - Unless specified, the talks will begin at 2:00pm CT in Room 401 at Cudahy Hall.

  • September 6th - Michael Albert, Department of Computer Science, University of Otago, New Zealand, Wilf-equivalence and Wilf-collapse.
  • September 27th - Guannan Wang, Department of Mathematics, College of William and Mary, Williamsburg, Simultaneous confidence corridors for mean functions in functional data analysis of imaging data
  • October 4th - Billy Herzberg, Department of MSSC, Marquette University, Improving EIT (Electrical Impedance Tomography) images using deep learning
  • October 11th - Marquette University Computational Sciences Summer Research Fellowship Talks: 
    • Nazmus SakibUnderstanding confounding medical interventions in Sepsis treatment: A step towards multi-parameter intelligent sepsis prediction in ICU.
    • Ziynet Nesibe Kesimoglu, Inferring competing endogenous RNA (ceRNA) interactions in cancer 
  • October 25th - Jordan Trinka, Department of MSSC, Marquette University, Milwaukee, Functional Singular Spectrum Analysis.
  • November 1st - Jacob R. Pichelmeyer, Mathematics Department, Kansas State University, Manhattan, KS.
  • November 8th - Andreas Hauptmann, Department of Mathematical Sciences, University of Oulu, Finland.
  • November 15th - Sunil Mathew, Joseph Coelho, Department of MSSC, Marquette University, Milwaukee, Computational Sciences Student Research Presentations.
  • November 22nd - Md Manzur Rahman, Paromita Nitu, Department of MSSC, Marquette University, Milwaukee, Computational Sciences Student Research Presentations.
  • December 6th - Rasha Atshan, Andrew Werra, Youming Wang, Wei Xu, Department of MSSC, Marquette University, Milwaukee, Applied Statistics Practica Summer Presentations.

Spring 2019

Fall 2018

Spring 2018

Fall 2017

Spring 2017