School of Mathematics and Statistics - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 5 of 5
  • Item
    Thumbnail Image
    Differential operators on modular forms mod p
    Ghitza, A (RIMS, 2019)
    We give a survey of recent work on the construction of differential operators on various types of modular forms (mod p) . We also discuss a framework for determining the effect of such operators on the mod p Galois representations attached to Hecke eigenforms.
  • Item
    Thumbnail Image
    Analytic evaluation of Hecke eigenvalues for Siegel modular forms of degree two
    Ghitza, A ; Colman, O ; Ryan, NC (Mathematical Sciences Publishers, 2019)
    The standard approach to evaluate Hecke eigenvalues of a Siegel modular eigenform F is to determine a large number of Fourier coefficients of F and then compute the Hecke action on those coefficients. We present a new method based on the numerical evaluation of F at explicit points in the upper half-space and of its image under the Hecke operators. The approach is more efficient than the standard method and has the potential for further optimization by identifying good candidates for the points of evaluation, or finding ways of lowering the truncation bound. A limitation of the algorithm is that it returns floating point numbers for the eigenvalues; however, the working precision can be adjusted at will to yield as close an approximation as needed.
  • Item
    Thumbnail Image
    Data-Driven Approach to Multiple-Source Domain Adaptation
    Stojanov, P ; Gong, M ; Carbonell, J ; Zhang, K (PMLR, 2019)
    A key problem in domain adaptation is determining what to transfer across different domains. We propose a data-driven method to represent these changes across multiple source domains and perform unsupervised domain adaptation. We assume that the joint distributions follow a specific generating process and have a small number of identifiable changing parameters, and develop a data-driven method to identify the changing parameters by learning low-dimensional representations of the changing class-conditional distributions across multiple source domains. The learned low-dimensional representations enable us to reconstruct the target-domain joint distribution from unlabeled target-domain data, and further enable predicting the labels in the target domain. We demonstrate the efficacy of this method by conducting experiments on synthetic and real datasets.
  • Item
    Thumbnail Image
    Geometry-Consistent Generative Adversarial Networks for One-Sided Unsupervised Domain Mapping
    Fu, H ; Gong, M ; Wang, C ; Batmanghelich, K ; Zhang, K ; Tao, D (IEEE, 2019)
    Unsupervised domain mapping aims to learn a function to translate domain X to Y by a function GXY in the absence of paired examples. Finding the optimal GXY without paired data is an ill-posed problem, so appropriate constraints are required to obtain reasonable solutions. One of the most prominent constraints is cycle consistency, which enforces the translated image by GXY to be translated back to the input image by an inverse mapping GYX. While cycle consistency requires the simultaneous training of GXY and GY X, recent studies have shown that one-sided domain mapping can be achieved by preserving pairwise distances between images. Although cycle consistency and distance preservation successfully constrain the solution space, they overlook the special properties that simple geometric transformations do not change the semantic structure of images. Based on this special property, we develop a geometry-consistent generative adversarial network (GcGAN), which enables one-sided unsupervised domain mapping. GcGAN takes the original image and its counterpart image transformed by a predefined geometric transformation as inputs and generates two images in the new domain coupled with the corresponding geometry-consistency constraint. The geometry-consistency constraint reduces the space of possible solutions while keep the correct solutions in the search space. Quantitative and qualitative comparisons with the baseline (GAN alone) and the state-of-the-art methods including CycleGAN and DistanceGAN demonstrate the effectiveness of our method.
  • Item
    Thumbnail Image
    Causal discovery and forecasting in nonstationary environments with state-space models
    Huang, B ; Zhang, K ; Gong, M ; Glymour, C ; Chaudhuri, K ; Salakhutdinov, R (ICML Press, 2019)
    In many scientific fields, such as economics and neuroscience, we are often faced with nonstationary time series, and concerned with both finding causal relations and forecasting the values of variables of interest, both of which are particularly challenging in such nonstationary environments. In this paper, we study causal discovery and forecasting for nonstationary time series. By exploiting a particular type of state-space model to represent the processes, we show that nonstationarity helps to identify the causal structure, and that forecasting naturally benefits from learned causal knowledge. Specifically, we allow changes in both causal strengths and noise variances in the nonlinear state-space models, which, interestingly, renders both the causal structure and model parameters identifiable. Given the causal model, we treat forecasting as a problem in Bayesian inference in the causal model, which exploits the time-varying property of the data and adapts to new observations in a principled manner. Experimental results on synthetic and real-world data sets demonstrate the efficacy of the proposed methods.