School of Mathematics and Statistics - Theses

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 235
  • Item
    Thumbnail Image
    Mathematical Modelling of Plasmodium Vivax Transmission
    Anwar, Md Nurul ( 2023-08)
    Malaria is caused by Plasmodium parasites transmitted to humans by the bite of an infected Anopheles mosquito. Plasmodium vivax is distinct from other malaria species in its ability to remain dormant in the liver (as hypnozoites) and activate later to cause further infections (referred to as relapses). For this reason, P. vivax is currently the most geographically widespread malaria-causing parasite resulting in significant associated global morbidity and mortality. As around 79–96% of infections are attributed to relapses from activating hypnozoites, targeting the hypnozoite reservoir (i.e., the collection of dormant parasites) to eliminate P. vivax is crucial. Mathematical models to describe the transmission dynamics of P. vivax have been developed, but most fail to capture realistic hypnozoite dynamics. Models that capture the complexity tend to involve many governing equations, making them difficult to extend to incorporate other important factors for P. vivax, such as treatment status, age, and pregnancy. In this thesis, we have developed a multiscale model (a system of integro-differential equations) that involves a minimal set of equations at the population scale, with an embedded within-host model that captures the dynamics of the hypnozoite reservoir and accounts for superinfection and mosquito seasonality. In this way, we can gain critical insights into the dynamics of P. vivax transmission with a minimum number of equations at the population scale, making this framework readily scalable to incorporate more complexity. We use our multiscale model to study the effect of radical cure (drugs that affect hypnozoites) treatment administered via a mass drug administration (MDA) program accounting for superinfection (infectious bites and/or the activation of hypnozoites can trigger multiple infections). We explicitly model the impact of the radical cure drug on each of the hypnozoites and infections. An optimisation model with different objective functions motivated on a public health basis is constructed to obtain the MDA interval that optimally disrupts P. vivax transmission. Our work shows that the effect of MDA interventions is temporary (using the deterministic framework) and depends on the pre-intervention disease prevalence (and choice of model parameters), drug efficacy, and the number of MDA rounds under consideration. We found that prevalence alone is insufficient to determine optimal intervals between rounds of MDA in regions where seasonal variation in the mosquito population is minimal. However, when seasonal variation is present, prevalence can be considered a reliable measure for determining the optimal timing of MDAs. To study the impact of MDA with radical cure on P. vivax elimination, we re-implemented our model as a continuous time non-Markovian stochastic model, as disease fadeout is not possible with a deterministic model. We found that the more rounds of MDA, the better the chance of P. vivax elimination and up to two MDA rounds have a very minimal effect on the probability of elimination (this depends on other model parameters as well). To achieve a higher probability of elimination, MDA with a very high-efficacy drug should be considered. Furthermore, a simplified approach to MDA timings can provide similar results compared to the optimal approach. As our model captures the effect of hypnozoite dynamics on transmission and the effect of treatment on each hypnozoite, it has the potential to become a critical tool in answering public health questions related to P. vivax transmission.
  • Item
    Thumbnail Image
    Mathematical and computational models of re-epithelialisation
    Zanca, Adriana ( 2023-06)
    Wound healing is a complex process occurring across multiple temporal and spatial scales. In this thesis we use individual-based computational and mathematical models to explore the epidermal re-epithelialisation stage of wound healing. We focus specifically on the overlapping spheres, Voronoi tessellation and vertex dynamics models in two dimensions and the computational choices available for each model that may impact healing dynamics. We find that the overlapping spheres and Voronoi tessellation models are highly sensitive to model choice, whereas the vertex dynamics model is insensitive to model choice. Therefore, we use the vertex dynamics model to investigate six re-epithelialisation mechanisms in small, acute wounds: cell compression, proliferation, cell edge contractility, the purse-string mechanism, cell crawling in response to local cues and cell crawling towards a point source. The scale of the mechanism can affect not only the time scale of re-epithelialisation, but also the shape of the wound over time. The most significant mechanisms for larger wounds are cell proliferation and crawling in response to local cues. We incorporate these mechanisms into a model of a larger wound to study the interplay between cell migration and proliferation during re-epithelialisation. Drawbacks of the individual-based approach include the computational costs and limited mathematical frameworks for model analysis. Hence, we convert our two-dimensional individual-based model of a larger wound into a one-dimensional model and derive a partial differential equation continuum approximation that can provide further insights into the re-epithelialisation process. From the one-dimensional model we can investigate the relationship between mechanical relaxation and tissue growth and derive an upper bound on tissue growth over time. We conclude that increasing the amount of cell proliferation or migration alone is insufficient to promote re-epithelialisation and that there is an interdependence between migration and proliferation. Overall, this thesis contributes to the understanding of the mechanisms regulating re-epithelialisation of the epidermis. A better understanding of the re-epithelialisation process allows for more informed studies of wound treatments, with the aim of reducing the global burden of wounds.
  • Item
    Thumbnail Image
    Quaternionic modular forms mod p
    Fam, Yiannis Heijun ( 2023-08)
    In a 1987 letter, Serre proves that the systems of Hecke eigenvalues arising from mod p modular forms are the same as those arising from certain functions on the adelic points of D^*, where D is the unique quaternion algebra over Q ramified at p and infinity. We give a detailed account of this proof, the key idea of which is to restrict our study of mod p modular forms to the supersingular locus of the modular curve using the Hasse invariant, and then we extend the result to other level structures. We then incorporate additional ramification into the quaternion algebra D, and this correlates with the study of modular forms on Shimura curves.
  • Item
    Thumbnail Image
    Irreducible Representations of the Subregular Finite W-Algebras
    Westfold, Owen Mckinley ( 2023-03)
    Let g be a complex semisimple Lie algebra, and let O be a nilpotent adjoint orbit in g. The Slodowy slice is an affine subspace of g that intersects O transversely at a single point. If we assume in addition that g is simply-laced, then a classic theorem of Brieskorn states that in the particular case where O is the subregular orbit, the nilpotent Slodowy slice has a Kleinian (ADE) singularity at the intersection point; moreover, this singularity is classified by the same Dynkin diagram as g. Associated to every nilpotent orbit is a filtered associative algebra that is a ‘quantisation’ of the Slodowy slice — such algebras are known as finite W-algebras. In this thesis, we study the finite dimensional irreducible representations of the finite W-algebras associated to subregular nilpotent orbits. We show that the characteristic cycles of these representations (which may be defined using an analogue of the Beilinson-Bernstein localisation due to Ginzburg) coincide with the irreducible components of the exceptional locus of the minimal resolution of the singularity, where each component is counted with multiplicity 1. This then implies, by work of Losev, that the corresponding representation is 1-dimensional. To carry out this computation, we use techniques developed by Borho, Brylinski and MacPherson to compute the left characteristic cycles of the subregular primitive quotients of the universal enveloping algebra of g. These primitive quotients are known to ‘restrict’, in a certain sense, to the finite dimensional primitive quotients of the subregular finite W-algebra. We prove that the weak G-equivariance of the corresponding sheaves implies that this restriction is compatible with the operation of taking left characteristic cycles. This work is inspired by the Landau-Ginzburg/Conformal Field Theory Correspondence — a conjecture in mathematical physics that predicts a surprising equivalence between the representation theory of vertex operators algebras and matrix factorisations of the defining equation of the singularity. Our results suggest that in the ADE case, the Lie theoretic setting of Brieskorn's theorem may provide a mathematical context in which this equivalence can be realised.
  • Item
    Thumbnail Image
    Staff shift scheduling of a blood donor centre
    Madduma Wellalage, Achini Erandi ( 2023-03)
    Australian Red Cross Lifeblood collects blood from non-remunerated voluntary donors. Thus, it is important to encourage donors to return to donate frequently by offering a satisfying service. Donor experience is adversely influenced by prolonged waiting times, but they may be reduced by determining the non-stationary staffing demand over the day. In addition, an optimal staff shift schedule, that fulfills labour standards, may improve staff satisfaction and also reduce staff non-utilised time. Hence, our objective is to implement a two-phase method that determines the optimal shift schedule for a typical day based on the predicted staffing demand. For the first phase, we determine the minimum staffing demand that ensures the system's predicted mean waiting time does not exceed a specified threshold. We define this threshold as the `waiting time target' which is set to six minutes according to a key performance indicator of Australian Red Cross Lifeblood. We propose a Monte-Carlo simulation-based simulated annealing algorithm that seeks the minimum number of employees to meet the non-stationary staffing demand over a typical weekday, while ensuring the system's predicted mean waiting time does not exceed the waiting time target. To enhance the efficiency of our simulated annealing algorithm, we develop a novel neighbourhood search algorithm based on the staff occupancy levels. The second phase entails finding an optimal shift schedule in terms of start times, shift lengths, start times of meal and rest breaks of each shift to meet the minimum staffing requirements. For this, we construct an integer linear program that fulfills all labour standards while minimising staff non-contact time. The numerical results based on four Australian Red Cross Lifeblood donor centres demonstrate that the proposed two-phase method can be adapted to any donor centre to determine the non-stationary minimum staffing demand and the optimal shift schedules. The optimal shift schedules generated for all four donor centres demonstrate that the proposed approach can result in significant reductions of the total working hours per day compared to the current roster. Since the determined staffing demands in each donor centre ensure the donor waiting time target is met, they have the potential to improve donor satisfaction as well as to streamline the donor flow.
  • Item
    Thumbnail Image
    Homotopy coherent cyclic operads
    Elliott, Patrick Cuidean ( 2023-03)
    We initiate the study of higher cyclic operads with involutive colour sets as cyclic dendroidal sets: presheaves on an appropriate category of trees. This theory lays the groundwork for an analogue of the Moerdijk-Cisinski program for cyclic operads. We introduce a homotopy coherent nerve from simplicial cyclic operads to cyclic dendoridal sets, and show it enjoys good homotopical properties. Finally, we introduce a new model for higher cyclic operads which is amenable to the construction of concrete examples.
  • Item
    No Preview Available
    A Bayesian hierarchical modelling framework for estimating parameters of stochastic epidemic models
    Alahakoon Mudiyanselage, Punya Tharangani Alahakoon ( 2023-02)
    We consider models for multiple outbreaks of an infectious disease occurring in isolation from one another, such that the status of the epidemic in one (sub)population does not influence the evolution of the epidemic in another (sub)population. However, given the shared biological, epidemiological, behavioural and demographic characteristics of the (sub)populations, it is natural to conduct parameter estimation within a hierarchical statistical framework. This framework allows the study of the outbreaks simultaneously at multiple levels. Standard approaches to Bayesian hierarchical analysis when the outbreaks are modelled as stochastic processes can be computationally prohibitive due to inefficiencies when sampling from the joint posterior distribution. To address this issue, we propose a two-step algorithm that takes advantage of parallel computing methods to estimate parameters within a hierarchical framework in a Bayesian context. This algorithm makes use of existing Approximate Bayesian computation (ABC) methods. After introducing the mathematical (Chapter 2), epidemiological (Chapter 3), and statistical (Chapter 4) background that is required, we introduce the theory and the usage of the novel algorithm (Chapter 5). We explain how our algorithm is different from other methods in the literature and highlight the suitability of the algorithm for the study of infectious disease and other biological data. We apply this algorithm to study different biological questions through three simulationbased studies and application to two epidemiological datasets. In particular, we use the simulation-based studies to examine the methodological aspects of the algorithm. In these studies, we evaluate the probability of epidemic fade-out—extinction of the disease after the first major outbreak when it is possible to observe a second wave (Chapter 6), estimate the waning immunity rates of stochastic models with multiple outbreak data (Chapter 7), and use clustering to improve parameter estimation for stochastic epidemic models (Chapter 8). The two studies applied to epidemiological data explore the applicability of the algorithm. The first study considers outbreaks of influenza that took place on board Australian ships during the pandemic of 1918–19 (Chapter 9) while the second study looks at early COVID-19 outbreaks that took place in rural counties in the United States (Chapter 10).
  • Item
    Thumbnail Image
    Practical applications of hypergraphs to modelling dynamical systems
    Diaz, Leo Paul Michael ( 2023-01)
    A current aim of systems biology is to build larger, more comprehensive models. While they are expected to be more informative than simpler, mathematically more convenient models, building such models is highly challenging. Indeed, without first principles to inform and constrain the models we can build, biological models are often constructed by trial and error and rely on a mixture of domain expertise and ad hoc choices. This process demands extensive manual curation, which clearly does not scale to building larger or more detailed models. This emphasises the need for new methods to efficiently construct and reproducibly analyse large models. This thesis is concerned with exploring how the mathematical abstraction provided by hypergraphs has the potential to address this need. This relies on the ability of hypergraphs to generally represent dynamical systems; concisely, vertices represent objects and are endowed with dynamical variables, and hyperedges represent n-ary relations between vertices. Hypergraphs generalise graphs, which already offer an intuitive way to represent a wide range of networked systems, as illustrated by the success of graph-based modelling formalisms such as chemical reaction networks and Petri nets. These formalisms are however limited by the inability of graphs to represent relations between more than two objects. This is a major limitation when higher-order interactions are commonplace in biological systems and crucial to understanding their dynamics. Hypergraphs are thus a more natural and accurate abstraction of such systems. Here, I precisely show how using hypergraphs to represent models of dynamical systems yields a highly abstract yet practical modelling framework with concrete uses. This relies on known relations between chemical reaction networks and Petri nets, here formally proven by abstracting each formalism and lifting it onto hypergraphs. Practically, these relations allow concepts defined on one formalism to carry over to the other via the shared language of hypergraphs. Using hypergraphs as a modelling framework presents two main advantages. First, it provides a rigorous way to efficiently build large models by composing independent submodels; this comes from hypergraphs being convenient objects to manipulate, both mathematically and algorithmically. Second, this framework provides a way to reproducibly analyse models. Specifically, the same methodology can be applied to different models identically, independent of the details of the model considered, as long as it is formulated in the general language hypergraphs provide. This work precisely demonstrates how the mathematical abstraction provided by hypergraphs is practically relevant to building and analysing the models of interest in systems biology. It provides a formal way of connecting distinct modelling formalisms via the abstract language of hypergraphs, therefore letting them act as an interface to bridge and integrate those formalisms. Furthermore, hypergraphs expose powerful mathematical tools that can be harnessed to more efficiently build and analyse models computationally. This is of direct relevance to help move away from ad hoc solutions and towards automating the modelling process, making hypergraphs an attractive framework to adopt given the current demand for larger, more accurate models.
  • Item
    Thumbnail Image
    A unified model of gene family evolution
    Li, Qiuyi ( 2023-01)
    The evolution of gene families is complex. It involves the gene-level evolutionary events such as gene duplications (D), horizontal gene transfers (T), and gene losses (L), together with the population-level processes such as incomplete lineage sorting (ILS). These processes are usually modelled separately/independently. However, ILS can interact with DTL to affect gene copy number polymorphism, resulting in discrepancies in gene copy number which we refer to as copy number hemiplasy (CNH). To properly account for CNH, we propose a new model of gene family evolution, the multilocus multispecies coalescent (MLMSC). The MLMSC model captures all possible scenarios that can arise through ILS, DTL, and interaction between these processes in addition to the linkage between loci. Compared to the existing models in the literature, the MLMSC offers a more flexible and arguably more realistic framework for phylogenetic simulation and inference. Then we study the effect of CNH by comparing the MLMSC model with the DLCoal model, which does not model CNH. The gene trees simulated under MLMSC and DLCoal differ in various summary statistics. Most importantly, the number of genes (leaves) in the gene tree are greatly reduced due to CNH, which suggests that the traditional methods of estimating duplication rates (by matching the average number of genes with the expectation under the duplication-loss birth-death process) becomes inaccurate. The simulated gene trees are also used to compare the accuracy of the species tree inference methods ASTRAL and ASTRAL-Pro. Based on simulations calibrated on real data, the accuracy of ASTRAL and ASTRAL-Pro may have been overestimated if CNH is not taken into account. Lastly, based on the MLMSC model, we propose a parsimony-based heuristic reconciliation method named IxDL. The IxDL takes into account D, L, ILS and interaction between these processes (CNH) to explain the incongruence between a gene tree and the species tree. By inferring duplications, the IxDL decomposes a multi-labelled gene tree into multiple single-labelled haplotype trees. A haplotype reconciliation is inferred between each haplotype tree and the unilocus tree. Then the full reconciliation is obtained by combining all haplotype reconciliations. Due to its heuristic nature, the IxDL does not guarantee to infer the most parsimonious reconciliations. However, the IxDL has the main virtue in time efficiency and simulation results show that its accuracy remains high in practice, especially for duplication inference.
  • Item
    Thumbnail Image
    Instance Space Analysis for the Maximum Flow Problem
    Alipour, Hossein ( 2023)
    The Maximum Flow Problem (MFP) is a fundamental network flow theory problem, for which many algorithms, supported by strong theoretical worst-case analyses, have been proposed. Due to its theoretical and practical importance in network theory, designing effective algorithms for the Maximum Flow Problem (MFP) remains a focus of research efforts. Although the worst-case performance analysis is the main tool for examining the performance of these algorithms, their practical efficiency depends on the network structure, making it unclear which algorithm is best for a particular instance or a class of MFP. Instance Space Analysis (ISA) is a methodology that provides insights into the such per-instance analysis. In this thesis, the instance space of MFP is constructed and analysed for the first time. To this end, novel features from the networks are extracted, capturing the performance of MFP algorithms. Additionally, this thesis expands the ISA methodology by addressing the issue of how benchmark instances should be selected to reduce bias in the analysis. Using the enhanced ISA (EISA) methodology with MFP as the case study, this thesis demonstrates that the most important features can be detected, and machine learning methods can identify their impact on algorithm performance, whilst reducing the bias caused by over-representation within the selected sample of test instances. The enhanced methodology enables new insights into the performance of state-of-the-art general-purpose MFP algorithms, as well as recommendations for the construction of comprehensive and unbiased benchmark test suites for MFP algorithm testing. To get further insight into the strengths and weaknesses of the algorithms, CPU time and fundamental operations counts are used as algorithm performance measures. Using EISA, it is revealed that the arc/path finding strategies employed by the algorithms explain critical differences in the algorithms’ behaviours. We leverage these insights to propose two new initialisation strategies, which are an essential part of the arc/path finding strategy. To employ these new strategies on our previously studied algorithms, we propose modifications that result in 15 new algorithmic variants. We examine the impact of these proposed initialisation strategies on the algorithm performance and discuss the conditions under which each initialisation strategy is expected to improve the performance. The implementations of the new algorithmic variants are also presented. Finally, the limitations of the study and opportunities for interesting future work are discussed.