PhD projects

Applied Mathematics

  • Parallel and Stochastic Methods in Lattice-Based Cryptography

    newgpulogoThis project will build on recent work that aims to test the security of base problems in lattice-based cryptography. These problems are computational lattice problems (e.g., learning with errors), which are conjectured to be difficult to solve. Such problems are central to the discipline of cryptography, the importance of which is shown by its prevalence throughout today’s world, providing a backbone for the global internet infrastructure. Current methods to solve these problems tend to be effective but particularly slow (e.g. Arora-Ge involving rewriting of a problem instance and then solving by Gaussian Elimination, and subsequent revision to produce solutions using Groebner bases, Blum-Kalai-Wasserman).

    The project will develop, likely hybridised, parallel and/or stochastic alternative symbolic computation methods to attack these kinds of problems, and will utilise HPC and GPU architectures to aim for a fast solution. The student would need good mathematical and computer programming skills, and knowledge of HPC or GPU programming in MPI, PyCUDA or Theano would be an advantage.

    Supervisor: Dr Matthew Craven
    Second Supervisor: Dr Daniel Robertz

Pure Mathematics

  • Integrability conditions and computer algebra

    ParabolasVarious dynamical systems are described by differential equations. In control theoretic problems (e.g., in robotics), it is essential to determine and utilise the degrees of freedom of a system. The aim of this project is to develop new algorithmic methods for determining integrability conditions for systems of partial differential equations (PDEs). Certain effective techniques in differential algebra are readily available, e.g., to determine all power series solutions of a system of PDEs. However, for concepts such as Bäcklund transformations or Lax pairs, which play a significant role in the theory of integrable systems, no systematic effective way of finding these relationships between systems of PDEs is known. This project will build on differential geometry, jet calculus, Lie symmetries and differential algebra to approach these concepts.

    The new methods should be implemented in computer algebra software, preferably as an extension of existing Maple packages, such as Janet and DifferentialThomas. The available resources for High Performance Computing should be used, developing parallelised methods whenever possible.

    Supervisor: Dr Daniel Robertz
    Second Supervisor: Dr Colin Christopher



  • Data-driven testing for sample selection bias

    Non-random sample selection is a commonplace amongst many empirical studies and it arises when an output variable of interest is available only for a restricted non-random subsample of data. This often occurs in sociological, medical and economic studies where individuals systematically select themselves into (or out of) the sample based on a combination of observed and unobserved characteristics. Estimates based on models that ignore such a non-random selection may be biased and inconsistent. The aim of this project is to develop new testing procedures for the presence of sample selection bias.

    In its classical form, the sample selection model consists of two equations which model the probability of inclusion in the sample and the outcome variable through a set of available predictors and of a joint bivariate distribution linking the two equations. The project will built on the recently introduced framework of generalised sample selection models which incorporates regression splines in order to deal with non-linear covariate-response relationships, and tackles non-normal bivariate distributions between the model equations through the use of copulae. The absence of sample selection in such models is equivalent to a product copula . The testing procedures considered in the project will also include a model selection step, through the choice of a copula, thus yielding flexible data-driven methods of testing. In the project, the new proposed testing procedures will be compared with other existing methods in a simulation study where their empirical power and empirical significance level will be investigated.

    Supervisor: Dr Malgorzata Wojtys

  • Bayesian approach to complex evidence synthesis with medical applications

    F3.mediumThis project is concerned with the development and application of statistical methods to combine evidence from clinical trials, to produce more reliable estimates, making informed and evidence-based decision for health.

    Most health technology assessment requires making use of all available evidence. A meta-analysis is often conducted to obtain an overall effect estimate by combining results from clinical trials that answer similar research questions. There is growing awareness that parameters in the random-effect meta-analysis model are often imprecisely estimated. Our recent work on constructing predictive distributions from Cochrane Database, demonstrates that using informative priors leads to more precise inference.

    This project is envisaged to encompass both methodologies and their medical applications. We propose to extend current Bayesian methods from univariate meta-analysis to multivariate meta-analysis, to allow for simultaneous comparison of all treatment options. We will develop methods to include informative prior information into multivariate meta-analysis. This will result in improved precision in estimation, leading to clearer clinical decisions. Key considerations will be to make best use of all available evidence, informative priors, and produce treatment ranking and its uncertainty for each outcome. The proposed methods will be applied to real-life clinical examples.

    Supervisor: Dr Yinghui Wei


  • High-Dimensional Big Data Modelling with Applications to Child Health and Forensic Science

    This project will build on recent work that developed models that provide clinicians with a better understand of changes in children’s eyes as they get older. The models for visual acuity measurements made on each eye were based on two-dimensional probability density functions called copulas and smooth curves called splines that relate the shape of the copula density to the covariate age. The challenges that will be addressed by this project are the extension of the modelling methodology from two to many dimensions and from one to many covariates. The main tool for addressing this problem will be a very flexible way of generating high-dimensional probability density functions based on a tree-based representation called a vine. These models will help us to understand how the dependencies between many variables change with other information. In particular, multiple testing of sick children can lead to higher than necessary referral rates, which our methodology will reduce by taking proper account of the dependencies between tests, in the light of other diagnostic information. This project will also work with data from forensic science where the aim will be to understand how measurements of long human bone structures depend on measurements taken from the skull.

    Supervisor: Dr Luciana Dalla Valle
    Second supervisor: Dr Julian Stander

  • Dynamic Social Media Information Extraction

    Sentiment_Analysis_400_by_400This project will build on recent work concerning the extraction of information from Social Media such as Facebook and Twitter. It will develop methodology to provide a dynamic understanding of sentiments expressed on social media, including sentiments represented by emoticons, and to relate these to events such as news stories and stock market fluctuations. It will use techniques from time series and even spatio-temporal modelling to differentiate between long term underlying sentiments and ephemeral ones.   Results will be disseminated by means of an R package and also by a Shiny app that will, for example, provide a user who inputs a topic and a time frame with a detailed understanding of the changing nature of sentiments about that topic. The student would attend a course on scientific communication, such as the Communication Skills Course offered by the Royal Society. In this way, the student would make broad steps to becoming an effective scientific communicator.

    Supervisor: Dr Julian Stander
    Second Supervisor: Dr Luciana Dalla Valle

Theoretical Physics

  • Lattice QCD calculations of the hadronic corrections to the anamalous magnetic moment of the muon.

    There has been a long standing tension between the experimentally measured anamalous magnetic moment of the muon and the theoretical prediction from the standard model of particle physics. It could be that the reason for this deviation between experiment and theory means that some new novel particles need to be added to the standard model of particle physics and thus a deeper new theoretical is developed. However, there is some uncertainity in the contribution of the hadronic corrections to the theoretical calculation and unless these calculations are done reliably then we will never know whether new particles are required. One way to compute the required hadronic corrections is to use a technique called lattice QCD on large supercomputers, such as those owned by DiRAC. The project will involve computing isospsin broken contributions to the hadronic contributions to the anamalous magnetic moment of the muon using large scale computers.

    The Muon g-2 ring sits in its detector hall amidst electronics racks, the muon beamline, and other equipment. This impressive experiment operates at negative 450 degrees Fahrenheit and studies the precession (or wobble) of muons as they travel through the magnetic field.
  • Signatures of strongly coupled extensions of the Standard Model


    The so-called lattice approach   is a very successful first principle method that allows to  solve Gauge Theories. Calculations resort to large scale simulations and are typically run on the largest supercomputers.  Lattice simulations are a unique tool to explore non perturbative phenomena in theories which are not well understood. In Nature, non perturbative phenomena give rise to the mass of the ordinary proton, whose mass mostly come from the binding energy of its constituents : the quarks. Tremendous efforts are being made to design extensions of the Standard Model of particle physics using similar mechanism  that could  for instance explain the mass and properties of the Higgs boson.

    In this project we will use lattice simulations  to explore new non perturbative dynamics and provide quantitative results that are relevant for experiments searching for new physics like the one performed by the  world’s largest accelerator:  the Large Hadron Collider (LHC).  The project will also provide a deeper understanding of non perturbative phenomena in particle physics and push back the knowledge frontier.  The project will rely heavily on code development, numerical simulations, data analysis and theoretical development of new methods.

    Supervisor: Dr Vincent Drach

  • Hybrid Mesons in lattice QCD

    JLab-overhead1QCD is the basic theory behind nuclear physics . The quarks and gluons combine together to form bound states. Currently the only confirmed type of bound states found in experiments are either mesons (quark and antiquark bound together) or baryons (a collective state of three quarks), but QCD allows other possibilities. One particularly exciting type of bound state is called a hybrid meson, where excited glue joins the quark and antiquark to form a totally new class of particle. There is a vigorous experimental program around the world searching for hybrid mesons in such places as: the Jefferson lab (pictured below) in the USA, PANDA in Germany, and at the experiments in CERN. Accurate theory prediction from lattice QCD are essential to the search for these novel particles. The PhD project will involve the calculation of the masses of exotic hybrid mesons with heavy quarks, using large scale numerical calculations running on supercomputers. The student will learn the techniques of numerical lattice QCD calculations and high performance computing.

    Supervisor: Dr Craig McNeile