This project will build on recent work that aims to test the security of base problems in lattice-based cryptography. These problems are computational lattice problems (e.g., learning with errors), which are conjectured to be difficult to solve. Such problems are central to the discipline of cryptography, the importance of which is shown by its prevalence throughout today’s world, providing a backbone for the global internet infrastructure. Current methods to solve these problems tend to be effective but particularly slow (e.g. Arora-Ge involving rewriting of a problem instance and then solving by Gaussian Elimination, and subsequent revision to produce solutions using Groebner bases, Blum-Kalai-Wasserman).
The project will develop, likely hybridised, parallel and/or stochastic alternative symbolic computation methods to attack these kinds of problems, and will utilise HPC and GPU architectures to aim for a fast solution. The student would need good mathematical and computer programming skills, and knowledge of HPC or GPU programming in MPI, PyCUDA or Theano would be an advantage.
Various dynamical systems are described by differential equations. In control theoretic problems (e.g., in robotics), it is essential to determine and utilise the degrees of freedom of a system. The aim of this project is to develop new algorithmic methods for determining integrability conditions for systems of partial differential equations (PDEs). Certain effective techniques in differential algebra are readily available, e.g., to determine all power series solutions of a system of PDEs. However, for concepts such as Bäcklund transformations or Lax pairs, which play a significant role in the theory of integrable systems, no systematic effective way of finding these relationships between systems of PDEs is known. This project will build on differential geometry, jet calculus, Lie symmetries and differential algebra to approach these concepts.
The new methods should be implemented in computer algebra software, preferably as an extension of existing Maple packages, such as Janet and DifferentialThomas. The available resources for High Performance Computing should be used, developing parallelised methods whenever possible.
In twistor theory, the points of Minkowski space-time are picked out by the families of null geodesics through them. In relativity, each such family has the conformal structure of a sphere, interpreted in twistor theory as a Riemann sphere, or in other words a complex projective line. These complex projective lines lie inside the three complex dimensional manifold called twistor space.
The Penrose transform yields an isomorphism between analytic zero-rest-mass free fields on Minkowski space-time and elements of the first sheaf cohomology on twistor space. This transform arises from the double fibration shown in the figure.
The same double fibration appears in a quite different context: to an arrangement of hyperplanes (a polytope) one can associate both a matroid and an orbit in a Grassmannian. Then the double fibration can be used to calculate the Tutte polynomial of the matroid, and hence a volume of the polytope.
But volumes of polytopes also appear in recent work in twistor field theory, as certain scattering amplitudes. This project will examine some simple examples of these constructions, in order to investigate how they are related.
Non-random sample selection is a commonplace amongst many empirical studies and it arises when an output variable of interest is available only for a restricted non-random subsample of data. This often occurs in sociological, medical and economic studies where individuals systematically select themselves into (or out of) the sample based on a combination of observed and unobserved characteristics. Estimates based on models that ignore such a non-random selection may be biased and inconsistent. The aim of this project is to develop new testing procedures for the presence of sample selection bias.
In its classical form, the sample selection model consists of two equations which model the probability of inclusion in the sample and the outcome variable through a set of available predictors and of a joint bivariate distribution linking the two equations. The project will built on the recently introduced framework of generalised sample selection models which incorporates regression splines in order to deal with non-linear covariate-response relationships, and tackles non-normal bivariate distributions between the model equations through the use of copulae. The absence of sample selection in such models is equivalent to a product copula . The testing procedures considered in the project will also include a model selection step, through the choice of a copula, thus yielding flexible data-driven methods of testing. In the project, the new proposed testing procedures will be compared with other existing methods in a simulation study where their empirical power and empirical significance level will be investigated.
Supervisor: Dr Malgorzata Wojtys
This project is concerned with the development and application of statistical methods to combine evidence from clinical trials, to produce more reliable estimates, making informed and evidence-based decision for health.
Most health technology assessment requires making use of all available evidence. A meta-analysis is often conducted to obtain an overall effect estimate by combining results from clinical trials that answer similar research questions. There is growing awareness that parameters in the random-effect meta-analysis model are often imprecisely estimated. Our recent work on constructing predictive distributions from Cochrane Database, demonstrates that using informative priors leads to more precise inference.
This project is envisaged to encompass both methodologies and their medical applications. We propose to extend current Bayesian methods from univariate meta-analysis to multivariate meta-analysis, to allow for simultaneous comparison of all treatment options. We will develop methods to include informative prior information into multivariate meta-analysis. This will result in improved precision in estimation, leading to clearer clinical decisions. Key considerations will be to make best use of all available evidence, informative priors, and produce treatment ranking and its uncertainty for each outcome. The proposed methods will be applied to real-life clinical examples.
Supervisor: Dr Yinghui Wei
This project will build on recent work that developed models that provide clinicians with a better understand of changes in children’s eyes as they get older. The models for visual acuity measurements made on each eye were based on two-dimensional probability density functions called copulas and smooth curves called splines that relate the shape of the copula density to the covariate age. The challenges that will be addressed by this project are the extension of the modelling methodology from two to many dimensions and from one to many covariates. The main tool for addressing this problem will be a very flexible way of generating high-dimensional probability density functions based on a tree-based representation called a vine. These models will help us to understand how the dependencies between many variables change with other information. In particular, multiple testing of sick children can lead to higher than necessary referral rates, which our methodology will reduce by taking proper account of the dependencies between tests, in the light of other diagnostic information. This project will also work with data from forensic science where the aim will be to understand how measurements of long human bone structures depend on measurements taken from the skull.
This project will build on recent work concerning the extraction of information from Social Media such as Facebook and Twitter. It will develop methodology to provide a dynamic understanding of sentiments expressed on social media, including sentiments represented by emoticons, and to relate these to events such as news stories and stock market fluctuations. It will use techniques from time series and even spatio-temporal modelling to differentiate between long term underlying sentiments and ephemeral ones. Results will be disseminated by means of an R package and also by a Shiny app that will, for example, provide a user who inputs a topic and a time frame with a detailed understanding of the changing nature of sentiments about that topic. The student would attend a course on scientific communication, such as the Communication Skills Course offered by the Royal Society. In this way, the student would make broad steps to becoming an effective scientific communicator.
The so-called lattice approach is a very successful first principle method that allows to solve Gauge Theories. Calculations resort to large scale simulations and are typically run on the largest supercomputers. Lattice simulations are a unique tool to explore non perturbative phenomena in theories which are not well understood. In Nature, non perturbative phenomena give rise to the mass of the ordinary proton, whose mass mostly come from the binding energy of its constituents : the quarks. Tremendous efforts are being made to design extensions of the Standard Model of particle physics using similar mechanism that could for instance explain the mass and properties of the Higgs boson.
In this project we will use lattice simulations to explore new non perturbative dynamics and provide quantitative results that are relevant for experiments searching for new physics like the one performed by the world’s largest accelerator: the Large Hadron Collider (LHC). The project will also provide a deeper understanding of non perturbative phenomena in particle physics and push back the knowledge frontier. The project will rely heavily on code development, numerical simulations, data analysis and theoretical development of new methods.
Supervisor: Dr Vincent Drach
Monte-Carlo methods are widely used in theoretical physics, statistical mechanics and condensed matter. Most of the applications have relied on importance sampling, which allows us to evaluate stochastically with a controllable error multi-dimensional integrals of localised functions. In lattice gauge theories, most quantities of interest can be expressed in the path integral formalism as ensemble averages over a positive-definite (and sharply peaked) measure, which provide an ideal scenario for applying importance sampling methods. However, there are noticeable cases in which Monte-Carlo importance sampling methods are either very inefficient or produce inherently wrong results for well understood reasons. Alternatives to importance sampling techniques do exist, but generally they are less efficient in standard cases and hence their use is limited to ad hoc situations in which more standard methods are inapplicable. A very promising method, the LLR algorithm, was recently introduced. The method provides an efficient algorithm to access the density of states of a generic system. Once the density of states is known, the partition function can be reconstructed by performing one-dimensional numerical integrals. In this project we will extend to LLR method to the evaluation of observables that have a poor overlap with the sampled ensemble, with the aim of taking full advantage of the exponential error suppression properties of the LLR method.
Supervisor: Dr Antonio Rago
QCD is the basic theory behind nuclear physics . The quarks and gluons combine together to form bound states. Currently the only confirmed type of bound states found in experiments are either mesons (quark and antiquark bound together) or baryons (a collective state of three quarks), but QCD allows other possibilities. One particularly exciting type of bound state is called a hybrid meson, where excited glue joins the quark and antiquark to form a totally new class of particle. There is a vigorous experimental program around the world searching for hybrid mesons in such places as: the Jefferson lab (pictured below) in the USA, PANDA in Germany, and at the experiments in CERN. Accurate theory prediction from lattice QCD are essential to the search for these novel particles. The PhD project will involve the calculation of the masses of exotic hybrid mesons with heavy quarks, using large scale numerical calculations running on supercomputers. The student will learn the techniques of numerical lattice QCD calculations and high performance computing.
Supervisor: Dr Craig McNeile