Skip to main content

Follow links below for more details on individual projects:

  • Aerodynamics: A Medieval “First in Flight” (Profs. Laura Miller, Glaire Anderson, Julie Kimbell, Jan Chambers). The focus of this project is to explore a medieval milestone in aviation history. What has been called the first successful human flight took place in ninth-century Cordoba, the capital of Islamic Spain. The experiment was the brainchild of ‘Abbas Ibn Firnas, a famous polymath in the Cordoban court and a major figure in the celebrated medieval Islamic scientific revolution that began in the ninth century. This project includes two subprojects:
    • Project #1 – Wind Tunnel Analysis: Early Aviation. The student will test and analyze models of early aviation device in the UNC wind tunnel. Based on the analysis, the student will produce physical models, to be realized with 3D printer. These models will be used to practice design and prototyping skills, constructing scale models of an early flight device, to be tested in UNC wind tunnel. The student will construct models based on mockups produced by an interdisciplinary research team, but may also produce alternative models. The student will also write and develop Matlab codes to record and analyze force data taken in the wind tunnel. [Will use: Matlab.]
    • Project #2 – Simulating Early Aviation. The student will mathematically model the mechanics of early human flight using computational fluid dynamics. The student should have interest and skills in engineering, scientific computing and coding, and fluid dynamics. To begin, the student will used mesh generating programs such as gmsh to create a simplified model of the early aviator. FLUENT and IBAMR will be used to simulate the flow around the model and to determine the lift forces generated. [Will use: python, FLUENT, IBAMR, Visit.]

  • Quantum Theory: Entanglement, Phase Transitions, and Propagators in Many-Body Systems (Prof. Joaquin Drut/Co-mentor Jay Porter). The motion of quantum particles through media is governed by probabilities encoded in a big matrix usually called the “propagator”. Knowing this matrix, such as the spectrum of eigenvalues, can give us information about the entanglement properties of the system (relevant for quantum cryptography, computation and teleportation), as well as quantum and thermal phase transitions. In this project, we aim to explore a novel approach to extracting such physical information from the propagator without resorting to the construction of its explicit matrix form, which is crucial to understand systems in strongly coupled regimes (e.g. nuclei, ultracold atoms, and certain condensed matter systems). We will study low dimensional systems at the beginning, but aim for full-fledged finite-temperature 3d calculations as the ultimate objective. [Will use: Fortran, python, Linux.]

  • Biophysics: Finite Element Model of Nucleus Mechanics in Living Cells (Prof. Rich Superfine/Co-mentor Kellie Beicker). The nucleus of a living cell is the site of the production of the proteins of a cell, and decides all of its properties including its identity and whether it is a normal cell or cancerous. Structurally, the nucleus contains the compacted DNA that comprise the chromosomes of genome of the cell, associated proteins and a shell composed of proteins called lamins. The mechanical properties of the nucleus are important because cells respond to external forces, and these forces are transmitted to the nucleus to cause deformations in its structure which affect transcription – the “reading out” of DNA. Cancer cells are known to be softer than healthy cells, but the origin of the nucleus mechanical properties and their alteration in disease is not known, so the connection between these mechanical changes cannot be interpreted. We are performing high resolution force measurements on cells and the cell nucleus with a unique system that allows us high resolution imaging to capture the compression and deformation in real time. [Will use: Matlab, COMSOL.]

  • Astroinformatics: Optimizing the Use of Telescope Time by the RESOLVE Survey (Profs. Jay Aikat, Sheila Kannappan). The RESOLVE survey ( is a census of galaxies’ mass in stars, gas, and dark matter as well as their rates of star formation and merging. This highly ambitious multi-year project is targeting all ~1550 galaxies, down to a limiting mass in the dwarf galaxy regime, within a huge volume of the nearby universe. The most challenging aspect of the data collection is obtaining spectroscopy for each galaxy, one by one, to measure its internal gas chemistry, stellar properties, and orbital motions (essential to measuring mass). Determining which galaxy to observe at any given time is complicated by many factors: the phase of the moon, the weather, which galaxies have been observed before, and how galaxies are distributed on the sky, among other factors. In this project, the student will create a decision-making tool for observers to pick the best galaxy to observe at a given moment. The project will involve computationally optimizing the results subject to one or multiple metrics of success, such as maximizing data quality or minimizing the time to survey completion. The student will also create computer simulations to test the success of the tool, then test the tool under real observing conditions with the SOAR Telescope, working in UNC’s remote observing center on campus. The optimization, simulation testing, and user interface for the tool are all flexible in scope and can be adapted to a student ranging from beginner to advanced level in computing skill. This project will be co-advised by astronomy and computer science faculty. [Will use: python, Linux.]

  • Astroinformatics: Analysis of Transient Events Observed with the Evryscope: the first all-sky gigapixel-scale telescope (Prof. Nick Law/Co-mentor Octavi Fors Aldrich). The Evryscope is the first telescope that images the whole visible sky in one single gigapixel-scale exposure. It consists of 27 telescopes placed into a dome which mimics the sky’s hemisphere. With a 10,200 square degree instantaneous field of view and two-minute imaging cadence, a wide range of science cases can be undertaken. Among them, surveying transiting exoplanets around nearby bright stars, M-dwarfs, and bright white dwarfs, monitoring of outbursting young stars and stellar activity of all types, and optical counterparts of gamma-ray bursts and nearby supernovae.

    The Evryscope’s sustained acquisition data-flow will be ~1Gb/minute all year long, and will be stored and reduced on-site at Cerro Tololo Inter-American Observatory (Chile). This brings a unique opportunity to tackle challenges in the area of astronomical data analysis. The huge amount of data and the diversity in the attempted science cases, demands the Evryscope to have a versatile reduction automatic pipeline. A first release of such pipeline, Evrypipe, has already been developed to deliver astronomers end user science products (small subimages and light curves) – but will only work from a limited list of targets of interest. Evrypipe also performs detailed light curve analysis for the transiting exoplanets targets. However, the current status of Evrypipe does not address the detection of other kinds of variable objects, such as stellar rotation and pulsation, magnetic activity, and intrinsic long-term modulation signatures, which will be a must for the Evryscope’s plans to help the upcoming NASA TESS exoplanet mission. The student’s project will consist of implementing time-frequency analysis algorithms and using them to detect new time-variable objects in the full Evryscope dataset. The student’s research plan is designed so that he/she will accomplish the project milestones in a step-by-step basis: after the student finishes the implementation of one algorithm, the next one will be approached, and so on. [Will use: python, Linux.]

  • Astrophysics: Tracing Radioactive Dust Grains from Supernovae to the Solar Nebula (Prof. Fabian Heitsch/Co-mentor Matthew Goodson). A nearby supernova may have injected radioactive dust grains into the early Solar System. This would explain anomalous elemental abundances in primitive meteorites; however, the probability (and survivability) of such a scenario is uncertain. A key question is how well the radioactive material mixed with the primordial Solar nebula. To estimate the mixing, we simulate a supernova shock-wave impacting a dense gas cloud. We use the hydrodynamics code Athena to model the gas, but we also need to track the dust grains. In this project, we will extend the tracer particles included in Athena to follow the injection and dispersal of radioactive dust grains. The particles interact with the gas via drag forces, which are dependent on the grain properties. The grains are formed in the supernova explosion, and they are carried by the shock-wave through interstellar space. The dust grains can also be destroyed, releasing the radioactive elements into the gas. By tracking the history, decay, and destruction of each particle, we can determine the radioactive abundances and distribution. If our results resemble conditions in the early Solar nebula, then the supernova scenario may be viable. [Will use: C, IDL, Unix.]

  • Astrophysics: Simulating the Dynamics of Galaxy Stripping across Environments (Prof. Sheila Kannappan/Co-mentor Elaine Snyder). Our Local Group of galaxies contains an unusually round and tiny galaxy, M32, which is a prototype of a class of compact stellar systems (CSSs) that span at least three orders of magnitude in stellar mass. One idea for how such systems might form postulates that CSSs are the residual nuclei of once normal (albeit small) galaxies, whose outer parts have been stripped away by close interactions with much larger galaxies. However, the amount of stripping that should occur is uncertain for environments like the Local Group with just a few galaxies, because to date, theoretical simulations of stripping have focused on dense clusters of galaxies. Furthermore, the internal dynamics of CSSs formed by stripping of small galaxies has not been measured directly, but only inferred. To remedy this situation, the student will first reproduce an existing simulation of stripping in a cluster environment using the publicly available code GADGET, adding to published work by directly measuring the internal dynamics of the forming CSS at each stage. This work will pave the way for a new simulation examining stripping in a non-cluster environment, enabling assessment of how the density of galaxies affects the relative efficacy of stripping. The results will also be analyzed in comparison with real CSS data being compiled for the RESOLVE survey ( (Note: In addition to the two listed UNC mentors, RESOLVE colleague M. Sinha at Vanderbilt, an expert on galaxy stripping, will collaborate on this project.) [Will use: python, GADGET2, some C depending on level, Linux.]

  • Cosmology: Microhalo Formation in the Early Universe (Prof. Adrienne Erickcek/Co-mentor Cosmin Ilie). Astrophysical structures form in regions of the Universe in which the matter density is slightly higher than average. Gravity collects the matter in these over-dense regions to form everything from small clumps of dark matter to the largest galaxy clusters. Structure formation is hierarchical: the smallest objects form first, and then these objects merge to form larger bound systems. We can simulate this process using computer programs that calculate how a billion particles respond to the gravitational forces between them. In this project, the student will use these n-body simulations to study the formation of the very first structures in the Universe: the dark matter microhalos. These microhalos grow from the smallest-scale density fluctuations, which makes them especially sensitive to the dynamics of the early Universe. By using different initial conditions for the n-body simulations, the student will investigate how the dynamics of the early Universe affects the abundance and internal structure of microhalos. The student will run the simulations with several different initial conditions, use halo-finding routines to find the microhalos, and then write python routines to analyze the results. [Will use: python, GADGET2, AHF, Splash, Linux.]