Projects

Nanomation: Automated Nanofabrication and Microscopy

Keywords: Nanotechnology, fabrication, microscopy

Technology is turning to novel materials, such as nanowires, nanotubes and graphene to improve transistor performance. The growth processes for these materials is inherently random, making top-down fabrication processes challenging. Working with the University of Cambridge's Low-Dimensional Materials group, I have helped develop a suite of techniques to assist in the automatic fabrication of nanomaterial-based devices. This has included the development of a fiducial marker system for accurate localisation of any features of interest, as well as computer vision algorithms and automated fabrication routines. This work is being spun out as a company, Nanomation, where I hold a position as Chief Technology Officer.

For further information, see:

Holographic Excitation of Modes in Photonic Crystal Fibre

Keywords: Waveguide, photonic crystal fibre, holography, wavefront shaping, mode, photochemical microreactor

Most common forms of optical waveguide use total internal reflection to guide light along their length, with photonic crystal fibres forming the exception to this. This novel type of waveguide instead obtains its guidance properties from its structure. For example, a two-dimensional photonic bandgap can be used to trap light in the transverse direction, confining light to the possibly hollow waveguide core as it propagates along the length of the fibre. Photonic crystal fibres have seen significant interest for the guidance of extremely intense light, or for the guidance of light over an extremely large range of wavelengths, including at wavelengths where this was not previously possible. Photonic crystal fibres are also being investigated for other applications, such as for use in optical gyroscopes and for gas sensing applications.

Light propagation in a multimode fibre or a photonic crystal fibre can be decomposed into orthogonal modes, each of which propagates at its own phase velocity. Controlled excitation of individual modes or superpositions of modes has seen significant interest as it provides a way of increasing the data-carrying capacity of existing fibres. Additionally, it has been shown that the controlled excitation of modes can enable the optical trapping of microparticles along the fibre core, allowing conveyor belts of microparticles to be created. Distributed sensors can hence be formed by creating conveyor belts of particles along the fibre core that are sensitive to their environment along.

As part of my PhD under Dr. Tijmen Euser and Prof. Tim Wilkinson, I developed techniques for the efficient excitation of high-purity modes in photonic crystal fibres. This consisted of the development of electromagnetic solvers to determine the modes guided by a given waveguide geometry, and the development of hologram generation algorithms tailored to exciting these particular modes. The techniques were experimentally demonstrated in several different waveguide geometries.

My Cavendish research group uses photonic crystal fibres as microreactors. The photonic crystal fibres are filled with reagents, and light coupled into the waveguide is used to pump and probe photochemical reactions. Only nanolitres are required to fill these microreactors, but the long interaction lengths offered by this experimental geometry yields unprecedented sensitivities and limits of detection. I am using the techniques described above to pump and probe reactions using discrete waveguide modes. This gives additional spatial information regarding the reactions, allowing these to be mapped in the transverse direction.

Image taken from my paper

For further information, see:

  • Ralf Mouthaan, "Holographic Control of Light Propagation in Optical Waveguides", PhD Thesis, University of Cambridge, 2021.

  • Ralf Mouthaan, Peter Christopher, Jonathan Pinnell, Michael Frosz, George S. D. Gordon, Tim Wilkinson & Tijmen G. Euser, "Efficient Excitation of High-Purity Modes in Arbitrary Waveguide Geometries", Journal of Lightwave Technology, 2021.

  • Ralf Mouthaan, Peter J. Christopher, Michael Frosz, George Gordon, Timothy Wilkinson & Tijmen Euser, "Holographic excitation of discrete waveguide modes in photonic crystal fibre", Frontiers in Optics, Washington DC, USA, 2021 (virtual).

Wavefront Shaping in Multimode Fibres and Scattering Media

Keywords: Wavefront shaping, holography, scattering media, transmission matrix, endoscopy, interferometry

A wavefront travelling through a scattering medium such as biological tissue or white paint is repeatedly scattered and distorted, forming a speckle pattern. The distortion imposed is deterministic and linear. Hence, the imposed distortion can be completely described by a so-called transmission matrix. Characterising the transmission matrix allows the distortion to be corrected for, and the original image to be recovered from the distorted speckle pattern. Hence, by exploiting the transmission matrix paradigm, images can be recovered from deep within biological tissue, through fog, or from around corners.

Light incident on the input facet of a multimode fibre propagates along the fibre in discrete modes. These modes couple with each other, build up phase delays with respect to each other, and create a distorted field at the output facet. This distortion can also be described by a transmission matrix, and characterisation of the transmission matrix in this case allows the multimode fibre to be turned into an endoscope with the diameter of a human hair. Furthermore, given that coherent light is used, phase, amplitude and polarisation information are available, giving additional information that can be used during diagnosis. For example, investigations done by George Gordon show that this can additional information can be used to diagnose Barrett's oesophagus and early-onset oesophageal cancer.

Characterising a transmission matrix requires a series of interferometric measurements. I have made several efforts to establish a robust measurement approach, such as proposing a novel technique that allows phase drift between measurements to be eliminated. I have also considered the effects of measurement noise on a transmission matrix measurement, and have shown that characterising the noise levels in a measurement system allows the transmission matrix to be correctly regularised.

A number of hurdles are to be overcome for hair-thin endoscopes to become a possibility at the patient bedside. Access to the distal end of the multimode fibre is required to characterise the transmission matrix, and the measured transmission matrix is then valid as long as the fibre conformation does not change. It is not possible to keep the fibre completely rigid when introducing it into a patient, though. As such, I have contributed to a project with George Gordon where we have proposed a metasurface reflector to be mounted on the distal end of the fibre that that allows the transmission matrix to be characterised without access to the distal end. The mathematical foundation for this approach has been established, and work is ongoing to build a prototype demonstrator.

Knowledge of the transmission matrix allows an image to be recovered from the distal side of a fibre. Conversely, it also allows an image to be projected to the distal side of a fibre. The wavefront is pre-distorted such that, upon further distortion by the fibre, the desired amplitude pattern is created at the distal side. Such projection techniques will enable optical trapping at the tip of a fibre, 3D printing at the tip of a fibre, or possibly even a new generation of small-footprint holographic projectors. I have studied various phase retrieval algorithms to the problem of projection through a fibre, showing that the weighted Yang-Gu algorithm gives much better results than the Gerchberg-Saxton algorithm that is typically used.

Image taken from my paper (under review)

For further information, see:

  • Ralf Mouthaan, "Holographic Control of Light Propagation in Optical Waveguides", PhD Thesis, University of Cambridge, 2021.

Hardware for Computer-Generated Holography

Keywords: Holography, open source, hardware, algorithms

Computer-generated holograms have revolutionised holography, enabling the projection of holographic videos, the generation of optical traps, the creation of orbital angular momentum beams, as well as super-resolution microscopy techniques. Computer-generated holograms are displayed on a spatial light modulator, with common implementations including liquid crystal devices as well as digital micromirror devices. Commercial devices are often prohibitively expensive, and so I have been supporting Andrew Kadis with the development of an open-source spatial light modulator driver. The long-term goal for this project is to develop an open source spatial light modulator that can be incorporated into other open source projects such as the OpenFlexure microscope.

Hardware-accelerated algorithms are now being leveraged to allow high-resolution holograms to be generated. A number of algorithms we have implemented run on graphical processing units, but an alternative is to instead use field programmable gate arrays (FPGAs). I have contributed to a review paper by Youchao Wang on this topic, and with Daoming Dong I have worked to implement a number of algorithms on FPGA, including time-multiplexing algorithms and algorithms that exploit foveated rendering phenomena.

Image taken from paper co-authored with Andrew Kadis

For further information, see:

  • Andrew Kadis, Youchao Wang, Daoming Dong, Peter Christopher, Ralf Mouthaan & Timothy Wilkinson, "HoloBlade: An open-hardware spatial light modulator driver platform for holographic displays", Applied Optics, Vol. 60, No. 4, 2021.

  • Youchao Wang, Daoming Dong, Peter J. Christopher, Andrew Kadis, Ralf Mouthaan, Fan Yang & Timothy D. Wilkinson, "Hardware implementations of computer-generated holography: a review", Optical Engineering, Vol. 59, No. 10, 2020.

  • Andrew Kadis, Daoming Dong, Youchao Wang, Peter Christopher, Ralf Mouthaan & Timothy Wilkinson, "Holoblade: An open platform for holography", Digital Holography and Three-Dimensional Imaging (DH3D), Washington DC, USA, 2020 (virtual).

Algorithms for Computer-Generated Holography

Keywords: Holography, algorithms

Current AR/VR headsets present the viewer with a stereoscopic display. These devices struggle to create the range full dynamic range of intensities that the eye can perceive, and vergence-accommodation conflicts can make the viewer feel nauseous after extended use. Next-generation holographic displays stand to resolve these issues, but improved algorithms are required to quickly generate high-quality holograms while taking into account the constraints of the projection system. Holographic projection is also widely used in other contexts, such as optical trapping, additive manufacturing and lithography. Working with Peter J. Christopher and others at the University of Cambridge, I have worked to propose improvements to a number of hologram generation algorithms.

Time multiplexing algorithms display a sequence of quickly-generated hologram sub-frames in quick succession that are averaged by the eye. We have proposed a simple approach by which a time-averaged projection is formed with less speckle and a lower variance that is more pleasing to the eye.

Phase retrieval algorithms, originally developed in the context of X-ray crystallography, have been widely applied to the hologram generation problem. We have shown that Parseval's theorem allows the error metric to be expressed in the diffraction field plane, the replay field plane, or indeed any other plane. This has then been exploited to give an improved-quality hologram.

Iterative algorithms such as direct search and simulated annealing proceed by choosing a pixel at random, and changing it to a random value. If the quality of the hologram is improved, the change is accepted, but if the quality of the hologram is deteriorated then the change is rejected. We have shown that a stochastic approach is not necessary, as it is possible to determine which pixel to alter next, and what value to set it to, such that a significantly improved hologram quality is obtained.

Spatial light modulators used to display computer-generated holograms cannot arbitrarily modulate an incident beam. As such, a computer-generated hologram must at some point be quantised to meet the constraints of the spatial light modulator. A novel quantisation strategy has been proposed that yields higher-quality holographic projections.

Gradient descent algorithms are becoming an increasingly popular approach to hologram generation. Typically, the mean-squared error is used as a loss function to optimise the hologram, but it is possible to use alternative metrics. With Fan Yang I have performed an in-depth analysis of the performance of different loss functions to determine which yields the most visually pleasing holograms.

Image from paper co-authored with Fan Yang.

For further information, see:

  • Peter J. Christopher, Ralf Mouthaan, Benjamin Wetherfield, Elliot J. Medcalf & Timothy D. Wilkinson, "Computer-Generated Holography in the Intermediate Domain", Journal of the Optical Society of America A, Vol. 39, No. 3, 2022.

  • Peter J. Christopher, Ralf Mouthaan, Miguel El Guendy & Timothy Wilkinson, "Linear-time algorithm for phase-sensitive holography", Optical Engineering, Vol. 59, No. 8, 2020.

  • Peter J. Christopher, Ralf Mouthaan, A. Mohamed Soliman & Timothy Wilkinson, "Sympathetic quantisation - a new approach to hologram quantisation", Optics Communications, Vol. 473, 2020.

  • Peter J. Christopher, Ralf Mouthaan, George S. D. Gordon & Timothy Wilkinson, "Holographic predictive search: extending the scope", Optics Communications, Vol. 467, 2020.

  • Peter J. Christopher, Ralf Mouthaan, Vamsee Bheemireddy & Timothy D. Wilkinson, "Improving performance of single-pass real-time holographic projection", Optics Communications, Vol. 457, 2020.

A Cell-Free Arsenic Sensor

Keywords: Sensing, arsenic, biosensor, cell-free biology, synthetic biology, electrochemistry

Arsenic contamination of drinking water has become an enduring problem for large parts of the world. Attempts in the 1960s to eradicate water-borne diseases such as cholera in parts of Bangladesh and Nepal led to tube-wells being sunk into aquifers naturally contaminated with arsenic. The consequent health effects of long-term exposure in local populations include diabetes, liver disorders, kidney damage, peripheral neuropathy, skin lesions, and can eventually be fatal. This has led the World Health Organisation (WHO) to describe the situation as the “largest mass poisoning of a population in history”. Testing is currently done using chemical kits that require a degree of expertise and contain small amounts of mercury, potentially raising issues with respect to disposal. An affordable, easy to use and disposable sensor would hence allow local populations to test wells regularly and often.

This project aimed to develop a low-cost, easy-to-use, durable and disposable proof of concept quantitative arsenic biosensor based on a cell-free biological interface. Cell-free systems displace the machinery required for transcription, translation and metabolism outside of the cell environment by lysing cells and using the lysate. Cell-free systems allow for higher repeatability, more flexibility, lower complexity and fewer incompatibility issues than their in-cell counterparts. Crucially, cell-free systems struggle to self-replicate and are not viable outside of a laboratory, hence providing a safe and versatile framework for developing practical biosensors.

A genetic circuit was proposed to produce a redox enzyme in response to exposure to arsenic. The redox enzyme concentration is then determined using established electrochemical techniques. As part of this project, we developed the genetic circuit, the electrochemical assay, an low-cost potentiostat, and a data sharing platform. It is ultimately anticipated that at the point of use, an electrode with all the necessary components freeze-dried onto it is rehydrated using arsenic-contaminated water. It is then inserted into a hand-held meter which reports the reading.

This project was a team challenge done as part of the Sensors MRes at the University of Cambridge. With Genevieve Hughes, I led a team of 12 people, and managed all aspects of the project delivery.

Image taken from my MRes thesis.

For further information, see:

  • Ralf Mouthaan, "A Cell-Free Arsenic Sensor", part of MRes degree, University of Cambridge, 2017.

Atomic Magnetometry

Keywords: Atomic magnetometry, sensing, nuclear magnetic resonance, magnetic field

New advancements in atomic magnetometry have greatly advanced its sensitivity, with demonstrated sensitivities of 1fT/Hz^(1/2), and theoretical sensitivities of 1aT/Hz^(1/2) now possible. These sensitivities are improving upon the benchmark set by superconducting quantum interference detectors (SQUIDs) without the need for cryogenic cooling. This project, undertaken under with Lynn Gladden and Michael Tayler, aimed to establish a rubidium-87 alkali vapour magnetometer at the University of Cambridge for nuclear magnetic resonance (NMR) applications.

Conventional NMR signals are dominated by Zeeman splitting and chemical shift signals, require extremely high fields. Ultra-low field NMR experiments, such as those enabled by atomic magnetometers, allow local spin-coupling interactions to be probed, opening up a new paradigm in NMR spectroscopy. Furthermore, these measurements do not require a homogeneous field imposed by a large magnet, meaning the footprint of the measurement apparatus is much smaller, and measurements can be performed on inhomogeneous samples.

In this project, I contributed to the commissioning of a tabletop-sized ultra low field spectrometer. The sensitivity and bandwidth of the magnetometer were characterised as a function of the pump laser power and a sensitivity-optimised pump laser power was determined. It was shown that the magnetometer recovery time can be reduced by increasing the pump laser power and a novel optical pumping approach was hence proposed that would allow the recovery time to be shortened without compromising the magnetometer sensitivity.

Images adapted from Michael Tayler's paper

For further information, see:

  • Ralf Mouthaan, "Sensitivity and Response Time of an Ultra Low Field NMR Instrument based on an Atomic Magnetometer", part of MRes degree, University of Cambridge, 2017.

Ultrafast Electrical Pulse Measurement Service

Keywords: Metrology, calibration, oscilloscope, pulse, waveform

Working with Christopher Eio, I ran the ultrafast electrical pulse measurement service at the UK's national measurement institute, the National Physical Laboratory. We provided a traceable measurement of the risetime, falltime, amplitude, overshoot and undershoot of pulse generators. Typically, pulse generators with risetimes of <25ps would be calibrated, which could then be used by customers for the traceable calibration of oscilloscopes.

For further information, see:

Implant Safety in MRI Scanners

Keywords: MRI, magnetic resonance imaging, human exposure, RF & microwave, implants

Magnetic resonance imaging (MRI) is a powerful imaging modality, and one which is perceived as inherently safe as it only exposes the body to non-ionising radiation. The exception to this is for patients with implants. While most modern implants are non-magnetic and will hence not be attracted to the large magnets used, there are concerns regarding heating around these implants. MRI scanners use radiofrequency (RF) fields to build up an image of the human body. These RF fields are can be concentrated by conducting implants, potentially causing burns and tissue necrosis. Concerns regarding these effects can lead to patients being excluded from MRI scans that may have been used for the diagnosis of other conditions. Ideally, the risks associated with scanning a patient with an implant would be well-known, such that an informed decision can be made regarding whether to scan a patient.

This project was undertaken at the UK's National Physical Laboratory (NPL) in collaboration with Guys' and St Thomas' NHS trust and JRI orthopaedics, and aimed to setup up the UK's first facility for testing the RF safety of implants in an MRI environment, in accordance with ASTM standard F2182. Tissue-equivalent phantoms were formulated to mimic the human body at 64 MHz. The implant under test was positioned inside the phantom with fibre-optic temperature probes attached, and was placed in a bespoke birdcage coil which recreates the RF environment of an MRI scanner.

Images taken from IWBEEMF Malta conference paper.

For further information, see:

  • Ralf Mouthaan & Benjamin Loader, "RF-induced heating near passive implants during magnetic resonance imaging scans", Institute of Physics & Engineering in Medicine Magnetic Resonance Safety Update (IPEM MR Safety Update), Edinburgh, Scotland, 2013. (Best proffered paper).

  • Ralf Mouthaan, Benjamin Loader, Anita Czenkusz & Edward Draper, "RF-induced heating near an orthopaedic implant during a magnetic resonance imaging scan", 7th International Workshop on the Biological Effects of Electromagnetic Fields (IWBEEMF), Valetta, Malta, 2012.

Safety of Body-Mounted and Implanted Antennas

Keywords: RF & microwave, specific absorption rate, instrumentation, metrology

Specific absorption rate (SAR) is a measure of non-ionising radiation absorbed by the human body. At NPL, I helped establish the calibration facilities to calibrate SAR probes between 30 MHz and 6 GHz. Additionally, I developed the capability of formulating dielectric phantoms over this frequency range. The next logical step was to establish a facility for taking SAR measurements.

A bespoke SAR measurement facility was established capable of taking SAR measurements in line with the IEEE 1528, IEC 62209-2 and FCC OET65-C standards. As this facility had been developed in-house, and was underpinned by the SAR probe calibration capability and dielectric measurement capability also developed at NPL, it provided the flexibility required to take more complicated measurements, for example at non-standard frequencies or in non-standard configurations. For example, backpack antennas used by the armed services and antennas embedded in the fabric of a t-shirt could be tested. Uniquely, this facility allows implanted antennas such as those used in capsule endoscopes to be evaluated, as well as hyperthermia treatments for tumours.

Image taken from the NPL website.

For further information, see:

Dielectric Measurements & Tissue-Equivalent Phantoms

Keywords: RF & microwave, dielectric measurements, metrology, specific absorption rate, tissue-equivalent phantoms

A specific absorption rate (SAR) measurement allows the amount of non-ionising radiation absorbed by the body to be quantified. It is, of course, not possible to take these measurements within the human body, and hence a so-called phantom is required, which stands in for the human body. These phantoms consist of a shell in the shape of the human body, and are filled using a liquid with dielectric properties that adhere to target values. Alternatively, computational models can be used instead of measurements on phantoms, but these still require the dielectric properties of the human body to be known with great accuracy.

High-quality dielectric data is required to underpin the a variety of measurements. For example, the dielectric properties of different biological tissues need to be measured to inform the target dielectric properties of the phantom and indeed the geometry of the phantom shell. Or, the dielectric properties of the tissue-equivalent liquid used to fill the phantom shells would need to be regularly characterised. My NPL colleague Andrew Gregory developed the dielectric metrology capability to perform these measurements, developing coaxial reflection standards as well as waveguide-based transmission standards.

Using this unique measurement capability, I worked to formulate new phantoms for use at different parts of the spectrum. I developed phantoms for SAR measurements at 5.2 GHz and 5.8 GHz, corresponding to common WiFi frequencies, as well as broadband SAR phantoms. In collaboration with Benjamin Loader, we defined target dielectric properties for SAR phantoms in the 6 to 10 GHz range, and were the first to develop SAR phantoms for this part of the spectrum. I also developed phantoms for the 30 MHz to 300 MHz range, supporting work investigating the safety of body-mounted antennas and MRI machines. Separately, I developed phantoms to mimic vegetables such as potatoes in microwave imaging systems for deployment in the agricultural sector.

Image taken from Andrew Gregory's paper

For further information, see:

  • B. G. Loader, A. P. Gregory & R. Mouthaan, "Formulation and properties of liquid phantoms, 1 MHz to 10 GHz", NPL Report, TQE9, 2018.

  • Ralf Mouthaan & Benjamin Loader, "Wideband tissue-equivalent liquid for multiband specific absorption rate measurements (850 MHz-2.5 GHz)", Electronic Letters, Vol. 52, No. 3, 2016.

  • Imran Mohamed, Richard Dudley, Andrew Gregory, Ralf Mouthaan, Zhengrong Tian, Paul Andrews & Andrew Mellonie, "Non-destructive testing for black heart cavities in potatoes with microwave radiation", 46th European Microwave Conference (EuMC), London, UK, 2016.

National Metrology Standards for RF & Microwave Exposure

Keywords: RF & microwave, human exposure, metrology, instrumentation, calibration, waveguide, national standards.

Mobile telephones, laptops, wireless sensors, smart watches, RFID tags and more communicate with the outside world using non-ionising RF & microwave radiation. Non-ionising radiation is not capable of directly damaging biological cells or DNA, and as such does not pose the same dangers as ionising radiation to the human body. Non-ionising radiation can induce electrical currents or cause heating though, and these effects must be kept within safe limits.

The International Commission for Non-Ionising Radiation Protection (ICNIRP) guidelines define safety limits, but consensus must still be reached on best practice for characterising radiation levels. In the UK, these responsibilities are met by the national metrology institute, the National Physical Laboratory (NPL), which contributes to documents such as the IEEE 1528 and the IEC 62209-2. These standards require that specific measurements be taken to demonstrate the safety of a device. For example, power flux density (PFD) can be measured in free space, or specific absorption rate (SAR) can be measured in a dielectric phantom. NPL maintains the UK's metrology standards in this area, to which all measurements must be traceable. NPL hence provides a unique service for UK industry, allowing the safety of transmitting devices to be demonstrated, and also to UK researchers, who can reliably quantify RF & microwave exposure in their experiments.

Working with Benjamin Loader and Daniel Bownds, I maintained the UK's RF & microwave exposure standards. Typically, metrology standards in this area consist of a transverse electromagnetic (TEM) cell or waveguide system capable of generating a known field into which a measurement probe can be inserted for calibration. A significant effort is made to characterise all the uncertainties associated with the calibration. In some cases, new standards need to be developed as new technologies are introduced or as the use of technologies evolved. For example, after the 2005 London bombings, a new TETRA communications system was introduced, requiring waveguide SAR standards to be developed capable of generating known fields in a lossy medium at 450 MHz. To future-proof the Europe's metrology capabilities, I contributed to a European Metrology Research Programme (EMRP) project to develop broadband standards covering the spectrum from 30 MHz to 10 GHz. I also worked on characterising the effects of non-sinusoidal signals on the measurement process, for example by performing a case study on electrical welding waveforms. NPL's capability is unique in the world, and a number of consultancy contracts were undertaken to provide measurement systems to armed services as well as other metrology institutes.

Images taken from SPEAG's website and the EMC Europe Wroclaw conference paper.

For further information, see:

  • Ralf Mouthaan, Daniel Bownds & Benjamin Loader, "SAR probe calibrations in the 450 MHz to 790 MHz band", Bulgarian Journal of Public Health, Supplement, Vol. 7, No. 2, 2015.

  • Ralf Mouthaan, Daniel Bownds & Benjamin Loader, "SAR probe calibrations in the 450 MHz to 790 MHz band", 8th International Workshop on the Biological Effects of Electromagnetic Fields (IWBEEMF), Varna, Bulgaria, 2014.

  • Ralf Mouthaan, Benjamin Loader, Daniel Bownds & Andrew Gregory, "Matched waveguide SAR probe calibration systems for 750 MHz to 1.7 GHz", 10th International Congress of the European Bioelectromagnetics Association (EBEA), Rome, Italy, 2011.

  • Ralf Mouthaan, Benjamin Loader, Daniel Bownds & Andrew Gregory, "Broadband matching windows for waveguide SAR calibration systems to span 1.7 GHz to 5.85 GHz", EMC Europe, Wroclaw, Poland, 2010.

Magnetic Cooling

Keywords: Magnetic cooling, magnetocaloric effect, air conditioning, instrumentation, modelling

Modern refrigerators and air conditioning units use a gas-compression cycle, but increased demand is emphasizing the need for more efficient and environmentally friendly heating and cooling systems. The magnetocaloric effect provides an enticing alternative to traditional technologies. The magnetocaloric effect is exhibited by materials such as gadolinium, which drop in temperature when exposed to a strong magnetic field. This can be understood by considering that the magnetic domains within the gadolinium sample align with the field, reducing the associated entropy. If this process is done adiabatically, a corresponding temperature increase is observed. Conversely, when the magnetic field is removed, a temperature decrease is observed.

This MSci project undertaken with Stephen Biggs at the University of Nottingham aimed to investigate the utility of the magnetocaloric effect for air conditioning applications. By incorporating the magnetization and demagnetization of gadolinium into a heat cycle, cooling of a cold reservoir and heating of a hot reservoir can be achieved. Experimental and modelling results demonstrated the feasibility of this approach, but an economically viable alternative to gadolinium would need to be found.

Images taken from my MSci thesis

For further information, see:

  • Ralf Mouthaan & Stephen Biggs, "Magnetic Cooling for Domestic Air-Conditioning: Application of the Magnetocaloric Effect to Air-Conditioning Systems", MSci Thesis, University of Nottingham, 2008.