VAMPIRE: Billion-atom simulations of magnetic materials
eCSE07-009Key Personnel
PI: Dr. Richard Evans, University of York
Technical: Rory Pond, University of York
Relevant Documents
eCSE Technical Report: Optimisation of VAMPIRE for billion-atom simulations of magnetic materials on ARCHER
Project summary
Magnetic materials are essential to a wide range of technologies, from data storage to cancer treatment to permanent magnets used in wind generators. New developments in magnetic materials promise huge increases in performance of devices but progress is limited by our understanding of magnetic properties at the atomic scale. Atomistic spin dynamics simulations[1] provide a natural way to study magnetic processes on the nanoscale, treating each atom as possessing a localised spin magnetic moment. The localised nature of the spins allows the simulation of a range of complex physical phenomena such as phase transitions, laser heating, and antiferromagnet dynamics in complex systems such as nanoparticles, surfaces and interfaces. However, such approaches are computationally expensive, requiring parallel computers to perform simulations of more than a few thousand atoms.
The aims of this eCSE project were to optimise the VAMPIRE code on the ARCHER system and improve the data input/output routines to enable configuration data to be extracted from the simulation to see the time evolution of the atomic spins. The improved code enables a new class of magnetic materials simulation containing between 10,000,000 and 1,000,000,000 atoms. Such large simulations give unprecedented insight into to the behaviour of complex magnetic materials for realistic situations. An example calculation of a recently discovered kind of magnetic switching caused by ultrafast laser heat pulses is shown in the figure. Here the material is made of an alloy of Fe and Gd where the composition varies on the five nanometre length scale. The simulation shows a snapshot of the magnetic configuration during reversal, showing the appearance of magnetic domains and their propagation at speeds of over 100 km per second!
These simulations were not possible before the project and have been enabled by a new method to write simulation to disk in parallel. The new method uses the parallel file system in ARCHER to output large files to disk in parallel, and has enabled transfers at a rate of over 30GB/s, allowing the simulations to proceed as fast as possible. In future these large scale simulations could lead to new magnetic devices and technologies that could increase the storage capacity of hard disk drives by an order of magnitude or lead to the development of new permanent magnet materials that are twice as energy efficient as ones in use today.
Achievement of objectives
The objectives of the project were:
- Port and optimise the VAMPIRE code on ARCHER
- Optimise parallel data input and output for scalable simulations on ARCHER
We have successfully ported and optimised the open source VAMPIRE software package to the UK ARCHER supercomputing service, achieving between 4% and 17% speedup using optimised compiler and settings. During the project we implemented major changes to the data input and output routines in the code removing the previous bottlenecks of generating snapshots of the atomic scale magnetic configuration from a simulation. The new output code utilizes the full capabilities of the parallel file system on ARCHER achieving effective output bandwidths in excess of 30 GB/s for a wide range of data sizes and processor counts. We have demonstrated the new capabilities of the code by simulating ultrafast magnetic domain wall dynamics in a system of over 918,000,000 Fe and Gd atoms.
Summary of the software
The VAMPIRE code is an open source software package to perform parallel atomistic simulations of magnetic materials. The code is written in a mixture of functional and object oriented C++ with a modular structure to allow new features to be easily added to the code. The code is freely available from Github. As part of the version 5 release of the VAMPIRE code (due summer 2017) it is intended that the code will become a standard module on the ARCHER system.