Physics Colloquium: Materials for the future

In today’s world where the expansion of technology is based largely upon the development of new materials, the demand for material innovation is constantly growing alongside the demand for newer and faster technologies. So, how are these materials created and tested for possible industrial uses? At this point in time, the answer to that is essentially by trial-and-error: repeated hypothesizing, fabrication, and implementation, which is continuously getting costlier in terms of both time and money. In this week’s installment of the Physics department’s colloquia series, Prof. Abram van der Geest of SUNY-Binghamton explained a recently developed process to ‘predict’ the formulation of new materials and their properties by computational methods and analysis.

At first, van der Geest explained, it seemed a bit of a stretch that the properties of materials not yet created could be accurately hypothesized by a computer without actually having a physical sample to work with. Since the methods of material development rely heavily upon multiple trials or iterations, computing clusters seemed appropriate as they excel at implementing repetitive and iterative calculations. Regarding what these computers would actually be calculating, van der Geest described how the science behind material structures and surfaces was based upon density functional theory (DFT), which states that all crystalline materials (the focus of this formulation) are infinitely periodic. Furthermore, it explains that these crystalline structures are composed of a basis and series of Bravais lattice, which are vectors of a given cell with six degrees of freedom (an important feature in later calculations).

The end goal of this computational DFT method was to define the stability (or in some cases, metastable points, which are explained later) of a hypothesized material. These computations differ slightly from the lowest-energy-seeking, single-system ideas of quantum mechanics, as material surfaces do not generally have a single low-energy point. Points of minimal energy for material structures change in unison with exterior conditions such as temperature and pressure, leading to several points of ‘metastability’, or stability at a specific set of circumstances.

The first step in this methodology, as with most computational methods, was to define the needed parameters. In this case, these parameters were the materials atomic computations, the number of formula units, the material’s lattice parameters, and the atomic positions within the material. Once these parameters are adequately set, van der Geest explained, the prediction calculation can then be run, and its computational cost (in CPU hours) varies depending on how far into the prediction process one wishes to go with zero experimental input (meaning an entirely predicted result). Generally, a basic characterization of the proposed material requires on the order of 100 CPU hours to compute, and a more precise solution takes on the order of roughly 1000 CPU hours. At both of these levels, most computer clusters can handle the work with little difficulty. However, for a thorough, highly detailed, ‘true’ prediction of the material and all of its properties, an order of about 100,000 CPU hours is required for a complete prediction with absolutely no experimental input. At this computational cost, a limit of feasibility proposed by van der Geest is materials with no more than 32 atoms in its composition. Any more than this and the benefits of predicting a material over simply producing and testing materials repetitively are virtually gone at the current level of available computing power.

Another major part of DFT is the provable fact that crystal structures are widely reused throughout nature, and that over time scientists have discovered and validated the existence of numerous structures. Thus, van der Geest proposed that to determine the structure of a hypothesized material, different known structures are iteratively tested on the composition until one ‘fits’, or has the lowest stability energy. Once again, computing clusters are great for quickly running all of these repetitive structure-composition combinations. The only drawback that van der Geest pointed out, however, was that predictions using DFT are limited to known structures. This process does not have any means of experimenting with new crystalline structures, and merely applies the ‘best fitting’ structure to the hypothesized material, which up to this point has been successful.

An example of how this new and growing idea has already impacted the world of technology is in the development and application of metal borides. At this point, up to 41 metal elements are known to be able to form molecules with boron atoms and create materials with a wide variety of properties. Some of these metal borides have been predicted to be used as superconductors, superhard materials, and refractory materials, which remain stable in extremely challenging conditions (high pressures, temperature, etc.). Several of these remain only predictions to this day, but some predictions have already been shown to be true, including the superconductors iron tetraboride (FeB4), cadmium hexaboride (CaB6), and manganese tetraboride (MnB4). And even for those that have yet to be physically proven true the predictions hold astounding possibilities if they are able to be proven true experimentally, as in the case of rhenium diboride, which predicts to be the world’s hardest material (even stronger than diamond).

While DFT is already intriguing to those in the study of material sciences, van der Geest hopes to soon improve upon the method further. He plans on eventually finding ways to apply the theory to a wider range of surfaces and interfaces, as well as modifying the functions involved within the computations to accommodate a wider range of material properties.



Copyright © 2020 The Oredigger Newspaper. All Rights Reserved.