{"id":7880,"date":"2025-01-14T20:40:00","date_gmt":"2025-01-14T20:40:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2025\/01\/14\/new-computational-chemistry-techniques-accelerate-the-prediction-of-molecules-and-materials\/"},"modified":"2025-01-14T20:40:00","modified_gmt":"2025-01-14T20:40:00","slug":"new-computational-chemistry-techniques-accelerate-the-prediction-of-molecules-and-materials","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2025\/01\/14\/new-computational-chemistry-techniques-accelerate-the-prediction-of-molecules-and-materials\/","title":{"rendered":"New computational chemistry techniques accelerate the prediction of molecules and materials"},"content":{"rendered":"<p>Author: Steve Nadis | Department of Nuclear Science and Engineering<\/p>\n<div>\n<p>Back in the old days \u2014 the really old days \u2014 the task of designing materials was laborious. Investigators, over the course of 1,000-plus years, tried to make gold by combining things like lead, mercury, and sulfur, mixed in what they hoped would be just the right proportions. Even famous scientists like Tycho Brahe, Robert Boyle, and Isaac Newton tried their hands at the fruitless endeavor we call alchemy.<\/p>\n<p>Materials science has, of course, come a long way. For the past 150 years, researchers have had the benefit of the periodic table of elements to draw upon, which tells them that different elements have different properties, and one can\u2019t magically transform into another. Moreover, in the past decade or so, machine learning tools have considerably boosted our capacity to determine the structure and physical properties of various molecules and substances. New research by a group led by Ju Li \u2014 the Tokyo Electric Power Company Professor of Nuclear Engineering at MIT and professor of materials science and engineering\u00a0\u2014 offers the promise of a major leap in capabilities that can facilitate materials design. The results of their investigation are reported in a\u00a0<a href=\"https:\/\/arxiv.org\/abs\/2405.12229\">December 2024 issue of <em>Nature Computational Science<\/em><\/a><em>.<\/em><\/p>\n<p>At present, most of the machine-learning models that are used to characterize molecular systems are based on density functional theory (DFT), which offers a quantum mechanical approach to determining the total energy of a molecule or crystal by looking at the electron density distribution \u2014 which is, basically, the average number of electrons located in a unit volume around each given point in space near the molecule. (Walter Kohn, who co-invented this theory 60 years ago, received a Nobel Prize in Chemistry for it in 1998.) While the method has been very successful, it has some drawbacks, according to Li: \u201cFirst, the accuracy is not uniformly great. And, second, it only tells you one thing: the lowest total energy of the molecular system.\u201d<\/p>\n<p><strong>\u201cCouples therapy\u201d to the rescue<\/strong><\/p>\n<p>His team is now relying on a different computational chemistry technique, also derived from quantum mechanics, known as coupled-cluster theory, or CCSD(T). \u201cThis is the gold standard of quantum chemistry,\u201d Li comments. The results of CCSD(T) calculations are much more accurate than what you get from DFT calculations, and they can be as trustworthy as those currently obtainable from experiments. The problem is that carrying out these calculations on a computer is very slow, he says, \u201cand the scaling is bad: If you double the number of electrons in the system, the computations become 100 times more expensive.\u201d For that reason, CCSD(T) calculations have normally been limited to molecules with a small number of atoms \u2014 on the order of about 10. Anything much beyond that would simply take too long.<\/p>\n<p>That\u2019s where machine learning comes in. CCSD(T) calculations are first performed on conventional computers, and the results are then used to train a neural network with a novel architecture specially devised by Li and his colleagues. After training, the neural network can perform these same calculations much faster by taking advantage of approximation techniques. What\u2019s more, their neural network model can extract much more information about a molecule than just its energy. \u201cIn previous work, people have used multiple different models to assess different properties,\u201d says Hao Tang, an MIT PhD student in materials science and engineering. \u201cHere we use just one model to evaluate all of these properties, which is why we call it a \u2018multi-task\u2019 approach.\u201d<\/p>\n<p>The \u201cMulti-task Electronic Hamiltonian network,\u201d or MEHnet, sheds light on a number of electronic properties, such as the dipole and quadrupole moments, electronic polarizability, and the optical excitation gap \u2014 the amount of energy needed to take an electron from the ground state to the lowest excited state. \u201cThe excitation gap affects the optical properties of materials,\u201d Tang explains, \u201cbecause it determines the frequency of light that can be absorbed by a molecule.\u201d Another advantage of their CCSD-trained model is that it can reveal properties of not only ground states, but also excited states. The model can also predict the infrared absorption spectrum of a molecule related to its vibrational properties, where the vibrations of atoms within a molecule are coupled to each other, leading to various collective behaviors.<\/p>\n<p>The strength of their approach owes a lot to the network architecture. Drawing on the work of MIT Assistant Professor <a href=\"https:\/\/mitibmwatsonailab.mit.edu\/people\/tess-smidt\/\">Tess Smidt<\/a>, the team is utilizing a so-called E(3)-equivariant graph neural network, says Tang, \u201cin which the nodes represent atoms and the edges that connect the nodes represent the bonds between atoms. We also use customized algorithms that incorporate physics principles \u2014 related to how people calculate molecular properties in quantum mechanics \u2014 directly into our model.\u201d<\/p>\n<p><strong>Testing, 1, 2 3<\/strong><\/p>\n<p>When tested on its analysis of known hydrocarbon molecules, the model of Li et al. outperformed DFT counterparts and closely matched experimental results taken from the published literature.<\/p>\n<p>Qiang Zhu \u2014 a materials discovery specialist at the University of North Carolina at Charlotte (who was not part of this study) \u2014 is impressed by what\u2019s been accomplished so far. \u201cTheir method enables effective training with a small dataset, while achieving superior accuracy and computational efficiency compared to existing models,\u201d he says. \u201cThis is exciting work that illustrates the powerful synergy between computational chemistry and deep learning, offering fresh ideas for developing more accurate and scalable electronic structure methods.\u201d<\/p>\n<p>The MIT-based group applied their model first to small, nonmetallic elements \u2014 hydrogen, carbon, nitrogen, oxygen, and fluorine, from which organic compounds can be made \u2014 and has since moved on to examining heavier elements: silicon, phosphorus, sulfur, chlorine, and even platinum. After being trained on small molecules, the model can be generalized to bigger and bigger molecules. \u201cPreviously, most calculations were limited to analyzing hundreds of atoms with DFT and just tens of atoms with CCSD(T) calculations,\u201d Li says. \u201cNow we\u2019re talking about handling thousands of atoms and, eventually, perhaps tens of thousands.\u201d<\/p>\n<p>For now, the researchers are still evaluating known molecules, but the model can be used to characterize molecules that haven\u2019t been seen before, as well as to predict the properties of hypothetical materials that consist of different kinds of molecules. \u201cThe idea is to use our theoretical tools to pick out promising candidates, which satisfy a particular set of criteria, before suggesting them to an experimentalist to check out,\u201d Tang says.<\/p>\n<p><strong>It\u2019s all about the apps<\/strong><\/p>\n<p>Looking ahead, Zhu is optimistic about the possible applications. \u201cThis approach holds the potential for high-throughput molecular screening,\u201d he says. \u201cThat\u2019s a task where achieving chemical accuracy can be essential for identifying novel molecules and materials with desirable properties.\u201d<\/p>\n<p>Once they demonstrate the ability to analyze large molecules with perhaps tens of thousands of atoms, Li says, \u201cwe should be able to invent new polymers or materials\u201d that might be used in drug design or in semiconductor devices. The examination of heavier transition metal elements could lead to the advent of new materials for batteries \u2014 presently an area of acute need.<\/p>\n<p>The future, as Li sees it, is wide open. \u201cIt\u2019s no longer about just one area,\u201d he says. \u201cOur ambition, ultimately, is to cover the whole periodic table with CCSD(T)-level accuracy, but at lower computational cost than DFT. This should enable us to solve a wide range of problems in chemistry, biology, and materials science. It\u2019s hard to know, at present, just how wide that range might be.\u201d<\/p>\n<p>This work was supported by the Honda Research Institute. Hao Tang acknowledges support from the Mathworks Engineering Fellowship. The calculations in this work were performed, in part, on the Matlantis high-speed universal atomistic simulator, the Texas Advanced Computing Center, the MIT SuperCloud, and the National Energy Research Scientific Computing.<\/p>\n<\/div>\n<p><a href=\"https:\/\/news.mit.edu\/2025\/new-computational-chemistry-techniques-accelerate-prediction-molecules-materials-0114\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Steve Nadis | Department of Nuclear Science and Engineering Back in the old days \u2014 the really old days \u2014 the task of designing [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2025\/01\/14\/new-computational-chemistry-techniques-accelerate-the-prediction-of-molecules-and-materials\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":458,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/7880"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=7880"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/7880\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/458"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=7880"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=7880"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=7880"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}