{"id":6134,"date":"2022-11-28T20:25:00","date_gmt":"2022-11-28T20:25:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2022\/11\/28\/the-task-of-magnetic-classification-suddenly-looks-easier\/"},"modified":"2022-11-28T20:25:00","modified_gmt":"2022-11-28T20:25:00","slug":"the-task-of-magnetic-classification-suddenly-looks-easier","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2022\/11\/28\/the-task-of-magnetic-classification-suddenly-looks-easier\/","title":{"rendered":"The task of magnetic classification suddenly looks easier"},"content":{"rendered":"<p>Author: Steve Nadis | Department of Nuclear Science and Engineering<\/p>\n<div>\n<p>Knowing the magnetic structure of crystalline materials is critical to many applications, including data storage, high-resolution imaging, spintronics, superconductivity, and quantum computing. Information of this sort, however, is difficult to come by. Although magnetic structures can be obtained from neutron diffraction and scattering studies, the number of machines that can support these analyses \u2014 and the time available at these facilities \u2014 is severely limited.<\/p>\n<p>As a result, the magnetic structures of only about 1,500 materials worked out experimentally have been tabulated to date. Researchers have also predicted magnetic structures by numerical means, but lengthy calculations are required, even on large, state-of-the-art supercomputers. These calculations, moreover, become increasingly expensive, with power demands growing exponentially, as the size of the crystal structures under consideration goes up.<\/p>\n<p>Now, researchers at MIT, Harvard University, and Clemson University \u2014 led by\u00a0<a href=\"https:\/\/web.mit.edu\/nse\/people\/faculty\/mli.html\">Mingda Li<\/a>, MIT assistant professor of nuclear science and engineering, and <a href=\"https:\/\/www.eecs.mit.edu\/people\/tess-smidt\/\">Tess Smidt<\/a>, MIT assistant professor of electrical engineering and computer science \u2014 have found a way to streamline this process by employing the tools of machine learning. \u201cThis might be a quicker and cheaper approach,\u201d Smidt says.<\/p>\n<p>The team\u2019s results were recently <a href=\"https:\/\/www.cell.com\/iscience\/fulltext\/S2589-0042(22)01464-X\" target=\"_blank\" rel=\"noopener\">published in the journal\u00a0<em>iScience<\/em><\/a>. One unusual feature of this paper, apart from its novel findings, is that its first authors are three MIT undergraduates \u2014 Helena Merker, Harry Heiberger, and Linh Nguyen \u2014 plus one PhD student, Tongtong Liu.<\/p>\n<p>Merker, Heiberger, and Nguyen joined the project as first-years in fall 2020, and they were given a sizable challenge: to design a neural network that can predict the magnetic structure of crystalline materials. They did not start from scratch, however, making use of \u201cequivariant Euclidean neural networks\u201d that were co-invented by Smidt in 2018. The advantage of this kind of network, Smidt explains, \u201cis that we won\u2019t get a different prediction for the magnetic order if a crystal is rotated or translated, which we know should not affect the magnetic properties.\u201d That feature is especially helpful for examining 3D materials.<\/p>\n<p><strong>The elements of structure<\/strong><\/p>\n<p>The MIT group drew upon a database of nearly 150,000 substances compiled by the <a href=\"https:\/\/materialsproject.org\/\">Materials Project<\/a> at the Lawrence Berkeley National Laboratory, which provided information concerning the arrangement of atoms in the crystal lattice. The team used this input to assess two key properties of a given material: magnetic order and magnetic propagation.<\/p>\n<p>Figuring out the magnetic order involves classifying materials into three categories: ferromagnetic, antiferromagnetic, and nonmagnetic. The atoms in a ferromagnetic material act like little magnets with their own north and south poles. Each atom has a magnetic moment, which points from its south to north pole. In a ferromagnetic material, Liu explains, \u201call the atoms are lined up in the same direction \u2014 the direction of the combined magnetic field produced by all of them.\u201d In an antiferromagnetic material, the magnetic moments of the atoms point in a direction opposite to that of their neighbors \u2014 canceling each other out in an orderly pattern that yields zero magnetization overall. In a nonmagnetic material, all the atoms could be nonmagnetic, having no magnetic moments whatsoever. Or the material could contain magnetic atoms, but their magnetic moments would point in random directions so that the net result, again, is zero magnetism.<\/p>\n<p>The concept of magnetic propagation relates to the periodicity of a material\u2019s magnetic structure. If you think of a crystal as a 3D arrangement of bricks, a unit cell is the smallest possible building block \u2014 the smallest number, and configuration, of atoms that can make up an individual \u201cbrick.\u201d If the magnetic moments of every unit cell are aligned, the MIT researchers accorded the material a propagation value of zero. However, if the magnetic moment changes direction, and hence \u201cpropagates,\u201d in moving from one cell to the next, the material is given a non-zero propagation value.<\/p>\n<p><strong>A network solution<\/strong><\/p>\n<p>So much for the goals. How can machine learning tools help achieve them? The students\u2019 first step was to take a portion of the Materials Project database to train the neural network to find correlations between a material\u2019s crystalline structure and its magnetic structure. The students also learned \u2014 through educated guesses and trial-and-error \u2014 that they achieved the best results when they included not just information about the atoms\u2019 lattice positions, but also the atomic weight, atomic radius, electronegativity (which reflects an atom\u2019s tendency to attract an electron), and dipole polarizability (which indicates how far the electron is from the atom\u2019s nucleus). During the training process, a large number of so-called \u201cweights\u201d are repeatedly fine-tuned.<\/p>\n<p>\u201cA weight is like the coefficient m in the equation y = mx + b,\u201d Heiberger explains. \u201cOf course, the actual equation, or algorithm, we use is a lot messier, with not just one coefficient but perhaps a hundred; x, in this case, is the input data, and you choose m so that y is predicted most accurately. And sometimes you have to change the equation itself to get a better fit.\u201d<\/p>\n<p>Next comes the testing phase. \u201cThe weights are kept as-is,\u201d Heiberger says, \u201cand you compare the predictions you get to previously established values [also found in the Materials Project database].\u201d<\/p>\n<p>As reported in\u00a0<em>iScience<\/em>, the model had an average accuracy of about 78 percent and 74 percent, respectively, for predicting magnetic order and propagation. The accuracy for predicting the order of nonmagnetic materials was 91 percent, even if the material contained magnetic atoms.<\/p>\n<p><strong>Charting the road ahead<\/strong><\/p>\n<p>The MIT investigators believe this approach could be applied to large molecules whose atomic structures are hard to discern and even to alloys, which lack crystalline structures. \u201cThe strategy there is to take as big a unit cell \u2014 as big a sample \u2014 as possible and try to approximate it as a somewhat disordered crystal,\u201d Smidt says.<\/p>\n<p>The current work, the authors wrote, represents one step toward \u201csolving the grand challenge of full magnetic structure determination.\u201d The \u201cfull structure\u201d in this case means determining \u201cthe specific magnetic moments of every atom, rather than the overall pattern of the magnetic order,\u201d Smidt explains.<\/p>\n<p>\u201cWe have the math in place to take this on,\u201d Smidt adds, \u201cthough there are some tricky details to be worked out. It\u2019s a project for the future, but one that appears to be within reach.\u201d<\/p>\n<p>The undergraduates won\u2019t participate in that effort, having already completed their work in this venture. Nevertheless, they all appreciated the research experience. \u201cIt was great to pursue a project outside the classroom that gave us the chance to create something exciting that didn\u2019t exist before,\u201d Merker says.<\/p>\n<p>\u201cThis research, entirely led by undergraduates, started in 2020 when they were first-years. With Institute support from the ELO [Experiential Learning Opportunities] program and later guidance from PhD student Tongtong Liu, we were able to bring them together even while physically remote from each other. This work demonstrates how we can expand the first-year learning experience to include a real research product,\u201d Li adds. \u201cBeing able to support this kind of collaboration and learning experience is what every educator strives for. It is wonderful to see their hard work and commitment result in a contribution to the field.\u201d<\/p>\n<p>\u201cThis really was a life-changing experience,\u201d Nguyen agrees. \u201cI thought it would be fun to combine computer science with the material world. That turned out to be a pretty good choice.\u201d<\/p>\n<\/div>\n<p><a href=\"https:\/\/news.mit.edu\/2022\/task-magnetic-classification-suddenly-looks-easier-1128\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Steve Nadis | Department of Nuclear Science and Engineering Knowing the magnetic structure of crystalline materials is critical to many applications, including data storage, [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2022\/11\/28\/the-task-of-magnetic-classification-suddenly-looks-easier\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":468,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/6134"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=6134"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/6134\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/467"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=6134"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=6134"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=6134"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}