{"id":2166,"date":"2019-05-20T17:00:00","date_gmt":"2019-05-20T17:00:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2019\/05\/20\/behind-magenta-the-tech-that-rocked-i-o\/"},"modified":"2019-05-20T17:00:00","modified_gmt":"2019-05-20T17:00:00","slug":"behind-magenta-the-tech-that-rocked-i-o","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2019\/05\/20\/behind-magenta-the-tech-that-rocked-i-o\/","title":{"rendered":"Behind Magenta, the tech that rocked I\/O"},"content":{"rendered":"<p>Author: <\/p>\n<div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>On the second day of I\/O 2019, two bands took the stage\u2014with a little help from machine learning. Both YACHT and The Flaming Lips worked with Google engineers who say that machine learning could change the way artists create music.<\/p>\n<p>\u201cAny time there has been a new technological development, it has made its way into music and art,\u201d says Adam Roberts, a software engineer on the <a href=\"http:\/\/g.co\/magenta\">Magenta<\/a> team. \u201cThe history of the piano, essentially, went from acoustic to electric to the synthesizer, and now there are ways to play it directly from your computer. That just happens naturally. If it\u2019s a new technology, people figure out how to use it in music.\u201d<\/p>\n<p>Magenta, which started nearly three years ago, is an open-source research project powered by TensorFlow that explores the role of machine learning as a tool in the creative process. Machine learning is a process of teaching computers to recognize patterns, with a goal of letting them learn by example rather than constantly receiving input from a programmer. So with music, for example, you can input two types of melodies, then use machine learning to combine them in a novel way.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-image_full_width\">\n<figure class=\"article-image--full article-module \"><img decoding=\"async\" alt=\"IO_19_Wed_Evening_11769.jpg\" src=\"https:\/\/storage.googleapis.com\/gweb-uniblog-publish-prod\/images\/IO_19_Wed_Evening_11769.max-1000x1000.jpg\"><figcaption class=\"article-image__caption h-c-page\">\n<div class=\"rich-text\">\n<p>Jesse Engel, Claire Evans, Wayne Coyne and Adam Roberts speak at I\/O. \u00a0<\/p>\n<\/div>\n<\/figcaption><\/figure>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>But the Magenta team isn\u2019t just teaching computers to make music\u2014instead, they\u2019re working hand-in-hand with musicians to help take their art in new directions. YACHT was one of Magenta\u2019s earliest collaborators; the trio came to Google to learn more about how to use artificial intelligence and machine learning in their upcoming album.<\/p>\n<p>The band first took all 82 songs from their back catalog and isolated each part, from bass lines to vocal melodies to drum rhythms; they then took those isolated parts and broke them up into four-bar loops. Then, they put those loops into the <a href=\"http:\/\/g.co\/magenta\/musicvae\">machine learning model<\/a>, which put out new melodies based on their old work. They did a similar process with lyrics, using their old songs plus other material they considered inspiring. The final task was to pick lyrics and melodies that made sense, and pair them together to make a song.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-video\">\n<div class=\"h-c-page h-c-page--mobile-full-bleed\">\n<div class=\"h-c-grid\">\n<div class=\"h-c-grid__col h-c-grid__col-l--12 \">\n<div class=\"article-module article-video \">\n<figure><a class=\"h-c-video h-c-video--marquee\" data-glue-modal-disabled-on-mobile=\"true\" data-glue-modal-trigger=\"uni-modal-pM9u9xcM_cs-\" href=\"https:\/\/youtube.com\/watch?v=pM9u9xcM_cs\"><img decoding=\"async\" alt=\"Music and Machine Learning (Google I\/O'19)\" src=\"https:\/\/img.youtube.com\/vi\/pM9u9xcM_cs\/maxresdefault.jpg\"><svg class=\"h-c-video__play h-c-icon h-c-icon--color-white\" role=\"img\"><use xlink:href=\"#mi-youtube-icon\"><\/use><\/svg><\/a><figcaption class=\"article-video__caption h-c-page\">\n<h4 class=\"h-c-headline h-c-headline--four h-u-font-weight-medium h-u-mt-std\">Music and Machine Learning Session from Google I\/O&#8217;19<\/h4>\n<\/figcaption><\/figure>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"h-c-modal--video\" data-glue-modal=\"uni-modal-pM9u9xcM_cs-\" data-glue-modal-close-label=\"Close Dialog\"><a class=\"glue-yt-video\" data-glue-yt-video-autoplay=\"true\" data-glue-yt-video-height=\"99%\" data-glue-yt-video-vid=\"pM9u9xcM_cs\" data-glue-yt-video-width=\"100%\" href=\"https:\/\/youtube.com\/watch?v=pM9u9xcM_cs\" ng-cloak=\"\"><\/a><\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>\u201cThey used these tools to push themselves out of their comfort zone,\u201d says Jesse Engel, a research scientist on the Magenta team. \u201cThey imposed some rules on themselves that they had to use the outputs of the model to some extent, and it helped them make new types of music.\u201d<\/p>\n<p>Claire Evans, the singer of YACHT, explained the process during a presentation at I\/O. \u201cUsing machine learning to make a song with structure, with a beginning, middle and end, is a little bit still out of our reach,\u201d she explained. \u201cBut that\u2019s a good thing. The melody was the model\u2019s job, but the arrangement and performance was entirely our job.\u201d<\/p>\n<p>The Flaming Lips\u2019 use of Magenta is a lot more recent; the band started working with the Magenta team to prepare for their performance at I\/O. The Magenta team showcased all their projects to the band, who were drawn to one in particular: <a href=\"https:\/\/magenta.tensorflow.org\/pianogenie\">Piano Genie<\/a>, which was dreamed up by a graduate student, Chris Donahue, who was a summer intern at Google. They decided to use Piano Genie as the basis for a new song to be debuted on the I\/O stage.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-video\">\n<div class=\"h-c-page h-c-page--mobile-full-bleed\">\n<div class=\"h-c-grid\">\n<div class=\"h-c-grid__col h-c-grid__col-l--12 \">\n<div class=\"article-module article-video \">\n<figure><a class=\"h-c-video h-c-video--marquee\" data-glue-modal-disabled-on-mobile=\"true\" data-glue-modal-trigger=\"uni-modal-HGWkQP9lVPw-\" href=\"https:\/\/youtube.com\/watch?v=HGWkQP9lVPw\"><img decoding=\"async\" alt=\"Google AI collaboration with The Flaming Lips bears fruit at I\/O 2019\" src=\"https:\/\/img.youtube.com\/vi\/HGWkQP9lVPw\/maxresdefault.jpg\"><svg class=\"h-c-video__play h-c-icon h-c-icon--color-white\" role=\"img\"><use xlink:href=\"#mi-youtube-icon\"><\/use><\/svg><\/a><\/figure>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"h-c-modal--video\" data-glue-modal=\"uni-modal-HGWkQP9lVPw-\" data-glue-modal-close-label=\"Close Dialog\"><a class=\"glue-yt-video\" data-glue-yt-video-autoplay=\"true\" data-glue-yt-video-height=\"99%\" data-glue-yt-video-vid=\"HGWkQP9lVPw\" data-glue-yt-video-width=\"100%\" href=\"https:\/\/youtube.com\/watch?v=HGWkQP9lVPw\" ng-cloak=\"\"><\/a><\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>Piano Genie distills 88 notes on a piano to eight buttons, which you can push to your heart\u2019s content to make piano music. In what Jesse calls \u201can initial moment of inspiration,\u201d someone put a piece of wire inside a piece of fruit, and turned fruit into the buttons for Piano Genie. \u201cFruit can be used as a capacitive sensor, like the screen on your phone, so you can detect whether or not someone is touching the fruit,\u201d Jesse explains. \u201cThey were playing these fruits just by touching these different fruits, and they got excited by how that changed the interaction.\u201d<\/p>\n<p>Wayne Coyne, the singer of The Flaming Lips, noted during an I\/O panel that a quick turnaround time, plus close collaboration with Google, gave them the inspiration to think outside the box. \u201cFor me, the idea that we\u2019re not playing it on a keyboard, we\u2019re not playing it on a guitar, we\u2019re playing it on fruit, takes it into this other realm,\u201d he said.<\/p>\n<p>During their performance that night, Steven Drozd from The Flaming Lips, who usually plays a variety of instruments, played a \u201cmagical bowl of fruit\u201d for the first time. He tapped each fruit in the bowl, which then played different musical tones, \u201csinging\u201d the fruit\u2019s own name. With help from Magenta, the band broke into a brand-new song, \u201cStrawberry Orange.\u201d<\/p>\n<\/div>\n<\/div>\n<div class=\"block-image_full_width\">\n<figure class=\"article-image--full article-module \"><img decoding=\"async\" alt=\"IO_19_Wed_Concert_13496 (1).jpg\" src=\"https:\/\/storage.googleapis.com\/gweb-uniblog-publish-prod\/images\/IO_19_Wed_Concert_13496_1.max-1000x1000.jpg\"><figcaption class=\"article-image__caption h-c-page\">\n<div class=\"rich-text\">\n<p>The Flaming Lips\u2019 Steven Drozd plays a bowl of fruit.<\/p>\n<\/div>\n<\/figcaption><\/figure>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>The Flaming Lips also got help from the audience: At one point, they tossed giant, blow-up \u201cfruits\u201d into the crowd, and each fruit was also set up as a sensor, so any audience member who got their hands on one played music, too. The end result was a cacophonous, joyous moment when a crowd truly contributed to the band\u2019s sound.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-image_full_width\">\n<figure class=\"article-image--full article-module \"><img decoding=\"async\" alt=\"IO_19_Wed_Concert_13502 (1).jpg\" src=\"https:\/\/storage.googleapis.com\/gweb-uniblog-publish-prod\/images\/IO_19_Wed_Concert_13502_1.max-1000x1000.jpg\"><figcaption class=\"article-image__caption h-c-page\">\n<div class=\"rich-text\">\n<p>Audience members \u201cplay\u201d an inflatable banana.<\/p>\n<\/div>\n<\/figcaption><\/figure>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>You can learn more about the &#8220;Fruit Genie&#8221; and how to build your own at <a href=\"http:\/\/g.co\/magenta\/fruitgenie\">g.co\/magenta\/fruitgenie<\/a>.<\/p>\n<p>Though the Magenta team collaborated on a much deeper level with YACHT, they also found the partnership with The Flaming Lips to be an exciting look toward the future. \u201cThe Flaming Lips is a proof of principle of how far we\u2019ve come with the technologies,\u201d Jesse says. \u201cThrough working with them we understood how to make our technologies more accessible to a broader base of musicians. We were able to show them all these things and they could just dive in and play with it.\u201d<\/p>\n<\/div>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/www.blog.google\/technology\/ai\/behind-magenta-tech-rocked-io\/\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: On the second day of I\/O 2019, two bands took the stage\u2014with a little help from machine learning. Both YACHT and The Flaming Lips [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2019\/05\/20\/behind-magenta-the-tech-that-rocked-i-o\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":2167,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/2166"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=2166"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/2166\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/2167"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=2166"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=2166"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=2166"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}