{"id":5511,"date":"2022-03-24T16:00:00","date_gmt":"2022-03-24T16:00:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2022\/03\/24\/the-check-up-our-latest-health-ai-developments\/"},"modified":"2022-03-24T16:00:00","modified_gmt":"2022-03-24T16:00:00","slug":"the-check-up-our-latest-health-ai-developments","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2022\/03\/24\/the-check-up-our-latest-health-ai-developments\/","title":{"rendered":"The Check Up: our latest health AI developments"},"content":{"rendered":"<p>Author: <\/p>\n<div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>Over the years, teams across Google have focused on how technology \u2014 specifically artificial intelligence and hardware innovations \u2014 can improve access to high-quality, equitable healthcare across the globe.<\/p>\n<p>Accessing the right healthcare can be challenging depending on where people live and whether local caregivers have specialized equipment or training for tasks like disease screening. To help, Google Health has expanded its research and applications to focus on improving the care clinicians provide and allow care to happen outside hospitals and doctor\u2019s offices.<\/p>\n<p>Today, at our Google Health event <a href=\"https:\/\/www.youtube.com\/watch?v=2XQZQR477fg\">The Check Up<\/a>, we\u2019re sharing new areas of AI-related research and development and how we\u2019re providing clinicians with easy-to-use tools to help them better care for patients. Here\u2019s a look at some of those updates.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<h3>Smartphone cameras\u2019 potential to protect cardiovascular health and preserve eyesight<\/h3>\n<p>One of our earliest Health AI projects, <a href=\"https:\/\/health.google\/caregivers\/arda\/\">ARDA<\/a>, aims to help address screenings for diabetic retinopathy \u2014 a complication of diabetes that, if undiagnosed and untreated, can cause blindness.<\/p>\n<p>Today, we screen 350 patients daily, resulting in close to 100,000 patients screened to date. We <a href=\"https:\/\/www.thelancet.com\/journals\/landig\/article\/PIIS2589-7500(22)00017-6\/fulltext\">recently completed a prospective study<\/a> with the Thailand national screening program that further shows ARDA is accurate and capable of being deployed safely across multiple regions to support more accessible eye screenings.<\/p>\n<p>In addition to diabetic eye disease, we\u2019ve previously also shown how <a href=\"https:\/\/ai.googleblog.com\/2018\/02\/assessing-cardiovascular-risk-factors.html\">photos of eyes\u2019 interiors (or fundus) can reveal cardiovascular risk factors<\/a>, such as high blood sugar and cholesterol levels, with assistance from deep learning. <a href=\"http:\/\/ai.googleblog.com\/2022\/03\/detecting-signs-of-disease-from.html\">Our recent research<\/a> tackles detecting diabetes-related diseases from photos of the exterior of the eye, using existing tabletop cameras in clinics. Given the <a href=\"https:\/\/arxiv.org\/abs\/2011.11732\">early promising results<\/a>, we\u2019re looking forward to clinical research with partners, including EyePACS and Chang Gung Memorial Hospital (CGMH), to investigate if photos from smartphone cameras can help detect diabetes and non-diabetes diseases from external eye photos as well. While this is in the early stages of research and development, our engineers and scientists envision a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own homes.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<h3>Recording and translating heart sounds with smartphones<\/h3>\n<p>We\u2019ve previously shared how mobile sensors combined with machine learning can democratize health metrics and <a href=\"https:\/\/blog.google\/technology\/health\/take-pulse-health-and-wellness-your-phone\/\">give people insights into daily health and wellness<\/a>. Our feature that allows you to measure your heart rate and respiratory rate with your phone\u2019s camera is now available on over 100 models of Android devices, as well as iOS devices. Our <a href=\"https:\/\/www.medrxiv.org\/content\/10.1101\/2021.03.08.21252408v1\">manuscript<\/a> describing the prospective validation study has been accepted for publication.<\/p>\n<p>Today, we\u2019re sharing a new area of research that explores how a smartphone\u2019s built-in microphones could record heart sounds when placed over the chest. Listening to someone\u2019s heart and lungs with a stethoscope, known as auscultation, is a critical part of a physical exam. It can help clinicians detect heart valve disorders, such as aortic stenosis which is important to detect early. Screening for aortic stenosis typically requires specialized equipment, like a stethoscope or an ultrasound, and an in-person assessment.<\/p>\n<p>Our latest research investigates whether a smartphone can detect heartbeats and murmurs. We&#8217;re currently in the early stages of clinical study testing, but we hope that our work can empower people to use the smartphone as an additional tool for accessible health evaluation.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<h3>Partnering with Northwestern Medicine to apply AI to improve maternal health<\/h3>\n<p>Ultrasound is a noninvasive diagnostic imaging method that uses high-frequency sound waves to create real-time pictures or videos of internal organs or other tissues, such as blood vessels and fetuses.<\/p>\n<p>Research shows that ultrasound is safe for use in prenatal care and effective in identifying issues early in pregnancy. However, <a href=\"https:\/\/www.who.int\/reproductivehealth\/publications\/maternal-mortality-2000-2017\/en\/\">more than half<\/a> of all birthing parents in low-to-middle-income countries don\u2019t receive ultrasounds, in part due to a shortage of expertise in reading ultrasounds. We believe that Google\u2019s expertise in machine learning can help solve this and allow for healthier pregnancies and better outcomes for parents and babies.<\/p>\n<p>We are working on foundational, open-access <a href=\"https:\/\/arxiv.org\/abs\/2203.10139\">research<\/a> <a href=\"https:\/\/arxiv.org\/abs\/2203.11903\">studies<\/a> that validate the use of AI to help providers conduct ultrasounds and perform assessments. We\u2019re excited to partner with Northwestern Medicine to further develop and test these models to be more generalizable across different levels of experience and technologies. With more automated and accurate evaluations of maternal and fetal health risks, we hope to lower barriers and help people get timely care in the right settings.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>To learn more about the health efforts we shared at <a href=\"https:\/\/health.google\/\">The Check Up with Google Health<\/a>, check out <a href=\"https:\/\/blog.google\/technology\/health\/check-up-consumer-developments-2022\/\">this blog post<\/a> from our Chief Health Officer Dr. Karen DeSalvo. And stay tuned for more health-related research milestones from us.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/blog.google\/technology\/health\/check-up-ai-developments-2022\/\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Over the years, teams across Google have focused on how technology \u2014 specifically artificial intelligence and hardware innovations \u2014 can improve access to high-quality, [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2022\/03\/24\/the-check-up-our-latest-health-ai-developments\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":474,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/5511"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=5511"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/5511\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/470"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=5511"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=5511"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=5511"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}