{"id":4504,"date":"2021-03-22T15:00:00","date_gmt":"2021-03-22T15:00:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2021\/03\/22\/what-drives-nithya-sambasivans-fight-for-fairness\/"},"modified":"2021-03-22T15:00:00","modified_gmt":"2021-03-22T15:00:00","slug":"what-drives-nithya-sambasivans-fight-for-fairness","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2021\/03\/22\/what-drives-nithya-sambasivans-fight-for-fairness\/","title":{"rendered":"What drives Nithya Sambasivan\u2019s fight for fairness"},"content":{"rendered":"<p>Author: <\/p>\n<div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p>When Nithya Sambasivan was finishing her undergraduate degree in engineering, she felt slightly unsatisfied. \u201cI wanted to know, \u2018how will the technology I build impact people?\u2019\u201d she says. Luckily, she would soon discover the field of Human Computer Interaction (HCI) and pursue her graduate degrees.\u00a0<\/p>\n<p>She completed her master\u2019s and PhD in HCI focusing on technology design for low-income communities in India. \u201cI worked with sex workers, slum communities, microentrepreneurs, fruit and vegetables sellers on the streetside&#8230;\u201d she says. \u201cI wanted to understand what their values, aspirations and struggles are, and how we can build with them in mind.\u201d\u00a0<\/p>\n<p>Today, Nithya is the founder of the HCI group at the Google Research India lab and an HCI researcher at <a href=\"https:\/\/pair.withgoogle.com\/\">PAIR<\/a>, a multidisciplinary team at Google that explores the human side of AI by doing fundamental research, building tools, creating design frameworks, and working with diverse communities. She recently sat down to answer some of our questions about her journey to researching responsible AI, fairness and championing historically underrepresented technology users.<\/p>\n<p><b>How would you explain your job to someone who isn&#8217;t in tech?<\/b><\/p>\n<p>I\u2019m a human-computer interaction (HCI) researcher, which means I study people to better understand how to build technology that works for them. There\u2019s been a lot of focus in the research community on building AI systems and the possibility of positively impacting the lives of billions of people. I focus on human-centered, responsible AI; specifically looking for ways it can empower communities in the Global South, where over 80% of the world\u2019s population lives. Today, my research outlines a road map for fairness research in India, calling for re-contextualizing datasets and models while empowering communities and enabling an entire fairness ecosystem.<\/p>\n<p><b>What originally inspired your interest in technology?\u00a0<\/b><\/p>\n<p>I grew up in a middle class family, the younger of two daughters from the South of India. My parents have very progressive views about gender roles and independence, especially in a conservative society \u2014 this definitely influenced what and how I research; things like gender, caste and\u00a0 poverty. In school, I started off studying engineering, which is a conventional path in India. Then, I went on to focus on HCI and designing with my own and other under-represented communities around the world.<\/p>\n<\/div>\n<\/div>\n<div class=\"block-image_full_width\">\n<div class=\"article-module h-c-page\">\n<div class=\"h-c-grid\">\n<figure class=\"article-image--large h-c-grid__col h-c-grid__col--6 h-c-grid__col--offset-3 \"><img decoding=\"async\" alt=\"Nithya smiling at a small child while working in the field.\" src=\"https:\/\/storage.googleapis.com\/gweb-uniblog-publish-prod\/images\/2017-04-059.max-1000x1000.jpg\"><\/figure>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"block-paragraph\">\n<div class=\"rich-text\">\n<p><b>How do Google\u2019s\u00a0\u00a0<a href=\"https:\/\/ai.google\/principles\/\">AI Principles<\/a>\u00a0inform your research? And how do you approach your research in general?<\/b><\/p>\n<p>Context matters. A general theory of algorithmic fairness cannot be based on \u201cWestern\u201d populations alone. My general approach is to research an important long-term, foundational problem. For example, our research on\u00a0<a href=\"https:\/\/arxiv.org\/pdf\/2101.09995.pdf\">algorithmic fairness<\/a>\u00a0reframes the conversation on ethical AI away from focusing mainly on Western, meaning largely European or North American, perspectives. Another\u00a0<a href=\"https:\/\/storage.googleapis.com\/pub-tools-public-publication-data\/pdf\/0d556e45afc54afeb2eb6b51a9bc1827b9961ff4.pdf\">project<\/a>\u00a0revealed that AI developers have historically focused more on the model \u2014 or algorithm \u2014 instead of the data. Both deeply affect the eventual AI performance, so being so focused on only one aspect creates downstream problems. For example, data sets may fully miss sub-populations, so when they are deployed, they may\u00a0 have much higher error rates or be unusable. Or they could make outcomes worse for certain groups, by misidentifying them as suspects for crimes or erroneously denying them bank loans they should receive.\u00a0\u00a0<\/p>\n<p>These insights not only enable AI systems to be better designed for under-represented communities; they also generate new considerations in the field of computing for humane and inclusive data collection, gender and social status representation, and privacy and safety needs of the most vulnerable. They are then\u00a0 incorporated into Google products that millions of people use, such as\u00a0<a href=\"https:\/\/blog.google\/around-the-globe\/google-asia\/making-privacy-personal-files-google\/\">Safe Folder<\/a>\u00a0on Files Go,\u00a0<a href=\"https:\/\/timesofindia.indiatimes.com\/gadgets-news\/google-brings-incognito-mode-to-google-go-app-heres-what-it-means-for-you\/articleshow\/71743648.cms\">Google Go<\/a>\u2019s incognito mode,\u00a0<a href=\"https:\/\/techcrunch.com\/2018\/05\/30\/google-launches-neighbourly\/\">Neighbourly<\/a>\u2018s privacy,\u00a0<a href=\"https:\/\/india.googleblog.com\/2019\/06\/get-alerts-with-google-maps-if-your.html\">Safe Safer<\/a>\u00a0by Google Maps and\u00a0<a href=\"https:\/\/india.googleblog.com\/2018\/10\/towards-gender-equity-in-stem-through.html\">Women in STEM<\/a>\u00a0videos.\u00a0<\/p>\n<p><b>What are some of the questions you\u2019re seeking to answer with your work?<\/b><\/p>\n<p>How do we challenge inherent \u201cWest\u201d-centric assumptions for algorithmic fairness, tech norms and make AI work better for people around the world?<\/p>\n<p>For example, there\u2019s an assumption that algorithmic biases can be fixed by adding more data from different groups. But in India, we&#8217;ve found that data can&#8217;t always represent individuals or events for many different reasons like economics and access to devices. The data could come mostly from middle class Indian men, since they\u2019re more likely to have internet access. This means algorithms will work well for them. Yet, over half the population \u2014 primarily women, rural and tribal communities \u2014 lack access to the internet and they\u2019re left out. Caste, religion and other factors can also contribute to new biases for AI models.\u00a0<\/p>\n<p><b>How should aspiring AI thinkers and future technologists prepare for a career in this field?\u00a0<\/b><\/p>\n<p>It\u2019s really important that Brown and Black people enter this field. We not only bring technical skills but also lived experiences and values that are so critical to the field of computing. Our communities are the most vulnerable to AI interventions, so it\u2019s important we shape and build these systems. To members of this community: Never play small or let someone make you feel small. Involve yourself in the political, social and ecological aspects of the invisible, not on tech innovation alone. We can\u2019t afford not to.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/blog.google\/technology\/research\/what-drives-nithya-sambasivans-fight-for-fairness\/\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: When Nithya Sambasivan was finishing her undergraduate degree in engineering, she felt slightly unsatisfied. \u201cI wanted to know, \u2018how will the technology I build [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2021\/03\/22\/what-drives-nithya-sambasivans-fight-for-fairness\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":4505,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4504"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=4504"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/4504\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/4505"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=4504"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=4504"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=4504"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}