{"id":6206,"date":"2023-01-05T18:35:00","date_gmt":"2023-01-05T18:35:00","guid":{"rendered":"https:\/\/www.aiproblog.com\/index.php\/2023\/01\/05\/simulating-discrimination-in-virtual-reality\/"},"modified":"2023-01-05T18:35:00","modified_gmt":"2023-01-05T18:35:00","slug":"simulating-discrimination-in-virtual-reality","status":"publish","type":"post","link":"https:\/\/www.aiproblog.com\/index.php\/2023\/01\/05\/simulating-discrimination-in-virtual-reality\/","title":{"rendered":"Simulating discrimination in virtual reality"},"content":{"rendered":"<p>Author: Alex Shipps | MIT CSAIL<\/p>\n<div>\n<p>Have you ever been advised to \u201cwalk a mile in someone else\u2019s shoes?\u201d Considering another person\u2019s perspective can be a challenging endeavor \u2014 but recognizing our errors and biases is key to building understanding across communities. By challenging our preconceptions, we confront prejudice, such as racism and xenophobia, and potentially develop a more inclusive perspective about others.<\/p>\n<p>To assist with perspective-taking, MIT researchers have developed \u201cOn the Plane,\u201d a virtual reality role-playing game (VR RPG) that simulates discrimination. In this case, the game portrays xenophobia directed against a Malaysian America woman, but the approach can be generalized. Situated on an airplane, players can take on the role of characters from different backgrounds, engaging in dialogue with others while making in-game choices to a series of prompts. In turn, players\u2019 decisions control the outcome of a tense conversation between the characters about cultural differences.<\/p>\n<p>As a VR RPG, \u201cOn the Plane\u201d encourages players to take on new roles that may be outside of their personal experiences in the first person, allowing them to confront in-group\/out-group bias by incorporating new perspectives into their understanding of different cultures. Players engage with three characters: Sarah, a first-generation Muslim American of Malaysian ancestry who wears a hijab; Marianne, a white woman from the Midwest with little exposure to other cultures and customs; or a flight attendant. Sarah represents the out group, Marianne is a member of the in group, and the flight staffer is a bystander witnessing an exchange between the two passengers.<\/p>\n<p>\u201cThis project is part of our efforts to harness the power of virtual reality and artificial intelligence to address social ills, such as discrimination and xenophobia,\u201d says Caglar Yildirim, an MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) research scientist who is a co-author and co-game designer on the project. \u201cThrough the exchange between the two passengers, players experience how one passenger&#8217;s xenophobia manifests itself and how it affects the other passenger. The simulation engages players in critical reflection and seeks to foster empathy for the passenger who was \u2018othered\u2019 due to her outfit being not so \u2018prototypical\u2019 of what an American should look like.\u201d<\/p>\n<p>Yildirim worked alongside the project\u2019s principal investigator, D. Fox Harrell, MIT professor of digital media and AI at CSAIL, the Program in Comparative Media Studies\/Writing (CMS), and the Institute for Data, Systems, and Society (IDSS) and founding director of the MIT Center for Advanced Virtuality. \u201cIt is not possible for a simulation to give someone the life experiences of another person, but while you cannot \u2018walk in someone else\u2019s shoes\u2019 in that sense, a system like this can help people recognize and understand the social patterns at work when it comes to issue like bias,\u201d says Harrell, who is also co-author and designer on this project. \u201cAn engaging, immersive, interactive narrative can also impact people emotionally, opening the door for users\u2019 perspectives to be transformed and broadened.\u201d\u00a0<\/p>\n<p>This simulation also utilizes an interactive narrative engine that creates several options for responses to in-game interactions based on a model of how people are categorized socially. The tool grants players a chance to alter their standing in the simulation through their reply choices to each prompt, affecting their affinity toward the other two characters. For example, if you play as the flight attendant, you can react to Marianne&#8217;s xenophobic expressions and attitudes toward Sarah, changing your affinities. The engine will then provide you with a different set of narrative events based on your changes in standing with others.<\/p>\n<p>To animate each avatar, \u201cOn the Plane\u201d incorporates artificial intelligence knowledge representation techniques controlled by probabilistic finite state machines, a tool commonly used in machine learning systems for pattern recognition. With the help of these machines, characters\u2019 body language and gestures are customizable: if you play as Marianne, the game will customize her mannerisms toward Sarah based on user inputs, impacting how comfortable she appears in front of a member of a perceived out group. Similarly, players can do the same from Sarah or the flight attendant\u2019s point of view.<\/p>\n<p>In a 2018 <a href=\"https:\/\/daphnis.wbnusystem.net\/~wbplus\/websites\/AD2902892\/files\/analysis_-_sengun_and_harrell.pdfwebsites\/AD2902892\/files\/analysis_-_sengun_and_harrell.pdf\/AD2902892\/gulf-affairs\/publications\/gender--im-balance-in-gulf-societies\"><u>paper<\/u><\/a> based on work done in a collaboration between MIT CSAIL and the Qatar Computing Research Institute, Harrell and co-author Sercan \u015eeng\u00fcn advocated for virtual system designers to be more inclusive of Middle Eastern identities and customs. They claimed that if designers allowed users to customize virtual avatars more representative of their background, it might empower players to engage in a more supportive experience. Four years later, \u201cOn the Plane\u201d accomplishes a similar goal, incorporating a Muslim\u2019s perspective into an immersive environment.<\/p>\n<p>\u201cMany virtual identity systems, such as avatars, accounts, profiles, and player characters, are not designed to serve the needs of people across diverse cultures. We have used statistical and AI methods in conjunction with qualitative approaches to learn where the gaps are,\u201d they note. \u201cOur project helps engender perspective transformation so that people will treat each other with respect and enhanced understanding across diverse cultural avatar representations.\u201d<\/p>\n<p>Harrell and Yildirim\u2019s work is part of the MIT IDSS\u2019s Initiative on Combatting Systemic Racism (ICSR). Harrell is on the initiative\u2019s steering committee and is the leader of the newly forming <a href=\"https:\/\/idss.mit.edu\/research\/collaborations\/icsr\/icsr-project-teams\/\"><u>Antiracism, Games, and Immersive Media<\/u><\/a> vertical, who study behavior, cognition, social phenomena, and computational systems related to race and racism in video games and immersive experiences.<\/p>\n<p>The researchers\u2019 latest project is part of the ICSR\u2019s broader goal to launch and coordinate cross-disciplinary research that addresses racially discriminatory processes across American institutions. Using big data, members of the research initiative develop and employ computing tools that drive racial equity. Yildirim and Harrell accomplish this goal by depicting a frequent, problematic scenario that illustrates how bias creeps into our everyday lives.<\/p>\n<p>\u201cIn a post-9\/11 world, Muslims often experience ethnic profiling in American airports. \u2018On the Plane\u2019 builds off of that type of in-group favoritism, a well-established finding in psychology,\u201d says MIT Professor Fotini Christia, director of the Sociotechnical Systems Research Center (SSRC) and associate director or IDSS. \u201cThis game also takes a novel approach to analyzing hardwired bias by utilizing VR instead of field experiments to simulate prejudice. Excitingly, this research demonstrates that VR can be used as a tool to help us better measure bias, combating systemic racism and other forms of discrimination.\u201d<\/p>\n<p>\u201cOn the Plane\u201d was developed on the Unity game engine using the XR Interaction Toolkit and Harrell\u2019s Chimeria platform for authoring interactive narratives that involve social categorization. The game will be deployed for research studies later this year on both desktop computers and the standalone, wireless Meta Quest headsets. A <a href=\"https:\/\/caglar.mit.edu\/sites\/default\/files\/documents\/IEEE_AIVR_22_Proceedings___On_the_Plane.pdf\" target=\"_blank\" rel=\"noopener\">paper on the work<\/a> was presented in December at the <a href=\"https:\/\/aivr.science.uu.nl\/aivr2022_program.html\" target=\"_blank\" rel=\"noopener\">2022 IEEE International Conference on Artificial Intelligence and Virtual Reality<\/a>.<\/p>\n<\/div>\n<p><a href=\"https:\/\/news.mit.edu\/2023\/simulating-discrimination-virtual-reality-0105\">Go to Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Alex Shipps | MIT CSAIL Have you ever been advised to \u201cwalk a mile in someone else\u2019s shoes?\u201d Considering another person\u2019s perspective can be [&hellip;] <span class=\"read-more-link\"><a class=\"read-more\" href=\"https:\/\/www.aiproblog.com\/index.php\/2023\/01\/05\/simulating-discrimination-in-virtual-reality\/\">Read More<\/a><\/span><\/p>\n","protected":false},"author":1,"featured_media":460,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_bbp_topic_count":0,"_bbp_reply_count":0,"_bbp_total_topic_count":0,"_bbp_total_reply_count":0,"_bbp_voice_count":0,"_bbp_anonymous_reply_count":0,"_bbp_topic_count_hidden":0,"_bbp_reply_count_hidden":0,"_bbp_forum_subforum_count":0,"footnotes":""},"categories":[24],"tags":[],"_links":{"self":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/6206"}],"collection":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/comments?post=6206"}],"version-history":[{"count":0,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/posts\/6206\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media\/466"}],"wp:attachment":[{"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/media?parent=6206"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/categories?post=6206"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aiproblog.com\/index.php\/wp-json\/wp\/v2\/tags?post=6206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}