Growing the field of engineering ethics education and research as a community Date: 24-26 March…
Student learning has been deeply impacted by the COVID-19 pandemic worldwide. Although the pandemic seems to be subsiding, the long-term effect on education and learning has persisted. Combined with advances in AI-based applications for learning, students continue to live in “uncertain times” and grapple with concerns about wellbeing, value of their education, and the ethics of using new technologies. While there certainly is uncertainty in students’ learning experiences, I have found talking to engineering students directly and asking them to reflect on their experiences has helped me better understand and make sense of my own. It has also proved to be fertile ground for my doctoral studies and research.
I started my doctoral studies right as the pandemic took hold in the United States. As a student working with Dr. Aditya Johri in the TrailsLab at George Mason University, I was presented with the opportunity to investigate student sensemaking of the world through research into the use of role-plays for teaching technology ethics. Over the next two years, I moderated and studied how historical (i.e., Boeing Max 737 disaster) and fictional (i.e., facial recognition (FR) on acollege campus) cases helped students take on different perspectives and address the micro,meso, and macro ethical concerns at play. I found that semi-structured discussions allowed students to transfer their learning from the abstract to something personal that affected them. In the case of facial recognition, students would often highlight navigating their role’sperception of FR and their acceptance of it. Through my work with my collaborators, I explored and assessed role-plays across courses and institutions and tested innovative ideas, such as using concept maps in a pre-post intervention assessment.
While continuing to research using role-plays, I also conversed with students about their experiences transitioning to online learning. As a student myself, I was exposed to badly migrated online exams due to the switch online. Through these experiences and reflecting with other students, it was clear that the online assessment systems, especially those relying on AI facial recognition, were being rushed into practice. This led to a study on students’ perspectiveson technology adoption. Students are often not privy to licensing agreements, nor do they largely have a say in adopting new technology, such as online proctoring software, which became a base approach to transition courses online. Not surprisingly, the research unearthed a host of negative sentiments about the software – students highlighted being misidentified, and others emphasized the inequity and luxury of quiet, undisturbed space. Students expressed unease and uncertainty with what is visible, audible, or implied from the video recordings. These experiences have led me to think ahead and delve deeper into understanding the role of AI in shaping engineering education. As a start, I believe it is essential to understand better student conceptions of AI itself and what leads to them. An initial study exploring this questionamong first- and second-year engineering students found that students often struggled to define AI, falling back on tropes and images from science fiction, such as a physical machine or robot. Many students quoted The Terminator as the source of their AI definition. Continuing this work, I am excited to explore the role of popular narratives in framing students’ understanding of AI.
Among the uncertainties of the future of education, engineering, and the intersections of all the identities in between, I have found these dialogues to significantly impact my worldview as a researcher. I have also learned much from my interdisciplinary Information Science and Technology program peers, who work with data through language processing, cybersecurity, socio-technical systems, and human-computer interaction. As an early-career researcher, I believe data is an extension of the people it represents. This sentiment is reflected and resonates across our work, and this relationship warrants protection and careful design.
Ultimately, I expect our beliefs about engineering education will be challenged as new technology fiercely questions our values and norms. I remain convinced that we must make a concerted effort to talk to students rather than simply look at them as amorphous data points. By doing so, I hope we may collaboratively shape engineering curricula and teaching practices in a language that resonates with students and reassures the intricacies between engineering, technology, and people. I also hope this leads to conversations about engineers’ social and ethical obligations being a part of every course session rather than a space-filling discussion in week 13. However, the challenge of the modern classroom is that talk is not cheap; the interactions I describe are fleeting as course size, technology reliance, and distance between instructor and student increase. Technology can ultimately support our work, but we must facilitate the discussion and ask, not tell, to ensure we are on track. I am certain we can do so.
Acknowledgments:
I would like to thank the U.S. National Science Foundation Awards#204863, 1937950, 1939105, and USDA/NIFA Award#2021-67021-35329, for partly supporting this work. Any opinions, findings, and conclusions, or recommendations expressed are mine and do not necessarily reflect the views of the funding agencies. I would also like to thank my advisor Dr. Aditya Johri, and other collaborators on the project: Dr. Huzefa Rangwala, Dr. Alex Monea, and Dr. Cory Brozina.