I have been doing some reading lately on the increasing role of empathy in computing and the world of software development. Termed empathic computing this emergent field is all about embedding elements of empathy (sensitivity to emotions, perspective based programming etc) that is aimed to interact with the user beyond the technical. Some aspects of AI are also embedded to enhance the user experience. Given the mass volumes of information already published on this topic as an educator and a tech person, I wanted to find hooks in them that lead back to education. What can we do in our classrooms to engage students and our schools in this topic?
To better organize my thoughts, I have divided up the information I found in these rough categories. While I am sure all this data is just scratching the surface, I am documenting here pieces I thought were note worthy.
- Why is empathy an important component of computing?
- How is empathy based computing taking shape in the tech around us?
- What can we as teachers do in our classrooms to help students engage in empathic computing?
Why is empathy an important component of future computing?
In some ways the already existing trend of virtual role playing games, 360 degree view videos and even Google’s Street View are technically examples of immersive computing. The user has a much better experience of the location and space as compared to say a static image or even a flat video. While these are immersive, sure, how much of it can be classified as empathic? Is there an imaginative extension to it?
Immersive technology (like Amazon’s Alexa, Echo, Google Home etc) is on the rise. We are literally talking to our devices telling them what we want them to do. This very human element in itself comes with several aspects of empathy. In her 2017 article in MIT’s technology review, Rana el Kaliouby (CEO and Founder of Affectiva – an emotion measurement technology) makes a case for the why. She highlights as much in the following words.
“What if, instead, these technologies—smart speakers, autonomous vehicles, television sets, connected refrigerators, mobile phones—were aware of your emotions? What if they sensed nonverbal behavior in real time? Your car might notice that you look tired and offer to take the wheel. Your fridge might work with you on a healthier diet. Your wearable fitness tracker and TV might team up to get you off the couch. Your bathroom mirror could sense that you’re stressed and adjust the lighting while turning on the right mood-enhancing music. Mood-aware technologies would make personalized recommendations and encourage people to do things differently, better, or faster.”
While this sounds quite promising (borderline fiction, quite frankly), recent events involving smart tech seem to have created valid concerns about their accuracy and reliability. I do agree with her on the points she makes for its potential uses in automotive, education, health care and communication.
Another WIRED article, “Why computers need empathy for the human condition”, discusses the work of Daniel McDuff and Robert Morris.
“Emotions matter. They make us feel alive, they filter what we remember about experiences, they influence what we choose to do and where we choose to go.”
They have built a device called “Pavlov Poke” which is designed to pull users away from distracting websites by literally giving them a small electric jolt via the keyboard!
A research paper from Carnegie Mellon University, titled “Empathic Computing” highlights the potential use of empathy based systems in a home environment. Using sensors, the systems discussed in the paper range from health care for the elderly to monitoring systems that display hot spots on a the sensor web. The applications for such a technology can be all kinds of monitoring systems involving humans (toddlers and infants included) who may need careful and extra assistance. The paper also goes into the physics and math of such a network, explaining the base calculations that need to happen for such a network to function well with the required hardware circuits. It’s ability to track and detect falls and other abnormalities is a big sell for why we need empathic computing at all.
Paul Balagot, Chief Experience Officer at Precision Effect (a company that works extensively with behavior based systems) writes in this 2017 article wondering how affective computing can fill the empathy gap between machines and humans. The focus seems to be primarily (and understandably) in the field of healthcare at the moment for empathic computing to really make a difference.
How is empathy based computing taking shape in the tech around us?
I mentioned a couple of generic applications above but let us dig deeper here. In his 2017 post, “The Coming Age of Empathic Computing” , Mark Billinghurst who is a Professor of Human Computer Interaction at University of South Australia talks about the various emerging technologies that are aimed at including aspects of empathy. He includes some interesting videos like the one below on Microsoft’s Holoportation project.
His comprehensive piece touches on various other products in the pipeline like “Project Syria”, a VR experience where audience is taken inside the markets of Syria and exposed to the experience of a terrorist attack. Elements from an actual news footage is stitched along with VR elements to help the user gain some perspective. Work like this, in my opinion, goes beyond the rhetoric.
PhD student Danielle Olson has embarked on another interesting project at MIT. Working from the Computer Science and Artificial Intelligence Laboratory (CSAIL), she worked on a project called “The Enemy”. An immersive experience, it was designed to evoke feelings of empathy between two soldiers on opposing sides of a conflict. Though the project had roots elsewhere, she wrote algorithms to “analyze user’s body language in different scenarios”. Watch the teaser for the project below.
Finally, an article I recently ran into on Harvard Business Review. It documents what happened when Rise Science (a company specializing in helping professional athletes improve their performance) joined hands with IDEO (a design and innovation firm). Together, they worked on behavior analysis of the users with the data they had. The UI design was changed to improve interactivity with the users by paying attention to their “functional, social and emotional behavior”. This made all the difference in the world.
Given above are just a few examples of how empathic computing is taking shape. As technology progresses and compatible hardware becomes available, a lot of potential purposes will surface.
What can we as teachers do in our classrooms to help students engage with empathic computing?
Engaging with the Tech
Augmented and virtual reality devices and apps have been around for the past few years. Various ed tech companies are constantly coming out with different ways to make learning science, math, language , history and social studies more immersive. Getting kids to work with these devices and apps is a good start. As hardware becomes more affordable, it is worth the time and effort for schools to start purposefully investing (even as a pilot) in some of this technology so students can get a chance to interact with them. The big advantage of doing this is that relevant and critical conversations can take place. Right from the alignment of such a technology with pedagogy all the way to visible impact of it on student learning and assessment. This is a good place to be in for any school since this is where some serious evaluation can take place of both the curriculum and the tech.
Engaging with the grassroots expertise
Finding a platform for experienced professionals (preferably educators, not consultants or tech companies) to come and discuss how they are helping their students/schools to work with empathic computing is another good strategy. Often times they have a lot of valuable resources, tips, cautionary tales to share which may guide schools in making the right choices. Networking with social media like Twitter where educators have a big presence would be a way forward to connect with peers who have been using VR devices in their classrooms.
Keeping an eye on the latest trends
While this is often cause for much anxiety (and a phrase I recently learnt “initiative fatigue”), if we are expecting our students to be developing a mindset for empathic computing going forward, then keeping an eye on the trends matters. Developers of empathic computing tomorrow are students in our classrooms today. So it only makes sense for us as educators to bridge that gap by engaging with the right resources from both within and outside our communities.
The concept of inducing empathy into computing is still a growing field. Education is often struggling with lack of funding, so to consider engaging with such potentially expensive infrastructure can feel daunting. But if the vision is clear and the intent is strong enough, there are several options out there which will help us as teaching and learning communities to start having a dialog with this aspect of tomorrow. What I have presented here is perhaps less than 1% of what is currently happening in this industry. So there is plenty of opportunities for all of us to learn, engage and apply.
Recommended further reading
- The Empathic Engineer – A student blog from Australian National University.
- Google Empathy Lab founder: AI will upend storytelling and human-machine interaction
- What will become of empathy in a world of smart machines?
- Alexa and the age of casual rudeness