Blog

Call for Code® 2019

Creating and deploying open source technologies to tackle some of the world's biggest challenges

When the tsunami hit Japan in 2011, people with disabilities died at twice the rate of the larger population. Some of them couldn’t hear the early warning system. Some of them couldn’t evacuate to higher ground.

That kind of heightened risk exists any time a disaster occurs, with fatalities likely up to four times the non-disabled population. According to a 2013 report from the UN, “people living with disabilities across the world say they are rarely consulted about their needs and only 20 percent could evacuate immediately without difficulty in the event of a sudden disaster event, the remainder could only do so with a degree of difficulty and 6 percent would not be able to do so at all.”

The increased risk doesn’t cease when a natural disaster stops. In the aftermath of disasters, individuals with disabilities are significantly more affected by the loss of infrastructure. They are also more likely to have issues consuming the support information coming from government and aid organizations. Broadcasts may not be subtitled. Printed material may not be available electronically.

The impact of a disaster is long lasting, too. Seven years after the Tohoku earthquake and tsunami, 75,000 of the people evacuated had yet to return home, with a third of these still living in temporary housing.

Last year, IBM and the David Clark Cause launched the Call for Code Global Challenge to help mitigate the effects of disasters. This multi-year initiative asks developers to create technology-based solutions that help communities prepare for and recover from natural disasters. In 2018, more than 100,000 developers across 156 nations answered the call.

This year’s Call for Code has an emphasis on individual and community well-being, with a focus on vulnerable populations. The United Nations and the World Health Organization have recognized the heightened challenges that people with disabilities face in emergency situations. They also have some key advice for how to ensure no one gets left behind. With today marking the eighth Global Accessibility Awareness Day, it is a good reminder how individuals with disabilities are disproportionately affected by natural disasters and how developers participating in Call for Code can think of ways to decrease these staggering statistics.

Integrate disability perspectives

The UN sets out some clear guidelines for how to reduce the risk to citizens with disabilities. That advice focuses on inclusion.

In the context of disasters, people with disabilities need to be involved in all phases of disaster risk reduction. This begins with planning. The UN-sponsored response to the Japanese earthquake identified three core indicators of whether disability perspectives were integrated into planning:

  • Were people with disabilities involved in the risk-reduction plans?
  • Does training for service personnel incorporate disability considerations?
  • What proportion of emergency shelters and sites are accessible?

Those familiar with User Centered Design will understand the importance of putting the user at the core of a solution. If you’re trying to solve challenges faced by individuals with disabilities, you need to involve those individuals so you understand what the challenges are, and whether a proposed solution actually helps.

But when you’re part of a technology team working within the constraints of a hackathon, the tight timeline and need to produce a workable prototype can undermine the need to keep the user at the heart of the discussion. A good example of this occurred in Call for Code in 2018 at the ViaTech Hackathon in Paris.

A real world example

The Joiyce team came up with an innovative idea on gesture recognition by using IBM Watson™ APIs as a means of facilitating American Sign Language (ASL) interpretation. It was an impressive piece of technical accomplishment. In one day, they created a prototype from scratch able to translate a small set of ASL gestures into spoken text.

The only problem? Hand gestures are only part of the information that makes up communication in ASL. “I do see potential with this…but one of the key components of ASL is facial expressions. They should be utilizing native ASL users to test out their concept,” says Brent Shiver, a Deaf developer with IBM Accessibility Design.

It’s an assessment that Joicye team leader Daryl Autar acknowledges. “We were quite isolated from our personal network at the hackathon venue in Paris. Due to the 24-hour deadline we had to learn basics in American Sign Language and the points of improvement of our approach by ourselves, all overnight,” says Autar.

What he’s learned from the experience is that good intentions have to be worked into the prep work, especially in the environment of hackathons, where teams are still dreaming up ideas in the car drive to the event.

“Winning this challenge laid the foundation for our AI-for-good start-up Wavy Assistant, consisting of former team Joiyce members. Now, one of Wavy Assistant’s key factors is that we involve experience experts/patients/customers in the challenges we are trying to solve. I strongly believe this contributed to our win later that year at the TechCrunch Disrupt San Francisco Hackathon,” says Autar.

So how does a team find the balance between the technical innovation emphasized at a hackathon and the UN call to be inclusive? In a constrained cycle, how do you ensure the rigor of interacting with sponsor users in order to define and properly understand needs? How do you create a solution that actually has benefits?

Consider approaching local advocacy groups and asking them about existing disaster response plans. That can surface a number of possible topics. Such a step will also likely reveal some individuals who would be interested in participating. The UN also offers some suggestions on places where universal design principles can be concentrated, such as:

  • General infrastructure development
  • Risk assessment
  • Preparedness planning
  • Drills
  • Early warning systems
  • Search and rescue systems
  • Emergency shelters
  • Temporary housing

The UN specifically has a call to “use innovative technologies, such as those for crowdsourcing, to enhance the chances of survival in the face of disaster.” With the variety of Watson APIs available today, the opportunities to unite the needs of vulnerable populations with technical innovation have never been greater. For instance, IBM’s Accessibility Research team has been focusing on how AI and machine learning can “remove obstacles and create better experiences for the estimated 1 billion people in the world with some kind of disability.”

After learning more about disabilities and talking with advocacy groups, it’s time to consider the technical aspect. Some ideas to get you started would be trying the Facial Emotion Classifier API for individuals who might not be able to convey their emotions with language due to a disability and added PTSD. Or adapting this code pattern on Creating a web app to show the age estimation from the detected human faces to help rescue aid workers easily determine ages of individuals who cannot communicate. You might also want to check out how to build a real-life robot assistant that could help with assisting individuals.

Further reading on the issue:

Michael Gower is a Senior Consultant in Accessibility with IBM Design, responsible for IBM's Accessibility Checklist. He is an active member of the w3c Accessibility Guidelines Working Group, which published WCAG 2.1. Michael has 20 years of experience in the field, including an extensive background in providing Needs Assessments, support, and training for Persons with Disabilities.