In April, I was invited to speak on a panel for the Digital Technologies and Development event at Columbia SIPA. Below is an edited version of my remarks.
Since this panel is on “making digital technologies work for people and businesses,” I want to briefly discuss why we need to keep the human factor in mind when we think about making digital technologies work for everyone. To do this, I’ll share three examples focused on human-centered design in technology and civic innovation.
The first example is about education technology. When I was working in Hyderabad, India, I saw too many schools with unused computers and smartboards. Despite their best intentions, donors and businesses were trying to enter the low-income school market with technologies that were wholly inappropriate for the context. Keep in mind, these are schools with no internet, power outages for the majority of the day, low technology literacy, untrained teachers, and gender gaps. But the big and small companies that I saw trying to improve education through technology didn’t think about these factors before entering the market, and they failed.
That’s why we wrote a 60-page report describing the context and daily life at low-income schools in India, and provided stakeholder descriptions of students, teachers, and parents, and their perceptions of technology. With this report, ed-tech start-ups could design their products with and for their users for maximum success and sustainability of their solution. This work mattered because businesses and governments really need to think about the users and context they’re designing for.
This leads to my second example, about a number of teams in government that are embracing the human-centered design approach to improve government platforms, from the Federal government’s 18f, OPM Lab, and Department of Veteran Affairs, to Bloomberg’s iTeams deployed to cities, to innovation teams in Mayoral offices. They are unique–and currently the exception in government, not the rule–because they’re making a concerted effort to think about the user experience from start to finish in civic engagement:
- The iTeam in New Orleans used design to improve the zoning application process.
- The Lab@OPM has been working for over a year on redesigning USAJobs.gov to reinvent and improve the way everyone applies for Federal jobs.
- Boston uses text messaging as part of their participatory budgeting process, to meet their voters where they’re most comfortable (on their phones), and has also developed an app called Citizens Connect to track and solve citizen public works requests.
By keeping the citizen experience in mind, government can improve citizen confidence, which is currently at an all time low. That’s because our country hasn’t invested in or changed the ways we engage with democracy; we’re interacting with 20th century institutions in the 21st century. By understanding how citizens interact with technology, and designing solutions with them in mind, government can make a lot of frustrating and complicated civic engagements–like the DMV, jury duty, and voting–a more joyful experience for all.
None of this is possible if we don’t integrate this human focus into technology careers in the public and private sector, which is my third point. This is where the SIPA degree becomes integral. Alec Ross, a former SIPA Fellow and innovation advisor at the State Department, notes that humans aren’t as easy to upgrade as software, but to compete in this new economy fueled by technology and globalization, we have a lot of upgrading to do. That includes fostering interdisciplinary leaders. There’s a greater emphasis on machine-learning and algorithms in our work, but we also risk bias and unintended consequences when there is a lack of human oversight in our technology ventures.
The most obvious recent example of this is the incident with Microsoft’s artificial intelligence Twitter account, Tay, that quickly became racist and misogynist, and even a Trump and Hitler supporter. To maintain a human-centered focus, engineers can’t work in isolation, but instead in collaboration with psychologists, sociologists, and political economists to understand the broader societal implications of their work. This also means we need to emphasize more diversity in technology, both with women and minorities. Microsoft didn’t create the chatbot with filters, but I imagine that if their team was more diverse, they could’ve avoided this embarrassing situation entirely.
Kentaro Toyama, a Professor at University of Michigan, says that, “technology is an amplifier of human intent.” It needs be created by, with, and for humans who understand the socio-economic context they are developing solutions for. When public-private partnerships take on this work, they have to ensure that they are incorporating an interdisciplinary approach that ultimately considers human needs, and not just the partnership’s end goals.