Household surveys, once a mostly low-tech activity, are becoming increasingly digital—with benefits for the quality and timeliness of IFPRI’s research results.The digital divide can work in unexpected ways. When developing-country farmers want instant information on, for example, prices or weather, they increasingly turn to 21st-century technologies such as smartphones. When food policy researchers want to find out what crops farmers grow, what children eat, or how much rural households spend, they have typically turned to much older technologies—pen and paper. Now researchers at IFPRI and elsewhere are joining a revolution in data collection: computer-assisted personal interviewing (CAPI).
Much of IFPRI’s research depends on surveys that uncover facts about the lives of poor households. To conduct these surveys, interviewers have typically gone door to door, asked questions, and written the answers on paper questionnaires. That was just the first step in a lengthy process that included collecting the questionnaires, hiring people to input the answers in computers, “cleaning” the data by removing or correcting inaccurate information, and analyzing it using statistics software programs. Months later, researchers had their results.
That era is coming to an end. With CAPI, a survey designer writes and programs a questionnaire and loads it onto handheld computer devices such as tablets or netbooks, and the interviewers type respondents’ answers directly into the devices. CAPI eliminates a major step in the data collection process: staff are no longer needed to enter responses from each paper questionnaire into a computer—the information is already there. That saves time and money. Plus, survey managers can see the data immediately and make needed changes to questionnaires while the survey is ongoing.
The result, says Esteban Quiñones, an IFPRI senior research analyst, is “better-quality data collected—and available—much faster.”
Better Input, Better Output
So why didn’t data collectors adopt computer technology long ago? It is only recently, says Quiñones, that computers have become cheap, rugged, and mobile enough for this use and that suitable software has become available. And, he says, “many researchers are risk averse and aren’t willing to try something like this until the kinks have been worked out and the benefits have been demonstrated.”
And the benefits are many. Electronic questionnaires can capture more complex and detailed information, allowing for multiple versions of the questionnaire and customized questions that evolve as the interview proceeds. Survey designers can include photos and videos in the questionnaire to capture richer information. Supervisors can track the location of interviewers.
A number of IFPRI researchers—including Quiñones and fellow researchers Jef Leroy, Deanna Olney, and Susan Richter, led by Poverty, Health, and Nutrition Division Director Marie Ruel—are working on a CAPI survey measuring child malnutrition in Guatemala. To find out how well nutrition interventions are promoting child growth and development, the project involves interviewing pregnant women and then following up when their children are 1, 4, 6, 9, 12, 18, and 24 months old.
Richter, who is leading the field management of the survey, says these frequent follow-ups would have been nearly impossible without CAPI. “Infants and young children change quickly, and follow-up surveys that allow us to track their development at frequent intervals will help us better understand which nutrition interventions are making a difference to their health and growth,” she explains. “Conducting the frequent follow-up surveys on paper would have made including information from previous surveys difficult.” Now, says Richter, one week after a survey is done she can have information analyzed and ready to be included in the follow-up survey. This increases the quality of the data collected and cuts down on interviewer errors.
Senior Research Assistant Mike Murphy agrees. He is surveying the impact of a US Feed the Future initiative on rural incomes, agricultural productivity, and nutrition in Honduras. “With a paper survey, the interviewer would have to be on top of a lot of details,” he notes. “With CAPI, we can completely automate it.”
Of course, problems do occur with CAPI. Murphy had to deliver extra battery packs to the field when the interviewers’ tablet batteries ran out. Richter’s software program crashed because it didn’t recognize accent marks. She lost data, and her team had to implement the first few weeks of the survey on paper before figuring out how to correct the problem.
But problems can—and do—occur with the old-fashioned paper-and-pen interviews as well. Stories abound about lost or accidentally de-stroyed paper surveys—whether dropped in a fire or eaten by goats—as well as sloppy data entry, unreliable interviewers, and any number of other human errors.
CAPI does take time, effort, and money at the outset. Survey managers must choose and tailor the software program to run the survey, buy computer equipment for the survey staff, and train the interviewers. Sometimes a fix that is minor on paper is time-consuming and expensive with CAPI. And if a hard drive fails, there is no paper version to turn to.
Still, the technology, equipment, and user know-how are improving quickly, making CAPI increasingly easy to use. And ultimately, timelier data that more accurately reflect realities on the ground could pave the way to better policies.
For more information on this topic:
A study comparing CAPI software programs
- Comparative Assessment of Software Programs for the Development of Computer-Assisted Personal Interview (CAPI) Applications, IRIS Center of the University of Maryland and World Bank, 2011.
A two-part discussion of CAPI on the World Bank’s Development Impact blog