“What do you do?”
For years this common, seemingly simple question has given me pause. As the head of Abt’s Digital Delivery team, I spend a lot of time overseeing the technology solutions we build to support our research, monitoring and evaluation portfolio. Sometimes I say I am a “technology manager,” other times I am a “policy researcher.” In fact, I am both: I am a research technologist—as is much of my team.
Currently, specific expertise around implementing technology to support policy research and evaluation is not understood to be a field or an acknowledged career path or specialty. There is not a defined community of practice for research technologists, as there is for, say, health IT or even IT for international development (“ICT4D”). But, there should be.
These days, technology is an integral part of all aspects of research, monitoring and evaluation work. This is true at Abt Global, but also at our peer companies, universities, implementing organizations, foundations, and government agencies. And, there are a plethora of topics of particular interest to those implementing technology for policy research and evaluation.
Research technologists understand how to use technology to facilitate and enforce different types of research designs; process multiple data sources that may not be used in combination in other fields; navigate divergent approaches to data collection in the context of an ongoing program; and preserve research integrity, reproducibility, equity, and ethics. They also must be attuned to bias effects of digital solutions on research results, for example, recognizing in research design that SMS-based surveys get higher response rates from younger people and tend to exclude older demographics.
In addition to the areas of specific concern to technology in research, a research technologist should be well versed in a range of adjacent fields, including general survey research, Health IT, human services management, data privacy and security, digital citizenship, grants management, and technical assistance, along with any technology trends that may be used for policy work.
Ideally, research technologists should also have a keen understanding of the particular policy and program domains they are working in and the current data and context in which data are collected. They can then serve as a translator between the “pure” researchers and technologists. This includes not only conveying requirements to technical teams, but also helping researchers understand how leading edge technologies—such as low code and data virtualization—can streamline study protocols, obtain better data, and analyze results. There are always new advances; research technologists stay on top of these and assess their potential for the field. At the same time, they appreciate the multiple constraints study teams are under and the various types of users who need to interact with these systems—and will tailor solutions accordingly, often hiding complex processing behind simple interfaces.
If we research technologists organized ourselves as a distinct field, there are several areas ways we could better collaborate and learn from each other. These include the following six general areas:
1. Facilitating and enforcing research designs through technology, including built in rolling random assignment and service tracking modules that prevent cross-contamination.
2. Solutions for all phases of research, from outreach, eligibility determination, consent, baseline and follow-up surveys, service tracking, intervention modes and “dosages,” withdrawals, monitoring, and data analysis and dissemination.
3. Digital support for ethics, equity, scientific integrity, security, and compliance, including informed consent and other IRB requirements; built-in roles that distinguish between service and research users; managing withdrawals and post-withdrawal data-processing; enforcing OMB requirements for data collection form changes; FISMA, HIPAA, and FERPA compliance; accessibility/Section 508 Compliance; preventing algorithmic and analytic bias; and digital equity.
4. Distinct technology approaches and appropriate software tools based on the research need. This includes knowing when to build or buy a new system or leverage existing ones and when to use a centralized system or a distributed approach; strengths and weaknesses of different software packages in this space, and defining the key modules and features needed to support a full study process.
5. Implementing interventions and demonstrations technologically. This includes cases where there is a digital component to the program (“digital interventions”) being evaluated, such as the comparative effectiveness of online learning, mobile applications, AI-based detection, or televisits as opposed to traditional approaches. In other cases, digital work is needed to implement a program such as interfacing with a government system to adjust benefit amounts.
6. Data management and governance, including version control for data and tools, faciliating reproducibility of results; data dictionaries and modern data catalogs; coding of qualitative case notes; data quality review libraries; and integrating and de-conflicting survey data, administrative data, physical specimens, and case management data.
While we may come from different, sometimes competing organizations, all of us in this field are united in the desire to build the best technology to, ultimately, support high quality research that guides policy and program leadership toward improved outcomes. We can advance that goal by aligning ourselves as a community, sharing ideas and solutions, and working together to develop guidance that can support professional development.
To this end, I have established a new “Research Technologists” group on LinkedIn. Please join me there.
Learn more about how Abt applies technology to our Research, Monitoring, and Evaluation work.