Do international norms and evaluations influence country performance? New evidence on the Aid transparency Index suggests they do. Combination of original panel data and interviews gives some pretty fascinating insights into institutional processes in government.
Community & Resources
A couple of new (and arguably redundant) efforts to open data in the US this week:
- The US State Department launched the “F Interagency Network Databank (FIND)” for accessing international development data by country.
- Former Microsoft executive spends a ton of cash creating USAFacts, to provide an integrated look at revenue and spending across federal, state and local governments. Coverage and skepticism.
There’s also now a SAGE Handbook of Resistance, has crowdsourced a lit review on free speech theory and technology in the US context, data from the 2016 Right to Education Index is now live, there’s 1 week left to comment on @SunFoundation’s Tactical Data Engagement Guide, and the eminent Stephen Coleman has a new book coming out to revitalize cyber utopianism. Continue reading “research links w 16-17”
Papers & Findings
Using the internet leads to civic engagement. Sometimes. Kind of. This according to structural equation analysis of US college survey data (n=2000), which finds “both positive and negative effects” of internet use on engagement patterns (students who share political opinions online tend to have less political conversations offline) but also identifies “feedback loops” between online and offline engagement activities (the effects of offline activity on online activity being slightly stronger than the other way around), and online info-gathering as having particular predictive power for engagement, both online and off.
Crowdsourcers, microtaskers and distributed team wranglers, choose your platforms wisely. Online groups do not display “collective intelligence.” This based on a replication of the 2010 study that demonstrated collective intelligence among groups, but using online groups. The likely explanation is that collective intelligence relies on social sensitivity of group members rather than individual intelligence, and that social sensitivity in turn relies on social cues absent in the type of online group-work tested in this study.
Open data enthusiasts and civic techies are good at exploiting norms around #open for mobilization, but “open data intermediaries lack a shared culture and political understandings necessary for broader and more impactful action,” with “exceedingly fragmented” perspectives on what data can and should do. This from work with focus groups in a mid-sized US city. Continue reading “research links w 6-17”
Papers and Findings
A field experiment among county governments in the US last April showed that municipal governments are more likely to fulfill public records requests if they know that their peers already have, suggesting profound implications for peer conformity and norm diffusion in responsive government. A recent commentary in Public Administration Review builds on these insights, to suggest concrete ways in which open data advocates can capitalize on this dynamic (publicize proactive fulfillment, bolster requests by citing prior fulfillment, request proactive fulfillment through feedback channels, request data on fulfillment when all else fails).
Meanwhile, Austrian researchers surveyed users of a citizen reporting platform for municipal public services (n=2,200, city not named, which is problematic for external validity, they call their study an “experiment”), and argue personal and pro-social motivations as the most important drivers of participation, but find no support for the technology acceptance model or demographic characteristics as drivers of participation (though they do note that “the gender divide is disappearing” (2768), so that’s good to know).
Continue reading “research links w1-2017 (!)”
Now, there’s a lot we could debate here about data collection processes, or tools, or when and how data clerks should be employed – but that’s not the point. Instead, we suggest that a growing amount of the qualitative evidence indicates that costs of collecting and reporting on the data that inform high-level performance indicators (for various agencies) can be quite high – perhaps higher than the M&E community typically realizes. These opportunity costs were echoed across countries and sectors; discussions with agricultural staff in Ghana, for example, suggest that many extension workers spend up to a quarter (or more) of their time collecting and reporting data.
That’s from a recent ICTworks blogpost. It’s focused on a specific initiative (HIV clinic in TZ), but the spirit will ring true to anyone who’s done donor reporting from the field (or at all really). The idea that reporting gets in the way of work is familiar, but it’s often even worse in a tech context, when there’s a presumption of data abundance, and finding metrics to strenghten work and anticipate roadblocks can be hard enough. Continue reading “When Indicators Get in the Way, Go Report Minimal?”