The (other) problem with scholarship on digital politics

One of the great dangers of the digital moment we currently are liveing through is that the discipline as a whole will succumb to a particularly virulent form of availability bias. It is easy to gather Twitter data. It is harder to navigate the Facebook terms of service, and even harder still to cobble together a comprehensive email dataset. As a result, both academic journals and academic conferences feature mountains of Twitter papers, molehills written about Facebook, and an awkward silence regarding email. We study the kinds of social media that we can access, regardless of their relative importance in political life. […]

It is not enough to study digital trace data as an alternative to surveys and content analysis. We must also attend to the messy, flawed, incomplete organizational logics that incorporate this data into strategic deliberation.

That’s David Karpf, closing out his book, Analytical Activism (pp. 174-5), on how campaigning orgs are using internal metrics to make strategic decisions. There’s a lot to it, and strategic data use isn’t the only blind spot. Perhaps more important is the kinds of organizations that tend to get studied, and how that skews the field more generally, making research outputs less useful to the kinds of orgs that actually need them.

I’m writing a review on the book, which takes issue with how the book defines it’s scope and how it’s contribution might be made more widely relevant to activist orgs. Link and summary forthcoming.

Advertisements

research links w 16-17

Findings

Do international norms and evaluations influence country performance? New evidence on the Aid transparency Index suggests they do. Combination of original panel data and interviews gives some pretty fascinating insights into institutional processes in government.

Community & Resources

A couple of new (and arguably redundant) efforts to open data in the US this week:

  • The US State Department launched the “F Interagency Network Databank (FIND)” for accessing international development data by country.
  • Former Microsoft executive spends a ton of cash creating USAFacts, to provide an integrated look at revenue and spending across federal, state and local governments. Coverage and skepticism

There’s also now a SAGE Handbook of Resistance, @morganweiland has crowdsourced a lit review on free speech theory and technology in the US context, data from the 2016 Right to Education Index is now live, there’s 1 week left to comment on @SunFoundation’s Tactical Data Engagement Guide, and the eminent Stephen Coleman has a new book coming out to revitalize cyber utopianismContinue reading “research links w 16-17”

research links w1-2017 (!)

Papers and Findings

A field experiment among county governments in the US last April showed that municipal governments are more likely to fulfill public records requests if they know that their peers already have, suggesting profound implications for peer conformity and norm diffusion in responsive government. A recent commentary in Public Administration Review builds on these insights, to suggest concrete ways in which open data advocates can capitalize on this dynamic (publicize proactive fulfillment, bolster requests by citing prior fulfillment, request proactive fulfillment through feedback channels, request data on fulfillment when all else fails).

Meanwhile, Austrian researchers surveyed users of a citizen reporting platform for municipal public services (n=2,200, city not named, which is problematic for external validity, they call their study an “experiment”), and argue personal and pro-social motivations as the most important drivers of participation, but find no support for the technology acceptance model or demographic characteristics as drivers of participation (though they do note that “the gender divide is disappearing” (2768), so that’s good to know).

Continue reading “research links w1-2017 (!)”

When Indicators Get in the Way, Go Report Minimal?

Now, there’s a lot we could debate here about data collection processes, or tools, or when and how data clerks should be employed – but that’s not the point. Instead, we suggest that a growing amount of the qualitative evidence indicates that costs of collecting and reporting on the data that inform high-level performance indicators (for various agencies) can be quite high – perhaps higher than the M&E community typically realizes. These opportunity costs were echoed across countries and sectors; discussions with agricultural staff in Ghana, for example, suggest that many extension workers spend up to a quarter (or more) of their time collecting and reporting data.

That’s from a recent ICTworks blogpost. It’s focused on a specific initiative (HIV clinic in TZ), but the spirit will ring true to anyone who’s done donor reporting from the field (or at all really). The idea that reporting gets in the way of work is familiar, but it’s often even worse in a tech context, when there’s a presumption of data abundance, and finding metrics to strenghten work and anticipate roadblocks can be hard enough.  Continue reading “When Indicators Get in the Way, Go Report Minimal?”