Papers and Findings
Autocracy Online: Freedom on the Net 2016 was released, and shows continued declines in internet freedom around the world, with an increase of app censorship. Meanwhile, a paper in Telecommunications Policy argues that autocracies have “caught up” with democracies in terms of internet penetration since 2013, and an article in press argues that moving from electoral to liberal democracy is a process, and in fact uses data from international comparative indices to argue that internet penetration facilitates more censorship and surveillance than liberal democracy (the methods look dubious). As case in point, a Russian case study shows how online voting can be used to open wash, while disempowering political opposition.
Interaction online: A literature review of research on online participation platforms (OPPs) is used to propose a “requirement framework” for evaluating OPPs, and which is composed of 6 criteria (usability, security, information, transparency, integration, and mobilisation). Meanwhile a US survey from 2014 (Qualtrics panel, online, opt-in) suggests that posting international news not only correlates with higher degrees of political participation, but facilitates it. While a survey of US government agencies’ use of Facebook for engagement, shows that, well, different agencies use it differently. Yup.
Measuring impact: This conference paper applies the Social Return on Investment (SROI) model of distinguishing between input/output/outcome/impact, to propose a model for measuring the economic benefits of open data. Meanwhile this conference paper claims to measure open data impacts across a variety of sectors, by building on “existing e-government development indices” and “partial least squares structural equation modelling.” I don’t really know what that means, I can only access the abstract.
In other news: “25 Years of Transparency Research: Evidence and Future Directions.” That’s the title of a sweeping literature reveiw published in PAR. The authors surveyed 185 studies (empirical and theoretical) between 1990 and 2015, to identify a typology of transparency and transparency outcomes. They also propose a research agenda.
A CDG working paper applies discontinuity regression to a database of World Bank supported procurement contracts (n=70k, in 132 countries) and concludes that the results suggest “an economically meaningful impact of a reasonably limited increase in advertising and transparency on procurement outcomes.” Ie: publicizing procurement is good.
There’s also a fascinating article in Public Administration, comparing resistance to open data performance measurement in the education sector in Brazil and the UK. The authors conclude: it’s complicated. But the paper’s worth the read, if only for the detail on the cases, and some fascinating research design choices.
Community and Commentary
The Open Data Institute collects and presents a dozen anecdotal examples of open data’s impact. This includes everything from presumed accountability via visualized budget data in Nigeria, to the $930m sale of an “open data company”, or the discovery of new species following crowd-science.
GovLab is recruiting peer reviewers for a long form report on open data impact in developing countries (looks like a further iteration of the report and book I reviewed here), to be followed by an open review process. Speaking of which, Matthew Salganik has released the Open Review Toolkit, a set of scripts for conducting open book reviews, based on his recent open review process. Nice.
In other news, Carleton University cast #openwashing as a measurement problem on the OGP blog, @ calls the end of the evangelical phase of the open data movement, Bangladeshi researchers penned a nice blog post on the challenges of measuring SDG 16 from a national perspective, and for national application, and last week saw the latest installment in a series of earnest academic proposals to build that one final democracy app to rule them all. All hail the #democracymachine (in preprint fwiw).
In the Methodological Weeds
@ is co-author on a paper proposing a measure of social media “power.” The measure takes an ecosystemic approach, explicitly accounting for counter-movements and mass media, and is composed of metrics for unity, numbers, and commitment (WUNC adapted to a digital media context). The authors apply the model to Black Lives Matter on twitter, and find commitment to be a strong predictor of elite response. It’s an excellent demonstration of meaningfully applied #digitalmethods.
Andrew Gelman’s manifesto on exploratory study design has clear relevance to civic tech and accountability studies (not to mention program and evaluation design for adaptive interventions and complex systems). He argues for clear advice from stats experts on study design, suggesting emphasis on valid & reliable measurements, open-endedness, inclusion of qual data, and continuous measurements. Subsequent comments show what an important growth area this is, and suggest Gelman will be returning to the subject.
- CFP for dg.o 2017 (International Conference on Digital Government Research, NYC, June 7-9)
- CFP: Nordic Political Science Congress (Denmark, August 8-11, deadline Jan 15)
- CFP: special issue: Evaluación y monitorización de la Comunicación para el Desarrollo y el Cambio Social (deadline Jan 31)
- EUROLAB is offering stipends for researchers to visit and use all their data for 1-122 months (deadline Dec 1).
Miscellanea & Absurdum
- “Justice without lawyers or courthouses”? “Fast, affordable, transparent justice”? There’s an app for that: #crowdjury.
- A college hackathon has fixed Facebook’s fake news problem.
- Should scientists be allowed to continue to play in the sandbox after they’ve pooped in it? (headline)
- Which American city offers the best protection from zombies (visualization)
- Algorythmic health detection is back (headline: Microsoft Shows Searches Can Boost Early Detection of Lung Cancer)