Methodical Snark critical reflections on how we measure and assess civic tech

Roundup: behavior change, early adopters, matchmaking NGOs with researchers, and measuring women’s access to the web.

R

Findings

Edutainment changes behavior through learning, not social pressure. This from RCTs on an edutainment TV drama promoting safe sex among youth in Lagos. Lots of interesting tidbits here, including methods for measuring statistically significant “spillover effect […] driven by friends of the opposite sex.” (Also, not tech focused, but this October conference on norms and behavior change looks like a great stocktaking on polisci thinking on the subject.)

Early adopters of SeeClickFix exhibit a high degree of individual political efficacy (they believe in their ability to address local political issues), according to an experiment in @erhardt’s recently published dissertation (more below in the weeds).

Downwards accountability matters in authoritarian contexts. At least according to study of 2,103 Chinese counties, that shows public responsiveness most closely associated with public threats of collective action and “tattling to upper levels of government.” On the other hand, a recent field experiment suggests that exposing Chinese students to politically sensitive information online significantly alters the way they use the web, but that demand for such information is now so low that it might be more cost effective for Chinese authorities to censor less, and achieve the same results.

Case studies

The Michigan Ross Center for Social Impact has a report documenting how Ashoka fellows are working to rebuild trust between communities and the media.

South African activists’ use of social media nanomedia (e.g., flyers, stickers, marches) as a tactic for securing access to mass media, is driven by scepticism regarding mass media, and should be understood in the context of South African culture for “networks of orality” (jokes, gossip, rumour).

The Journal of Urban Health tracks the dissemination of participatory budgeting in US cities (from 1 to 50 from 2009-2017), and argues for potential benefits in decreasing health disparities.

Community

@CivicTech_BCN has a new #civictech manifesto and is building a community in Barcelona, while Participedia is holding a webinar series on teaching participatory democracy, with a special focus on using their data, I presume. Looks exciting. Starts June 6.

@NiemanLab has a nice rant on what’s wrong with contemporary debates about journalism and media: they’re not based on the research, which is actually quite good, and which consistently contradicts many of the truisms that get bandied about. Sound familiar?

350.org is looking for research partners, as part of the excellent matchmaking work done by research4impact. They’re specifically looking for researchers to help them understand and improve their movement-building efforts (full description here).

Resources and Data

The Open Data for Resilience Index is a “free online tool to identify, assess and compare – for any location – the availability of key datasets for disaster risk management.”

@GSMA and partners have a toolkit for researching women’s internet access and use, including detailed breakdown of sample survey questions structured for different research methods, to be easily adapted in local contexts. Great stuff.

In the Methodological Weeds

This article reviews best practice in open judicial data and argues (unconvincingly, I think) that “Global Open Data Index methodology is the most suitable” for assessments (and comparative assessments) of open judicial data efforts.

The @webfoundation has a useful breakdown of different methods for measuring the digital divide from a gender perspective, and why that’s quite important, for policy and analysis. Also check out the toolkit under resources.

Lastly, @erhardt just published his dissertation on civic technology design and civic empowerment. It proposes six empowerment-based design principles for monitorial citizenship (1. Be Inclusive at Every Stage 2. Give Users Agency 3. Provide Opportunities for Reflection and Discourse 4. Foster and Respect Communities 5. Tell Stories with Data 6. Anticipate Breakdown and Evaluate Rigorously) and “a prototype toolkit for evaluating the impact of civic technology on political efficacy.” The “toolkit” appears to be a set of indicators for individual, external and platform efficacy, and their implementation in a survey. This is solid work, but it also looks like it would be tremendously challenging to apply it to other platforms.

Miscellanea and Absurdum

Randomistas is a new book that seems to be a late entry to the hype-building around RCTs, pitching purple narratives about the RCT revolution in a “‘Gladwell-esque’” style” and without much critical perspective. Sigh.

In more colorful news, municipal government websites in Scandinavia are getting emotional in an effort to connect with citizens, Trump’s tweets appear to be increasing his perceived competence among college students (though this from a convenience sampling of 476 students at a private Northern California university, do, maybe..) and Russian researchers struggle to define an open data specialist in this article. There’s a reason why that’s hard.

Lastly, consider this:

“In recent years, much research has been devoted to the synthesis of the Turing machine; nevertheless, few have enabled the deployment of systems. Given the current status of ubiquitous theory, physicists daringly desire the evaluation of IPv6, which embodies the appropriate principles of cyber informatics. It might seem counterintuitive but is supported by previous work in the field. In this work, we show not only that e-commerce can be made cacheable, semantic, and client-server, but that the same is true for vacuum tubes [1].”

That’s the abstract for a recent peer-review article by Kim Kardashian and Satoshi Nakamoto. Because nothing is broken in academia. And because SCIgen.

Add Comment

Methodical Snark critical reflections on how we measure and assess civic tech

Tags