Methodical Snark critical reflections on how we measure and assess civic tech

Roundup: degrees of responsiveness, evidence on smart participation design, how digital mobilization works, civic engagement with the dead

R

Findings

What works:
A field experiment in Pakistan shows positive effects on citizen perceptions of government when politicians reach out via IVR, and even more so when they follow up in direct response to citizens IVR responses. Analysis from of over 4 million public procurement docs (Europe, 2006-20015) suggests that transparency between internal bidding partners is more effective at reducing corruptions risks than transparency to the general public, but that the latter increases the effectiveness of the former.

Modes of engagement
Parliaments are scaling down meaningful digital interaction with citizens, despite an uptick in the use of online tools since 2010, according to content analysis of 10 countries’ parliamentary website. The article is as interesting as it is confusing, and wins the award for least useful excel bar graphs.  Analysis of German deliberative democracy initiatives finds stronger efficacy in municipalities where participatory mechanisms are institutionalized, and a comparative analysis of Mexican deliberative fora finds that they are more successful when facilitators are substantive experts, and that participants find them more useful for dialogue with government than for policy influence.

# Mobilization
describes an experiment on Brazilian participatory budgeting (n=+43,000) demonstrating significant mobilization effects from digital “get out the vote” (GOTV) messages. Effects did not associated with policy preferences, and were not significantly mediated by content framing in GOTV messages. Lottery incentives “perform[ed] significantly worse than the informational treatment.”

Meanwhile, a survey experiment using German panel data suggests that demand for direct democracy increases with individuals’ perception that everybody else agrees with them on policy :), and a case analysis of the Toronto Community Housing finds that participatory budgeting is confusing.  Phone surveys (n=22) find that disabled activists struggle to use and engage with the most popular tools and platforms for digital activism. The researchers make recommendations for activist organizers and, predictably (?), plan to build an app.

# Downers
Is it just me, or are the global comparative indices competing for biggest downer? Freedom of the Net 2017 is out. This year’s report finds widespread disinformation campaigns, government restrictions to communication tech and online info manipulation messing with elections in at least 18 countries. That’s before you dig into their findings on individual countries and more traditional censorship. Also, a field experiments in Peru suggest that in low accountability contexts, accountability workshops can have counter-productive consequences, increasing civic disengagement, civil unrest, and decreased efforts by politicians towards responsive governance. Whoops.

# design considerations for successful participation
Coding and bootstrap regression analysis of cases from the  database of participatory initiatives (probability sampling 167 of 304 www.participedia.net cases according to fixed-field variables for facilitation, active/passive participation, and use of voting) found “distinct links” between design inputs related to “democratic respect, democratic consideration, and deliberative breadth,” and “outcomes such as decision quality, changes in participants’ civic engagement, participants’ decision ratings, and policy impact, on the other.”

Case Studies

Resources

  • NYU’s @TheGovLab has revamped their CrowdLaw research, focused on the impact of tech on participatory law-making.  The website has analysis, case studies, a Twitter list and bibliography, plus recommendations and model language for policy.
  • A new book on Technopolitics in Latin America has several theoretical chapters, plus case studies on protest and education advocacy in Brazil, social conflict in Colombia, and activist media networks in Mexico.
  • Upturn has released a policy scorecard and research findings on policy body cameras in 75 American cities.
  • @DataActive ‏ has started a curated list of “References for data activist research” So far it’s a few web links to activism campaigns and a few articles by the director of the program. Curious to see if it develops.

Community

There’s another book arguing that we need to advance measurement  and metrics of progress beyond GDP, plus a  report on participatory grant-making, and @CitizenLabCo spells out the difference between civic tech and govtech.

Meanwhile, @annbrowndc reports back from the Global Evidence Summit on Rapid Reviews for policy and programs, noting that despite rhetoric, they tend still to be slow, difficult and resource intensive.

For the titles

Add Comment

Methodical Snark critical reflections on how we measure and assess civic tech

Tags