Papers / Findings
- Citizen engagement in rulemaking — evidence on regulatory practices in 185 countries (from the World Bank). TL;DR: opportunities for engagement are greatest in developed countries with strong regulatory systems, as are the use of ex post ante impact assessments. Paper includes an incredibly brief literature review and the study itself is based on e-questionnaires (word docs, expert perception only, no data on actual participation), which was sent to 1,500 individuals in 190 countries. The researchers also conducted follow up interviews for clarification, but there is no information on how many questionnaire responses were received. Most strikingly, the report advances a composite scoring mechanisms for engagement in rulemaking, for application across all country contexts. It’s clunky, with 4 scoring options for most metrics, each of which beg a million questions about comparability and the applicability of the scores to individual political contexts. I’d love to read some reflections on the challenges in actually applying this. Methods and questionnaire available here.
- User Research on UK parliamentary data from the ODI. Contains 4 detailed recommendations plus user journeys, but very sparse info on the methods or users interviewed. Also, @ODIHQ, stop using Scribd, we’ve been through this.
Continue reading “research links w42”
I just attended the digital methods summer school, hosted by University of Amsterdam initiative of the same name. It’s something I’ve wanted to do for years, but first had the opportunity as a phd candidate. It was worth the wait, and here’s a quick summary of what I learned about the methods, the tools, and the course.
“Digital methods” could mean a lot of different things, but there’s a lot at stake in the rhetoric. Digital humanities, data journalism, webometrics, virtual methods, data science, oh my. Cramming the internet into social science research makes for a complicated landscape, and there’s ontological and political work to be done in how academic schools and approaches distinguish themselves.
Digital methods stakes out its turf with a 2-part move: Continue reading “What I Learned about Digital Methods”
Last week I joined the Impacts of Civic Technology Conference 2016, a sort of annual mixer for researchers and the civic tech community, organized by MySociety to “promote and share rigorous and meaningful research into online technologies and digital democracy around the world.”
The event was good (write ups here, here, here, and here), but notable for being so firmly grounded in the idea of research, without talking about it all that much. I left inspired, but frustrated, wishing there was a forum for addressing some of the thornier issues surrounding this still fuzzy idea of research and evidence on civic technology. Because throughout the event, the idea of “research” influencing programming got mentioned a lot, but never examined. Here’s a quick run through some of those issues, and thoughts about why they aren’t yet getting the attention they deserve. Continue reading “Building on TICTec: more thinking about research pls”