What a week…
Papers & Findings
Political tech: A survey of Swedish NGOs (n=907) suggests that civil society needs lots of human resources to use social media effectively in campaigns, which raises the bar for entry, and strengthens an elite cohort of civil society organizations. Tech was shown to directly help voters, however, as new research strengthens the claim that information apps for voters increase electoral participation, based on electoral data sets from 12 countries and a randomized field experiment during the 2013 Italian parliamentary elections. An online field experiment with San Fransisco residents (n=140,000) also suggests that people who vote are more likely to engage in other forms of political participation, or at least more likely to open NGO surveys.
Thinking about cities, a study of 65 mid-to-large size US cities suggest that data analytics practices are wide spread, and that leadership attention, capacity and external partners are the primary factors determining whether cities engage with big data. A researcher from International Data Corporation (whoa) compares three prominent models for evaluating the implementation of smart cities, and suggests how city managers should merge them. Continue reading “research links w 7-17”
Papers / Findings
- Citizen engagement in rulemaking — evidence on regulatory practices in 185 countries (from the World Bank). TL;DR: opportunities for engagement are greatest in developed countries with strong regulatory systems, as are the use of ex post ante impact assessments. Paper includes an incredibly brief literature review and the study itself is based on e-questionnaires (word docs, expert perception only, no data on actual participation), which was sent to 1,500 individuals in 190 countries. The researchers also conducted follow up interviews for clarification, but there is no information on how many questionnaire responses were received. Most strikingly, the report advances a composite scoring mechanisms for engagement in rulemaking, for application across all country contexts. It’s clunky, with 4 scoring options for most metrics, each of which beg a million questions about comparability and the applicability of the scores to individual political contexts. I’d love to read some reflections on the challenges in actually applying this. Methods and questionnaire available here.
- User Research on UK parliamentary data from the ODI. Contains 4 detailed recommendations plus user journeys, but very sparse info on the methods or users interviewed. Also, @ODIHQ, stop using Scribd, we’ve been through this.
Continue reading “research links w42”
Now, there’s a lot we could debate here about data collection processes, or tools, or when and how data clerks should be employed – but that’s not the point. Instead, we suggest that a growing amount of the qualitative evidence indicates that costs of collecting and reporting on the data that inform high-level performance indicators (for various agencies) can be quite high – perhaps higher than the M&E community typically realizes. These opportunity costs were echoed across countries and sectors; discussions with agricultural staff in Ghana, for example, suggest that many extension workers spend up to a quarter (or more) of their time collecting and reporting data.
That’s from a recent ICTworks blogpost. It’s focused on a specific initiative (HIV clinic in TZ), but the spirit will ring true to anyone who’s done donor reporting from the field (or at all really). The idea that reporting gets in the way of work is familiar, but it’s often even worse in a tech context, when there’s a presumption of data abundance, and finding metrics to strenghten work and anticipate roadblocks can be hard enough. Continue reading “When Indicators Get in the Way, Go Report Minimal?”