Methodical Snark critical reflections on how we measure and assess civic tech

I read this for you: Experimental Evidence from 2,150 Brazilian Municipalities on how Research Affects Policy

I

What is it:

  • A 59pg Working Paper from NBER (inc. 6pgs of references and 29 pgs of tables and annexes).
  • Presents the results of two field experiments, plus background and lots of methods.
  • Includes a highly readable 4 page intro that summarizes everything.
  • Produced by four solid academics doing rigorous work, with no institutional biases that I can see.

Should you care:

Maybe.

  • It’s a cool and smart research design, if you get excited about that sort of thing.
  • In terms of findings, it’s always nice to see evidence that policy-makers are rational, but it’s not clear how much this can be generalized. There are distinctive aspects to the sample, the intervention, and the context. See below.

 

Methods:

Demand/Beliefs Experiment Policy-Adoption Experiment
Research Question(s) 1-Do policy makers demand info (ie, willing to pay for it)?

2-Does info affect their policy beliefs?

Does quality evidence about policy effectiveness influence whether similar policy is influenced?
Sample 900 officials (from 657 municipalities) that were attending conferences Mayors attending a conference (881

municipalities in treatment, 937 in control group)

Justifications for using Brazilian municipalities:

  • “…their political leaders hold a role analogous to that of many countries’ head of state…”
  • There’s a lot of them and the data was accessible.
Experimental design During breaks at relevant conferences, participants were recruited (promised evidence on impact of Early Childhood Development programs) and brought into a side room for 30 min with an tablet:

1- were asked about their beliefs in a policy area

2- were offered RCT evidence on same issue

3- were asked if they were willing to pay for it (using lottery tickets with which they might otherwise win a trip to the US, super complicated)

4- received evidence and were asked about their policy beliefs again

Random selection of conference attendees were invited to attend “an optional research information session”

1- Presentation of research findings from studies on the quantitative impact of reminder letters sent to taxpayers to induce them to comply with taxes.  (45min)

2- Desk research and in-depth phone survey with “key bureaucrats in treatment and control municipalities 15-24 months later.

3- asked control and treatment group about their policy beliefs regarding tax letter intervention

Misc notes Randomly informed participants about contextual factors to control for politically motivated participation bias. 37.9% of treatment and <1% of control group attended session

group chose to attend our session.

 

Findings:

Demand/Beliefs Experiment Policy-Adoption Experiment
Participants were generally willing to pay for evidence, and that willingness increased for studies with larger samples, and for participants from a municipality where a similar intervention had been recently conducted.

Participants updated their beliefs after receiving evidence, and more so for studies with larger sample sizes, but not for studies conducted in developing countries.

Participants did not display confirmation bias.

Mayors that attended the research session were more likely to implement the policy (“increased the probability … by a remarkable 10 percentage points, or 33 percent relative to the 32 percent of municipalities in the control group”).

Mayors that attended research session had more realistic policy beliefs (estimates in the treatment group deviated 20 percent less from evidence than control group).

 

Caveats and generalization

The most immediate conclusion that I draw from this research is that Brazilian mayors are amazing superhumans. They are willing to pay for evidence and display no confirmation bias. I wish I was a Brazilian mayor.

Some other distinct aspects:

  • The tax reminder letter is a uniquely cost-free policy intervention that directly conveys a high-value benefit (tax monies). Who wouldn’t jump at that? How much different would it be if the policy were hard, power was involved, or the policy issue was contentious?
  • Brazilian mayors may be similar to heads of state insofar as they have significant executive authority and are elected, but municipal governance is a distinct type of governance, and differences in scale and mandate at other levels of government might well influence these dynamics.
  • Participation was voluntary and primed by the fact that respondents were already attending a conference in order to acquire information. Would it be different if that information was not solicited, or if it ran contrary to policy-makers’ most immediate interests?
  • As the authors note, these experiments deal in easily accessible information and cost free policy options, and do not address the host of other obstacles to policy implementation that may exist.

In sum, this is great research, and it sets the stage for a lot more research, but as with any rigorous experimental design, answers a pretty narrow research question.

Add Comment

Methodical Snark critical reflections on how we measure and assess civic tech

Tags

Get in touch

Suggest research to be reviewed or mini-lit reviews. Ask questions or tell me why I'm wrong.