Policy Evaluations

It is my position as an academic researcher that public policy should be based on systematic knowledge about what works and what does not. It is up to the people and representative politicians to determine goals for public policy. Academics can help policy makers make sensible decisions on how to reach those goals. Evaluation research can show to what extent alternative policies achieve predetermined goals and should also examine potential unintended side-effects. This goes for the introduction of new policy as well as for revisions of old policy.

Policy evaluation is not easy. When the goals that should be achieved are not specified or when they are vague it is hard to say which instrument works best. Also unintended side-effects are hard to predict. Finally the research methodology for policy evaluation can pose problems. The best research design for any evaluation is a randomized control trial. The effects of policy measures should be tested as if they are new health treatments. Roll a die, put some people by chance in a treatment group and compare the development of their situation with people in a control group. Do those receiving the supposedly effective treatment get better sooner and more so than those receiving a placebo or no treatment at all?

In many cases this design is not possible, for ethical or pragmatic reasons. The job of a researcher is to find the next best design: a natural experiment or a quasi-experiment in which disturbing influences are controlled as much as possible. One of the problems is self-selection into the treatment group. If only those get treatment who expect to get better by it keep in mind that those who did not get treatment might not profit. Whatever the exact design: look for change. Compare the treatment group after treatment with itself before treatment. If possible, compare the change in the treatment group with the change among people who did not receive the treatment. Is the change in the treatment group more in the desired direction than in the group that did not receive treatment?

From this tough-on-methods point of view I look at policy debates. What do we know about what works? To facilitate debates I have conducted evaluations of a new national service learning program in the Netherlands, of fundraising methods, and of the charitable deduction. The evaluation of service learning programs is described on this separate page. Here I go into the other evaluations I have been working on.

 

Charitable deduction

For long, an evaluation of the new Law on Giving in the Netherlands (introduced January 1, 2012) was on my wish list. You can read more about this law on this separate page. A partial evaluation of the law was part of the Giving in the Netherlands Study 2015, published on the National Day of Philanthropy, April 23, 2015. More details on the research are available here (in Dutch). A summary in English of the study “Cultural nonprofit organizations in the Netherlands: Changes in giving behavior, fundraising and income between 2011 and 2014” is available here.

 

Organ donation

An evaluation of the Organ Donation Law in the Netherlands is still on my wish list. Here’s a nice post from a national newspaper commenting on the results of previous evaluations. The evaluations have concluded that a change in the law towards an opt-out system would not yield an increase in the number of donors. From the tough-on-methods point of view I sincerely doubt the validity of this conclusion. In a previous blog post you can find more details.

A fascinating but often disappointing issue is what happens with evaluations. Here the political machine starts working again. When the official evaluation of the charitable deduction by the Ministry of Finance concluded that the effect could not be demonstrated, the political conclusion was that it did not work – as if it had been demonstrated that the effect was negative (click here to read more). When my own research yielded many factors that made service learning programs more effective, the political conclusion was not to instruct schools how to design the programs. Schools should be free in the design of the programs because previous decades of education policy have tired the whole system too much already, a committee advised. The committee was chaired by the current Minister of Finance, Jeroen Dijsselbloem, recently named as the new president of Europe’s Finance Ministers. That committee also advised not to change education policy unless there is evidence that a new policy would work better. An advice that resonates with the academic mind and generates sensible policy. In December 2012, Dijsselbloem still endorsed its conclusions, but the webpage expressing this position was taken offline in Spring 2013. What happened to that advice when the new Cabinet decided to eliminate the mandatory service learning programs? It was stuffed in a drawer.

 

Fundraising methods

The study on fundraising methods was an evaluation commissioned by the Association of Fundraising Organizations (VFI) and the Association of door-to-door fundraising (Stichting Collecteplan). The task was to analyze the effects of the “Don’t call me” register. As a result of a new law on telecommunication, consumers can register their phone numbers to prevent receiving sales phone calls. Nonprofit organizations and charities raising money and recruiting volunteers are also obliged to avoid calling registered consumers.

Since October 1, 2009, the number of consumers who have opted out through the register has increased to more than 6 million in September 2011. In a previous study, volunteer recruiters warned that the register would reduce the number of volunteers. The research is an attempt to investigate the actual consequences of the “Don’t call me” register on the costs and effiency of fundraising and volunteer recruitment. The fieldwork took place from June to September 2011.

The design of the study was far from ideal: all fundraising organizations were affected by the new law. The only option was to study the continuity in trends in the years before and after the introduction of the law. Discontinuities could indicate effects of the law. However,  it also proved very difficult to obtain data on fundraising costs and benefits before and after the introduction of the law. A particular problem was that charities could not quantify the ‘lifetime amounts’ donated by donors who were recruited through different methods.

The final report on donor recruitment is here.

The final report on volunteer fundraiser recruitment is here.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s