Job creation incentives didn’t incentivize job creation

I have a blogged about a some of my recent research projects examining the use of financial incentives to attract firms. My co-authors and I have written that politicians use these incentives to claim credit for investment, but these incentives have very little impact on job creation.

With the support for the Ewing Marion Kauffman Foundation, I have been examining incentive policies in the Kanas City region. As part of this project I surveyed all of the recipients of the Promoting Employment Across Kansas (PEAK) program, the flagship Kansas economic development program.  I have used observational data to compare incentive recipients to non-incentive recipients.

I also conducted a survey of recipients on how incentives affected their employment plans. Using data from a Freedom of Information request I identified 105 PEAK incentive applicants. For all 105 firms, the managers received a recruitment email and a link to an online Qualtrics survey asking a battery of questions about their company’s involvement with the PEAK program. I received a total of 25 responses for a response rate of 23%.

What are the findings?  Incentives are popular with companies and they are ineffective in creating employment.

You’re still reading?

When asked about the program, the vast majority of respondents had a positive experience with the PEAK program and would recommend this program to other firms. A striking 71% of respondents indicated that the program was either “very efficient” or “somewhat efficient” and 92% of respondents would “definitely recommend” or “probably recommend” this program to other firms.

What was the purpose of these incentives? The majority of firms (58%) indicated they were applying for PEAK incentives for expansion, while 42% and 25% indicated this was for relocation and retention.[1]

For the firms that received competing offers from other locations (63% of the firms), 67% of respondents indicated that the PEAK incentives where more generous than the competitor offers. Only 13% of respondents indicated that the PEAK program was less generous.

The program is efficient and generous.  What is not to like?

But are the incentives effective in helping create Kansas jobs? The key set of questions are the counterfactural on what would the firms had done without the PEAK incentive. I asked two direct questions to managers. First U asked if the firm would have left the state of Kansas without a PEAK incentive, providing three possible answers.

1) Without the PEAK incentive would your company have left the state of Kansas?

Next, I asked about the firm’s expected employment without a PEAK incentive:

2) Without the PEAK incentive would you have company hired less employees or the same number of employees?

In response to question 1, only 22% of respondents indicated that they would have left the State of Kansas without the PEAK incentive program while 48% indicated they would have not have left the state. 30% of respondents indicated they were unsure. This question on relocation is a difficult one, since some of the firms were applying for PEAK incentives for expansion.

The second question is a more comparable measure of the impact of PEAK incentives. 67% of firms claimed that they would have hired the same number of workers without the PEAK incentives, while 29% of firms indicated that they would have hired fewer workers.[2]

Thus for most firms, employment decisions were independent of the incentives.

The final question asked about a specific policy. The Missouri State Legislature introduced a bill that proposed limiting incentive competition in the Kanas City area. I asked the respondents if they had heard of the bill (50% had) and if they supported this legislation. The largest percentage of respondents “neither supported nor opposed” the legislation (42%) with a smattering of responses in the supported or opposed followed by 21% of respondents indicating “don’t know”.[3] Thus there is only limited support, and limited opposition, for ending a very specific type of incentive competition in the region.

In summary, firms tend to like this program. Taxpayers probably should not.

[1] Note that these categories aren’t mutually exclusive. Respondents had the option to check more than one.

[2] 4% of respondents were unsure.

[3] The distribution is as follows: Strongly support (0%), Support (8%), Neither support nor oppose (42%), Oppose (17%), Strongly oppose (13%), and Don’t know (21%).

Auditing Economic Development

As part of a book project I’m reviewing information on the performance of investment incentive programs in the United States.  These programs are designed to help attract new investment, encourage expansion, and retain existing firms.  I’ve been collecting audits of these programs by state legislatures.

The general consensus from these audits are that the programs lack oversight and that it difficult to verify the performance of these programs.  That is putting it nicely.  Most of these audits range from critical to scathing.  Here are some news stories on the most recent audits.

  • A very critical audit of the Wisconsin Economic Development Corporation (the organization and the incentive programs).
  • The Texas Enterprise Fund critics include legislators that signed the legislation. Other Texas programs received similar reports including incentives from school districts.
  • Claims of lack of transparency and favoritism in the Utah incentive programs.
  • I’ve written about the The Promoting Employment Across Kansas (PEAK) program in the past.  The third part of their legislative audit should be out in Dec.  The first two parts were very critical of the program and lamented the lack of systematic data collection on the program
  • PEW has a nice overview of the past Minnesota, Louisiana, and Massachusetts audits.

More to come.

Experts mixed on effectiveness of U.S. airstrikes against ISIS

In a previous blog post I proposed a Public Policy Survey of Political Scientists. I proposed using the potential reviewer pool from an academic journal to identify a small group of experts to ask about a pressing public policy issue.

Michael Colaresi, journal co-editor at International Interactions, has agreed to use their potential review pool to identify experts on a given topic. As I noted in a previous post, there are costs and benefits of focusing on a journal review pool, as opposed to a broader membership. This post presents our first trial run.

We asked a total of 50 academics, all experts in this area, three questions about U.S. policy towards ISIS. We received 30 responses. Here we present the raw results without any commentary. Michael put together some nice graphics on the three questions and the correlations across questions.

All three questions were statements and respondents were presented with five options.

Question 1: The current airstrikes on ISIS will roll back their advances in the region (N=30).

Q1_JPEG

Question 2: I support the President’s decision to strike ISIS.

Q2_JPEG

Question 3: If the airstrikes against ISIS continues beyond two years in their current intensity, the majority of the public will not support continued airstrikes against ISIS.

Q3_JPEG

Comments from the Respondents

For each question we gave respondents the option to provide additional background information or justifications for their answers. I am not going to provide individual comments here, but there were a number of common themes.

First, a number of respondents indicated that the airstrikes on ISIS aren’t sufficient to roll back ISIS, although they can slow or stop ISIS advances. A number of respondents also indicated that additional military force, including ground troops, would be necessary to roll back ISIS.

Second, while the majority of respondents agreed with the President’s airstrikes against ISIS, a few respondents indicated that this policy will likely fall short of the ambitious goals of stopping ISIS.

Third, respondents were not only mixed on the third question, the comments indicated an even larger divide on public support for bombing. But the most common response was that public support was largely contingent the U.S. avoiding any casualties.

Interpretation and Post-Game

The goal of this survey is to present expert opinions on a pressing policy issue. I have my own interpretation of the implications of this survey, and have some thoughts on the value (and limitations) of this type of expert survey. But this is for the next blog post.

We will certainly run another one of these surveys using the International Interactions potential review pool again next month. Please send me an email if you have thoughts about questions.

Can I add an Appendix to a Blog Post?

In case you are wondering about the correlations between the three questions:

Q12_JPEG

Q13_JPEG

Q23_JPEG

Some random links

A few interesting studies have popped up over the past few days.

Here is a clever paper on how public scrutiny affects how companies use tax havens. Using a high profile campaign in the UK, the authors find evidence that is consistent with targeted firms reducing tax avoidance strategies.

A pair of interesting NBER working papers popped up this week.

This paper examines the percentage of women that are top earners in the US. Summary: things are looking better, but they are still pretty bad.

This other paper examines how lending by lending by the Small Business Association (SBA) is associated with lower county income growth. That’s not a typos. The SBA is associated with lower levels of income growth.

Nolan McCarty’s post on what can be learned from the Goldman Sachs tapes is excellent.

Insider baseball on the pros and cons of research registration. I’m a fan of registration, but Josh Tucker makes some excellent points.

 

Public Policy Survey Proposal Update

Yesterday I proposed piloting a public policy survey of political science researchers.

This blog post had about 400 hits, which is sadly a very good day for my new blog. Either this is a good idea or a dumb idea. Don’t answer that.

I’ve received some excellent feedback on the idea. A few things that popped up:

First, the William & Mary Teaching and Research in International Relations (TRIP) snap poll actually does something very similar to what I proposed. This survey panel includes almost 3,000 teachers and researchers in international relations. While this survey is pretty broad, they ask people to identify their main fields of interest.

Second, quite a few people mentioned The Good Judgement Project.  This project aims to use crowdsourcing to predict world events.  Check out their leadership team.  This is no joke.

I like both of these projects quite a bit. So why should I provide yet another public policy survey?

My goal is to harness the expertise of a group of researchers active on a particular topic. To be honest, I am not sure how much this will differ from TRIP survey or Good Judgement Project results. Yes, Tetlock, I have read your work.

But why not try?

The worst case scenario is that I look silly. This happens all of the time. The best case scenario is that this is a great idea and someone or some group that is better trained than me actually does the right.

Maybe there is no upside.  Similar to all of my NSF grant proposals.

Here is the plan:

  • I have constructed a draft Qualtrics survey here: http://tinyurl.com/lv4rzba
  • International Interactions did a quick check of their reviewer pool on the topic and we can easily find 50 researchers to survey.
  • I refined the survey and we send it out.
  • We collect the responses and post them.
  • Nobody reads this post or the results.

Feel free to take the survey for kicks.  To be clear, this survey isn’t fielded yet.  Your answers are just for my amusement.

Public Policy Survey of Political Scientists: A Proposal

I have been quietly pitching an idea of establishing a survey political scientists modeled after the University of Chicago’s Booth School IGM Panel. In the IGM survey, they ask an established panel of top economists one or two survey questions on topics like the impact of the minimum wage on unemployment, the economic returns to infrastructure spending, and the predicted economic impact of Scottish independence. See here for their latest survey:

My idea is slightly different. Rather than surveying the same group of scholars, why not ask different experts based on the topic? This allows us to harness the power of specialized knowledge and to include a much wider set of individuals to weigh in on topics.

The problem with this sort of proposal is that I am sure it will come under fire on how I select questions and how I establish a panel.

Rather than debate this, why not just give it a try?

So here is my proposal. First, I am going to select a questions, actually statements, with some feedback via this website or email.

Here is my first cut. How much do you agree or disagree with the following statements?

Statement 1: Effectiveness of airstrikes on ISIS

The current airstrikes on ISIS will roll back their advances in the region.

  1. Strongly agree
  2. Agree
  3. Neither agree nor disagree
  4. Disagree
  5. Strongly Disagree
  6. No opinion

Statement 2: Support for Strikes

I support the President’s decision to strike ISIS.

  1. Strongly agree
  2. Agree
  3. Neither Agree nor disagree
  4. Disagree
  5. Strongly Disagree
  6. No opinion

Statement 3: If the airstrikes against ISIS continues beyond two years, the public will not support continued action against ISIS.

  1. Strongly agree
  2. Agree
  3. Neither agree nor disagree
  4. Disagree
  5. Strongly Disagree
  6. No Opinion

Second, the editors at the journal International Interactions, have agreed to help me identify scholars on each topic, similarly to how they would find reviewers for research articles. Together, we will then ask potential panelists whether they would be willing to be a part of this project and provide answers.

There are obviously other models, but I’d like to at least give this one a try. Send me an email if you have suggestions.

Update:  I’ve already received some feedback on question working.  Minor changes incorporated.

Updates by Nate

Blogging has been light for a variety of reasons.  A few updates.

First, my International Studies Quarterly article on how voters respond to incentives just came out.  See here (gated).

Second, I am presenting work at the Kansas Policy Institute conference on Wichita’s proposed 1% sales tax hike.  What do I know about this?  I know there is a proposal to create a jobs program that includes incentives.  Here is the program.

I think there will be some fireworks at this conference.  More to come on this very, very soon!

My Op-Ed in the Kansas City Star on Investment Incentives

A few months ago I wrote on op-ed for the Kansas City Star on my research on the “Kansas City Border” war.  I did some analysis of the main Kansas economic incentive program and found that firms that were given subsidies didn’t create more jobs than a control group of firms that didn’t receive subsidies.

My op-ed took months to publish, got stuck in a local section of the paper, and they changed the title.  My original title was: “Should you Pay $25 Million for 45 Fewer Jobs?”

They could have just said “no thank you”.

Anyway.  Here it is.

 

Previous Posts on the Academic Job Market

I am relaunching my blog and thought I would repost some previous blogs by topic.  I have two blog exchanges on the academic job market in 2012 and 2013 that are hopefully still relevant.

First, I blogged on political science job market candidates during the 2012-2013 political science job market.  As the Director of Graduate Studies at Washington University I had a bunch of undergrad RAs collect data on candidates from some of the top programs.  Post are here, here, herehere, and here.

I had the students collect this data because it was an easy way to get a snapshot of the job market candidates.  But don’t interpret this data collection as my belief that this simulates how academic search committees actually make decisions.  But I did interview a few search chairs as part of this process.  I am considering a follow-up survey to search chairs.  We’ll see.

Second, there was a little exchange on academic job talks that included a lot of interesting posts by political scientists.  This is less job candidate centered and more about how much weight should be given to talks by committees and departments.  See here, here, and here.

Going back through this post I wonder why I defended the job talk.  I think many departments weigh this too much, especially for senior hires.