co Party Fundraising Success Continues Through Mid-Year By webfeeds.brookings.edu Published On :: Mon, 02 Aug 2004 00:00:00 -0400 With only a few months remaining before the 2004 elections, national party committees continue to demonstrate financial strength and noteworthy success in adapting to the more stringent fundraising rules imposed by the Bipartisan Campaign Reform Act (BCRA). A number of factors, including the deep partisan divide in the electorate, the expectations of a close presidential race, and the growing competition in key Senate and House races, have combined with recent party investments in new technology and the emergence of the Internet as a major fundraising tool to produce what one party chairman has described as a "perfect storm" for party fundraising.1 Consequently, both national parties have exceeded the mid-year fundraising totals achieved in 2000, and both approach the general election with substantial amounts of money in the bank.After eighteen months of experience under the new rules, the national parties are still outpacing their fundraising efforts of four years ago. As of June 30, the national parties have raised $611.1 million in federally regulated hard money alone, as compared to $535.6 million in hard and soft money combined at a similar point in the 2000 election cycle. The Republicans lead the way, taking in more than $381 million as compared to about $309 million in hard and soft money by the end of June in 2000. The Democrats have also raised more, bringing in $230 million as compared to about $227 million in hard and soft money four years ago. Furthermore, with six months remaining in the election cycle, both national parties have already raised more hard money than they did in the 2000 election cycle.2 In fact, by the end of June, every one of the Democratic and Republican national party committees had already exceeded its hard money total for the entire 2000 campaign.3 This surge in hard money fundraising has allowed the national party committees to replace a substantial portion of the revenues they previously received through unlimited soft money contributions. Through June, these committees have already taken in enough additional hard money to compensate for the $254 million of soft money that they had garnered by this point in 2000, which represented a little more than half of their $495 million in total soft money receipts in the 2000 election cycle.View the accompanying data tables (PDF - 11.4 KB) 1Terrence McAuliffe, Democratic National Committee Chairman, quoted in Paul Fahri, "Small Donors Grow Into Big Political Force," Washington Post, May 3, 2004, p. A11.2In 2000, the Republican national party committees raised $361.6 million in hard money, while the Democratic national committees raised $212.9 million. These figures are based on unadjusted data and do not take into account any transfers of funds that may have taken place among the national party committees.3The election cycle totals for 2000 can be found in Federal Election Commission, "FEC Reports Increase in Party Fundraising for 2000," press release, May 15, 2001. Available at http://www.fec.gov/press/press2001/051501partyfund/051501partyfund.html (viewed July 28, 2004). Downloads DownloadData Tables Authors Anthony Corrado Full Article
co Artificial Intelligence Won’t Save Us From Coronavirus By webfeeds.brookings.edu Published On :: Sun, 26 Apr 2020 20:58:02 +0000 Full Article
co Navigating the US-China 5G competition By webfeeds.brookings.edu Published On :: Mon, 27 Apr 2020 13:45:44 +0000 Executive summary: The United States and China are in a race to deploy fifth-generation, or 5G, wireless networks, and the country that dominates will lead in standard-setting, patents, and the global supply chain. While some analysts suggest that the Chinese government appears to be on a sprint to achieve nationwide 5G, U.S. government leaders and… Full Article
co Webinar: Telehealth before and after COVID-19 By webfeeds.brookings.edu Published On :: Mon, 27 Apr 2020 14:35:44 +0000 The coronavirus outbreak has generated an immediate need for telehealth services to prevent further infections in the delivery of health care. Before the global pandemic, federal and state regulations around reimbursement and licensure requirements limited the use of telehealth. Private insurance programs and Medicaid have historically excluded telehealth from their coverage, and state parity laws… Full Article
co COVID-19 has taught us the internet is critical and needs public interest oversight By webfeeds.brookings.edu Published On :: Wed, 29 Apr 2020 17:50:42 +0000 The COVID-19 pandemic has graphically illustrated the importance of digital networks and service platforms. Imagine the shelter-in-place reality we would have experienced at the beginning of the 21st century, only two decades ago: a slow internet and (because of that) nothing like Zoom or Netflix. Digital networks that deliver the internet to our homes, and… Full Article
co COVID-19 trends from Germany show different impacts by gender and age By webfeeds.brookings.edu Published On :: Fri, 01 May 2020 15:41:03 +0000 The world is in the midst of a global pandemic and all countries have been impacted significantly. In Europe, the most successful policy response to the pandemic has been by Germany, as measured by the decline in new COVID-19 cases in recent weeks and consistent increase in recovered’ cases. This is also reflected in the… Full Article
co Removing regulatory barriers to telehealth before and after COVID-19 By webfeeds.brookings.edu Published On :: Wed, 06 May 2020 16:00:55 +0000 Introduction A combination of escalating costs, an aging population, and rising chronic health-care conditions that account for 75% of the nation’s health-care costs paint a bleak picture of the current state of American health care.1 In 2018, national health expenditures grew to $3.6 trillion and accounted for 17.7% of GDP.2 Under current laws, national health… Full Article
co How to increase financial support during COVID-19 by investing in worker training By webfeeds.brookings.edu Published On :: Wed, 06 May 2020 17:46:07 +0000 It took just two weeks to exhaust one of the largest bailout packages in American history. Even the most generous financial support has limits in a recession. However, I am optimistic that a pandemic-fueled recession and mass underemployment could be an important opportunity to upskill the American workforce through loans for vocational training. Financially supporting… Full Article
co Artificial Intelligence Won’t Save Us From Coronavirus By webfeeds.brookings.edu Published On :: Thu, 07 May 2020 22:46:30 +0000 Full Article
co Why France? Understanding terrorism’s many (and complicated) causes By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 The terrible attack in Nice on July 14—Bastille Day—saddened us all. For a country that has done so much historically to promote democracy and human rights at home and abroad, France is paying a terrible and unfair price, even more than most countries. This attack will again raise the question: Why France? Full Article Uncategorized
co France needs its own National Counterterrorism Center By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 The horrific attack in Nice last week underscores the acute terrorist threat France is facing, writes Bruce Riedel. The French parliamentary recommendation to create a French version of the National Counterterrorism Center is a smart idea that Paris should implement. Full Article Uncategorized
co Turkey after the coup attempt By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 On July 20, the Foreign Policy program at Brookings will host a panel discussion to consider the domestic and international consequences of the coup attempt in Turkey. Full Article
co What to do when containing the Syrian crisis has failed By webfeeds.brookings.edu Published On :: Mon, 01 Aug 2016 09:30:47 +0000 Attacks across the Western world—including most recently in Nice, but also of course in Brussels, Paris, San Bernardino, and elsewhere—highlight the growing threat from extremism, with Syria as its home base. It’s time to recognize, therefore, that containment of the Syria crisis (which I think is essentially President Obama’s policy and which many in the […] Full Article
co Congo’s political crisis: What is the way forward? By webfeeds.brookings.edu Published On :: Thu, 04 Aug 2016 16:09:16 +0000 On August 15, the Africa Security Initiative, part of the Brookings Center for 21st Century Security and Intelligence, will host an event focused on Congo and the broader region. Full Article
co The Marketplace of Democracy : Electoral Competition and American Politics By webfeeds.brookings.edu Published On :: Fri, 01 Sep 2006 00:00:00 -0400 Brookings Institution Press and Cato Institute 2006 312pp. Since 1998, U.S. House incumbents have won a staggering 98 percent of their reelection races. Electoral competition is also low and in decline in most state and primary elections. The Marketplace of Democracy combines the resources of two eminent research organizationsthe Brookings Institution and the Cato Instituteto address the startling lack of competition in our democratic system. The contributors consider the historical development, legal background, and political aspects of a system that is supposed to be responsive and accountable yet for many is becoming stagnant, self-perpetuating, and tone-deaf. How did we get to this point, and whatif anythingshould be done about it? In The Marketplace of Democracy, top-tier political scholars also investigate the perceived lack of competition in arenas only previously speculated on, such as state legislative contests and congressional primaries. Michael McDonald, John Samples, and their colleagues analyze previous reform efforts such as direct primaries and term limits, and the effects they have had on electoral competition. They also examine current reform efforts in redistricting and campaign finance regulation, as well as the impact of third parties. In sum, what does all this tell us about what might be done to increase electoral competition? Elections are the vehicles through which Americans choose who governs them, and the power of the ballot enables ordinary citizens to keep public officials accountable. This volume considers different policy options for increasing the competition needed to keep American politics vibrant, responsive, and democratic. Brookings Forum: "The Marketplace of Democracy: A Groundbreaking Survey Explores Voter Attitudes About Electoral Competition and American Politics," October 27, 2006. Podcast: "The Marketplace of Democracy: Electoral Competition and American Politics," a Capitol Hill briefing featuring Michael McDonald and John Samples, September 22, 2006. Contributors: Stephen Ansolabehere (Massachusetts Institute of Technology), William D. Berry (Florida State University), Bruce Cain (University of California-Berkeley), Thomas M. Carsey (Florida State University), James G. Gimpel (University of Maryland), Tim Groseclose (University of California-Los Angeles), John Hanley (University of California-Berkeley), John mark Hansen (University of Chicago), Paul S. Herrnson (University of Maryland), Shigeo Hirano (Columbia University), Gary C. Jacobson (University of California-San Diego), Thad Kousser (University of California-San Diego), Frances E. Lee (University of Maryland), John C. Matsusaka (University of Southern California), Kenneth R. Mayer (University of Wisconsin-Madison), Michael P. McDonald (Brookings Institution and George Mason University), Jeffrey Milyo (University of Missouri-Columbia), Richard G. Niemi (University of Rochester), Natheniel Persily (University of Pennsylvania Law School), Lynda W. Powell (University of Rochester), David Primo (University of Rochester), John Samples (Cato Institute), James M. Snyder Jr. (Massachusetts Institute of Technology), Timothy Werner (University of Wisconsin-Madison), and Amanda Williams (University of Wisconsin-Madison). ABOUT THE EDITORS John Samples John Samples directs the Center for Representative Government at the Cato Institute and teaches political science at Johns Hopkins University. Michael P. McDonald Downloads Sample Chapter Ordering Information: {9ABF977A-E4A6-41C8-B030-0FD655E07DBF}, 978-0-8157-5579-1, $24.95 Add to Cart{CD2E3D28-0096-4D03-B2DE-6567EB62AD1E}, 978-0-8157-5580-7, $54.95 Add to Cart Full Article
co The Marketplace of Democracy: A Groundbreaking Survey Explores Voter Attitudes About Electoral Competition and American Politics By webfeeds.brookings.edu Published On :: Fri, 27 Oct 2006 10:00:00 -0400 Event Information October 27, 200610:00 AM - 12:00 PM EDTFalk AuditoriumThe Brookings Institution1775 Massachusetts Ave., NWWashington, DC Register for the EventDespite the attention on the mid-term races, few elections are competitive. Electoral competition, already low at the national level, is in decline in state and primary elections as well. Reformers, who point to gerrymandering and a host of other targets for change, argue that improving competition will produce voters who are more interested in elections, better-informed on issues, and more likely to turn out to the polls. On October 27, the Brookings Institution—in conjunction with the Cato Institute and The Pew Research Center—presented a discussion and a groundbreaking survey exploring the attitudes and opinions of voters in competitive and noncompetitive congressional districts. The survey, part of Pew's regular polling on voter attitudes, was conducted through the weekend of October 21. A series of questions explored the public's perceptions, knowledge, and opinions about electoral competitiveness. The discussion also explored a publication that addresses the startling lack of competition in our democratic system. The Marketplace of Democracy: Electoral Competition and American Politics (Brookings, 2006), considers the historical development, legal background, and political aspects of a system that is supposed to be responsive and accountable, yet for many is becoming stagnant, self-perpetuating, and tone-deaf. Michael McDonald, editor and Brookings visiting fellow, moderated a discussion among co-editor John Samples, director of the Center for Representative Government at the Cato Institute, and Andrew Kohut and Scott Keeter from The Pew Research Center, who also discussed the survey. Transcript Transcript (.pdf) Event Materials 2006102720061027ppt Full Article
co The Competitive Problem of Voter Turnout By webfeeds.brookings.edu Published On :: Tue, 31 Oct 2006 00:00:00 -0500 On November 7, millions of Americans will exercise their civic duty to vote. At stake will be control of the House and Senate, not to mention the success of individual candidates running for office. President Bush's "stay the course" agenda will either be enabled over the next two years by a Republican Congress or knocked off kilter by a Democratic one.With so much at stake, it is not surprising that the Pew Research Center found that 51 percent of registered voters have given a lot of thought to this November's election. This is higher than any other recent midterm election, including 44 percent in 1994, the year Republicans took control of the House. If so, turnout should better the 1994 turnout rate among eligible voters of 41 percent. There is good reason to suspect that despite the high interest, turnout will not exceed 1994. The problem is that a national poll is, well, a national poll, and does not measure attitudes of voters within states and districts. People vote when there is a reason to do so. Republican and Democratic agendas are in stark contrast on important issues, but voters also need to believe that their vote will matter in deciding who will represent them. It is here that the American electoral system is broken for many voters. Voters have little choice in most elections. In 1994, Congressional Quarterly called 98 House elections as competitive. Today, they list 51. To put it another way, we are already fairly confident of the winner in nearly 90 percent of House races. Although there is no similar tracking for state legislative offices, we know that the number of elections won by less than 60 percent of the vote has fallen since 1994. The real damage to the national turnout rate is in the large states of California and New York, which together account for 17 percent of the country's eligible voters. Neither state has a competitive Senate or Governor's election, and few competitive House or state legislative races. Compare to 1994, when Californians participated in competitive Senate and governor races the state's turnout was 5 percentage points above the national rate. The same year New York's competitive governor's race helped boost turnout a point above the national rate. Lacking stimulation from two of the largest states, turnout boosts will have to come from elsewhere. Texas has an interesting four-way governor's race that might draw from infrequent voters to the polls. Ohio's competitive Senate race and some House races might also draw voters. However, in other large states like Florida, Illinois, Michigan and Pennsylvania, turnout will suffer from largely uncompetitive statewide races. The national turnout rate will likely be less than 1994 and fall shy of 40 percent. This is not to say that turnout will be poor everywhere. Energized voters in Connecticut get to vote in an interesting Senate race and three of five Connecticut House seats are up for grabs. The problem is that turnout will be localized in these few areas of competition. The fault is not on the voters; people's lives are busy, and a rational person will abstain when their vote does not matter to the election outcome. The political parties also are sensitive to competition and focus their limited resources where elections are competitive. Television advertising and other mobilizing efforts by campaigns will only be found in competitive races. The old adage of "build it and they will come" is relevant. All but hardcore sports fans tune out a blowout. Building competitive elections -- and giving voters real choices -- will do much to increase voter turnout in American politics. There are a number of reforms on the table: redistricting to create competitive districts, campaign financing to give candidates equal resources, and even altering the electoral system to fundamentally change how a vote elects representatives. If voters want choice and a government more responsive to their needs, they should consider how these seemingly arcane election procedures have real consequences on motivating them to do the most fundamental democratic action: vote. Authors Michael P. McDonald Publication: washingtonpost.com Full Article
co Collapsible Candidates from Iowa to New Hampshire By webfeeds.brookings.edu Published On :: Wed, 09 Jan 2008 12:00:00 -0500 After his first place finish in Iowa, which was supposed to propel him to a New Hampshire victory, “change” is probably a word Barack Obama does not like as much anymore. But, his support did not really change much between these two elections. He won 38 percent of Iowa’s delegates and 36 percent of New Hampshire’s vote. It was Hillary Clinton and John McCain who were the big change candidates. What happens when a presidential candidate that does well in a primary or caucus state, does not do so well in the next? The dynamic of the presidential election can swiftly and stunningly change, as it did in New Hampshire on Tuesday. How Barack Obama wishes John Edwards showed up in New Hampshire. Edwards was awarded 30 percent of Iowa’s delegates, barely denying Clinton a second place finish. He finished a distant third in New Hampshire, receiving only 17 percent of the vote. There are strong indications that a shift among his supporters helped propel Hillary Clinton to her New Hampshire victory. According to the exit polls, Edwards did 8 percentage points worse in New Hampshire among women, while Clinton did 16 percent better. Obama’s support was virtually identical, dropping a statistically insignificant 1 percentage point. Obama’s support among young people remained strong, if slightly increasing among 18-24 and 30-39 year olds. Clinton’s support remained strong and slightly increased among those 65 and older. Edwards won Iowa’s middle-aged voters, age 40-64, but it was Clinton who decisively won this coveted age demographic in New Hampshire. And where these people were 38 percent Iowa caucus attendees, they were 54 percent of New Hampshire voters. (To understand why their turnout increased, see my analysis of Iowa’s turnout .) Moving forward, the generational war is still a strong dynamic in the Democratic race, as evident in the candidates’ speech styles following the election results. In Iowa, Clinton was flanked by the ghosts of the Clinton administration. In New Hampshire, she shared the stage with a sea of young voters. In Iowa, Obama spoke of change, a message that resonates with younger people who are not part of the establishment. In New Hampshire his slogan was a message that echoes the can-do spirit of the greatest generation, “Yes, we can!” In the days between Iowa and New Hampshire, Edwards spoke about how he wanted the election to become a two-way race. One should be careful with what one wishes for. Edwards and Clinton are vying for the same support base, that when united can defeat Obama, at least in New Hampshire. In the short-term, Obama most needs Edwards to do better so that support can continue to be divided. Among Republicans, John McCain recreated his magic of eight years ago and bounced back strong from a poor Iowa showing to win New Hampshire. The Iowa and New Hampshire electorates are so different it is difficult to compare them. In Iowa, Evangelical Christians were 60 percent of the electorate, while in New Hampshire, they were only 23 percent. Mike Huckabee’s move from first in Iowa to third in New Hampshire can be clearly attributed to the shrinking of his base. His collapse paved the way for a new winner to emerge. It is thus tempting to attribute McCain’s victory solely to the different electorates, but he still had to defeat Mitt Romney to win New Hampshire. According to the exit polls, the battle between McCain and Romney is a referendum on the Bush administration. Surprisingly, McCain, who has tried to rebuild bridges with the Bush establishment since his defeat in the 2000 presidential election, is still seen as the outsider and agent of change by voters participating in the Republican nomination process. In both Iowa and New Hampshire, McCain drew his support from those who said they are angry or dissatisfied with the Bush administration. Romney drew his support from those who said they are enthusiastic or satisfied. Not surprisingly, McCain is also drawing more support from self-described Independents and Romney from Republicans. The candidates seem to understand this dynamic, too, as they gave their speeches following the election results. In a contrived bit of acting, Romney showed up on stage without a podium and shoved a prepared speech back into his pocket (if he had needed a podium, his advance team would have provided it). He appeared relaxed, delivering his speech in a personable style reminiscent of Huckabee, who is competing with Romney for those who support Bush. But he also seemed to be reaching out to Independents with a message of change. In stark contrast, McCain delivered a carefully written, almost sedate speech designed to reassure Republicans of his conservative credentials. This three-way dynamic between Huckabee, McCain, and Romney should prove fascinating as the Republican nomination process moves forward. Where Evangelicals are strong, Huckabee should do well. Where they are not, the rules governing if Independents can or cannot participate will dictate how McCain and Romney do. And we have yet to see regional candidates like Fred Thompson have their day in the sun. And then there is Rudy Giuliani, who is lying in wait in the larger states where his name recognition should give him a significant boost over the other candidates. All of this points to an extended campaign among Republicans. Michael P. McDonald is an Associate Professor at George Mason University and a Non-Resident Senior Fellow at the Brookings Institution. He studies voter turnout and is a consultant to the national exit poll organization. Authors Michael P. McDonald Full Article
co Midterm Elections 2010: Driving Forces, Likely Outcomes, Possible Consequences By webfeeds.brookings.edu Published On :: Mon, 04 Oct 2010 09:30:00 -0400 Event Information October 4, 20109:30 AM - 11:30 AM EDTFalk AuditoriumThe Brookings Institution1775 Massachusetts Ave., NWWashington, DC As the recent primary in Delaware attests, this year's midterm elections continue to offer unexpected twists and raise large questions. Will the Republicans take over the House and possibly the Senate? Or has the Republican wave ebbed? What role will President Obama play in rallying seemingly dispirited Democrats -- and what effect will reaction to the sluggish economy play in rallying Republicans? Is the Tea Party more an asset or a liability to the G.O.P.'s hopes? What effect will the inevitably narrowed partisan majorities have in the last two year's of Obama's first term? And how will contests for governorships and state legislatures around the nation affect redistricting and the shape of politics to come?On October 4, a panel of Brookings Governance Studies scholars, moderated by Senior Fellow E.J. Dionne, Jr., attempted to answer these questions. Senior Fellow Thomas Mann provided an overview. Senior Fellow Sarah Binder discussed congressional dynamics under shrunken majorities or divided government. Senior Fellow William Galston offered his views on the administration’s policy prospects during the 112th Congress. Nonresident Senior Fellow Michael McDonald addressed electoral reapportionment and redistricting around the country. Video Partisan Gridlock post-Elections?GOP Influence over Redistricting, ReapportionmentWorking Within Divided GovernmentGood Conditions for GOP in 2010 Midterms Audio Midterm Elections 2010: Driving Forces, Likely Outcomes, Possible Consequences Transcript Uncorrected Transcript (.pdf) Event Materials 20101004_midterm_elections Full Article
co @ Brookings Podcast: The Politics and Process of Congressional Redistricting By webfeeds.brookings.edu Published On :: Fri, 28 Jan 2011 11:22:00 -0500 Now that the 2010 Census is concluded, states will begin the process of reapportionment—re-drawing voting district lines to account for population shifts. Nonresident Senior Fellow Michael McDonald says redistricting has been fraught with controversy and corruption since the nation’s early days, when the first “gerrymandered” district was drawn. Two states—Arizona and California—have instituted redistricting commissions intended to insulate the process from political shenanigans, but politicians everywhere will continue to work the system to gain electoral advantage and the best chance of re-election for themselves and their parties. Subscribe to audio and video podcasts of Brookings events and policy research » previous play pause next mute unmute @ Brookings Podcast: The Politics and Process of Congressional Redistricting 07:42 Download (Help) Get Code Brookings Right-click (ctl+click for Mac) on 'Download' and select 'save link as..' Get Code Copy and paste the embed code above to your website or blog. Video States Attempt to Reform Redistricting Audio @ Brookings Podcast: The Politics and Process of Congressional Redistricting Full Article
co A Status Report on Congressional Redistricting By webfeeds.brookings.edu Published On :: Mon, 18 Jul 2011 10:00:00 -0400 Event Information July 18, 201110:00 AM - 11:30 AM EDTFalk AuditoriumThe Brookings Institution1775 Massachusetts Ave., NWWashington, DC Register for the EventFull video archive of this event is also available via C-SPAN here. The drawing of legislative district boundaries is arguably among the most self-interested and least transparent systems in American democracy. Every ten years redistricting authorities, usually state legislatures, redraw congressional and legislative lines in accordance with Census reapportionment and population shifts within states. Most state redistricting authorities are in the midst of their redistricting process, while others have already finished redrawing their state and congressional boundaries. A number of initiatives—from public mapping competitions to independent shadow commissions—have been launched to open up the process to the public during this round of redrawing district lines.On July 18, Brookings hosted a panel of experts to review the results coming in from the states and discuss how the rest of the process is likely to unfold. Panelists focused on evidence of partisan or bipartisan gerrymandering, the outcome of transparency and public mapping initiatives, and minority redistricting. After the panel discussion, participants took audience questions. Video Full Event Video Archive Audio A Status Report on Congressional Redistricting Transcript Uncorrected Transcript (.pdf) Event Materials 20110718_congressional_redistricting Full Article
co Social Security Smörgåsbord? Lessons from Sweden’s Individual Pension Accounts By webfeeds.brookings.edu Published On :: President Bush has proposed adding optional personal accounts as one of the central elements of a major Social Security reform proposal. Although many details remain to be worked out, the proposal would allow individuals who choose to do so to divert part of the money they currently pay in Social Security taxes into individual investment… Full Article
co Target Compliance: The Final Frontier of Policy Implementation By webfeeds.brookings.edu Published On :: Abstract Surprisingly little theoretical attention has been devoted to the final step of the public policy implementation chain: understanding why the targets of public policies do or do not “comply” — that is, behave in ways that are consistent with the objectives of the policy. This paper focuses on why program “targets” frequently fail to… Full Article
co The Collapse of Canada? By webfeeds.brookings.edu Published On :: America's northern neighbor faces a severe constitutional crisis. Unprecedented levels of public support for sovereignty in the predominantly French-speaking province of Quebec could lead to the breakup of Canada. This crisis was precipitated by two Canadian provinces' failure in 1990 to ratify the Meech Lake Accord, a package of revisions to Canada's constitution that addressed… Full Article
co The Study of the Distributional Outcomes of Innovation: A Book Review By webfeeds.brookings.edu Published On :: Mon, 05 Jan 2015 07:30:00 -0500 Editors Note: This post is an extended version of a previous post. Cozzens, Susan and Dhanaraj Thakur (Eds). 2014. Innovation and Inequality: Emerging technologies in an unequal world. Northampton, Massachusetts: Edward Elgar. Historically, the debate on innovation has focused on the determinants of the pace of innovation—on the premise that innovation is the driver of long-term economic growth. Analysts and policymakers have taken less interest on how innovation-based growth affects income distribution. Less attention even has received the question of how innovation affects other forms of inequality such as economic opportunity, social mobility, access to education, healthcare, and legal representation, or inequalities in exposure to insalubrious environments, be these physical (through exposure to polluted air, water, food or harmful work conditions) or social (neighborhoods ridden with violence and crime). The relation between innovation, equal political representation and the right for people to have a say in the collective decisions that affect their lives can also be added to the list of neglect. But neglect has not been universal. A small but growing group of analysts have been working for at least three decades to produce a more careful picture of the relationship between innovation and the economy. A distinguished vanguard of this group has recently published a collection of case studies that illuminates our understanding of innovation and inequality—which is the title of the book. The book is edited by Susan Cozzens and Dhanaraj Thakur. Cozzens is a professor in the School of Public Policy and Vice Provost of Academic Affairs at Georgia Tech. She has studied innovation and inequality long before inequality was a hot topic and led the group that collaborated on this book. Thakur is a faculty member of the school College of Public Service and Urban Affairs at Tennessee State University (while writing the book he taught at the University of West Indies in Jamaica). He is an original and sensible voice in the study of social dimensions of communication technologies. We’d like to highlight here three aspects of the book: the research design, the empirical focus, and the conceptual framework developed from the case studies in the book. Edited volumes are all too often a collection of disparate papers, but not in this case. This book is patently the product of a research design that probes the evolution of a set of technologies across a wide variety of national settings and, at the same time, it examines the different reactions to new technologies within specific countries. The second part of the book devotes five chapters to study five emerging technologies—recombinant insulin, genetically modified corn, mobile phones, open-source software, and tissue culture—observing the contrasts and similarities of their evolution in different national environments. In turn, part three considers the experience of eight countries, four of high income—Canada, Germany, Malta, and the U.S.—and four of medium or low income—Argentina, Costa Rica, Jamaica, and Mozambique. The stories in part three tell how these countries assimilated these diverse technologies into to their economies and policy environments. The second aspect to highlight is the deliberate choice of elements for empirical focus. First, the object of inquiry is not all of technology but a discreet set of emerging technologies gaining a specificity that would otherwise be negated if they were to handle the unwieldy concept of “technology” broadly construed. At the same time, this choice reveals the policy orientation of the book because these new entrants have just started to shape the socio-technical spaces they inhabit while the spaces of older technologies have likely ossified. Second, the study offers ample variance in terms of jurisdictions under study, i.e. countries of all income levels; a decision that makes at the same time theory construction more difficult and the test of general premises more robust.[i] We can add that the book avoids sweeping generalizations. Third, they focus on technological projects and their champions, a choice that increases the rigor of the empirical analysis. This choice, naturally, narrows the space of generality but the lessons are more precise and the conjectures are presented with according modesty. The combination of a solid design and clear empirical focus allow the reader to obtain a sense of general insight from the cases taken together that could not be derived from any individual case standing alone. Economic and technology historians have tackled the effects of technological advancement, from the steam engine to the Internet, but those lessons are not easily applicable to the present because emerging technologies intimate at a different kind of reconfiguration of economic and social structures. It is still too early to know the long-term effects of new technologies like genetically modified crops or mobile phone cash-transfers, but this book does a good job providing useful concepts that begin to form an analytical framework. In addition, the mix of country case studies subverts the disciplinary separation between the economics of innovation (devoted mostly to high-income countries) and development studies (interested in middle and low income economies). As a consequence of these selections, the reader can draw lessons that are likely to apply to technologies and countries other than the ones discussed in this book. The third aspect we would like to underscore in this review is the conceptual framework. Cozzens, Thakur and their colleagues have done a service to anyone interested in pursuing the empirical and theoretical analysis of innovation and inequality. For these authors, income distribution is only one part of the puzzle. They observe that inequalities are also part of social, ethnic, and gender cleavages in society. Frances Stewart, from Oxford University, introduced the notion of horizontal inequalities or inequalities at the social group level (for instance, across ethnic groups or genders). She developed the concept to contrast vertical inequalities or inequalities operating at the individual level (such as household income or wealth). The authors of this book borrow Stewart’s concept and pay attention to horizontal inequalities in the technologies they examine and observe that new technologies enter marketplaces that are already configured under historical forms of exclusion. A dramatic example is the lack of access to recombinant insulin in the U.S., because it is expensive and minorities are less likely to have health insurance (see Table 3.1 in p. 80).[ii] Another example is how innovation opens opportunities for entrepreneurs but closes them for women in cultures that systematically exclude women from entrepreneurial activities. Another key concept is that of complementary assets. A poignant example is the failure of recombinant insulin to reach poor patients in Mozambique who are sent home with old medicine even though insulin is subsidized by the government. The reason why doctors deny the poor the new treatment is that they don’t have the literacy and household resources (e.g. a refrigerator, a clock) necessary to preserve the shots, inject themselves periodically, and read sugar blood levels. Technologies aimed at fighting poverty require complementary assets to be already in place and in the absence of them, they fail to mitigate suffering and ultimately ameliorate inequality. Another illustration of the importance of complementary assets is given by the case of Open Source Software. This technology has a nominal price of zero; however, only individuals who have computers and the time, disposition, and resources to learn how to use open source operative systems benefit. Likewise, companies without the internal resources to adapt open software will not adopt it and remain economically tied to proprietary software. These observations lead to two critical concepts elaborated in the book: distributional boundaries and the inequalities across technological transitions. Distributional boundaries refer to the reach of the benefits of new technologies, boundaries that could be geographic (as in urban/suburban or center/periphery) or across social cleavages or incomes levels. Standard models of technological diffusion assume the entire population will gradually adopt a new technology, but in reality the authors observe several factors intervene in limiting the scope of diffusion to certain groups. The most insidious factors are monopolies that exercise sufficient control over markets to levy high prices. In these markets, the price becomes an exclusionary barrier to diffusion. This is quite evident in the case of mobile phones (see table 5.1, p. 128) where monopolies (or oligopolies) have market power to create and maintain a distributional boundary between post-pay and high-quality for middle and high income clients and pre-pay and low-quality for poor customers. This boundary renders pre-pay plans doubly regressive because the per-minute rates are higher than post-pay and phone expenses represent a far larger percentage in poor people’s income. Another example of exclusion happens in GMOs because in some countries subsistence farmers cannot afford the prices for engineering seeds; a disadvantage that compounds to their cost and health problems as they have to use more and stronger pesticides. A technological transition, as used here, is an inflection point in the adoption of a technology that re-shapes its distributional boundaries. When smart phones were introduced, a new market for second-hand or hand-down phones was created in Maputo; people who could not access the top technology get stuck with a sub-par system. By looking at tissue culture they find that “whether it provides benefits to small farmers as well as large ones depends crucially on public interventions in the lower-income countries in our study” (p. 190). In fact, farmers in Costa Rica enjoy much better protections compare to those in Jamaica and Mozambique because the governmental program created to support banana tissue culture was designed and implemented as an extension program aimed at disseminating know-how among small-farmers and not exclusively to large multinational-owned farms. When introducing the same technology, because of this different policy environment, the distributional boundaries were made much more extensive in Costa Rica. This is a book devoted to present the complexity of the innovation-inequality link. The authors are generous in their descriptions, punctilious in the analysis of their case studies, and cautious and measured in their conclusions. Readers who seek an overarching theory of inequality, a simple story, or a test of causality, are bound to be disappointed. But those readers may find the highest reward from carefully reading all the case studies presented in this book, not only because of the edifying richness of the detail herein but also because they will be invited to rethink the proper way to understand and address the problem of inequality.[iii] [i] These are clearly spelled out: “we assumed that technologies, societies, and inequalities co-evolved; that technological projects are always inherently distributional; and that the distributional aspects of individual projects and portfolios of projects are open to choice.” (p. 6) [ii] This problem has been somewhat mitigated since the Affordable Healthcare Act entered into effect. [iii] Kevin Risser contributed to this posting. Authors Walter D. Valdivia Image Source: © Akhtar Soomro / Reuters Full Article
co NASA considers public values in its Asteroid Initiative By webfeeds.brookings.edu Published On :: Tue, 19 May 2015 07:30:00 -0400 NASA’s Asteroid Initiative encompasses efforts for the human exploration of asteroids—as well as the Asteroid Grand Challenge—to enhance asteroid detection capabilities and mitigate their threat to Earth. The human space flight portion of the initiative primarily includes the Asteroid Redirect Mission (ARM), which is a proposal to put an asteroid in orbit of the moon and send astronauts to it. The program originally contemplated two alternatives for closer study: capturing a small 10m diameter asteroid versus simply recovering a boulder from a much larger asteroid. Late in March, NASA offered an update of its plans. It has decided to retrieve a boulder from an asteroid near Earth’s orbit—candidates are the asteroids 2008 EV5, Bennu, and Itokawa—and will place the boulder on the moon’s orbit to further study it. This mission will help NASA develop a host of technical capabilities. For instance, Solar Electric Propulsion uses solar electric power to charge atoms for spacecraft propulsion—in the absence of gravity, even a modicum of force can alter the trajectory of a body in outer space. Another related capability under development is the gravity tractor, which is based on the notion that even the modest mass of a spacecraft can exert sufficient gravitational force over an asteroid to ever so slightly change its orbit. The ARM spacecraft mass could be further increased by its ability to capture a boulder from the asteroid that is steering clear of the Earth, enabling a test of how humans might prevent asteroid threats in the future. Thus, NASA will have a second test of how to deflect near-Earth objects on a hazardous trajectory. The first test, implemented as part of the Deep Impact Mission, is a kinetic impactor; that is, crashing a spacecraft on an approaching object to change its trajectory. The Asteroid Initiative is a partner of the agency’s Near Earth Object Observation (NEOO) program. The goal of this program is to discover and monitor space objects traveling on a trajectory that could pose the risk of hitting Earth with catastrophic effects. The program also seeks to develop mitigation strategies. The capabilities developed by ARM could also support other programs of NASA, such as the manned exploration of Mars. NEOO has recently enjoyed an uptick of public support. It used to be funded at about $4 million in the 1990s and in 2010 was allocated a paltry $6 million. But then, a redirection of priorities—linked to the transition from the Bush to the Obama administrations—increased funding for NEOO to about $20 million in 2012 and $40 million in 2014—and NASA is seeking $50 million for 2015. It is clear that NASA officials made a compelling case for the importance of NEOO; in fact, what they are asking seems quite a modest amount if indeed asteroids pose an existential risk to life on earth. At the same time, the instrumental importance of the program and the public funds devoted to it beg the question as to whether taxpayers should have a say in the decisions NASA is making regarding how to proceed with the program. NASA has done something remarkable to help answer this question. Last November, NASA partnered with the ECAST network (Expert and Citizen Assessment of Science and Technology) to host a citizen forum assessing the Asteroid Initiative. ECAST is a consortium of science policy and advocacy organizations which specializes in citizen deliberations on science policy. The forum consisted of a dialogue with 100 citizens in Phoenix and Boston who learned more about the asteroid initiative and then commented on various aspects of the project. The participants, who were selected to approximate the demographics of the U.S. population, were asked to assess mitigation strategies to protect against asteroids. They were introduced to four strategies: civil defense, gravity tractor, kinetic impactor, and nuclear blast deflection. As part of the deliberations, they were asked to consider the two aforementioned approaches to perform ARM. A consensus emerged about the boulder retrieval option primarily because citizens thought that option offered better prospects for developing planetary defense technologies. This preference existed despite the excitement of capturing a full asteroid, which could potentially have additional economic impacts. The participants showed interest in promoting the development of mitigation capabilities at least as much as they wanted to protect traditional NASA goals such as the advancement of science and space flight technology. This is not surprising given that concerns about doomsday should reasonably take precedence over traditional research and exploration concerns. NASA could have decided to set ARM along the path of boulder retrieval exclusively on technical merits, but having conducted a citizen forum, the agency is now able to claim that this decision is also socially robust, which is to say, is responsive to public values of consensus. In this manner, NASA has shown a promising method by which research mission federal agencies can increase their public accountability. In the same spirit of responsible research and innovation, a recent Brookings paper I authored with David Guston—who is a co-founder of ECAST—proposes a number of other innovative ways in which the innovation enterprise can be made more responsive to public values and social expectations. Kudos to NASA for being at the forefront of innovation in space exploration and public accountability. Authors Walter D. Valdivia Image Source: © Handout . / Reuters Full Article
co Patent infringement suits have a reputational cost for universities By webfeeds.brookings.edu Published On :: Tue, 10 Nov 2015 07:30:00 -0500 Universities cash handsome awards on infringement cases Last month, a jury found Apple Inc. guilty of infringing a patent of the University of Wisconsin-Madison (UW) and ordered the tech giant to pay $234 million. The university scored a big financial victory, but this hardly meant any gain for the good name of the university. The plaintiffs argued successfully in court that Apple infringed their 1998 patent on a predictor circuit that greatly improved the efficiency of microchips used in the popular iPhone 5s, 6, and 6 Plus. Apple first responded by challenging the validity of the patent, but the US Patent and Trademark Office ruled in favor of the university. Apple plans to appeal, but the appellate court is not likely to reverse the lower court’s decision. This is not the first time this university has asserted its patents rights (UW sued Intel in 2008 for this exact same patent and reportedly settled for $110 million). Nor is this the first time universities in general have taken infringers to court. Prominent cases in recent memory include Boston University, which sued several companies for infringement of a patent for blue light-emitting diodes and settled out of court with most of them, and Carnegie Mellon, who was awarded $237 million by the federal appellate court on its infringement suit against Marvell, a semiconductor company, for its use of an enhanced detector of data in hard drives called Kavcic detectors. Means not always aligned with aims in patent law When university inventions emerge from federal research grants, universities can also sue the infringers, but in those cases they would be testing the accepted interpretations of current patent law. The Bayh-Dole Act of 1980 extended patent law and gave small-business and universities the right to take title to patents from federal grants—later it was amended to extend the right to all federal grantees regardless of size. The ostensible aim of this act is to “to promote the utilization of inventions arising from federally supported research or development.” Under the law, a condition for universities to keep their exclusive rights on those patents is that they or their licensees take “effective steps to achieve practical application” of those patents. Bayh-Dole was not designed to create a new source of revenue for universities. If companies are effectively using university technologies, Bayh-Dole’s purpose is served without need of the patents. To understand this point, consider a counterfactual: What if the text of Bayh-Dole had been originally composed to grant a conditional right to patents for federal research grantees? The condition could be stated like this: “This policy seeks to promote the commercialization of federally funded research and to this end it will use the patent system. Grantees may take title to patents if and only if other mechanisms for disseminating and developing those inventions into useful applications prove unsuccessful.” Under this imagined text, the universities could still take title to patents on their inventions if they or the U.S. Patent and Trademark Office were not aware that the technologies were being used in manufactures. But no court would find their infringement claim meritorious if the accused companies could demonstrate that, absent of willful infringement, they had in fact used the technologies covered by university patents in their commercial products. In this case, other mechanisms for disseminating and developing the technologies would have proven successful indeed. The reality that Bayh-Dole did not mandate such a contingent assignation of rights creates a contradiction between its aims and the means chosen to advance those aims for the subset of patents that were already in use by industry. I should clarify that the predictor circuit, the blue-light diode, and the Kavcic detectors are not in that subset of patents. But even in they were, there is no indication that the University of Wisconsin-Madison would have exercised its patent rights with any less vigor just because the original research was funded by public funds. Today, it is fully expected from universities to aggressively assert their patent rights regardless of the source of funding for the original research. You can have an answer for every question and still lose the debate It is this litigious attitude that puts off many observers. While the law may very well allow universities to be litigious, universities could still refuse to exercise their rights under circumstances in which those rights are not easily reconciled with the public mission of the university. Universities administrators, tech transfer personnel, and particularly the legal teams winning infringement cases have legitimate reasons to wonder why universities are publicly scorned. After all, they are acting within the law and simply protecting their patent rights; they are doing what any rational person would do. They may be really surprised when critics accuse universities of becoming allies of patent trolls, or of aiding and abetting their actions. Such accusations are unwarranted. Trolls are truants; the universities are venerable institutions. Patent trolls would exploit the ambiguities of patent law and the burdens of due process to their own benefit and to the detriment of truly productive businesses and persons. In stark contrast, universities are long established partners of democracy, respected beyond ideological divides for their abundant contributions to society. The critics may not be fully considering the intricacies of patent law. Or they may forget that universities are in need of additional revenue—higher education has not seen public financial support increase in recent years, with federal grants roughly stagnated and state funding falling drastically in some states. Critics may also ignore that revenues collected from licensing of patents, favorable court rulings, and out-of-court settlements, are to a large extent (usually two thirds of the total) plugged back into the research enterprise. University attorneys may have an answer for every point that critics raise, but the overall concern of critics should not be dismissed outright. Given that many if not most university patents can be traced back to research funded by tax dollars, there is a legitimate reason for observers to expect universities to manage their patents with a degree of restraint. There is also a legitimate reason for public disappointment when universities do not seem to endeavor to balance the tensions between their rights and duties. Substantive steps to improve the universities’ public image Universities can become more responsive to public expectations about their character not only by promoting their good work, but also by taking substantive steps to correct misperceptions. First, when universities discover a case of proven infringement, they should take companies to court as a measure of last resort. If a particular company refuses to negotiate in good faith and an infringement case ends up in court, the universities should be prepared to demonstrate to the court of public opinion that they have tried, with sufficient insistence and time, to negotiate a license and even made concessions in pricing the license. In the case of the predictor circuit patent, it seems that the University of Wisconsin-Madison tried to license the technology and Apple refused, but the university would be in a much better position if it could demonstrate that the licensing deals offered to Apple would have turned to be far less expensive for the tech company. Second, universities would be well advised not to join any efforts to lobby Congress for stronger patent protection. At least two reasons substantiate this suggestion. First, as a matter of principle, the dogmatic belief that without patents there is no innovation is wrong. Second, as a matter of material interest, universities as a group do not have a financial interest in patenting. It’s worth elaborating these points a bit more. Neither historians nor social science researchers have settled the question about the net effects of patents on innovation. While there is evidence of social benefits from patent-based innovation, there is also evidence of social costs associated with patent-monopolies, and even more evidence of momentous innovations that required no patents. What’s more, the net social benefit varies across industries and over time. Research shows economic areas in which patents do spur innovation and economic sectors where it actually hinders them. This research explains, for instance, why some computer and Internet giants lobby Congress in the opposite direction to the biotech and big pharma industries. Rigorous industrial surveys of the 1980s and 1990s found that companies in most economic sectors did not use patents as their primary tool to protect their R&D investments. Yet patenting has increased rapidly over the past four decades. This increase includes industries that once were uninterested in patents. Economic analyses have shown that this new patenting is a business strategy against patent litigation. Companies are building patent portfolios as a defensive strategy, not because they are innovating more. The university’s public position on patent policy should acknowledge that the debate on the impact of patents on innovation is not settled and that this impact cannot be observed in the aggregate, but must be considered in the context of each specific economic sector, industry, or even market. From this vantage point, universities could then turn up or down the intensity with which they negotiate licenses and pursue compensation for infringement. Universities would better assert their commitment to their public mission if they compute on a case by case basis the balance between social benefits and costs for each of its controversial patents. As to the material interest in patents, it is understandable that some patent attorneys or the biotech lobby publicly espouse the dogma of patents, that there is no innovation without patents. After all, their livelihood depends on it. However, research universities as a group do not have any significant financial interest in stronger patent protection. As I have shown in a previous Brookings paper, the vast majority of research universities earn very little from their patent portfolios and about 87% of tech transfer offices operate in the red. Universities as a group receive so little income from licensing and asserting their patents relative to the generous federal support (below 3%), that if the federal government were to declare that grant reviewers should give a preference to universities that do not patent, all research universities would stop the practice at once. It is true that a few universities (like the University of Wisconsin-Madison) raise significant revenue from their patent portfolio, and they will continue to do so regardless of public protestations. But the majority of universities do not have a material interest in patenting. Time to get it right on anti-troll legislation Last year, the House of Representative passed legislation closing loopholes and introducing disincentives for patent trolls. Just as mirror legislation was about to be considered in the Senate, Sen. Patrick Leahy withdrew it from the Judiciary Committee. It was reported that Sen. Harry Reid forced the hand of Mr. Leahy to kill the bill in committee. In the public sphere, the shrewd lobbying efforts to derail the bill were perceived to be pro-troll interests. The lobbying came from pharmaceutical companies, biotech companies, patent attorneys, and, to the surprise of everyone, universities. Little wonder that critics overreacted and suggested universities were in partnership with trolls: even if they were wrong, these accusations stung. University associations took that position out of a sincere belief in the dogma of patents and out of fear that the proposed anti-troll legislation limited their ability to sue patent infringers. However, their convictions stand on shaky ground and their material interests are not those of the vast majority of universities. A reversal of that position is not only possible, but would be timely. When anti-troll legislation is again introduced in Congress, universities should distance themselves from efforts to protect the policy status quo that so benefits patent trolls. It is not altogether improbable that Congress sees fit to exempt universities from some of the requirements that the law would impose. University associations could show Congress the merit of such exemptions in consideration of the universities’ constant and significant contributions to states, regions, and the nation. However, no such concessions could ever be expected if the universities continue to place themselves in the company of those who profit from patent management. No asset is more valuable for universities than their prestige. It is the ample recognition of their value in society that guarantees tax dollars will continue to flow into universities. While acting legally to protect their patent rights, universities are nevertheless toying with their own legitimacy. Let those universities that stand to gain from litigation act in their self-interest, but do not let them speak for all universities. When university associations advocate for stronger patent protection, they do the majority of universities a disservice. These associations should better represent the interests of all their members by advocating a more neutral position about patent reform, by publicly praising universities’ restraint on patent litigation, and by promoting a culture and readiness in technology transfer offices to appraise each patent not by its market value but by its social value. At the same time, the majority of universities that obtain neither private nor social benefits from patenting should press their political representatives to adopt a more balanced approach to policy advocacy, lest they squander the reputation of the entire university system. Authors Walter D. Valdivia Image Source: © Stephen Lam / Reuters Full Article
co Patent infringement suits have a reputational cost for universities By webfeeds.brookings.edu Published On :: Fri, 04 Dec 2015 07:30:00 -0500 This post originally appeared on the Center for Technology Innovation’s TechTank blog. Universities cash handsome awards on infringement cases This October, a jury found Apple Inc. guilty of infringing a patent of the University of Wisconsin-Madison (UW) and ordered the tech giant to pay $234 million. The university scored a big financial victory, but this hardly meant any gain for the good name of the university. The plaintiffs argued successfully in court that Apple infringed their 1998 patent on a predictor circuit that greatly improved the efficiency of microchips used in the popular iPhone 5s, 6, and 6 Plus. Apple first responded by challenging the validity of the patent, but the US Patent and Trademark Office ruled in favor of the university. Apple plans to appeal, but the appellate court is not likely to reverse the lower court’s decision. This is not the first time this university has asserted its patents rights (UW sued Intel in 2008 for this exact same patent and reportedly settled for $110 million). Nor is this the first time universities in general have taken infringers to court. Prominent cases in recent memory include Boston University, which sued several companies for infringement of a patent for blue light-emitting diodes and settled out of court with most of them, and Carnegie Mellon, who was awarded $237 million by the federal appellate court on its infringement suit against Marvell, a semiconductor company, for its use of an enhanced detector of data in hard drives called Kavcic detectors. Means not always aligned with aims in patent law When university patented inventions emerge from federal research grants, infringement suits test the accepted interpretations of current patent law. The Bayh-Dole Act of 1980 extended patent law and gave small-business and universities the right to take title to patents from federal research grants—later it was amended to extend the right to all federal grantees regardless of size. The ostensible aim of this act is to “to promote the utilization of inventions arising from federally supported research or development.” Under the law, a condition for universities (or any other government research performers) to keep their exclusive rights on those patents is that they or their licensees take “effective steps to achieve practical application” of those patents. Bayh-Dole was not designed to create a new source of revenue for universities. If companies are effectively using university technologies, Bayh-Dole’s purpose is served without need of patents. To understand this point, consider a counterfactual: What if the text of Bayh-Dole had been originally composed to grant a conditional right to patents for federal research grantees? The condition could be stated like this: “This policy seeks to promote the commercialization of federally funded research and to this end it will use the patent system. Grantees may take title to patents if and only if other mechanisms for disseminating and developing those inventions into useful applications prove unsuccessful.” Under this imagined text, the universities could still take title to patents on their inventions if they or the U.S. Patent and Trademark Office were not aware that the technologies were being used in manufactures. But no court would find their infringement claim meritorious if the accused companies could demonstrate that, absent of willful infringement, they had in fact used the technologies covered by university patents in their commercial products. In this case, other mechanisms for disseminating and developing the technologies would have proven successful indeed. The reality that Bayh-Dole did not mandate such a contingent assignation of rights creates a contradiction between its aims and the means chosen to advance those aims for the subset of patents that were already in use by industry. I should remark that UW’s predictor circuit resulted from grants from NSF and DARPA and there is no indication that the university exercised its patent rights with any less vigor just because the original research was funded by public funds. In fact, it is fully expected from universities to aggressively assert their patent rights regardless of the source of funding for the original research. You can have an answer for every question and still lose the debate It is this litigious attitude that puts off many observers. While the law may very well allow universities to be litigious, universities could still refuse to exercise their rights under circumstances in which those rights are not easily reconciled with the public mission of the university. Universities administrators, tech transfer personnel, and particularly the legal teams winning infringement cases have legitimate reasons to wonder why universities are publicly scorned. After all, they are acting within the law and simply protecting their patent rights; they are doing what any rational person would do. They may be really surprised when critics accuse universities of becoming allies of patent trolls, or of aiding and abetting their actions. Such accusations are unwarranted. Trolls are truants; the universities are venerable institutions. Patent trolls would exploit the ambiguities of patent law and the burdens of due process to their own benefit and to the detriment of truly productive businesses and persons. In stark contrast, universities are long established partners of democracy, respected beyond ideological divides for their abundant contributions to society. The critics may not be fully considering the intricacies of patent law. Or they may forget that universities are in need of additional revenue—higher education has not seen public financial support increase in recent years, with federal grants roughly stagnated and state funding falling drastically in some states. Critics may also ignore that revenues collected from licensing of patents, favorable court rulings, and out-of-court settlements, are to a large extent (usually two thirds of the total) plugged back into the research enterprise. University attorneys may have an answer for every point that critics raise, but the overall concern of critics should not be dismissed outright. Given that many if not most university patents can be traced back to research funded by tax dollars, there is a legitimate reason for observers to expect universities to manage their patents with a degree of restraint. There is also a legitimate reason for public disappointment when universities do not seem to endeavor to balance the tensions between their rights and duties. Substantive steps to improve the universities’ public image Universities can become more responsive to public expectations about their character not only by promoting their good work, but also by taking substantive steps to correct misperceptions. First, when universities discover a case of proven infringement, they should take companies to court as a measure of last resort. If a particular company refuses to negotiate in good faith and an infringement case ends up in court, the universities should be prepared to demonstrate to the court of public opinion that they have tried, with sufficient insistence and time, to negotiate a license and even made concessions in pricing the license. In the case of the predictor circuit patent, it seems that the University of Wisconsin-Madison tried to license the technology and Apple refused, but the university would be in a much better position if it could demonstrate that the licensing deals offered to Apple would have turned to be far less expensive for the tech company. Second, universities would be well advised not to join any efforts to lobby Congress for stronger patent protection. At least two reasons substantiate this suggestion. First, as a matter of principle, the dogmatic belief that without patents there is no innovation is wrong. Second, as a matter of material interest, universities as a group do not have a financial interest in patenting. It’s worth elaborating these points a bit more. Neither historians nor social science researchers have settled the question about the net effects of patents on innovation. While there is evidence of social benefits from patent-based innovation, there is also evidence of social costs associated with patent-monopolies, and even more evidence of momentous innovations that required no patents. What’s more, the net social benefit varies across industries and over time. Research shows economic areas in which patents do spur innovation and economic sectors where it actually hinders them. This research explains, for instance, why some computer and Internet giants lobby Congress in the opposite direction to the biotech and big pharma industries. Rigorous industrial surveys of the 1980s and 1990s found that companies in most economic sectors did not use patents as their primary tool to protect their R&D investments. Yet patenting has increased rapidly over the past four decades. This increase includes industries that once were uninterested in patents. Economic analyses have shown that this new patenting is a business strategy against patent litigation. Companies are building patent portfolios as a defensive strategy, not because they are innovating more. The university’s public position on patent policy should acknowledge that the debate on the impact of patents on innovation is not settled and that this impact cannot be observed in the aggregate, but must be considered in the context of each specific economic sector, industry, or even market. From this vantage point, universities could then turn up or down the intensity with which they negotiate licenses and pursue compensation for infringement. Universities would better assert their commitment to their public mission if they compute on a case by case basis the balance between social benefits and costs for each of its controversial patents. As to the material interest in patents, it is understandable that some patent attorneys or the biotech lobby publicly espouse the dogma of patents, that there is no innovation without patents. After all, their livelihood depends on it. However, research universities as a group do not have any significant financial interest in stronger patent protection. As I have shown in a previous Brookings paper, the vast majority of research universities earn very little from their patent portfolios and about 87% of tech transfer offices operate in the red. Universities as a group receive so little income from licensing and asserting their patents relative to the generous federal support (below 3%), that if the federal government were to declare that grant reviewers should give a preference to universities that do not patent, all research universities would stop the practice at once. It is true that a few universities (like the University of Wisconsin-Madison) raise significant revenue from their patent portfolio, and they will continue to do so regardless of public protestations. But the majority of universities do not have a material interest in patenting. Time to get it right on anti-troll legislation Last year, the House of Representative passed legislation closing loopholes and introducing disincentives for patent trolls. Just as mirror legislation was about to be considered in the Senate, Sen. Patrick Leahy withdrew it from the Judiciary Committee. It was reported that Sen. Harry Reid forced the hand of Mr. Leahy to kill the bill in committee. In the public sphere, the shrewd lobbying efforts to derail the bill were perceived to be pro-troll interests. The lobbying came from pharmaceutical companies, biotech companies, patent attorneys, and, to the surprise of everyone, universities. Little wonder that critics overreacted and suggested universities were in partnership with trolls: even if they were wrong, these accusations stung. University associations took that position out of a sincere belief in the dogma of patents and out of fear that the proposed anti-troll legislation limited the universities’ ability to sue patent infringers. However, their convictions stand on shaky ground and only a few universities sue for infringement. In taking that policy position, university associations are representing neither the interests nor the beliefs of the vast majority of universities. A reversal of that position is not only possible, but would be timely. When anti-troll legislation is again introduced in Congress, universities should distance themselves from efforts to protect the policy status quo that so benefits patent trolls. It is not altogether improbable that Congress sees fit to exempt universities from some of the requirements that the law would impose. University associations could show Congress the merit of such exemptions in consideration of the universities’ constant and significant contributions to states, regions, and the nation. However, no such concessions could ever be expected if the universities continue to place themselves in the company of those who profit from patent management. No asset is more valuable for universities than their prestige. It is the ample recognition of their value in society that guarantees tax dollars will continue to flow into universities. While acting legally to protect their patent rights, universities are nevertheless toying with their own legitimacy. Let those universities that stand to gain from litigation act in their self-interest, but do not let them speak for all universities. When university associations advocate for stronger patent protection, they do the majority of universities a disservice. These associations should better represent the interests of all their members by advocating a more neutral position about patent reform, by publicly praising universities’ restraint on patent litigation, and by promoting a culture and readiness in technology transfer offices to appraise each patent not by its market value but by its social value. At the same time, the majority of universities that obtain neither private nor social benefits from patenting should press their political representatives to adopt a more balanced approach to policy advocacy, lest they squander the reputation of the entire university system. Editor's Note: The post was corrected to state that UW’s predictor circuit did originate from federally funded research. Authors Walter D. Valdivia Image Source: © Stephen Lam / Reuters Full Article
co Stuck in a patent policy rut: Considerations for trade agreements By webfeeds.brookings.edu Published On :: Thu, 17 Dec 2015 07:30:00 -0500 International development debates of the last four decades have ascribed ever greater importance to intellectual property rights (IPRs). There has also been a significant effort on the part of the U.S. to encourage its trade partners to introduce and enforce patent law modeled after American intellectual property law. Aside from a discussion on the impact of patents on innovation, there are some important consequences of international harmonization regarding the obduracy of the terms of trade agreements. The position of the State Department on patents when negotiating trade agreements has consistently been one of defending stronger patent protection. However, the high-tech sector is under reorganization, and the most innovative industries today have strong disagreements about the value of patents for innovation. This situation begs the question as to why the national posture on patent law is so consistent in favor of industries such as pharmaceuticals or biotech to the detriment of software developers and Internet-based companies. The State Department defends this posture, arguing that the U.S. has a comparative advantage in sectors dependent on patent protection. Therefore, to promote exports, our national trade policy should place incentives for partners to come in line with national patent law. This posture will become problematic when America’s competitive advantage shifts to sectors that find patents to be a hindrance to innovation, because too much effort will have already been invested in twisting the arm of our trade partners. It will be hard to undo those chapters in trade agreements particularly after our trade partners have taken pains in passing laws aligned to American law. Related to the previous concern, the policy inertia effect and inflexibility applies to domestic policy as much as it does to trade agreements. When other nations adopt policy regimes following the American model, advocates of stronger patent protection will use international adoption as an argument in favor of keeping the domestic policy status quo. The pressure we place on our trade partners to strengthen patent protection (via trade agreements and other mechanisms like the Special 301 Report) will be forgotten. Advocates will present those trade partners as having adopted the enlightened laws of the U.S., and ask why American lawmakers would wish to change law that inspires international emulation. Innovation scholar Timothy Simcoe has correctly suggested that harmonization creates inflexibility in domestic policy. Indeed, in a not-too-distant future the rapid transformation of the economy, new big market players, and emerging business models may give policymakers the feeling that we are stuck in a patent policy rut whose usefulness has expired. In addition, there are indirect economic effects from projecting national patent law onto trade agreements. If we assume that a club of economies (such as OECD) generate most of the innovation worldwide while the rest of countries simply adopt new technologies, the innovation club would have control over the global supply of high value-added goods and services and be able to preserve a terms-of-trade advantage. In this scenario, stronger patent protection may be in the interest of the innovation club to the extent that their competitive advantage remains in industries dependent of patent protection. But should the world economic order change and the innovation club become specialized in digital services while the rest of the world takes on larger segments of manufactures, the advantage may shift outside the innovation club. This is not a far-fetched scenario. Emerging economies have increased their service economy in addition to their manufacturing capacity; overall they are better integrated in global supply chains. What is more, these emerging economies are growing consumption markets that will become increasingly more relevant globally as they continue to grow faster than rich economies. What is more, the innovation club will not likely retain a monopoly on global innovation for too long. Within emerging economies, another club of economies is placing great investments in developing innovative capacity. In particular, China, India, Brazil, Mexico, and South Africa (and possibly Russia) have strengthened their innovation systems by expanding public investments in R&D and introducing institutional reforms to foster entrepreneurship. The innovation of this second club may, in a world of harmonized patent law, increase their competitive advantage by securing monopolistic control of key high-tech markets. As industries less reliant on patents flourish and the digital economy transforms US markets, an inflexibly patent policy regime may actually be detrimental to American terms of trade. I should stress that these kind of political and economic effects of America’s posture on IPRs in trade policy are not merely speculative. Just as manufactures displaced the once dominant agricultural sector, and services in turn took over as the largest sector of the economy, we can fully expect that the digital economy—with its preference for limited use of patents—will become not only more economic relevant, but also more politically influential. The tensions observed in international trade and especially the aforementioned considerations merit revisiting the rationale for America’s posture on intellectual property policy in trade negotiations. Elsie Bjarnason contributed to this post. Authors Walter D. Valdivia Image Source: © Romeo Ranoco / Reuters Full Article
co The fair compensation problem of geoengineering By webfeeds.brookings.edu Published On :: Tue, 23 Feb 2016 09:00:00 -0500 The promise of geoengineering is placing average global temperature under human control, and is thus considered a powerful instrument for the international community to deal with global warming. While great energy has been devoted to learning more about the natural systems that it would affect, questions of political nature have received far less consideration. Taking as a given that regional effects will be asymmetric, the nations of the world will only give their consent to deploying this technology if they can be given assurances of a fair compensation mechanism, something like an insurance policy. The question of compensation reveals that the politics of geoengineering are far more difficult than the technical aspects. What is Geoengineering? In June 1991, Mount Pinatubo exploded, throwing a massive amount of volcanic sulfate aerosols into the high skies. The resulting cloud dispersed over weeks throughout the planet and cooled its temperature on average 0.5° Celsius over the next two years. If this kind of natural phenomenon could be replicated and controlled, the possibility of engineering the Earth’s climate is then within reach. Spraying aerosols in the stratosphere is one method of solar radiation management (SRM), a class of climate engineering that focuses on increasing the albedo, i.e. reflectivity, of the planet’s atmosphere. Other SRM methods include brightening clouds by increasing their content of sea salt. A second class of geo-engineering efforts focuses on carbon removal from the atmosphere and includes carbon sequestration (burying it deep underground) and increasing land or marine vegetation. Of all these methods, SRM is appealing for its effectiveness and low costs; a recent study put the cost at about $5 to $8 billion per year.1 Not only is SRM relatively inexpensive, but we already have the technological pieces that assembled properly would inject the skies with particles that reflect sunlight back into space. For instance, a fleet of modified Boeing 747s could deliver the necessary payload. Advocates of geoengineering are not too concerned about developing the technology to effect SRM, but about its likely consequences, not only in terms of slowing global warming but the effects on regional weather. And there lies the difficult question for geoengineering: the effects of SRM are likely to be unequally distributed across nations. Here is one example of these asymmetries: Julia Pongratz and colleagues at the department of Global Ecology of the Carnegie Institution for Science estimated a net increase in yields of wheat, corn, and rice from SRM modified weather. However, the study also found a redistributive effect with equatorial countries experiencing lower yields.2 We can then expect that equatorial countries will demand fair compensation to sign on the deployment of SRM, which leads to two problems: how to calculate compensation, and how to agree on a compensation mechanism. The calculus of compensation What should be the basis for fair compensation? One view of fairness could be that, every year, all economic gains derived from SRM are pooled together and distributed evenly among the regions or countries that experience economic losses. If the system pools gains from SRM and distributes them in proportion to losses, questions about the balance will only be asked in years in which gains and losses are about the same. But if losses are far greater than the gains; then this would be a form of insurance that cannot underwrite some of the incidents it intends to cover. People will not buy such an insurance policy; which is to say, some countries will not authorize SRM deployment. In the reverse, if the pool has a large balance left after paying out compensations, then winners of SRM will demand lower compensation taxes. Further complicating the problem is the question of how to separate gains or losses that can be attributed to SRM from regional weather fluctuations. Separating the SRM effect could easily become an intractable problem because regional weather patterns are themselves affected by SRM. For instance, any year that El Niño is particularly strong, the uncertainty about the net effect of SRM will increase exponentially because it could affect the severity of the oceanic oscillation itself. Science can reduce uncertainty but only to a certain degree, because the better we understand nature, the more we understand the contingency of natural systems. We can expect better explanations of natural phenomena from science, but it would be unfair to ask science to reduce greater understanding to a hard figure that we can plug into our compensation equation. Still, greater complexity arises when separating SRM effects from policy effects at the local and regional level. Some countries will surely organize better than others to manage this change, and preparation will be a factor in determining the magnitude of gains or losses. Inherent to the problem of estimating gains and losses from SRM is the inescapable subjective element of assessing preparation. The politics of compensation Advocates of geoengineering tell us that their advocacy is not about deploying SRM; rather, it is about better understanding the scientific facts before we even consider deployment. It’s tempting to believe that the accumulating science on SRM effects would be helpful. But when we consider the factors I just described above, it is quite possible that more science will also crystalize the uncertainty about exact amounts of compensation. The calculus of gain or loss, or the difference between the reality and a counterfactual of what regions and countries will experience requires certainty, but science only yields irreducible uncertainty about nature. The epistemic problems with estimating compensation are only to be compounded by the political contestation of those numbers. Even within the scientific community, different climate models will yield different results, and since economic compensation is derived from those models’ output, we can expect a serious contestation of the objectivity of the science of SRM impact estimation. Who should formulate the equation? Who should feed the numbers into it? A sure way to alienate scientists from the peoples of the world is to ask them to assert their cognitive authority over this calculus. What’s more, other parts of the compensation equation related to regional efforts to deal with SRM effect are inherently subjective. We should not forget the politics of asserting compensation commensurate to preparation effort; countries that experience low losses may also want compensation for their efforts preparing and coping with natural disasters. Not only would a compensation equation be a sham, it would be unmanageable. Its legitimacy would always be in question. The calculus of compensation may seem a way to circumvent the impasses of politics and define fairness mathematically. Ironically, it is shot through with subjectivity; is truly a political exercise. Can we do without compensation? Technological innovations are similar to legislative acts, observed Langdon Winner.3 Technical choices of the earliest stage in technical design quickly “become strongly fixed in material equipment, economic investment, and social habit, [and] the original flexibility vanishes for all practical purposes once the initial commitments are made.” For that reason, he insisted, "the same careful attention one would give to the rules, roles, and relationships of politics must also be given to such things as the building of highways, the creation of television networks, and the tailoring of seeming insignificant features on new machines." If technological change can be thought of as legislative change, we must consider how such a momentous technology as SRM can be deployed in a manner consonant with our democratic values. Engineering the planet’s weather is nothing short of passing an amendment to Planet Earth’s Constitution. One pesky clause in that constitutional amendment is a fair compensation scheme. It seems so small a clause in comparison to the extent of the intervention, the governance of deployment and consequences, and the international commitments to be made as a condition for deployment (such as emissions mitigation and adaptation to climate change). But in the short consideration afforded here, we get a glimpse of the intractable political problem of setting up a compensation scheme. And yet, if the clause were not approved by a majority of nations, a fair compensation scheme has little hope to be consonant with democratic aspirations. 1McClellan, Justin, David W Keith, Jay Apt. 2012. Cost analysis of stratospheric albedo modification delivery systems. Environmental Research Letters 7(3): 1-8. 2Pongratz, Julia, D. B. Lobell, L. Cao, K. Caldeira. 2012. Nature Climate Change 2, 101–105. 3Winner, Langdon. 1980. Do artifacts have politics? Daedalus (109) 1: 121-136. Authors Walter D. Valdivia Image Source: © Antara Photo Agency / Reuters Full Article
co Bernie Sanders’s failed coalition By webfeeds.brookings.edu Published On :: Tue, 10 Mar 2020 11:00:33 +0000 Throughout Bernie Sanders’s presidential campaigns in 2016 and 2020, he promised to transform the Democratic Party and American politics. He promised a “revolution” that would resonate with a powerful group of Americans who have not normally participated in politics: young voters, liberal voters, and new voters. He believed that once his call went out and… Full Article
co In administering the COVID-19 stimulus, the president’s role model should be Joe Biden By webfeeds.brookings.edu Published On :: Tue, 07 Apr 2020 20:24:12 +0000 As America plunges into recession, Congress and President Donald Trump have approved a series of aid packages to assist businesses, the unemployed, and others impacted by COVID-19. The first three aid packages will likely be supplemented by at least a fourth package, as the nation’s leaders better understand the depth and reach of the economic… Full Article
co The French connection: Explaining Sunni militancy around the world By webfeeds.brookings.edu Published On :: Fri, 25 Mar 2016 14:55:00 -0400 Editors’ Note: The mass-casualty terrorist attacks in Paris and now in Brussels underscore an unsettling truth: Jihadis pose a greater threat to France and Belgium than to the rest of Europe. Research by Will McCants and Chris Meserole reveals that French political culture may play a role. This post originally appeared in Foreign Affairs. The mass-casualty terrorist attacks in Paris and now in Brussels underscore an unsettling truth: Jihadists pose a greater threat to France and Belgium than to the rest of Europe. The body counts are larger and the disrupted plots are more numerous. The trend might be explained by the nature of the Islamic State (ISIS) networks in Europe or as failures of policing in France and Belgium. Both explanations have merit. However, our research reveals that another factor may be at play: French political culture. Last fall, we began a project to test empirically the many proposed explanations for Sunni militancy around the globe. The goal was to take common measures of the violence—namely, the number of Sunni foreign fighters from any given country as well as the number of Sunni terror attacks carried out within it—and then crunch the numbers to see which explanations best predicted a country’s rate of Sunni radicalization and violence. (The raw foreign fighter data came from The International Centre for the Study of Radicalisation and Political Violence; the original attack data came from the University of Maryland’s START project.) What we found surprised us, particularly when it came to foreign fighter radicalization. It turns out that the best predictor of foreign fighter radicalization was not a country’s wealth. Nor was it how well-educated its citizens were, how healthy they were, or even how much Internet access they enjoyed. Instead, the top predictor was whether a country was Francophone; that is, whether it currently lists (or previously listed) French as a national language. As strange as it may seem, four of the five countries with the highest rates of radicalization in the world are Francophone, including the top two in Europe (France and Belgium). Knowledgeable readers will immediately object that the raw numbers tell a different story. The English-speaking United Kingdom, for example, has produced far more foreign fighters than French-speaking Belgium. And fighters from Saudi Arabia number in the several thousands. But the raw numbers are misleading. If you view the foreign fighters as a percentage of the overall Muslim population, you see a different picture. Per Muslim resident, Belgium produces far more foreign fighters than either the United Kingdom or Saudi Arabia. [W]hat could the language of love possibly have to do with Islamist violence? We suspect that it is really a proxy for something else: French political culture. So what could the language of love possibly have to do with Islamist violence? We suspect that it is really a proxy for something else: French political culture. The French approach to secularism is more aggressive than, say, the British approach. France and Belgium, for example, are the only two countries in Europe to ban the full veil in their public schools. They’re also the only two countries in Western Europe not to gain the highest rating for democracy in the well-known Polity score data, which does not include explanations for the markdowns. Adding support to this story are the top interactions we found between different variables. When you look at which combination of variables is most predictive, it turns out that the “Francophone effect” is actually strongest in the countries that are most developed: French-speaking countries with the highest literacy, best infrastructure, and best health system. This is not a story about French colonial plunder. If anything it’s a story about what happens when French economic and political development has most deeply taken root. An important subplot within this story concerns the distribution of wealth. In particular, the rate of youth unemployment and urbanization appear to matter a great deal too. Globally, we found that when between 10 and 30 percent of a country’s youth are unemployed, there is a strong relationship between a rise in youth unemployment and a rise in Sunni militancy. Rates outside that range don’t have an effect. Likewise, when urbanization is between 60 and 80 percent, there is a strong relationship. These findings seem to matter most in Francophone countries. Among the over 1,000 interactions our model looked at, those between Francophone and youth unemployment and Francophone and urbanization both ranked among the 15 most predictive. There’s broad anecdotal support for this idea: consider the rampant radicalization in Molenbeek, in the Parisbanlieus, in Ben Gardane. Each of these contexts have produced a massively disproportionate share of foreign fighters, and each are also urban pockets with high youth unemployment. As with the Francophone finding overall, we’re left with guesswork as to why exactly the relationships between French politics, urbanization, youth unemployment, and Sunni militancy exist. We suspect that when there are large numbers of unemployed youth, some of them are bound to get up to mischief. When they live in large cities, they have more opportunities to connect with people espousing radical causes. And when those cities are in Francophone countries that adopt the strident French approach to secularism, Sunni radicalism is more appealing. For now, the relationship needs to be studied and tested by comparing several cases in countries and between countries. We also found other interesting relationships—such as between Sunni violence and prior civil conflict—but they are neither as strong nor as compelling. Regardless, the latest attacks in Belgium are reason enough to share the initial findings. They may be way off, but at least they are based on the best available data. If the data is wrong or our interpretations skewed, we hope the effort will lead to more rigorous explanations of what is driving jihadist terrorism in Europe. Our initial findings should in no way imply that Francophone countries are responsible for the recent horrible attacks—no country deserves to have its civilians killed, regardless of the perpetrator’s motives. But the magnitude of the violence and the fear it engenders demand that we investigate those motives beyond just the standard boilerplate explanations. Authors William McCantsChristopher Meserole Publication: Foreign Affairs Full Article
co Realist or neocon? Mixed messages in Trump advisor’s foreign policy vision By webfeeds.brookings.edu Published On :: Tue, 19 Jul 2016 08:00:00 -0400 Last night, retired lieutenant general Michael Flynn addressed the Republican convention as a headline speaker on the subject of national security. One of Donald Trump’s closest advisors—so much so that he was considered for vice president—Flynn repeated many of the themes found in his new book, The Field of Fight, How We Can Win the Global War Against Radical Islam and Its Allies, which he coauthored with Michael Ledeen. (The book is published by St. Martin’s, which also published mine.) Written in Flynn’s voice, the book advances two related arguments: First, the U.S. government does not know enough about its enemies because it does not collect enough intelligence, and it refuses to take ideological motivations seriously. Second, our enemies are collaborating in an “international alliance of evil countries and movements that is working to destroy” the United States despite their ideological differences. Readers will immediately notice a tension between the two ideas. “On the surface,” Flynn admits, “it seems incoherent.” He asks: “How can a Communist regime like North Korea embrace a radical Islamist regime like Iran? What about Russia’s Vladimir Putin? He is certainly no jihadi; indeed, Russia has a good deal to fear from radical Islamist groups.” Flynn spends much of the book resolving the contradiction and proving that America’s enemies—North Korea, China, Russia, Iran, Syria, Cuba, Bolivia, Venezuela, Nicaragua, al-Qaida, Hezbollah, and ISIS—are in fact working in concert. No one who has read classified intelligence or studied international relations will balk at the idea that unlikely friendships are formed against a common enemy. As Flynn observes, the revolutionary Shiite government in Tehran cooperates with nationalist Russia and communist North Korea; it has also turned a blind eye (at the very least) to al-Qaida’s Sunni operatives in Iran and used them bargaining chips when negotiating with Osama bin Laden and the United States. Flynn argues that this is more than “an alliance of convenience.” Rather, the United States’ enemies share “a contempt for democracy and an agreement—by all the members of the enemy alliance—that dictatorship is a superior way to run a country, an empire, or a caliphate.” Their shared goals of maximizing dictatorship and minimizing U.S. interference override their substantial ideological differences. Consequently, the U.S. government must work to destroy the alliance by “removing the sickening chokehold of tyranny, dictatorships, and Radical Islamist regimes.” Its failure to do so over the past decades gravely imperils the United States, he contends. The book thus offers two very different views of how to exercise American power abroad: spread democracies or stand with friendly strongmen...[P]erhaps it mirrors the confusion in the Republican establishment over the direction of conservative foreign policy. Some of Flynn’s evidence for the alliance diverts into the conspiratorial—I’ve seen nothing credible to back up his assertion that the Iranians were behind the 1979 takeover of the Grand Mosque in Mecca by Sunni apocalypticists. And there’s an important difference between the territorially-bounded ambitions of Iran, Russia, and North Korea, on the one hand, and ISIS’s desire to conquer the world on the other; the former makes alliances of convenience easier than the latter. Still, Flynn would basically be a neocon if he stuck with his core argument: tyrannies of all stripes are arrayed against the United States so the United States should destroy them. But some tyrannies are less worthy of destruction than others. In fact, Flynn argues there’s a category of despot that should be excluded from his principle, the “friendly tyrants” like President Abdel-Fatah el-Sissi in Egypt and former president Zine Ben Ali in Tunisia. Saddam Hussein should not have been toppled, Flynn argues, and even Russia could become an “ideal partner for fighting Radical Islam” if only it would come to its senses about the threat of “Radical Islam.” Taken alone, these arguments would make Flynn realist, not a neocon. The book thus offers two very different views of how to exercise American power abroad: spread democracies or stand with friendly strongmen. Neither is a sure path to security. Spreading democracy through the wrong means can bring to power regimes that are even more hostile and authoritarian; standing with strongmen risks the same. Absent some principle higher than just democracy or security for their own sakes, the reader is unable to decide between Flynn’s contradictory perspectives and judge when their benefits are worth the risks. It’s strange to find a book about strategy so at odds with itself. Perhaps the dissonance is due to the co-authors’ divergent views (Ledeen is a neocon and Flynn is comfortable dining with Putin.) Or perhaps it mirrors the confusion in the Republican establishment over the direction of conservative foreign policy. Whatever the case, the muddled argument offered in The Field of Fight demonstrates how hard it is to overcome ideological differences to ally against a common foe, regardless of whether that alliance is one of convenience or conviction. Authors William McCants Full Article
co Global economic and environmental outcomes of the Paris Agreement By webfeeds.brookings.edu Published On :: The Paris Agreement, adopted by the Parties to the United Nations Framework Convention on Climate Change (UNFCCC) in 2015, has now been signed by 197 countries. It entered into force in 2016. The agreement established a process for moving the world toward stabilizing greenhouse gas (GHG) concentrations at a level that would avoid dangerous climate… Full Article
co Policy insights from comparing carbon pricing modeling scenarios By webfeeds.brookings.edu Published On :: Carbon pricing is an important policy tool for reducing greenhouse gas pollution. The Stanford Energy Modeling Forum exercise 32 convened eleven modeling teams to project emissions, energy, and economic outcomes of an illustrative range of economy-wide carbon price policies. The study compared a coordinated reference scenario involving no new policies with policy scenarios that impose… Full Article
co The risk of fiscal collapse in coal-reliant communities By webfeeds.brookings.edu Published On :: EXECUTIVE SUMMARY If the United States undertakes actions to address the risks of climate change, the use of coal in the power sector will decline rapidly. This presents major risks to the 53,000 US workers employed by the industry and their communities. 26 US counties are classified as “coal-mining dependent,” meaning the coal industry is… Full Article
co Columbia Energy Exchange: Coal communities face risk of fiscal collapse By webfeeds.brookings.edu Published On :: Mon, 15 Jul 2019 15:31:47 +0000 Full Article
co The risk of fiscal collapse in coal-reliant communities By webfeeds.brookings.edu Published On :: Wed, 17 Jul 2019 20:46:52 +0000 Full Article
co Why local governments should prepare for the fiscal effects of a dwindling coal industry By webfeeds.brookings.edu Published On :: Thu, 05 Sep 2019 15:36:41 +0000 Full Article
co Adele Morris on BPEA and looking outside macroeconomics By webfeeds.brookings.edu Published On :: Thu, 12 Mar 2020 13:00:49 +0000 Adele Morris is a senior fellow in Economic Studies and policy director for Climate and Energy Economics at Brookings. She recently served as a discussant for a paper as part of the Spring 2019 BPEA conference.Her research informs critical decisions related to climate change, energy, and tax policy. She is a leading global expert on the design… Full Article
co Modeling community efforts to reduce childhood obesity By webfeeds.brookings.edu Published On :: Mon, 26 Aug 2019 13:00:42 +0000 Why childhood obesity matters According to the latest data, childhood obesity affects nearly 1 in 5 children in the United States, a number which has more than tripled since the early 1970s. Children who have obesity are at a higher risk of many immediate health risks such as high blood pressure and high cholesterol, type… Full Article
co Simulating the effects of tobacco retail restriction policies By webfeeds.brookings.edu Published On :: Tue, 03 Sep 2019 13:00:00 +0000 Tobacco use remains the single largest preventable cause of death and disease in the United States, killing more than 480,000 Americans each year and incurring over $300 billion per year in costs for direct medical care and lost productivity. In addition, of all cigarettes sold in the U.S. in 2016, 35% were menthol cigarettes, which… Full Article
co Development of a computational modeling laboratory for examining tobacco control policies: Tobacco Town By webfeeds.brookings.edu Published On :: Mon, 30 Dec 2019 16:03:48 +0000 Full Article
co Webinar: Reopening the coronavirus-closed economy — Principles and tradeoffs By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2020 13:55:02 +0000 In an extraordinary response to an extraordinary public health challenge, the U.S. government has forced much of the economy to shut down. We now face the challenge of deciding when and how to reopen it. This is both vital and complicated. Wait too long—maintain the lockdown until we have a vaccine, for instance—and we’ll have another Great Depression. Move too soon, and we… Full Article
co The dark side of consensus in Tunisia: Lessons from 2015-2019 By webfeeds.brookings.edu Published On :: Fri, 31 Jan 2020 16:55:04 +0000 Executive Summary Since the 2011 revolution, Tunisia has been considered a model for its pursuit of consensus between secular and Islamist forces. While other Arab Spring countries descended into civil war or military dictatorship, Tunisia instead chose dialogue and cooperation, forming a secular-Islamist coalition government in 2011 and approving a constitution by near unanimity in… Full Article
co Holding our own: Is the future of Islam in the West communal? By webfeeds.brookings.edu Published On :: Wed, 25 Mar 2020 20:43:39 +0000 Full Article
co The coronavirus killed the revolution By webfeeds.brookings.edu Published On :: Wed, 25 Mar 2020 21:10:56 +0000 Full Article
co Desert Storm after 25 years: Confronting the exposures of modern warfare By webfeeds.brookings.edu Published On :: Thu, 16 Jun 2016 15:00:00 -0400 Event Information June 16, 20163:00 PM - 5:00 PM EDTSEIU Building1800 Massachusetts Ave. NWWashington, DC Register for the EventBy most metrics, the 1991 Gulf War, also known as Operation Desert Storm, was a huge and rapid success for the United States and its allies. The mission of defeating Iraq's army, which invaded Kuwait the year prior, was done swiftly and decisively. However, the war's impact on soldiers who fought in it was lasting. Over 650,000 American men and women served in the conflict, and many came home with symptoms including insomnia, respiratory disorders, memory issues and others attributed to a variety of exposures – “Gulf War Illness." On June 16, the Center for 21st Century Security and Intelligence at Brookings and Georgetown University Medical Center co-hosted a discussion on Desert Storm, its veterans, and how they are faring today. Representative Mike Coffman (R-Col.), the only member of Congress to serve in both Gulf wars, delivered an opening address before joining Michael O’Hanlon, senior fellow at Brookings, for a moderated discussion. Joel Kupersmith, former head of the Office of Research and Development of the Department of Veterans Affairs, convened a follow-on panel with Carolyn Clancy, deputy under secretary for health for organizational excellence at the Department of Veterans Affairs; Adrian Atizado, deputy national legislative director at Disabled American Veterans; and James Baraniuk, professor of medicine at Georgetown University Medical Center. Audio Desert Storm after 25 years: Confronting the exposures of modern warfare Transcript Uncorrected Transcript (.pdf) Event Materials 20160616_desert_storm_transcript Full Article
co Why a Trump presidency could spell big trouble for Taiwan By webfeeds.brookings.edu Published On :: Wed, 06 Jul 2016 09:05:00 -0400 Presumptive Republican presidential nominee Donald Trump’s idea to withdraw American forces from Asia—letting allies like Japan and South Korea fend for themselves, including possibly by acquiring nuclear weapons—is fundamentally unsound, as I’ve written in a Wall Street Journal op-ed. Among the many dangers of preemptively pulling American forces out of Japan and South Korea, including an increased risk of war between Japan and China and a serious blow to the Nuclear Non-Proliferation Treaty, such a move would heighten the threat of war between China and Taiwan. The possibility that the United States would dismantle its Asia security framework could unsettle Taiwan enough that it would pursue a nuclear deterrent against China, as it has considered doing in the past—despite China indicating that such an act itself could be a pathway to war. And without bases in Japan, the United States could not as easily deter China from potential military attacks on Taiwan. Trump’s proposed Asia policy could take the United States and its partners down a very dangerous road. It’s an experiment best not to run. Authors Michael E. O'Hanlon Full Article