through

What drove oil prices through the floor this week?

The coronavirus pandemic has sent crude oil prices plummeting, so much so that the price for West Texas Intermediate oil dropped below zero dollars earlier this week. In this special edition of the podcast, Samantha Gross joins David Dollar to explain the factors influencing recent changes in demand for oil and the long-term effects the…

       




through

Global Governance Breakthrough: The G20 Summit and the Future Agenda

Executive Summary

At the invitation of President George W. Bush, the G20 leaders met on November 15, 2008, in Washington, DC, in response to the worldwide financial and economic crisis. With this summit meeting the reality of global governance shifted surprisingly quickly. Previously, major global economic, social and environmental issues were debated in the small, increasingly unrepresentative and often times ineffectual circle of G8 leaders. Now, there is a larger, much more legitimate summit group which can speak for over two-thirds of the world’s population and controls 90% of the world’s economy.

The successful first G20 Summit provides a platform on which President-elect Obama can build in forging an inclusive and cooperative approach for resolving the current financial and economic crisis. Rather than get embroiled in a debate about which country is in and which country is out of the summit, the new U.S. administration should take a lead in accepting the new summit framework for now and focus on the substantive issues. Aside from tackling the current crisis, future G20 summits should also drive the reform of the international financial institutions and address other major global concerns—climate change, poverty and health, and energy among others. With its diverse and representative membership of key countries and with a well-managed process of summit preparation and follow-up the new G20 governance structure would allow for a more inclusive deliberation and more effective response to today’s complex global challenges and opportunities.

Policy Brief #168

A Successful G20 Summit—A Giant Step Forward

Once announced, there was speculation that the G20 Summit would be at best a distraction and at worst a costly failure, with a lame duck U.S. president hobbled by a crisis-wracked economy and a president-elect impotently waiting at the sidelines, with European leaders bickering over seemingly arcane matters, and with the leaders of the emerging economies sitting on the fence, unwilling or unprepared to take responsibility for fixing problems not of their making.

As it turned out, the first G20 Summit was by most standards a success. It served as a platform for heads of state to address the current financial turmoil and the threats of the emerging economic crisis facing not only the U.S. and Europeans, but increasingly also the rest of the world. The communiqué unmistakably attributes blame for the crisis where it belongs—to the advanced countries. It lays out a set of principles and priorities for crisis management and an action plan for the next four months and beyond, and it promises to address the longer-term agenda of reform of the global financial system. Very importantly, it also commits the leaders to meet again in April 2009 under the G20 umbrella. This assures that the November G20 Summit was not a one-off event, but signified the beginning of a new way of managing the world economy. The U.S. Treasury, which apparently drove the decision to hold the G20 rather than a G8 summit and which led the brief preparation process, deserves credit for this outcome.

A Long Debate over Global Governance Reform Short-circuited

With this successful summit a number of unresolved issues in global governance were pushed aside virtually overnight:

  • The embarrassing efforts of past G8 summits to reach out to the leaders of emerging market economies with ad hoc invitations to join as part-time guests or through the well-meaning expedient of the “Heiligendamm Process”—under which a G8+5 process was to be institutionalized—were overtaken by the fact of the G20 summit.
  • A seemingly endless debate among experts about what is the optimal size and composition for an expanded summit—G13, G14, G16, G20, etc. —was pragmatically resolved by accepting the format of the already existing G20 of finance ministers and central bank presidents, which has functioned well since 1999. With this, the Pandora’s Box of country selection remained mercifully closed. This is a major accomplishment, which is vitally important to preserve at this time.
  • The idea of a “League of Democracies” as an alternative to the G8 and G20 summits, which had been debated in the U.S. election, was pushed aside by the hard reality of a financial crisis that made it clear that all the key economic players had to sit at the table, irrespective of political regime.
  • Finally, the debate about whether the leaders of the industrial world would ever be willing to sit down with their peers from the emerging market economies as equals was short circuited by the picture of the U.S. president at lunch during the G20 Summit, flanked by the presidents of two of the major emerging economies, Brazil and China. This photograph perhaps best defines the new reality of global governance in the 21st Century.

Is the G20 Summit Here to Stay?

The communiqué of the November 15, 2008 Summit locked in the next G20 summit and hence ordained a sequel that appears to have enshrined the G20 as the new format to address the current global financial and economic crisis over the coming months and perhaps years. Much, of course, depends on the views of the new U.S. administration, but the November 2008 Summit has paved the way for President Obama and his team to move swiftly beyond the traditional G8 and to continue the G20 format.

In principle there is nothing wrong with exploring options for further change. However at this juncture, we strongly believe that it is best for the new U.S. administration to focus its attention on making the G20 summit format work, in terms of its ability to address the immediate crisis, and in terms of subsequently dealing with other pressing problems, such as global warming and global poverty. There may be a need to fine-tune size and composition, but more fundamental changes, in our view, can and should wait for later since arguments about composition and size—who is in and who is out—could quickly overwhelm a serious discussion of pressing substantive issues. Instead, the next G20 Summit in the United Kingdom on April 2, 2009 should stay with the standard G20 membership and get on with the important business of solving the world’s huge financial and economic problems.

One change, however, would be desirable: At the Washington Summit in November 2008 two representatives for each country were seated at the table, usually the country’s leader and finance minister. There may have been good reasons for this practice under the current circumstances, since leaders may have felt more comfortable with having the experts at their side during intense discussions of how to respond to the financial and economic crisis. In general, however, a table of 40 chairs undoubtedly is less conducive to an open and informal discussion than a table half that size. From our experience, a table of 20 can support a solid debate as long as the format is one of open give and take, rather than a delivery of scripted speeches. This is not the case for a table with 40 participants. The G8 format of leaders only at the table, with prior preparation by ministers who do not then participate in the leaders level summits, should definitely be preserved. To do otherwise would dilute the opportunity for informal discussion among leaders, which is the vital core of summit dynamics.

What Will Happen to the G8 Summit and to the G7 and G20 Meetings of Finance Ministers?

As the world’s financial storm gathered speed and intensity in recent months, the inadequacy of the traditional forums of industrial countries—the G8 group of leaders and the G7 group of finance ministers—became obvious. Does this mean that the G8 and G7 are a matter of the past? Most likely not. We would expect these forums to continue to meet for some time to come, playing a role as caucus for industrial countries. In any event, the G20 finance ministers will take on an enhanced role, since it will be the forum at which minister-level experts will lay the ground on key issues of global financial and economic management to ensure that they are effectively addressed at summit level by their leaders. The G20 Summit of November 15 was prepared by a meeting of G20 finance ministers in this fashion.

It may well be that the dynamics of interactions within the G20 will cause coalitions to be formed, shifting over time as issues and interests change. This could at times and on some issues involve a coalition of traditional G7 members. However, with increasing frequency, we would expect that some industrial countries would temporarily team-up with emerging market country members, for example on agricultural trade policies, where a coalition of Argentina, Australia, Brazil and Canada might align itself to challenge the agricultural protection policies of Europe, Japan and the United States. Or in the area of energy, a coalition among producer states, such as Indonesia, Mexico, Russia and Saudi Arabia might debate the merits of a stable energy supply and demand regime with an alliance among energy users, such as China, Europe, Japan, South Africa and the United States. It is this potential for multiple, overlapping and shifting alliances, which creates the opportunities for building trust, forcing trade-offs and forging cross-issue compromises that makes the G20 summit such an exciting opportunity.

What Should Be the Agenda of Future G20 Summits?

The communiqué of the November 2008 G20 Summit identified three main agenda items for the April 2009 follow-up summit: (1) A list of key issues for the containment of the current global financial and economic crisis; (2) a set of issues for the prevention of future global financial crises, including the reform of the international financial institutions, especially the IMF and World Bank; and (3) a push toward the successful conclusion of the Doha Round of WTO trade negotiations.

The first item is obviously a critical one if the G20 is to demonstrate its ability to help address the current crisis in a meaningful way. The second item is also important and timely. The experience with reform of the global financial institutions in the last few years has demonstrated that serious governance changes in these institutions will have to be driven by a summit-level group that is as inclusive as the G20. We would hope that Prime Minister Gordon Brown, as chair—with his exceptional economic expertise and experience in the international institutions, especially the IMF—will be able to forge a consensus at the April 2 summit in regard to reform of the international financial institutions. The third agenda item is also important, since the Doha Round is at a critical stage and its successful conclusion would send a powerful signal that the world community recognizes the importance of open trade relations in a time of crisis, when the natural tendency may be to revert to a protectionist stance.

However, we believe three additional topics should be added to the agenda for the April 2009 G20 Summit:

  • First, there should be an explicit commitment to make the G20 forum a long-term feature of global governance, even as the group may wish to note that its size and composition is not written in stone, but subject to change as circumstances change.
  • Second, the communiqué of the November summit stated that the G20 countries are “committed to addressing other critical challenges such as energy security and climate change, food security, the rule of law, and the fight against terrorism, poverty and disease”. This needs to be acted upon. These issues cannot be left off the table, even as the global financial and economic crisis rages. If anything, the crisis reinforces some of the key challenges which arise in these other areas and offers opportunities for a timely response. The U.K.-hosted summit should launch a G20 initiative to develop framework ideas for the post-Kyoto climate change agreement at Copenhagen.
  • Third, assuming the April 2009 summit commits itself—as it should—to a continuation of the G20 summit format into the future, it must begin to address the question of how the summit process should be managed. We explore some of the possible options next.

How Should the G20 Summit Process Be Managed?

So far the G7, G8 and G20 forums have been supported by a loose organizational infrastructure. For each group the country holding the rotating year-long presidency of the forum takes over the secretariat function while a team of senior officials (the so-called “sherpas”) from each country meets during the course of the year to prepare the agenda and the communiqué for leaders and ministers. This organization has the advantage of avoiding a costly and rigid bureaucracy. It also fosters a growing level of trust and mutual understanding among the sherpas.

The problem with this approach has been two-fold: First, it led to discontinuities in focus and organization and in the monitoring of implementation. For the G20 of finance ministers, this problem was addressed in part by the introduction of a “troika” system, under with the immediate past and future G20 presidencies would work systematically with the current G20 presidency to shape the agenda and manage the preparation process. Second, particularly for the countries in the G20 with lesser administrative capacity, the responsibility for running the secretariat for a year during their country’s presidency imposed a heavy burden.

For the G20 summit, these problems will be amplified, not least because these summits will require first-rate preparation for very visible and high-level events. In addition, as the agenda of the G20 summit broadens over time, the burden of preparing a consistent multi-year agenda based on strong technical work will be such that it cannot be effectively handled when passed on year to year from one secretariat in one country to another secretariat in another country, especially when multiple ministries have to be engaged in each country. It is for this reason that the time may have come to explore setting up a very small permanent secretariat in support of the G20 summit.

The secretariat should only provide technical and logistical support for the political leadership of the troika of presidencies and for the sherpa process, but should not run the summit. That is the job of the host member governments. They must continue to run the summits, lead the preparations and drive the follow-up. The troika process will help strengthen the capacity of national governments to shoulder these burdens. Summits are the creatures of national government authorities where they have primacy, and this must remain so, even as the new summits become larger, more complex and more important.

Implications for the Obama Administration

The November 2008 G20 Summit opened a welcome and long-overdue opportunity for a dramatic and lasting change in global governance. It will be critical that the leaders of the G20 countries make the most of this opportunity at the next G20 Summit on April 2. The presence of U.S. President Obama will be a powerful signal that the United States is ready to push and where necessary lead the movement for global change. President-elect Obama’s vision of inclusion and openness and his approach to governing, which favors innovative and far-reaching pragmatic responses to key national and global challenges, make him a great candidate for this role.

We would hope that President Obama would make clear early on that:

  • He supports the G20 summit as the appropriate apex institution of global governance for now;
  • He may wish to discuss how to fine-tune the summit’s composition for enhanced credibility and effectiveness but without fundamentally questioning the G20 framework;
  • He supports cooperative solutions to the current financial crisis along with a serious restructuring of the global financial institutions;
  • He will look to the G20 summit as the right forum to address other pressing global issues, such as climate change, energy, poverty and health; and
  • He is ready to explore an innovative approach to effectively manage the G20 summit process.

These steps would help ensure that the great promise of the November 2008 G20 Summit is translated into a deep and essential change in global governance. This change will allow the world to move from a governance system that continues to be dominated by the transatlantic powers of the 20th century to one which reflects the fundamentally different global economic and political realities of the 21st century. It would usher in a framework of deliberation, consultation and decision making that would make it possible to address the great global challenges and opportunities that we face today in a more effective and legitimate manner.

Downloads

     
 
 




through

Spurring Innovation Through Education: Four Ideas

Policy Brief #174

A nation’s education system is a pillar of its economic strength and international competitiveness. The National Bureau of Economic Research analyzed data from 146 countries, collected between 1950 and 2010, and found that each year of additional average schooling attained by a population translates into at least a two percent increase in economic output. A 2007 World Bank policy research working paper reported similar results. Based on these findings, if the United States increased the average years of schooling completed by its adult population from the current 12 years to 13 years—that is, added one year of postsecondary education—our gross domestic product would rise by more than $280 billion.

The story also can be told by focusing on the returns to education for individuals. The difference in income between Americans who complete high school and those who drop out after 10th grade exceeds 50 percent. Large income differentials extend throughout the continuum of education attainment, with a particularly huge gap occurring between an advanced degree and a four-year college degree.

Although education clearly pays, the education attainment of the nation’s youth has largely stagnated, falling substantially behind that of countries with which we compete. In 1960, the United States led the world in the number of students who graduated from high school. Today young adults in many countries, including Estonia and Korea, exceed their U.S. counterparts in education attainment.

RECOMMENDATIONS
America’s economic productivity and competitiveness are grounded in education. Our public schools and our higher education institutions alike are falling behind those of other nations. Four policy proposals offer substantial promise for improving American education, are achievable and have low costs:

  • Choose K–12 curriculum based on evidence of effectiveness.
  • Evaluate teachers in ways that meaningfully differentiate levels of performance.
  • Accredit online education providers so they can compete with traditional schools across district and state lines.
  • Provide the public with information that will allow comparison of the labor market outcomes and price of individual postsecondary degree and certificate programs.

The problem of low education attainment is particularly salient among students from low-income and minority backgrounds. The graduation rate for minorities has been declining for 40 years, and majority/minority graduation rate differentials have not converged. Hispanic and black students earn four-year or higher degrees at less than half the rate of white students.

The economic future of the nation and the prospects of many of our citizens depend on returning the United States to the forefront of education attainment. Simply put, many more of our students need to finish high school and graduate from college.

At the same time, graduation standards for high school and college must be raised. Forty percent of college students take at least one remedial course to make up for deficiencies in their high school preparation, and a test of adult literacy recently given to a random sample of graduating seniors from four-year U.S. institutions found less than 40 percent to be proficient on prose and quantitative tasks.

Barriers to Innovation and Reform

Our present education system is structured in a way that discourages the innovation necessary for the United States to regain education leadership. K-12 education is delivered largely through a highly regulated public monopoly. Outputs such as high school graduation rates and student performance on standardized assessments are carefully measured and publicly available, but mechanisms that would allow these outputs to drive innovation and reform are missing or blocked. For example, many large urban districts and some states are now able to measure the effectiveness of individual teachers by assessing the annual academic growth of students in their classes. Huge differences in teacher effectiveness are evident, but collective bargaining agreements or state laws prevent most school district administrators from using that information in tenure or salary decisions.

Further complicating K-12 reform is the fact that authority for education policy is broadly dispersed. Unlike countries with strong national ministries that can institute top-down reforms within the public sector, education policy and practice in the United States are set through a chaotic network of laws, relationships and funding streams connecting 16,000 independent school districts to school boards, mayors, and state and federal officials. The lack of central authority allows the worst characteristics of public monopolies to prevail—inefficiency, stasis and catering to interests of employees—without top-down systems’ offsetting advantage of being capable of quick and coordinated action.

The challenges to reforming higher education are different. The 6,000-plus U.S. postsecondary institutions have greater flexibility to innovate than do the public school districts—and a motive to do so, because many compete among themselves for students, faculty and resources. However, while output is carefully measured and publicly reported for public K-12 schools and districts, we have only the grossest measures of output for post secondary institutions.

Even for something as straightforward as graduation rates, the best data we have at the institutional level are the proportion of full-time, first-time degree-seeking students who graduate within 150 percent of the normal time to degree completion. Data on critical outputs, including labor market returns and student learning, are missing entirely. In the absence of information on issues that really matter, postsecondary institutions compete and innovate on dimensions that are peripheral to their productivity, such as the winning records of their sports teams, the attractiveness of their grounds and buildings, and their ratio of acceptances to applications. Far more information is available to consumers in the market for a used car than for a college education. This information vacuum undermines productive innovation.

Examining Two Popular Reforms

Many education reformers across the political spectrum agree on two structural and governance reforms: expanding the public charter school sector at the expense of traditional public schools and setting national standards for what students should know. Ironically, the evidence supporting each of these reforms is weak at best.

Charter schools are publicly funded schools outside the traditional public school system that operate with considerable autonomy in staffing, curriculum and practices. The Obama administration has pushed to expand charter schools by eliminating states that don’t permit charters, or capping them, from competition for $4.35 billion in Race to the Top funding. Both President Obama and Education Secretary Arne Duncan have proposed shuttering poorly performing traditional public schools and replacing them with charters.

What does research say about charter schools’ effects on academic outcomes? Large studies that control for student background generally find very small differences in student achievement between the two types of public schools.

For example, on the 2005 National Assessment of Educational Progress (the “Nation’s Report Card”), white, black and Hispanic fourth graders in charter schools performed equivalently to fourth-graders with similar racial and ethnic backgrounds in traditional public schools. Positive findings do emerge from recent studies of oversubscribed New York and Boston area charter schools, which use lotteries to determine admission. But these results are obtained from children whose parents push to get them into the most popular charter schools in two urban areas with dynamic and innovative charter entrepreneurs.

What about common standards? Based on the belief that high content standards for what students should know and be able to do are essential elements of reform and that national standards are superior to individual state standards, the Common Core State Standards Initiative has signed up 48 states and 3 territories to develop a common core of state standards in English-language arts and mathematics for grades K-12. The administration has praised this joint effort by the National Governors Association and Council of Chief State School Officers, made participation in it a prerequisite for Race to the Top funding, and set aside $350 million in American Recovery and Reinvestment Act funding to develop ways to assess schools’ performance in meeting common core standards.

Does research support this approach? The Brown Center on Education Policy at Brookings examined the relationship between student achievement outcomes in mathematics at the state level and ratings of the quality of state content standards in math. There was no association. Some states with strong standards produce high-achieving students, such as Massachusetts, while other states with strong standards languish near the bottom in terms of achievement, such as California. Some states with weak standards boast high levels of achievement, such as New Jersey, while others with weak standards experience low levels of achievement, such as Tennessee.

Four Ideas

For every complex problem there is one solution which is simple, neat, and wrong. — H. L. Mencken

I will avoid Mencken’s approbation by proposing four solutions rather than one. Although education has far too many moving parts to be dramatically reformed by any short list of simple actions, we can start with changes that are straightforward, ripe for action and most promising, based on research and past experience.

Link K-12 Curricula to Comparative Effectiveness

Little attention has been paid to choice of curriculum as a driver of student achievement. Yet the evidence for large curriculum effects is persuasive. Consider a recent study of first-grade math curricula, reported by the National Center for Education Evaluation and Regional Assistance in February 2009. The researchers randomly matched schools with one of four widely used curricula. Two curricula were clear winners, generating three months’ more learning over a nine-month school year than the other two. This is a big effect on achievement, and it is essentially free because the more effective curricula cost no more than the others.

The federal government should fund many more comparative effectiveness trials of curricula, and schools using federal funds to support the education of disadvantaged students should be required to use evidence of effectiveness in the choice of curriculum materials. The Obama administration supports comparative effectiveness research in health care. It is no less important in education.

Evaluate Teachers Meaningfully

Good education outcomes for students depend on good teachers. If we have no valid and reliable system in place to identify who is good, we cannot hope to create substantial improvements in the quality of the teacher workforce.

A substantial body of high-quality research demonstrates that teachers vary substantially in effectiveness, with dramatic consequences for student learning. To increase academic achievement overall and address racial, ethnic and socioeconomic achievement gaps, we must enhance the quality of the teacher workforce and provide children from poor and minority backgrounds with equitable access to the best teachers.

Despite strong empirical evidence for differences in teacher performance—as well as intuitive appeal, demonstrated when we remember our own best and worst teachers—the vast majority of public school teachers in America face no meaningful evaluation of on-the-job performance. A recent survey of thousands of teachers and administrators, spanning 12 districts in four states, revealed that none of the districts’ formal evaluation processes differentiated meaningfully among levels of teaching effectiveness, according to a 2009 report published by The New Teacher Project. In districts using binary ratings, more than 99 percent of teachers were rated satisfactory. In districts using a broader range of ratings, 94 percent of teachers received one of the top two ratings, and less than one percent were rated unsatisfactory. In most school districts, virtually all probationary teachers receive tenure—98 percent in Los Angeles, for example—and very small numbers of tenured teachers are ever dismissed for poor performance.

Conditions of employment should be restructured to recruit and select more promising teachers, provide opportunities for them to realize their potential, keep the very best teachers in the profession, and motivate them to serve in locations where students have the highest needs. The precondition for these changes is a valid system of evaluating teachers.

The federal government should require school districts to evaluate teachers meaningfully, as a condition of federal aid. Washington also should provide extra support to districts that pay substantially higher salaries to teachers demonstrating persistently high effectiveness and serving in high-needs schools. But, because many technical issues in the evaluation of on-the-job performance of teachers are unresolved, the federal government should refrain, at least for now, from mandating specific evaluation components or designs. The essential element is meaningful differentiation—that is, a substantial spread of performance outcomes.

Accredit Online Education Providers

Traditional forms of schooling are labor-intensive and offer few economies of scale. To the extent that financial resources are critical to education outcomes, the only way to improve the U.S. education system in its current configuration is to spend more. Yet we currently spend more per student on education than any other country in the world, and the appetite for ever-increasing levels of expenditure has been dampened by changing demographics and ballooning government deficits. The monies that can be reasonably anticipated in the next decade or two will hardly be enough to forestall erosion in the quality of the system, as currently designed. The game changer for education productivity will have to be technology, which can both cut labor costs and introduce competitive pressures.

Already, at the college level, online education (also termed “virtual education” or “distance learning”) is proving competitive with the classroom experience. Nearly 3.5 million students in 2006—about 20 percent of all students in postsecondary schools and twice the number five years previously—were taking at least one course online, according to a 2007 report published by the Sloan Consortium.

In K-12, online education is developing much more slowly. But, the case for online K–12 education is strong—and linked to cost control. A survey reported on page one of Education Week (March 18, 2009) found the average per-pupil cost of 20 virtual schools in 14 states to be about half the national average for a traditional public school.

Local and state control of access to virtual schooling impedes the growth of high-quality online education and the competitive pressure it contributes to traditional schooling. Development costs are very high for virtual courseware that takes full advantage of the newest technologies and advances in cognitive science and instruction—much higher than the costs for traditional textbooks and instructional materials. These development costs can only be rationalized if the potential market for the resulting product is large. But, states and local school districts now are able to determine whether an online program is acceptable. The bureaucracy that may be most disrupted by the introduction of virtual education acts as gatekeeper.

To overcome this challenge, K-12 virtual public education would benefit from the model of accreditation used in higher education. Colleges and universities are accredited by regional or national bodies recognized by the federal government. Such accrediting bodies as the New England Association of Schools and Colleges and the Accrediting Council for Independent Colleges and Schools are membership organizations that determine their own standards within broad federal guidelines. Once an institution is accredited, students residing anywhere can take its courses, often with the benefit of federal and state student aid.

Federal legislation to apply this accreditation model to online K-12 education could transform public education, especially if the legislation also required school districts to cover the reasonable costs of online courses for students in persistently low-performing schools. This approach would exploit—and enhance—U.S. advantages in information technology. We are unlikely to regain the international lead in education by investing more in business as usual; but we could leapfrog over other countries by building new, technology-intensive education systems.

Link Postsecondary Programs to Labor Market Outcomes

On a per-student basis, the United States spends two and one-half times the developed countries’ average on postsecondary education. Although our elite research universities remain remarkable engines of innovation and are the envy of the world, our postsecondary education system in general is faltering. The United States used to lead the world in higher education attainment, but, according to 2009 OECD data, is now ranked 12th among developed countries. We have become a high-cost provider of mediocre outcomes.

Critical to addressing this problem is better information on the performance of our postsecondary institutions. As the U.S. Secretary of Education’s Commission on the Future of Higher Education concluded in 2006:

Our complex, decentralized postsecondary education system has no comprehensive strategy, particularly for undergraduate programs, to provide either adequate internal accountability systems or effective public information. Too many decisions about higher education—from those made by policymakers to those made by students and families—rely heavily on reputation and rankings derived to a large extent from inputs such as financial resources rather than outcomes. Better data about real performance and lifelong working and learning ability is absolutely essential if we are to meet national needs and improve institutional performance.
Ideally, this information would be available in comparable forms for all institutions through a national system of data collection. However, achieving consensus on the desirability of a national database of student records has proved politically contentious. One of the issues is privacy of information. More powerful is the opposition of some postsecondary institutions that apparently seek to avoid accountability for their performance.

The way forward is for Congress to authorize, and fund at the state level, data systems that follow individual students through their postsecondary careers into the labor market. The standards for such state systems could be recommended at the federal level or by national organizations, to maximize comparability and eventual interoperability.

The public face of such a system at the state level would be a website allowing prospective students and parents to compare degree and certificate programs within and across institutions on diverse outcomes, with corresponding information on price. At a minimum, the outcomes would include graduation rates, employment rates and average annual earnings five years after graduation. Outcomes would be reported at the individual program level, such as the B.S. program in chemical engineering at the University of Houston. Price could be reported in three ways: advertised tuition,average tuition for new students for the previous two years, and average tuition for new students for the previous two years net of institutional and state grants for students eligible for federally subsidized student loans. These different forms of price information are necessary because institutions frequently discount their advertised price, particularly for low-income students. Students and families need information about discounts in order to shop on the basis of price.

Many states, such as Washington, already have data that would allow the creation of such college search sites, at least for their public institutions. The primary impediment to progress is the federal Family Educational Rights and Privacy Act (FERPA), which makes it very difficult for postsecondary institutions to share data on individual students with state agencies, such as the tax division or unemployment insurance office, in order to match students with information on post-graduation employment and wages. Congress should amend FERPA to allow such data exchanges among state agencies while maintaining restrictions on release of personally identifiable information. To address privacy concerns, Congress also should impose substantial penalties for the public release of personally identifiable information; FERPA currently is toothless.

Creating a higher education marketplace that is vibrant with transparent and valid information on performance and price would be a powerful driver of reform and innovation. Easily addressed concerns about the privacy of student records and political opposition from institutions that do not want their performance exposed to the public have stood in the way of this critical reform for too long. America’s economic future depends on returning the United States to the forefront of education attainment. Simply put, many more of our students need to finish high school and graduate from college. Investments in improved data, along with structural reforms and innovation, can help restore our leadership in educational attainment and increase economic growth.

Downloads

     
 
 




through

Opportunity through Education: Two Proposals


Policy Brief #181

The new normal for local, state and federal governments is fiscal austerity. Although President Obama supported education during his State of the Union address and in his budget proposal to Congress, cash-strapped localities and states—which foot most of the bill for educating America’s children—may have to balance their budgets with cuts to schools and teachers. The recession exposed a long-developing structural imbalance between public expenditure versus raising the revenue for public services. Especially on education, reality has set in, with a vengeance.

Cutting public expenditure is not necessarily a bad thing. There are, however, some activities that have become so fundamentally governmental and so critically important to the nation’s future that they require special care during a period of severe budget trimming. Education is one such example.

The Brown Center on Education Policy at Brookings has recently developed proposals to ensure that federal investments in education have impact. These proposals present the dual advantage of low costs of implementation at the federal level coupled with the promise of considerable leverage at the state and local level. Two of those proposals are presented in this brief: increasing digital and virtual education and expanding consumer information on higher education.



RECOMMENDATIONS
One important path to individual opportunity is higher levels of educational attainment. The U.S. economy is marked by an increasing economic divide between those who are educated and those who are not. In a time of fiscal austerity, every federal dollar invested in education must have a return.

Congress should:
  • Increase digital and virtual education. In reauthorizing the No Child Left Behind Act, provide that parents of economically disadvantaged students who are eligible for federal Title I funding should be able to direct that the funding associated with their child be spent to cover the costs of enrolling their child in virtual courses or in a virtual school.
     
  • Expand consumer information in higher education. Amend the Higher Education Act (HEA) to require that states that receive federal funds for statewide longitudinal data systems provide information on completion rates, employment levels, and annual earned income for each degree or certificate program and for each degree-granting institution that operates in the state. This information could be disseminated on the Internet.

Downloads

Image Source: © Adam Hunger / Reuters
      
 
 




through

White House releases breakthrough strategy on antibiotic resistance


After years of warnings from the public health community about the growing threat of antibiotic resistance, yesterday the White House announced a national strategy to combat the growing problem of antibiotic resistance within the U.S. and abroad. The administration’s commitment represents an important step forward, as antibiotic-resistant infections are responsible for 23,000 deaths annually, and cost over $50 billion in excess health spending and lost productivity.  The administration’s National Strategy on Combating Antibiotic-Resistant Bacteria includes incentives for developing new drugs, more rigorous stewardship of existing drugs, and better surveillance of antibiotic use and the pathogens that are resistant to them.  President Obama also issued an Executive Order that establishes an interagency Task Force and a non-governmental Presidential Advisory Council that will focus on broad-based strategies for slowing the emergence and spread of resistant infections. 

While antibiotics are crucial for treating bacterial infections, their misuse over time has contributed to a rather alarming rate of antibiotic resistance, including the development of multidrug-resistance bacteria or “super bugs.” Misuse manifests throughout all corners of public and private life; from the doctor’s office when prescribed to treat viruses; to industrial agriculture, where they are used in abundance to promote growth in livestock. New data from the World Health Organization (WHO) and U.S. Centers for Disease Control and Prevention (CDC) confirm that rising overuse of antibiotics has already become a major public health threat worldwide.

The administration’s announcement included a report from the President’s Council of Advisors on Science and Technology (PCAST) titled “Combatting Antibiotic Resistance,” which includes recommendations developed by a range of experts to help control antibiotic resistance. In addition, they outline a $20 million prize to reward the development of a new rapid, point-of-care diagnostic test. Such tests help health care providers choose the right antibiotics for their patients and streamline drug development by making it easier to identify and treat patients in clinical trials.  

The Need for Financial Incentives and Better Reimbursement

A highlight of the PCAST report is its recommendations on economic incentives to bring drug manufacturers back into the antibiotics market. Innovative changes to pharmaceutical regulation and research and development (R&D) will be welcomed by many in the health care community, but financial incentives and better reimbursement are necessary to alleviate the market failure for antibacterial drugs. A major challenge, particularly within a fee-for-service or volume-based reimbursement system is providing economic incentives that promote investment in drug development without encouraging overuse.

A number of public and private stakeholders, including the Engelberg Center for Health Care Reform and Chatham House’s Centre on Global Health Security Working Group on Antimicrobial Resistance, are exploring alternative reimbursement mechanisms that “de-link” revenue from the volume of antibiotics sold. Such a mechanism, combined with further measures to stimulate innovation, could create a stable incentive structure to support R&D. Further, legislative proposals under consideration by Congress to reinvigorate the antibiotic pipeline, including the Antibiotic Development to Advance Patient Treatment (ADAPT) Act of 2013, could complement the White House’s efforts and help turn the tide on antibiotic resistance. Spurring the development of new antibiotics is critical because resistance will continue to develop even if health care providers and health systems can find ways to prevent the misuse of these drugs.

Authors

       




through

Breakthrough therapy designation: A primer


Breakthrough therapy designation (BTD) is the newest of four expedited programs developed by the U.S Food and Drug Administration (FDA) to accelerate the development and review of novel therapies that target serious conditions. The public response to the program has been largely positive, and dozens of drugs have successfully received the designation. However, the FDA denies many more requests than it grants. In fact, as of March 2015, less than one in three of the BTD requests submitted have been granted. By contrast, roughly 75 percent of the requests for fast track designation (another of the Agency’s expedited programs) were granted between 1998 and 2007. This discrepancy suggests ongoing uncertainty over what exactly constitutes a “breakthrough” according to the FDA’s criteria.

On April 24, the Center for Health Policy at Brookings will host an event, Breakthrough Therapy Designation: Exploring the Qualifying Criteria, that will discuss qualifying criteria for the BTD program using real and hypothetical case studies to explore how FDA weighs the evidence submitted. Below is a primer that describes the definition, value, and impact of BTD.

What is BTD?

BTD was established in 2012 under the Food and Drug Administration Safety and Innovation Act, and is intended to expedite the development and review of drugs that show signs of extraordinary benefit at early stages of the clinical development process. However, BTD is not an automatic approval. The drug still has to undergo clinical testing and review by the FDA. Rather, BTD is designed to facilitate and shorten the clinical development process, which can otherwise take many years to complete.

What criteria does FDA use to evaluate potential breakthroughs?

In order to qualify for the designation, a therapy must be intended to treat a serious or life-threatening illness, and there must be preliminary clinical evidence that it represents a substantial improvement over existing therapies on at least one clinically significant outcome (such as death or permanent impairment).

In considering a request for BTD, FDA relies on three primary considerations:

1) the quantity and quality of the clinical evidence being submitted;

2) the available therapies that the drug is being compared to; and

3) the magnitude of treatment effect shown on the outcome being studied.


In practice, however, it can be difficult to define a single threshold that a therapy must meet. The decision depends on the specific context for that drug.  In some cases, for example, the targeted disease has few or no treatments available, while in others there may be several effective alternative treatments to which the new therapy can be compared. The request may also be made at different stages of the clinical development process, which means that the amount and type of data available to FDA can vary. In some cases, early evidence of benefit may disappear when the drug is tested in larger populations, which is why FDA reserves the right to rescinded the designation if subsequent data shows that the therapy no longer meets the criteria.

How many therapies have received the designation?

As of March 2015, FDA had received a total of 293 requests for BTD. Of these, 82 received the designation, and 23 have since been approved for marketing. Ten of these approvals were new indications for already approved drugs, rather than novel therapies that had never before received FDA approval.

What are the benefits of BTD?

For drug manufacturers, it is about the intensity and frequency of their interactions with FDA. Once the designation is granted, the FDA takes an “all hands-on-deck” approach to providing the manufacturer with ongoing guidance and feedback throughout the clinical development process. Products that receive BTD are also able to submit portions of their marketing application on a rolling basis (rather than all at once at the end of clinical trials) and BTD can also be used in combination with other expedited programs in order to further reduce the product’s time to market.

For patients, the potential benefits are straightforward: earlier access to therapies that may significantly improve or extend their lives.

How does BTD relate to the other three expedited programs?

The other three expedited review and development programs—fast track designation, priority review, and accelerated approval—are also geared at facilitating the development and approval of drugs for serious conditions. These other programs have been in place for over 15 years, and have played a significant role in accelerating patient access to new therapeutics (Table 1). In 2014 alone, 66 percent of the 41 drugs approved by FDA's Center for Drug Evaluation and Research used at least one of these four pathways, and 46 percent received at least two of the designations in combination.

Table 1: Overview of FDA’s Expedited Review Programs


 Adapted from FDA's Guidance for Industry: Expedited Programs for Serious Conditions - Drugs and Biologics

Authors

      




through

Breakthrough therapy designation: Exploring the qualifying criteria


Event Information

April 24, 2015
8:45 AM - 4:45 PM EDT

Ballroom
The Park Hyatt Hotel
24th and M Streets, NW
Washington, DC

Register for the Event

Established by the Food and Drug Administration Safety and Innovation Act of 2012, breakthrough therapy designation (BTD) is one of several programs developed by the U.S. Food and Drug Administration (FDA) to speed up the development and review of drugs and biologics that address unmet medical needs. In order to qualify for this designation, the treatment must address a serious or life-threatening illness. In addition, the manufacturer (i.e., sponsor) must provide early clinical evidence that the treatment is a substantial improvement over currently available therapies. The FDA is working to further clarify how it applies the qualifying criteria to breakthrough designation applications.

On April 24, under a cooperative agreement with FDA, the Center for Health Policy convened a public meeting to discuss the qualifying criteria for this special designation. Using examples from oncology, neurology, psychiatry, and hematology, the workshop highlighted considerations for the BTD application process, the evaluation process, and factors for acceptance or rejection. The discussion also focused on key strategies for ensuring that the qualifying criteria are understood across a broad range of stakeholder groups.


Video

Event Materials

      




through

Faster, more efficient innovation through better evidence on real-world safety and effectiveness


Many proposals to accelerate and improve medical product innovation and regulation focus on reforming the product development and regulatory review processes that occur before drugs and devices get to market. While important, such proposals alone do not fully recognize the broader opportunities that exist to learn more about the safety and effectiveness of drugs and devices after approval. As drugs and devices begin to be used in larger and more diverse populations and in more personalized clinical combinations, evidence from real-world use during routine patient care is increasingly important for accelerating innovation and improving regulation.

First, further evidence development from medical product use in large populations can allow providers to better target and treat individuals, precisely matching the right drug or device to the right patients. As genomic sequencing and other diagnostic technologies continue to improve, postmarket evidence development is critical to assessing the full range of genomic subtypes, comorbidities, patient characteristics and preferences, and other factors that may significantly affect the safety and effectiveness of drugs and devices. This information is often not available or population sizes are inadequate to characterize such subgroup differences in premarket randomized controlled trials.

Second, improved processes for generating postmarket data on medical products are necessary for fully realizing the intended effect of premarket reforms that expedite regulatory approval. The absence of a reliable postmarket system to follow up on potential safety or effectiveness issues means that potential signals or concerns must instead be addressed through additional premarket studies or through one-off postmarket evaluations that are more costly, slower, and likely to be less definitive than would be possible through a better-established infrastructure. As a result, the absence of better systems for generating postmarket evidence creates a barrier to more extensive use of premarket reforms to promote innovation.

These issues can be addressed through initiatives that combine targeted premarket reforms with postmarket steps to enhance innovation and improve evidence on safety and effectiveness throughout the life cycle of a drug or device. The ability to routinely capture clinically relevant electronic health data within our health care ecosystem is improving, increasingly allowing electronic health records, payer claims data, patient-reported data, and other relevant data to be leveraged for further research and innovation in care. Recent legislative proposals released by the House of Representatives’ 21st Century Cures effort acknowledge and seek to build on this progress in order to improve medical product research, development, and use. The initial Cures discussion draft included provisions for better, more systematic reporting of and access to clinical trials data; for increased access to Medicare claims data for research; and for FDA to promulgate guidance on the sources, analysis, and potential use of so-called Real World Evidence. These are potentially useful proposals that could contribute valuable data and methods to advancing the development of better treatments.

What remains a gap in the Cures proposals, however, is a more systematic approach to improving the availability of postmarket evidence. Such a systematic approach is possible now. Biomedical researchers and health care plans and providers are doing more to collect and analyze clinical and outcomes data. Multiple independent efforts – including the U.S. Food and Drug Administration’s Sentinel Initiative for active postmarket drug safety surveillance, the Patient-Centered Outcomes Research Institute’s PCORnet for clinical effectiveness studies, the Medical Device Epidemiology Network (MDEpiNet) for developing better methods and medical device registries for medical device surveillance and a number of dedicated, product-specific outcomes registries – have demonstrated the potential for large-scale, systematic postmarket data collection. Building on these efforts could provide unprecedented evidence on how medical products perform in the real-world and on the course of underlying diseases that they are designed to treat, while still protecting patient privacy and confidentiality.

These and other postmarket data systems now hold the potential to contribute to public-private collaboration for improved population-based evidence on medical products on a wider scale. Action in the Cures initiative to unlock this potential will enable the legislation to achieve its intended effect of promoting quicker, more efficient development of effective, personalized treatments and cures.

What follows is a set of both short- and long-term proposals that would bolster the current systems for postmarket evidence development, create new mechanisms for generating postmarket data, and enable individual initiatives on evidence development to work together as part of a broad push toward a truly learning health care system.

Downloads

      




through

Event recap: Lessons learned from two years of breakthrough therapy designation


The breakthrough therapy designation (BTD) program was initiated by the U.S. Food and Drug Administration (FDA) in 2012 to expedite the development of treatments for serious or life-threatening illness that demonstrate “substantial improvement” over existing therapies. The program has since become a widely supported mechanism for accelerating patient access to new drugs. As of March 2015, FDA has received a total of 293 requests for BTD. However, it has granted just  82 (28%), which indicates an ongoing lack of clarity over what exactly meets the criteria for the designation.

On April 24, the Center for Health Policy at Brookings convened a public meeting to explore the designation’s qualifying criteria and how FDA applies those criteria across therapeutic areas. Panelists used real-world and hypothetical case studies to frame the discussion, and highlighted major considerations for the application process, the FDA’s evaluation of the evidence, and the key factors for acceptance or rejection. The discussion also identified strategies to ensure that qualifying criteria are well understood. Here are the five big takeaways:

1.  The BTD program is viewed positively by drug companies, researchers, advocates, and others 

Across the board, participants expressed enthusiasm for the BTD program. Industry representatives noted that their experience had been extremely positive, and that the increased cooperation with and guidance from FDA were very helpful in streamlining their development programs. Receiving the designation can also raise a drug company’s profile, which can facilitate additional investment as well as clinical trial patient recruitment; this is particularly important for smaller companies with limited resources.

Patient and disease advocates were likewise supportive, and expressed hope that the early lessons learned from successful breakthrough therapy approvals (which have been mostly concentrated in the oncology and antiviral fields) could be translated to other disease areas with less success. However, while BTD is an important tool in expediting the development of new drugs, it is just one piece of broader scientific and regulatory policy landscape. Accelerating the pace of discovery and development of truly innovative new drugs will depend on a range of other factors, such as developing and validating new biomarkers that can be used to measure treatment effects at an earlier stage, as well as establishing networks that can streamline the clinical trial process. It will also be important to develop effective new approaches to collecting, analyzing, and communicating information about these treatments once they are on the market, as this information can potentially be used by FDA, providers, and patients to  further improve prescription drug policy and medical decision-making.

2.  BTD requests far outnumber those that actually meet the qualifying criteria

Since the program began, less than 30 percent of requests have received BTD designation. A substantial majority were denied at least in part due to either a lack of data or problems with the quality of the data, or some combination of the two. For example, some sponsors requested the designation before they had any clinical data, or submitted the request using clinical data that was incomplete or based on flawed study designs. Many requests also failed to meet the Agency’s bar for “substantial improvement” over existing therapies.

One reason for the high denial rate may be a lack of a clear regulatory or statutory bar that could be used as a definitive guide for sponsors to know what is needed to qualify for the designation. BTD denials are also confidential, which means that sponsors effectively have nothing to lose by submitting a request. Going forward, manufacturers may need to exercise more discretion in deciding to request the designation, as the process can be resource- and time-intensive for both sides.

3.  There is no single threshold for determining what defines a breakthrough therapy

About 53 percent of the 109 total BTD denials were due at least in part to the fact that the drug did not represent a substantial improvement over existing therapies. During the day’s discussion, FDA and sponsors both noted that this is likely because the criteria for BTD are inherently subjective. In practice, this means there is no clear threshold for determining when a new therapy represents a “substantial improvement” over existing therapies. Designation decisions are complex and highly dependent on the context, including the disease or condition being targeted, the availability of other treatments, the patient population, the outcomes being studied, and the overall reliability of the data submitted. Given the multiple factors at play, it can be difficult in some cases to determine when a new product is potentially “transformational” as opposed to “better,” especially for conditions that are poorly understood or have few or no existing treatments. In making its determinations, FDA considers the totality of the evidence submitted, rather than focusing on specific evidentiary requirements.

4.  Early communication with FDA is strongly recommended for BTD applicants

Roughly 72 percent of the BTD denials related at least in part to trial design or analysis problems, which led several people to suggest that sponsors engage with FDA prior to submitting their request. Though there are several formal mechanisms for interacting with the agency, informal consultations with the relevant review division could help sponsors to get a better  and much earlier sense of what kind of data FDA might need. This early communication could both strengthen viable BTD requests and reduce the number of frivolous requests.

5.  FDA may need more resources for implementing the BTD program

Drugs that receive breakthrough designation are subject to much more intensive FDA guidance and review. However, when the program was established in 2012, Congress did not allocate funding to cover its costs. There have been ongoing concerns that the program is exacting a significant toll on FDA’s already limited resources, and potentially affecting the timeline for other drug application reviews. These concerns were reiterated during the day’s discussion, and some suggested that Congress consider attaching a user fee to the BTD program when the Prescription Drug User Fee Act comes up for reauthorization in 2017.

Authors

      




through

What macroprudential policies are countries using to help their economies through the COVID-19 crisis?

Countries around the world are reeling from the health threat and economic and financial fallout from COVID-19. Legislatures are responding with massive relief programs. Central banks have lowered interest rates and opened lender-of-last-resort spigots to support the flow of credit and maintain financial market functioning. Authorities are also deploying macroprudential policies, many of them developed…

       




through

How the EU and Turkey can promote self-reliance for Syrian refugees through agricultural trade

Executive Summary The Syrian crisis is approaching its ninth year. The conflict has taken the lives of over 500,000 people and forced over 7 million more to flee the country. Of those displaced abroad, more than 3.6 million have sought refuge in Turkey, which now hosts more refugees than any other country in the world.…

       




through

Optimal solar subsidy policy design and incentive pass-through evaluation: using US California as an example


Renewable energy is an important source to tackle against climate change, as the latest IPCC report has pointed out. However, due to the existence of multiple market failures such as negative externalities of fossil fuels and knowledge spillovers of new technology, government subsidies are still needed to develop renewable energy, such as solar photovoltaic (PV) cells. In the United States, there have been various forms of subsidies for PV, varying from the federal level to the state level, and from the city level to the utility level. California, as the pioneer of solar PV development, has put forward the biggest state-level subsidy program for PV, the California Solar Initiative (CSI). The CSI has planned to spend around $2.2 Billion in 2007–2016 to install roughly 2 GW PV capacity, with the average subsidy level as high as $1.1/W. How to evaluate the cost-effectiveness and incentive pass-through of this program are the two major research questions we are pursing.

Our cost-effectiveness analysis is based on a constrained optimization model that we developed, where the objective is to install as much PV capacity as possible under a fixed budget constraint. Both the analytical and computational results suggest that due to a strong peer effect and the learning-by-doing effect, one can shift subsides from later periods to early periods so that the final PV installed capacity can be increased by 8.1% (or 32 MW). However, if the decision-maker has other policy objectives or constraints in mind, such as maintaining the policy certainty, then, the optimally calculated subsidy policy would look like the CSI.

As to the incentive pass-through question, we took a structural approach and in addition used the method of regression discontinuity (RD). While in general, the incentive pass-through rate depends on the curvature of the demand and supply curve and the level of market competition, our two estimations indicate that the incentive pass-through for the CSI program is almost complete. In other words, almost all of the incentive has been enjoyed by the customer, and the PV installers did not retain much. Based on the RD design, we observe that PV installers tend to consider the CSI incentive as exogenous to their pricing decision.

The relative good performance of the CSI in terms of both the cost-effectiveness and the incentive pass-through aspect are tightly related to its policy design and program management. International speaking, the biggest challenge for the design of any PV subsidy program is the quick running out of the budget, and in the end, it looks like customers are rushing for the subsidy. Such rushing behavior is a clear indication of higher-than-needed incentive levels. Due to the policy rigidity and rapid PV technological change, the PV subsidy policy may lag behind the PV cost decline; and as a result, rational customers could rush for any unnecessarily high subsidy.

Due to the high uncertainty and unpredictability of future PV costs, the CSI put forward a new design that links the incentive level change and the installed capacity goal fulfillment. Specifically, the CSI has designed nine steps to achieve its policy goal; at each step, there is a PV capacity goal that corresponds to an incentive level. Once the capacity goal is finished, the incentive level will decrease to the next lower level. Furthermore, to maintain the policy certainty, the CSI regulated that every step-wise change in the incentive level should not be higher than $0.45/W, nor smaller than $0.05/W, together with other three constraints.

A good subsidy policy not only requires flexible policy design to respond to fast-changing environment, but also demands an efficient program management system, digitalized if possible. For the CSI, the authority has contracted out a third-party to maintain a good database system for the program. Specifically, the database has documented in detail every PV system that customers requested. Key data fields include 22 important dates during the PV installation process, customers’ zip code, city, utility and county information, and various characteristics of the PV system such as price, system size, incentive, PV module and installer. All information is publicly available, which to some extent fills in the information gap held by customers and fosters the market competition among PV installers. For customers to receive the incentive, their PV systems have to pass the inspection of the local government, and also to be interconnected to the grid. On the supply side, the CSI has also certified and created a list of PV installers that every customer can choose from.

Although the CSI has ended in 2014 due to fast PV cost reduction starting from 2009, its experience has been transferred to other areas in the United States and in Europe. It is highly possible that other similar new technologies and products (e.g. the electric car and the battery) can adopt the CSI policy design, too. In summary, a good and successful policy may need to be simply, clear, credible, foreseeable, flexible, end-able, and incentive-compatible. The PV subsidy policy in China still has a long way to go when compared to the CSI.

Authors

  • Changgui Dong
      
 
 




through

Scaling Up Through Aid: The Real Challenge

Summary

At the Gleneagles Summit in 2005, leaders of the G8 group of nations committed to increase aid to poor nations by $50 billion per year. During the same year, in a meeting in Paris, donors promised to coordinate their interventions for more effective delivery. These commitments are now often referred to as the promise of donors to “scale up aid.” Increasing aid flows and improving coordination are indeed important goals and, in fact, goals that donors seem to have trouble meeting. The international donor community met this fall in Accra and will meet in Doha in November 2008 to review progress with this aspect of scaling up aid, and it is hoped that they will recommit to meet the ambitious targets set three years ago.

Scaling up aid is only one of the challenges that donors face. A more important challenge is to “scale up through aid,” meaning that aid flows should not merely support short-lived, one-time and partial development interventions—pilot projects, short-term technical assistance, programs that only address part of the problem, but leave major bottlenecks unaddressed—but should support projects, programs and policies that scale up successful interventions in a country, region or globally to reach the entire target population. Scaling up means that programs are long-term and sustained and that external support is aligned with country needs and deals comprehensively with the development challenges—often by working in partnership with other donors and pooling resources. This is the scaling up challenge that donors should address head-on, but so far have not.

This policy brief reports on the findings of an in-depth review of the literature and practice of scaling up development interventions and focuses on the role that aid donors can play in supporting scaling up for effective development. It stresses that successful scaling up with external assistance means that donor agencies need to: work with a vision and leadership; help create the political constituencies for large-scale implementation; create linkages among project, program and policy interventions; strengthen the institutional capacity of the implementing entities; provide for effective incentives and accountabilities of their own staff and management; work together with each other; monitor and evaluate the progress of programs with special attention to the scaling up dimension; and finally make sure they focus on effective preparation and flexible implementation of the scaling up process. While this is a long-term agenda, donors can take a few practical steps right away that will provide a basis for a more ambitious effort over time.

Downloads

Authors

      
 
 




through

Life after coronavirus: Strengthening labor markets through active policy

Prior to the COVID-19 crisis, the growing consensus was that the central challenge to achieving inclusive economic prosperity was the creation of good jobs that bring more workers closer to a true “middle-class” lifestyle (Rodrik, 2019). This simple goal will be hard to meet. The lingering effects of the coronavirus crisis will add to the…

       




through

What must corporate directors do? Maximizing shareholder value versus creating value through team production


In our latest 21st Century Capitalism initiative paper, "What must corporate directors do? Maximizing shareholder value versus creating value through team production," author Margaret M. Blair explores how the share value maximization norm (or the “short-termism” malady) came to dominate, why it is wrong, and why the “team production” approach provides a better basis for governing corporations over the long term.

Blair reviews the legal and economic theories behind the share-value maximization norm, and then lays out a theory of corporate law building on the economics of team production. Blair demonstrates how the team production theory recognizes that creating wealth for society as a whole requires recognizing the importance of all of the participants in a corporate enterprise, and making sure that all share in the expanding pie so that they continue to collaborate to create wealth.

Arguing that the corporate form itself helps solve the team production problem, Blair details five features which distinguish corporations from other organizational forms:

  1. Legal personality
  2. Limited liability
  3. Transferable shares
  4. Management under a Board of Directors
  5. Indefinite existence

Blair concludes that these five characteristics are all problematic from a principal-agent point of view where shareholders are principals. However, the team production theory makes sense out of these arrangements. This theory provides a rationale for the role of corporate directors consistent with the role that boards of directors historically understood themselves to play: balancing competing interests so the whole organization stays productive.

Downloads

Authors

  • Margaret M. Blair
     
 
 




through

What drove oil prices through the floor this week?

The coronavirus pandemic has sent crude oil prices plummeting, so much so that the price for West Texas Intermediate oil dropped below zero dollars earlier this week. In this special edition of the podcast, Samantha Gross joins David Dollar to explain the factors influencing recent changes in demand for oil and the long-term effects the…

       




through

African Youth Tribute Nelson Mandela through Civic Action for Development


As the world pays its tributes to the critically ailing former South African President Nelson Mandela, youth across Africa are stepping up their own tributes to Madiba in the form of civic service on Mandela Day. The United Nations and the African Union have called on citizens across Africa and the world to volunteer 67 minutes— representing the 67 years of Mandela’s public service—to community projects on his birthday, July 18.

The Africa Peace Service Corps (APSC) has launched volunteering projects in Nairobi, Kenya; Cape Town and rural Limpopo, South Africa; Lusaka, Zambia; Abuja, Nigeria; villages in Uganda and other countries.  Four hundred youths and 35 partners assembled last July at the United Nations conference in Nairobi to launch the Pan-African service project, spurring civic action in health, climate change, youth entrepreneurship and positive peace. 

A 2012 Brookings report, “Volunteering and Civic Service in Three African Regions,” released at the Nairobi conference and co-authored by three African scholars notes the benefits of volunteering (“Ubuntu”) in South, West and East Africa in addressing youth livelihoods, health and peace-building.  The report further documents policy recommendations and strategies linking youth service and entrepreneurship in addressing the daunting task of youth unemployment across the region.  Dr. Manu Chandaria  (Comcraft CEO and Global Peace Foundation Africa chairman) and Les Baillie (chairman of Kenya mobile phone giant Safaricom Foundation, which created Africa’s M-Pesa mobile banking microfinance success) have assembled corporate leaders to back APSC youth social enterprises in tree planting and waste management to generate green jobs and reach Kenya’s goal of ten percent tree coverage.

Nelson Mandela’s life of struggle and triumph, in particular his time and insights during his time unjustly incarcerated on Robben Island, provides a rich textbook for these young social entrepreneurs.  During my recent Harris Wofford Global Service Fellowship with the University of Cape Town Development Policy Research Unit (DPRU) and Cross Cultural Solutions, while teaching an entrepreneurship class in the townships I was able to see the teeming spirit of youth enterprise first-hand alive in the poorest communities.  A South African national assets demonstration has been launched this year to tap the power of service and entrepreneurship in generating savings among township youths from these deliberations with the Nelson Mandela Children’s Fund, Ford Foundation, University of Johannesburg Center for Social Development and Washington University Center for Social Development and Brookings’ Africa Growth Initiative partner DPRU, among others.

Along with addressing Mandela’s dream of ending poverty, a recent Brookings report, “Impacts of Malaria Interventions and their Potential Additional Humanitarian Benefits in Sub-Saharan Africa,” outlines the potential significant peace-building effects of service in sub-Saharan Africa by highlighting the joint efforts of the Muslim Sultan and Catholic Cardinal of Nigeria in tackling malaria along with those of the Africa Malaria Leaders Alliance with PEPFAR support.  The contributions of volunteering to both peace and development outcomes are further underscored in the draft of a United Nations post-2015 “sustainable development goals” report.

Amidst inevitable political debates over the Mandela legacy, his generous spirit and legacy of reconciliation rises high above Cape Town’s Table Mountain and across the Pan-African youth landscape.  The challenge of applying his vision and spiritual values in addressing poverty through emerging demonstrations of youth service, assets and entrepreneurship will test the commitment of Africa’s next generation of young freedom pioneers, guided by this humble giant’s profound legacy now spanning the globe.

Image Source: © Dylan Martinez / Reuters
      
 
 




through

Subsidizing Higher Education through Tax and Spending Programs

ABSTRACT  During the past 10 years, tax benefits have played an increasingly important role in federal higher education policy. Before 1998, most federal support for higher education involved direct expenditure programs— largely grants and loans—primarily intended to provide more equal educational opportunities for low- and moderate-income students. In 1997 (effective largely for expenses in 1998 and…

       




through

Social Entrepreneurship in the Middle East: Advancing Youth Innovation and Development through Better Policies

On April 28, the Middle East Youth Initiative and Silatech discussed a new report titled “Social Entrepreneurship in the Middle East: Toward Sustainable Development for the Next Generation.” The report is the first in-depth study of its kind addressing the state of social entrepreneurship and social investment in the Middle East and its potential for the…

       




through

Clean Energy Finance Through the Bond Market: A New Option for Progress


State and local bond finance represents a powerful but underutilized tool for future clean energy investment.

For 100 years, the nation’s state and local infrastructure finance agencies have issued trillions of dollars’ worth of public finance bonds to fund the construction of the nation’s roads, bridges, hospitals, and other infrastructure—and literally built America. Now, as clean energy subsidies from Washington dwindle, these agencies are increasingly willing to finance clean energy projects, if only the clean energy community will embrace them.

So far, these authorities are only experimenting. However, the bond finance community has accumulated significant experience in getting to scale and knows how to raise large amounts for important purposes by selling bonds to Wall Street. The challenge is therefore to create new models for clean energy bond finance in states and regions, and so to establish a new clean energy asset class that can easily be traded in capital markets. To that end, this brief argues that state and local bonding authorities and other partners should do the following:

  • Establish mutually useful partnerships between development finance experts and clean energy officials at the state and local government levels
  • Expand and scale up bond-financed clean energy projects using credit enhancement and other emerging tools to mitigate risk and through demonstration projects
  • Improve the availability of data and develop standardized documentation so that the risks and rewards of clean energy investments can be better understood
  • Create a pipeline of rated and private placement deals, in effect a new clean energy asset class, to meet the demand by institutional investors for fixed-income clean energy securities

Downloads

Authors

Image Source: © Steve Marcus / Reuters
      
 
 




through

Party Fundraising Success Continues Through Mid-Year

With only a few months remaining before the 2004 elections, national party committees continue to demonstrate financial strength and noteworthy success in adapting to the more stringent fundraising rules imposed by the Bipartisan Campaign Reform Act (BCRA). A number of factors, including the deep partisan divide in the electorate, the expectations of a close presidential race, and the growing competition in key Senate and House races, have combined with recent party investments in new technology and the emergence of the Internet as a major fundraising tool to produce what one party chairman has described as a "perfect storm" for party fundraising.1 Consequently, both national parties have exceeded the mid-year fundraising totals achieved in 2000, and both approach the general election with substantial amounts of money in the bank.

After eighteen months of experience under the new rules, the national parties are still outpacing their fundraising efforts of four years ago. As of June 30, the national parties have raised $611.1 million in federally regulated hard money alone, as compared to $535.6 million in hard and soft money combined at a similar point in the 2000 election cycle. The Republicans lead the way, taking in more than $381 million as compared to about $309 million in hard and soft money by the end of June in 2000. The Democrats have also raised more, bringing in $230 million as compared to about $227 million in hard and soft money four years ago. Furthermore, with six months remaining in the election cycle, both national parties have already raised more hard money than they did in the 2000 election cycle.2 In fact, by the end of June, every one of the Democratic and Republican national party committees had already exceeded its hard money total for the entire 2000 campaign.3

This surge in hard money fundraising has allowed the national party committees to replace a substantial portion of the revenues they previously received through unlimited soft money contributions. Through June, these committees have already taken in enough additional hard money to compensate for the $254 million of soft money that they had garnered by this point in 2000, which represented a little more than half of their $495 million in total soft money receipts in the 2000 election cycle.

View the accompanying data tables (PDF - 11.4 KB)


1Terrence McAuliffe, Democratic National Committee Chairman, quoted in Paul Fahri, "Small Donors Grow Into Big Political Force," Washington Post, May 3, 2004, p. A11.
2In 2000, the Republican national party committees raised $361.6 million in hard money, while the Democratic national committees raised $212.9 million. These figures are based on unadjusted data and do not take into account any transfers of funds that may have taken place among the national party committees.
3The election cycle totals for 2000 can be found in Federal Election Commission, "FEC Reports Increase in Party Fundraising for 2000," press release, May 15, 2001. Available at http://www.fec.gov/press/press2001/051501partyfund/051501partyfund.html (viewed July 28, 2004).

Downloads

Authors

     
 
 




through

Campaign Reform in the Networked Age: Fostering Participation through Small Donors and Volunteers

Event Information

January 14, 2010
10:30 AM - 12:00 PM EST

Falk Auditorium
The Brookings Institution
1775 Massachusetts Ave., NW
Washington, DC

Register for the Event

The 2008 elections showcased the power of the Internet to generate voter enthusiasm, mobilize volunteers and increase small-donor contributions. After the political world has been arguing about campaign finance policy for decades, the digital revolution has altered the calculus of participation.

On January 14, a joint project of the Campaign Finance Institute, American Enterprise Institute and the Brookings Institution unveiled a new report that seeks to change the ongoing national dialogue about money in politics. At this event, the four authors of the report will detail their findings and recommendations. Relying on lessons from the record-shattering 2008 elections and the rise of Internet campaigning, experts will present a new vision of how campaign finance and communications policy can help further democracy through broader participation.

Video

Audio

Transcript

Event Materials

      
 
 




through

Photo: Take a walk through a mossy spruce forest

Our dreamy photo of the day comes from Bohuslän, Sweden.




through

Breakthrough could finally help doctors pinpoint a patient's cancer cause

Scientists find that tumors hold information like a 'black box' pointing to the specific cause of disease.




through

Power from poo: Breakthrough could lead to sustainable electricity from sewage

Oh, the wonderful things that poo can do.




through

A biotech breakthrough hopes to save bananas from extinction

While banana farmers watch their plantations get ravaged by a fungal disease, scientists think they may have found a solution.




through

1000 US veterans to receive solar job training through Troops To Solar initiative

Thanks to GRID Alternatives and Wells Fargo, more than 1000 US military veterans and active servicemen will be getting solar industry job training and job placement.




through

Oregon's Lost Lake is disappearing through a strange hole

Bye bye, lake. Where it's going, nobody knows for sure.




through

Inky the octopus escapes from aquarium through a drainpipe to the sea

In a tale of intrigue and derring-do, the crafty cephalopod slipped out of his enclosure and found his way to freedom.




through

Waterproof solar cell could go through the wash and still work

The solar cell can be stretched, bent and compressed without substantially affecting performance.




through

Hong Kong's housing crisis seen through 40 sq.ft. "cubicle" apartments (Photos)

The growing disparity between wealthy and poor is reflected in this shocking photo report on the tiny island city's critical lack of affordable housing.




through

Smart desk moves you between sitting and standing throughout the day

The high tech answer to keeping you moving while working.




through

Biking Through Amish Country for Climate Ride

There is a certain irony in the fact that some of the best biking in the U.S. is in an area where people have rejected the modern world -- including bikes (for those of the Old Order).




through

A plywood core runs through ISA's latest house in Philadelphia

They are as gutsy and gritty as ever.




through

Would you do an 18-day Mystery Trip through the Middle East?

Intrepid Travel's latest offering is an uncharted journey that starts in Tehran and ends in Istanbul.




through

The Eames' Appreciation of the World Through Common Objects

An exhibit about the pioneering mid-century designers illustrates the innovative ideas that inspired their work.




through

Interactive exhibit tells a sustainability story through the lens of contemporary art

Art Works For Change is using a unique online exhibit to inspire change through storytelling, including 'featured tours' of the galleries by leading eco-organizations.




through

See-through solar cells could close gap to meet electricity demand

This could turn 5-7 billion square meters of glass in the USA alone into solar power plants, plus power your cell phone and other gadgets




through

Montreal's impressive food recovery program will expand throughout Quebec

A partnership between Moisson Montreal and the largest grocery chains in the province will continue to salvage hundreds of tons of food that would otherwise be discarded.




through

Clever technical breakthrough could make LEDs as inexpensive as incandescents

It was only a few years ago that I was testing LED bulbs that cost $100 each...




through

Cory Doctorow has a vision of "resilience and joyful thriving through and after a just climate transition"

Unless, of course, TINA gets in the way.




through

Improved Cooking Technique Slashes Coal Use. Is Funded Through Carbon Offsets (Video)

Carbon offsets have traditionally gone to technological improvements or reforestation. One group is teaching a different way of cooking. And it's using offset funds for the training.




through

Coworking-inspired school gets kids to learn through interactive play

Created by BIG and WeWork, this open-plan school in NYC offers another model for education.




through

Beautiful new see-through frog puts whole heart on display

The new-to-science Amazonian glassfrog has skin so transparent that its tiny heart can be seen beating in its chest.




through

Mesmerizing short film follows photographer through the Arctic, wolves and polar bears ensue (video)

Take a breathtaking 9-minute journey with wildlife photographer Vincent Munier through the beautifully bleak frozen North, you won’t be sorry.




through

Maersk to send first container ship through the Northeast Passage

They call it a "one-off" but it is the shape of things to come.




through

Flexible cooling strip breakthrough for heat removal

This could keep people from overheating in their wearable electronic outfits or simply in the heat of the ever-warming days




through

How to get your houseplants through the winter

From watering needs to ideal temperatures, here's what to know to help your indoor plants survive the cooler months.




through

Photo: Beautiful batwing slug flies through the sea

Our photo of the day comes from the watery wilds of Wilsons Promontory in southern Australia.




through

Renovated terrace house has see-through stair of the week

This home has been redesigned to feel much more spacious, and connected to its surroundings.