break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Africa in the news: Nagy visits Africa, locust outbreak threatens East Africa, and Burundi update

Security and youth top agenda during US Assistant Secretary of State Nagy’s visit to Africa On January 15, U.S. Assistant Secretary of State for African Affairs Tibor Nagy headed to Africa for a six-nation tour that included stops in the Central African Republic, Ethiopia, Kenya, South Sudan, Sudan, and Somalia. Security was on the top of the agenda…

       




break

Vettel takes record-breaking pole in Brazil

Sebastian Vettel took his 15th pole position of the season at the Brazilian Grand Prix, breaking Nigel Mansell's record for the most in a season from 1992 and lining up on the front row alongside Red Bull team-mate Mark Webber




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Breaking Down the Huawei v. Pentagon Dispute

If nothing else, the long-running Huawei situation shows the importance of considering the supply chain when it comes to cybersecurity. Huawei being the Chinese telecommunications equipment maker basically banned by the federal government. Bruce Schneier joins Host Tom Temin on Federal Drive.




break

Breaking the Ice: How France and the UK Could Reshape a Credible European Defense and Renew the Transatlantic Partnership

History is replete with irony, but rarely more poignantly than in the summer of 2016 when, on 23 June, the UK voted to leave the European Union and the next day, 24 June, the EU published its Global Strategy document asserting its ambition of “strategic autonomy.” Whither Franco-British defense cooperation in such chaotic circumstances? This paper attempts to provide the outline of an answer to that question.




break

Africa in the news: Nagy visits Africa, locust outbreak threatens East Africa, and Burundi update

Security and youth top agenda during US Assistant Secretary of State Nagy’s visit to Africa On January 15, U.S. Assistant Secretary of State for African Affairs Tibor Nagy headed to Africa for a six-nation tour that included stops in the Central African Republic, Ethiopia, Kenya, South Sudan, Sudan, and Somalia. Security was on the top of the agenda…

       




break

Protecting our most economically vulnerable neighbors during the COVID-19 outbreak

While we are all adjusting to new precautions as we start to understand how serious the COVID-19 coronavirus is, we also need to be concerned about how to minimize the toll that such precautions will have on our most economically vulnerable citizens. A country with the levels of racial and income inequality that we have…

       




break

Webinar: The effects of the coronavirus outbreak on marginalized communities

As the coronavirus outbreak rapidly spreads, existing social and economic inequalities in society have been exposed and exacerbated. State and local governments across the country, on the advice of public health officials, have shuttered businesses of all types and implemented other social distancing recommendations. Such measures assume a certain basic level of affluence, which many…

       




break

How is the coronavirus outbreak affecting China’s relations with India?

China’s handling of the coronavirus pandemic has reinforced the skeptical perception of the country that prevails in many quarters in India. The Indian state’s rhetoric has been quite measured, reflecting its need to procure medical supplies from China and its desire to keep the relationship stable. Nonetheless, Beijing’s approach has fueled Delhi’s existing strategic and economic concerns. These…

       




break

20 years after Clinton’s pathbreaking trip to India, Trump contemplates one of his own

President Trump is planning on a trip to India — probably next month, depending on his impeachment trial in the Senate. That will be almost exactly 20 years after President Clinton’s pathbreaking trip to India, Bangladesh, and Pakistan in March 2000. There are some interesting lessons to be learned from looking back. Presidential travel to…

       




break

Breaking bad in the Middle East and North Africa: Drugs, militants, and human rights

The Middle East and North Africa are grappling with an intensifying drug problem—increased use, the spread of drug-related communicable diseases, and widening intersections between drug production and violent conflict. The repressive policies long-applied in the region have not prevented these worsening trends.

      
 
 




break

Democracy in Hong Kong: Might 'none-of-these-candidates' break the deadlock?


Midway through Hong Kong’s second public consultation on the method of electing the next chief executive (CE), both pro-democracy “pan-democrat” legislators and the Hong Kong government and Chinese Central government are still holding their cards close. Following the current public consultation, members of Hong Kong’s Legislative Council (LegCo) will cast an historic vote on political reform. Hong Kong’s mini-constitution, the Basic Law, states that “the ultimate aim is the selection of the CE by universal suffrage upon nomination by a broadly representative nominating committee in accordance with democratic procedures” (Basic Law Art. 45). Pan-democrat LegCo members currently plan to vote against the eventual resolution on political reform, given their dissatisfaction with the reform process to date. Observers predict that passage of a resolution will happen only if the Hong Kong and Central governments can swing a few pan-democrats over to their side in the final hour.

The problem is a prickly one: Is it possible to design an electoral system that is sufficiently open and democratic in the eyes of the Hong Kong people and, at the same time, that guarantees to the Central Government that the elected leader of this special administrative region accepts the supremacy of the Chinese Communist Party? Even as politicians on each side reiterate the near “impossibility” of changing their positions (see e.g., RTHK Backchat discussion with Justice Secretary Rimsky Yuen at 4:25), thought-leaders from Hong Kong’s universities are inventing creative proposals with the potential to break the deadlock.

The Ground Rules

A 2004 decision of the Standing Committee of the National People’s Congress (NPCSC), China’s national legislature, interpreted the Basic Law to require a “Five-Step Process” in order to amend the selection method for the CE. Hong Kong is now between Steps 2 and 3.

  • Step 1: The current CE must submit a report to the NPCSC on the need to amend the electoral system. That submission took place on July 15, 2014 after a five-month initial public consultation process. The CE’s report faced heavy criticism in Hong Kong for not accurately reflecting public opinion.
  • Step 2: The NPCSC must issue a decision affirming the need for the amendment. The NPCSC announced that decision on August 31, 2014. It endorsed a system by which citizens may directly vote for the CE but imposed restrictive conditions on the nomination procedure of eligible candidates. The decision triggered 79 days of protest and civil disobedience – what activists and the media have referred to as the “Umbrella Movement.”
  • Step 3: The Hong Kong government must introduce the political reform bill in LegCo, and two-thirds of legislators must endorse it. The vote in LegCo is scheduled to take place during the first half of 2015, although a precise date has not been set. The purpose of the second-round public consultation is to forge consensus behind political reform within the parameters set out in the August 31 NPCSC decision.
  • Steps 4 and 5: In the event that LegCo endorses the bill, the CE must provide his consent and report the amendment to the NPCSC for its final approval.

If the bill does not receive two-thirds endorsement of LegCo (or if it does, but the NPCSC does not approve) then political reform would fail. Hong Kong would be left with the status quo, and Hong Kong people would lose the opportunity to vote for their chief executive for at least the next seven years.

Limited Room for Negotiation

The terms set out by the August 31 NPCSC decision limit the range of possible political reform options. For that reason, one of the core demands of the Umbrella Movement was to scrap the decision and re-start the Five-Step Process; that didn’t happen, however. In January 2015, the Hong Kong government issued a public consultation document framing the discussion in the lead up to the vote in LegCo. The consultation document hews closely to the NPCSC decision:

  • The Nominating Committee (NC) will resemble the previous committee that elected the CE with the same number of members (1,200) belonging to the same limited number of subsectors (38). The Wall Street Journal recently described that committee as “a hodgepodge of special interests.” During the consultation, citizens may discuss adding new subsectors to make the committee more inclusive and representative (such as adding new subsectors to represent the interests of women or young voters), but restructuring will necessarily mean disrupting and eliminating the positions of existing subsectors or committee members. Therefore, the consultation document suggests these changes are unlikely to be achieved (Consultation Document, Chapter 3, Sec. 3.08 p. 10).
  • The NC will nominate two to three candidates, and each candidate will require endorsement from at least half of the NC membership. (Given the difficulty of restructuring the subsectors or their electoral bases, these terms would effectively exclude any pan-democrats from nomination.) In order to make this more palatable, the consultation document proposes that citizens discuss a two-stage nomination process. In the first stage, a quorum of 100-150 committee members would “recommend” individuals for nomination. The committee would then elect the nominees from this recommended group (Consultation Document, Chapter 4, Sec. 4.09 p. 14). In theory, the meetings when recommendation and nomination votes take place could be staggered in order to allow campaigning and public debate. The idea is that NC members would take public opinion into consideration before casting their second vote.
  • On the voting arrangements, citizens may discuss a “first-past-the-post” arrangement with either a single-round, two-round, or instant runoff vote systems (Consultation Document, Chapter 5, Sec. 5.06 p. 17-19).

Both sides in this negotiation have fired shots across the bow. At the launch of the second public consultation on January 7, Chief Secretary Carrie Lam remarked, “there is no room for any concessions or promises to be made in order to win over support from the pan-democratic members.” For their part, the pan-democrats vowed to boycott the public consultation and veto a resolution that conforms to these terms. They argue that the proposed method of electing the chief executive does not improve upon the status quo.

Most pan-democrat legislators are directly elected from geographical constituencies, and public opinion could provide legitimate grounds for shifting their position. According to polling by the Hong Kong University Public Opinion Programme last month, a plurality of respondents view the Hong Kong government’s proposal as neither a step forward nor a step backward for democracy. If the government were to commit to making the electoral system more democratic in the next CE election in 2022, a clear majority of respondents would then support the government’s plan.

Inventing Options and Finding Common Ground

The two-stage nomination mechanism in the government’s proposal is an acknowledgement that the NC ought to be responsive to public opinion. But without additional tinkering, this procedure does not materially change the incentives of NC members. What if the public had the power to reject the slate of candidates nominated by the committee?

Since the first public consultation, a few academics, including Simon Young at Hong Kong University (HKU), have considered at least two ways this could happen. An “active” approach would allow Hong Kong voters to cast blank votes and require a minimum percentage of affirmative votes for the winning candidate. A “passive” approach would require a minimum voter turnout rate for a valid election. NC members might then have to take public opinion into account.

Early last month, Albert Chen, also a professor at HKU and a legal advisor of the NPCSC, began to advocate publicly for a proposal that employs a ballot with a none-of-these-candidates option (see RTHK Jan. 13 edition of The Pulse). Under his proposal, if a majority of people vote for “none-of-these-candidates,” the slate of candidates put forward by the NC will be voided. When the public votes down the candidates, the NC could revert back to an election committee and choose a provisional CE. Alternatively, the Chief Secretary could assume CE duties during a six-month interim period prior to a new election (drawing upon Basic Law Art. 53). Chen argues that his proposal would give the Hong Kong people—not pan-democrat politicians—decision-making power to accept the new NC and its slate of candidates or to revert back to the status quo.

More recently, Johannes Chan, HKU professor and human rights advocate, floated a competing proposal that would provide voters with the option for negative voting. A 20 percent “no” vote for an otherwise leading candidate would trigger a re-vote. Between the first and second elections, the candidates would have additional time to campaign. If after the second election, still 20 percent of voters oppose the leading candidate, the candidate would be disqualified, and the NC would nominate new candidates. Given Hong Kong’s governance problems and increasing public polarization, the 20 percent veto ensures that no CE will be saddled with a substantial block of Hong Kong society affirmatively opposed to him or her from day one.

Albert Chen’s proposal received a tepid if supportive response in pro-Beijing quarters. Jasper Tsang, the Speaker of LegCo and member of the largest pro-establishment political party, and Rita Fan, a member of the NPCSC, affirmed their view that the none-of-these-candidates mechanism does not violate the Basic Law. While the government’s consultation document does not expressly mention the none-of-these-candidates concept, Hong Kong’s Justice Secretary indicated that the proposal should be considered. Starry Lee, another leader of the biggest pro-establishment party in LegCo, countered that technical difficulties and limited time for discussion would pose obstacles to the none-of-these-candidates ballot proposal.

Pan-democrats so far have tended to rebuff government overtures to engage on the topic. A few legislators, such as the Civic Party’s Ronny Tong, have been willing to engage (with Albert Chen on the Jan. 13 edition of The Pulse) but have reservations about what happens after a voided election, and feel that the threshold for public veto is too high. Law Chi-kwong, a founding member of Hong Kong’s Democratic Party and also a member of the HKU faculty, suggested that the winning candidate ought to receive an absolute majority of votes with blank votes counted. (E.g., when one candidate receives 45 percent, another receives 35 percent, and none-of-these-candidates receives 20 percent, that would lead to a void election.) However, other scholars associated with the Democratic Party have distanced themselves from the blank vote debate and Law’s statements.

The Merits of Blank Voting

The debate over blank and negative voting in Hong Kong unfolds in a global context where none-of-these-candidates has become an increasingly common political choice. Several democracies have institutionalized the practice. Proponents cite instrumental rationales, such as improved accountability and transparency. However, these benefits are not necessarily guaranteed. More broadly, people recognize the inherent value of the “no” vote as a form of political expression.

In the U.S. state of Nevada, for example, a none-of-these-candidates option has appeared on the ballot for all statewide and national elections since 1975. During the 2012 presidential cycle, the Secretary of State of Nevada argued that removing a none-of-these-candidates option would harm Nevada voters by taking away a “legitimate and meaningful ballot choice.” There is precedent for none-of-these-candidates winning a plurality of votes in a congressional primary; in that case, Republican Walden Earnhart finished behind the none-of-these-candidates option but still “won” the primary and got the nomination. More typically, the ballot option plays a “spoiler role.” In the 1998 Senate race, for example, 8,125 votes for none-of-these-candidates dwarfed the 395-vote margin between Harry Reid and John Ensign. This allowed Reid, the incumbent, to be re-elected.

It is hard to find examples where none-of-these-candidates has won a majority of the popular vote. Hong Kong’s pan-democrats may be right to question whether this possibility would meaningfully affect the calculus of the NC. Colombia is one of the few jurisdictions where blank votes can have institutional consequences. The right of citizens to cast a blank vote was established by the Colombian Constitution in 1991, and later codified in political reform statutes in 2003 and 2009. Similar to Albert Chen’s proposal in Hong Kong, if the number of blank votes equals a majority of the total number of votes cast, the election must be repeated. The original candidates cannot participate in the second election.

The Colombian experience suggests that the blank vote is more consequential in races with fewer candidates. Colombian voters have never nullified a slate of candidates at the national-level, where the field is crowded. In the city of Bello, however, the blank vote won the mayoral election in 2011. In that case, the electoral authority disqualified the one opposition candidate. This led to a one-man race and united all opposition forces around the blank vote in order to reject the establishment Conservative Party candidate. In the second round election, the replacement Conservative Party candidate (Carlos Alirio Muñoz López) won 59 percent of the vote. In the end, his party benefited with a resounding popular mandate. By this logic, the blank vote could matter in the two- to three-candidate race contemplated for Hong Kong.

Empirical evidence also suggests that local conditions in Hong Kong could support a relatively high turnout for none-of-these-candidates. Based on data from Spain and Italy, Chiara Superti at Harvard finds that blank voting is a sophisticated political choice, more likely to take place in municipalities with highly educated and politically engaged electorates. Hong Kong would qualify.

Beyond candidate selection, voting is a highly expressive act. A citizen’s vote is an expression of identity as well as a channel for protest. Echoing this view, the Supreme Court of India recently held that the country’s constitutional guarantees of freedom of speech and expression confer on Indian citizens a right to reject all candidates and to exercise their right to affirmatively vote for none-of-these-candidates in secrecy. As a people who define themselves by “core values,” including freedom of expression, this resonates with Hongkongers. More fundamentally, the ballot serves a powerful safety-valve function. At the time universal suffrage was introduced in England and France, the vote was presented as a way to channel political turmoil into more moderate political expression—and this, too, resonates in Hong Kong today.

Views expressed in the article are the author's personal views.

Authors

  • David Caragliano
Image Source: Reuters
       




break

Why the internet didn’t break

Working, studying, and playing at home during the COVID-19 pandemic has meant that residential internet usage has soared. According to one set of industry analytics, between January 29 (shortly after COVID-19 appeared in the U.S.) and March 26 there was a 105% spike in people active online at home between 9:00 a.m. and 6:00 p.m.…

       




break

A Fair Compromise to Break the Climate Impasse


Key messages and Policy Pointers

• Given the stalemate in U.N. climate negotiations, the best arena to strike a workable deal is among the members the Major Economies Forum on Energy and Climate (MEF).

• The 13 MEF members—including the EU-27 (but not double-counting the four EU countries that are also individual members of the MEF)—account for 81.3 percent of all global emissions.

• This proposal devises a fair compromise to break the impasse to develop a science-based approach for fairly sharing the carbon budget in order to have a 75 percent chance of avoiding dangerous climate change.

• To increase the likelihood of a future climate agreement, carbon accounting must shift from pro­duction-based inventories to consumption-based ones.

• The shares of a carbon budget to stay below 2 °C through 2050 are calculated by cumulative emis­sions since 1990, i.e. according to a short-horizon polluter pays principle, and national capability (income), and allocated to MEF members through emission rights. This proposed fair compromise addresses key concerns of major emitters.

• According to this accounting, no countries have negative carbon budgets, there is substantial time for greening major developing economies, and some developed countries need to institute very rapid reductions in emissions.

• To provide a ‘green ladder’ to developing countries and to ensure a fair global deal, it will be crucial to agree how to extend sufficient and predictable financial support and the rapid transfer of technology.

The most urgent and complicated ethical issue in addressing climate change is how human society will share the work of reducing greenhouse gases (GHG) emissions. Looking ahead to 2015 when a new international treaty on climate change should be agreed upon, we fear we are headed towards a train wreck.

Key developed countries have made it clear they will not accept any regime excluding emerging economies such as China and Brazil, and the U.S. and other ‘umbrella’ countries are calling for only voluntary, bottom-up com­mitments. Yet the major developing countries have made equity the sine qua non for any kind of agreement: they will not take on mandatory emission reduction targets with perceived implications for their economic growth and social development, unless the wealthier countries commit to deep emissions cuts and act first.

These entrenched positions between the different blocs have led to the current impasse, but as Nobel laureate economist and philosopher, Amaryta Sen pointed out, the perfect agreement that never happens is more unjust than an imperfect one that is obtainable.

What is a fair and feasible way to break the impasse, given that all efforts are faltering? The most difficult task is determining a country’s fair share of the required emissions reductions in a way that is politically feasible. After 20 years of negotiations and gridlock, it is clear that many conflicting principles of equity are brought to the table, so a solution will have to be based on some kind of ‘negotiated justice,’ or a ‘fair compromise,’ which will not be one preferred by just one group of countries.

A few basic requirements must be met. A feasible, fair and effective climate agreement must involve the largest emitters from both the developed and developing countries. Such an agreement must find a way to engage the latter without penalizing them or the former countries too much. In order to secure progress, above all it must be acceptable to the two world superpowers and top carbon emitters, China and the U.S.; with this leadership, in fact, other emitters will likely follow. This agreement could be forged in a ‘plurilateral’ setting where a limited number of countries come together first, and then be brought into the formal U.N. negotiations as the basis for a future deal, perhaps by 2015.

How can future negotiations on emissions reductions overcome such political inertia? We suggest that taking three manageable steps to a fair compromise will unlock progress.

First, negotiate a core agreement between the 13 members in the MEF (including the EU-27), which accounts for 81.3 percent of all global emissions. This makes the negotiations feasible, where deals can be struck that would be impossible in the vast U.N. forum.

Second, use consumption-based emissions accounting, which is much fairer than the cur­rent production/territorial-based accounting that all past agreements and negotiations have been based upon. These are relatively new numbers developed by the Norwegian research center CICERO, and have been vetted by the top scientific journals and increasingly utilized by policymakers.

Third, forge a fair compromise to allocate emissions rights. We propose a compromise based on a short-horizon ‘polluter pays principle’ and an indicator of national capability (income).

This third step in particular is a genuine compromise for both developed and developing countries, but it is re­quired to break the current gridlock. Each MEF member gives and takes something from this simple, workable framework and all gain a liveable planet in the future.

Throughout the paper we first explain why counting carbon emissions by consumption is far better and the im­plications of doing so, and we then introduce the MEF and why it is a promising arena for forging a bold compro­mise like the one so badly needed before 2015. We then calculate what the numbers actually mean for that group of countries and develop a proposal for a fair compromise that embodies a feasible but fair operationalization of the central equity principles of the U.N. climate treaty, i.e. action by countries according to their responsibility and capability. We conclude with a discussion of how a start in the MEF could lead to a new framework being brought into those broader negotiations.

Download and read the full paper »

Downloads

Authors

Image Source: © Ina Fassbender / Reuters
     
 
 




break

How is the coronavirus outbreak affecting China’s relations with India?

China’s handling of the coronavirus pandemic has reinforced the skeptical perception of the country that prevails in many quarters in India. The Indian state’s rhetoric has been quite measured, reflecting its need to procure medical supplies from China and its desire to keep the relationship stable. Nonetheless, Beijing’s approach has fueled Delhi’s existing strategic and economic concerns. These…

       




break

George W. Bush Was Tough on Russia? Give Me a Break.


As the Obama administration copes with Russia’s annexation of Crimea and continuing pressure on Ukraine, its actions invariably invite comparison to the Bush administration’s response to the 2008 Georgian-Russian war. But as the Obama White House readies potentially more potent economic sanctions against Russia, former Bush administration officials are bandying a revisionist history of the Georgia conflict that suggests a far more robust American response than there actually was.

Neither White House had good options for influencing Russian President Vladimir Putin. And this time, the fast-moving developments on the ground in Ukraine confront the United States with tough choices. Because the West will not go to war over Crimea, U.S. and European officials must rely on political, diplomatic and financial measures to punish Moscow, while seeking to launch negotiations involving Russia in order to de-escalate and ultimately stabilize the Ukraine situation. They are not having an easy time of it.

Neither did the Bush administration during the 2008 Georgia-Russia war. In a brief, five-day conflict, the Russian army routed its outnumbered and outgunned Georgian opponent and advanced to within a short drive of the Georgian capital, Tbilisi. Bush officials ruled out military options and found that, given the deterioration in U.S.-Russian relations over the previous five years, they had few good levers to influence the Kremlin. The sanctions Washington applied at the time had little resonance in Moscow.

In recent days, however, former Bush administration officials have described a forceful and effective U.S. response in Georgia. On “Fox News Sunday” on March 16, former senior White House adviser Karl Rove told Chris Wallace, “What the United States did was it sent warships to, to the Black Sea, it took the combat troops that Georgia had in Afghanistan, and airlifted them back, sending a very strong message to Putin that ‘you’re going to be facing combat-trained, combat-experienced Georgian forces.’ And not only that, but the United States government is willing to give logistical support to get them there, and this stopped them.”

Rove was echoing what former Secretary of State Condoleezza Rice wrote in a March 7 op-ed in The Washington Post: “After Russia invaded Georgia in 2008, the United States sent ships into the Black Sea, airlifted Georgian military forces from Iraq back to their home bases and sent humanitarian aid. Russia was denied its ultimate goal of overthrowing the democratically elected government.” Really? These statements do not match well with the history of the conflict.

War broke out the night of Aug. 7, when Georgian President Mikhail Saakashvili ordered his troops into the breakaway region of South Ossetia, after Russian forces shelled Georgian villages just outside South Ossetia. The Russians — by appearances, spoiling for a fight — responded swiftly with massive force. They turned the Georgian army back and overran much of Georgia.

As has been widelyreported, when the conflict began, one of Georgia’s five army brigades was serving as part of the coalition force in Iraq (not Afghanistan, as Rove claimed). On Aug. 10, U.S. C-17s began returning the brigade to Tbilisi, and it promptly went into combat.

The brigade was well-trained and experienced — but in counterinsurgency operations for Iraq, not combined arms operations. Facing a larger and far better-armed opponent, the brigade added little to the failing Georgian effort to halt the Russian advance. On Aug. 12, Moscow announced a cease-fire. French President Nicolas Sarkozy traveled to the Russian and Georgian capitals to formalize an end to the hostilities.

Did the U.S. airlift of the Georgian troops to Tbilisi change the tide of battle or Moscow’s political calculations? No. The Russian army handily drove them back.

What about the deployment of U.S. Navy ships to the Black Sea? The guided missile destroyer USS McFaul did enter the Black Sea to deliver humanitarian supplies to Georgia, passing through the Bosporus on Aug. 22 — 10 days after the cease-fire.

No evidence suggests these actions had much, if any, impact on Putin’s decision making. The Russians halted their offensive short of Tbilisi, figuring that occupying the capital was unnecessary. They thought — as did many in Georgia and the West — that the political shock of the rout would suffice to bring down Saakashvili’s government (though, in the end, it did not).

U.S. C-17s did fly humanitarian supplies to Tbilisi, but President Bush ruled out military action. His administration imposed modest penalties on Russia, ratcheting down bilateral relations, freezing a U.S.-Russia civil nuclear cooperation agreement and ending support for Moscow’s bid to join the World Trade Organization. U.S. officials found that they had little leverage to affect Moscow’s behavior.

The Obama administration has applied similar measures as it seeks to sway Putin again, but it has added a new penalty: visa and financial sanctions targeted at individual Russians, including some close to Putin. On March 20, the president also announced a new executive order to enable U.S. sanctions against key sectors of the Russian economy, including finance, energy and defense — the kinds of tough penalties that the United States has not previously applied against Moscow.

Despite the bluster of former Bush administration officials today, Washington in fact has a stronger hand in the current crisis in Ukraine in one other regard. In 2008, many European states held Saakashvili partially responsible for triggering the war with the Georgian advance into South Ossetia. Ukraine, by contrast, has acted with great restraint. This time, nearly all of Europe agrees that Russia’s actions are out of bounds. Sure enough, European states also appear more ready to sanction Russia than in 2008. Along with the various sanctions the U.S. alone has announced, European Union officials last week also announced visa and financial sanctions on individual Russians.

These moves might not end up shaking Putin from his course, but applying the new executive order could inflict real pain on the Russian economy — something Washington did not accomplish in 2008. Those who faced the challenge of punishing Russia over Georgia should understand the complexities of dealing with Putin and, at a minimum, cut the current administration a little slack.

Read the original article at POLITICO Magazine»

Authors

Publication: POLITICO Magazine
Image Source: © Grigory Dukor / Reuters
      
 
 




break

Global Governance Breakthrough: The G20 Summit and the Future Agenda

Executive Summary

At the invitation of President George W. Bush, the G20 leaders met on November 15, 2008, in Washington, DC, in response to the worldwide financial and economic crisis. With this summit meeting the reality of global governance shifted surprisingly quickly. Previously, major global economic, social and environmental issues were debated in the small, increasingly unrepresentative and often times ineffectual circle of G8 leaders. Now, there is a larger, much more legitimate summit group which can speak for over two-thirds of the world’s population and controls 90% of the world’s economy.

The successful first G20 Summit provides a platform on which President-elect Obama can build in forging an inclusive and cooperative approach for resolving the current financial and economic crisis. Rather than get embroiled in a debate about which country is in and which country is out of the summit, the new U.S. administration should take a lead in accepting the new summit framework for now and focus on the substantive issues. Aside from tackling the current crisis, future G20 summits should also drive the reform of the international financial institutions and address other major global concerns—climate change, poverty and health, and energy among others. With its diverse and representative membership of key countries and with a well-managed process of summit preparation and follow-up the new G20 governance structure would allow for a more inclusive deliberation and more effective response to today’s complex global challenges and opportunities.

Policy Brief #168

A Successful G20 Summit—A Giant Step Forward

Once announced, there was speculation that the G20 Summit would be at best a distraction and at worst a costly failure, with a lame duck U.S. president hobbled by a crisis-wracked economy and a president-elect impotently waiting at the sidelines, with European leaders bickering over seemingly arcane matters, and with the leaders of the emerging economies sitting on the fence, unwilling or unprepared to take responsibility for fixing problems not of their making.

As it turned out, the first G20 Summit was by most standards a success. It served as a platform for heads of state to address the current financial turmoil and the threats of the emerging economic crisis facing not only the U.S. and Europeans, but increasingly also the rest of the world. The communiqué unmistakably attributes blame for the crisis where it belongs—to the advanced countries. It lays out a set of principles and priorities for crisis management and an action plan for the next four months and beyond, and it promises to address the longer-term agenda of reform of the global financial system. Very importantly, it also commits the leaders to meet again in April 2009 under the G20 umbrella. This assures that the November G20 Summit was not a one-off event, but signified the beginning of a new way of managing the world economy. The U.S. Treasury, which apparently drove the decision to hold the G20 rather than a G8 summit and which led the brief preparation process, deserves credit for this outcome.

A Long Debate over Global Governance Reform Short-circuited

With this successful summit a number of unresolved issues in global governance were pushed aside virtually overnight:

  • The embarrassing efforts of past G8 summits to reach out to the leaders of emerging market economies with ad hoc invitations to join as part-time guests or through the well-meaning expedient of the “Heiligendamm Process”—under which a G8+5 process was to be institutionalized—were overtaken by the fact of the G20 summit.
  • A seemingly endless debate among experts about what is the optimal size and composition for an expanded summit—G13, G14, G16, G20, etc. —was pragmatically resolved by accepting the format of the already existing G20 of finance ministers and central bank presidents, which has functioned well since 1999. With this, the Pandora’s Box of country selection remained mercifully closed. This is a major accomplishment, which is vitally important to preserve at this time.
  • The idea of a “League of Democracies” as an alternative to the G8 and G20 summits, which had been debated in the U.S. election, was pushed aside by the hard reality of a financial crisis that made it clear that all the key economic players had to sit at the table, irrespective of political regime.
  • Finally, the debate about whether the leaders of the industrial world would ever be willing to sit down with their peers from the emerging market economies as equals was short circuited by the picture of the U.S. president at lunch during the G20 Summit, flanked by the presidents of two of the major emerging economies, Brazil and China. This photograph perhaps best defines the new reality of global governance in the 21st Century.

Is the G20 Summit Here to Stay?

The communiqué of the November 15, 2008 Summit locked in the next G20 summit and hence ordained a sequel that appears to have enshrined the G20 as the new format to address the current global financial and economic crisis over the coming months and perhaps years. Much, of course, depends on the views of the new U.S. administration, but the November 2008 Summit has paved the way for President Obama and his team to move swiftly beyond the traditional G8 and to continue the G20 format.

In principle there is nothing wrong with exploring options for further change. However at this juncture, we strongly believe that it is best for the new U.S. administration to focus its attention on making the G20 summit format work, in terms of its ability to address the immediate crisis, and in terms of subsequently dealing with other pressing problems, such as global warming and global poverty. There may be a need to fine-tune size and composition, but more fundamental changes, in our view, can and should wait for later since arguments about composition and size—who is in and who is out—could quickly overwhelm a serious discussion of pressing substantive issues. Instead, the next G20 Summit in the United Kingdom on April 2, 2009 should stay with the standard G20 membership and get on with the important business of solving the world’s huge financial and economic problems.

One change, however, would be desirable: At the Washington Summit in November 2008 two representatives for each country were seated at the table, usually the country’s leader and finance minister. There may have been good reasons for this practice under the current circumstances, since leaders may have felt more comfortable with having the experts at their side during intense discussions of how to respond to the financial and economic crisis. In general, however, a table of 40 chairs undoubtedly is less conducive to an open and informal discussion than a table half that size. From our experience, a table of 20 can support a solid debate as long as the format is one of open give and take, rather than a delivery of scripted speeches. This is not the case for a table with 40 participants. The G8 format of leaders only at the table, with prior preparation by ministers who do not then participate in the leaders level summits, should definitely be preserved. To do otherwise would dilute the opportunity for informal discussion among leaders, which is the vital core of summit dynamics.

What Will Happen to the G8 Summit and to the G7 and G20 Meetings of Finance Ministers?

As the world’s financial storm gathered speed and intensity in recent months, the inadequacy of the traditional forums of industrial countries—the G8 group of leaders and the G7 group of finance ministers—became obvious. Does this mean that the G8 and G7 are a matter of the past? Most likely not. We would expect these forums to continue to meet for some time to come, playing a role as caucus for industrial countries. In any event, the G20 finance ministers will take on an enhanced role, since it will be the forum at which minister-level experts will lay the ground on key issues of global financial and economic management to ensure that they are effectively addressed at summit level by their leaders. The G20 Summit of November 15 was prepared by a meeting of G20 finance ministers in this fashion.

It may well be that the dynamics of interactions within the G20 will cause coalitions to be formed, shifting over time as issues and interests change. This could at times and on some issues involve a coalition of traditional G7 members. However, with increasing frequency, we would expect that some industrial countries would temporarily team-up with emerging market country members, for example on agricultural trade policies, where a coalition of Argentina, Australia, Brazil and Canada might align itself to challenge the agricultural protection policies of Europe, Japan and the United States. Or in the area of energy, a coalition among producer states, such as Indonesia, Mexico, Russia and Saudi Arabia might debate the merits of a stable energy supply and demand regime with an alliance among energy users, such as China, Europe, Japan, South Africa and the United States. It is this potential for multiple, overlapping and shifting alliances, which creates the opportunities for building trust, forcing trade-offs and forging cross-issue compromises that makes the G20 summit such an exciting opportunity.

What Should Be the Agenda of Future G20 Summits?

The communiqué of the November 2008 G20 Summit identified three main agenda items for the April 2009 follow-up summit: (1) A list of key issues for the containment of the current global financial and economic crisis; (2) a set of issues for the prevention of future global financial crises, including the reform of the international financial institutions, especially the IMF and World Bank; and (3) a push toward the successful conclusion of the Doha Round of WTO trade negotiations.

The first item is obviously a critical one if the G20 is to demonstrate its ability to help address the current crisis in a meaningful way. The second item is also important and timely. The experience with reform of the global financial institutions in the last few years has demonstrated that serious governance changes in these institutions will have to be driven by a summit-level group that is as inclusive as the G20. We would hope that Prime Minister Gordon Brown, as chair—with his exceptional economic expertise and experience in the international institutions, especially the IMF—will be able to forge a consensus at the April 2 summit in regard to reform of the international financial institutions. The third agenda item is also important, since the Doha Round is at a critical stage and its successful conclusion would send a powerful signal that the world community recognizes the importance of open trade relations in a time of crisis, when the natural tendency may be to revert to a protectionist stance.

However, we believe three additional topics should be added to the agenda for the April 2009 G20 Summit:

  • First, there should be an explicit commitment to make the G20 forum a long-term feature of global governance, even as the group may wish to note that its size and composition is not written in stone, but subject to change as circumstances change.
  • Second, the communiqué of the November summit stated that the G20 countries are “committed to addressing other critical challenges such as energy security and climate change, food security, the rule of law, and the fight against terrorism, poverty and disease”. This needs to be acted upon. These issues cannot be left off the table, even as the global financial and economic crisis rages. If anything, the crisis reinforces some of the key challenges which arise in these other areas and offers opportunities for a timely response. The U.K.-hosted summit should launch a G20 initiative to develop framework ideas for the post-Kyoto climate change agreement at Copenhagen.
  • Third, assuming the April 2009 summit commits itself—as it should—to a continuation of the G20 summit format into the future, it must begin to address the question of how the summit process should be managed. We explore some of the possible options next.

How Should the G20 Summit Process Be Managed?

So far the G7, G8 and G20 forums have been supported by a loose organizational infrastructure. For each group the country holding the rotating year-long presidency of the forum takes over the secretariat function while a team of senior officials (the so-called “sherpas”) from each country meets during the course of the year to prepare the agenda and the communiqué for leaders and ministers. This organization has the advantage of avoiding a costly and rigid bureaucracy. It also fosters a growing level of trust and mutual understanding among the sherpas.

The problem with this approach has been two-fold: First, it led to discontinuities in focus and organization and in the monitoring of implementation. For the G20 of finance ministers, this problem was addressed in part by the introduction of a “troika” system, under with the immediate past and future G20 presidencies would work systematically with the current G20 presidency to shape the agenda and manage the preparation process. Second, particularly for the countries in the G20 with lesser administrative capacity, the responsibility for running the secretariat for a year during their country’s presidency imposed a heavy burden.

For the G20 summit, these problems will be amplified, not least because these summits will require first-rate preparation for very visible and high-level events. In addition, as the agenda of the G20 summit broadens over time, the burden of preparing a consistent multi-year agenda based on strong technical work will be such that it cannot be effectively handled when passed on year to year from one secretariat in one country to another secretariat in another country, especially when multiple ministries have to be engaged in each country. It is for this reason that the time may have come to explore setting up a very small permanent secretariat in support of the G20 summit.

The secretariat should only provide technical and logistical support for the political leadership of the troika of presidencies and for the sherpa process, but should not run the summit. That is the job of the host member governments. They must continue to run the summits, lead the preparations and drive the follow-up. The troika process will help strengthen the capacity of national governments to shoulder these burdens. Summits are the creatures of national government authorities where they have primacy, and this must remain so, even as the new summits become larger, more complex and more important.

Implications for the Obama Administration

The November 2008 G20 Summit opened a welcome and long-overdue opportunity for a dramatic and lasting change in global governance. It will be critical that the leaders of the G20 countries make the most of this opportunity at the next G20 Summit on April 2. The presence of U.S. President Obama will be a powerful signal that the United States is ready to push and where necessary lead the movement for global change. President-elect Obama’s vision of inclusion and openness and his approach to governing, which favors innovative and far-reaching pragmatic responses to key national and global challenges, make him a great candidate for this role.

We would hope that President Obama would make clear early on that:

  • He supports the G20 summit as the appropriate apex institution of global governance for now;
  • He may wish to discuss how to fine-tune the summit’s composition for enhanced credibility and effectiveness but without fundamentally questioning the G20 framework;
  • He supports cooperative solutions to the current financial crisis along with a serious restructuring of the global financial institutions;
  • He will look to the G20 summit as the right forum to address other pressing global issues, such as climate change, energy, poverty and health; and
  • He is ready to explore an innovative approach to effectively manage the G20 summit process.

These steps would help ensure that the great promise of the November 2008 G20 Summit is translated into a deep and essential change in global governance. This change will allow the world to move from a governance system that continues to be dominated by the transatlantic powers of the 20th century to one which reflects the fundamentally different global economic and political realities of the 21st century. It would usher in a framework of deliberation, consultation and decision making that would make it possible to address the great global challenges and opportunities that we face today in a more effective and legitimate manner.

Downloads

     
 
 




break

White House releases breakthrough strategy on antibiotic resistance


After years of warnings from the public health community about the growing threat of antibiotic resistance, yesterday the White House announced a national strategy to combat the growing problem of antibiotic resistance within the U.S. and abroad. The administration’s commitment represents an important step forward, as antibiotic-resistant infections are responsible for 23,000 deaths annually, and cost over $50 billion in excess health spending and lost productivity.  The administration’s National Strategy on Combating Antibiotic-Resistant Bacteria includes incentives for developing new drugs, more rigorous stewardship of existing drugs, and better surveillance of antibiotic use and the pathogens that are resistant to them.  President Obama also issued an Executive Order that establishes an interagency Task Force and a non-governmental Presidential Advisory Council that will focus on broad-based strategies for slowing the emergence and spread of resistant infections. 

While antibiotics are crucial for treating bacterial infections, their misuse over time has contributed to a rather alarming rate of antibiotic resistance, including the development of multidrug-resistance bacteria or “super bugs.” Misuse manifests throughout all corners of public and private life; from the doctor’s office when prescribed to treat viruses; to industrial agriculture, where they are used in abundance to promote growth in livestock. New data from the World Health Organization (WHO) and U.S. Centers for Disease Control and Prevention (CDC) confirm that rising overuse of antibiotics has already become a major public health threat worldwide.

The administration’s announcement included a report from the President’s Council of Advisors on Science and Technology (PCAST) titled “Combatting Antibiotic Resistance,” which includes recommendations developed by a range of experts to help control antibiotic resistance. In addition, they outline a $20 million prize to reward the development of a new rapid, point-of-care diagnostic test. Such tests help health care providers choose the right antibiotics for their patients and streamline drug development by making it easier to identify and treat patients in clinical trials.  

The Need for Financial Incentives and Better Reimbursement

A highlight of the PCAST report is its recommendations on economic incentives to bring drug manufacturers back into the antibiotics market. Innovative changes to pharmaceutical regulation and research and development (R&D) will be welcomed by many in the health care community, but financial incentives and better reimbursement are necessary to alleviate the market failure for antibacterial drugs. A major challenge, particularly within a fee-for-service or volume-based reimbursement system is providing economic incentives that promote investment in drug development without encouraging overuse.

A number of public and private stakeholders, including the Engelberg Center for Health Care Reform and Chatham House’s Centre on Global Health Security Working Group on Antimicrobial Resistance, are exploring alternative reimbursement mechanisms that “de-link” revenue from the volume of antibiotics sold. Such a mechanism, combined with further measures to stimulate innovation, could create a stable incentive structure to support R&D. Further, legislative proposals under consideration by Congress to reinvigorate the antibiotic pipeline, including the Antibiotic Development to Advance Patient Treatment (ADAPT) Act of 2013, could complement the White House’s efforts and help turn the tide on antibiotic resistance. Spurring the development of new antibiotics is critical because resistance will continue to develop even if health care providers and health systems can find ways to prevent the misuse of these drugs.

Authors

       




break

Breakthrough therapy designation: A primer


Breakthrough therapy designation (BTD) is the newest of four expedited programs developed by the U.S Food and Drug Administration (FDA) to accelerate the development and review of novel therapies that target serious conditions. The public response to the program has been largely positive, and dozens of drugs have successfully received the designation. However, the FDA denies many more requests than it grants. In fact, as of March 2015, less than one in three of the BTD requests submitted have been granted. By contrast, roughly 75 percent of the requests for fast track designation (another of the Agency’s expedited programs) were granted between 1998 and 2007. This discrepancy suggests ongoing uncertainty over what exactly constitutes a “breakthrough” according to the FDA’s criteria.

On April 24, the Center for Health Policy at Brookings will host an event, Breakthrough Therapy Designation: Exploring the Qualifying Criteria, that will discuss qualifying criteria for the BTD program using real and hypothetical case studies to explore how FDA weighs the evidence submitted. Below is a primer that describes the definition, value, and impact of BTD.

What is BTD?

BTD was established in 2012 under the Food and Drug Administration Safety and Innovation Act, and is intended to expedite the development and review of drugs that show signs of extraordinary benefit at early stages of the clinical development process. However, BTD is not an automatic approval. The drug still has to undergo clinical testing and review by the FDA. Rather, BTD is designed to facilitate and shorten the clinical development process, which can otherwise take many years to complete.

What criteria does FDA use to evaluate potential breakthroughs?

In order to qualify for the designation, a therapy must be intended to treat a serious or life-threatening illness, and there must be preliminary clinical evidence that it represents a substantial improvement over existing therapies on at least one clinically significant outcome (such as death or permanent impairment).

In considering a request for BTD, FDA relies on three primary considerations:

1) the quantity and quality of the clinical evidence being submitted;

2) the available therapies that the drug is being compared to; and

3) the magnitude of treatment effect shown on the outcome being studied.


In practice, however, it can be difficult to define a single threshold that a therapy must meet. The decision depends on the specific context for that drug.  In some cases, for example, the targeted disease has few or no treatments available, while in others there may be several effective alternative treatments to which the new therapy can be compared. The request may also be made at different stages of the clinical development process, which means that the amount and type of data available to FDA can vary. In some cases, early evidence of benefit may disappear when the drug is tested in larger populations, which is why FDA reserves the right to rescinded the designation if subsequent data shows that the therapy no longer meets the criteria.

How many therapies have received the designation?

As of March 2015, FDA had received a total of 293 requests for BTD. Of these, 82 received the designation, and 23 have since been approved for marketing. Ten of these approvals were new indications for already approved drugs, rather than novel therapies that had never before received FDA approval.

What are the benefits of BTD?

For drug manufacturers, it is about the intensity and frequency of their interactions with FDA. Once the designation is granted, the FDA takes an “all hands-on-deck” approach to providing the manufacturer with ongoing guidance and feedback throughout the clinical development process. Products that receive BTD are also able to submit portions of their marketing application on a rolling basis (rather than all at once at the end of clinical trials) and BTD can also be used in combination with other expedited programs in order to further reduce the product’s time to market.

For patients, the potential benefits are straightforward: earlier access to therapies that may significantly improve or extend their lives.

How does BTD relate to the other three expedited programs?

The other three expedited review and development programs—fast track designation, priority review, and accelerated approval—are also geared at facilitating the development and approval of drugs for serious conditions. These other programs have been in place for over 15 years, and have played a significant role in accelerating patient access to new therapeutics (Table 1). In 2014 alone, 66 percent of the 41 drugs approved by FDA's Center for Drug Evaluation and Research used at least one of these four pathways, and 46 percent received at least two of the designations in combination.

Table 1: Overview of FDA’s Expedited Review Programs


 Adapted from FDA's Guidance for Industry: Expedited Programs for Serious Conditions - Drugs and Biologics

Authors

      




break

Breakthrough therapy designation: Exploring the qualifying criteria


Event Information

April 24, 2015
8:45 AM - 4:45 PM EDT

Ballroom
The Park Hyatt Hotel
24th and M Streets, NW
Washington, DC

Register for the Event

Established by the Food and Drug Administration Safety and Innovation Act of 2012, breakthrough therapy designation (BTD) is one of several programs developed by the U.S. Food and Drug Administration (FDA) to speed up the development and review of drugs and biologics that address unmet medical needs. In order to qualify for this designation, the treatment must address a serious or life-threatening illness. In addition, the manufacturer (i.e., sponsor) must provide early clinical evidence that the treatment is a substantial improvement over currently available therapies. The FDA is working to further clarify how it applies the qualifying criteria to breakthrough designation applications.

On April 24, under a cooperative agreement with FDA, the Center for Health Policy convened a public meeting to discuss the qualifying criteria for this special designation. Using examples from oncology, neurology, psychiatry, and hematology, the workshop highlighted considerations for the BTD application process, the evaluation process, and factors for acceptance or rejection. The discussion also focused on key strategies for ensuring that the qualifying criteria are understood across a broad range of stakeholder groups.


Video

Event Materials

      




break

Event recap: Lessons learned from two years of breakthrough therapy designation


The breakthrough therapy designation (BTD) program was initiated by the U.S. Food and Drug Administration (FDA) in 2012 to expedite the development of treatments for serious or life-threatening illness that demonstrate “substantial improvement” over existing therapies. The program has since become a widely supported mechanism for accelerating patient access to new drugs. As of March 2015, FDA has received a total of 293 requests for BTD. However, it has granted just  82 (28%), which indicates an ongoing lack of clarity over what exactly meets the criteria for the designation.

On April 24, the Center for Health Policy at Brookings convened a public meeting to explore the designation’s qualifying criteria and how FDA applies those criteria across therapeutic areas. Panelists used real-world and hypothetical case studies to frame the discussion, and highlighted major considerations for the application process, the FDA’s evaluation of the evidence, and the key factors for acceptance or rejection. The discussion also identified strategies to ensure that qualifying criteria are well understood. Here are the five big takeaways:

1.  The BTD program is viewed positively by drug companies, researchers, advocates, and others 

Across the board, participants expressed enthusiasm for the BTD program. Industry representatives noted that their experience had been extremely positive, and that the increased cooperation with and guidance from FDA were very helpful in streamlining their development programs. Receiving the designation can also raise a drug company’s profile, which can facilitate additional investment as well as clinical trial patient recruitment; this is particularly important for smaller companies with limited resources.

Patient and disease advocates were likewise supportive, and expressed hope that the early lessons learned from successful breakthrough therapy approvals (which have been mostly concentrated in the oncology and antiviral fields) could be translated to other disease areas with less success. However, while BTD is an important tool in expediting the development of new drugs, it is just one piece of broader scientific and regulatory policy landscape. Accelerating the pace of discovery and development of truly innovative new drugs will depend on a range of other factors, such as developing and validating new biomarkers that can be used to measure treatment effects at an earlier stage, as well as establishing networks that can streamline the clinical trial process. It will also be important to develop effective new approaches to collecting, analyzing, and communicating information about these treatments once they are on the market, as this information can potentially be used by FDA, providers, and patients to  further improve prescription drug policy and medical decision-making.

2.  BTD requests far outnumber those that actually meet the qualifying criteria

Since the program began, less than 30 percent of requests have received BTD designation. A substantial majority were denied at least in part due to either a lack of data or problems with the quality of the data, or some combination of the two. For example, some sponsors requested the designation before they had any clinical data, or submitted the request using clinical data that was incomplete or based on flawed study designs. Many requests also failed to meet the Agency’s bar for “substantial improvement” over existing therapies.

One reason for the high denial rate may be a lack of a clear regulatory or statutory bar that could be used as a definitive guide for sponsors to know what is needed to qualify for the designation. BTD denials are also confidential, which means that sponsors effectively have nothing to lose by submitting a request. Going forward, manufacturers may need to exercise more discretion in deciding to request the designation, as the process can be resource- and time-intensive for both sides.

3.  There is no single threshold for determining what defines a breakthrough therapy

About 53 percent of the 109 total BTD denials were due at least in part to the fact that the drug did not represent a substantial improvement over existing therapies. During the day’s discussion, FDA and sponsors both noted that this is likely because the criteria for BTD are inherently subjective. In practice, this means there is no clear threshold for determining when a new therapy represents a “substantial improvement” over existing therapies. Designation decisions are complex and highly dependent on the context, including the disease or condition being targeted, the availability of other treatments, the patient population, the outcomes being studied, and the overall reliability of the data submitted. Given the multiple factors at play, it can be difficult in some cases to determine when a new product is potentially “transformational” as opposed to “better,” especially for conditions that are poorly understood or have few or no existing treatments. In making its determinations, FDA considers the totality of the evidence submitted, rather than focusing on specific evidentiary requirements.

4.  Early communication with FDA is strongly recommended for BTD applicants

Roughly 72 percent of the BTD denials related at least in part to trial design or analysis problems, which led several people to suggest that sponsors engage with FDA prior to submitting their request. Though there are several formal mechanisms for interacting with the agency, informal consultations with the relevant review division could help sponsors to get a better  and much earlier sense of what kind of data FDA might need. This early communication could both strengthen viable BTD requests and reduce the number of frivolous requests.

5.  FDA may need more resources for implementing the BTD program

Drugs that receive breakthrough designation are subject to much more intensive FDA guidance and review. However, when the program was established in 2012, Congress did not allocate funding to cover its costs. There have been ongoing concerns that the program is exacting a significant toll on FDA’s already limited resources, and potentially affecting the timeline for other drug application reviews. These concerns were reiterated during the day’s discussion, and some suggested that Congress consider attaching a user fee to the BTD program when the Prescription Drug User Fee Act comes up for reauthorization in 2017.

Authors

      




break

20 years after Clinton’s pathbreaking trip to India, Trump contemplates one of his own

President Trump is planning on a trip to India — probably next month, depending on his impeachment trial in the Senate. That will be almost exactly 20 years after President Clinton’s pathbreaking trip to India, Bangladesh, and Pakistan in March 2000. There are some interesting lessons to be learned from looking back. Presidential travel to…

       




break

How is the coronavirus outbreak affecting China’s relations with India?

China’s handling of the coronavirus pandemic has reinforced the skeptical perception of the country that prevails in many quarters in India. The Indian state’s rhetoric has been quite measured, reflecting its need to procure medical supplies from China and its desire to keep the relationship stable. Nonetheless, Beijing’s approach has fueled Delhi’s existing strategic and economic concerns. These…

       




break

Webinar: The effects of the coronavirus outbreak on marginalized communities

As the coronavirus outbreak rapidly spreads, existing social and economic inequalities in society have been exposed and exacerbated. State and local governments across the country, on the advice of public health officials, have shuttered businesses of all types and implemented other social distancing recommendations. Such measures assume a certain basic level of affluence, which many…

       




break

Ask the Expert: Former CMS Head Breaks Down ACO Lessons to Date

A new approach to delivering -- and paying for -- health care made its debut three years ago and has been picking up steam ever since. Accountable care organizations (ACOs) are growing rapidly nationwide, offering the promise of coordinated patient care at a lower cost.

Yet, making the transition away from operating as a single, discrete practice unit according to a fee-for-service payment model can, admittedly, be difficult. Created as part of the Patient Protection and Affordable Care Act, ACOs are drawing close scrutiny from many different stakeholders.

Mark McClellan, M.D., Ph.D., recently discussed with AAFP News some early returns on ACOs, including the fact that many physician-led groups are moving to the new payment model. A former administrator of CMS, McClellan now serves as director of the Health Care Innovation and Value Initiative at the Brookings Institution in Washington.

Q: Are ACOs just a repackaged version of HMOs from the 1990s?

A: No, they are different. First, the ACOs directly involve clinicians in accountability for a population of patients rather than simply relying on the health plan. Second, in contrast with the cost-control approach of many managed care plans in the 1990s, there are now more effective tools to do clinical management and handle some form of capitation-based payments.

Q: How does a physician practice make the transition to an ACO?

A: It's a shift from the fee-for-service model whereby the practice starts to take on the overall financial risk for their patients. This means their approach to care has to change to reduce costs, but it also means they have new resources to make those changes financially sustainable.

Access to physicians or nurses in the practice should increase, ideally, to have 24/7 staffing to help avoid costly complications and avoidable admissions. A patient registry of individuals with chronic diseases or risk factors can help identify where and how to intervene. These are the types of things that, under a fee-for-service payment system, you don't get paid for, but in an ACO model, you can.

Q: How would you characterize the growth in ACOs to date and into the future?

A: I think accountable care will continue to grow, including payments that are tied more directly to results and that give clinicians more flexibility in how they deliver care. Many ACOs are integrated organizations like Health Care Partners, Monarch HealthCare and the University of Michigan.

But recently, there has been more growth in smaller ACOs led by physician groups, often primary care (physicians). These ACOs may consist of 20 to 30 doctors and are not affiliated with a hospital. They are still physician-owned, but they may be jointly financed by other co-investing organizations, like health plans or practice management programs, that also share in the savings.

Q: Can smaller physician groups be successful within the ACO model?

A: There are some promising ACOs made up of small practices. Some of these practices formed an ACO in a way that builds upon the traditional IPA (independent practice association) model. One of the advantages of the newer, physician-led ACOs is that they have clearer financial benefits to the physicians when they are able to reduce costs.

In contrast to traditional fee-for-service payment, in a physician ACO, when the group takes steps to reduce outpatient visits or hospital visits, they capture the savings. For hospital-affiliated ACOs, some of those savings are offset by reduced payments to the hospital.

There is new, hard work that needs to be done in terms of tracking patients. It's not just about insurance claims. These smaller ACOs are collaborating on population health management tools and information technology tools. You do need technology infrastructure to support specific changes in care to improve outcomes for your patient.

Q: Can ACOs with no hospital affiliation succeed?

A: Yes. Some of these ACOs are achieving impressive early results, and a lot of physician-led groups are more comfortable taking on population risks. Our research indicates that physician-led ACOs do not have to have a huge impact on care to succeed. For example, a physician-led ACO that reduces hospital visits by 1 percent to 2 percent can double the net revenues for its physicians. It's a very promising opportunity. A lot of physician groups are interested, and we're learning more about what it takes to succeed.

Q: What's an average timeline for an ACO to be declared successful?

A: For those that do succeed, it's likely to be a marathon and not a sprint. Some ACOs are already reporting gains in terms of improved quality of care, care coordination and cost reduction through steps like better management of high-risk patients and modifying referral and admission patterns. Other steps may take longer. For diabetes management, it could take about 12 to 24 months for improvements in care to translate into significant cost savings. With congestive heart failure, it can happen sooner.

As clinicians in ACOs get more experienced and comfortable with coordinating care and managing a patient's overall care experience, it's likely that they will want to implement additional payment reforms to move away from fee-for-service, which, in turn, means more resources for innovative approaches to care.

Q: Overall, how is the first wave of ACOs doing in enhancing quality and reducing costs?

A: In general, the ACOs are doing pretty well in terms of quality of care and improving on important quality measures. Financially, about half of the 114 ACOs participating in the Medicare Shared Savings Program reported that they reduced Medicare spending in their first year of operation.

About 29 percent of physician-led ACOs and 20 percent of hospital ACOs demonstrated large enough savings to qualify for the shared-savings payments. Some private-sector ACOs, like the Alternative Quality Contract developed by Massachusetts Blue Cross, show growing effects on costs over time. It's likely to be the case that some ACOs won't succeed and others will.

Q: How do the shared-savings models used by Medicare today compare with ACOs in terms of moving away from fee-for-service?

A: Many private-sector ACO plans and some Medicaid programs are offering bigger shifts away from fee-for-service. As ACOs gain more experience, I think these payment reforms will be more attractive. In addition, some private-sector health plans are including financial and other incentives to attract patients. They might offer discounted premiums or copay discounts for patients who stay engaged with their ACO. In other words, the patients can share in the savings, too. As care continues to get more individualized, patient engagement in the ACO initiatives will be increasingly important.

Publication: AAFP News
      




break

Break up the big banks? Not quite, here’s a better option.


Neel Kashkari, the newly appointed President of the Federal Reserve Bank of Minneapolis, is super-smart with extensive experience in the financial industry at Goldman Sachs and then running the government’s TARP program, but his call to break up the big banks misses the mark.

Sure, big banks, medium-sized banks and small banks all contributed to the devastating financial crisis, but so did the rating agencies and the state-regulated institutions (mostly small) that originated many of the bad mortgages.  It was vital that regulation be strengthened to avoid a repetition of what happened – and it has been.  There should never again be a situation where policymakers are faced with either bailing out failing institutions or letting them fail and seeing financial panic spread.

That’s why the Dodd-Frank Act gave the authorities a new tool to avoid that dilemma titled “Orderly Liquidation Authority,” which gives them the ability to fail a firm but sustain the key parts whose failure might cause financial instability.  Kashkari thinks that the authorities will not want to exercise this option in a crisis because they will be fearful of the consequences of imposing heavy losses on the original owners of the largest banks.  It’s a legitimate concern, but he underestimates the progress that has been made in making the orderly liquidation authority workable in practice.  He also underestimates the determination of regulators not to bail out financial institutions from now on.

To make orderly liquidation operational, the Federal Deposit Insurance Corporation (FDIC) devised something called the “single point of entry” approach, or SPOE, which provides a way of dealing with large failing banks.  The bank holding company is separated from the operating subsidiaries and takes with it all of the losses, which are then imposed on the shareholders and unsecured bond holders of the original holding company, and not on the creditors of the critical operating subs and not on  taxpayers.  The operating subsidiaries of the failing institution are placed into a new bank entity, and they are kept open and operating so that customers can still go into their bank branch or ATM and get their money, and the bank can still make loans to support household and business spending or the investment bank can continue to help businesses and households raise funds in securities markets.  The largest banks also have foreign subsidiaries and these too would stay open to serve customers in Brazil or Mexico.

This innovative approach to failing banks is not magic, although it is hard for most people to understand.  However, the reason that Kashkari and other knowledgeable officials have not embraced SPOE is that they believe the authorities will be hesitant to use it and will try to find ways around it.  When a new crisis hits, the argument goes, government regulators will always bail out the big banks.

First, let’s get the facts straight about the recent crisis.  The government did step in to protect the customers of banks of all sizes as well as money market funds.  In the process, they also protected most bondholders, and people who had lent money to the troubled institutions, including the creditors of Bear Stearns, a broker dealer, and AIG, an insurance company.  This was done for a good reason because a collapse in the banking and financial system more broadly would have been even worse if markets stopped lending to them.  Shareholders of banks and other systemically important institutions lost a lot of money in the crisis, as they should have.  The CEOs lost their jobs, as they should have (although not their bonuses).  Most bondholders were protected because it was an unfortunate necessity.

As a result of Dodd-Frank rules the situation is different now from what it was in 2007.  Banks are required to hold much more capital, meaning that there is more shareholder equity in the banks.  In addition, banks must hold long-term unsecured debt, bonds that essentially become a form of equity in the event of a bank failure.  It is being made clear to markets that this form of lending to banks will be subject to losses in the event the bank fails—unlike in 2008.  Under the new rules, both the owners of the shares of big banks and the holders of their unsecured bonds have a lot to lose if the bank fails, providing market discipline and a buffer that makes it very unlikely indeed that taxpayers would be on the hook for losses.

The tricky part is to understand the situation facing the operating subsidiaries of the bank holding company — the parts that are placed into a new bank entity and remain open for business.  The subsidiaries may in fact be the part of the bank that caused it to fail in the first place, perhaps by making bad loans or speculating on bad risks.  Some of these subsidiaries may need to be broken off and allowed to fail along with the holding company—provided that can be done without risking spillover to the economy.  Other parts may be sold separately or wound down in an orderly way.  In fact the systemically important banks are required to submit “living wills” to the FDIC and the Federal Reserve that will enable the critical pieces of a failing bank to be separated from the rest.

It is possible that markets will be reluctant to lend money to the new entity but the key point is that this new entity will be solvent because the losses, wherever they originated, have been taken away and the new entities recapitalized by the creditors of the holding company that have been “bailed in.”   Even if it proves necessary for the government to lend money to the newly formed bank entity, this can be done with reasonable assurance that the loans will be repaid with interest.  Importantly, it can be done through the orderly liquidation authority and would not require Congress to pass another TARP, the very unpopular fund that was used to inject capital into failing institutions.

There are proposals to enhance the SPOE approach by creating a new chapter of the bankruptcy code, so that a judge would control the failure process for a big bank and this could ensure there is no government bailout.  I support these efforts to use bankruptcy proceedings where possible, although I am doubtful if the courts could handle a severe crisis with multiple failures of global financial institutions.  But regardless of whether failing financial institutions are resolved through judicial proceedings or through the intervention of the FDIC (as specified under Title II of Dodd-Frank) the new regulations guaranty that shareholders and unsecured bondholders bear the losses so that the parts of the firm that are essential for keeping financial services going in the economy are kept alive.  That should assure the authorities that bankruptcy or resolution can be undertaken while keeping the economy relatively safe.

The Federal Reserve regulates the largest banks and it is making sure that the bigger the bank, the greater is the loss-absorbing buffer it must hold—and it will be making sure that systemically important nonbanks also have extra capital and can be resolved in an orderly manner.  Once that process is complete, it can be left to the market to decide whether or not it pays to be a big bank.  Regulators do not have to break up the banks or figure out how that would be done without disrupting the financial system.


Editor's note: This piece originally appeared in Bloomberg Government

Publication: Bloomberg Government
Image Source: © Keith Bedford / Reuters
      
 
 




break

The Marketplace of Democracy: A Groundbreaking Survey Explores Voter Attitudes About Electoral Competition and American Politics

Event Information

October 27, 2006
10:00 AM - 12:00 PM EDT

Falk Auditorium
The Brookings Institution
1775 Massachusetts Ave., NW
Washington, DC

Register for the Event

Despite the attention on the mid-term races, few elections are competitive. Electoral competition, already low at the national level, is in decline in state and primary elections as well. Reformers, who point to gerrymandering and a host of other targets for change, argue that improving competition will produce voters who are more interested in elections, better-informed on issues, and more likely to turn out to the polls.

On October 27, the Brookings Institution—in conjunction with the Cato Institute and The Pew Research Center—presented a discussion and a groundbreaking survey exploring the attitudes and opinions of voters in competitive and noncompetitive congressional districts. The survey, part of Pew's regular polling on voter attitudes, was conducted through the weekend of October 21. A series of questions explored the public's perceptions, knowledge, and opinions about electoral competitiveness.

The discussion also explored a publication that addresses the startling lack of competition in our democratic system. The Marketplace of Democracy: Electoral Competition and American Politics (Brookings, 2006), considers the historical development, legal background, and political aspects of a system that is supposed to be responsive and accountable, yet for many is becoming stagnant, self-perpetuating, and tone-deaf. Michael McDonald, editor and Brookings visiting fellow, moderated a discussion among co-editor John Samples, director of the Center for Representative Government at the Cato Institute, and Andrew Kohut and Scott Keeter from The Pew Research Center, who also discussed the survey.

Transcript

Event Materials

     
 
 




break

COVID-19 outbreak highlights critical gaps in school emergency preparedness

The COVID-19 epidemic sweeping the globe has affected millions of students, whose school closures have more often than not caught them, their teachers, and families by surprise. For some, it means missing class altogether, while others are trialing online learning—often facing difficulties with online connections, as well as motivational and psychosocial well-being challenges. These problems…

       




break

USA: Bernie Sanders and the lessons of the “Dirty Break” – Why socialists shouldn’t run as Democrats

The economic crisis and pandemic have made it patently clear that US capitalism is not at all exceptional. Like everything else in the universe, American capital’s political system is subject to sharp and sudden changes. After Bernie Sanders handily won the first few contests of the 2020 race for the Democratic nomination, he was seen as an unstoppable threat—prompting every other candidate to immediately fold up their campaigns and close ranks against him. After months of panicking over Bernie’s momentum, the ruling class finally managed to reverse the course of the electoral race—and they did it with unprecedented speed. Now, after an electrifying rollercoaster ride, Bernie Sanders’s campaign for the American presidency is over, and a balance sheet is needed.




break

Ollie the jailbreaking bobcat on the lam from National Zoo

The 25-pound lady bobcat was last seen on Monday morning.




break

UK wind energy breaks output records. Again.

This is very good news. So much so that it might soon stop being news.




break

Morocco: let us break the rod of repression with organisation and struggle

Those who follow the situation in Morocco can see that the repressive dictatorial regime has become more and more frenzied, and the police state has tightened its repressive grip on everyone and everything. They are arresting those who protest, who sing, who criticise, who write, and who show solidarity with those arrested.




break

195 nations agree to groundbreaking Paris climate deal

Today, the United Nations climate talks reached an agreement, and committed to fighting devastating levels of climate change.