apt

Immediate adaptation analysis implicates BCL6 as an EGFR-TKI combination therapy target in NSCLC [Research]

Drug resistance is a major obstacle to curative cancer therapies, and increased understanding of the molecular events contributing to resistance would enable better prediction of therapy response, as well as contribute to new targets for combination therapy. Here we have analyzed the early molecular response to epidermal growth factor receptor (EGFR) inhibition using RNA sequencing data covering 13 486 genes and mass spectrometry data covering 10 138 proteins. This analysis revealed a massive response to EGFR inhibition already within the first 24 hours, including significant regulation of hundreds of genes known to control downstream signaling, such as transcription factors, kinases, phosphatases and ubiquitin E3-ligases. Importantly, this response included upregulation of key genes in multiple oncogenic signaling pathways that promote proliferation and survival, such as ERBB3, FGFR2, JAK3 and BCL6, indicating an early adaptive response to EGFR inhibition. Using a library of more than 500 approved and experimental compounds in a combination therapy screen, we could show that several kinase inhibitors with targets including JAK3 and FGFR2 increased the response to EGFR inhibitors. Further, we investigated the functional impact of BCL6 upregulation in response to EGFR inhibition using siRNA-based silencing of BCL6. Proteomics profiling revealed that BCL6 inhibited transcription of multiple target genes including p53, resulting in reduced apoptosis which implicates BCL6 upregulation as a new EGFR inhibitor treatment escape mechanism. Finally, we demonstrate that combined treatment targeting both EGFR and BCL6 act synergistically in killing lung cancer cells. In conclusion, or data indicates that multiple different adaptive mechanisms may act in concert to blunt the cellular impact of EGFR inhibition, and we suggest BCL6 as a potential target for EGFR inhibitor-based combination therapy.




apt

Characterization of signaling pathways associated with pancreatic {beta}-cell adaptive flexibility in compensation of obesity-linked diabetes in db/db mice [Research]

The onset of obesity-linked type 2 diabetes (T2D) is marked by an eventual failure in pancreatic β-cell function and mass that is no longer able to compensate for the inherent insulin resistance and increased metabolic load intrinsic to obesity. However, in a commonly used model of T2D, the db/db mouse, β-cells have an inbuilt adaptive flexibility enabling them to effectively adjust insulin production rates relative to the metabolic demand. Pancreatic β-cells from these animals have markedly reduced intracellular insulin stores, yet high rates of (pro)insulin secretion, together with a substantial increase in proinsulin biosynthesis highlighted by expanded rough endoplasmic reticulum and Golgi apparatus. However, when the metabolic overload and/or hyperglycemia is normalized, β-cells from db/db mice quickly restore their insulin stores and normalize secretory function. This demonstrates the β-cell’s adaptive flexibility and indicates that therapeutic approaches applied to encourage β-cell rest are capable of restoring endogenous β-cell function. However, mechanisms that regulate β-cell adaptive flexibility are essentially unknown. To gain deeper mechanistic insight into the molecular events underlying β-cell adaptive flexibility in db/db β-cells, we conducted a combined proteomic and post-translational modification specific proteomic (PTMomics) approach on islets from db/db mice and wild-type controls (WT) with or without prior exposure to normal glucose levels. We identified differential modifications of proteins involved in redox homeostasis, protein refolding, K48-linked deubiquitination, mRNA/protein export, focal adhesion, ERK1/2 signaling, and renin-angiotensin-aldosterone signaling, as well as sialyltransferase activity, associated with β-cell adaptive flexibility. These proteins are all related to proinsulin biosynthesis and processing, maturation of insulin secretory granules, and vesicular trafficking—core pathways involved in the adaptation of insulin production to meet metabolic demand. Collectively, this study outlines a novel and comprehensive global PTMome signaling map that highlights important molecular mechanisms related to the adaptive flexibility of β-cell function, providing improved insight into disease pathogenesis of T2D.




apt

Characterization of Prenylated C-terminal Peptides Using a Thiopropyl-based Capture Technique and LC-MS/MS [Research]

Post-translational modifications play a critical and diverse role in regulating cellular activities. Despite their fundamentally important role in cellular function, there has been no report to date of an effective generalized approach to the targeting, extraction and characterization of the critical c-terminal regions of natively prenylated proteins. Various chemical modification and metabolic labelling strategies in cell culture have been reported. However, their applicability is limited to cell culture systems and does not allow for analysis of tissue samples. The chemical characteristics (hydrophobicity, low abundance, highly basic charge) of many of the c-terminal regions of prenylated proteins have impaired the use of standard proteomic workflows. In this context, we sought a direct approach to the problem in order to examine these proteins in tissue without the use of labelling.  Here we demonstrate that prenylated proteins can be captured on chromatographic resins functionalized with mixed disulfide functions. Protease treatment of resin-bound proteins using chymotryptic digestion revealed peptides from many known prenylated proteins. Exposure of the protease-treated resin to reducing agents and hydro organic mixtures released c-terminal peptides with intact prenyl groups along with other enzymatic modifications expected in this protein family. Database and search parameters were selected to allow for c-terminal modifications unique to these molecules such as CAAX box processing and c-terminal methylation. In summary, we present a direct approach to enrich and obtain information at a molecular level of detail about prenylation of proteins from tissue and cell extracts using high performance LCMS without the need for metabolic labeling and derivatization.




apt

Motifs of Three HLA-DQ Amino Acid Residues ({alpha}44, {beta}57, {beta}135) Capture Full Association with the Risk of Type 1 Diabetes in DQ2 and DQ8 Children

HLA-DQA1 and -DQB1 are strongly associated with type 1 diabetes (T1D), and DQ8.1 and DQ2.5 are major risk haplotypes. Next generation targeted sequencing of HLA-DQA1 and -DQB1 in Swedish newly diagnosed 1-18 year-old patients (n=962) and controls (n=636) was used to construct abbreviated DQ haplotypes, converted into amino acid (AA) residues, and assessed for their associations with T1D. A hierarchically-organized haplotype (HOH) association analysis, allowed 45 unique DQ haplotypes to be categorized into seven clusters. The DQ8/9 cluster included two DQ8.1 risk and the DQ9 resistant haplotypes, and the DQ2 cluster, included the DQ2.5 risk and DQ2.2 resistant haplotypes. Within each cluster, HOH found residues α44Q (OR 3.29, p=2.38*10-85 ) and β57A (OR 3.44, p=3.80*10-84) to be associated with T1D in the DQ8/9 cluster representing all ten residues (α22, α23, α44, α49, α51, α53, α54, α73, α184, β57) due to complete linkage-disequilibrium (LD) of α44 with eight such residues. Within the DQ2 cluster and due to LD, HOH analysis found α44C and β135D to share the risk for T1D (OR 2.10, p=1.96*10-20). The motif "QAD" of α44, β57, and β135 captured the T1D risk association of DQ8.1 (OR 3.44, p=3.80*10-84), the corresponding motif "CAD" captured the risk association of DQ2.5 (OR 2.10, p=1.96*10-20). Two risk associations were related to GADA and IA-2A, but in opposite directions. "CAD" was positively associated with GADA (OR 1.56; p=6.35*10-8) but negatively with IA-2A (OR 0.59, p= 6.55*10-11). "QAD" was negatively associated with GADA (OR 0.88; p= 3.70*10-3) but positively with IA-2A (OR 1.64; p= 2.40*10-14), despite a single difference at α44. The residues are found in and around anchor pockets 1 and 9, as potential TCR contacts, in the areas for CD4 binding and putative homodimer formation. The identification of three HLA-DQ AA (α44, β57, β135) conferring T1D risk should sharpen functional and translational studies.




apt

Nutrient-Induced Metabolic Stress, Adaptation, Detoxification, and Toxicity in the Pancreatic {beta}-Cell

Paraphrasing the Swiss physician and father of toxicology Paracelsus (1493–1541) on chemical agents used as therapeutics, "the dose makes the poison," it is now realized that this aptly applies to the calorigenic nutrients. The case here is the pancreatic islet β-cell presented with excessive levels of nutrients such as glucose, lipids, and amino acids. The short-term effects these nutrients exert on the β-cell are enhanced insulin biosynthesis and secretion and changes in glucose sensitivity. However, chronic fuel surfeit triggers additional compensatory and adaptive mechanisms by β-cells to cope with the increased insulin demand or to protect itself. When these mechanisms fail, toxicity due to the nutrient surplus ensues, leading to β-cell dysfunction, dedifferentiation, and apoptosis. The terms glucotoxicity, lipotoxicity, and glucolipotoxicity have been widely used, but there is some confusion as to what they mean precisely and which is most appropriate for a given situation. Here we address the gluco-, lipo-, and glucolipo-toxicities in β-cells by assessing the evidence both for and against each of them. We also discuss potential mechanisms and defend the view that many of the identified "toxic" effects of nutrient excess, which may also include amino acids, are in fact beneficial adaptive processes. In addition, candidate fuel-excess detoxification pathways are evaluated. Finally, we propose that a more general term should be used for the in vivo situation of overweight-associated type 2 diabetes reflecting both the adaptive and toxic processes to mixed calorigenic nutrients excess: "nutrient-induced metabolic stress" or, in brief, "nutri-stress."




apt

A kinesin adapter directly mediates dendritic mRNA localization during neural development in mice [Neurobiology]

Motor protein-based active transport is essential for mRNA localization and local translation in animal cells, yet how mRNA granules interact with motor proteins remains poorly understood. Using an unbiased yeast two–hybrid screen for interactions between murine RNA-binding proteins (RBPs) and motor proteins, here we identified protein interaction with APP tail-1 (PAT1) as a potential direct adapter between zipcode-binding protein 1 (ZBP1, a β-actin RBP) and the kinesin-I motor complex. The amino acid sequence of mouse PAT1 is similar to that of the kinesin light chain (KLC), and we found that PAT1 binds to KLC directly. Studying PAT1 in mouse primary hippocampal neuronal cultures from both sexes and using structured illumination microscopic imaging of these neurons, we observed that brain-derived neurotrophic factor (BDNF) enhances co-localization of dendritic ZBP1 and PAT1 within granules that also contain kinesin-I. PAT1 is essential for BDNF-stimulated neuronal growth cone development and dendritic protrusion formation, and we noted that ZBP1 and PAT1 co-locate along with β-actin mRNA in actively transported granules in living neurons. Acute disruption of the PAT1–ZBP1 interaction in neurons with PAT1 siRNA or a dominant-negative ZBP1 construct diminished localization of β-actin mRNA but not of Ca2+/calmodulin-dependent protein kinase IIα (CaMKIIα) mRNA in dendrites. The aberrant β-actin mRNA localization resulted in abnormal dendritic protrusions and growth cone dynamics. These results suggest a critical role for PAT1 in BDNF-induced β-actin mRNA transport during postnatal development and reveal a new molecular mechanism for mRNA localization in vertebrates.




apt

Adapt or Die: The Need for Orders to Evolve

12 June 2019

Adam Ward

Former Deputy Director, Chatham House
Historically, efforts to build rules-based international orders have emerged out of conflict, only for each system to falter when a new crisis emerges. At issue today, with the post-1945 multilateral system under strain, is how to modernize the making and application of rules to break that cycle.

2019-06-07-UN-protest.jpg

School children hold a placard reading "CHANGE" during the Youth Climate Strike May 24, 2019 outside United Nations headquarters in New York City. Photo by Johannes EISELE/AFP/Getty Images.

The most vexing, complicated and elusive question in international relations is how to achieve an order, based on rules, that enjoys legitimacy, rewards investments in cooperation, reconciles clashing interests and deters conflict. It is not a problem over which a magic wand can be waved. But in our own time, immense and patient efforts have been made towards that general goal, however imperfect the result.

The concept of the ‘rules-based international order’ refers today in its most general sense to arrangements put into place to allow for cooperative efforts in addressing geopolitical, economic and other global challenges, and to arbitrate disputes. It is embodied in a variety of multilateral institutions, starting with the United Nations and running through various functional architectures such as the Bretton Woods system, the corpus of international law and other regimes and treaties, down to various regional instances where sovereignty is pooled or where powers have been delegated consensually by states on a particular issue.

Some aspects of the rules-based order are heavily informed by distinct values, such as those contained in the Universal Declaration of Human Rights. But, more often than not, they simply prescribe a set of basic principles for how the business of international political and economic relations is to be transacted. The parameters of legitimate and illegitimate behaviour are specified. Compliance is incentivized, and some scope to sanction transgressors is provided for.

For some, the rules-based international order is a politically highly charged concept. Indeed, the absence of a common standardized definition of it is perhaps a by-product of the controversy which the mere notion of a rules-based order often attracts – among those who had no or little part in its shaping; those who regard multilateralism as an infringement of sovereignty and a straitjacket on national ambitions; and those who sense in it a presumption of universal values and shared interests that jars with their own particular historical experience and political preferences. And in a world in which each country occupies its own place on the spectrum of attraction to, tolerance of and resistance to multilateralism, it is inevitable that the present system should be a patchy and incomplete one.

If that patchiness seems increasingly apparent today, then this reflects the proliferation of problems on a truly global scale that multilateral initiatives have as yet failed to keep up with. This is partly because of the sheer pace of change and the deep complexity of problems, and partly because any significant programme of coordinated action requires a focus and consensus that today is in shrinking supply.

More than that, some of the sharpest challenges – climate change; the lack or weakness of rules in the sea, space and cyber domains; the dilemmas thrown up by technological change – are problematic precisely because they are areas in and through which geopolitical competitions are being contested. The policy challenges may be new, but the pattern of behaviour currently surrounding them presents some dangerous echoes from the past.

Throughout history, most attempts to form international orders have been conceived in a coercive way. From classical antiquity to the 20th century, the dominant form of order has been that imposed or attempted by successive territorial empires, or by predominant powers who made the rules by fiat and were deferred to by their neighbours and satellites.

Significant attempts at more collaborative conceptions of order, aimed at coexistence and minimizing risk through rules and accepted conventions, have been far rarer. And the key point about them is that they have been attempted only after competition has spilled over in an uncontrolled, exhausting and ruinous conflict that has called for mechanisms and understandings to prevent a recurrence of disaster. That, in any case, has been the European experience, and subsequently the result of the engulfing crises that radiated out globally from Europe in the 20th century.

Early efforts at order-building focused on mutual recognition and the management of what were felt to be inevitable rivalries. The Westphalian Peace of 1648 emerged from a 30-year period of religious war in Europe. It emphasized the sanctity of sovereignty and non-interference in the internal affairs of other states as a precondition for order, but relied on a jostling balance-of-power approach to the preservation of a basic stability.

A tolerance of conflicts to correct imbalances was implicit to the scheme. But its acute sensitivity to shifts in alignments of power contributed to the later conflicts – from the wars of the Spanish Succession and Austrian Succession to the Seven Years’ War – that ravaged Europe in the 18th century and occurred in an increasingly global theatre of military operations, tracing the development of European imperial projects.

Despite these shortcomings, the balance-of-power model was produced again as a remedy to uncontrolled conflict, at the Congress of Vienna in 1814–15, following more than 20 years of French Revolutionary and Napoleonic wars. A Concert of Europe, accommodating a rehabilitated France, was instituted to regulate the system and periodically decide major geopolitical issues. But it fell into disuse. And although Europe did not suffer a general war for the rest of the 19th century, the salient geopolitical facts were ones not of power balances but of the sharp relative decline of France and the vertiginous rise of Prussia, which defeated Austria and France on the path to German unification.

These dynamics produced convoluted and ever-widening balancing manoeuvres that by the eve of the First World War in 1914 had congealed and hardened into the opposing Triple Alliance and Triple Entente systems, which trapped their respective members into tangled commitments to fight at the trigger of a crisis.

The peacemaking efforts, in Paris in 1919, that followed the war entailed conscious efforts to overturn the balance-of-power model. The tone was set by US President Woodrow Wilson’s Fourteen Points, with their emphasis on transparency and openness, while the concepts of egalitarianism among states, the drive towards disarmament and the practice of collective security were central to the revolutionary creation of a League of Nations in 1920.

But the peacemaking also included a punitive dimension – the designation of German culpability, the demand of economic reparations and territorial adjustments – imposed by victor on vanquished. To its critics, the international order being evolved, and the rules drafted to underpin it, had the attributes of an involuntary settlement more than those of a construct built by equals.

Lacking a comprehensive membership – crucially, the US had demurred, while other major powers progressively withdrew or were thrown out – and the military means to impose itself, a divided and often circumspect League faltered in meeting a succession of international crises. It then collided fatally with the revanchism of Germany, Italy and Japan that produced the Second World War.

The ambitiousness and eventual institutional intricacy of the UN system founded in 1945 marked a response to the scale of the ordeal through which the world had passed, and sought to correct the deficits of the League. The UN’s membership and the activity of its main organs and specialized agencies all grew prodigiously in succeeding decades, as did its efforts to advance the spirit and culture of multilateralism.

But by giving special privileges to the victors, principally through veto rights held among a small group of permanent Security Council members, the UN reflected and perpetuated a certain historical circumstance: there was no formal institutional adaptation in its highest structures to account for a progressive redistribution of international power, the rehabilitation of defeated countries, the rise of the decolonized world or the desire of emerging powers to assume international responsibilities commensurate with their heft. Rather than a mechanism for international governance, it remained an intergovernmental body through which states pursued their specific or collective priorities.

Indeed, the dominant questions around order in the first five decades of the UN’s existence were those posed by the Cold War conducted by the US and the Soviet Union and their respective allies and satellites, while the UN in effect was a prominent arena in which this global antagonism was carried out.

The world order was bipolar in concentrating power in two camps, with a swath of neutrals, non-aligned and swing players in between; and bi-systemic in the complete contrast in the ideological affinities and economic models that were promoted. Nuclear weapons raised the stakes associated with direct conflict to an existential level, and so pushed armed contests to peripheral theatres or on to skirmishing proxies.

The collapse of communism in the early 1990s ushered in a new dispensation. Those who divined the arrival of a ‘unipolar moment’ for the US were perhaps more accurate in their choice of epithet than they knew. At least on the surface, the US became by far the preponderant power. The decline and 1991 dissolution of the Soviet Union, in consequence of its economic decrepitude and strategic overstretch, not only removed the US’s peer competitor, but also opened up avenues for promoting economic liberalization and democratic government.

This shift was manifest in particular in changing dynamics in Europe. The US had sponsored the reunification of Germany and was a patron of its subsequent embedding in an integrating, democratic and liberal region. Over time, this drew the former Warsaw Pact members into EU and NATO structures (albeit at a pace and with a completeness that Russia’s strategic calculations could not be accommodated to).

And yet, despite these advances, in retrospect the chief development of the 20 years after the Cold War was a different one: globalization had at a gathering pace prompted a redistribution of political power, while its interlocking economic structures created a dense web of interests and dependencies that moved in all directions. It was likely in these circumstances that the appearance of any major emergency would produce insistent voices demanding what they saw as a more inclusive, legitimate and effective form of international order.

Crises duly arrived, first in the shape of the 2003 US-led invasion of Iraq, which strained alliances and stirred controversial debates about the justice and permissibility of military interventions and the need for constraints on US power; and then in the form of the financial meltdown of 2008, seen by many as a principally Western debacle calling for new global economic governance structures as instanced in the improvised G20. Neither set of debates was conclusively resolved, but each persisted against the backdrop of quickening systemic change.

The dilemmas about the shape and maintenance of a rules-based order with multilateralism at its core have since only deepened. The world is pulling in different directions. The ‘America First’ posture of the Trump administration has upturned the central feature of the system. It entails a distaste for multilateral agreements, a disavowal of traditional notions of US leadership, and an insistence on the unimpeded exercise of American power in pursuit of defined national interests.

China asserts the centrality of multilateralism, and practises it selectively, but on the whole favours binary diplomatic transactions where it holds asymmetric advantages; it has used this approach in the construction of its Belt and Road Initiative, as well as on other fronts.

Europe has created in its continent a rules-based order par excellence in the shape of the EU, but its energy has been sapped and its introversion fed by a succession of crises, of which the amputation of the Brexit-bound UK is simply one. The EU has yet to chart its future course or define a global strategy to uphold and advance the multilateralism which has been at its core.

Russia unabashedly is subverting the rules-based order as part of a programme of aggrieved self-aggrandizement. Japan champions the principle of a rules-based system, but the country has been disoriented by its abrupt detachment on this issue from its traditional US partner; while Japan has sought to engage like-minded countries in the West, they have not forged a concerted practical plan of action together.

Among other regional powers, Brazil has a populist government that echoes many of the Trump administration’s instincts, and India, whatever its preferences, has yet to acquire a foreign policy or presence on the global stage equal to its demographic weight and economic potential.

Prominent points of risk in this fragmenting picture are the multilateral trade system, efforts to address climate change, and collective measures to deal with entrenched conflicts.

One obvious consequence of the attrition of the rules-based system through the indifference or ambitions of the great powers is that it will leave smaller states much more exposed and hostage to the vagaries of geopolitical competition. A key question therefore is whether such states will choose and be able to defend a system which gives them a measure of protection.

Over recent decades, a variety of regional groupings – ASEAN, the African Union, the Gulf Cooperation Council, the Organization of American States – have evolved as species of rules-based mechanisms and in order to gather their collective weight. They make a ready constituency for those who would build a coalition for multilateralism. But it is also clear that the support of smaller regional players for such an approach depends on a revision of the rule-making system towards greater inclusivity and a broader say as to the issues it should address.

It is in the context of these trends and structural shifts that Chatham House Expert Perspectives 2019 offers ideas for how to modernize and adapt elements of the rules-based international order. As the title of this opening essay indicates, the imperative to ‘adapt’ reflects the gravity of contemporary challenges, and the inability of many existing structures to underpin ever-more-essential cooperation. Chatham House experts do not offer a master plan, but they attack the problem from a variety of indicative angles.

Suggestions are offered as to where gaps in international rules – regarding economic governance, the global health architecture and in respect of under-regulated domains such as space, for example – need to be filled to address immediate problems and advertise the relevance of multilateralism.

Other ideas demonstrate how logjams affecting some aspects of the system can be worked around; how key powers with scope to shape the system should be engaged; how a broader variety of actors beyond national governments need to be drawn into the effort; how rule-breakers might be tackled; and how imposing order on some chaotic situations requires the fundamental premises of existing policies to be rethought.

Chatham House, which celebrates its centenary in 2020, is a child of efforts after the Great War to reconceive the conduct of international relations and fulfil a mission that is today defined as the creation of a ‘sustainably secure, prosperous and just world’. The historical record shows that international orders not built on these attributes will fail.

This essay was produced for the 2019 edition of Chatham House Expert Perspectives – our annual survey of risks and opportunities in global affairs – in which our researchers identify areas where the current sets of rules, institutions and mechanisms for peaceful international cooperation are falling short, and present ideas for reform and modernization.




apt

Carbon Capture and Storage: Panacea or Procrastination?

Research Event

14 September 2009 - 12:00am to 11:00pm

Chatham House, London

Event participants

Dr Jon Gibbins, Senior Lecturer in the Department of Mechanical Engineering, Imperial College London
Jim Footner, Senior Climate Change Campaigner, Greenpeace

Carbon capture and storage (CCS) has risen up the political agenda both nationally and internationally as a part of the effort to reduce CO2 emissions in power generation yet the applications, potential and impacts of this technology remain contested.

Is CCS - employed to produce low-carbon electricity and hydrogen - the panacea we urgently need to limit cumulative CO2 emissions to a level at which we stand a chance of avoiding dangerous climate change (and possibly also a renaissance in global nuclear fission)? Or does it shift the emphasis away from switching to more a sustainable renewable energy infrastructure that could avoid the use of fossil fuels and nuclear altogether?

In this meeting two leading voices in the debate give their opinions, separating the known from the unknown and kick starting an informed discussion about the pros, cons and politics of CCS.

Please note that attendance is by invitation only and there is a maximum of 25 places. 

This meeting is part of the Chatham House Fossil Fuels Expert Roundtable.

Event attributes

All-day event




apt

Lamin C Counteracts Glucose Intolerance in Aging, Obesity, and Diabetes Through {beta}-Cell Adaptation

Aging-dependent changes in tissue function are associated with the development of metabolic diseases. However, the molecular connections linking aging, obesity, and diabetes remain unclear. Lamin A, lamin C, and progerin, products of the Lmna gene, have antagonistic functions on energy metabolism and life span. Lamin C, albeit promoting obesity, increases life span, suggesting that this isoform is crucial for maintaining healthy conditions under metabolic stresses. Because β-cell loss during obesity or aging leads to diabetes, we investigated the contribution of lamin C to β-cell function in physiopathological conditions. We demonstrate that aged lamin C only–expressing mice (LmnaLCS/LCS) become obese but remain glucose tolerant due to adaptive mechanisms including increased β-cell mass and insulin secretion. Triggering diabetes in young mice revealed that LmnaLCS/LCS animals normalize their fasting glycemia by both increasing insulin secretion and regenerating β-cells. Genome-wide analyses combined to functional analyses revealed an increase of mitochondrial biogenesis and global translational rate in LmnaLCS/LCS islets, two major processes involved in insulin secretion. Altogether, our results demonstrate for the first time that the sole expression of lamin C protects from glucose intolerance through a β-cell–adaptive transcriptional program during metabolic stresses, highlighting Lmna gene processing as a new therapeutic target for diabetes treatment.




apt

A maladaptive pathway to drug approval

The European Medicines Agency (EMA) has embraced a new model of drug testing and marketing called “adaptive pathways”, allowing new drugs for “unmet medical needs” to be launched on the market faster, on the basis of fewer data. While industry claims this is necessary, an analysis on thebmj.com looks at the assumptions underlying the new pathway,...




apt

How Qatar’s Food System Has Adapted to the Blockade

14 November 2019

Laura Wellesley

Research Fellow, Energy, Environment and Resources Programme
Two-and-a-half years on from the imposition of a trade blockade against Qatar by the Arab Quartet, Qatar’s food system has undergone a remarkable transformation – but it is one that brings new risks to Qatar’s future food and resource security.

2019-11-14-QatarCows.jpg

Cows are are fed at a dairy factory at Baladna farm in al-Khor, Qatar. Photo: Karim Jaafar/AFP via Getty Images.

Earlier this month, Sheikh Tamim – the emir of Qatar – hailed the country’s success in overcoming the impacts of the embargo levied by the so-called Arab Quartet – Bahrain, Egypt, Saudi Arabia and the United Arab Emirates (UAE). Qatar will post a budget surplus for the first time in three years, and the country’s long-term plan for economic diversification has taken great strides, according to the emir. Key among the achievements cited was the advancement of Qatar’s domestic food industry.

When the blockade was introduced in June 2017, it threw the vulnerability of Qatar’s domestic food supply to outside interruption into sharp relief. Qatar is poorly suited to growing food. The desert country ranks as the most water-stressed in the world. As one of the hottest, most arid countries in the world, trade is critical to feeding the nation; over 90 per cent of its food supply is imported.

Most of Qatar’s cereal imports – including 80 per cent of its wheat supply – arrive by sea from exporters including India, Russia and Australia. Sitting on the eastern edge of the Persian Gulf, Qatar’s only maritime gateway to the world is the Strait of Hormuz. This narrow body of water can, as events this summer have shown, be disrupted by geopolitical events. But for 40 per cent of overall food imports, overland trade from Saudi Arabia was Qatar’s primary supply channel before June 2017 – particularly so for dairy products and fresh fruit and vegetables coming from the EU, Turkey and Jordan.

The abrupt closure of Saudi Arabia’s borders prompted significant private investment in Qatar’s own food industry; domestic production has reportedly increased four-fold since the blockade was introduced. Prior to the blockade, Qatar imported 85 per cent of its vegetables; it now hopes to produce 60 per cent within the next three years. Perhaps even more remarkably, the country is now self-sufficient in dairy, having previously relied on imports for 72 per cent of its supply.

This progress has come at a cost. Qatar’s booming domestic industry is highly resource-intensive. To fill the gap in the dairy sector, Baladna – the country’s principal dairy producer – imported around 18,000 Holstein dairy cows from the EU and US. The company is thriving; in June of this year, it made its first dairy exports.

But the desert is not a natural environment for these cows; they must be kept indoors, at temperatures around 15°C cooler than the outside air, and misted with water to prevent overheating. The cooling systems are a huge drain on local resources. Each dairy cow requires an average of 185 gallons of water a day, almost twice the volume used by the average Qatari household. The majority of this water comes from oil- or gas-powered desalination plants; the cooling systems themselves run on gas-fired electricity.

Qatar has traditionally invested in production overseas – particularly in Sudan and Tanzania – to secure its fodder supply, but the government has plans to become self-sufficient in fodder crops such as lucerne (alfalfa) and Rhodes grass. This will require irrigation on a vast scale. Qatar’s farmland is mostly located in the north of the country where it benefits from aquifers; fodder production already accounts for half of the groundwater extracted for use in agriculture.

Despite commitments made under the National Food Security Programme to improving the water efficiency of Qatar’s food production, the rate of draw-down of these aquifers exceeds their recharge rates. Overexploitation has resulted in saline intrusion, threatening their long-term viability. With 92 per cent of all extracted groundwater given to farmers free of charge, there is little incentive for economizing on its use.

Increasing production will also likely mean increasing fertilizer use; rates of fertilizer use in Qatar are among the highest in the world, second only to those in Singapore.

Both government and industry are taking small steps to ‘green’ the country’s food production. Certain local authorities plan to ban the use of groundwater for fodder production by 2025, requiring producers to use treated sewage water instead and reserving the use of groundwater for crop production.

A number of companies are also adopting so-called ‘circular’ practices to achieve more efficienct resource use; Agrico, a major vegetable producer, has expanded its organic hydroponics operations, a move the company reports has led to a 90 per cent reduction in water use. But, with a target to produce up to 50 per cent of Qatar’s fresh food supply domestically within just a few years, scattered examples of resource-saving strategies will not be enough to mitigate the rise in water demand.

As Qatar looks to continue growing its food industry in the wake of the blockade, it is from Saudi Arabia – ironic though it may be – that Qatar stands to learn important lessons.

Saudi Arabia’s scaling up of domestic wheat production – initially to achieve self-sufficiency and then to support a prosperous export industry – was ultimately a failed effort. The unsustainable extraction of groundwater – fuelled by generous subsidies for wheat producers and the nominal cost of diesel for pumping – brought the country’s water table to the brink of collapse, and the government was forced to make a dramatic U-turn, reducing then removing the subsidies and shrinking its wheat sector.

The UAE also provides an instructive example for how domestic food production may be supported – this time positive. This summer, the Department of Environment in Abu Dhabi announced its Recycled Water Policy, laying out a policy framework to promote and facilitate reused water across all major sectors, including agriculture.

Back in 2014, the Ministry of Climate Change and Environment set hydroponics as a key priority, launching a 100 million Emirati dirham fund to incentivize and support farmers establishing hydroponic farms. And the International Center for Biosaline Agriculture, based in Dubai and supported by the UAE government, undertakes pioneering research into sustainable food production in saline environments.

On the face of it, Qatar has indeed bounced back from the blockade. As and when cross-border trade is re-established with Saudi Arabia, Qatar will boast a more diverse – and more resilient – network of trade relationships than it did prior to June 2017.

In addition to investment in domestic food production, the blockade also provoked a rapid recalibration of Qatar’s trade relationships. Allies in the region – most notably Turkey and Iran – were quick to come to Qatar’s assistance, delivering fresh produce by air. Since then, Qatar has scaled up its trading relationship with both countries.

It has also leveraged its position as the world’s largest exporter of liquid natural gas to establish new maritime trade lines with major food exporters, including India. Should tensions spike again in the future, it will be in a stronger position to weather the storm.

But, in the absence of a commitment to support the widespread adoption of circular agricultural technologies and practices, Qatar’s commitment to increasing its self-sufficiency and expanding its domestic production could ultimately undermine its long-term food security.

Rising average temperatures and increasingly frequent extreme weather events – like the heatwave in 2010 when temperatures soared to over 50°C – will exacerbate already high resource stress in the country. Unsustainable exploitation of finite land, water and energy reserves will limit the country’s long-term capacity to produce food and weaken its ability to withstand future disruptions to regional and international supply channels.

As Qatar continues in its efforts to secure a reliable food supply, it would do well to heed the experience of its neighbours, be they friend or foe.




apt

Net Zero and Beyond: What Role for Bioenergy with Carbon Capture and Storage?

Invitation Only Research Event

23 January 2020 - 8:30am to 10:00am

Chatham House | 10 St James's Square | London | SW1Y 4LE

Event participants

Richard King, Senior Research Fellow, Energy, Environment and Resources Department, Chatham House
Chair: Duncan Brack, Associate Fellow, Energy, Environment and Resources Department, Chatham House

In the context of the feasibility of reducing greenhouse gas emissions to net zero, policymakers are beginning to pay more attention to options for removing carbon dioxide from the atmosphere. A wide range of potential carbon dioxide removal (CDR) options are currently being discussed and modelled though the most prominent among them are bioenergy with carbon capture and storage (BECCS) and afforestation and reforestation.

There are many reasons to question the reliance on BECCS assumed in the models including the carbon balances achievable, its substantial needs for land, water and other inputs and technically and economically viable carbon capture and storage technologies.

This meeting will examine the potentials and challenges of BECCS in the context of other CDR and emissions abatement options. It will discuss the requisite policy and regulatory frameworks to minimize sustainability and socio-political risks of CDR approaches while also avoiding overshooting climate goals.

Attendance at this event is by invitation only.

Event attributes

Chatham House Rule

Chloé Prendleloup




apt

Net Zero and Beyond: What Role for Bioenergy with Carbon Capture and Storage?

29 January 2020

Policymakers are in danger of sleepwalking into ineffective carbon dioxide removal solutions in the quest to tackle climate change. This paper warns against overreliance on bioenergy with carbon capture and storage (BECCS). 

Duncan Brack

Associate Fellow, Energy, Environment and Resources Programme

Richard King

Senior Research Fellow, Energy, Environment and Resources Programme

Reaching Net Zero: Does BECCS Work?

Policymakers can be influenced by ineffective carbon dioxide removal solutions in the quest to tackle climate change. This animation explores the risks of using bioenergy with carbon capture and storage (BECCS).

Summary

  • Current climate efforts are not progressing quickly enough to prevent the world from overshooting the global emissions targets set in the Paris Agreement; accordingly, attention is turning increasingly to options for removing carbon dioxide from the atmosphere – ‘carbon dioxide removal’ (CDR).
  •  Alongside afforestation and reforestation, the main option under discussion is bioenergy with carbon capture and storage (BECCS): processes through which the carbon emissions from burning biomass for energy are captured before release into the atmosphere and stored in underground reservoirs.
  • This pre-eminent status is not, however, based on a comprehensive analysis of the feasibility and impacts of BECCS. In reality, BECCS has many drawbacks.
  • Models generally assume that biomass for energy is inherently carbon-neutral (and thus that BECCS, by capturing and storing the emissions from combustion, is carbon-negative), but in reality this is not a valid assumption.
  • On top of this, the deployment of BECCS at the scales assumed in most models would consume land on a scale comparable to half that currently taken up by global cropland, entailing massive land-use change, potentially endangering food security and biodiversity. There is also significant doubt about the likely energy output of BECCS solutions.
  • BECCS may still have some role to play in strategies for CDR, depending mainly on the feedstock used; but it should be evaluated on the same basis as other CDR options, such as nature-based solutions or direct air carbon capture and storage (DACCS). Analysis should take full account of carbon balances over time, the requirements of each CDR option in terms of demand for land, water and other inputs, and the consequences of that demand.
  • There is an urgent need for policymakers to engage with these debates. The danger at the moment is that policymakers are ‘sleepwalking towards BECCS’ simply because most models incorporate it – or, almost as bad, it may be that they are simply ignoring the need for any meaningful action on CDR as a whole.




apt

The New Reality: Germany Adapts to Its Role as a Major Migrant Magnet

Although long one of the world's top migrant destinations, only in the recent past has Germany come to acknowledge and adjust to its role as a country of immigration. Its welcoming approach—a relatively new development—has been put to the test amid massive humanitarian inflows beginning in 2015. This country profile examines Germany's history on immigration and highlights current and emerging debates.




apt

Association of Urine Haptoglobin With Risk of All-Cause and Cause-Specific Mortality in Individuals With Type 2 Diabetes: A Transethnic Collaborative Work

OBJECTIVE

Haptoglobin is an acute-phase reactant with pleiotropic functions. We aimed to study whether urine haptoglobin may predict risk of mortality in people with type 2 diabetes.

RESEARCH DESIGN AND METHODS

We employed a transethnic approach with a cohort of Asian origin (Singapore) (N = 2,061) and a cohort of European origin (France) (N = 1,438) included in the study. We used survival analyses to study the association of urine haptoglobin with risk of all-cause and cause-specific mortality.

RESULTS

A total of 365 and 525 deaths were registered in the Singapore cohort (median follow-up 7.5 years [interquartile range 3.5–12.8]) and French SURDIAGENE cohort (median follow-up 6.8 years [interquartile range 4.3–10.5], respectively. Singapore participants with urine haptoglobin in quartiles 2 to 4 had higher risk for all-cause mortality compared with quartile 1 (unadjusted hazard ratio [HR] 1.47 [95% CI 1.02–2.11], 2.28 [1.62–3.21], and 4.64 [3.39–6.35], respectively). The association remained significant in quartile 4 after multiple adjustments (1.68 [1.15–2.45]). Similarly, participants in the French cohort with haptoglobin in quartile 4 had significantly higher hazards for all-cause mortality compared with quartile 1 (unadjusted HR 2.67 [2.09–3.42] and adjusted HR 1.49 [1.14–1.96]). In both cohorts, participants in quartile 4 had a higher risk of mortality attributable to cardiovascular disease and infection but not malignant tumor.

CONCLUSIONS

Urine haptoglobin predicts risk of mortality independent of traditional risk factors, suggesting that it may potentially be a novel biomarker for risk of mortality in patients with type 2 diabetes.




apt

ADA urges third-party payers to adapt coding, billing procedures to help patients recover

The American Dental Association sent a letter to third-party payers urging that administrators of dental benefit plans adjust and adapt reimbursement procedures important to dentists and patients — including coverage for temporary procedures and adjusting fee schedules to account for cost of increasing infection control procedures ¬— in the midst of the “unprecedented and extraordinary circumstances dentists and their patients face” during the pandemic.




apt

Unlike 'Jurassic Park,' real raptors may not have hunted in packs

While the coordinated attacks of Velociraptor dinosaurs depicted in the 1993 blockbuster made for compelling movie viewing, a study published this week claims raptors most likely hunted solo, not in packs.




apt

Disabled flies sleep more as they learn to adapt

New research suggests flies that are unable to fly sleep more as they learn to adapt to their disability.




apt

Navy adapts maintenance procedures, strategies for containing COVID-19

The Navy has been particularly hard hit by the coronavirus pandemic and is working to adapt its strategies for maintenance as well as containing outbreaks on ships.




apt

Maryland educators, students aim to adapt to closures




apt

The Baptism of General Butt Naked.




apt

Developing tailored study plans for the new higher education environment : 'Letting go of control' : final report / Professor Joe Shapter, National Teaching Fellow, Flinders University ; Associate Professor Ingo Koeper, College of Science and Engi

"It is timely that the higher education sector examines paths forward to address and indeed engage in the new environment in which it will work in the future. This fellowship explored two approaches to engage students more deeply in their education. The first approach is generally termed 'interdisciplinary studies' where students define their own program of study; the second approach focuses on topic structure where students are given a wide range of choice and in effect can build a topic that suits their interests."--Page iv.




apt

Capturing nature : early scientific photography at the Australian Museum 1857-1893 / Vanessa Finney ; foreword by Kim McKay.

Krefft, Gerard, 1830-1881.




apt

Australianama : the south Asian odyssey in Australia / Samia Khatun ; [adapted by Stan Lamond].

East Indians -- Australia -- Languages.




apt

Whose story is this? : old conflicts, new chapters / Rebecca Solnit.

American essays -- 21st century.




apt

Die Frage über die Heilbarkeit der Lungenphthisen : historisch, pathologisch und therapeutisch untersucht / von Joh. Bapt. Ullersperger.

Wurzburg : Stahel, 1867.




apt

Die pathologische Anatomie und Physiologie des Joh. Bapt. Morgagni (1682-1771) : Ein monographischer Beitrag zur Geschichte der theoretischen Heilkunde / von F. Falk.

Berlin : Hirschwald, 1887.




apt

Diseases of the digestive organs in infancy and childhood : with chapters on the diet and general management of children, and massage in pediatrics / by Louis Starr.

London : Rebman, 1901.




apt

Dr Pereira's elements of materia medica and therapeutics : abridged and adapted for the use of medical and pharmaceutical practitioners and students, and comprising all the medicines of the British pharmacopoeia, with such others as are frequently ord

London : Longmans, Green, 1872.




apt

Economic entomology : Aptera / by Andrew Murray.

[London] : Chapman and Hall, [1877]




apt

Elements of the comparative anatomy of vertebrates / adapted from the German of Robert Wiedersheim by W. Newton Parker.

London : Macmillan, 1897.




apt

Elements of the comparative anatomy of vertebrates / adapted from the German of Robert Wiedersheim by W. Newton Parker ; with additions by the author and translator.

London : Macmillan, 1886.




apt

An epitome of the reports of the medical officers to the Chinese imperial maritime customs service, from 1871 to 1882 : with chapters on the history of medicine in China; materia medica; epidemics; famine; ethnology; and chronology in relation to medicine

London : Bailliere, Tindall and Cox, 1884.




apt

Did #RedForEd Just Capture Its First Midterm Victory?

In Tuesday night's Republican primary in West Virginia, Robert Karnes, a West Virginia Republican state senator who lashed out at teachers during their nine-day strike, lost to pro-labor candidate Bill Hamilton.




apt

An episode in 'Every man in his humour' by Ben Jonson: Charles Dickens in character as Captain Bobadill is awakened after a hard night's drinking. Lithograph by T.H. Maguire after C.R. Leslie.

[London?]




apt

Chukchi people and housing encountered by Captain Cook on his third voyage (1777-1780). Engraving after J. Webber, 1778.

[London?], [between 1700 and 1799?]




apt

Elon Musk Makes Donation to Flint, Mich., District for Laptops

Entrepreneur and business founder Elon Musk will donate $423,000 for laptops in the Michigan district, a few months after making a gift focused on improving water quality in the school system.




apt

Adaptive estimation in the supremum norm for semiparametric mixtures of regressions

Heiko Werner, Hajo Holzmann, Pierre Vandekerkhove.

Source: Electronic Journal of Statistics, Volume 14, Number 1, 1816--1871.

Abstract:
We investigate a flexible two-component semiparametric mixture of regressions model, in which one of the conditional component distributions of the response given the covariate is unknown but assumed symmetric about a location parameter, while the other is specified up to a scale parameter. The location and scale parameters together with the proportion are allowed to depend nonparametrically on covariates. After settling identifiability, we provide local M-estimators for these parameters which converge in the sup-norm at the optimal rates over Hölder-smoothness classes. We also introduce an adaptive version of the estimators based on the Lepski-method. Sup-norm bounds show that the local M-estimator properly estimates the functions globally, and are the first step in the construction of useful inferential tools such as confidence bands. In our analysis we develop general results about rates of convergence in the sup-norm as well as adaptive estimation of local M-estimators which might be of some independent interest, and which can also be applied in various other settings. We investigate the finite-sample behaviour of our method in a simulation study, and give an illustration to a real data set from bioinformatics.




apt

Non-parametric adaptive estimation of order 1 Sobol indices in stochastic models, with an application to Epidemiology

Gwenaëlle Castellan, Anthony Cousien, Viet Chi Tran.

Source: Electronic Journal of Statistics, Volume 14, Number 1, 50--81.

Abstract:
Global sensitivity analysis is a set of methods aiming at quantifying the contribution of an uncertain input parameter of the model (or combination of parameters) on the variability of the response. We consider here the estimation of the Sobol indices of order 1 which are commonly-used indicators based on a decomposition of the output’s variance. In a deterministic framework, when the same inputs always give the same outputs, these indices are usually estimated by replicated simulations of the model. In a stochastic framework, when the response given a set of input parameters is not unique due to randomness in the model, metamodels are often used to approximate the mean and dispersion of the response by deterministic functions. We propose a new non-parametric estimator without the need of defining a metamodel to estimate the Sobol indices of order 1. The estimator is based on warped wavelets and is adaptive in the regularity of the model. The convergence of the mean square error to zero, when the number of simulations of the model tend to infinity, is computed and an elbow effect is shown, depending on the regularity of the model. Applications in Epidemiology are carried to illustrate the use of non-parametric estimators.




apt

Adaptive two-treatment three-period crossover design for normal responses

Uttam Bandyopadhyay, Shirsendu Mukherjee, Atanu Biswas.

Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 2, 291--303.

Abstract:
In adaptive crossover design, our goal is to allocate more patients to a promising treatment sequence. The present work contains a very simple three period crossover design for two competing treatments where the allocation in period 3 is done on the basis of the data obtained from the first two periods. Assuming normality of response variables we use a reliability functional for the choice between two treatments. We calculate the allocation proportions and their standard errors corresponding to the possible treatment combinations. We also derive some asymptotic results and provide solutions on related inferential problems. Moreover, the proposed procedure is compared with a possible competitor. Finally, we use a data set to illustrate the applicability of the proposed design.




apt

Spatially adaptive Bayesian image reconstruction through locally-modulated Markov random field models

Salem M. Al-Gezeri, Robert G. Aykroyd.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 3, 498--519.

Abstract:
The use of Markov random field (MRF) models has proven to be a fruitful approach in a wide range of image processing applications. It allows local texture information to be incorporated in a systematic and unified way and allows statistical inference theory to be applied giving rise to novel output summaries and enhanced image interpretation. A great advantage of such low-level approaches is that they lead to flexible models, which can be applied to a wide range of imaging problems without the need for significant modification. This paper proposes and explores the use of conditional MRF models for situations where multiple images are to be processed simultaneously, or where only a single image is to be reconstructed and a sequential approach is taken. Although the coupling of image intensity values is a special case of our approach, the main extension over previous proposals is to allow the direct coupling of other properties, such as smoothness or texture. This is achieved using a local modulating function which adjusts the influence of global smoothing without the need for a fully inhomogeneous prior model. Several modulating functions are considered and a detailed simulation study, motivated by remote sensing applications in archaeological geophysics, of conditional reconstruction is presented. The results demonstrate that a substantial improvement in the quality of the image reconstruction, in terms of errors and residuals, can be achieved using this approach, especially at locations with rapid changes in the underlying intensity.




apt

Flexible, boundary adapted, nonparametric methods for the estimation of univariate piecewise-smooth functions

Umberto Amato, Anestis Antoniadis, Italia De Feis.

Source: Statistics Surveys, Volume 14, 32--70.

Abstract:
We present and compare some nonparametric estimation methods (wavelet and/or spline-based) designed to recover a one-dimensional piecewise-smooth regression function in both a fixed equidistant or not equidistant design regression model and a random design model. Wavelet methods are known to be very competitive in terms of denoising and compression, due to the simultaneous localization property of a function in time and frequency. However, boundary assumptions, such as periodicity or symmetry, generate bias and artificial wiggles which degrade overall accuracy. Simple methods have been proposed in the literature for reducing the bias at the boundaries. We introduce new ones based on adaptive combinations of two estimators. The underlying idea is to combine a highly accurate method for non-regular functions, e.g., wavelets, with one well behaved at boundaries, e.g., Splines or Local Polynomial. We provide some asymptotic optimal results supporting our approach. All the methods can handle data with a random design. We also sketch some generalization to the multidimensional setting. To study the performance of the proposed approaches we have conducted an extensive set of simulations on synthetic data. An interesting regression analysis of two real data applications using these procedures unambiguously demonstrates their effectiveness.




apt

Adaptive clinical trial designs for phase I cancer studies

Oleksandr Sverdlov, Weng Kee Wong, Yevgen Ryeznik.

Source: Statistics Surveys, Volume 8, 2--44.

Abstract:
Adaptive clinical trials are becoming increasingly popular research designs for clinical investigation. Adaptive designs are particularly useful in phase I cancer studies where clinical data are scant and the goals are to assess the drug dose-toxicity profile and to determine the maximum tolerated dose while minimizing the number of study patients treated at suboptimal dose levels. In the current work we give an overview of adaptive design methods for phase I cancer trials. We find that modern statistical literature is replete with novel adaptive designs that have clearly defined objectives and established statistical properties, and are shown to outperform conventional dose finding methods such as the 3+3 design, both in terms of statistical efficiency and in terms of minimizing the number of patients treated at highly toxic or nonefficacious doses. We discuss statistical, logistical, and regulatory aspects of these designs and present some links to non-commercial statistical software for implementing these methods in practice.




apt

Capturing and Explaining Trajectory Singularities using Composite Signal Neural Networks. (arXiv:2003.10810v2 [cs.LG] UPDATED)

Spatial trajectories are ubiquitous and complex signals. Their analysis is crucial in many research fields, from urban planning to neuroscience. Several approaches have been proposed to cluster trajectories. They rely on hand-crafted features, which struggle to capture the spatio-temporal complexity of the signal, or on Artificial Neural Networks (ANNs) which can be more efficient but less interpretable. In this paper we present a novel ANN architecture designed to capture the spatio-temporal patterns characteristic of a set of trajectories, while taking into account the demographics of the navigators. Hence, our model extracts markers linked to both behaviour and demographics. We propose a composite signal analyser (CompSNN) combining three simple ANN modules. Each of these modules uses different signal representations of the trajectory while remaining interpretable. Our CompSNN performs significantly better than its modules taken in isolation and allows to visualise which parts of the signal were most useful to discriminate the trajectories.




apt

Covariance Matrix Adaptation for the Rapid Illumination of Behavior Space. (arXiv:1912.02400v2 [cs.LG] UPDATED)

We focus on the challenge of finding a diverse collection of quality solutions on complex continuous domains. While quality diver-sity (QD) algorithms like Novelty Search with Local Competition (NSLC) and MAP-Elites are designed to generate a diverse range of solutions, these algorithms require a large number of evaluations for exploration of continuous spaces. Meanwhile, variants of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) are among the best-performing derivative-free optimizers in single-objective continuous domains. This paper proposes a new QD algorithm called Covariance Matrix Adaptation MAP-Elites (CMA-ME). Our new algorithm combines the self-adaptation techniques of CMA-ES with archiving and mapping techniques for maintaining diversity in QD. Results from experiments based on standard continuous optimization benchmarks show that CMA-ME finds better-quality solutions than MAP-Elites; similarly, results on the strategic game Hearthstone show that CMA-ME finds both a higher overall quality and broader diversity of strategies than both CMA-ES and MAP-Elites. Overall, CMA-ME more than doubles the performance of MAP-Elites using standard QD performance metrics. These results suggest that QD algorithms augmented by operators from state-of-the-art optimization algorithms can yield high-performing methods for simultaneously exploring and optimizing continuous search spaces, with significant applications to design, testing, and reinforcement learning among other domains.




apt

Convergence rates for optimised adaptive importance samplers. (arXiv:1903.12044v4 [stat.CO] UPDATED)

Adaptive importance samplers are adaptive Monte Carlo algorithms to estimate expectations with respect to some target distribution which extit{adapt} themselves to obtain better estimators over a sequence of iterations. Although it is straightforward to show that they have the same $mathcal{O}(1/sqrt{N})$ convergence rate as standard importance samplers, where $N$ is the number of Monte Carlo samples, the behaviour of adaptive importance samplers over the number of iterations has been left relatively unexplored. In this work, we investigate an adaptation strategy based on convex optimisation which leads to a class of adaptive importance samplers termed extit{optimised adaptive importance samplers} (OAIS). These samplers rely on the iterative minimisation of the $chi^2$-divergence between an exponential-family proposal and the target. The analysed algorithms are closely related to the class of adaptive importance samplers which minimise the variance of the weight function. We first prove non-asymptotic error bounds for the mean squared errors (MSEs) of these algorithms, which explicitly depend on the number of iterations and the number of samples together. The non-asymptotic bounds derived in this paper imply that when the target belongs to the exponential family, the $L_2$ errors of the optimised samplers converge to the optimal rate of $mathcal{O}(1/sqrt{N})$ and the rate of convergence in the number of iterations are explicitly provided. When the target does not belong to the exponential family, the rate of convergence is the same but the asymptotic $L_2$ error increases by a factor $sqrt{ ho^star} > 1$, where $ ho^star - 1$ is the minimum $chi^2$-divergence between the target and an exponential-family proposal.




apt

Domain Adaptation in Highly Imbalanced and Overlapping Datasets. (arXiv:2005.03585v1 [cs.LG])

In many Machine Learning domains, datasets are characterized by highly imbalanced and overlapping classes. Particularly in the medical domain, a specific list of symptoms can be labeled as one of various different conditions. Some of these conditions may be more prevalent than others by several orders of magnitude. Here we present a novel unsupervised Domain Adaptation scheme for such datasets. The scheme, based on a specific type of Quantification, is designed to work under both label and conditional shifts. It is demonstrated on datasets generated from Electronic Health Records and provides high quality results for both Quantification and Domain Adaptation in very challenging scenarios. Potential benefits of using this scheme in the current COVID-19 outbreak, for estimation of prevalence and probability of infection, are discussed.




apt

A Locally Adaptive Interpretable Regression. (arXiv:2005.03350v1 [stat.ML])

Machine learning models with both good predictability and high interpretability are crucial for decision support systems. Linear regression is one of the most interpretable prediction models. However, the linearity in a simple linear regression worsens its predictability. In this work, we introduce a locally adaptive interpretable regression (LoAIR). In LoAIR, a metamodel parameterized by neural networks predicts percentile of a Gaussian distribution for the regression coefficients for a rapid adaptation. Our experimental results on public benchmark datasets show that our model not only achieves comparable or better predictive performance than the other state-of-the-art baselines but also discovers some interesting relationships between input and target variables such as a parabolic relationship between CO2 emissions and Gross National Product (GNP). Therefore, LoAIR is a step towards bridging the gap between econometrics, statistics, and machine learning by improving the predictive ability of linear regression without depreciating its interpretability.




apt

Subdomain Adaptation with Manifolds Discrepancy Alignment. (arXiv:2005.03229v1 [cs.LG])

Reducing domain divergence is a key step in transfer learning problems. Existing works focus on the minimization of global domain divergence. However, two domains may consist of several shared subdomains, and differ from each other in each subdomain. In this paper, we take the local divergence of subdomains into account in transfer. Specifically, we propose to use low-dimensional manifold to represent subdomain, and align the local data distribution discrepancy in each manifold across domains. A Manifold Maximum Mean Discrepancy (M3D) is developed to measure the local distribution discrepancy in each manifold. We then propose a general framework, called Transfer with Manifolds Discrepancy Alignment (TMDA), to couple the discovery of data manifolds with the minimization of M3D. We instantiate TMDA in the subspace learning case considering both the linear and nonlinear mappings. We also instantiate TMDA in the deep learning framework. Extensive experimental studies demonstrate that TMDA is a promising method for various transfer learning tasks.




apt

Adaptive Invariance for Molecule Property Prediction. (arXiv:2005.03004v1 [q-bio.QM])

Effective property prediction methods can help accelerate the search for COVID-19 antivirals either through accurate in-silico screens or by effectively guiding on-going at-scale experimental efforts. However, existing prediction tools have limited ability to accommodate scarce or fragmented training data currently available. In this paper, we introduce a novel approach to learn predictors that can generalize or extrapolate beyond the heterogeneous data. Our method builds on and extends recently proposed invariant risk minimization, adaptively forcing the predictor to avoid nuisance variation. We achieve this by continually exercising and manipulating latent representations of molecules to highlight undesirable variation to the predictor. To test the method we use a combination of three data sources: SARS-CoV-2 antiviral screening data, molecular fragments that bind to SARS-CoV-2 main protease and large screening data for SARS-CoV-1. Our predictor outperforms state-of-the-art transfer learning methods by significant margin. We also report the top 20 predictions of our model on Broad drug repurposing hub.