ipl COP26 Diplomatic Briefing Series: Climate Change and National Security By feedproxy.google.com Published On :: Tue, 21 Apr 2020 13:25:01 +0000 Research Event 25 March 2020 - 9:00am to 10:30am Event participants Lieutenant General Richard Nugee, Departmental Lead for Climate Change and Sustainability in the UK Ministry of DefenceRear Admiral Neil Morisetti, Vice Dean (Public Policy) Engineering Sciences at the University College London and Associate Fellow at Chatham HouseDr. Patricia Lewis, Research Director for Conflict, Science and Technology, and Director of the International Security Programme at Chatham HouseProfessor Yacob Mulugetta, Professor of Energy and Development Policy at the University College LondonChair: Glada Lahn, Senior Research Fellow, Chatham House Extreme weather, rising sea levels and a melting Arctic - the effects of climate change are posing an increasingly large threat to national security worldwide. Although the issue has gained traction within the international community in recent years, including within the UN Security Council, it is urgent that governments act more decisively to mitigate and respond to the threat, not least given that climate change is happening faster and in a more powerful way than originally anticipated. The third event in the Chatham House COP26 Diplomatic Briefing Series – 'Climate Change and National Security' - will analyze how climate change acts as a threat multiplier, fuelling instability and endangering economic, social and political systems across the globe. The briefing will also provide recommendations of what governments and other stakeholders should do to develop effective responses. Department/project Energy, Environment and Resources Programme, COP26 Diplomatic Briefing Series Anna Aberg Research Analyst, Energy, Environment and Resources Programme 020 7314 3629 Email Full Article
ipl COP26 Diplomatic Briefing Series: Money Matters: Climate Finance and the COP By feedproxy.google.com Published On :: Tue, 21 Apr 2020 13:25:01 +0000 Research Event 20 April 2020 - 9:00am to 10:30am Event participants Tenzin Wangmo, Lead Negotiator of the Least Developed Countries (LDC) Group Mattias Frumerie, Director at the Swedish Ministry for Foreign AffairsRachel Ward, Programme Director and Head of Policy at the Institutional Investors Group on Climate ChangeIseoluwa Akintunde, Mo Ibrahim Academy Fellow at Chatham HouseChair: Kirsty Hamilton, Associate Fellow, Chatham House Finance plays a key role in enabling climate change mitigation and adaptation. It is also a contested issue in the UN climate negotiations. The fourth event in the Chatham House COP26 Diplomatic Briefing Series will explore the politics of climate finance in the context of the COP, and provide a comprehensive update of the main climate finance-related negotiation items and processes. The topic is particularly timely given that the UK Government has made climate finance one of its top thematic priorities for COP26 and that 2020 constitutes the deadline for developed countries to mobilise USD 100 billion per year to support climate action in developing countries. Department/project Energy, Environment and Resources Programme, COP26 Diplomatic Briefing Series Anna Aberg Research Analyst, Energy, Environment and Resources Programme 020 7314 3629 Email Full Article
ipl Randomized Study to Evaluate the Impact of Telemedicine Care in Patients With Type 1 Diabetes With Multiple Doses of Insulin and Suboptimal HbA1c in Andalusia (Spain): PLATEDIAN Study By care.diabetesjournals.org Published On :: 2020-01-20T12:00:30-08:00 OBJECTIVE To assess the impact of a telemedicine visit using the platform Diabetic compared with a face-to-face visit on clinical outcomes, patients’ health-related quality of life (HRQoL), and physicians’ satisfaction in patients with type 1 diabetes. RESEARCH DESIGN AND METHODS PLATEDIAN (Telemedicine on Metabolic Control in Type 1 Diabetes Mellitus Andalusian Patients) (NCT03332472) was a multicenter, randomized, 6-month follow-up, open-label, parallel-group controlled study performed in patients with type 1 diabetes with suboptimal metabolic control (HbA1c <8% [<64 mmol/mol]), treated with multiple daily injections. A total of 388 patients were assessed for eligibility; 379 of them were randomized 1:1 to three face-to-face visits (control cohort [CC]) (n = 167) or the replacement of an intermediate face-to-face visit by a telemedicine visit using Diabetic (intervention cohort [IC]) (n = 163). The primary efficacy end point was the mean change of HbA1c levels from baseline to month 6. Other efficacy and safety end points were mean blood glucose, glucose variability, episodes of hypoglycemia and hyperglycemia, patient-reported outcomes, and physicians’ satisfaction. RESULTS At month 6, the mean change in HbA1c levels was –0.04 ± 0.5% (–0.5 ± 5.8 mmol/mol) in the CC and 0.01 ± 0.6% (0.1 ± 6.0 mmol/mol) in the IC (P = 0.4941). The number of patients who achieved HbA1c <7% (<53 mmol/mol) was 73 and 78 in the CC and IC, respectively. Significant differences were not found regarding safety end points at 6 months. Changes in HRQoL between the first visit and final visit did not differ between cohorts, and, regarding fear of hypoglycemia (FH-15 score ≥28), statistically significant differences observed at baseline remained unchanged at 6 months (P < 0.05). CONCLUSIONS The use of telemedicine in patients with type 1 diabetes with HbA1c <8% (<64 mmol/mol) provides similar efficacy and safety outcomes as face-to-face visits. Full Article
ipl U.N. triples coronavirus aid appeal to help most vulnerable countries By www.upi.com Published On :: Thu, 07 May 2020 05:52:20 -0400 The United Nations more than tripled its humanitarian aid appeal on Thursday from $2 billion to $6.7 billion to accommodate its updated global plan to help the poorest nations fight the coronavirus pandemic. Full Article
ipl Blood test may effectively spot multiple cancers By www.upi.com Published On :: Fri, 01 May 2020 17:37:16 -0400 A new blood-test approach for cancer screening was able to identify 26 previously undetected cases of the disease, according to the findings of a new study published Friday by the journal Science. Full Article
ipl Risk of Major Adverse Cardiovascular Events, Severe Hypoglycemia, and All-Cause Mortality for Widely Used Antihyperglycemic Dual and Triple Therapies for Type 2 Diabetes Management: A Cohort Study of All Danish Users By care.diabetesjournals.org Published On :: 2020-04-01T06:54:34-07:00 OBJECTIVEThe vast number of antihyperglycemic medications and growing amount of evidence make clinical decision making difficult. The aim of this study was to investigate the safety of antihyperglycemic dual and triple therapies for type 2 diabetes management with respect to major adverse cardiovascular events, severe hypoglycemia, and all-cause mortality in a real-life clinical setting.RESEARCH DESIGN AND METHODSCox regression models were constructed to analyze 20 years of data from the Danish National Patient Registry with respect to effect of the antihyperglycemic therapies on the three end points.RESULTSA total of 66,807 people with type 2 diabetes were treated with metformin (MET) including a combination of second- and third-line therapies. People on MET plus sulfonylurea (SU) had the highest risk of all end points, except for severe hypoglycemia, for which people on MET plus basal insulin (BASAL) had a higher risk. The lowest risk of major adverse cardiovascular events was seen for people on a regimen including a glucagon-like peptide 1 (GLP-1) receptor agonist. People treated with MET, GLP-1, and BASAL had a lower risk of all three end points than people treated with MET and BASAL, especially for severe hypoglycemia. The lowest risk of all three end points was, in general, seen for people treated with MET, sodium–glucose cotransporter 2 inhibitor, and GLP-1.CONCLUSIONSFindings from this study do not support SU as the second-line treatment choice for patients with type 2 diabetes. Moreover, the results indicate that adding a GLP-1 for people treated with MET and BASAL could be considered, especially if those people suffer from severe hypoglycemia. Full Article
ipl Discipline and Punish, by Michel Foucault By brooklynbooktalk.blogspot.com Published On :: Fri, 06 Sep 2013 21:38:00 +0000 Discipline and Punish (1975), is a genealogy of power based on particulars of penal history, and is considered Foucault’s “out-of-the-ordinary,” “intellectually charismatic,” and “soundly subversive” work, in which he also reveals his passionate empathy for the disenfranchised and the dispossessed, and a desire to trace the overt and covert networks of power, which underlie modern societies. Highly interdisciplinary and thought-provoking in its content, the book is at once a work of history, sociology, philosophy, penology, legal analysis and cultural criticism, therby making it difficult to categorize in any given literature or tradition. Foucault, who is hailed as a “theorist of paradox” by highly acclaimed critics, was influenced by some of the greatest European philosophers such as Maurice Merleau-Ponty, Jean Beaufret—Martin Heidegger’s major interpreter in France—and Louis Althusser. He earned his License de philosophie in 1948 and Diplôme de psycho-pathologie in 1952, and taught in Sweden, Poland, and Germany before his appointment as the head of the philosophy department at the University of Clermont-Ferrand. The range of his creative (and massively subversive) thought knows no bounds but throughout his many studies, on subjects as varied as madness, medicine, modern discourse, sexuality, there is a definite tendency to reverse “taken-for-granted” understandings and to discover, not unlike Freud, the latent behind the manifest--especially when it come to the nature of power and its pervasive effects in the human condition. Moreover, Foucault in his major works, has undertaken a sustained assault upon what he regards as the myths of "the Enlightenment," "Reason," "science," "freedom," "justice," and "democracy"--all these salient features of modern civilization, and has exposed their “hidden side.” Foucault has also argued that the hidden side usually stays hidden because the “production of discourse” in modern societies is controlled, selected, and organized according to certain behind-the-scenes procedures. He suggests that when an idea appears before us repeatedly through different modalities, we are unaware of the prodigious machinery behind, which is diligently doing discourse selection and dissemination.To make sense of this incredibly crucial work for our times, please join us at Brooklyn Book Talk and share your views about matters of power and punishment, and their subtle manifestations, which ought to concern us all, if we are to leave this world a little better than the way we found it. Full Article
ipl As European policymakers take stock of seasonal worker programmes, MPI Europe brief outlines principles to improve these schemes for all parties By www.migrationpolicy.org Published On :: Tue, 18 Feb 2020 17:46:56 -0500 Findings will be discussed during 25 February MPI Europe – SVR webinar Full Article
ipl Ricotta fritters with triple citrus syrup By www.abc.net.au Published On :: Thu, 15 Oct 2015 12:36:00 +1000 2 cups of fresh ricotta 4 eggs 2 tbsp. sugar Zest of 1 orange, 1lime and 1 lemon 1 cup of plain flour 60 g soft butter Triple Citrus Syrup Zest and juice of 2 lemons, 2 oranges and 2 limes 2 cups of sugar Full Article ABC Local widebay Lifestyle and Leisure:Recipes:All Australia:QLD:Bundaberg 4670
ipl Interdisciplinary Team Care for Diabetic Patients by Primary Care Physicians, Advanced Practice Nurses, and Clinical Pharmacists By clinical.diabetesjournals.org Published On :: 2011-04-01 David WillensApr 1, 2011; 29:60-68Feature Articles Full Article
ipl Application of Adult-Learning Principles to Patient Instructions: A Usability Study for an Exenatide Once-Weekly Injection Device By clinical.diabetesjournals.org Published On :: 2010-09-01 Gayle LorenziSep 1, 2010; 28:157-162Bridges to Excellence Full Article
ipl Annual Interdisciplinary Symposium on Decision Neuroscience (ISDN), Philadelphia, June 5-6, 2020 By feedproxy.google.com Published On :: Thu, 09 Jan 2020 22:04:35 +0000 DEADLINE FOR ORAL PRESENTATIONS: FEB 15, 2020 On June 5-6 2020, Temple University will host the 10th Annual Interdisciplinary Symposium on Decision Neuroscience (ISDN) in Philadelphia, PA. This symposium is unique in that it brings together a range of constituencies involved in the use of neuroscience techniques to understand consumer decision making – world renowned […] The post Annual Interdisciplinary Symposium on Decision Neuroscience (ISDN), Philadelphia, June 5-6, 2020 appeared first on Decision Science News. Full Article Conferences 2020 Annual decision Interdisciplinary ISDN June Neuroscience philadelphia symposium
ipl Recent Disciplinary Decisions and Trends. By www.catalog.slsa.sa.gov.au Published On :: Full Article
ipl Recent Disciplinary Decisions and Trends. By www.catalog.slsa.sa.gov.au Published On :: Full Article
ipl Strategies For Avoiding Disciplinary Complaints and What To Do - Dealing with Complaints. By www.catalog.slsa.sa.gov.au Published On :: Full Article
ipl Strategies For Avoiding Disciplinary Complaints and What To Do - Section 14AB (1)(C) – The Society’s Statutory Reporting Requirements. By www.catalog.slsa.sa.gov.au Published On :: Full Article
ipl Strategies For Avoiding Disciplinary Complaints and What To Do - How to Defend a Disciplinary Complaint. By www.catalog.slsa.sa.gov.au Published On :: Full Article
ipl Penny Wong : passion and principle : the biography / by Margaret Simons. By www.catalog.slsa.sa.gov.au Published On :: Wong, Penny. Full Article
ipl Die multiple Fettgewebsnecrose : Klinische und experimentelle Studien / von Arthur Katz und Ferdinand Winkler ; mit einem Vorwort von Leopold Oser. By feedproxy.google.com Published On :: Berlin : S. Karger, 1899. Full Article
ipl A digest of the principles and practice of medicine : with a short account of the history of medicine, and tables of Indian materia medica / by Rustomjee Naserwanjee Khory. By feedproxy.google.com Published On :: London : Smith, Elder, 1879. Full Article
ipl Diseases of the skin in twenty-four letters on the principles and practice of cutaneous medicine / by Henry Evans Cauty. By feedproxy.google.com Published On :: London : J. & A. Churchill, [1874] Full Article
ipl Dissertation sur l'hémiplégie faciale ... thèse ... / par J.-J.-H. Montault. By feedproxy.google.com Published On :: Paris : Didot le jeune, 1831. Full Article
ipl Du role de l'épiploïte aiguë ou chronique (adhérences épiploïques) au cours des appendicites / par le Docteur Leon Levrey. By feedproxy.google.com Published On :: Paris : Steinheil, 1899. Full Article
ipl A nakhan or diplomatic officer of the Burmese court, seated, wearing robes. Coloured etching by J.H. Newton, 1828. By feedproxy.google.com Published On :: London (Old Bond Street) : published by John Warren ; [London] (Ave Maria Lane) : G & W.B. Whittaker, [between 1800 and 1899] Full Article
ipl New approaches to treatment of chronic pain : a review of multidisciplinary pain clinics and pain centers / editor, Lorenz K.Y. Ng. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1981. Full Article
ipl National drug/alcohol collaborative project : issues in multiple substance abuse / edited by Stephen E. Gardner. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1980. Full Article
ipl Targeted Fused Ridge Estimation of Inverse Covariance Matrices from Multiple High-Dimensional Data Classes By Published On :: 2020 We consider the problem of jointly estimating multiple inverse covariance matrices from high-dimensional data consisting of distinct classes. An $ell_2$-penalized maximum likelihood approach is employed. The suggested approach is flexible and generic, incorporating several other $ell_2$-penalized estimators as special cases. In addition, the approach allows specification of target matrices through which prior knowledge may be incorporated and which can stabilize the estimation procedure in high-dimensional settings. The result is a targeted fused ridge estimator that is of use when the precision matrices of the constituent classes are believed to chiefly share the same structure while potentially differing in a number of locations of interest. It has many applications in (multi)factorial study designs. We focus on the graphical interpretation of precision matrices with the proposed estimator then serving as a basis for integrative or meta-analytic Gaussian graphical modeling. Situations are considered in which the classes are defined by data sets and subtypes of diseases. The performance of the proposed estimator in the graphical modeling setting is assessed through extensive simulation experiments. Its practical usability is illustrated by the differential network modeling of 12 large-scale gene expression data sets of diffuse large B-cell lymphoma subtypes. The estimator and its related procedures are incorporated into the R-package rags2ridges. Full Article
ipl Skill Rating for Multiplayer Games. Introducing Hypernode Graphs and their Spectral Theory By Published On :: 2020 We consider the skill rating problem for multiplayer games, that is how to infer player skills from game outcomes in multiplayer games. We formulate the problem as a minimization problem $arg min_{s} s^T Delta s$ where $Delta$ is a positive semidefinite matrix and $s$ a real-valued function, of which some entries are the skill values to be inferred and other entries are constrained by the game outcomes. We leverage graph-based semi-supervised learning (SSL) algorithms for this problem. We apply our algorithms on several data sets of multiplayer games and obtain very promising results compared to Elo Duelling (see Elo, 1978) and TrueSkill (see Herbrich et al., 2006).. As we leverage graph-based SSL algorithms and because games can be seen as relations between sets of players, we then generalize the approach. For this aim, we introduce a new finite model, called hypernode graph, defined to be a set of weighted binary relations between sets of nodes. We define Laplacians of hypernode graphs. Then, we show that the skill rating problem for multiplayer games can be formulated as $arg min_{s} s^T Delta s$ where $Delta$ is the Laplacian of a hypernode graph constructed from a set of games. From a fundamental perspective, we show that hypernode graph Laplacians are symmetric positive semidefinite matrices with constant functions in their null space. We show that problems on hypernode graphs can not be solved with graph constructions and graph kernels. We relate hypernode graphs to signed graphs showing that positive relations between groups can lead to negative relations between individuals. Full Article
ipl Measuring symmetry and asymmetry of multiplicative distortion measurement errors data By projecteuclid.org Published On :: Mon, 04 May 2020 04:00 EDT Jun Zhang, Yujie Gai, Xia Cui, Gaorong Li. Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 2, 370--393.Abstract: This paper studies the measure of symmetry or asymmetry of a continuous variable under the multiplicative distortion measurement errors setting. The unobservable variable is distorted in a multiplicative fashion by an observed confounding variable. First, two direct plug-in estimation procedures are proposed, and the empirical likelihood based confidence intervals are constructed to measure the symmetry or asymmetry of the unobserved variable. Next, we propose four test statistics for testing whether the unobserved variable is symmetric or not. The asymptotic properties of the proposed estimators and test statistics are examined. We conduct Monte Carlo simulation experiments to examine the performance of the proposed estimators and test statistics. These methods are applied to analyze a real dataset for an illustration. Full Article
ipl Wilcoxon-Mann-Whitney or t-test? On assumptions for hypothesis tests and multiple interpretations of decision rules By projecteuclid.org Published On :: Thu, 05 Aug 2010 15:41 EDT Michael P. Fay, Michael A. ProschanSource: Statist. Surv., Volume 4, 1--39.Abstract: In a mathematical approach to hypothesis tests, we start with a clearly defined set of hypotheses and choose the test with the best properties for those hypotheses. In practice, we often start with less precise hypotheses. For example, often a researcher wants to know which of two groups generally has the larger responses, and either a t-test or a Wilcoxon-Mann-Whitney (WMW) test could be acceptable. Although both t-tests and WMW tests are usually associated with quite different hypotheses, the decision rule and p-value from either test could be associated with many different sets of assumptions, which we call perspectives. It is useful to have many of the different perspectives to which a decision rule may be applied collected in one place, since each perspective allows a different interpretation of the associated p-value. Here we collect many such perspectives for the two-sample t-test, the WMW test and other related tests. We discuss validity and consistency under each perspective and discuss recommendations between the tests in light of these many different perspectives. Finally, we briefly discuss a decision rule for testing genetic neutrality where knowledge of the many perspectives is vital to the proper interpretation of the decision rule. Full Article
ipl On a computationally-scalable sparse formulation of the multidimensional and non-stationary maximum entropy principle. (arXiv:2005.03253v1 [stat.CO]) By arxiv.org Published On :: Data-driven modelling and computational predictions based on maximum entropy principle (MaxEnt-principle) aim at finding as-simple-as-possible - but not simpler then necessary - models that allow to avoid the data overfitting problem. We derive a multivariate non-parametric and non-stationary formulation of the MaxEnt-principle and show that its solution can be approximated through a numerical maximisation of the sparse constrained optimization problem with regularization. Application of the resulting algorithm to popular financial benchmarks reveals memoryless models allowing for simple and qualitative descriptions of the major stock market indexes data. We compare the obtained MaxEnt-models to the heteroschedastic models from the computational econometrics (GARCH, GARCH-GJR, MS-GARCH, GARCH-PML4) in terms of the model fit, complexity and prediction quality. We compare the resulting model log-likelihoods, the values of the Bayesian Information Criterion, posterior model probabilities, the quality of the data autocorrelation function fits as well as the Value-at-Risk prediction quality. We show that all of the considered seven major financial benchmark time series (DJI, SPX, FTSE, STOXX, SMI, HSI and N225) are better described by conditionally memoryless MaxEnt-models with nonstationary regime-switching than by the common econometric models with finite memory. This analysis also reveals a sparse network of statistically-significant temporal relations for the positive and negative latent variance changes among different markets. The code is provided for open access. Full Article
ipl Active Learning with Multiple Kernels. (arXiv:2005.03188v1 [cs.LG]) By arxiv.org Published On :: Online multiple kernel learning (OMKL) has provided an attractive performance in nonlinear function learning tasks. Leveraging a random feature approximation, the major drawback of OMKL, known as the curse of dimensionality, has been recently alleviated. In this paper, we introduce a new research problem, termed (stream-based) active multiple kernel learning (AMKL), in which a learner is allowed to label selected data from an oracle according to a selection criterion. This is necessary in many real-world applications as acquiring true labels is costly or time-consuming. We prove that AMKL achieves an optimal sublinear regret, implying that the proposed selection criterion indeed avoids unuseful label-requests. Furthermore, we propose AMKL with an adaptive kernel selection (AMKL-AKS) in which irrelevant kernels can be excluded from a kernel dictionary 'on the fly'. This approach can improve the efficiency of active learning as well as the accuracy of a function approximation. Via numerical tests with various real datasets, it is demonstrated that AMKL-AKS yields a similar or better performance than the best-known OMKL, with a smaller number of labeled data. Full Article
ipl Wyllie's treatment of epilepsy : principles and practice By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 149639769X Full Article
ipl Wine science : principles and applications By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Author: Jackson, Ron S., author.Callnumber: OnlineISBN: 9780128161180 Full Article
ipl Tissue engineering : principles, protocols, and practical exercises By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9783030396985 Full Article
ipl Maxillofacial cone beam computed tomography : principles, techniques and clinical applications By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9783319620619 (electronic bk.) Full Article
ipl Handbook for principles and practice of gynecologic oncology By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9781975141066 (paperback) Full Article
ipl A unified treatment of multiple testing with prior knowledge using the p-filter By projecteuclid.org Published On :: Fri, 02 Aug 2019 22:04 EDT Aaditya K. Ramdas, Rina F. Barber, Martin J. Wainwright, Michael I. Jordan. Source: The Annals of Statistics, Volume 47, Number 5, 2790--2821.Abstract: There is a significant literature on methods for incorporating knowledge into multiple testing procedures so as to improve their power and precision. Some common forms of prior knowledge include (a) beliefs about which hypotheses are null, modeled by nonuniform prior weights; (b) differing importances of hypotheses, modeled by differing penalties for false discoveries; (c) multiple arbitrary partitions of the hypotheses into (possibly overlapping) groups and (d) knowledge of independence, positive or arbitrary dependence between hypotheses or groups, suggesting the use of more aggressive or conservative procedures. We present a unified algorithmic framework called p-filter for global null testing and false discovery rate (FDR) control that allows the scientist to incorporate all four types of prior knowledge (a)–(d) simultaneously, recovering a variety of known algorithms as special cases. Full Article
ipl Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem By projecteuclid.org Published On :: Thu, 05 Aug 2010 15:41 EDT James G. Scott, James O. BergerSource: Ann. Statist., Volume 38, Number 5, 2587--2619.Abstract: This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens automatically in Bayesian analysis, and to distinguish this correction from the Bayesian Ockham’s-razor effect. Our second goal is to contrast empirical-Bayes and fully Bayesian approaches to variable selection through examples, theoretical results and simulations. Considerable differences between the two approaches are found. In particular, we prove a theorem that characterizes a surprising aymptotic discrepancy between fully Bayes and empirical Bayes. This discrepancy arises from a different source than the failure to account for hyperparameter uncertainty in the empirical-Bayes estimate. Indeed, even at the extreme, when the empirical-Bayes estimate converges asymptotically to the true variable-inclusion probability, the potential for a serious difference remains. Full Article
ipl A comparison of principal component methods between multiple phenotype regression and multiple SNP regression in genetic association studies By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Zhonghua Liu, Ian Barnett, Xihong Lin. Source: The Annals of Applied Statistics, Volume 14, Number 1, 433--451.Abstract: Principal component analysis (PCA) is a popular method for dimension reduction in unsupervised multivariate analysis. However, existing ad hoc uses of PCA in both multivariate regression (multiple outcomes) and multiple regression (multiple predictors) lack theoretical justification. The differences in the statistical properties of PCAs in these two regression settings are not well understood. In this paper we provide theoretical results on the power of PCA in genetic association testings in both multiple phenotype and SNP-set settings. The multiple phenotype setting refers to the case when one is interested in studying the association between a single SNP and multiple phenotypes as outcomes. The SNP-set setting refers to the case when one is interested in studying the association between multiple SNPs in a SNP set and a single phenotype as the outcome. We demonstrate analytically that the properties of the PC-based analysis in these two regression settings are substantially different. We show that the lower order PCs, that is, PCs with large eigenvalues, are generally preferred and lead to a higher power in the SNP-set setting, while the higher-order PCs, that is, PCs with small eigenvalues, are generally preferred in the multiple phenotype setting. We also investigate the power of three other popular statistical methods, the Wald test, the variance component test and the minimum $p$-value test, in both multiple phenotype and SNP-set settings. We use theoretical power, simulation studies, and two real data analyses to validate our findings. Full Article
ipl Propensity score weighting for causal inference with multiple treatments By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Fan Li, Fan Li. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2389--2415.Abstract: Causal or unconfounded descriptive comparisons between multiple groups are common in observational studies. Motivated from a racial disparity study in health services research, we propose a unified propensity score weighting framework, the balancing weights, for estimating causal effects with multiple treatments. These weights incorporate the generalized propensity scores to balance the weighted covariate distribution of each treatment group, all weighted toward a common prespecified target population. The class of balancing weights include several existing approaches such as the inverse probability weights and trimming weights as special cases. Within this framework, we propose a set of target estimands based on linear contrasts. We further develop the generalized overlap weights, constructed as the product of the inverse probability weights and the harmonic mean of the generalized propensity scores. The generalized overlap weighting scheme corresponds to the target population with the most overlap in covariates across the multiple treatments. These weights are bounded and thus bypass the problem of extreme propensities. We show that the generalized overlap weights minimize the total asymptotic variance of the moment weighting estimators for the pairwise contrasts within the class of balancing weights. We consider two balance check criteria and propose a new sandwich variance estimator for estimating the causal effects with generalized overlap weights. We apply these methods to study the racial disparities in medical expenditure between several racial groups using the 2009 Medical Expenditure Panel Survey (MEPS) data. Simulations were carried out to compare with existing methods. Full Article
ipl Estimating abundance from multiple sampling capture-recapture data via a multi-state multi-period stopover model By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Hannah Worthington, Rachel McCrea, Ruth King, Richard Griffiths. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2043--2064.Abstract: Capture-recapture studies often involve collecting data on numerous capture occasions over a relatively short period of time. For many study species this process is repeated, for example, annually, resulting in capture information spanning multiple sampling periods. To account for the different temporal scales, the robust design class of models have traditionally been applied providing a framework in which to analyse all of the available capture data in a single likelihood expression. However, these models typically require strong constraints, either the assumption of closure within a sampling period (the closed robust design) or conditioning on the number of individuals captured within a sampling period (the open robust design). For real datasets these assumptions may not be appropriate. We develop a general modelling structure that requires neither assumption by explicitly modelling the movement of individuals into the population both within and between the sampling periods, which in turn permits the estimation of abundance within a single consistent framework. The flexibility of the novel model structure is further demonstrated by including the computationally challenging case of multi-state data where there is individual time-varying discrete covariate information. We derive an efficient likelihood expression for the new multi-state multi-period stopover model using the hidden Markov model framework. We demonstrate the significant improvement in parameter estimation using our new modelling approach in terms of both the multi-period and multi-state components through both a simulation study and a real dataset relating to the protected species of great crested newts, Triturus cristatus . Full Article
ipl Radio-iBAG: Radiomics-based integrative Bayesian analysis of multiplatform genomic data By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Youyi Zhang, Jeffrey S. Morris, Shivali Narang Aerry, Arvind U. K. Rao, Veerabhadran Baladandayuthapani. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1957--1988.Abstract: Technological innovations have produced large multi-modal datasets that include imaging and multi-platform genomics data. Integrative analyses of such data have the potential to reveal important biological and clinical insights into complex diseases like cancer. In this paper, we present Bayesian approaches for integrative analysis of radiological imaging and multi-platform genomic data, where-in our goals are to simultaneously identify genomic and radiomic, that is, radiology-based imaging markers, along with the latent associations between these two modalities, and to detect the overall prognostic relevance of the combined markers. For this task, we propose Radio-iBAG: Radiomics-based Integrative Bayesian Analysis of Multiplatform Genomic Data , a multi-scale Bayesian hierarchical model that involves several innovative strategies: it incorporates integrative analysis of multi-platform genomic data sets to capture fundamental biological relationships; explores the associations between radiomic markers accompanying genomic information with clinical outcomes; and detects genomic and radiomic markers associated with clinical prognosis. We also introduce the use of sparse Principal Component Analysis (sPCA) to extract a sparse set of approximately orthogonal meta-features each containing information from a set of related individual radiomic features, reducing dimensionality and combining like features. Our methods are motivated by and applied to The Cancer Genome Atlas glioblastoma multiforme data set, where-in we integrate magnetic resonance imaging-based biomarkers along with genomic, epigenomic and transcriptomic data. Our model identifies important magnetic resonance imaging features and the associated genomic platforms that are related with patient survival times. Full Article
ipl Bayesian methods for multiple mediators: Relating principal stratification and causal mediation in the analysis of power plant emission controls By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Chanmin Kim, Michael J. Daniels, Joseph W. Hogan, Christine Choirat, Corwin M. Zigler. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1927--1956.Abstract: Emission control technologies installed on power plants are a key feature of many air pollution regulations in the US. While such regulations are predicated on the presumed relationships between emissions, ambient air pollution and human health, many of these relationships have never been empirically verified. The goal of this paper is to develop new statistical methods to quantify these relationships. We frame this problem as one of mediation analysis to evaluate the extent to which the effect of a particular control technology on ambient pollution is mediated through causal effects on power plant emissions. Since power plants emit various compounds that contribute to ambient pollution, we develop new methods for multiple intermediate variables that are measured contemporaneously, may interact with one another, and may exhibit joint mediating effects. Specifically, we propose new methods leveraging two related frameworks for causal inference in the presence of mediating variables: principal stratification and causal mediation analysis. We define principal effects based on multiple mediators, and also introduce a new decomposition of the total effect of an intervention on ambient pollution into the natural direct effect and natural indirect effects for all combinations of mediators. Both approaches are anchored to the same observed-data models, which we specify with Bayesian nonparametric techniques. We provide assumptions for estimating principal causal effects, then augment these with an additional assumption required for causal mediation analysis. The two analyses, interpreted in tandem, provide the first empirical investigation of the presumed causal pathways that motivate important air quality regulatory policies. Full Article
ipl Identifying multiple changes for a functional data sequence with application to freeway traffic segmentation By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Jeng-Min Chiou, Yu-Ting Chen, Tailen Hsing. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1430--1463.Abstract: Motivated by the study of road segmentation partitioned by shifts in traffic conditions along a freeway, we introduce a two-stage procedure, Dynamic Segmentation and Backward Elimination (DSBE), for identifying multiple changes in the mean functions for a sequence of functional data. The Dynamic Segmentation procedure searches for all possible changepoints using the derived global optimality criterion coupled with the local strategy of at-most-one-changepoint by dividing the entire sequence into individual subsequences that are recursively adjusted until convergence. Then, the Backward Elimination procedure verifies these changepoints by iteratively testing the unlikely changes to ensure their significance until no more changepoints can be removed. By combining the local strategy with the global optimal changepoint criterion, the DSBE algorithm is conceptually simple and easy to implement and performs better than the binary segmentation-based approach at detecting small multiple changes. The consistency property of the changepoint estimators and the convergence of the algorithm are proved. We apply DSBE to detect changes in traffic streams through real freeway traffic data. The practical performance of DSBE is also investigated through intensive simulation studies for various scenarios. Full Article
ipl Frequency domain theory for functional time series: Variance decomposition and an invariance principle By projecteuclid.org Published On :: Mon, 27 Apr 2020 04:02 EDT Piotr Kokoszka, Neda Mohammadi Jouzdani. Source: Bernoulli, Volume 26, Number 3, 2383--2399.Abstract: This paper is concerned with frequency domain theory for functional time series, which are temporally dependent sequences of functions in a Hilbert space. We consider a variance decomposition, which is more suitable for such a data structure than the variance decomposition based on the Karhunen–Loéve expansion. The decomposition we study uses eigenvalues of spectral density operators, which are functional analogs of the spectral density of a stationary scalar time series. We propose estimators of the variance components and derive convergence rates for their mean square error as well as their asymptotic normality. The latter is derived from a frequency domain invariance principle for the estimators of the spectral density operators. This principle is established for a broad class of linear time series models. It is a main contribution of the paper. Full Article
ipl A unified principled framework for resampling based on pseudo-populations: Asymptotic theory By projecteuclid.org Published On :: Fri, 31 Jan 2020 04:06 EST Pier Luigi Conti, Daniela Marella, Fulvia Mecatti, Federico Andreis. Source: Bernoulli, Volume 26, Number 2, 1044--1069.Abstract: In this paper, a class of resampling techniques for finite populations under $pi $ps sampling design is introduced. The basic idea on which they rest is a two-step procedure consisting in: (i) constructing a “pseudo-population” on the basis of sample data; (ii) drawing a sample from the predicted population according to an appropriate resampling design. From a logical point of view, this approach is essentially based on the plug-in principle by Efron, at the “sampling design level”. Theoretical justifications based on large sample theory are provided. New approaches to construct pseudo populations based on various forms of calibrations are proposed. Finally, a simulation study is performed. Full Article
ipl Semiparametric Multivariate and Multiple Change-Point Modeling By projecteuclid.org Published On :: Tue, 11 Jun 2019 04:00 EDT Stefano Peluso, Siddhartha Chib, Antonietta Mira. Source: Bayesian Analysis, Volume 14, Number 3, 727--751.Abstract: We develop a general Bayesian semiparametric change-point model in which separate groups of structural parameters (for example, location and dispersion parameters) can each follow a separate multiple change-point process, driven by time-dependent transition matrices among the latent regimes. The distribution of the observations within regimes is unknown and given by a Dirichlet process mixture prior. The properties of the proposed model are studied theoretically through the analysis of inter-arrival times and of the number of change-points in a given time interval. The prior-posterior analysis by Markov chain Monte Carlo techniques is developed on a forward-backward algorithm for sampling the various regime indicators. Analysis with simulated data under various scenarios and an application to short-term interest rates are used to show the generality and usefulness of the proposed model. Full Article
ipl A Bayesian Nonparametric Multiple Testing Procedure for Comparing Several Treatments Against a Control By projecteuclid.org Published On :: Fri, 31 May 2019 22:05 EDT Luis Gutiérrez, Andrés F. Barrientos, Jorge González, Daniel Taylor-Rodríguez. Source: Bayesian Analysis, Volume 14, Number 2, 649--675.Abstract: We propose a Bayesian nonparametric strategy to test for differences between a control group and several treatment regimes. Most of the existing tests for this type of comparison are based on the differences between location parameters. In contrast, our approach identifies differences across the entire distribution, avoids strong modeling assumptions over the distributions for each treatment, and accounts for multiple testing through the prior distribution on the space of hypotheses. The proposal is compared to other commonly used hypothesis testing procedures under simulated scenarios. Two real applications are also analyzed with the proposed methodology. Full Article