anal

ADA technical report on age assessment by dental analysis available for review

The ADA Standards Committee on Dental Informatics has approved the technical report for circulation and comment.




anal

May JADA discusses calcium hydroxide overfill risk during root canals

Overfill of medication or obturation materials in endodontic treatment can cause permanent neurologic injury, and there are steps clinicians can take to help prevent that, according to an article published in the May issue of The Journal of the American Dental Association.




anal

Analysis suggests states need to plan for second wave of COVID-19

Leaders across the United States should plan for a worst-case scenario, second wave to the COVID-19 pandemic, including no vaccine availability or herd immunity, say experts at the University of Minnesota.




anal

U.S. ICUs could still be overwhelmed by COVID-19 patients, analysis says

Communities across the U.S. still need to expand hospital capacity to manage new COVID-19 cases, even as some states loosen social distancing restrictions, a study published Wednesday by JAMA Network Open says.




anal

Continuous Glucose Monitoring in Pregnancy: Importance of Analysing Temporal Profiles to Understand Clinical Outcomes

OBJECTIVE

To determine if temporal glucose profiles differed between 1) women who were randomized to real-time continuous glucose monitoring (RT-CGM) or self-monitored blood glucose (SMBG), 2) women who used insulin pumps or multiple daily insulin injections (MDIs), and 3) women whose infants were born large for gestational age (LGA) or not, by assessing CGM data obtained from the Continuous Glucose Monitoring in Women With Type 1 Diabetes in Pregnancy Trial (CONCEPTT).

RESEARCH DESIGN AND METHODS

Standard summary metrics and functional data analysis (FDA) were applied to CGM data from the CONCEPTT trial (RT-CGM, n = 100; SMBG, n = 100) taken at baseline and at 24- and 34-weeks gestation. Multivariable regression analysis determined if temporal differences in 24-h glucose profiles occurred between comparators in each of the three groups.

RESULTS

FDA revealed that women using RT-CGM had significantly lower glucose (0.4–0.8 mmol/L [7–14 mg/dL]) for 7 h/day (0800 h–1200 h and 1600 h–1900 h) compared with those with SMBG. Women using pumps had significantly higher glucose (0.4–0.9 mmol/L [7–16 mg/dL]) for 12 h/day (0300 h to 0600 h, 1300 h to 1800 h, and 2030 h to 0030 h) at 24 weeks with no difference at 34 weeks compared with MDI. Women who had an LGA infant ran a significantly higher glucose by 0.4–0.7 mmol/L (7–13 mg/dL) for 4.5 h/day at baseline; by 0.4–0.9 mmol/L (7–16 mg/dL) for 16 h/day at 24 weeks; and by 0.4–0.7 mmol/L (7–13 mg/dL) for 14 h/day at 34 weeks.

CONCLUSIONS

FDA of temporal glucose profiles gives important information about differences in glucose control and its timing, which are undetectable by standard summary metrics. Women using RT-CGM were able to achieve better daytime glucose control, reducing fetal exposure to maternal glucose.




anal

Use of Antihyperglycemic Medications in U.S. Adults: An Analysis of the National Health and Nutrition Examination Survey

OBJECTIVE

1) To examine trends in the use of diabetes medications and 2) to determine whether physicians individualize diabetes treatment as recommended by the American Diabetes Association (ADA).

RESEARCH DESIGN AND METHODS

We conducted a retrospective, cross-sectional analysis of 2003–2016 National Health and Nutrition Examination Survey (NHANES) data. We included people ≥18 years who had ever been told they had diabetes, had an HbA1C >6.4%, or had a fasting plasma glucose >125 mg/dL. Pregnant women, and those aged <20 years receiving only insulin were excluded. We assessed trends in use of ADA’s seven preferred classes from 2003–2004 to 2015–2016. We also examined use by hypoglycemia risk (sulfonylureas, insulin, and meglitinides), weight effect (sulfonylureas, thiazolidinediones [TZDs], insulin, and meglitinides), cardiovascular benefit (canagliflozin, empagliflozin, and liraglutide), and cost (brand-name medications and insulin analogs).

RESULTS

The final sample included 6,323 patients. The proportion taking any medication increased from 58% in 2003–2004 to 67% in 2015–2016 (P < 0.001). Use of metformin and insulin analogs increased, while use of sulfonylureas, TZDs, and human insulin decreased. Following the 2012 ADA recommendation, the choice of drug did not vary significantly by older age, weight, or presence of cardiovascular disease. Patients with low HbA1C, or HbA1C <6%, and age ≥65 years were less likely to receive hypoglycemia-inducing medications, while older patients with comorbidities were more likely. Insurance, but not income, was associated with the use of higher-cost medications.

CONCLUSIONS

Following ADA recommendations, the use of metformin increased, but physicians generally did not individualize treatment according to patients’ characteristics. Substantial opportunities exist to improve pharmacologic management of diabetes.




anal

Acrylamide Exposure and Oxidative DNA Damage, Lipid Peroxidation, and Fasting Plasma Glucose Alteration: Association and Mediation Analyses in Chinese Urban Adults

OBJECTIVE

Acrylamide exposure from daily-consumed food has raised global concern. We aimed to assess the exposure-response relationships of internal acrylamide exposure with oxidative DNA damage, lipid peroxidation, and fasting plasma glucose (FPG) alteration and investigate the mediating role of oxidative DNA damage and lipid peroxidation in the association of internal acrylamide exposure with FPG.

RESEARCH DESIGN AND METHODS

FPG and urinary biomarkers of oxidative DNA damage (8-hydroxy-deoxyguanosine [8-OHdG]), lipid peroxidation (8-iso-prostaglandin-F2α [8-iso-PGF2α]), and acrylamide exposure (N-acetyl-S-[2-carbamoylethyl]-l-cysteine [AAMA], N-acetyl-S-[2-carbamoyl-2-hydroxyethyl]-l-cysteine [GAMA]) were measured for 3,270 general adults from the Wuhan-Zhuhai cohort. The associations of urinary acrylamide metabolites with 8-OHdG, 8-iso-PGF2α, and FPG were assessed by linear mixed models. The mediating roles of 8-OHdG and 8-iso-PGF2α were evaluated by mediation analysis.

RESULTS

We found significant linear positive dose-response relationships of urinary acrylamide metabolites with 8-OHdG, 8-iso-PGF2α, and FPG (except GAMA with FPG) and 8-iso-PGF2α with FPG. Each 1-unit increase in log-transformed level of AAMA, AAMA + GAMA (UAAM), or 8-iso-PGF2α was associated with a 0.17, 0.15, or 0.23 mmol/L increase in FPG, respectively (P and/or P trend < 0.05). Each 1% increase in AAMA, GAMA, or UAAM was associated with a 0.19%, 0.27%, or 0.22% increase in 8-OHdG, respectively, and a 0.40%, 0.48%, or 0.44% increase in 8-iso-PGF2α, respectively (P and P trend < 0.05). Increased 8-iso-PGF2α rather than 8-OHdG significantly mediated 64.29% and 76.92% of the AAMA- and UAAM-associated FPG increases, respectively.

CONCLUSIONS

Exposure of the general adult population to acrylamide was associated with FPG elevation, oxidative DNA damage, and lipid peroxidation, which in turn partly mediated acrylamide-associated FPG elevation.




anal

Strict Preanalytical Oral Glucose Tolerance Test Blood Sample Handling Is Essential for Diagnosing Gestational Diabetes Mellitus

OBJECTIVE

Preanalytical processing of blood samples can affect plasma glucose measurement because on-going glycolysis by cells prior to centrifugation can lower its concentration. In June 2017, ACT Pathology changed the processing of oral glucose tolerance test (OGTT) blood samples for pregnant women from a delayed to an early centrifugation protocol. The effect of this change on the rate of gestational diabetes mellitus (GDM) diagnosis was determined.

RESEARCH DESIGN AND METHODS

All pregnant women in the Australian Capital Territory (ACT) are recommended for GDM testing with a 75-g OGTT using the World Health Organization diagnostic criteria. From January 2015 to May 2017, OGTT samples were collected into sodium fluoride (NaF) tubes and kept at room temperature until completion of the test (delayed centrifugation). From June 2017 to October 2018, OGTT samples in NaF tubes were centrifuged within 10 min (early centrifugation).

RESULTS

A total of 7,509 women were tested with the delayed centrifugation protocol and 4,808 with the early centrifugation protocol. The mean glucose concentrations for the fasting, 1-h and 2-h OGTT samples were, respectively, 0.24 mmol/L (5.4%), 0.34 mmol/L (4.9%), and 0.16 mmol/L (2.3%) higher using the early centrifugation protocol (P < 0.0001 for all), increasing the GDM diagnosis rate from 11.6% (n = 869/7,509) to 20.6% (n = 1,007/4,887).

CONCLUSIONS

The findings of this study highlight the critical importance of the preanalytical processing protocol of OGTT blood samples used for diagnosing GDM. Delay in centrifuging of blood collected into NaF tubes will result in substantially lower rates of diagnosis than if blood is centrifuged early.




anal

Cardiovascular Risk Reduction With Liraglutide: An Exploratory Mediation Analysis of the LEADER Trial

OBJECTIVE

The LEADER trial (ClinicalTrials.gov reg. no. NCT01179048) demonstrated a reduced risk of cardiovascular (CV) events for patients with type 2 diabetes who received the glucagon-like peptide 1 receptor agonist liraglutide versus placebo. The mechanisms behind this CV benefit remain unclear. We aimed to identify potential mediators for the CV benefit observed with liraglutide in the LEADER trial.

RESEARCH DESIGN AND METHODS

We performed exploratory analyses to identify potential mediators of the effect of liraglutide on major adverse CV events (MACE; composite of CV death, nonfatal myocardial infarction, or nonfatal stroke) from the following candidates: glycated hemoglobin (HbA1c), body weight, urinary albumin-to-creatinine ratio (UACR), confirmed hypoglycemia, sulfonylurea use, insulin use, systolic blood pressure, and LDL cholesterol. These candidates were selected as CV risk factors on which liraglutide had an effect in LEADER such that a reduction in CV risk might result. We used two methods based on a Cox proportional hazards model and the new Vansteelandt method designed to use all available information from the mediator and to control for confounding factors.

RESULTS

Analyses using the Cox methods and Vansteelandt method indicated potential mediation by HbA1c (up to 41% and 83% mediation, respectively) and UACR (up to 29% and 33% mediation, respectively) on the effect of liraglutide on MACE. Mediation effects were small for other candidates.

CONCLUSIONS

These analyses identify HbA1c and, to a lesser extent, UACR as potential mediators of the CV effects of liraglutide. Whether either is a marker of an unmeasured factor or a true mediator remains a key question that invites further investigation.





anal

A Needed Evidence Revolution: Using Cost-Benefit Analysis to Improve Refugee Integration Programming

European countries have ramped up their investments in helping refugees find work and integrate into society. Yet little hard evidence exists of what programs and policies work best. This report proposes a new framework for thinking smartly about integration programming, using cost-benefit analysis to look beyond short-term, economic outcomes to also measure indirect benefits through a social-value concept.




anal

Two- and three-color STORM analysis reveals higher-order assembly of leukotriene synthetic complexes on the nuclear envelope of murine neutrophils [Computational Biology]

Over the last several years it has become clear that higher order assemblies on membranes, exemplified by signalosomes, are a paradigm for the regulation of many membrane signaling processes. We have recently combined two-color direct stochastic optical reconstruction microscopy (dSTORM) with the (Clus-DoC) algorithm that combines cluster detection and colocalization analysis to observe the organization of 5-lipoxygenase (5-LO) and 5-lipoxygenase–activating protein (FLAP) into higher order assemblies on the nuclear envelope of mast cells; these assemblies were linked to leukotriene (LT) C4 production. In this study we investigated whether higher order assemblies of 5-LO and FLAP included cytosolic phospholipase A2 (cPLA2) and were linked to LTB4 production in murine neutrophils. Using two- and three-color dSTORM supported by fluorescence lifetime imaging microscopy we identified higher order assemblies containing 40 molecules (median) (IQR: 23, 87) of 5-LO, and 53 molecules (62, 156) of FLAP monomer. 98 (18, 154) molecules of cPLA2 were clustered with 5-LO, and 77 (33, 114) molecules of cPLA2 were associated with FLAP. These assemblies were tightly linked to LTB4 formation. The activation-dependent close associations of cPLA2, FLAP, and 5-LO in higher order assemblies on the nuclear envelope support a model in which arachidonic acid is generated by cPLA2 in apposition to FLAP, facilitating its transfer to 5-LO to initiate LT synthesis.




anal

Sketch for a Self-Analysis, by Pierre Bourdieu

Pierre Bourdieu, a philosopher by education, and an anthropologist and sociologist by choice, is one of the most esteemed names in twentieth-century French thought. With his election in 1981 to the chair of sociology at the College de France, he joined the distinguished ranks of the most respected French social scientists, Raymond Aron and Claude Levi-Strauss. Prolific writer, Bourdieu has published more than 30 books and 340 articles over the period 1958 to 1995. The Social Science Citation Index ranking from high to low in 1989 for leading French thinkers was the following: Foucault, Bourdieu, Levi-Strauss, Derrida, Althusser, Sartre, Poulantzas, Touraine, Lacan, Baudrillard, and Aron.

Although his subject was mainly Algerian and French society,  Bourdieu’s approach is useful in analyzing power in many more illuminating ways than offered by Foucault. While Foucault sees power as "ubiquitous" and beyond agency or structure, Bourdieu sees power as economically, culturally, socially and symbolically created, and constantly re-legitimized through an interplay of agency and structure. The main way this comes about is through what he calls "habitus" or socialized norms or tendencies that unconsciously guide behavior, choices and thinking. In Bourdieu’s stipulation, habitus is "the way society becomes deposited in persons in the form of lasting dispositions, or trained capacities and structured propensities to think, feel and act in determinant ways, which then guide them."
In his Sketch for a Self-Analysis, written shortly before his death in January 2002, Bourdieu offers a "self-socioanalysis," in only 113 pages, and provides a compelling narrative of his life and career, and insights from his lifelong preoccupation with sociology, including intimate insights into the ideas of Foucault, Sartre, Althusser and de Beauvoir, among others, as well as his reflections on his own formative years at boarding school and his moral outrage at the colonial war in Algeria. Please join us as at Brooklyn Book Talk, as we explore some of the most stimulating thoughts of one of the greatest sociologists of the twentieth-century.




anal

MPI Analysis of All State ESSA Accountability Plans Finds Fractured Picture of Education Policy for English Learners &amp; Differing Approaches

WASHINGTON – Four years since the Every Student Succeeds Act (ESSA) was signed into law, all 50 states, the District of Columbia and Puerto Rico have developed accountability plans that include blueprints for serving English Learners (ELs), as well as measuring these students’ progress and being accountable for their outcomes. This marked a significant development, as EL performance was previously not well integrated with factors that determined whether a school was performing well or poorly.




anal

Efficacy and Safety of Dapagliflozin in the Elderly: Analysis From the DECLARE-TIMI 58 Study

OBJECTIVE

Data regarding the effects of sodium–glucose cotransporter 2 inhibitors in the elderly (age ≥65 years) and very elderly (age ≥75 years) are limited.

RESEARCH DESIGN AND METHODS

The Dapagliflozin Effect on Cardiovascular Events (DECLARE)–TIMI 58 assessed cardiac and renal outcomes of dapagliflozin versus placebo in patients with type 2 diabetes. Efficacy and safety outcomes were studied within age subgroups for treatment effect and age-based treatment interaction.

RESULTS

Of the 17,160 patients, 9,253 were <65 years of age, 6,811 ≥65 to <75 years, and 1,096 ≥75 years. Dapagliflozin reduced the composite of cardiovascular death or hospitalization for heart failure consistently, with a hazard ratio (HR) of 0.88 (95% CI 0.72, 1.07), 0.77 (0.63, 0.94), and 0.94 (0.65, 1.36) in age-groups <65, ≥65 to <75, and ≥75 years, respectively (interaction P value 0.5277). Overall, dapagliflozin did not significantly decrease the rates of major adverse cardiovascular events, with HR 0.93 (95% CI 0.81, 1.08), 0.97 (0.83, 1.13), and 0.84 (0.61, 1.15) in age-groups <65, ≥65 to <75, and ≥75 years, respectively (interaction P value 0.7352). The relative risk reduction for the secondary prespecified cardiorenal composite outcome ranged from 18% to 28% in the different age-groups with no heterogeneity. Major hypoglycemia was less frequent with dapagliflozin versus placebo, with HR 0.97 (95% CI 0.58, 1.64), 0.50 (0.29, 0.84), and 0.68 (0.29, 1.57) in age-groups <65, ≥65 to <75, and ≥75 years, respectively (interaction P value 0.2107). Safety outcomes, including fractures, volume depletion, cancer, urinary tract infections, and amputations were balanced with dapagliflozin versus placebo, and acute kidney injury was reduced, all regardless of age. Genital infections that were serious or led to discontinuation of the study drug and diabetic ketoacidosis were uncommon, yet more frequent with dapagliflozin versus placebo, without heterogeneity (interaction P values 0.1058 and 0.8433, respectively).

CONCLUSIONS

The overall efficacy and safety of dapagliflozin are consistent regardless of age.




anal

PROactive: A Sad Tale of Inappropriate Analysis and Unjustified Interpretation

Jay S. Skyler
Apr 1, 2006; 24:63-65
Commentary




anal

To make terms of compromise a rule of Court or not? That is the question : an analysis of the options available to settle estate matters / presended by Christina Flourentzou, Supreme Court of South Australia..




anal

Fee for service in Indigenous land and sea management : impact assessment and analysis.

This evaluation report identifies experiences, motivations, supporting mechanisms, barriers and impacts of fee-for-service commercial activities undertaken by Indigenous Land and Water Management (ILWM) organisations. The report draws on a literature review, interviews with ILWM organisations, administrative data, in-depth case studies, and online survey data.




anal

Ratings summary - labour market analysis of skilled occupations / Department of Employment, Skills, Small and Family Business.




anal

Dark times : psychoanalytic perspectives on politics, history, and mourning / Jonathan Sklar.

Psychoanalysis.




anal

Des rétrécissements du canal de l'urèthre / par Édouard de Smet.

Bruxelles : H. Manceux, 1880.




anal

Die Entzündung und Verschwärung der Schleimhaut des Verdauungskanales, als selbständige Krankheit, Grundleiden vieler sogenannten Nervenfieber, Schleimfieber, Ruhren u.s.w., und als symptomatische Erscheinung vieler acuten und chronischen K

Berlin : T.C.F. Enslin, 1830.




anal

Die Spectralanalyse / von John Landauer.

Braunschweig : Druck und Verlag, 1896.




anal

Du somnambulisme en général : nature, analogies, signification nosologique et étiologie, avec huit observations de somnambulisme hystérique / par Ernest Chambard.

Paris : O. Doin, 1881.




anal

Éléments d'analyse chimique médicale, appliquée aux recherches cliniques / par le Dr Sonnié-Moret.

Paris : Société d’éditions scientifiques, 1896.




anal

Elements of water bacteriology : with special reference to sanitary water analysis / by Samuel Cate Prescott, and Charles-Edward Amory Winslow.

London : Chapman & Hall, 1908.




anal

Essai analytique sur la non-identité des virus gonorrhoïque et syphilitique : ouvrage couronné le 3 juillet 1810, par la Société de Médecine de Besancon, sur la question suívante: Déterminer par des expériences

Toulon : chez l'auteur, 1812.




anal

Essai de coprologie clinique : de l’exploration fonctionnelle de l’intestin par l’analyse des fèces / par Rene Gaultier.

Paris : J.-B. Baillière, 1905.




anal

Essai de sémiologie urinaire; méthodes d'interprétation de l'analyse urologique. L'urine dans les divers états morbides / par Camille Vieillard ; preface par Albert Robin.

Paris : Société d’éditions scientifiques, 1901.




anal

Essai sur le typhus, ou sur les fièvres dites malignes, putrides, bilieuses, muqueuses, jaune, la peste. Exposition analytique et expérimentale de la nature des fièvres en général ... / par J.F. Hernandez.

Paris : chez Mequignon-Marvis, 1816.




anal

The analysis of cannabinoids in biological fluids / editor, Richard L. Hawks.

Rockville, Maryland : National Institute on Drug Abuse, 1982.




anal

Contemporary research in pain and analgesia, 1983 / editors, Roger M. Brown, Theodore M. Pinkert, Jacqueline P. Ludford.

Rockville, Maryland : National Institute on Drug Abuse, 1983.




anal

Adolescent drug abuse : analyses of treatment research / editors, Elizabeth R. Rahdert, John Grabowski.

Rockville, Maryland : National Institute on Drug Abuse, 1988.




anal

Testing goodness of fit for point processes via topological data analysis

Christophe A. N. Biscio, Nicolas Chenavier, Christian Hirsch, Anne Marie Svane.

Source: Electronic Journal of Statistics, Volume 14, Number 1, 1024--1074.

Abstract:
We introduce tests for the goodness of fit of point patterns via methods from topological data analysis. More precisely, the persistent Betti numbers give rise to a bivariate functional summary statistic for observed point patterns that is asymptotically Gaussian in large observation windows. We analyze the power of tests derived from this statistic on simulated point patterns and compare its performance with global envelope tests. Finally, we apply the tests to a point pattern from an application context in neuroscience. As the main methodological contribution, we derive sufficient conditions for a functional central limit theorem on bounded persistent Betti numbers of point processes with exponential decay of correlations.




anal

A Model of Fake Data in Data-driven Analysis

Data-driven analysis has been increasingly used in various decision making processes. With more sources, including reviews, news, and pictures, can now be used for data analysis, the authenticity of data sources is in doubt. While previous literature attempted to detect fake data piece by piece, in the current work, we try to capture the fake data sender's strategic behavior to detect the fake data source. Specifically, we model the tension between a data receiver who makes data-driven decisions and a fake data sender who benefits from misleading the receiver. We propose a potentially infinite horizon continuous time game-theoretic model with asymmetric information to capture the fact that the receiver does not initially know the existence of fake data and learns about it during the course of the game. We use point processes to model the data traffic, where each piece of data can occur at any discrete moment in a continuous time flow. We fully solve the model and employ numerical examples to illustrate the players' strategies and payoffs for insights. Specifically, our results show that maintaining some suspicion about the data sources and understanding that the sender can be strategic are very helpful to the data receiver. In addition, based on our model, we propose a methodology of detecting fake data that is complementary to the previous studies on this topic, which suggested various approaches on analyzing the data piece by piece. We show that after analyzing each piece of data, understanding a source by looking at the its whole history of pushing data can be helpful.




anal

Generalized probabilistic principal component analysis of correlated data

Principal component analysis (PCA) is a well-established tool in machine learning and data processing. The principal axes in PCA were shown to be equivalent to the maximum marginal likelihood estimator of the factor loading matrix in a latent factor model for the observed data, assuming that the latent factors are independently distributed as standard normal distributions. However, the independence assumption may be unrealistic for many scenarios such as modeling multiple time series, spatial processes, and functional data, where the outcomes are correlated. In this paper, we introduce the generalized probabilistic principal component analysis (GPPCA) to study the latent factor model for multiple correlated outcomes, where each factor is modeled by a Gaussian process. Our method generalizes the previous probabilistic formulation of PCA (PPCA) by providing the closed-form maximum marginal likelihood estimator of the factor loadings and other parameters. Based on the explicit expression of the precision matrix in the marginal likelihood that we derived, the number of the computational operations is linear to the number of output variables. Furthermore, we also provide the closed-form expression of the marginal likelihood when other covariates are included in the mean structure. We highlight the advantage of GPPCA in terms of the practical relevance, estimation accuracy and computational convenience. Numerical studies of simulated and real data confirm the excellent finite-sample performance of the proposed approach.




anal

On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent

Dual first-order methods are essential techniques for large-scale constrained convex optimization. However, when recovering the primal solutions, we need $T(epsilon^{-2})$ iterations to achieve an $epsilon$-optimal primal solution when we apply an algorithm to the non-strongly convex dual problem with $T(epsilon^{-1})$ iterations to achieve an $epsilon$-optimal dual solution, where $T(x)$ can be $x$ or $sqrt{x}$. In this paper, we prove that the iteration complexity of the primal solutions and dual solutions have the same $Oleft(frac{1}{sqrt{epsilon}} ight)$ order of magnitude for the accelerated randomized dual coordinate ascent. When the dual function further satisfies the quadratic functional growth condition, by restarting the algorithm at any period, we establish the linear iteration complexity for both the primal solutions and dual solutions even if the condition number is unknown. When applied to the regularized empirical risk minimization problem, we prove the iteration complexity of $Oleft(nlog n+sqrt{frac{n}{epsilon}} ight)$ in both primal space and dual space, where $n$ is the number of samples. Our result takes out the $left(log frac{1}{epsilon} ight)$ factor compared with the methods based on smoothing/regularization or Catalyst reduction. As far as we know, this is the first time that the optimal $Oleft(sqrt{frac{n}{epsilon}} ight)$ iteration complexity in the primal space is established for the dual coordinate ascent based stochastic algorithms. We also establish the accelerated linear complexity for some problems with nonsmooth loss, e.g., the least absolute deviation and SVM.




anal

Exact Guarantees on the Absence of Spurious Local Minima for Non-negative Rank-1 Robust Principal Component Analysis

This work is concerned with the non-negative rank-1 robust principal component analysis (RPCA), where the goal is to recover the dominant non-negative principal components of a data matrix precisely, where a number of measurements could be grossly corrupted with sparse and arbitrary large noise. Most of the known techniques for solving the RPCA rely on convex relaxation methods by lifting the problem to a higher dimension, which significantly increase the number of variables. As an alternative, the well-known Burer-Monteiro approach can be used to cast the RPCA as a non-convex and non-smooth $ell_1$ optimization problem with a significantly smaller number of variables. In this work, we show that the low-dimensional formulation of the symmetric and asymmetric positive rank-1 RPCA based on the Burer-Monteiro approach has benign landscape, i.e., 1) it does not have any spurious local solution, 2) has a unique global solution, and 3) its unique global solution coincides with the true components. An implication of this result is that simple local search algorithms are guaranteed to achieve a zero global optimality gap when directly applied to the low-dimensional formulation. Furthermore, we provide strong deterministic and probabilistic guarantees for the exact recovery of the true principal components. In particular, it is shown that a constant fraction of the measurements could be grossly corrupted and yet they would not create any spurious local solution.




anal

Bayesian modeling and prior sensitivity analysis for zero–one augmented beta regression models with an application to psychometric data

Danilo Covaes Nogarotto, Caio Lucidius Naberezny Azevedo, Jorge Luis Bazán.

Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 2, 304--322.

Abstract:
The interest on the analysis of the zero–one augmented beta regression (ZOABR) model has been increasing over the last few years. In this work, we developed a Bayesian inference for the ZOABR model, providing some contributions, namely: we explored the use of Jeffreys-rule and independence Jeffreys prior for some of the parameters, performing a sensitivity study of prior choice, comparing the Bayesian estimates with the maximum likelihood ones and measuring the accuracy of the estimates under several scenarios of interest. The results indicate, in a general way, that: the Bayesian approach, under the Jeffreys-rule prior, was as accurate as the ML one. Also, different from other approaches, we use the predictive distribution of the response to implement Bayesian residuals. To further illustrate the advantages of our approach, we conduct an analysis of a real psychometric data set including a Bayesian residual analysis, where it is shown that misleading inference can be obtained when the data is transformed. That is, when the zeros and ones are transformed to suitable values and the usual beta regression model is considered, instead of the ZOABR model. Finally, future developments are discussed.




anal

A note on the “L-logistic regression models: Prior sensitivity analysis, robustness to outliers and applications”

Saralees Nadarajah, Yuancheng Si.

Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 1, 183--187.

Abstract:
Da Paz, Balakrishnan and Bazan [Braz. J. Probab. Stat. 33 (2019), 455–479] introduced the L-logistic distribution, studied its properties including estimation issues and illustrated a data application. This note derives a closed form expression for moment properties of the distribution. Some computational issues are discussed.




anal

Time series of count data: A review, empirical comparisons and data analysis

Glaura C. Franco, Helio S. Migon, Marcos O. Prates.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 4, 756--781.

Abstract:
Observation and parameter driven models are commonly used in the literature to analyse time series of counts. In this paper, we study the characteristics of a variety of models and point out the main differences and similarities among these procedures, concerning parameter estimation, model fitting and forecasting. Alternatively to the literature, all inference was performed under the Bayesian paradigm. The models are fitted with a latent AR($p$) process in the mean, which accounts for autocorrelation in the data. An extensive simulation study shows that the estimates for the covariate parameters are remarkably similar across the different models. However, estimates for autoregressive coefficients and forecasts of future values depend heavily on the underlying process which generates the data. A real data set of bankruptcy in the United States is also analysed.




anal

L-Logistic regression models: Prior sensitivity analysis, robustness to outliers and applications

Rosineide F. da Paz, Narayanaswamy Balakrishnan, Jorge Luis Bazán.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 3, 455--479.

Abstract:
Tadikamalla and Johnson [ Biometrika 69 (1982) 461–465] developed the $L_{B}$ distribution to variables with bounded support by considering a transformation of the standard Logistic distribution. In this manuscript, a convenient parametrization of this distribution is proposed in order to develop regression models. This distribution, referred to here as L-Logistic distribution, provides great flexibility and includes the uniform distribution as a particular case. Several properties of this distribution are studied, and a Bayesian approach is adopted for the parameter estimation. Simulation studies, considering prior sensitivity analysis, recovery of parameters and comparison of algorithms, and robustness to outliers are all discussed showing that the results are insensitive to the choice of priors, efficiency of the algorithm MCMC adopted, and robustness of the model when compared with the beta distribution. Applications to estimate the vulnerability to poverty and to explain the anxiety are performed. The results to applications show that the L-Logistic regression models provide a better fit than the corresponding beta regression models.




anal

Hierarchical modelling of power law processes for the analysis of repairable systems with different truncation times: An empirical Bayes approach

Rodrigo Citton P. dos Reis, Enrico A. Colosimo, Gustavo L. Gilardoni.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 2, 374--396.

Abstract:
In the data analysis from multiple repairable systems, it is usual to observe both different truncation times and heterogeneity among the systems. Among other reasons, the latter is caused by different manufacturing lines and maintenance teams of the systems. In this paper, a hierarchical model is proposed for the statistical analysis of multiple repairable systems under different truncation times. A reparameterization of the power law process is proposed in order to obtain a quasi-conjugate bayesian analysis. An empirical Bayes approach is used to estimate model hyperparameters. The uncertainty in the estimate of these quantities are corrected by using a parametric bootstrap approach. The results are illustrated in a real data set of failure times of power transformers from an electric company in Brazil.




anal

The coreset variational Bayes (CVB) algorithm for mixture analysis

Qianying Liu, Clare A. McGrory, Peter W. J. Baxter.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 2, 267--279.

Abstract:
The pressing need for improved methods for analysing and coping with big data has opened up a new area of research for statisticians. Image analysis is an area where there is typically a very large number of data points to be processed per image, and often multiple images are captured over time. These issues make it challenging to design methodology that is reliable and yet still efficient enough to be of practical use. One promising emerging approach for this problem is to reduce the amount of data that actually has to be processed by extracting what we call coresets from the full dataset; analysis is then based on the coreset rather than the whole dataset. Coresets are representative subsamples of data that are carefully selected via an adaptive sampling approach. We propose a new approach called coreset variational Bayes (CVB) for mixture modelling; this is an algorithm which can perform a variational Bayes analysis of a dataset based on just an extracted coreset of the data. We apply our algorithm to weed image analysis.




anal

Basic models and questions in statistical network analysis

Miklós Z. Rácz, Sébastien Bubeck.

Source: Statistics Surveys, Volume 11, 1--47.

Abstract:
Extracting information from large graphs has become an important statistical problem since network data is now common in various fields. In this minicourse we will investigate the most natural statistical questions for three canonical probabilistic models of networks: (i) community detection in the stochastic block model, (ii) finding the embedding of a random geometric graph, and (iii) finding the original vertex in a preferential attachment tree. Along the way we will cover many interesting topics in probability theory such as Pólya urns, large deviation theory, concentration of measure in high dimension, entropic central limit theorems, and more.




anal

Some models and methods for the analysis of observational data

José A. Ferreira.

Source: Statistics Surveys, Volume 9, 106--208.

Abstract:
This article provides a concise and essentially self-contained exposition of some of the most important models and non-parametric methods for the analysis of observational data, and a substantial number of illustrations of their application. Although for the most part our presentation follows P. Rosenbaum’s book, “Observational Studies”, and naturally draws on related literature, it contains original elements and simplifies and generalizes some basic results. The illustrations, based on simulated data, show the methods at work in some detail, highlighting pitfalls and emphasizing certain subjective aspects of the statistical analyses.




anal

Analyzing complex functional brain networks: Fusing statistics and network science to understand the brain

Sean L. Simpson, F. DuBois Bowman, Paul J. Laurienti

Source: Statist. Surv., Volume 7, 1--36.

Abstract:
Complex functional brain network analyses have exploded over the last decade, gaining traction due to their profound clinical implications. The application of network science (an interdisciplinary offshoot of graph theory) has facilitated these analyses and enabled examining the brain as an integrated system that produces complex behaviors. While the field of statistics has been integral in advancing activation analyses and some connectivity analyses in functional neuroimaging research, it has yet to play a commensurate role in complex network analyses. Fusing novel statistical methods with network-based functional neuroimage analysis will engender powerful analytical tools that will aid in our understanding of normal brain function as well as alterations due to various brain disorders. Here we survey widely used statistical and network science tools for analyzing fMRI network data and discuss the challenges faced in filling some of the remaining methodological gaps. When applied and interpreted correctly, the fusion of network scientific and statistical methods has a chance to revolutionize the understanding of brain function.




anal

Arctic Amplification of Anthropogenic Forcing: A Vector Autoregressive Analysis. (arXiv:2005.02535v1 [econ.EM] CROSS LISTED)

Arctic sea ice extent (SIE) in September 2019 ranked second-to-lowest in history and is trending downward. The understanding of how internal variability amplifies the effects of external $ ext{CO}_2$ forcing is still limited. We propose the VARCTIC, which is a Vector Autoregression (VAR) designed to capture and extrapolate Arctic feedback loops. VARs are dynamic simultaneous systems of equations, routinely estimated to predict and understand the interactions of multiple macroeconomic time series. Hence, the VARCTIC is a parsimonious compromise between fullblown climate models and purely statistical approaches that usually offer little explanation of the underlying mechanism. Our "business as usual" completely unconditional forecast has SIE hitting 0 in September by the 2060s. Impulse response functions reveal that anthropogenic $ ext{CO}_2$ emission shocks have a permanent effect on SIE - a property shared by no other shock. Further, we find Albedo- and Thickness-based feedbacks to be the main amplification channels through which $ ext{CO}_2$ anomalies impact SIE in the short/medium run. Conditional forecast analyses reveal that the future path of SIE crucially depends on the evolution of $ ext{CO}_2$ emissions, with outcomes ranging from recovering SIE to it reaching 0 in the 2050s. Finally, Albedo and Thickness feedbacks are shown to play an important role in accelerating the speed at which predicted SIE is heading towards 0.




anal

Sampling random graph homomorphisms and applications to network data analysis. (arXiv:1910.09483v2 [math.PR] UPDATED)

A graph homomorphism is a map between two graphs that preserves adjacency relations. We consider the problem of sampling a random graph homomorphism from a graph $F$ into a large network $mathcal{G}$. We propose two complementary MCMC algorithms for sampling a random graph homomorphisms and establish bounds on their mixing times and concentration of their time averages. Based on our sampling algorithms, we propose a novel framework for network data analysis that circumvents some of the drawbacks in methods based on independent and neigborhood sampling. Various time averages of the MCMC trajectory give us various computable observables, including well-known ones such as homomorphism density and average clustering coefficient and their generalizations. Furthermore, we show that these network observables are stable with respect to a suitably renormalized cut distance between networks. We provide various examples and simulations demonstrating our framework through synthetic networks. We also apply our framework for network clustering and classification problems using the Facebook100 dataset and Word Adjacency Networks of a set of classic novels.




anal

Multi-scale analysis of lead-lag relationships in high-frequency financial markets. (arXiv:1708.03992v3 [stat.ME] UPDATED)

We propose a novel estimation procedure for scale-by-scale lead-lag relationships of financial assets observed at high-frequency in a non-synchronous manner. The proposed estimation procedure does not require any interpolation processing of original datasets and is applicable to those with highest time resolution available. Consistency of the proposed estimators is shown under the continuous-time framework that has been developed in our previous work Hayashi and Koike (2018). An empirical application to a quote dataset of the NASDAQ-100 assets identifies two types of lead-lag relationships at different time scales.