era

A hierarchical curve-based approach to the analysis of manifold data

Liberty Vittert, Adrian W. Bowman, Stanislav Katina.

Source: The Annals of Applied Statistics, Volume 13, Number 4, 2539--2563.

Abstract:
One of the data structures generated by medical imaging technology is high resolution point clouds representing anatomical surfaces. Stereophotogrammetry and laser scanning are two widely available sources of this kind of data. A standardised surface representation is required to provide a meaningful correspondence across different images as a basis for statistical analysis. Point locations with anatomical definitions, referred to as landmarks, have been the traditional approach. Landmarks can also be taken as the starting point for more general surface representations, often using templates which are warped on to an observed surface by matching landmark positions and subsequent local adjustment of the surface. The aim of the present paper is to provide a new approach which places anatomical curves at the heart of the surface representation and its analysis. Curves provide intermediate structures which capture the principal features of the manifold (surface) of interest through its ridges and valleys. As landmarks are often available these are used as anchoring points, but surface curvature information is the principal guide in estimating the curve locations. The surface patches between these curves are relatively flat and can be represented in a standardised manner by appropriate surface transects to give a complete surface model. This new approach does not require the use of a template, reference sample or any external information to guide the method and, when compared with a surface based approach, the estimation of curves is shown to have improved performance. In addition, examples involving applications to mussel shells and human faces show that the analysis of curve information can deliver more targeted and effective insight than the use of full surface information.




era

Fitting a deeply nested hierarchical model to a large book review dataset using a moment-based estimator

Ningshan Zhang, Kyle Schmaus, Patrick O. Perry.

Source: The Annals of Applied Statistics, Volume 13, Number 4, 2260--2288.

Abstract:
We consider a particular instance of a common problem in recommender systems, using a database of book reviews to inform user-targeted recommendations. In our dataset, books are categorized into genres and subgenres. To exploit this nested taxonomy, we use a hierarchical model that enables information pooling across across similar items at many levels within the genre hierarchy. The main challenge in deploying this model is computational. The data sizes are large and fitting the model at scale using off-the-shelf maximum likelihood procedures is prohibitive. To get around this computational bottleneck, we extend a moment-based fitting procedure proposed for fitting single-level hierarchical models to the general case of arbitrarily deep hierarchies. This extension is an order of magnitude faster than standard maximum likelihood procedures. The fitting method can be deployed beyond recommender systems to general contexts with deeply nested hierarchical generalized linear mixed models.




era

Joint model of accelerated failure time and mechanistic nonlinear model for censored covariates, with application in HIV/AIDS

Hongbin Zhang, Lang Wu.

Source: The Annals of Applied Statistics, Volume 13, Number 4, 2140--2157.

Abstract:
For a time-to-event outcome with censored time-varying covariates, a joint Cox model with a linear mixed effects model is the standard modeling approach. In some applications such as AIDS studies, mechanistic nonlinear models are available for some covariate process such as viral load during anti-HIV treatments, derived from the underlying data-generation mechanisms and disease progression. Such a mechanistic nonlinear covariate model may provide better-predicted values when the covariates are left censored or mismeasured. When the focus is on the impact of the time-varying covariate process on the survival outcome, an accelerated failure time (AFT) model provides an excellent alternative to the Cox proportional hazard model since an AFT model is formulated to allow the influence of the outcome by the entire covariate process. In this article, we consider a nonlinear mixed effects model for the censored covariates in an AFT model, implemented using a Monte Carlo EM algorithm, under the framework of a joint model for simultaneous inference. We apply the joint model to an HIV/AIDS data to gain insights for assessing the association between viral load and immunological restoration during antiretroviral therapy. Simulation is conducted to compare model performance when the covariate model and the survival model are misspecified.




era

A hierarchical Bayesian model for single-cell clustering using RNA-sequencing data

Yiyi Liu, Joshua L. Warren, Hongyu Zhao.

Source: The Annals of Applied Statistics, Volume 13, Number 3, 1733--1752.

Abstract:
Understanding the heterogeneity of cells is an important biological question. The development of single-cell RNA-sequencing (scRNA-seq) technology provides high resolution data for such inquiry. A key challenge in scRNA-seq analysis is the high variability of measured RNA expression levels and frequent dropouts (missing values) due to limited input RNA compared to bulk RNA-seq measurement. Existing clustering methods do not perform well for these noisy and zero-inflated scRNA-seq data. In this manuscript we propose a Bayesian hierarchical model, called BasClu, to appropriately characterize important features of scRNA-seq data in order to more accurately cluster cells. We demonstrate the effectiveness of our method with extensive simulation studies and applications to three real scRNA-seq datasets.




era

A Bayesian mark interaction model for analysis of tumor pathology images

Qiwei Li, Xinlei Wang, Faming Liang, Guanghua Xiao.

Source: The Annals of Applied Statistics, Volume 13, Number 3, 1708--1732.

Abstract:
With the advance of imaging technology, digital pathology imaging of tumor tissue slides is becoming a routine clinical procedure for cancer diagnosis. This process produces massive imaging data that capture histological details in high resolution. Recent developments in deep-learning methods have enabled us to identify and classify individual cells from digital pathology images at large scale. Reliable statistical approaches to model the spatial pattern of cells can provide new insight into tumor progression and shed light on the biological mechanisms of cancer. We consider the problem of modeling spatial correlations among three commonly seen cells observed in tumor pathology images. A novel geostatistical marking model with interpretable underlying parameters is proposed in a Bayesian framework. We use auxiliary variable MCMC algorithms to sample from the posterior distribution with an intractable normalizing constant. We demonstrate how this model-based analysis can lead to sharper inferences than ordinary exploratory analyses, by means of application to three benchmark datasets and a case study on the pathology images of $188$ lung cancer patients. The case study shows that the spatial correlation between tumor and stromal cells predicts patient prognosis. This statistical methodology not only presents a new model for characterizing spatial correlations in a multitype spatial point pattern conditioning on the locations of the points, but also provides a new perspective for understanding the role of cell–cell interactions in cancer progression.




era

RCRnorm: An integrated system of random-coefficient hierarchical regression models for normalizing NanoString nCounter data

Gaoxiang Jia, Xinlei Wang, Qiwei Li, Wei Lu, Ximing Tang, Ignacio Wistuba, Yang Xie.

Source: The Annals of Applied Statistics, Volume 13, Number 3, 1617--1647.

Abstract:
Formalin-fixed paraffin-embedded (FFPE) samples have great potential for biomarker discovery, retrospective studies and diagnosis or prognosis of diseases. Their application, however, is hindered by the unsatisfactory performance of traditional gene expression profiling techniques on damaged RNAs. NanoString nCounter platform is well suited for profiling of FFPE samples and measures gene expression with high sensitivity which may greatly facilitate realization of scientific and clinical values of FFPE samples. However, methodological development for normalization, a critical step when analyzing this type of data, is far behind. Existing methods designed for the platform use information from different types of internal controls separately and rely on an overly-simplified assumption that expression of housekeeping genes is constant across samples for global scaling. Thus, these methods are not optimized for the nCounter system, not mentioning that they were not developed for FFPE samples. We construct an integrated system of random-coefficient hierarchical regression models to capture main patterns and characteristics observed from NanoString data of FFPE samples and develop a Bayesian approach to estimate parameters and normalize gene expression across samples. Our method, labeled RCRnorm, incorporates information from all aspects of the experimental design and simultaneously removes biases from various sources. It eliminates the unrealistic assumption on housekeeping genes and offers great interpretability. Furthermore, it is applicable to freshly frozen or like samples that can be generally viewed as a reduced case of FFPE samples. Simulation and applications showed the superior performance of RCRnorm.




era

A refined Cramér-type moderate deviation for sums of local statistics

Xiao Fang, Li Luo, Qi-Man Shao.

Source: Bernoulli, Volume 26, Number 3, 2319--2352.

Abstract:
We prove a refined Cramér-type moderate deviation result by taking into account of the skewness in normal approximation for sums of local statistics of independent random variables. We apply the main result to $k$-runs, U-statistics and subgraph counts in the Erdős–Rényi random graph. To prove our main result, we develop exponential concentration inequalities and higher-order tail probability expansions via Stein’s method.




era

Kernel and wavelet density estimators on manifolds and more general metric spaces

Galatia Cleanthous, Athanasios G. Georgiadis, Gerard Kerkyacharian, Pencho Petrushev, Dominique Picard.

Source: Bernoulli, Volume 26, Number 3, 1832--1862.

Abstract:
We consider the problem of estimating the density of observations taking values in classical or nonclassical spaces such as manifolds and more general metric spaces. Our setting is quite general but also sufficiently rich in allowing the development of smooth functional calculus with well localized spectral kernels, Besov regularity spaces, and wavelet type systems. Kernel and both linear and nonlinear wavelet density estimators are introduced and studied. Convergence rates for these estimators are established and discussed.




era

On the probability distribution of the local times of diagonally operator-self-similar Gaussian fields with stationary increments

Kamran Kalbasi, Thomas Mountford.

Source: Bernoulli, Volume 26, Number 2, 1504--1534.

Abstract:
In this paper, we study the local times of vector-valued Gaussian fields that are ‘diagonally operator-self-similar’ and whose increments are stationary. Denoting the local time of such a Gaussian field around the spatial origin and over the temporal unit hypercube by $Z$, we show that there exists $lambdain(0,1)$ such that under some quite weak conditions, $lim_{n ightarrow+infty}frac{sqrt[n]{mathbb{E}(Z^{n})}}{n^{lambda}}$ and $lim_{x ightarrow+infty}frac{-logmathbb{P}(Z>x)}{x^{frac{1}{lambda}}}$ both exist and are strictly positive (possibly $+infty$). Moreover, we show that if the underlying Gaussian field is ‘strongly locally nondeterministic’, the above limits will be finite as well. These results are then applied to establish similar statements for the intersection local times of diagonally operator-self-similar Gaussian fields with stationary increments.




era

Interacting reinforced stochastic processes: Statistical inference based on the weighted empirical means

Giacomo Aletti, Irene Crimaldi, Andrea Ghiglietti.

Source: Bernoulli, Volume 26, Number 2, 1098--1138.

Abstract:
This work deals with a system of interacting reinforced stochastic processes , where each process $X^{j}=(X_{n,j})_{n}$ is located at a vertex $j$ of a finite weighted directed graph, and it can be interpreted as the sequence of “actions” adopted by an agent $j$ of the network. The interaction among the dynamics of these processes depends on the weighted adjacency matrix $W$ associated to the underlying graph: indeed, the probability that an agent $j$ chooses a certain action depends on its personal “inclination” $Z_{n,j}$ and on the inclinations $Z_{n,h}$, with $h eq j$, of the other agents according to the entries of $W$. The best known example of reinforced stochastic process is the Pólya urn. The present paper focuses on the weighted empirical means $N_{n,j}=sum_{k=1}^{n}q_{n,k}X_{k,j}$, since, for example, the current experience is more important than the past one in reinforced learning. Their almost sure synchronization and some central limit theorems in the sense of stable convergence are proven. The new approach with weighted means highlights the key points in proving some recent results for the personal inclinations $Z^{j}=(Z_{n,j})_{n}$ and for the empirical means $overline{X}^{j}=(sum_{k=1}^{n}X_{k,j}/n)_{n}$ given in recent papers (e.g. Aletti, Crimaldi and Ghiglietti (2019), Ann. Appl. Probab. 27 (2017) 3787–3844, Crimaldi et al. Stochastic Process. Appl. 129 (2019) 70–101). In fact, with a more sophisticated decomposition of the considered processes, we can understand how the different convergence rates of the involved stochastic processes combine. From an application point of view, we provide confidence intervals for the common limit inclination of the agents and a test statistics to make inference on the matrix $W$, based on the weighted empirical means. In particular, we answer a research question posed in Aletti, Crimaldi and Ghiglietti (2019).




era

Degeneracy in sparse ERGMs with functions of degrees as sufficient statistics

Sumit Mukherjee.

Source: Bernoulli, Volume 26, Number 2, 1016--1043.

Abstract:
A sufficient criterion for “non-degeneracy” is given for Exponential Random Graph Models on sparse graphs with sufficient statistics which are functions of the degree sequence. This criterion explains why statistics such as alternating $k$-star are non-degenerate, whereas subgraph counts are degenerate. It is further shown that this criterion is “almost” tight. Existence of consistent estimates is then proved for non-degenerate Exponential Random Graph Models.




era

Convergence of the age structure of general schemes of population processes

Jie Yen Fan, Kais Hamza, Peter Jagers, Fima Klebaner.

Source: Bernoulli, Volume 26, Number 2, 893--926.

Abstract:
We consider a family of general branching processes with reproduction parameters depending on the age of the individual as well as the population age structure and a parameter $K$, which may represent the carrying capacity. These processes are Markovian in the age structure. In a previous paper ( Proc. Steklov Inst. Math. 282 (2013) 90–105), the Law of Large Numbers as $K o infty $ was derived. Here we prove the central limit theorem, namely the weak convergence of the fluctuation processes in an appropriate Skorokhod space. We also show that the limit is driven by a stochastic partial differential equation.




era

A Feynman–Kac result via Markov BSDEs with generalised drivers

Elena Issoglio, Francesco Russo.

Source: Bernoulli, Volume 26, Number 1, 728--766.

Abstract:
In this paper, we investigate BSDEs where the driver contains a distributional term (in the sense of generalised functions) and derive general Feynman–Kac formulae related to these BSDEs. We introduce an integral operator to give sense to the equation and then we show the existence of a strong solution employing results on a related PDE. Due to the irregularity of the driver, the $Y$-component of a couple $(Y,Z)$ solving the BSDE is not necessarily a semimartingale but a weak Dirichlet process.




era

On frequentist coverage errors of Bayesian credible sets in moderately high dimensions

Keisuke Yano, Kengo Kato.

Source: Bernoulli, Volume 26, Number 1, 616--641.

Abstract:
In this paper, we study frequentist coverage errors of Bayesian credible sets for an approximately linear regression model with (moderately) high dimensional regressors, where the dimension of the regressors may increase with but is smaller than the sample size. Specifically, we consider quasi-Bayesian inference on the slope vector under the quasi-likelihood with Gaussian error distribution. Under this setup, we derive finite sample bounds on frequentist coverage errors of Bayesian credible rectangles. Derivation of those bounds builds on a novel Berry–Esseen type bound on quasi-posterior distributions and recent results on high-dimensional CLT on hyperrectangles. We use this general result to quantify coverage errors of Castillo–Nickl and $L^{infty}$-credible bands for Gaussian white noise models, linear inverse problems, and (possibly non-Gaussian) nonparametric regression models. In particular, we show that Bayesian credible bands for those nonparametric models have coverage errors decaying polynomially fast in the sample size, implying advantages of Bayesian credible bands over confidence bands based on extreme value theory.




era

Operator-scaling Gaussian random fields via aggregation

Yi Shen, Yizao Wang.

Source: Bernoulli, Volume 26, Number 1, 500--530.

Abstract:
We propose an aggregated random-field model, and investigate the scaling limits of the aggregated partial-sum random fields. In this model, each copy in the aggregation is a $pm 1$-valued random field built from two correlated one-dimensional random walks, the law of each determined by a random persistence parameter. A flexible joint distribution of the two parameters is introduced, and given the parameters the two correlated random walks are conditionally independent. For the aggregated random field, when the persistence parameters are independent, the scaling limit is a fractional Brownian sheet. When the persistence parameters are tail-dependent, characterized in the framework of multivariate regular variation, the scaling limit is more delicate, and in particular depends on the growth rates of the underlying rectangular region along two directions: at different rates different operator-scaling Gaussian random fields appear as the region area tends to infinity. In particular, at the so-called critical speed, a large family of Gaussian random fields with long-range dependence arise in the limit. We also identify four different regimes at non-critical speed where fractional Brownian sheets arise in the limit.




era

Weak convergence of quantile and expectile processes under general assumptions

Tobias Zwingmann, Hajo Holzmann.

Source: Bernoulli, Volume 26, Number 1, 323--351.

Abstract:
We show weak convergence of quantile and expectile processes to Gaussian limit processes in the space of bounded functions endowed with an appropriate semimetric which is based on the concepts of epi- and hypo- convergence as introduced in A. Bücher, J. Segers and S. Volgushev (2014), ‘ When Uniform Weak Convergence Fails: Empirical Processes for Dependence Functions and Residuals via Epi- and Hypographs ’, Annals of Statistics 42 . We impose assumptions for which it is known that weak convergence with respect to the supremum norm generally fails to hold. For quantiles, we consider stationary observations, where the marginal distribution function is assumed to be strictly increasing and continuous except for finitely many points and to admit strictly positive – possibly infinite – left- and right-sided derivatives. For expectiles, we focus on independent and identically distributed (i.i.d.) observations. Only a finite second moment and continuity at the boundary points but no further smoothness properties of the distribution function are required. We also show consistency of the bootstrap for this mode of convergence in the i.i.d. case for quantiles and expectiles.




era

Gordon of Huntly : heraldic heritage : cadets to South Australia / Robin Gregory Gordon.

South Australia -- Genealogy.




era

The Kuerschner story : 1848 - 1999 / compiled by Gerald Kuerschner.

Kuerschner (Family)




era

Austin-Area District Looks for Digital/Blended Learning Program; Baltimore Seeks High School Literacy Program

The Round Rock Independent School District in Texas is looking for a digital curriculum and blended learning program. Baltimore is looking for a comprehensive high school literacy program.

The post Austin-Area District Looks for Digital/Blended Learning Program; Baltimore Seeks High School Literacy Program appeared first on Market Brief.



  • Purchasing Alert
  • Curriculum / Digital Curriculum
  • Educational Technology/Ed-Tech
  • Learning Management / Student Information Systems
  • Procurement / Purchasing / RFPs

era

Federal watchdog finds 'reasonable grounds to believe' vaccine doctor's ouster was retaliation, lawyers say

The Office of Special Counsel is recommending that ousted vaccine official Dr. Rick Bright be reinstated while it investigates his case, his lawyers announced Friday.Bright while leading coronavirus vaccine development was recently removed from his position as the director of the Department of Health and Human Services' Biomedical Advanced Research and Development Authority, and he alleges it was because he insisted congressional funding not go toward "drugs, vaccines, and other technologies that lack scientific merit" and limited the "broad use" of hydroxychloroquine after it was touted by President Trump. In a whistleblower complaint, he alleged "cronyism" at HHS. He has also alleged he was "pressured to ignore or dismiss expert scientific recommendations and instead to award lucrative contracts based on political connections."On Friday, Bright's lawyers said that the Office of Special Counsel has determined there are "reasonable grounds to believe" his firing was retaliation, The New York Times reports. The federal watchdog also recommended he be reinstated for 45 days to give the office "sufficient time to complete its investigation of Bright's allegations," CNN reports. The decision on whether to do so falls on Secretary of Health and Human Services Alex Azar, and Office of Special Counsel recommendations are "not binding," the Times notes. More stories from theweek.com Outed CIA agent Valerie Plame is running for Congress, and her launch video looks like a spy movie trailer 7 scathing cartoons about America's rush to reopen Trump says he couldn't have exposed WWII vets to COVID-19 because the wind was blowing the wrong way





era

High-Dimensional Posterior Consistency for Hierarchical Non-Local Priors in Regression

Xuan Cao, Kshitij Khare, Malay Ghosh.

Source: Bayesian Analysis, Volume 15, Number 1, 241--262.

Abstract:
The choice of tuning parameters in Bayesian variable selection is a critical problem in modern statistics. In particular, for Bayesian linear regression with non-local priors, the scale parameter in the non-local prior density is an important tuning parameter which reflects the dispersion of the non-local prior density around zero, and implicitly determines the size of the regression coefficients that will be shrunk to zero. Current approaches treat the scale parameter as given, and suggest choices based on prior coverage/asymptotic considerations. In this paper, we consider the fully Bayesian approach introduced in (Wu, 2016) with the pMOM non-local prior and an appropriate Inverse-Gamma prior on the tuning parameter to analyze the underlying theoretical property. Under standard regularity assumptions, we establish strong model selection consistency in a high-dimensional setting, where $p$ is allowed to increase at a polynomial rate with $n$ or even at a sub-exponential rate with $n$ . Through simulation studies, we demonstrate that our model selection procedure can outperform other Bayesian methods which treat the scale parameter as given, and commonly used penalized likelihood methods, in a range of simulation settings.




era

Scalable Bayesian Inference for the Inverse Temperature of a Hidden Potts Model

Matthew Moores, Geoff Nicholls, Anthony Pettitt, Kerrie Mengersen.

Source: Bayesian Analysis, Volume 15, Number 1, 1--27.

Abstract:
The inverse temperature parameter of the Potts model governs the strength of spatial cohesion and therefore has a major influence over the resulting model fit. A difficulty arises from the dependence of an intractable normalising constant on the value of this parameter and thus there is no closed-form solution for sampling from the posterior distribution directly. There is a variety of computational approaches for sampling from the posterior without evaluating the normalising constant, including the exchange algorithm and approximate Bayesian computation (ABC). A serious drawback of these algorithms is that they do not scale well for models with a large state space, such as images with a million or more pixels. We introduce a parametric surrogate model, which approximates the score function using an integral curve. Our surrogate model incorporates known properties of the likelihood, such as heteroskedasticity and critical temperature. We demonstrate this method using synthetic data as well as remotely-sensed imagery from the Landsat-8 satellite. We achieve up to a hundredfold improvement in the elapsed runtime, compared to the exchange algorithm or ABC. An open-source implementation of our algorithm is available in the R package bayesImageS .




era

Hierarchical Normalized Completely Random Measures for Robust Graphical Modeling

Andrea Cremaschi, Raffaele Argiento, Katherine Shoemaker, Christine Peterson, Marina Vannucci.

Source: Bayesian Analysis, Volume 14, Number 4, 1271--1301.

Abstract:
Gaussian graphical models are useful tools for exploring network structures in multivariate normal data. In this paper we are interested in situations where data show departures from Gaussianity, therefore requiring alternative modeling distributions. The multivariate $t$ -distribution, obtained by dividing each component of the data vector by a gamma random variable, is a straightforward generalization to accommodate deviations from normality such as heavy tails. Since different groups of variables may be contaminated to a different extent, Finegold and Drton (2014) introduced the Dirichlet $t$ -distribution, where the divisors are clustered using a Dirichlet process. In this work, we consider a more general class of nonparametric distributions as the prior on the divisor terms, namely the class of normalized completely random measures (NormCRMs). To improve the effectiveness of the clustering, we propose modeling the dependence among the divisors through a nonparametric hierarchical structure, which allows for the sharing of parameters across the samples in the data set. This desirable feature enables us to cluster together different components of multivariate data in a parsimonious way. We demonstrate through simulations that this approach provides accurate graphical model inference, and apply it to a case study examining the dependence structure in radiomics data derived from The Cancer Imaging Atlas.




era

A Bayesian Nonparametric Multiple Testing Procedure for Comparing Several Treatments Against a Control

Luis Gutiérrez, Andrés F. Barrientos, Jorge González, Daniel Taylor-Rodríguez.

Source: Bayesian Analysis, Volume 14, Number 2, 649--675.

Abstract:
We propose a Bayesian nonparametric strategy to test for differences between a control group and several treatment regimes. Most of the existing tests for this type of comparison are based on the differences between location parameters. In contrast, our approach identifies differences across the entire distribution, avoids strong modeling assumptions over the distributions for each treatment, and accounts for multiple testing through the prior distribution on the space of hypotheses. The proposal is compared to other commonly used hypothesis testing procedures under simulated scenarios. Two real applications are also analyzed with the proposed methodology.




era

Fast Model-Fitting of Bayesian Variable Selection Regression Using the Iterative Complex Factorization Algorithm

Quan Zhou, Yongtao Guan.

Source: Bayesian Analysis, Volume 14, Number 2, 573--594.

Abstract:
Bayesian variable selection regression (BVSR) is able to jointly analyze genome-wide genetic datasets, but the slow computation via Markov chain Monte Carlo (MCMC) hampered its wide-spread usage. Here we present a novel iterative method to solve a special class of linear systems, which can increase the speed of the BVSR model-fitting tenfold. The iterative method hinges on the complex factorization of the sum of two matrices and the solution path resides in the complex domain (instead of the real domain). Compared to the Gauss-Seidel method, the complex factorization converges almost instantaneously and its error is several magnitude smaller than that of the Gauss-Seidel method. More importantly, the error is always within the pre-specified precision while the Gauss-Seidel method is not. For large problems with thousands of covariates, the complex factorization is 10–100 times faster than either the Gauss-Seidel method or the direct method via the Cholesky decomposition. In BVSR, one needs to repetitively solve large penalized regression systems whose design matrices only change slightly between adjacent MCMC steps. This slight change in design matrix enables the adaptation of the iterative complex factorization method. The computational innovation will facilitate the wide-spread use of BVSR in reanalyzing genome-wide association datasets.




era

Modeling Population Structure Under Hierarchical Dirichlet Processes

Lloyd T. Elliott, Maria De Iorio, Stefano Favaro, Kaustubh Adhikari, Yee Whye Teh.

Source: Bayesian Analysis, Volume 14, Number 2, 313--339.

Abstract:
We propose a Bayesian nonparametric model to infer population admixture, extending the hierarchical Dirichlet process to allow for correlation between loci due to linkage disequilibrium. Given multilocus genotype data from a sample of individuals, the proposed model allows inferring and classifying individuals as unadmixed or admixed, inferring the number of subpopulations ancestral to an admixed population and the population of origin of chromosomal regions. Our model does not assume any specific mutation process, and can be applied to most of the commonly used genetic markers. We present a Markov chain Monte Carlo (MCMC) algorithm to perform posterior inference from the model and we discuss some methods to summarize the MCMC output for the analysis of population admixture. Finally, we demonstrate the performance of the proposed model in a real application, using genetic data from the ectodysplasin-A receptor (EDAR) gene, which is considered to be ancestry-informative due to well-known variations in allele frequency as well as phenotypic effects across ancestry. The structure analysis of this dataset leads to the identification of a rare haplotype in Europeans. We also conduct a simulated experiment and show that our algorithm outperforms parametric methods.




era

Comment: Models as (Deliberate) Approximations

David Whitney, Ali Shojaie, Marco Carone.

Source: Statistical Science, Volume 34, Number 4, 591--598.




era

Generalized Multiple Importance Sampling

Víctor Elvira, Luca Martino, David Luengo, Mónica F. Bugallo.

Source: Statistical Science, Volume 34, Number 1, 129--155.

Abstract:
Importance sampling (IS) methods are broadly used to approximate posterior distributions or their moments. In the standard IS approach, samples are drawn from a single proposal distribution and weighted adequately. However, since the performance in IS depends on the mismatch between the targeted and the proposal distributions, several proposal densities are often employed for the generation of samples. Under this multiple importance sampling (MIS) scenario, extensive literature has addressed the selection and adaptation of the proposal distributions, interpreting the sampling and weighting steps in different ways. In this paper, we establish a novel general framework with sampling and weighting procedures when more than one proposal is available. The new framework encompasses most relevant MIS schemes in the literature, and novel valid schemes appear naturally. All the MIS schemes are compared and ranked in terms of the variance of the associated estimators. Finally, we provide illustrative examples revealing that, even with a good choice of the proposal densities, a careful interpretation of the sampling and weighting procedures can make a significant difference in the performance of the method.




era

Afferents and Homotypic Neighbors Regulate Horizontal Cell Morphology, Connectivity, and Retinal Coverage

Benjamin E. Reese
Mar 2, 2005; 25:2167-2175
BehavioralSystemsCognitive




era

Interactions of Top-Down and Bottom-Up Mechanisms in Human Visual Cortex

Stephanie McMains
Jan 12, 2011; 31:587-597
BehavioralSystemsCognitive




era

Neurodegeneration induced by beta-amyloid peptides in vitro: the role of peptide assembly state

CJ Pike
Apr 1, 1993; 13:1676-1687
Articles




era

Interaction between the C terminus of NMDA receptor subunits and multiple members of the PSD-95 family of membrane-associated guanylate kinases

M Niethammer
Apr 1, 1996; 16:2157-2163
Articles




era

Mice Deficient in Cellular Glutathione Peroxidase Show Increased Vulnerability to Malonate, 3-Nitropropionic Acid, and 1-Methyl-4-Phenyl-1,2,5,6-Tetrahydropyridine

Peter Klivenyi
Jan 1, 2000; 20:1-7
Cellular




era

Neurogenesis in the dentate gyrus of the adult rat: age-related decrease of neuronal progenitor proliferation

HG Kuhn
Mar 15, 1996; 16:2027-2033
Articles




era

Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex

EL Bienenstock
Jan 1, 1982; 2:32-48
Articles




era

Response of Neurons in the Lateral Intraparietal Area during a Combined Visual Discrimination Reaction Time Task

Jamie D. Roitman
Nov 1, 2002; 22:9475-9489
Behavioral




era

La costruzione di una crescita resiliente passa per la cooperazione internazionale

Italian translation of the BIS Press Release on the presentation of the Annual Report (25 June 2017)




era

Le Comité de Bâle finalise sa revue du traitement réglementaire des expositions aux actifs souverains sans modifier les règles existantes et publie un document de discussion

French translation of the press release about the Basel Committee publishing a discussion paper on "The regulatory treatment of sovereign exposures" (7 December 2017)




era

Las monedas digitales de bancos centrales podrían afectar a los pagos, la política monetaria y la estabilidad financiera

Spanish version of Press release about CPMI and the Markets Committee issuing a report on "Central bank digital currencies" (12 March 2018)




era

Israeli Jews at odds with liberal brethren in US




era

Programming the Be Operating System




era

Vulnerabilities in the international monetary and financial system

Speech by Mr Claudio Borio, Head of the Monetary and Economic Department of the BIS, at the OECD-G20 High Level Policy Seminar, Paris, 11 September 2019.




era

Contribution of NPY Y5 Receptors to the Reversible Structural Remodeling of Basolateral Amygdala Dendrites in Male Rats Associated with NPY-Mediated Stress Resilience

Endogenous neuropeptide Y (NPY) and corticotrophin-releasing factor (CRF) modulate the responses of the basolateral amygdala (BLA) to stress and are associated with the development of stress resilience and vulnerability, respectively. We characterized persistent effects of repeated NPY and CRF treatment on the structure and function of BLA principal neurons in a novel organotypic slice culture (OTC) model of male rat BLA, and examined the contributions of specific NPY receptor subtypes to these neural and behavioral effects. In BLA principal neurons within the OTCs, repeated NPY treatment caused persistent attenuation of excitatory input and induced dendritic hypotrophy via Y5 receptor activation; conversely, CRF increased excitatory input and induced hypertrophy of BLA principal neurons. Repeated treatment of OTCs with NPY followed by an identical treatment with CRF, or vice versa, inhibited or reversed all structural changes in OTCs. These structural responses to NPY or CRF required calcineurin or CaMKII, respectively. Finally, repeated intra-BLA injections of NPY or a Y5 receptor agonist increased social interaction, a validated behavior for anxiety, and recapitulated structural changes in BLA neurons seen in OTCs, while a Y5 receptor antagonist prevented NPY's effects both on behavior and on structure. These results implicate the Y5 receptor in the long-term, anxiolytic-like effects of NPY in the BLA, consistent with an intrinsic role in stress buffering, and highlight a remarkable mechanism by which BLA neurons may adapt to different levels of stress. Moreover, BLA OTCs offer a robust model to study mechanisms associated with resilience and vulnerability to stress in BLA.

SIGNIFICANCE STATEMENT Within the basolateral amygdala (BLA), neuropeptide Y (NPY) is associated with buffering the neural stress response induced by corticotropin releasing factor, and promoting stress resilience. We used a novel organotypic slice culture model of BLA, complemented with in vivo studies, to examine the cellular mechanisms associated with the actions of NPY. In organotypic slice cultures, repeated NPY treatment reduces the complexity of the dendritic extent of anxiogenic BLA principal neurons, making them less excitable. NPY, via activation of Y5 receptors, additionally inhibits and reverses the increases in dendritic extent and excitability induced by the stress hormone, corticotropin releasing factor. This NPY-mediated neuroplasticity indicates that resilience or vulnerability to stress may thus involve neuropeptide-mediated dendritic remodeling in BLA principal neurons.




era

Deletion of Voltage-Gated Calcium Channels in Astrocytes during Demyelination Reduces Brain Inflammation and Promotes Myelin Regeneration in Mice

To determine whether Cav1.2 voltage-gated Ca2+ channels contribute to astrocyte activation, we generated an inducible conditional knock-out mouse in which the Cav1.2 α subunit was deleted in GFAP-positive astrocytes. This astrocytic Cav1.2 knock-out mouse was tested in the cuprizone model of myelin injury and repair which causes astrocyte and microglia activation in the absence of a lymphocytic response. Deletion of Cav1.2 channels in GFAP-positive astrocytes during cuprizone-induced demyelination leads to a significant reduction in the degree of astrocyte and microglia activation and proliferation in mice of either sex. Concomitantly, the production of proinflammatory factors such as TNFα, IL1β and TGFβ1 was significantly decreased in the corpus callosum and cortex of Cav1.2 knock-out mice through demyelination. Furthermore, this mild inflammatory environment promotes oligodendrocyte progenitor cells maturation and myelin regeneration across the remyelination phase of the cuprizone model. Similar results were found in animals treated with nimodipine, a Cav1.2 Ca2+ channel inhibitor with high affinity to the CNS. Mice of either sex injected with nimodipine during the demyelination stage of the cuprizone treatment displayed a reduced number of reactive astrocytes and showed a faster and more efficient brain remyelination. Together, these results indicate that Cav1.2 Ca2+ channels play a crucial role in the induction and proliferation of reactive astrocytes during demyelination; and that attenuation of astrocytic voltage-gated Ca2+ influx may be an effective therapy to reduce brain inflammation and promote myelin recovery in demyelinating diseases.

SIGNIFICANCE STATEMENT Reducing voltage-gated Ca2+ influx in astrocytes during brain demyelination significantly attenuates brain inflammation and astrocyte reactivity. Furthermore, these changes promote myelin restoration and oligodendrocyte maturation throughout remyelination.




era

Carbon Monoxide, a Retrograde Messenger Generated in Postsynaptic Mushroom Body Neurons, Evokes Noncanonical Dopamine Release

Dopaminergic neurons innervate extensive areas of the brain and release dopamine (DA) onto a wide range of target neurons. However, DA release is also precisely regulated. In Drosophila melanogaster brain explant preparations, DA is released specifically onto α3/α'3 compartments of mushroom body (MB) neurons that have been coincidentally activated by cholinergic and glutamatergic inputs. The mechanism for this precise release has been unclear. Here we found that coincidentally activated MB neurons generate carbon monoxide (CO), which functions as a retrograde signal evoking local DA release from presynaptic terminals. CO production depends on activity of heme oxygenase in postsynaptic MB neurons, and CO-evoked DA release requires Ca2+ efflux through ryanodine receptors in DA terminals. CO is only produced in MB areas receiving coincident activation, and removal of CO using scavengers blocks DA release. We propose that DA neurons use two distinct modes of transmission to produce global and local DA signaling.

SIGNIFICANCE STATEMENT Dopamine (DA) is needed for various higher brain functions, including memory formation. However, DA neurons form extensive synaptic connections, while memory formation requires highly specific and localized DA release. Here we identify a mechanism through which DA release from presynaptic terminals is controlled by postsynaptic activity. Postsynaptic neurons activated by cholinergic and glutamatergic inputs generate carbon monoxide, which acts as a retrograde messenger inducing presynaptic DA release. Released DA is required for memory-associated plasticity. Our work identifies a novel mechanism that restricts DA release to the specific postsynaptic sites that require DA during memory formation.




era

Report: eradicate hunger and malnutrition

Eradicating hunger must be accompanied by strenuous efforts to end malnutrition and its devastating effects. That was a pivotal message at the launch of FAO’s key publication The State of Food and Agriculture, which this year focuses on Food systems for better nutrition. “FAO’s message is that we must strive for nothing less than the eradication of hunger and malnutrition,” said Director-General [...]




era

Empowerment is key to eradicating hunger

Global food security largely depends on smallholder family farms where in many regions of the world women play a crucial role as both producers and providers of food. Studies show that when women and other rural poor have better access to resources, the benefits are far-reaching. Families are healthier, more children attend school, agricultural productivity improves, incomes increase, and rural communities [...]




era

Inspiring the young generation to take action against climate change - in pictures

Climate change is what most of us perceive as the top global threat, and the dangers it poses affect present and future generations alike.  How global warming is threatening the planet has been a theme in children’s books for all ages for some time.   How everyone, especially today’s youth, can make a difference to the future of the world [...]




era

Release of 2019 Technical Cooperation Programme Report

The 2019 Report of the Technical Cooperation Programme (TCP) examines the role of the TCP to deliver FAO technical assistance for agriculture, food and nutrition in response to countries’ most [...]




era

"chimeras of experience"