son Report on Harrogate visit / by L. S. P. Davidson. By search.wellcomelibrary.org Published On :: England : Harrogate Corporation, Wells and Baths Department, 1945. Full Article
son Scientific report / Beatson Institute for Cancer Research. By search.wellcomelibrary.org Published On :: Glasgow : Beatson Institute for Cancer Research, 2008- Full Article
son Drug abuse treatment client characteristics and pretreatment behaviors : 1979-1981 TOPS admission cohorts / Robert L. Hubbard, Robert M. Bray, Elizabeth R. Cavanaugh, J. Valley Rachal, S. Gail Craddock, James J. Collins, Margaret Allison ; Research Triang By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1986. Full Article
son Suicide and depression among drug abusers / Margaret Allison, Robert L. Hubbard, Harold M. Ginzburg. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1985. Full Article
son Treatment process in methadone, residential, and outpatient drug free programs / Margaret Allison, Robert L. Hubbard, J. Valley Rachal. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1985. Full Article
son Management information systems in the drug field / edited by George M. Beschner, Neil H. Sampson, National Institute on Drug Abuse ; and Christopher D'Amanda, Coordinating Office for Drug and Alcohol Abuse, City of Philadelphia. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1979. Full Article
son Inhalant use and treatment / by Terry Mason. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1979. Full Article
son An evaluation of the California civil addict program / by William H. McGlothlin, M. Douglas Anglin, Bruce D. Wilson. By search.wellcomelibrary.org Published On :: Rockville, Maryland : National Institute on Drug Abuse, 1977. Full Article
son The incidence of driving under the influence of drugs, 1985 : an update of the state of knowledge / [Richard P. Compton and Theodore E. Anderson]. By search.wellcomelibrary.org Published On :: Springfield, Virginia : National Technical Information Service, 1985. Full Article
son Ferguson family papers, 1885-1993 By feedproxy.google.com Published On :: 2/10/2015 12:00:00 AM Full Article
son Selected Poems of Henry Lawson: Correspondence: Vol.1 By feedproxy.google.com Published On :: 29/10/2015 12:00:00 AM Full Article
son Sabrina Ionescu, Ruthy Hebard, Satou Sabally on staying connected, WNBA Draft, Oregon's historic season By sports.yahoo.com Published On :: Thu, 09 Apr 2020 16:27:12 GMT Pac-12 Networks' Ashley Adamson catches up with Oregon's "Big 3" of Sabrina Ionescu, Ruthy Hebard and Satou Sabally to hear how they're adjusting to the new world without sports while still preparing for the WNBA Draft on April 17. They also share how they're staying hungry for basketball during the hiatus. Full Article video Sports
son Mississippi State hires Nikki McCray-Penson as women's coach By sports.yahoo.com Published On :: Sat, 11 Apr 2020 19:32:26 GMT Mississippi State hired former Old Dominion women’s basketball coach Nikki McCray-Penson to replace Vic Schaefer as the Bulldogs’ head coach. Athletic director John Cohen called McCray-Penson “a proven winner who will lead one of the best programs in the nation” on the department’s website. McCray-Penson, a former Tennessee star and Women’s Basketball Hall of Famer, said it’s been a dream to coach in the Southeastern Conference and she’s “grateful and blessed for this incredible honor and opportunity.” Full Article article Sports
son Neyman-Pearson classification: parametrics and sample size requirement By Published On :: 2020 The Neyman-Pearson (NP) paradigm in binary classification seeks classifiers that achieve a minimal type II error while enforcing the prioritized type I error controlled under some user-specified level $alpha$. This paradigm serves naturally in applications such as severe disease diagnosis and spam detection, where people have clear priorities among the two error types. Recently, Tong, Feng, and Li (2018) proposed a nonparametric umbrella algorithm that adapts all scoring-type classification methods (e.g., logistic regression, support vector machines, random forest) to respect the given type I error (i.e., conditional probability of classifying a class $0$ observation as class $1$ under the 0-1 coding) upper bound $alpha$ with high probability, without specific distributional assumptions on the features and the responses. Universal the umbrella algorithm is, it demands an explicit minimum sample size requirement on class $0$, which is often the more scarce class, such as in rare disease diagnosis applications. In this work, we employ the parametric linear discriminant analysis (LDA) model and propose a new parametric thresholding algorithm, which does not need the minimum sample size requirements on class $0$ observations and thus is suitable for small sample applications such as rare disease diagnosis. Leveraging both the existing nonparametric and the newly proposed parametric thresholding rules, we propose four LDA-based NP classifiers, for both low- and high-dimensional settings. On the theoretical front, we prove NP oracle inequalities for one proposed classifier, where the rate for excess type II error benefits from the explicit parametric model assumption. Furthermore, as NP classifiers involve a sample splitting step of class $0$ observations, we construct a new adaptive sample splitting scheme that can be applied universally to NP classifiers, and this adaptive strategy reduces the type II error of these classifiers. The proposed NP classifiers are implemented in the R package nproc. Full Article
son Q&A with Adam Ferguson By feedproxy.google.com Published On :: Tue, 05 May 2020 05:43:41 +0000 Each year the Library hosts the popular World Press Photo exhibition, bringing together award-winning photographs from t Full Article
son Random environment binomial thinning integer-valued autoregressive process with Poisson or geometric marginal By projecteuclid.org Published On :: Mon, 04 May 2020 04:00 EDT Zhengwei Liu, Qi Li, Fukang Zhu. Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 2, 251--272.Abstract: To predict time series of counts with small values and remarkable fluctuations, an available model is the $r$ states random environment process based on the negative binomial thinning operator and the geometric marginal. However, we argue that the aforementioned model may suffer from the following two drawbacks. First, under the condition of no prior information, the overdispersed property of the geometric distribution may cause the predictions fluctuate greatly. Second, because of the constraints on the model parameters, some estimated parameters are close to zero in real-data examples, which may not objectively reveal the correlation relationship. For the first drawback, an $r$ states random environment process based on the binomial thinning operator and the Poisson marginal is introduced. For the second drawback, we propose a generalized $r$ states random environment integer-valued autoregressive model based on the binomial thinning operator to model fluctuations of data. Yule–Walker and conditional maximum likelihood estimates are considered and their performances are assessed via simulation studies. Two real-data sets are conducted to illustrate the better performances of the proposed models compared with some existing models. Full Article
son Application of weighted and unordered majorization orders in comparisons of parallel systems with exponentiated generalized gamma components By projecteuclid.org Published On :: Mon, 03 Feb 2020 04:00 EST Abedin Haidari, Amir T. Payandeh Najafabadi, Narayanaswamy Balakrishnan. Source: Brazilian Journal of Probability and Statistics, Volume 34, Number 1, 150--166.Abstract: Consider two parallel systems, say $A$ and $B$, with respective lifetimes $T_{1}$ and $T_{2}$ wherein independent component lifetimes of each system follow exponentiated generalized gamma distribution with possibly different exponential shape and scale parameters. We show here that $T_{2}$ is smaller than $T_{1}$ with respect to the usual stochastic order (reversed hazard rate order) if the vector of logarithm (the main vector) of scale parameters of System $B$ is weakly weighted majorized by that of System $A$, and if the vector of exponential shape parameters of System $A$ is unordered mojorized by that of System $B$. By means of some examples, we show that the above results can not be extended to the hazard rate and likelihood ratio orders. However, when the scale parameters of each system divide into two homogeneous groups, we verify that the usual stochastic and reversed hazard rate orders can be extended, respectively, to the hazard rate and likelihood ratio orders. The established results complete and strengthen some of the known results in the literature. Full Article
son Bayesian approach for the zero-modified Poisson–Lindley regression model By projecteuclid.org Published On :: Mon, 26 Aug 2019 04:00 EDT Wesley Bertoli, Katiane S. Conceição, Marinho G. Andrade, Francisco Louzada. Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 4, 826--860.Abstract: The primary goal of this paper is to introduce the zero-modified Poisson–Lindley regression model as an alternative to model overdispersed count data exhibiting inflation or deflation of zeros in the presence of covariates. The zero-modification is incorporated by considering that a zero-truncated process produces positive observations and consequently, the proposed model can be fitted without any previous information about the zero-modification present in a given dataset. A fully Bayesian approach based on the g-prior method has been considered for inference concerns. An intensive Monte Carlo simulation study has been conducted to evaluate the performance of the developed methodology and the maximum likelihood estimators. The proposed model was considered for the analysis of a real dataset on the number of bids received by $126$ U.S. firms between 1978–1985, and the impact of choosing different prior distributions for the regression coefficients has been studied. A sensitivity analysis to detect influential points has been performed based on the Kullback–Leibler divergence. A general comparison with some well-known regression models for discrete data has been presented. Full Article
son Time series of count data: A review, empirical comparisons and data analysis By projecteuclid.org Published On :: Mon, 26 Aug 2019 04:00 EDT Glaura C. Franco, Helio S. Migon, Marcos O. Prates. Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 4, 756--781.Abstract: Observation and parameter driven models are commonly used in the literature to analyse time series of counts. In this paper, we study the characteristics of a variety of models and point out the main differences and similarities among these procedures, concerning parameter estimation, model fitting and forecasting. Alternatively to the literature, all inference was performed under the Bayesian paradigm. The models are fitted with a latent AR($p$) process in the mean, which accounts for autocorrelation in the data. An extensive simulation study shows that the estimates for the covariate parameters are remarkably similar across the different models. However, estimates for autoregressive coefficients and forecasts of future values depend heavily on the underlying process which generates the data. A real data set of bankruptcy in the United States is also analysed. Full Article
son A Jackson network under general regime By projecteuclid.org Published On :: Mon, 10 Jun 2019 04:04 EDT Yair Y. Shaki. Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 3, 532--548.Abstract: We consider a Jackson network in a general heavy traffic diffusion regime with the $alpha$-parametrization . We also assume that each customer may abandon the system while waiting. We show that in this regime the queue-length process converges to a multi-dimensional regulated Ornstein–Uhlenbeck process. Full Article
son The equivalence of dynamic and static asset allocations under the uncertainty caused by Poisson processes By projecteuclid.org Published On :: Mon, 14 Jan 2019 04:01 EST Yong-Chao Zhang, Na Zhang. Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 1, 184--191.Abstract: We investigate the equivalence of dynamic and static asset allocations in the case where the price process of a risky asset is driven by a Poisson process. Under some mild conditions, we obtain a necessary and sufficient condition for the equivalence of dynamic and static asset allocations. In addition, we provide a simple sufficient condition for the equivalence. Full Article
son A comparison of spatial predictors when datasets could be very large By projecteuclid.org Published On :: Tue, 19 Jul 2016 14:13 EDT Jonathan R. Bradley, Noel Cressie, Tao Shi. Source: Statistics Surveys, Volume 10, 100--131.Abstract: In this article, we review and compare a number of methods of spatial prediction, where each method is viewed as an algorithm that processes spatial data. To demonstrate the breadth of available choices, we consider both traditional and more-recently-introduced spatial predictors. Specifically, in our exposition we review: traditional stationary kriging, smoothing splines, negative-exponential distance-weighting, fixed rank kriging, modified predictive processes, a stochastic partial differential equation approach, and lattice kriging. This comparison is meant to provide a service to practitioners wishing to decide between spatial predictors. Hence, we provide technical material for the unfamiliar, which includes the definition and motivation for each (deterministic and stochastic) spatial predictor. We use a benchmark dataset of $mathrm{CO}_{2}$ data from NASA’s AIRS instrument to address computational efficiencies that include CPU time and memory usage. Furthermore, the predictive performance of each spatial predictor is assessed empirically using a hold-out subset of the AIRS data. Full Article
son Errata: A survey of Bayesian predictive methods for model assessment, selection and comparison By projecteuclid.org Published On :: Wed, 26 Feb 2014 09:10 EST Aki Vehtari, Janne Ojanen. Source: Statistics Surveys, Volume 8, , 1--1.Abstract: Errata for “A survey of Bayesian predictive methods for model assessment, selection and comparison” by A. Vehtari and J. Ojanen, Statistics Surveys , 6 (2012), 142–228. doi:10.1214/12-SS102. Full Article
son A survey of Bayesian predictive methods for model assessment, selection and comparison By projecteuclid.org Published On :: Thu, 27 Dec 2012 12:22 EST Aki Vehtari, Janne OjanenSource: Statist. Surv., Volume 6, 142--228.Abstract: To date, several methods exist in the statistical literature for model assessment, which purport themselves specifically as Bayesian predictive methods. The decision theoretic assumptions on which these methods are based are not always clearly stated in the original articles, however. The aim of this survey is to provide a unified review of Bayesian predictive model assessment and selection methods, and of methods closely related to them. We review the various assumptions that are made in this context and discuss the connections between different approaches, with an emphasis on how each method approximates the expected utility of using a Bayesian model for the purpose of predicting future data. Full Article
son Unsupervised Pre-trained Models from Healthy ADLs Improve Parkinson's Disease Classification of Gait Patterns. (arXiv:2005.02589v2 [cs.LG] UPDATED) By arxiv.org Published On :: Application and use of deep learning algorithms for different healthcare applications is gaining interest at a steady pace. However, use of such algorithms can prove to be challenging as they require large amounts of training data that capture different possible variations. This makes it difficult to use them in a clinical setting since in most health applications researchers often have to work with limited data. Less data can cause the deep learning model to over-fit. In this paper, we ask how can we use data from a different environment, different use-case, with widely differing data distributions. We exemplify this use case by using single-sensor accelerometer data from healthy subjects performing activities of daily living - ADLs (source dataset), to extract features relevant to multi-sensor accelerometer gait data (target dataset) for Parkinson's disease classification. We train the pre-trained model using the source dataset and use it as a feature extractor. We show that the features extracted for the target dataset can be used to train an effective classification model. Our pre-trained source model consists of a convolutional autoencoder, and the target classification model is a simple multi-layer perceptron model. We explore two different pre-trained source models, trained using different activity groups, and analyze the influence the choice of pre-trained model has over the task of Parkinson's disease classification. Full Article
son A Global Benchmark of Algorithms for Segmenting Late Gadolinium-Enhanced Cardiac Magnetic Resonance Imaging. (arXiv:2004.12314v3 [cs.CV] UPDATED) By arxiv.org Published On :: Segmentation of cardiac images, particularly late gadolinium-enhanced magnetic resonance imaging (LGE-MRI) widely used for visualizing diseased cardiac structures, is a crucial first step for clinical diagnosis and treatment. However, direct segmentation of LGE-MRIs is challenging due to its attenuated contrast. Since most clinical studies have relied on manual and labor-intensive approaches, automatic methods are of high interest, particularly optimized machine learning approaches. To address this, we organized the "2018 Left Atrium Segmentation Challenge" using 154 3D LGE-MRIs, currently the world's largest cardiac LGE-MRI dataset, and associated labels of the left atrium segmented by three medical experts, ultimately attracting the participation of 27 international teams. In this paper, extensive analysis of the submitted algorithms using technical and biological metrics was performed by undergoing subgroup analysis and conducting hyper-parameter analysis, offering an overall picture of the major design choices of convolutional neural networks (CNNs) and practical considerations for achieving state-of-the-art left atrium segmentation. Results show the top method achieved a dice score of 93.2% and a mean surface to a surface distance of 0.7 mm, significantly outperforming prior state-of-the-art. Particularly, our analysis demonstrated that double, sequentially used CNNs, in which a first CNN is used for automatic region-of-interest localization and a subsequent CNN is used for refined regional segmentation, achieved far superior results than traditional methods and pipelines containing single CNNs. This large-scale benchmarking study makes a significant step towards much-improved segmentation methods for cardiac LGE-MRIs, and will serve as an important benchmark for evaluating and comparing the future works in the field. Full Article
son Convergence and inference for mixed Poisson random sums. (arXiv:2005.03187v1 [math.PR]) By arxiv.org Published On :: In this paper we obtain the limit distribution for partial sums with a random number of terms following a class of mixed Poisson distributions. The resulting weak limit is a mixing between a normal distribution and an exponential family, which we call by normal exponential family (NEF) laws. A new stability concept is introduced and a relationship between {alpha}-stable distributions and NEF laws is established. We propose estimation of the parameters of the NEF models through the method of moments and also by the maximum likelihood method, which is performed via an Expectation-Maximization algorithm. Monte Carlo simulation studies are addressed to check the performance of the proposed estimators and an empirical illustration on financial market is presented. Full Article
son A comparison of group testing architectures for COVID-19 testing. (arXiv:2005.03051v1 [stat.ME]) By arxiv.org Published On :: An important component of every country's COVID-19 response is fast and efficient testing -- to identify and isolate cases, as well as for early detection of local hotspots. For many countries, producing a sufficient number of tests has been a serious limiting factor in their efforts to control COVID-19 infections. Group testing is a well-established mathematical tool, which can provide a serious and rapid improvement to this situation. In this note, we compare several well-established group testing schemes in the context of qPCR testing for COVID-19. We include example calculations, where we indicate which testing architectures yield the greatest efficiency gains in various settings. We find that for identification of individuals with COVID-19, array testing is usually the best choice, while for estimation of COVID-19 prevalence rates in the total population, Gibbs-Gower testing usually provides the most accurate estimates given a fixed and relatively small number of tests. This note is intended as a helpful handbook for labs implementing group testing methods. Full Article
son The neuroethology of birdsong By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9783030346836 (electronic bk.) Full Article
son Personalized food intervention and therapy for autism spectrum disorder management By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9783030304027 (electronic bk.) Full Article
son Neonatal lung ultrasonography By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Callnumber: OnlineISBN: 9789402415490 (electronic bk.) Full Article
son Handbook of geotechnical testing : basic theory, procedures and comparison of standards By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Author: Li, Yanrong (Writer on geology), author.Callnumber: OnlineISBN: 0429323743 electronic book Full Article
son Active ranking from pairwise comparisons and when parametric assumptions do not help By projecteuclid.org Published On :: Wed, 30 Oct 2019 22:03 EDT Reinhard Heckel, Nihar B. Shah, Kannan Ramchandran, Martin J. Wainwright. Source: The Annals of Statistics, Volume 47, Number 6, 3099--3126.Abstract: We consider sequential or active ranking of a set of $n$ items based on noisy pairwise comparisons. Items are ranked according to the probability that a given item beats a randomly chosen item, and ranking refers to partitioning the items into sets of prespecified sizes according to their scores. This notion of ranking includes as special cases the identification of the top-$k$ items and the total ordering of the items. We first analyze a sequential ranking algorithm that counts the number of comparisons won, and uses these counts to decide whether to stop, or to compare another pair of items, chosen based on confidence intervals specified by the data collected up to that point. We prove that this algorithm succeeds in recovering the ranking using a number of comparisons that is optimal up to logarithmic factors. This guarantee does depend on whether or not the underlying pairwise probability matrix, satisfies a particular structural property, unlike a significant body of past work on pairwise ranking based on parametric models such as the Thurstone or Bradley–Terry–Luce models. It has been a long-standing open question as to whether or not imposing these parametric assumptions allows for improved ranking algorithms. For stochastic comparison models, in which the pairwise probabilities are bounded away from zero, our second contribution is to resolve this issue by proving a lower bound for parametric models. This shows, perhaps surprisingly, that these popular parametric modeling choices offer at most logarithmic gains for stochastic comparisons. Full Article
son A comparison of principal component methods between multiple phenotype regression and multiple SNP regression in genetic association studies By projecteuclid.org Published On :: Wed, 15 Apr 2020 22:05 EDT Zhonghua Liu, Ian Barnett, Xihong Lin. Source: The Annals of Applied Statistics, Volume 14, Number 1, 433--451.Abstract: Principal component analysis (PCA) is a popular method for dimension reduction in unsupervised multivariate analysis. However, existing ad hoc uses of PCA in both multivariate regression (multiple outcomes) and multiple regression (multiple predictors) lack theoretical justification. The differences in the statistical properties of PCAs in these two regression settings are not well understood. In this paper we provide theoretical results on the power of PCA in genetic association testings in both multiple phenotype and SNP-set settings. The multiple phenotype setting refers to the case when one is interested in studying the association between a single SNP and multiple phenotypes as outcomes. The SNP-set setting refers to the case when one is interested in studying the association between multiple SNPs in a SNP set and a single phenotype as the outcome. We demonstrate analytically that the properties of the PC-based analysis in these two regression settings are substantially different. We show that the lower order PCs, that is, PCs with large eigenvalues, are generally preferred and lead to a higher power in the SNP-set setting, while the higher-order PCs, that is, PCs with small eigenvalues, are generally preferred in the multiple phenotype setting. We also investigate the power of three other popular statistical methods, the Wald test, the variance component test and the minimum $p$-value test, in both multiple phenotype and SNP-set settings. We use theoretical power, simulation studies, and two real data analyses to validate our findings. Full Article
son Fire seasonality identification with multimodality tests By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Jose Ameijeiras-Alonso, Akli Benali, Rosa M. Crujeiras, Alberto Rodríguez-Casal, José M. C. Pereira. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2120--2139.Abstract: Understanding the role of vegetation fires in the Earth system is an important environmental problem. Although fire occurrence is influenced by natural factors, human activity related to land use and management has altered the temporal patterns of fire in several regions of the world. Hence, for a better insight into fires regimes it is of special interest to analyze where human activity has altered fire seasonality. For doing so, multimodality tests are a useful tool for determining the number of annual fire peaks. The periodicity of fires and their complex distributional features motivate the use of nonparametric circular statistics. The unsatisfactory performance of previous circular nonparametric proposals for testing multimodality justifies the introduction of a new approach, considering an adapted version of the excess mass statistic, jointly with a bootstrap calibration algorithm. A systematic application of the test on the Russia–Kazakhstan area is presented in order to determine how many fire peaks can be identified in this region. A False Discovery Rate correction, accounting for the spatial dependence of the data, is also required. Full Article
son Modeling seasonality and serial dependence of electricity price curves with warping functional autoregressive dynamics By projecteuclid.org Published On :: Wed, 16 Oct 2019 22:03 EDT Ying Chen, J. S. Marron, Jiejie Zhang. Source: The Annals of Applied Statistics, Volume 13, Number 3, 1590--1616.Abstract: Electricity prices are high dimensional, serially dependent and have seasonal variations. We propose a Warping Functional AutoRegressive (WFAR) model that simultaneously accounts for the cross time-dependence and seasonal variations of the large dimensional data. In particular, electricity price curves are obtained by smoothing over the $24$ discrete hourly prices on each day. In the functional domain, seasonal phase variations are separated from level amplitude changes in a warping process with the Fisher–Rao distance metric, and the aligned (season-adjusted) electricity price curves are modeled in the functional autoregression framework. In a real application, the WFAR model provides superior out-of-sample forecast accuracy in both a normal functioning market, Nord Pool, and an extreme situation, the California market. The forecast performance as well as the relative accuracy improvement are stable for different markets and different time periods. Full Article
son The maximal degree in a Poisson–Delaunay graph By projecteuclid.org Published On :: Fri, 31 Jan 2020 04:06 EST Gilles Bonnet, Nicolas Chenavier. Source: Bernoulli, Volume 26, Number 2, 948--979.Abstract: We investigate the maximal degree in a Poisson–Delaunay graph in $mathbf{R}^{d}$, $dgeq 2$, over all nodes in the window $mathbf{W}_{ ho }:= ho^{1/d}[0,1]^{d}$ as $ ho $ goes to infinity. The exact order of this maximum is provided in any dimension. In the particular setting $d=2$, we show that this quantity is concentrated on two consecutive integers with high probability. A weaker version of this result is discussed when $dgeq 3$. Full Article
son A new method for obtaining sharp compound Poisson approximation error estimates for sums of locally dependent random variables By projecteuclid.org Published On :: Thu, 05 Aug 2010 15:41 EDT Michael V. Boutsikas, Eutichia VaggelatouSource: Bernoulli, Volume 16, Number 2, 301--330.Abstract: Let X 1 , X 2 , …, X n be a sequence of independent or locally dependent random variables taking values in ℤ + . In this paper, we derive sharp bounds, via a new probabilistic method, for the total variation distance between the distribution of the sum ∑ i =1 n X i and an appropriate Poisson or compound Poisson distribution. These bounds include a factor which depends on the smoothness of the approximating Poisson or compound Poisson distribution. This “smoothness factor” is of order O( σ −2 ), according to a heuristic argument, where σ 2 denotes the variance of the approximating distribution. In this way, we offer sharp error estimates for a large range of values of the parameters. Finally, specific examples concerning appearances of rare runs in sequences of Bernoulli trials are presented by way of illustration. Full Article
son The Thomson family : fisherman in Buckhaven, retailers in Kapunda / compiled by Elizabeth Anne Howell. By www.catalog.slsa.sa.gov.au Published On :: Thomson (Family) Full Article
son With a bottle of whisky in my hand : the family of James Grant and Isabella Masson / by Carolyn Cowgill. By www.catalog.slsa.sa.gov.au Published On :: Grant (Family) Full Article
son Pearson K12 Spinoff Rebranded as ‘Savvas Learning Company’ By marketbrief.edweek.org Published On :: Wed, 06 May 2020 21:20:35 +0000 Savvas Learning Company will continue to provide its K-12 products and services, and is working to support districts with their remote learning needs during school closures. The post Pearson K12 Spinoff Rebranded as ‘Savvas Learning Company’ appeared first on Market Brief. Full Article Marketplace K-12 Business Strategy Data Mergers and Acquisitions Online / Virtual Learning
son Item 1: George Hugh Morrison diary, 15 January 1916- 1 January 1917 By feedproxy.google.com Published On :: 23/03/2015 4:10:54 PM Full Article
son Item 2: George Hugh Morrison diary, 1 January 1917-9 October 1917 By feedproxy.google.com Published On :: 23/03/2015 4:22:32 PM Full Article
son Federal watchdog finds 'reasonable grounds to believe' vaccine doctor's ouster was retaliation, lawyers say By news.yahoo.com Published On :: Fri, 08 May 2020 16:37:13 -0400 The Office of Special Counsel is recommending that ousted vaccine official Dr. Rick Bright be reinstated while it investigates his case, his lawyers announced Friday.Bright while leading coronavirus vaccine development was recently removed from his position as the director of the Department of Health and Human Services' Biomedical Advanced Research and Development Authority, and he alleges it was because he insisted congressional funding not go toward "drugs, vaccines, and other technologies that lack scientific merit" and limited the "broad use" of hydroxychloroquine after it was touted by President Trump. In a whistleblower complaint, he alleged "cronyism" at HHS. He has also alleged he was "pressured to ignore or dismiss expert scientific recommendations and instead to award lucrative contracts based on political connections."On Friday, Bright's lawyers said that the Office of Special Counsel has determined there are "reasonable grounds to believe" his firing was retaliation, The New York Times reports. The federal watchdog also recommended he be reinstated for 45 days to give the office "sufficient time to complete its investigation of Bright's allegations," CNN reports. The decision on whether to do so falls on Secretary of Health and Human Services Alex Azar, and Office of Special Counsel recommendations are "not binding," the Times notes. More stories from theweek.com Outed CIA agent Valerie Plame is running for Congress, and her launch video looks like a spy movie trailer 7 scathing cartoons about America's rush to reopen Trump says he couldn't have exposed WWII vets to COVID-19 because the wind was blowing the wrong way Full Article
son A person was struck and killed by a Southwest plane as it landed on the runway at Austin international airport By news.yahoo.com Published On :: Fri, 08 May 2020 10:53:00 -0400 Austin-Bergstrom International Airport said it was "aware of an individual that was struck and killed on runway 17-R by a landing aircraft." Full Article
son Neighbor of father and son arrested in Ahmaud Arbery killing is also under investigation By news.yahoo.com Published On :: Fri, 08 May 2020 11:42:19 -0400 The ongoing investigation of the fatal shooting in Brunswick, Georgia, will also look at a neighbor of suspects Gregory and Travis McMichael who recorded video of the incident, authorities said. Full Article
son Comment on “Automated Versus Do-It-Yourself Methods for Causal Inference: Lessons Learned from a Data Analysis Competition” By projecteuclid.org Published On :: Fri, 12 Apr 2019 04:00 EDT Susan Gruber, Mark J. van der Laan. Source: Statistical Science, Volume 34, Number 1, 82--85.Abstract: Dorie and co-authors (DHSSC) are to be congratulated for initiating the ACIC Data Challenge. Their project engaged the community and accelerated research by providing a level playing field for comparing the performance of a priori specified algorithms. DHSSC identified themes concerning characteristics of the DGP, properties of the estimators, and inference. We discuss these themes in the context of targeted learning. Full Article
son Tapadh leibh airson nach do smoc sibh / design : Biman Mullick. By search.wellcomelibrary.org Published On :: London (33 Stillness Rd, SE23 1NG) : Cleanair, Campaign for a Smoke-free Environment, [198-?] Full Article
son Tapadh leibh airson nach do smoc sibh / design: Biman Mullick. By search.wellcomelibrary.org Published On :: London (33 Stillness Road London SE23 1NG) : Cleanair Campaign for a Smoke-free Environment, [198-?] Full Article
son Linear Systems Analysis of Functional Magnetic Resonance Imaging in Human V1 By www.jneurosci.org Published On :: 1996-07-01 Geoffrey M. BoyntonJul 1, 1996; 16:4207-4221Articles Full Article