kernel 49. kernel-discuss Archives: This tutorial concludes the Basic Reportingsection. By www.handhelds.org Published On :: ricerche inerenti Simply Uk Gadgets: - Regali Tesi - Bomboniera Capodimonte - Portachiavi. ... Sella cio simply uk gadgets . Type as you would on a normal ... Full Article
kernel LXer: Upstream Linux 6.12 Makes It Easier To Build A Debug Kernel For Arch Linux By www.linuxquestions.org Published On :: Sat, 28 Sep 2024 14:50:30 GMT Published at LXer: The upstream Linux 6.11 kernel introduced the ability to easily produce a Pacman kernel package for Arch Linux with the new "make pacman-pkg" target. With Linux 6.12 new... Full Article Syndicated Linux News
kernel LXer: Early Linux 6.12 Kernel Benchmarks Showing Some Nice Gains On AMD Zen 5 By www.linuxquestions.org Published On :: Sun, 29 Sep 2024 06:50:18 GMT Published at LXer: With the Linux 6.12 merge window wrapping up this weekend and the bulk of the new feature merges now in the tree, I've begun running some Linux 6.12 benchmarks. Here is an... Full Article Syndicated Linux News
kernel LXer: CachyOS ISO Release for September 2024 Brings Linux Kernel 6.11 and Optimizations By www.linuxquestions.org Published On :: Mon, 30 Sep 2024 09:02:01 GMT Published at LXer: The Arch Linux-based and KDE Plasma-focused CachyOS distribution has a new ISO release for September 2024 adding various performance improvements and optimizations across the... Full Article Syndicated Linux News
kernel LXer: Linus Torvalds Announces First Linux Kernel 6.12 Release Candidate By www.linuxquestions.org Published On :: Mon, 30 Sep 2024 14:00:05 GMT Published at LXer: Linus Torvalds announced today the general availability for public testing of the first Release Candidate (RC) development milestone of the upcoming Linux 6.12 kernel series. ... Full Article Syndicated Linux News
kernel LXer: Linux Kernel 6.12 RC1 Released: PREEMPT_RT Mainlined and Sched_ext Merged By www.linuxquestions.org Published On :: Tue, 01 Oct 2024 10:44:11 GMT Published at LXer: Linus Torvalds announced the release of Linux Kernel 6.12 RC1. Kernel 6.12 RC1 brings important new features like PREEMPT_RT and sched_ext. Read More...... Full Article Syndicated Linux News
kernel LXer: Linux kernel 6.11 lands with vintage TV support By www.linuxquestions.org Published On :: Tue, 01 Oct 2024 15:01:34 GMT Published at LXer: Released remotely from Vienna, Linux kernel 6.11 is here, with improved monochrome TV support. Yes, in 2024. Emperor penguin Linus Torvalds was attending the Open Source Summit... Full Article Syndicated Linux News
kernel Android 15 QPR2 brings the newest Linux kernel to all tensor-powered phones and tablets - Android Police By news.google.com Published On :: Wed, 13 Nov 2024 02:23:00 GMT Android 15 QPR2 brings the newest Linux kernel to all tensor-powered phones and tablets Android PoliceHere’s everything new in Android 15 QPR2 Beta 1 [Gallery] 9to5GoogleYour Google Pixel Phone's Newest Android 15 Beta Update Arrived Droid LifeGoogle is preparing to bring back a beloved customization feature from Android 11 Android AuthorityAndroid 15 QPR2 beta 1 release includes major upgrade for Tensor-powered Pixels PhoneArena Full Article
kernel Study of physico-mechanical properties of concretes based on palm kernel shells originating from the locality of Haut Nkam in Cameroon By academicjournals.org Published On :: Sun, 31 May 2020 00:00:00 +0100 This study is based on the use of palm kernel shells as aggregate in the manufacture of concrete. Several (0, 25, 50, 75 and 100%) substitutions were used in the volume fraction of the aggregates. In order to evaluate the effect of this substitution, the mechanical properties at 7 and 28 days for compression was determine, 28 days for bending and then the physical properties of fresh and har... Full Article
kernel How a signed driver exposed users to kernel-level threats – Week in Security with Tony Anscombe By www.welivesecurity.com Published On :: Sun, 21 Jul 2024 07:24:11 +0000 A purported ad blocker marketed as a security solution leverages a Microsoft-signed driver that inadvertently exposes victims to dangerous threats Full Article
kernel Re: CVE-2024-36905: Linux kernel: Divide-by-zero on shutdown of TCP_SYN_RECV sockets By seclists.org Published On :: Tue, 12 Nov 2024 15:03:15 GMT Posted by Solar Designer on Nov 12NIST doesn't appear to provide their own CVSS vectors/scores lately. However, they republish (with attribution) some third-party ones, this time from CISA-ADP. The CISA-ADP CVSS vector for this vulnerability specifies that it not only is network-reachable, but also that it has High impact not only on Availability, but also on Confidentiality and Integrity. This results in a CVSSv3.1 score of 9.8. Even merely correcting the vector not to... Full Article
kernel Re: CVE-2024-36905: Linux kernel: Divide-by-zero on shutdown of TCP_SYN_RECV sockets By seclists.org Published On :: Tue, 12 Nov 2024 16:42:28 GMT Posted by Clemens Lang on Nov 12Hi, I think the source for the CISA-ADP data is at [1]. For this specific CVE, the relevant file would be [2]. Their readme has a section at the bottom, where they encourage feedback: I’m aware of at last one prior case where a similar case of (IMHO) overblown CVSS scores was discussed in an issue on this particular GitHub project [3]. Somebody seems to already have opened a ticket for this CVE, too: [4] [1]:... Full Article
kernel RE: CVE-2024-36905: Linux kernel: Divide-by-zero on shutdown of TCP_SYN_RECV sockets By seclists.org Published On :: Tue, 12 Nov 2024 17:06:25 GMT Posted by Joel GUITTET on Nov 12Hello First thanks to Alexander for reposting because I was not able to do so! You're right Clemens, I have myself ask the question on this github (https://github.com/cisagov/vulnrichment/issues/130), but still no information for the moment. Joel Full Article
kernel SE-Radio Episode 271: Idit Levine on Unikernelsl By traffic.libsyn.com Published On :: Tue, 11 Oct 2016 16:31:00 +0000 Jeff Meyerson talks to Idit Levine about Unikernels and unik, a project for compiling unikernels. The Linux kernel contains features that may be unnecessary to many application developers--particularly if those developers are deploying to the cloud. Unikernels allow programmers to specify the minimum features of an operating system we need to deploy our applications. Topics include the the Linux kernel, requirements for a cloud operating system, and how unikernels compare to Docker containers. Full Article
kernel De Branges-Rovnyak spaces and complete Nevanlinna-Pick kernels By www.ams.org Published On :: Tue, 05 Nov 2024 15:05 EST Cheng Chu Proc. Amer. Math. Soc. 152 (), 5289-5297. Abstract, references and article information Full Article
kernel Cashew exporters concerned over surge in fraudulent imports of kernel By www.financialexpress.com Published On :: 2019-05-08T02:03:00+05:30 India produces 6-7 million tonne raw cashew per annum, and was till recently the leading supplier of kernels to the global markets. Full Article Commodities Markets
kernel Photos: Cedar Rapids Kernels offer curbside ballpark food to fans By feedproxy.google.com Published On :: Fri, 08 May 2020 15:25:40 PDT The team will be offering carry-out ballpark food to fans on Fridays with orders placed during business hours on Tuesdays and Wednesdays Full Article
kernel No baseball right now, but Cedar Rapids Kernels offering a bit of the ballpark taste By feedproxy.google.com Published On :: Fri, 08 May 2020 17:40:25 PDT CEDAR RAPIDS — You weren’t taken out to the ballgame or the crowd. You couldn’t get Cracker Jack, though you could get peanuts. Not to mention hot dogs and bacon cheeseburgers, a... Full Article Minor League Sports
kernel Compression, inversion, and approximate PCA of dense kernel matrices at near-linear computational complexity. (arXiv:1706.02205v4 [math.NA] UPDATED) By arxiv.org Published On :: Dense kernel matrices $Theta in mathbb{R}^{N imes N}$ obtained from point evaluations of a covariance function $G$ at locations ${ x_{i} }_{1 leq i leq N} subset mathbb{R}^{d}$ arise in statistics, machine learning, and numerical analysis. For covariance functions that are Green's functions of elliptic boundary value problems and homogeneously-distributed sampling points, we show how to identify a subset $S subset { 1 , dots , N }^2$, with $# S = O ( N log (N) log^{d} ( N /epsilon ) )$, such that the zero fill-in incomplete Cholesky factorisation of the sparse matrix $Theta_{ij} 1_{( i, j ) in S}$ is an $epsilon$-approximation of $Theta$. This factorisation can provably be obtained in complexity $O ( N log( N ) log^{d}( N /epsilon) )$ in space and $O ( N log^{2}( N ) log^{2d}( N /epsilon) )$ in time, improving upon the state of the art for general elliptic operators; we further present numerical evidence that $d$ can be taken to be the intrinsic dimension of the data set rather than that of the ambient space. The algorithm only needs to know the spatial configuration of the $x_{i}$ and does not require an analytic representation of $G$. Furthermore, this factorization straightforwardly provides an approximate sparse PCA with optimal rate of convergence in the operator norm. Hence, by using only subsampling and the incomplete Cholesky factorization, we obtain, at nearly linear complexity, the compression, inversion and approximate PCA of a large class of covariance matrices. By inverting the order of the Cholesky factorization we also obtain a solver for elliptic PDE with complexity $O ( N log^{d}( N /epsilon) )$ in space and $O ( N log^{2d}( N /epsilon) )$ in time, improving upon the state of the art for general elliptic operators. Full Article
kernel Dynamic rule management for kernel mode filter drivers By www.freepatentsonline.com Published On :: Tue, 26 May 2015 08:00:00 EDT A method for providing rules for a plurality of processes from a user mode to a kernel mode of a computer is disclosed. The method includes providing to the kernel mode a policy for at least a first process of the plurality of processes, the policy indicating at least when and/or how notifications are to be provided from the kernel mode to the user mode upon detection in the kernel mode of launching of the first process. The method further includes selecting, from the rules stored in the user mode, rules related to the launching of the first process, in response to receiving from the kernel mode a first notification in accordance with the policy, and providing the selected rules related to the launching of the first process from the user mode to at least one of the one or more filter drivers in the kernel mode. Full Article
kernel SYSTEMS AND METHODS FOR UNIT TESTING OF FUNCTIONS ON REMOTE KERNELS By www.freepatentsonline.com Published On :: Thu, 29 Jun 2017 08:00:00 EDT The disclosed computer-implemented method may include (1) providing a framework that includes (A) a user-space component that runs at a client site and (B) a kernel-space component that runs at a remote site, (2) identifying attributes of objects that reside at the remote site and whose addresses are unknown at the client site, (3) generating a script to test a function of a kernel running on the remote site based at least in part on the attributes, and (4) performing a remote unit testing of the function of the kernel by executing the script such that the user-space component (A) generates a message that identifies the attributes and (B) sends the message to the kernel-space component to facilitate (I) obtaining references to the objects by way of the attributes and (II) invoking the function by way of the references. Various other methods, systems, and computer-readable media are also disclosed. Full Article
kernel Using N_Port ID Virtualization (NPIV) with kernel-based virtual machine (KVM) guests on IBM Power servers By www.ibm.com Published On :: 08 Jan 2018 05:00:00 +0000 This article provides the basic steps to use N-Port ID Virtualization (NPIV) technology in a kernel-based virtual machine (KVM) guest. Additionally, the article also provides the significance of NPIV allowing multiple guests to make use of a single physical host bus adapter (HBA) to access multiple storage devices. Full Article linux
kernel Detecting Linux kernel process masquerading with command line forensics By blog.apnic.net Published On :: Mon, 27 Apr 2020 00:40:50 +0000 Guest Post: Learn how to use Linux command line to investigate suspicious processes trying to masquerade as kernel threads. Full Article <a href="https://blog.apnic.net/category/tech-matters/">Tech matters</a>
kernel Spectral analysis and representation of solutions of integro-differential equations with fractional exponential kernels By www.ams.org Published On :: Fri, 10 Apr 2020 08:09 EDT V. V. Vlasov and N. A. Rautian Trans. Moscow Math. Soc. 80 (2020), 169-188. Abstract, references and article information Full Article
kernel Nonlinear ????-term approximation of harmonic functions from shifts of the Newtonian kernel By www.ams.org Published On :: Wed, 08 Apr 2020 11:21 EDT Kamen G. Ivanov and Pencho Petrushev Trans. Amer. Math. Soc. 373 (2020), 3117-3176. Abstract, references and article information Full Article
kernel On the predictive potential of kernel principal components By projecteuclid.org Published On :: Wed, 15 Apr 2020 04:02 EDT Ben Jones, Andreas Artemiou, Bing Li. Source: Electronic Journal of Statistics, Volume 14, Number 1, 1--23.Abstract: We give a probabilistic analysis of a phenomenon in statistics which, until recently, has not received a convincing explanation. This phenomenon is that the leading principal components tend to possess more predictive power for a response variable than lower-ranking ones despite the procedure being unsupervised. Our result, in its most general form, shows that the phenomenon goes far beyond the context of linear regression and classical principal components — if an arbitrary distribution for the predictor $X$ and an arbitrary conditional distribution for $Yvert X$ are chosen then any measureable function $g(Y)$, subject to a mild condition, tends to be more correlated with the higher-ranking kernel principal components than with the lower-ranking ones. The “arbitrariness” is formulated in terms of unitary invariance then the tendency is explicitly quantified by exploring how unitary invariance relates to the Cauchy distribution. The most general results, for technical reasons, are shown for the case where the kernel space is finite dimensional. The occurency of this tendency in real world databases is also investigated to show that our results are consistent with observation. Full Article
kernel On lp-Support Vector Machines and Multidimensional Kernels By Published On :: 2020 In this paper, we extend the methodology developed for Support Vector Machines (SVM) using the $ell_2$-norm ($ell_2$-SVM) to the more general case of $ell_p$-norms with $p>1$ ($ell_p$-SVM). We derive second order cone formulations for the resulting dual and primal problems. The concept of kernel function, widely applied in $ell_2$-SVM, is extended to the more general case of $ell_p$-norms with $p>1$ by defining a new operator called multidimensional kernel. This object gives rise to reformulations of dual problems, in a transformed space of the original data, where the dependence on the original data always appear as homogeneous polynomials. We adapt known solution algorithms to efficiently solve the primal and dual resulting problems and some computational experiments on real-world datasets are presented showing rather good behavior in terms of the accuracy of $ell_p$-SVM with $p>1$. Full Article
kernel A Convex Parametrization of a New Class of Universal Kernel Functions By Published On :: 2020 The accuracy and complexity of kernel learning algorithms is determined by the set of kernels over which it is able to optimize. An ideal set of kernels should: admit a linear parameterization (tractability); be dense in the set of all kernels (accuracy); and every member should be universal so that the hypothesis space is infinite-dimensional (scalability). Currently, there is no class of kernel that meets all three criteria - e.g. Gaussians are not tractable or accurate; polynomials are not scalable. We propose a new class that meet all three criteria - the Tessellated Kernel (TK) class. Specifically, the TK class: admits a linear parameterization using positive matrices; is dense in all kernels; and every element in the class is universal. This implies that the use of TK kernels for learning the kernel can obviate the need for selecting candidate kernels in algorithms such as SimpleMKL and parameters such as the bandwidth. Numerical testing on soft margin Support Vector Machine (SVM) problems show that algorithms using TK kernels outperform other kernel learning algorithms and neural networks. Furthermore, our results show that when the ratio of the number of training data to features is high, the improvement of TK over MKL increases significantly. Full Article
kernel GraKeL: A Graph Kernel Library in Python By Published On :: 2020 The problem of accurately measuring the similarity between graphs is at the core of many applications in a variety of disciplines. Graph kernels have recently emerged as a promising approach to this problem. There are now many kernels, each focusing on different structural aspects of graphs. Here, we present GraKeL, a library that unifies several graph kernels into a common framework. The library is written in Python and adheres to the scikit-learn interface. It is simple to use and can be naturally combined with scikit-learn's modules to build a complete machine learning pipeline for tasks such as graph classification and clustering. The code is BSD licensed and is available at: https://github.com/ysig/GraKeL. Full Article
kernel Conjugate Gradients for Kernel Machines By Published On :: 2020 Regularized least-squares (kernel-ridge / Gaussian process) regression is a fundamental algorithm of statistics and machine learning. Because generic algorithms for the exact solution have cubic complexity in the number of datapoints, large datasets require to resort to approximations. In this work, the computation of the least-squares prediction is itself treated as a probabilistic inference problem. We propose a structured Gaussian regression model on the kernel function that uses projections of the kernel matrix to obtain a low-rank approximation of the kernel and the matrix. A central result is an enhanced way to use the method of conjugate gradients for the specific setting of least-squares regression as encountered in machine learning. Full Article
kernel The weight function in the subtree kernel is decisive By Published On :: 2020 Tree data are ubiquitous because they model a large variety of situations, e.g., the architecture of plants, the secondary structure of RNA, or the hierarchy of XML files. Nevertheless, the analysis of these non-Euclidean data is difficult per se. In this paper, we focus on the subtree kernel that is a convolution kernel for tree data introduced by Vishwanathan and Smola in the early 2000's. More precisely, we investigate the influence of the weight function from a theoretical perspective and in real data applications. We establish on a 2-classes stochastic model that the performance of the subtree kernel is improved when the weight of leaves vanishes, which motivates the definition of a new weight function, learned from the data and not fixed by the user as usually done. To this end, we define a unified framework for computing the subtree kernel from ordered or unordered trees, that is particularly suitable for tuning parameters. We show through eight real data classification problems the great efficiency of our approach, in particular for small data sets, which also states the high importance of the weight function. Finally, a visualization tool of the significant features is derived. Full Article
kernel The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy By projecteuclid.org Published On :: Tue, 16 Oct 2012 09:36 EDT Nancy HeckmanSource: Statist. Surv., Volume 6, 113--141.Abstract: The popular cubic smoothing spline estimate of a regression function arises as the minimizer of the penalized sum of squares $sum_{j}(Y_{j}-mu(t_{j}))^{2}+lambda int_{a}^{b}[mu''(t)]^{2},dt$, where the data are $t_{j},Y_{j}$, $j=1,ldots,n$. The minimization is taken over an infinite-dimensional function space, the space of all functions with square integrable second derivatives. But the calculations can be carried out in a finite-dimensional space. The reduction from minimizing over an infinite dimensional space to minimizing over a finite dimensional space occurs for more general objective functions: the data may be related to the function $mu$ in another way, the sum of squares may be replaced by a more suitable expression, or the penalty, $int_{a}^{b}[mu''(t)]^{2},dt$, might take a different form. This paper reviews the Reproducing Kernel Hilbert Space structure that provides a finite-dimensional solution for a general minimization problem. Particular attention is paid to the construction and study of the Reproducing Kernel Hilbert Space corresponding to a penalty based on a linear differential operator. In this case, one can often calculate the minimizer explicitly, using Green’s functions. Full Article
kernel Primal and dual model representations in kernel-based learning By projecteuclid.org Published On :: Wed, 25 Aug 2010 10:28 EDT Johan A.K. Suykens, Carlos Alzate, Kristiaan PelckmansSource: Statist. Surv., Volume 4, 148--183.Abstract: This paper discusses the role of primal and (Lagrange) dual model representations in problems of supervised and unsupervised learning. The specification of the estimation problem is conceived at the primal level as a constrained optimization problem. The constraints relate to the model which is expressed in terms of the feature map. From the conditions for optimality one jointly finds the optimal model representation and the model estimate. At the dual level the model is expressed in terms of a positive definite kernel function, which is characteristic for a support vector machine methodology. It is discussed how least squares support vector machines are playing a central role as core models across problems of regression, classification, principal component analysis, spectral clustering, canonical correlation analysis, dimensionality reduction and data visualization. Full Article
kernel Fast multivariate empirical cumulative distribution function with connection to kernel density estimation. (arXiv:2005.03246v1 [cs.DS]) By arxiv.org Published On :: This paper revisits the problem of computing empirical cumulative distribution functions (ECDF) efficiently on large, multivariate datasets. Computing an ECDF at one evaluation point requires $mathcal{O}(N)$ operations on a dataset composed of $N$ data points. Therefore, a direct evaluation of ECDFs at $N$ evaluation points requires a quadratic $mathcal{O}(N^2)$ operations, which is prohibitive for large-scale problems. Two fast and exact methods are proposed and compared. The first one is based on fast summation in lexicographical order, with a $mathcal{O}(N{log}N)$ complexity and requires the evaluation points to lie on a regular grid. The second one is based on the divide-and-conquer principle, with a $mathcal{O}(Nlog(N)^{(d-1){vee}1})$ complexity and requires the evaluation points to coincide with the input points. The two fast algorithms are described and detailed in the general $d$-dimensional case, and numerical experiments validate their speed and accuracy. Secondly, the paper establishes a direct connection between cumulative distribution functions and kernel density estimation (KDE) for a large class of kernels. This connection paves the way for fast exact algorithms for multivariate kernel density estimation and kernel regression. Numerical tests with the Laplacian kernel validate the speed and accuracy of the proposed algorithms. A broad range of large-scale multivariate density estimation, cumulative distribution estimation, survival function estimation and regression problems can benefit from the proposed numerical methods. Full Article
kernel Active Learning with Multiple Kernels. (arXiv:2005.03188v1 [cs.LG]) By arxiv.org Published On :: Online multiple kernel learning (OMKL) has provided an attractive performance in nonlinear function learning tasks. Leveraging a random feature approximation, the major drawback of OMKL, known as the curse of dimensionality, has been recently alleviated. In this paper, we introduce a new research problem, termed (stream-based) active multiple kernel learning (AMKL), in which a learner is allowed to label selected data from an oracle according to a selection criterion. This is necessary in many real-world applications as acquiring true labels is costly or time-consuming. We prove that AMKL achieves an optimal sublinear regret, implying that the proposed selection criterion indeed avoids unuseful label-requests. Furthermore, we propose AMKL with an adaptive kernel selection (AMKL-AKS) in which irrelevant kernels can be excluded from a kernel dictionary 'on the fly'. This approach can improve the efficiency of active learning as well as the accuracy of a function approximation. Via numerical tests with various real datasets, it is demonstrated that AMKL-AKS yields a similar or better performance than the best-known OMKL, with a smaller number of labeled data. Full Article
kernel Scalable high-resolution forecasting of sparse spatiotemporal events with kernel methods: A winning solution to the NIJ “Real-Time Crime Forecasting Challenge” By projecteuclid.org Published On :: Wed, 27 Nov 2019 22:01 EST Seth Flaxman, Michael Chirico, Pau Pereira, Charles Loeffler. Source: The Annals of Applied Statistics, Volume 13, Number 4, 2564--2585.Abstract: We propose a generic spatiotemporal event forecasting method which we developed for the National Institute of Justice’s (NIJ) Real-Time Crime Forecasting Challenge (National Institute of Justice (2017)). Our method is a spatiotemporal forecasting model combining scalable randomized Reproducing Kernel Hilbert Space (RKHS) methods for approximating Gaussian processes with autoregressive smoothing kernels in a regularized supervised learning framework. While the smoothing kernels capture the two main approaches in current use in the field of crime forecasting, kernel density estimation (KDE) and self-exciting point process (SEPP) models, the RKHS component of the model can be understood as an approximation to the popular log-Gaussian Cox Process model. For inference, we discretize the spatiotemporal point pattern and learn a log-intensity function using the Poisson likelihood and highly efficient gradient-based optimization methods. Model hyperparameters including quality of RKHS approximation, spatial and temporal kernel lengthscales, number of autoregressive lags and bandwidths for smoothing kernels as well as cell shape, size and rotation, were learned using cross validation. Resulting predictions significantly exceeded baseline KDE estimates and SEPP models for sparse events. Full Article
kernel Kernel and wavelet density estimators on manifolds and more general metric spaces By projecteuclid.org Published On :: Mon, 27 Apr 2020 04:02 EDT Galatia Cleanthous, Athanasios G. Georgiadis, Gerard Kerkyacharian, Pencho Petrushev, Dominique Picard. Source: Bernoulli, Volume 26, Number 3, 1832--1862.Abstract: We consider the problem of estimating the density of observations taking values in classical or nonclassical spaces such as manifolds and more general metric spaces. Our setting is quite general but also sufficiently rich in allowing the development of smooth functional calculus with well localized spectral kernels, Besov regularity spaces, and wavelet type systems. Kernel and both linear and nonlinear wavelet density estimators are introduced and studied. Convergence rates for these estimators are established and discussed. Full Article
kernel Adaptive Bayesian Nonparametric Regression Using a Kernel Mixture of Polynomials with Application to Partial Linear Models By projecteuclid.org Published On :: Mon, 13 Jan 2020 04:00 EST Fangzheng Xie, Yanxun Xu. Source: Bayesian Analysis, Volume 15, Number 1, 159--186.Abstract: We propose a kernel mixture of polynomials prior for Bayesian nonparametric regression. The regression function is modeled by local averages of polynomials with kernel mixture weights. We obtain the minimax-optimal contraction rate of the full posterior distribution up to a logarithmic factor by estimating metric entropies of certain function classes. Under the assumption that the degree of the polynomials is larger than the unknown smoothness level of the true function, the posterior contraction behavior can adapt to this smoothness level provided an upper bound is known. We also provide a frequentist sieve maximum likelihood estimator with a near-optimal convergence rate. We further investigate the application of the kernel mixture of polynomials to partial linear models and obtain both the near-optimal rate of contraction for the nonparametric component and the Bernstein-von Mises limit (i.e., asymptotic normality) of the parametric component. The proposed method is illustrated with numerical examples and shows superior performance in terms of computational efficiency, accuracy, and uncertainty quantification compared to the local polynomial regression, DiceKriging, and the robust Gaussian stochastic process. Full Article
kernel A Kernel Regression Procedure in the 3D Shape Space with an Application to Online Sales of Children’s Wear By projecteuclid.org Published On :: Thu, 18 Jul 2019 22:01 EDT Gregorio Quintana-Ortí, Amelia Simó. Source: Statistical Science, Volume 34, Number 2, 236--252.Abstract: This paper is focused on kernel regression when the response variable is the shape of a 3D object represented by a configuration matrix of landmarks. Regression methods on this shape space are not trivial because this space has a complex finite-dimensional Riemannian manifold structure (non-Euclidean). Papers about it are scarce in the literature, the majority of them are restricted to the case of a single explanatory variable, and many of them are based on the approximated tangent space. In this paper, there are several methodological innovations. The first one is the adaptation of the general method for kernel regression analysis in manifold-valued data to the three-dimensional case of Kendall’s shape space. The second one is its generalization to the multivariate case and the addressing of the curse-of-dimensionality problem. Finally, we propose bootstrap confidence intervals for prediction. A simulation study is carried out to check the goodness of the procedure, and a comparison with a current approach is performed. Then, it is applied to a 3D database obtained from an anthropometric survey of the Spanish child population with a potential application to online sales of children’s wear. Full Article
kernel Linux Kernel v2.4 Released By packetstormsecurity.com Published On :: Fri, 05 Jan 2001 12:28:39 GMT Full Article linux kernel
kernel Linux Kernel 2.2/2.4 Local Root Ptrace Vulnerability By packetstormsecurity.com Published On :: Mon, 17 Mar 2003 14:20:12 GMT Full Article linux kernel
kernel Linux Kernel Backdoor Blocked By packetstormsecurity.com Published On :: Fri, 07 Nov 2003 12:07:08 GMT Full Article linux trojan kernel
kernel Security Flaws Force Linux Kernel Upgrade By packetstormsecurity.com Published On :: Mon, 05 Jan 2004 14:56:05 GMT Full Article linux flaw kernel
kernel Controlling The Kernel - Its All About DRM By packetstormsecurity.com Published On :: Fri, 20 Oct 2006 06:07:43 GMT Full Article kernel
kernel Vista Kernel Fix Worse Than Useless By packetstormsecurity.com Published On :: Tue, 24 Oct 2006 01:44:24 GMT Full Article microsoft kernel
kernel Authentium: Vista Kernel Cracked By packetstormsecurity.com Published On :: Tue, 24 Oct 2006 16:19:53 GMT Full Article privacy microsoft kernel
kernel Cracking The Kernel By packetstormsecurity.com Published On :: Tue, 22 May 2007 18:09:42 GMT Full Article australia kernel
kernel ATI Driver Flaw Exposes Vista Kernel By packetstormsecurity.com Published On :: Fri, 10 Aug 2007 18:01:36 GMT Full Article microsoft flaw kernel
kernel Ubuntu Issues Security Patch For Kernel Flaw By packetstormsecurity.com Published On :: Tue, 26 Aug 2008 03:25:22 GMT Full Article linux flaw kernel patch
kernel David Kernell Photo - Rep. Mike Kernell Son Sarah Palin Anonymous Hacker? By packetstormsecurity.com Published On :: Fri, 19 Sep 2008 08:38:14 GMT Full Article hacker kernel