machine learning How do machine learning and non-traditional data affect credit scoring? New evidence from a Chinese fintech firm [electronic journal]. By encore.st-andrews.ac.uk Published On :: Full Article
machine learning Building(s and) cities: Delineating urban areas with a machine learning algorithm [electronic journal]. By encore.st-andrews.ac.uk Published On :: Full Article
machine learning Morphology of Lithium Halides in Tetrahydrofuran from Molecular Dynamics with Machine Learning Potentials By pubs.rsc.org Published On :: Chem. Sci., 2024, Accepted ManuscriptDOI: 10.1039/D4SC04957H, Edge Article Open Access   This article is licensed under a Creative Commons Attribution 3.0 Unported Licence.Marinella de Giovanetti, Sondre H. Hopen Eliasson, Sigbjørn L. Bore, Odile Eisenstein, Michele CascellaThe preferred structures of lithium halides (LiX, with X = Cl, Br, I) in organic solvents have been the subject of a wide scientific debate, and a large variety of...The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Towards real-time myocardial infarction diagnosis: a convergence of machine learning and ion-exchange membrane technologies leveraging miRNA signatures By pubs.rsc.org Published On :: Lab Chip, 2024, Advance ArticleDOI: 10.1039/D4LC00640B, Paper Open Access   This article is licensed under a Creative Commons Attribution 3.0 Unported Licence.Xiang Ren, Ruyu Zhou, George Ronan, S. Gulberk Ozcebe, Jiaying Ji, Satyajyoti Senapati, Keith L. March, Eileen Handberg, David Anderson, Carl J. Pepine, Hsueh-Chia Chang, Fang Liu, Pinar ZorlutunaRapid diagnosis of acute myocardial infarction (AMI) is crucial for optimal patient management.To cite this article before page numbers are assigned, use the DOI form of citation above.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Self-assembly of amphiphilic homopolymers grafted onto spherical nanoparticles: complete embedded minimal surfaces and a machine learning algorithm for their recognition By pubs.rsc.org Published On :: Soft Matter, 2024, 20,8385-8394DOI: 10.1039/D4SM00616J, PaperD. A. Mitkovskiy, A. A. Lazutin, A. L. Talis, V. V. VasilevskayaAmphiphilic macromolecules grafted onto spherical nanoparticles can self-assemble into morphological structures corresponding to the family of complete embedded minimal surfaces. They arise situationally, can coexist and transform into each other.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Towards efficient and stable organic solar cells: fixing the morphology problem in block copolymer active layers with synergistic strategies supported by interpretable machine learning By pubs.rsc.org Published On :: Energy Environ. Sci., 2024, 17,8954-8965DOI: 10.1039/D4EE03168G, PaperYu Cui, Qunping Fan, Hao Feng, Tao Li, Dmitry Yu. Paraschuk, Wei Ma, Han YanInterpretable machine learning identifies the causal structure–property relationships and key control factors in block copolymer organic solar cells with excellent power conversion efficiency and thermal stability.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Accelerated design of nickel-cobalt based catalysts for CO2 hydrogenation with human-in-the-loop active machine learning By pubs.rsc.org Published On :: Catal. Sci. Technol., 2024, 14,6307-6320DOI: 10.1039/D4CY00873A, Paper Open Access   This article is licensed under a Creative Commons Attribution 3.0 Unported Licence.Yasemen Kuddusi, Maarten R. Dobbelaere, Kevin M. Van Geem, Andreas ZüttelThe effect of catalyst synthesis and reaction conditions on catalytic activity were accurately predicted with an interpretable data-driven strategy. The method is demonstrated for CO2 methanation and is extendable to other catalytic processes.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Effect of graphene electrode functionalization on machine learning-aided single nucleotide classification By pubs.rsc.org Published On :: Nanoscale, 2024, 16,20202-20215DOI: 10.1039/D4NR02274B, PaperMohd Rashid, Milan Kumar Jena, Sneha Mittal, Biswarup PathakIn this study, we explored the role of functionalized entities (C, H, N, and OH) in graphene electrodes using a machine learning (ML) framework integrated with the quantum transport method to achieve precise single DNA nucleotide identification.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning Enhancing antioxidant properties of CeO2 nanoparticles with Nd3+ doping: structural, biological, and machine learning insights By pubs.rsc.org Published On :: Biomater. Sci., 2024, 12,2108-2120DOI: 10.1039/D3BM02107F, PaperOscar Ceballos-Sanchez, Diego E. Navarro-López, Jorge L. Mejía-Méndez, Gildardo Sanchez-Ante, Vicente Rodríguez-González, Angélica Lizeth Sánchez-López, Araceli Sanchez-Martinez, Sergio M. Duron-Torres, Karla Juarez-Moreno, Naveen Tiwari, Edgar R. López-MenaThe antioxidant capabilities of nanoparticles are contingent upon various factors, including their shape, size, and chemical composition.The content of this RSS Feed (c) The Royal Society of Chemistry Full Article
machine learning AI and machine learning to be top job roles in India: WEF report By www.thehindu.com Published On :: Thu, 11 May 2023 12:52:17 +0530 Full Article Shorts
machine learning Verge Genomics raises $32 million for machine learning-backed neuroscience drug discovery By cen.acs.org Published On :: 20 Jul 2018 16:15:23 +0000 Full Article
machine learning Bayesian machine learning improves single-wavelength anomalous diffraction phasing By scripts.iucr.org Published On :: 2019-10-07 Single-wavelength X-ray anomalous diffraction (SAD) is a frequently employed technique to solve the phase problem in X-ray crystallography. The precision and accuracy of recovered anomalous differences are crucial for determining the correct phases. Continuous rotation (CR) and inverse-beam geometry (IBG) anomalous data collection methods have been performed on tetragonal lysozyme and monoclinic survivin crystals and analysis carried out of how correlated the pairs of Friedel's reflections are after scaling. A multivariate Bayesian model for estimating anomalous differences was tested, which takes into account the correlation between pairs of intensity observations and incorporates the a priori knowledge about the positivity of intensity. The CR and IBG data collection methods resulted in positive correlation between I(+) and I(−) observations, indicating that the anomalous difference dominates between these observations, rather than different levels of radiation damage. An alternative pairing method based on near simultaneously observed Bijvoet's pairs displayed lower correlation and it was unsuccessful for recovering useful anomalous differences when using the multivariate Bayesian model. In contrast, multivariate Bayesian treatment of Friedel's pairs improved the initial phasing of the two tested crystal systems and the two data collection methods. Full Article text
machine learning Machine learning technique sharpens prediction of material's mechanical properties By news.ntu.edu.sg Published On :: Sun, 15 Mar 2020 16:00:00 GMT ... Full Article All
machine learning Machine learning technique sharpens mechanical property prediction  By news.ntu.edu.sg Published On :: Tue, 17 Mar 2020 16:00:00 GMT Scientists at NTU Singapore, MIT and Brown University have developed new approaches that significantly improve the accuracy of an important material testing technique by harnessing the power of machine learning.... Full Article All
machine learning NuWave Solutions to Co-host Sentiment Analysis Workshop on Deep Learning, Machine Learning, and Lexicon Based By www.24-7pressrelease.com Published On :: Thu, 02 Jan 2020 07:00:00 GMT Would you like to know what your customers, users, contacts, or relatives really think? NuWave Solutions' Executive Vice President, Brian Frutchey, leads participants as they build their own sentiment analysis application with KNIME Analytics. Full Article
machine learning A Key Missing Part of the Machine Learning Stack By feedproxy.google.com Published On :: Mon, 20 Apr 2020 16:00:16 +0000 With many organizations having machine learning models running in production, some are discovering that inefficiencies exists in the first step of the process: feature definition and extraction. Robust feature management is now being realized as a key missing part of the ML stack, and improving it by applying standard software development practices is gaining attention. Full Article 2020 Apr Opinions Feature Engineering Feature Extraction Machine Learning
machine learning Free High-Quality Machine Learning & Data Science Books & Courses: Quarantine Edition By feedproxy.google.com Published On :: Wed, 22 Apr 2020 12:00:13 +0000 If you find yourself quarantined and looking for free learning materials in the way of books and courses to sharpen your data science and machine learning skills, this collection of articles I have previously written curating such things is for you. Full Article 2020 Apr Tutorials Overviews Books Courses Data Science Free ebook Machine Learning MOOC
machine learning DBSCAN Clustering Algorithm in Machine Learning By feedproxy.google.com Published On :: Fri, 24 Apr 2020 12:00:35 +0000 An introduction to the DBSCAN algorithm and its Implementation in Python. Full Article 2020 Apr Tutorials Overviews Clustering DBSCAN Machine Learning
machine learning Top Stories, Apr 20-26: The Super Duper NLP Repo; Free High-Quality Machine Learning & Data Science Books & Courses By feedproxy.google.com Published On :: Mon, 27 Apr 2020 13:52:06 +0000 Also: Should Data Scientists Model COVID19 and other Biological Events; 5 Papers on CNNs Every Data Scientist Should Read; 24 Best (and Free) Books To Understand Machine Learning; Mathematics for Machine Learning: The Free eBook; Find Your Perfect Fit: A Quick Guide for Job Roles in the Data World Full Article 2020 Apr Top Stories Tweets Top stories
machine learning 10 Best Machine Learning Textbooks that All Data Scientists Should Read By feedproxy.google.com Published On :: Tue, 28 Apr 2020 12:00:46 +0000 Check out these 10 books that can help data scientists and aspiring data scientists learn machine learning today. Full Article 2020 Apr Tutorials Overviews Books Data Scientist Machine Learning
machine learning KDnuggets™ News 20:n17, Apr 29: The Super Duper NLP Repo; Free Machine Learning & Data Science Books & Courses for Quarantine By feedproxy.google.com Published On :: Wed, 29 Apr 2020 10:55:45 +0000 Also: Should Data Scientists Model COVID19 and other Biological Events; Learning during a crisis (Data Science 90-day learning challenge); Data Transformation: Standardization vs Normalization; DBSCAN Clustering Algorithm in Machine Learning; Find Your Perfect Fit: A Quick Guide for Job Roles in the Data World Full Article KDnuggets 2020 Issues Courses Covid-19 Data Science Free ebook Machine Learning Modeling NLP Normalization Standardization
machine learning Top KDnuggets tweets, Apr 22-28: 24 Best (and Free) Books To Understand Machine Learning By feedproxy.google.com Published On :: Wed, 29 Apr 2020 20:00:14 +0000 Also: A Concise Course in Statistical Inference: The Free eBook; ML Ops: Machine Learning as an Engineering Discipline; Learning during a crisis (#DataScience 90-day learning challenge) ; Free High-Quality Machine Learning & Data Science Books & Courses: Quarantine Edition Full Article 2020 Apr Top Stories Tweets Top tweets
machine learning Optimize Response Time of your Machine Learning API In Production By feedproxy.google.com Published On :: Fri, 01 May 2020 14:00:09 +0000 This article demonstrates how building a smarter API serving Deep Learning models minimizes the response time. Full Article 2020 May Tutorials Overviews API Machine Learning Optimization Production Python
machine learning Beginners Learning Path for Machine Learning By feedproxy.google.com Published On :: Tue, 05 May 2020 16:00:37 +0000 So, you are interested in machine learning? Here is your complete learning path to start your career in the field. Full Article 2020 May Tutorials Overviews Beginners Learning Path Machine Learning
machine learning Explaining “Blackbox” Machine Learning Models: Practical Application of SHAP By feedproxy.google.com Published On :: Wed, 06 May 2020 14:00:23 +0000 Train a "blackbox" GBM model on a real dataset and make it explainable with SHAP. Full Article 2020 May Tutorials Overviews Explainability Interpretability Python SHAP
machine learning Top KDnuggets tweets, Apr 29 – May 5: 24 Best (and Free) Books To Understand Machine Learning By feedproxy.google.com Published On :: Wed, 06 May 2020 20:53:25 +0000 What are Some 'Advanced ' #AI and #MachineLearning Online Courses?; 24 Best (and Free) Books To Understand Machine Learning; Top 5 must-have #DataScience skills for 2020 Full Article 2020 May Top Stories Tweets Top tweets
machine learning Hyperparameter Optimization for Machine Learning Models By feedproxy.google.com Published On :: Thu, 07 May 2020 12:00:24 +0000 Check out this comprehensive guide to model optimization techniques. Full Article 2020 May Tutorials Overviews Hyperparameter Machine Learning Modeling Optimization Python
machine learning Will Machine Learning Engineers Exist in 10 Years? By feedproxy.google.com Published On :: Fri, 08 May 2020 16:00:45 +0000 As can be common in many technical fields, the landscape of specialized roles is evolving quickly. With more people learning at least a little machine learning, this could eventually become a common skill set for every software engineer. Full Article 2020 May Opinions Advice AutoML Career Machine Learning Engineer Trends
machine learning Top April Stories: Mathematics for Machine Learning: The Free eBook By feedproxy.google.com Published On :: Fri, 08 May 2020 20:36:00 +0000 Also: Introducing MIDAS: A New Baseline for Anomaly Detection in Graphs; The Super Duper NLP Repo: 100 Ready-to-Run Colab Notebooks; Five Cool Python Libraries for Data Science. Full Article 2020 May Top Stories Tweets Top stories
machine learning Should a small business invest in AI and machine learning software? By economictimes.indiatimes.com Published On :: 2019-05-25T10:37:16+05:30 Both AI and ML are touted to give businesses the edge they need, improve efficiencies, make sales and marketing better and even help in critical HR functions. Full Article
machine learning AI, machine learning can help achieve $5 trillion target: Piyush Goyal By economictimes.indiatimes.com Published On :: 2020-01-06T23:52:54+05:30 “Our government believes artificial intelligence, in different forms, can help us achieve the $5 trillion benchmark over the next five years, but also help us do it effectively and efficiently,” Goyal said while inaugurating the NSE Knowledge Hub here. The hub is an AI-powered learning ecosystem for the banking, financial services and insurance sector. Full Article
machine learning Laplace’s Demon: A Seminar Series about Bayesian Machine Learning at Scale By statmodeling.stat.columbia.edu Published On :: Thu, 07 May 2020 21:20:16 +0000 David Rohde points us to this new seminar series that has the following description: Machine learning is changing the world we live in at a break neck pace. From image recognition and generation, to the deployment of recommender systems, it seems to be breaking new ground constantly and influencing almost every aspect of our lives. […] Full Article Bayesian Statistics Statistical computing
machine learning Differential Machine Learning. (arXiv:2005.02347v2 [q-fin.CP] UPDATED) By arxiv.org Published On :: Differential machine learning (ML) extends supervised learning, with models trained on examples of not only inputs and labels, but also differentials of labels to inputs. Differential ML is applicable in all situations where high quality first order derivatives wrt training inputs are available. In the context of financial Derivatives risk management, pathwise differentials are efficiently computed with automatic adjoint differentiation (AAD). Differential ML, combined with AAD, provides extremely effective pricing and risk approximations. We can produce fast pricing analytics in models too complex for closed form solutions, extract the risk factors of complex transactions and trading books, and effectively compute risk management metrics like reports across a large number of scenarios, backtesting and simulation of hedge strategies, or capital regulations. The article focuses on differential deep learning (DL), arguably the strongest application. Standard DL trains neural networks (NN) on punctual examples, whereas differential DL teaches them the shape of the target function, resulting in vastly improved performance, illustrated with a number of numerical examples, both idealized and real world. In the online appendices, we apply differential learning to other ML models, like classic regression or principal component analysis (PCA), with equally remarkable results. This paper is meant to be read in conjunction with its companion GitHub repo https://github.com/differential-machine-learning, where we posted a TensorFlow implementation, tested on Google Colab, along with examples from the article and additional ones. We also posted appendices covering many practical implementation details not covered in the paper, mathematical proofs, application to ML models besides neural networks and extensions necessary for a reliable implementation in production. Full Article
machine learning Machine learning topological phases in real space. (arXiv:1901.01963v4 [cond-mat.mes-hall] UPDATED) By arxiv.org Published On :: We develop a supervised machine learning algorithm that is able to learn topological phases for finite condensed matter systems from bulk data in real lattice space. The algorithm employs diagonalization in real space together with any supervised learning algorithm to learn topological phases through an eigenvector ensembling procedure. We combine our algorithm with decision trees and random forests to successfully recover topological phase diagrams of Su-Schrieffer-Heeger (SSH) models from bulk lattice data in real space and show how the Shannon information entropy of ensembles of lattice eigenvectors can be used to retrieve a signal detailing how topological information is distributed in the bulk. The discovery of Shannon information entropy signals associated with topological phase transitions from the analysis of data from several thousand SSH systems illustrates how model explainability in machine learning can advance the research of exotic quantum materials with properties that may power future technological applications such as qubit engineering for quantum computing. Full Article
machine learning Estimating Blood Pressure from Photoplethysmogram Signal and Demographic Features using Machine Learning Techniques. (arXiv:2005.03357v1 [eess.SP]) By arxiv.org Published On :: Hypertension is a potentially unsafe health ailment, which can be indicated directly from the Blood pressure (BP). Hypertension always leads to other health complications. Continuous monitoring of BP is very important; however, cuff-based BP measurements are discrete and uncomfortable to the user. To address this need, a cuff-less, continuous and a non-invasive BP measurement system is proposed using Photoplethysmogram (PPG) signal and demographic features using machine learning (ML) algorithms. PPG signals were acquired from 219 subjects, which undergo pre-processing and feature extraction steps. Time, frequency and time-frequency domain features were extracted from the PPG and their derivative signals. Feature selection techniques were used to reduce the computational complexity and to decrease the chance of over-fitting the ML algorithms. The features were then used to train and evaluate ML algorithms. The best regression models were selected for Systolic BP (SBP) and Diastolic BP (DBP) estimation individually. Gaussian Process Regression (GPR) along with ReliefF feature selection algorithm outperforms other algorithms in estimating SBP and DBP with a root-mean-square error (RMSE) of 6.74 and 3.59 respectively. This ML model can be implemented in hardware systems to continuously monitor BP and avoid any critical health conditions due to sudden changes. Full Article
machine learning Eliminating NB-IoT Interference to LTE System: a Sparse Machine Learning Based Approach. (arXiv:2005.03092v1 [cs.IT]) By arxiv.org Published On :: Narrowband internet-of-things (NB-IoT) is a competitive 5G technology for massive machine-type communication scenarios, but meanwhile introduces narrowband interference (NBI) to existing broadband transmission such as the long term evolution (LTE) systems in enhanced mobile broadband (eMBB) scenarios. In order to facilitate the harmonic and fair coexistence in wireless heterogeneous networks, it is important to eliminate NB-IoT interference to LTE systems. In this paper, a novel sparse machine learning based framework and a sparse combinatorial optimization problem is formulated for accurate NBI recovery, which can be efficiently solved using the proposed iterative sparse learning algorithm called sparse cross-entropy minimization (SCEM). To further improve the recovery accuracy and convergence rate, regularization is introduced to the loss function in the enhanced algorithm called regularized SCEM. Moreover, exploiting the spatial correlation of NBI, the framework is extended to multiple-input multiple-output systems. Simulation results demonstrate that the proposed methods are effective in eliminating NB-IoT interference to LTE systems, and significantly outperform the state-of-the-art methods. Full Article
machine learning AVAC: A Machine Learning based Adaptive RRAM Variability-Aware Controller for Edge Devices. (arXiv:2005.03077v1 [eess.SY]) By arxiv.org Published On :: Recently, the Edge Computing paradigm has gained significant popularity both in industry and academia. Researchers now increasingly target to improve performance and reduce energy consumption of such devices. Some recent efforts focus on using emerging RRAM technologies for improving energy efficiency, thanks to their no leakage property and high integration density. As the complexity and dynamism of applications supported by such devices escalate, it has become difficult to maintain ideal performance by static RRAM controllers. Machine Learning provides a promising solution for this, and hence, this work focuses on extending such controllers to allow dynamic parameter updates. In this work we propose an Adaptive RRAM Variability-Aware Controller, AVAC, which periodically updates Wait Buffer and batch sizes using on-the-fly learning models and gradient ascent. AVAC allows Edge devices to adapt to different applications and their stages, to improve computation performance and reduce energy consumption. Simulations demonstrate that the proposed model can provide up to 29% increase in performance and 19% decrease in energy, compared to static controllers, using traces of real-life healthcare applications on a Raspberry-Pi based Edge deployment. Full Article
machine learning Offloading your Informix data in Spark, Part 5: Machine Learning will help you extrapolate future orders By www.ibm.com Published On :: 15 Dec 2017 05:00:00 +0000 Part 5 of this tutorial series teaches you how to add machine learning to your data to help you extrapolate future orders. Full Article data opensource
machine learning Three Paper Thursday: Adversarial Machine Learning, Humans and everything in between By www.lightbluetouchpaper.org Published On :: Thu, 16 Apr 2020 14:41:35 +0000 Recent advancements in Machine Learning (ML) have taught us two main lessons: a large proportion of things that humans do can actually be automated, and that a substantial part of this automation can be done with minimal human supervision. One no longer needs to select features for models to use; in many cases people are … Continue reading Three Paper Thursday: Adversarial Machine Learning, Humans and everything in between → Full Article Three Paper Thursday
machine learning AI and Machine Learning for Coders By shop.oreilly.com Published On :: Wed, 06 May 2020 04:57:34 PDT If you’re looking to make a career move from programmer to AI specialist, this is the ideal place to start. Based on Laurence Moroney's extremely successful AI courses, this introductory book provides a hands-on, code-first approach to help you build confidence while you learn key topics.You’ll understand how to implement the most common scenarios in machine learning, such as computer vision, natural language processing (NLP), and sequence modeling for web, mobile, cloud, and embedded runtimes. Full Article
machine learning Kubeflow for Machine Learning By shop.oreilly.com Published On :: Wed, 06 May 2020 04:50:34 PDT If you’re training a machine learning model but aren’t sure how to put it into production, this book will get you there. Kubeflow provides a collection of cloud native tools for different stages of a model’s lifecycle, from data exploration, feature preparation, and model training to model serving. This guide helps data scientists build production-grade machine learning implementations with Kubeflow and shows data engineers how to make models scalable and reliable. Full Article
machine learning Article: Marketing in China: Can Machine Learning Solve the ROI Problem? By www.emarketer.com Published On :: Wed, 24 Jan 2018 04:01:00 GMT William Bao Bean, managing director of Chinaccelerator, explains how investments in artificial intelligence and machine learning are helping marketers improve user targeting and return on investment. Full Article
machine learning Erratum. Predicting 10-Year Risk of End-Organ Complications of Type 2 Diabetes With and Without Metabolic Surgery: A Machine Learning Approach. Diabetes Care 2020;43:852-859 By care.diabetesjournals.org Published On :: 2020-04-15T14:26:52-07:00 Full Article
machine learning Predicting the Risk of Inpatient Hypoglycemia With Machine Learning Using Electronic Health Records By care.diabetesjournals.org Published On :: 2020-04-29T13:46:01-07:00 OBJECTIVEWe analyzed data from inpatients with diabetes admitted to a large university hospital to predict the risk of hypoglycemia through the use of machine learning algorithms.RESEARCH DESIGN AND METHODSFour years of data were extracted from a hospital electronic health record system. This included laboratory and point-of-care blood glucose (BG) values to identify biochemical and clinically significant hypoglycemic episodes (BG ≤3.9 and ≤2.9 mmol/L, respectively). We used patient demographics, administered medications, vital signs, laboratory results, and procedures performed during the hospital stays to inform the model. Two iterations of the data set included the doses of insulin administered and the past history of inpatient hypoglycemia. Eighteen different prediction models were compared using the area under the receiver operating characteristic curve (AUROC) through a 10-fold cross validation.RESULTSWe analyzed data obtained from 17,658 inpatients with diabetes who underwent 32,758 admissions between July 2014 and August 2018. The predictive factors from the logistic regression model included people undergoing procedures, weight, type of diabetes, oxygen saturation level, use of medications (insulin, sulfonylurea, and metformin), and albumin levels. The machine learning model with the best performance was the XGBoost model (AUROC 0.96). This outperformed the logistic regression model, which had an AUROC of 0.75 for the estimation of the risk of clinically significant hypoglycemia.CONCLUSIONSAdvanced machine learning models are superior to logistic regression models in predicting the risk of hypoglycemia in inpatients with diabetes. Trials of such models should be conducted in real time to evaluate their utility to reduce inpatient hypoglycemia. Full Article
machine learning Predicting 10-Year Risk of End-Organ Complications of Type 2 Diabetes With and Without Metabolic Surgery: A Machine Learning Approach By care.diabetesjournals.org Published On :: 2020-03-20T11:50:34-07:00 OBJECTIVE To construct and internally validate prediction models to estimate the risk of long-term end-organ complications and mortality in patients with type 2 diabetes and obesity that can be used to inform treatment decisions for patients and practitioners who are considering metabolic surgery. RESEARCH DESIGN AND METHODS A total of 2,287 patients with type 2 diabetes who underwent metabolic surgery between 1998 and 2017 in the Cleveland Clinic Health System were propensity-matched 1:5 to 11,435 nonsurgical patients with BMI ≥30 kg/m2 and type 2 diabetes who received usual care with follow-up through December 2018. Multivariable time-to-event regression and random forest machine learning models were built and internally validated using fivefold cross-validation to predict the 10-year risk for four outcomes of interest. The prediction models were programmed to construct user-friendly web-based and smartphone applications of Individualized Diabetes Complications (IDC) Risk Scores for clinical use. RESULTS The prediction tools demonstrated the following discrimination ability based on the area under the receiver operating characteristic curve (1 = perfect discrimination and 0.5 = chance) at 10 years in the surgical and nonsurgical groups, respectively: all-cause mortality (0.79 and 0.81), coronary artery events (0.66 and 0.67), heart failure (0.73 and 0.75), and nephropathy (0.73 and 0.76). When a patient’s data are entered into the IDC application, it estimates the individualized 10-year morbidity and mortality risks with and without undergoing metabolic surgery. CONCLUSIONS The IDC Risk Scores can provide personalized evidence-based risk information for patients with type 2 diabetes and obesity about future cardiovascular outcomes and mortality with and without metabolic surgery based on their current status of obesity, diabetes, and related cardiometabolic conditions. Full Article
machine learning GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning By Published On :: 2020 When the data is distributed across multiple servers, lowering the communication cost between the servers (or workers) while solving the distributed learning problem is an important problem and is the focus of this paper. In particular, we propose a fast, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed algorithm, Group Alternating Direction Method of Multipliers (GADMM) is based on the Alternating Direction Method of Multipliers (ADMM) framework. The key novelty in GADMM is that it solves the problem in a decentralized topology where at most half of the workers are competing for the limited communication resources at any given time. Moreover, each worker exchanges the locally trained model only with two neighboring workers, thereby training a global model with a lower amount of communication overhead in each exchange. We prove that GADMM converges to the optimal solution for convex loss functions, and numerically show that it converges faster and more communication-efficient than the state-of-the-art communication-efficient algorithms such as the Lazily Aggregated Gradient (LAG) and dual averaging, in linear and logistic regression tasks on synthetic and real datasets. Furthermore, we propose Dynamic GADMM (D-GADMM), a variant of GADMM, and prove its convergence under the time-varying network topology of the workers. Full Article
machine learning Cyclic Boosting -- an explainable supervised machine learning algorithm. (arXiv:2002.03425v2 [cs.LG] UPDATED) By arxiv.org Published On :: Supervised machine learning algorithms have seen spectacular advances and surpassed human level performance in a wide range of specific applications. However, using complex ensemble or deep learning algorithms typically results in black box models, where the path leading to individual predictions cannot be followed in detail. In order to address this issue, we propose the novel "Cyclic Boosting" machine learning algorithm, which allows to efficiently perform accurate regression and classification tasks while at the same time allowing a detailed understanding of how each individual prediction was made. Full Article
machine learning Machine learning in medicine : a complete overview By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Author: Cleophas, Ton J. M., authorCallnumber: OnlineISBN: 9783030339708 (electronic bk.) Full Article
machine learning Machine learning in aquaculture : hunger classification of Lates calcarifer By dal.novanet.ca Published On :: Fri, 1 May 2020 19:44:43 -0300 Author: Mohd Razman, Mohd Azraai, authorCallnumber: OnlineISBN: 9789811522376 (electronic bk.) Full Article
machine learning Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model By projecteuclid.org Published On :: Fri, 02 Aug 2019 22:04 EDT Aryeh Kontorovich, Iosif Pinelis. Source: The Annals of Statistics, Volume 47, Number 5, 2822--2854.Abstract: We provide an exact nonasymptotic lower bound on the minimax expected excess risk (EER) in the agnostic probably-approximately-correct (PAC) machine learning classification model and identify minimax learning algorithms as certain maximally symmetric and minimally randomized “voting” procedures. Based on this result, an exact asymptotic lower bound on the minimax EER is provided. This bound is of the simple form $c_{infty}/sqrt{ u}$ as $ u oinfty$, where $c_{infty}=0.16997dots$ is a universal constant, $ u=m/d$, $m$ is the size of the training sample and $d$ is the Vapnik–Chervonenkis dimension of the hypothesis class. It is shown that the differences between these asymptotic and nonasymptotic bounds, as well as the differences between these two bounds and the maximum EER of any learning algorithms that minimize the empirical risk, are asymptotically negligible, and all these differences are due to ties in the mentioned “voting” procedures. A few easy to compute nonasymptotic lower bounds on the minimax EER are also obtained, which are shown to be close to the exact asymptotic lower bound $c_{infty}/sqrt{ u}$ even for rather small values of the ratio $ u=m/d$. As an application of these results, we substantially improve existing lower bounds on the tail probability of the excess risk. Among the tools used are Bayes estimation and apparently new identities and inequalities for binomial distributions. Full Article