class

Lithium ion adduction enables UPLC-MS/MS-based analysis of multi-class 3-hydroxyl group-containing keto-steroids [Methods]

Steroids that contain a 3-hydroxyl group (3-OH steroids) are widely distributed in nature. During analysis with ESI-MS, they easily become dehydrated while in the protonated form, resulting in the production of several precursor ions and leading to low sensitivity of detection. To address this analytical challenge, here, we developed a method for the quantitation of 3-OH steroids by LC-MS/MS coupled with post-column addition of lithium (Li) ions to the mobile phase. The Li ion has a high affinity for the keto group of steroids, stabilizing their structures during ionization and permitting detection of analytes exclusively as the lithiated form. This not only improved the intensities of the precursor ions, but also promoted the formation of typical lithiated fragment ions. This improvement made the quantitation by multiple reaction monitoring more sensitive and reliable, as evidenced by 1.53–188 times enhanced detection sensitivity of 13 steroids that contained at least one keto and two hydroxyl groups or one keto and one 5-olefinic double bond, among 16 different 3-OH steroids. We deployed our newly developed method for profiling steroids in mouse brain tissue and identified six steroids in one tissue sample. Among these, 16-hydroxyestrone, tetrahydrocorticosterone, and 17α-hydroxypregnenolone were detected for the first time in the mouse brain. In summary, the method described here enables the detection of lithiated steroids by LC-MS/MS, including three 3-OH steroids not previously reported in the mouse brain. We anticipate that this new method may allow the determination of 3-OH steroids in different brain regions.




class

Exercise in old age - "we need kendo classes in Huddersfield"

There's a crisis in old age care - not just in the UK, around the world, as population demographics shift, and the proportion of older people increase - there's a worry about who's going to look after them, and how much is it going to cost? However, a new analysis on bmj.com says this picture need not be so gloomy - they say that encouraging...




class

From dance class to social prescription - starting and evaluating an idea

If you read the Christmas BMJ in the last few weeks, you might have noticed a lot around art and health - the way in which engagement in arts can help in prevention and treatment, but can also affect those more nebulous things which really matter to patients - loneliness, self expression, being connected to the wider community. That obviously...




class

Classification and Diagnosis of Diabetes Mellitus and Other Categories of Glucose Intolerance

National Diabetes Data Group
Dec 1, 1979; 28:1039-1057
Articles




class

International Classification of Diseases, 10th Revision, Coding for Diabetes

Joy Dugan
Oct 1, 2017; 35:232-238
Practical Pointers




class

A Review of the Pathophysiology, Classification, and Treatment of Foot Ulcers in Diabetic Patients

Warren Clayton
Mar 1, 2009; 27:52-58
Features




class

Watch: Utah man reunited with class ring 38 years after it was lost in Germany

A Utah man who lost his high school class ring in Germany in 1982 was reunited with the ring thanks to a man who found it on a beach in the United States.




class

Study: Evidence does not support classifying fluoride as cognitive neurodevelopmental hazard

The National Academies of Sciences, Engineering, and Medicine announced March 5 that it does not find that the National Toxicology Program adequately supported its conclusion that fluoride is “presumed” to be a cognitive neurodevelopmental hazard to humans.




class

Classroom Observations

Classroom observations occur to document behaviors and to help provide insight to teachers. Teachers are teaching and are typically focused on the overall learning of the entire class. It is not possible for a teacher to catch all of the details of classroom while teaching. An outside observer, often a School Psychologist, can sit in the classroom and observe a student or the entire class. These insights can be used to help provide better instruction, create behavioral or academic interventions, or to document behaviors.

When do classroom observations occur?

- During a special education evaluation The classroom observation is a required component in a special education evaluation. It provides data and insight to the eligibility committee.

- Before a Behavior Intervention Plan or Functional Behavioral Assessment Classroom observations are important before implementing a Functional Behavioral Assessment or a Behavior Intervention Plan. It helps to clarify the current behaviors, identify possible triggers of the behavior, and determine the frequency of the behaviors.

- When a teacher is worried about a particular student Often a teacher will have a concern about a student and ask the School Psychologist to conduct an observation. After the observation the teacher and psychologist will meet to discuss and brainstorm strategies to assist in instruction.




class

'Valley Girl' musical reinvents classic '80s music

Director Rachel Lee Goldenberg discusses how her "Valley Girl" musical remake puts a twist on the hit music of the '80s.




class

Effectively Serving Children in a Superdiverse Classroom: Implications for the Early Education System

As the number and share of Dual Language Learners (DLLs) continues to grow across the United States, diversity within this population is also increasing. This webinar marks the release of a report providing analysis of the diversity within the DLL population nationwide and at the state and local levels. Speakers discuss data on the three rapidly growing subgroups within the DLL population: Black and Asian American and Pacific Islander DLLs and young children of refugees, and the implications for the early education and care field and K-12 education systems. 




class

A RedForEd Wave: Teachers in North and South Carolina Leave Classrooms in Protest

A sea of red swept the capitals of North and South Carolina on Wednesday, as thousands of teachers turned out to demand higher pay and more school funding.




class

LeBron James to honor Class of 2020 with all-star event




class

Unimpressed by online classes, college students seek refunds




class

Stop Giving Inexperienced Teachers All the Lower-Level Math Classes, Reformers Argue

“Detracking” math teachers is tough because many educators resist upending their routines or challenging informal hierarchies, and PD initiatives to make it happen are limited.




class

Lamont canceling in-person classes for rest of school year




class

Public schools, classes at Univ. of SC hope for fall return




class

Tennessee School District Prohibits Crowdfunding for Class Supplies

A school district in Tennessee says it no longer wants teachers to use crowdfunding websites to get extra school supplies.




class

LeBron James to honor Class of 2020 with all-star event




class

Unimpressed by online classes, college students seek refunds




class

Stop Giving Inexperienced Teachers All the Lower-Level Math Classes, Reformers Argue

“Detracking” math teachers is tough because many educators resist upending their routines or challenging informal hierarchies, and PD initiatives to make it happen are limited.




class

Lamont canceling in-person classes for rest of school year




class

Public schools, classes at Univ. of SC hope for fall return




class

Small Arkansas School District Installs Safe Rooms in All Classrooms

A school district in a small Arkansas city has installed steel safe rooms in every classroom.




class

Lamont canceling in-person classes for rest of school year




class

The menu: College athletes get cooking classes, grocery tips

Nevada offensive lineman Nate Brown is doing his best to eat right, like many football players and other college athletes scattered around the country without access to training facilities amid the coronavirus pandemic. The 6-foot-4, 300-pound rising senior has stumbled a few times in college sports' version of Weight Watchers, with no in-person classes or spring practices. ''Maybe I would get Taco Bell because I do like Taco Bell,'' Brown said.




class

Oregon football lands four-star, All-American OT Bram Walden for class of 2021

Another piece to the TakeFlight21 puzzle




class

Classic keys : keyboard sounds that launched rock music / Alan S. Lenhoff and David Robertson.

Keyboard instruments -- History -- 20th century.




class

No longer newsworthy : how the mainstream media abandoned the working class / Christopher R. Martin.

Working class -- Press coverage -- United States.




class

A RedForEd Wave: Teachers in North and South Carolina Leave Classrooms in Protest

A sea of red swept the capitals of North and South Carolina on Wednesday, as thousands of teachers turned out to demand higher pay and more school funding.




class

Public schools, classes at Univ. of SC hope for fall return




class

Ectopic pregnancy : its etiology, classification, embryology, diagnosis and treatment / by J. Clarence Webster.

Edinburgh : Young J. Pentland, 1895.




class

Iowa governor: K-12 schools won't resume classes this year




class

Earthquake Scuttles Classes in Alaska, As California Students Return to School

While thousands of students in wildfire-ravaged Northern California resumed classes last week, thousands of others in Alaska stayed home after a 7.0 magnitude earthquake struck Nov. 30.




class

Capitals' Greatest Hits: How to watch Troy Brouwer's game-winning goal in 2015 Winter Classic

Troy Brouwer scored the game-winning goal as the Capitals beat the Blackhawks at Nationals Park on Jan. 1, 2015. Relive the excitement with Monday night's replay.




class

Kobe, Duncan, Garnett headline Basketball Hall of Fame class

Kobe Bryant was already immortal. Bryant and fellow NBA greats Tim Duncan and Kevin Garnett headlined a nine-person group announced Saturday as this year’s class of enshrinees into the Naismith Memorial Basketball Hall of Fame. Two-time NBA champion coach Rudy Tomjanovich finally got his call, as did longtime Baylor women’s coach Kim Mulkey, 1,000-game winner Barbara Stevens of Bentley and three-time Final Four coach Eddie Sutton.




class

The Class of 2020: A look at basketball's new Hall of Famers

A look at the newest members of the Naismith Memorial Basketball Hall of Fame, announced on Saturday:




class

Neyman-Pearson classification: parametrics and sample size requirement

The Neyman-Pearson (NP) paradigm in binary classification seeks classifiers that achieve a minimal type II error while enforcing the prioritized type I error controlled under some user-specified level $alpha$. This paradigm serves naturally in applications such as severe disease diagnosis and spam detection, where people have clear priorities among the two error types. Recently, Tong, Feng, and Li (2018) proposed a nonparametric umbrella algorithm that adapts all scoring-type classification methods (e.g., logistic regression, support vector machines, random forest) to respect the given type I error (i.e., conditional probability of classifying a class $0$ observation as class $1$ under the 0-1 coding) upper bound $alpha$ with high probability, without specific distributional assumptions on the features and the responses. Universal the umbrella algorithm is, it demands an explicit minimum sample size requirement on class $0$, which is often the more scarce class, such as in rare disease diagnosis applications. In this work, we employ the parametric linear discriminant analysis (LDA) model and propose a new parametric thresholding algorithm, which does not need the minimum sample size requirements on class $0$ observations and thus is suitable for small sample applications such as rare disease diagnosis. Leveraging both the existing nonparametric and the newly proposed parametric thresholding rules, we propose four LDA-based NP classifiers, for both low- and high-dimensional settings. On the theoretical front, we prove NP oracle inequalities for one proposed classifier, where the rate for excess type II error benefits from the explicit parametric model assumption. Furthermore, as NP classifiers involve a sample splitting step of class $0$ observations, we construct a new adaptive sample splitting scheme that can be applied universally to NP classifiers, and this adaptive strategy reduces the type II error of these classifiers. The proposed NP classifiers are implemented in the R package nproc.




class

Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning

One of the common tasks in unsupervised learning is dimensionality reduction, where the goal is to find meaningful low-dimensional structures hidden in high-dimensional data. Sometimes referred to as manifold learning, this problem is closely related to the problem of localization, which aims at embedding a weighted graph into a low-dimensional Euclidean space. Several methods have been proposed for localization, and also manifold learning. Nonetheless, the robustness property of most of them is little understood. In this paper, we obtain perturbation bounds for classical scaling and trilateration, which are then applied to derive performance bounds for Isomap, Landmark Isomap, and Maximum Variance Unfolding. A new perturbation bound for procrustes analysis plays a key role.




class

Targeted Fused Ridge Estimation of Inverse Covariance Matrices from Multiple High-Dimensional Data Classes

We consider the problem of jointly estimating multiple inverse covariance matrices from high-dimensional data consisting of distinct classes. An $ell_2$-penalized maximum likelihood approach is employed. The suggested approach is flexible and generic, incorporating several other $ell_2$-penalized estimators as special cases. In addition, the approach allows specification of target matrices through which prior knowledge may be incorporated and which can stabilize the estimation procedure in high-dimensional settings. The result is a targeted fused ridge estimator that is of use when the precision matrices of the constituent classes are believed to chiefly share the same structure while potentially differing in a number of locations of interest. It has many applications in (multi)factorial study designs. We focus on the graphical interpretation of precision matrices with the proposed estimator then serving as a basis for integrative or meta-analytic Gaussian graphical modeling. Situations are considered in which the classes are defined by data sets and subtypes of diseases. The performance of the proposed estimator in the graphical modeling setting is assessed through extensive simulation experiments. Its practical usability is illustrated by the differential network modeling of 12 large-scale gene expression data sets of diffuse large B-cell lymphoma subtypes. The estimator and its related procedures are incorporated into the R-package rags2ridges.




class

A New Class of Time Dependent Latent Factor Models with Applications

In many applications, observed data are influenced by some combination of latent causes. For example, suppose sensors are placed inside a building to record responses such as temperature, humidity, power consumption and noise levels. These random, observed responses are typically affected by many unobserved, latent factors (or features) within the building such as the number of individuals, the turning on and off of electrical devices, power surges, etc. These latent factors are usually present for a contiguous period of time before disappearing; further, multiple factors could be present at a time. This paper develops new probabilistic methodology and inference methods for random object generation influenced by latent features exhibiting temporal persistence. Every datum is associated with subsets of a potentially infinite number of hidden, persistent features that account for temporal dynamics in an observation. The ensuing class of dynamic models constructed by adapting the Indian Buffet Process — a probability measure on the space of random, unbounded binary matrices — finds use in a variety of applications arising in operations, signal processing, biomedicine, marketing, image analysis, etc. Illustrations using synthetic and real data are provided.




class

Noise Accumulation in High Dimensional Classification and Total Signal Index

Great attention has been paid to Big Data in recent years. Such data hold promise for scientific discoveries but also pose challenges to analyses. One potential challenge is noise accumulation. In this paper, we explore noise accumulation in high dimensional two-group classification. First, we revisit a previous assessment of noise accumulation with principal component analyses, which yields a different threshold for discriminative ability than originally identified. Then we extend our scope to its impact on classifiers developed with three common machine learning approaches---random forest, support vector machine, and boosted classification trees. We simulate four scenarios with differing amounts of signal strength to evaluate each method. After determining noise accumulation may affect the performance of these classifiers, we assess factors that impact it. We conduct simulations by varying sample size, signal strength, signal strength proportional to the number predictors, and signal magnitude with random forest classifiers. These simulations suggest that noise accumulation affects the discriminative ability of high-dimensional classifiers developed using common machine learning methods, which can be modified by sample size, signal strength, and signal magnitude. We developed the measure total signal index (TSI) to track the trends of total signal and noise accumulation.




class

A Convex Parametrization of a New Class of Universal Kernel Functions

The accuracy and complexity of kernel learning algorithms is determined by the set of kernels over which it is able to optimize. An ideal set of kernels should: admit a linear parameterization (tractability); be dense in the set of all kernels (accuracy); and every member should be universal so that the hypothesis space is infinite-dimensional (scalability). Currently, there is no class of kernel that meets all three criteria - e.g. Gaussians are not tractable or accurate; polynomials are not scalable. We propose a new class that meet all three criteria - the Tessellated Kernel (TK) class. Specifically, the TK class: admits a linear parameterization using positive matrices; is dense in all kernels; and every element in the class is universal. This implies that the use of TK kernels for learning the kernel can obviate the need for selecting candidate kernels in algorithms such as SimpleMKL and parameters such as the bandwidth. Numerical testing on soft margin Support Vector Machine (SVM) problems show that algorithms using TK kernels outperform other kernel learning algorithms and neural networks. Furthermore, our results show that when the ratio of the number of training data to features is high, the improvement of TK over MKL increases significantly.




class

pyts: A Python Package for Time Series Classification

pyts is an open-source Python package for time series classification. This versatile toolbox provides implementations of many algorithms published in the literature, preprocessing functionalities, and data set loading utilities. pyts relies on the standard scientific Python packages numpy, scipy, scikit-learn, joblib, and numba, and is distributed under the BSD-3-Clause license. Documentation contains installation instructions, a detailed user guide, a full API description, and concrete self-contained examples.




class

(1 + epsilon)-class Classification: an Anomaly Detection Method for Highly Imbalanced or Incomplete Data Sets

Anomaly detection is not an easy problem since distribution of anomalous samples is unknown a priori. We explore a novel method that gives a trade-off possibility between one-class and two-class approaches, and leads to a better performance on anomaly detection problems with small or non-representative anomalous samples. The method is evaluated using several data sets and compared to a set of conventional one-class and two-class approaches.




class

Unsupervised Pre-trained Models from Healthy ADLs Improve Parkinson's Disease Classification of Gait Patterns. (arXiv:2005.02589v2 [cs.LG] UPDATED)

Application and use of deep learning algorithms for different healthcare applications is gaining interest at a steady pace. However, use of such algorithms can prove to be challenging as they require large amounts of training data that capture different possible variations. This makes it difficult to use them in a clinical setting since in most health applications researchers often have to work with limited data. Less data can cause the deep learning model to over-fit. In this paper, we ask how can we use data from a different environment, different use-case, with widely differing data distributions. We exemplify this use case by using single-sensor accelerometer data from healthy subjects performing activities of daily living - ADLs (source dataset), to extract features relevant to multi-sensor accelerometer gait data (target dataset) for Parkinson's disease classification. We train the pre-trained model using the source dataset and use it as a feature extractor. We show that the features extracted for the target dataset can be used to train an effective classification model. Our pre-trained source model consists of a convolutional autoencoder, and the target classification model is a simple multi-layer perceptron model. We explore two different pre-trained source models, trained using different activity groups, and analyze the influence the choice of pre-trained model has over the task of Parkinson's disease classification.




class

Mnemonics Training: Multi-Class Incremental Learning without Forgetting. (arXiv:2002.10211v3 [cs.CV] UPDATED)

Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between different classes.




class

On the impact of selected modern deep-learning techniques to the performance and celerity of classification models in an experimental high-energy physics use case. (arXiv:2002.01427v3 [physics.data-an] UPDATED)

Beginning from a basic neural-network architecture, we test the potential benefits offered by a range of advanced techniques for machine learning, in particular deep learning, in the context of a typical classification problem encountered in the domain of high-energy physics, using a well-studied dataset: the 2014 Higgs ML Kaggle dataset. The advantages are evaluated in terms of both performance metrics and the time required to train and apply the resulting models. Techniques examined include domain-specific data-augmentation, learning rate and momentum scheduling, (advanced) ensembling in both model-space and weight-space, and alternative architectures and connection methods.

Following the investigation, we arrive at a model which achieves equal performance to the winning solution of the original Kaggle challenge, whilst being significantly quicker to train and apply, and being suitable for use with both GPU and CPU hardware setups. These reductions in timing and hardware requirements potentially allow the use of more powerful algorithms in HEP analyses, where models must be retrained frequently, sometimes at short notice, by small groups of researchers with limited hardware resources. Additionally, a new wrapper library for PyTorch called LUMINis presented, which incorporates all of the techniques studied.




class

Margin-Based Generalization Lower Bounds for Boosted Classifiers. (arXiv:1909.12518v4 [cs.LG] UPDATED)

Boosting is one of the most successful ideas in machine learning. The most well-accepted explanations for the low generalization error of boosting algorithms such as AdaBoost stem from margin theory. The study of margins in the context of boosting algorithms was initiated by Schapire, Freund, Bartlett and Lee (1998) and has inspired numerous boosting algorithms and generalization bounds. To date, the strongest known generalization (upper bound) is the $k$th margin bound of Gao and Zhou (2013). Despite the numerous generalization upper bounds that have been proved over the last two decades, nothing is known about the tightness of these bounds. In this paper, we give the first margin-based lower bounds on the generalization error of boosted classifiers. Our lower bounds nearly match the $k$th margin bound and thus almost settle the generalization performance of boosted classifiers in terms of margins.




class

Local Cascade Ensemble for Multivariate Data Classification. (arXiv:2005.03645v1 [cs.LG])

We present LCE, a Local Cascade Ensemble for traditional (tabular) multivariate data classification, and its extension LCEM for Multivariate Time Series (MTS) classification. LCE is a new hybrid ensemble method that combines an explicit boosting-bagging approach to handle the usual bias-variance tradeoff faced by machine learning models and an implicit divide-and-conquer approach to individualize classifier errors on different parts of the training data. Our evaluation firstly shows that the hybrid ensemble method LCE outperforms the state-of-the-art classifiers on the UCI datasets and that LCEM outperforms the state-of-the-art MTS classifiers on the UEA datasets. Furthermore, LCEM provides explainability by design and manifests robust performance when faced with challenges arising from continuous data collection (different MTS length, missing data and noise).