validation

Development and validation of scale to measure minimalism - a study analysing psychometric assessment of minimalistic behaviour! Consumer perspective

This research aims to establish a valid and accurate measurement scale and identify consumer-driven characteristics for minimalism. The study has employed a hybrid approach to produce items for minimalism. Expert interviews were conducted to identify the items for minimalism in the first phase followed by consumer survey to obtain their response in second phase. A five-point Likert scale was used to collect the data. Further, data was subjected to reliability and validity check. Structural equation modelling was used to test the model. The findings demonstrated that there are five dimensions by which consumers perceive minimalism: decluttering, mindful consumption, aesthetic choices, financial freedom, and sustainable lifestyle. The outcome also revealed a high correlation between simplicity and well-being. This study is the first to provide a reliable and valid instrument for minimalism. The results will have several theoretical and practical ramifications for society and policymakers. It will support policymakers in gauging and encouraging minimalistic practices, which enhance environmental performance and lower carbon footprint.




validation

A Data Model Validation Approach for Relational Database Design Courses




validation

Development and Validation of an Instrument for Assessing Users’ Views about the Usability of Digital Libraries




validation

Digital Learning Literacies – A Validation Study

This paper presents a validation research of seven Digital Learning Domains (DLDs) and sixty-five performance statements (PSs) as perceived by students with experience in learning via ICT. The preliminary findings suggest a statistical firmness of the inventory. The seven DLDs identified are Social Responsibility, Team-based Learning, Information Research and Retrieval, Information Management, Information Validation, Processing and Presentation of Information, and Digital Integrity. The 65 PSs will enable a teacher to identify the level of competency the learner has in each DLD, thus identifying students’ strengths and weaknesses that must be addressed in order to facilitate learning in the current era. As can be concluded from the findings, most of the participants evaluate themselves as digitally literate with regard to the basic information research and retrieval skills, validation and information management. But when it comes to PSs that require complex decision making or higher order thinking strategies, it seems that a large number of participants lack these skills. Also, social responsibility and digital integrity domains are perceived as known by the participants but not very well taken in terms of pro-active action to enforce appropriate digital behavior, or avoiding illegally obtained music or movies.




validation

The Generalized Requirement Approach for Requirement Validation with Automatically Generated Program Code




validation

Influential Factors of Collaborative Networks in Manufacturing: Validation of a Conceptual Model

The purpose of the study is to identify influential factors in the use of collaborative networks within the context of manufacturing. The study aims to investigate factors that influence employees’ learning, and to bridge the gap between theory and praxis in collaborative networks in manufacturing. The study further extends the boundary of a collaborative network beyond enterprises to include suppliers, customers, and external stakeholders. It provides a holistic perspective of collaborative networks within the complexity of the manufacturing environment, based on empirical evidence from a questionnaire survey of 246 respondents from diverse manufacturing industries. Drawing upon the socio-technical systems (STS) theory, the study presents the theoretical context and interpretations through the lens of manufacturing. The results show significant influences of organizational support, promotive interactions, positive interdependence, internal-external learning, perceived effectiveness, and perceived usefulness on the use of collaborative networks among manufacturing employees. The study offers a basis of empirical validity for measuring collaborative networks in organizational learning and knowledge/information sharing in manufacturing.




validation

Validation of a Learning Object Review Instrument: Relationship between Ratings of Learning Objects and Actual Learning Outcomes




validation

Development and Validation of a Model to Investigate the Impact of Individual Factors on Instructors’ Intention to Use E-learning Systems




validation

Empirical Validation Procedure for the Knowledge Management Technology Stage Model




validation

Organizational Practices That Foster Knowledge Sharing: Validation across Distinct National Cultures




validation

Development and Validation of a Noise in Decision Inventory for Organizational Settings

Aim/Purpose: The aim of the present paper is to present a Noise Decision (ND) scale. First, it reports the development and validation of the instrument aimed at examining organizational factors that have an influence on decision-making and the level of noise. Second, it validates this rating scale by testing its discriminant and convergent validity with other measures to assess decision-making qualities. Background: According to the literature, the concept of noise is the unwanted variability present in judgments. The notion of noise concerns the systematic influence to which individuals are exposed in their environment. The literature in the field has found that noise reduction improves the perception of work performance. Methodology: The first study involves the development of a scale (composed of 36 items) consisting of semi-structured interviews, item development, and principal component analysis. The second study involves validation and convergent validity of this scale. In the first study, there were 43 employees from three medium-sized Italian multinationals. For the second study, a sample of 867 subjects was analysed. Contribution: This paper introduces the first scale aimed at assessing noise within individuals and, in the organizational context, within employees and employers. Findings: Results show that the estimated internal reliability for each of the ND subscales and also the correlations between the subscales were relatively low, suggesting that ND correctly measures the analyzed components. Furthermore, the validation of the psychometric qualities of the ND allowed for the assertion that the influence of noise is present in the decision-making process within the context of work environments, validating the initial hypotheses. Recommendation for Researchers: This paper aims to improve theory and research on decision-making; for example, by providing a possible implementation for scales for evaluating decision-making skills. Furthermore, detecting and limiting noise with a systematic method could improve both the quality of decisions and the quality of thought processes. Future Research: Given the measurement of ND, the study can be a starting point for future research on this topic. Since there is no literature about this construct, it would be necessary to spend more time researching, so that the topic becomes clearer. System noise has been tested by some researchers with a “noise audit,” which means giving the same problem to different people and measuring the differences in their responses. Repeating this kind of audit in conjunction with the ND in a specific work environment could be helpful to detect but also measure the influence of noise.




validation

First meeting of WP 5 EU BON testing and validation of concepts, tools, and services held

The first meeting of Work Package 5 (EU BON testing and validation of concepts, tools, and services) was held between 2nd and 4th April, 2013 in the Doñana Biological Station, in Spain. 23 people from 10 different institutions (7 of the EU BON consortium) worked towards building a draft on Principles and Guidelines for establishing and operating EU BON test sites.

During the meeting the members agreed on starting documenting each of the sites, using a common format to be decided in May on the Informatics Task Force meeting of WP2 (Data integration and interoperability) in Norway. A data inventory will also be built in each of the sites, adding monitoring protocols in stepwise fashion.

In addition the meeting served to get a better view of the variety of Ecosystems in Doñana as well as the monitoring protocols that are being conducted in this area. 





validation

First meeting of WP5 EU BON testing and validation of concepts, tools, and services

In 2013, Doñana Biological Reserve will host the first meeting of WP5 EU BON testing and validation of concepts, tools, and services, focusing on the organization and planning of the forthcoming WP5 tasks. The meeting will take place in Palacio de Doñana, Huelva, Spain, from 2 to 4 April 2013.
The aim of the meeting is to bring together the experts in charge of the implementation of the WP5 tasks with the people responsible for the data architecture (WP2), the tools developers (WP3, 4) and the policy and dialogue responsible partners (WP6, 7, 8). The meeting aims at providing a precise definition of the responsibilities of the partners involved and discussing the organization of the work process.





validation

Establishing macroecological trait datasets: digitalization, extrapolation, and validation of diet preferences in terrestrial mammals worldwide




validation

TMA-AVS-01 Alarm Validation Standard Obtains ANSI Accreditation

Initiated in 2020, the standard provides a method of creating an alarm scoring or classification metric for unauthorized human activity detected by alarm systems.




validation

Peace, equanimity and acceptance in the cancer experience: validation of the German version (PEACE-G) and associations with mental health, health-related quality of life and psychological constructs

Systematic reviews and meta-analyses reveal the importance of an accepting attitude towards cancer for mental health and functional coping. The aim of this study was to examine the psychometric properties of t… Read the full article ›

The post Peace, equanimity and acceptance in the cancer experience: validation of the German version (PEACE-G) and associations with mental health, health-related quality of life and psychological constructs was curated by information for practice.



  • Open Access Journal Articles

validation

Derivation and validation of an algorithm to predict transitions from community to residential long-term care among persons with dementia—A retrospective cohort study

The post Derivation and validation of an algorithm to predict transitions from community to residential long-term care among persons with dementia—A retrospective cohort study was curated by information for practice.



  • Open Access Journal Articles


validation

Online carbohydrate 3D structure validation with the Privateer web app

Owing to the difficulties associated with working with carbohydrates, validating glycan 3D structures prior to deposition into the Protein Data Bank has become a staple of the structure-solution pipeline. The Privateer software provides integrative methods for the validation, analysis, refinement and graphical representation of 3D atomic structures of glycans, both as ligands and as protein modifiers. While Privateer is free software, it requires users to install any of the structural biology software suites that support it or to build it from source code. Here, the Privateer web app is presented, which is always up to date and available to be used online (https://privateer.york.ac.uk) without installation. This self-updating tool, which runs locally on the user's machine, will allow structural biologists to simply and quickly analyse carbohydrate ligands and protein glycosylation from a web browser whilst retaining all confidential information on their devices.




validation

Small-angle scattering and dark-field imaging for validation of a new neutron far-field interferometer

A neutron far-field interferometer is under development at NIST with the aim of enabling a multi-scale measurement combining the best of small-angle neutron scattering (SANS) and neutron imaging and tomography. We use the close relationship between SANS, ultra-SANS, spin-echo SANS and dark-field imaging and measurements of monodisperse spheres as a validation metric, highlighting the strengths and weaknesses of each of these neutron techniques.




validation

Small-angle scattering and dark-field imaging for validation of a new neutron far-field interferometer

The continued advancement of complex materials often requires a deeper understanding of the structure–function relationship across many length scales, which quickly becomes an arduous task when multiple measurements are required to characterize hierarchical and inherently heterogeneous materials. Therefore, there are benefits in the simultaneous characterization of multiple length scales. At the National Institute of Standards and Technology, a new neutron far-field interferometer is under development that aims to enable a multi-scale measurement combining the best of small-angle neutron scattering (SANS) and neutron imaging and tomography. Spatially resolved structural information on the same length scales as SANS (0.001–1 µm) and ultra-small-angle neutron scattering (USANS, 0.1–10 µm) will be collected via dark-field imaging simultaneously with regular attenuation radiography (>10 µm). The dark field is analogous to the polarization loss measured in spin-echo SANS (SESANS) and is related to isotropic SANS through a Hankel transform. Therefore, we use this close relationship and analyze results from SANS, USANS, SESANS and dark-field imaging of monodisperse spheres as a validation metric for the interferometry measurements. The results also highlight the strengths and weaknesses of these neutron techniques for both steady-state and pulsed neutron sources. Finally, we present an example of the value added by the spatial resolution enabled by dark-field imaging in the study of more complex heterogeneous materials. This information would otherwise be lost in other small-angle scattering measurements averaged over the sample.




validation

Validation of electron-microscopy maps using solution small-angle X-ray scattering

The determination of the atomic resolution structure of biomacromolecules is essential for understanding details of their function. Traditionally, such a structure determination has been performed with crystallographic or nuclear resonance methods, but during the last decade, cryogenic transmission electron microscopy (cryo-TEM) has become an equally important tool. As the blotting and flash-freezing of the samples can induce conformational changes, external validation tools are required to ensure that the vitrified samples are representative of the solution. Although many validation tools have already been developed, most of them rely on fully resolved atomic models, which prevents early screening of the cryo-TEM maps. Here, a novel and automated method for performing such a validation utilizing small-angle X-ray scattering measurements, publicly available through the new software package AUSAXS, is introduced and implemented. The method has been tested on both simulated and experimental data, where it was shown to work remarkably well as a validation tool. The method provides a dummy atomic model derived from the EM map which best represents the solution structure.




validation

Community recommendations on cryoEM data archiving and validation

In January 2020, a workshop was held at EMBL-EBI (Hinxton, UK) to discuss data requirements for the deposition and validation of cryoEM structures, with a focus on single-particle analysis. The meeting was attended by 47 experts in data processing, model building and refinement, validation, and archiving of such structures. This report describes the workshop's motivation and history, the topics discussed, and the resulting consensus recommendations. Some challenges for future methods-development efforts in this area are also highlighted, as is the implementation to date of some of the recommendations.




validation

HCLTech launches 5G testing, validation lab in Chennai for telecom OEMs

HCLTech said the lab is scalable to test millimeter-wave (mmWave) frequency 5G infrastructure to help OEMs and telecom service providers quickly and accurately measure critical parameters.




validation

Bakery kill-step validation drives consumer confidence

Though the baking industry has a very safe record for the production of shelf-stable food products, Salmonella continues to be a pathogen of concern.




validation

Marion paddle agitator offers fast, cost-effective product validation

Thanks to manufacturing improvements to its standard plated paddle agitator, Marion can speed up lead times on its popular 30-inch diameter and smaller-sized horizontal mixers.




validation

Clinical Validation in Dark Pigmented Individuals Finds CIRCUL™ Pulse Oximetry Ring Provides Reliable Oxygen Saturation Readings Addressing Potential Discrimination in Traditional Pulse Oximeters

CIRCUL™ Pulse Oximetry Wearable Technology Demonstrated Statistically Significant Correlation in Delivering Stable Oxygen Saturation Value in Dark-Pigmented Individuals




validation

BIOIONIX CIP/COP Validation Studies Confirm over 6-Log Pathogen Reduction

Activated Water Performs with Higher Kill Rates in Less Time than Chemicals




validation

Leave-one-out cross validation (LOO) for an astronomy problem

Harrison Siegel pointed us to this project with Maximiliano Isi and Will Farr on gravitational-wave analysis. The compare models using predictive evaluation, in particular leave-one-out cross-validation (LOO), as discussed here and here. Siegel writes: We discuss our implementation of the … Continue reading




validation

Validation of the labor reform in Congress: litmus test for the credibility of collective bargaining

Javier Thibault weighs in on the Spanish parliament’s recent labor reform agreement and its effects on the recovery and the labor market. 

Confilegal

View




validation

Future confidence: Inaugural LTA Signature Augmentation and Validation Plugtests™ focuses on Long-Term Archive signatures

Sophia Antipolis, 21 February 2024

ETSI’s first LTA Signature Augmentation and Validation Plugtests™ has seen international participants exchange over 35 000 digital signature validation reports.

Held from 23 October - 22 December 2023, the remote interoperability event was organized by the ETSI Centre for Testing and Interoperability (CTI), on behalf of ETSI’s Technical Committee for Electronic Signatures and Trust Infrastructures (TC ESI). This Plugtests™ event was facilitated with the support and co-funding of the European Commission (EC) and the European Free Trade Association (EFTA).

Conducted using a dedicated web portal, sessions over the month-long Plugtests™ attracted the involvement of 190 participants from 121 organizations across 38 countries.

Read More...




validation

Validation test of a data centre cooling method using renewable energy in a cold region

Validation test of a data centre cooling method using renewable energy in a cold region




validation

Free Sitemap Creation, Validation and Submission

Free Sitemap Creation Considering all the available resources for creating sitemaps, this one is the best around as far as I am concerned. I personally have tinkered around with several sitemap programs and multiple sitemap generators online. Some charge for these services but there are plenty of free resources available online. The option is yours. […]




validation

Validation is a mirage

Spend enough time talking with entrepreneurs, product people, designers, and anyone charged with proving something, and you’ll bump into questions about validation. “How do you validate if it’s going to work?”“How do you know if people will buy it to not?”“How do you validate product market fit?”“How do you validate if a feature is worth… keep reading




validation

Development and validation of a high-throughput whole cell assay to investigate Staphylococcus aureus adhesion to host ligands [Microbiology]

Staphylococcus aureus adhesion to the host's skin and mucosae enables asymptomatic colonization and the establishment of infection. This process is facilitated by cell wall-anchored adhesins that bind to host ligands. Therapeutics targeting this process could provide significant clinical benefits; however, the development of anti-adhesives requires an in-depth knowledge of adhesion-associated factors and an assay amenable to high-throughput applications. Here, we describe the development of a sensitive and robust whole cell assay to enable the large-scale profiling of S. aureus adhesion to host ligands. To validate the assay, and to gain insight into cellular factors contributing to adhesion, we profiled a sequence-defined S. aureus transposon mutant library, identifying mutants with attenuated adhesion to human-derived fibronectin, keratin, and fibrinogen. Our screening approach was validated by the identification of known adhesion-related proteins, such as the housekeeping sortase responsible for covalently linking adhesins to the cell wall. In addition, we also identified genetic loci that could represent undescribed anti-adhesive targets. To compare and contrast the genetic requirements of adhesion to each host ligand, we generated a S. aureus Genetic Adhesion Network, which identified a core gene set involved in adhesion to all three host ligands, and unique genetic signatures. In summary, this assay will enable high-throughput chemical screens to identify anti-adhesives and our findings provide insight into the target space of such an approach.




validation

Prediction and validation of mouse meiosis-essential genes based on spermatogenesis proteome dynamics

Kailun Fang
Nov 30, 2020; 0:RA120.002081v1-mcp.RA120.002081
Research




validation

Generation and validation of a conditional knockout mouse model for the study of the Smith-Lemli-Opitz Syndrome

Babunageswararao Kanuri
Nov 17, 2020; 0:jlr.RA120001101v1-jlr.RA120001101
Research Articles




validation

Generation and validation of a conditional knockout mouse model for the study of the Smith-Lemli-Opitz Syndrome [Research Articles]

Smith-Lemli-Opitz Syndrome (SLOS) is a developmental disorder (OMIM #270400) caused by autosomal recessive mutations in the Dhcr7 gene, which encodes the enzyme 3β-hydroxysterol-7 reductase. SLOS patients present clinically with dysmorphology and neurological, behavioral and cognitive defects, with characteristically elevated levels of 7-dehydrocholesterol (7-DHC) in all bodily tissues and fluids. Previous mouse models of SLOS have been hampered by postnatal lethality when Dhcr7 is knocked out globally, while a hypomorphic mouse model showed improvement in the biochemical phenotype with ageing, and did not manifest most other characteristic features of SLOS. We report the generation of a conditional knockout of Dhcr7 (Dhcr7flx/flx), validated by generating a mouse with a liver-specific deletion (Dhcr7L-KO). Phenotypic characterization of liver-specific knockout mice revealed no significant changes in viability, fertility, growth curves, liver architecture, hepatic triglyceride secretion, or parameters of systemic glucose homeostasis. Furthermore, qPCR and RNA-Seq analyses of livers revealed no perturbations in pathways responsible for cholesterol synthesis, either in male or female Dhcr7L-KO mice, suggesting hepatic disruption of post-squalene cholesterol synthesis leads to minimal impact on sterol metabolism in the liver. This validated conditional Dhcr7 knockout model may now allow us to systematically explore the pathophysiology of SLOS, by allowing for temporal, cell and tissue-specific loss of DHCR7.




validation

Correction: Concentration Determination of >200 Proteins in Dried Blood Spots for Biomarker Discovery and Validation [Addition and Correction]




validation

Prediction and validation of mouse meiosis-essential genes based on spermatogenesis proteome dynamics [Research]

The molecular mechanism associated with mammalian meiosis has yet to be fully explored, and one of the main reasons for this lack of exploration is that some meiosis-essential genes are still unknown. The profiling of gene expression during spermatogenesis has been performed in previous studies, yet few studies have aimed to find new functional genes. Since there is a huge gap between the number of genes that are able to be quantified and the number of genes that can be characterized by phenotype screening in one assay, an efficient method to rank quantified genes according to phenotypic relevance is of great importance. We proposed to rank genes by the probability of their function in mammalian meiosis based on global protein abundance using machine learning. Here, nine types of germ cells focusing on continual substages of meiosis prophase I were isolated, and the corresponding proteomes were quantified by high-resolution mass spectrometry. By combining meiotic labels annotated from the MGI mouse knockout database and the spermatogenesis proteomics dataset, a supervised machine learning package, FuncProFinder, was developed to rank meiosis-essential candidates. Of the candidates whose functions were unannotated, four of ten genes with the top prediction scores, Zcwpw1, Tesmin, 1700102P08Rik and Kctd19, were validated as meiosis-essential genes by knockout mouse models. Therefore,  mammalian meiosis-essential genes could be efficiently predicted based on the protein abundance dataset, which provides a paradigm for other functional gene mining from a related abundance dataset.




validation

Development and validation of outcome prediction models for aneurysmal subarachnoid haemorrhage: the SAHIT multinational cohort study




validation

Development of 18F-Fluoromisonidazole Hypoxia PET/CT Diagnostic Interpretation Criteria and Validation of Interreader Reliability, Reproducibility, and Performance

Tumor hypoxia, an integral biomarker to guide radiotherapy, can be imaged with 18F-fluoromisonidazole (18F-FMISO) hypoxia PET. One major obstacle to its broader application is the lack of standardized interpretation criteria. We sought to develop and validate practical interpretation criteria and a dedicated training protocol for nuclear medicine physicians to interpret 18F-FMISO hypoxia PET. Methods: We randomly selected 123 patients with human papillomavirus–positive oropharyngeal cancer enrolled in a phase II trial who underwent 123 18F-FDG PET/CT and 134 18F-FMISO PET/CT scans. Four independent nuclear medicine physicians with no 18F-FMISO experience read the scans. Interpretation by a fifth nuclear medicine physician with over 2 decades of 18F-FMISO experience was the reference standard. Performance was evaluated after initial instruction and subsequent dedicated training. Scans were considered positive for hypoxia by visual assessment if 18F-FMISO uptake was greater than floor-of-mouth uptake. Additionally, SUVmax was determined to evaluate whether quantitative assessment using tumor-to-background ratios could be helpful to define hypoxia positivity. Results: Visual assessment produced a mean sensitivity and specificity of 77.3% and 80.9%, with fair interreader agreement ( = 0.34), after initial instruction. After dedicated training, mean sensitivity and specificity improved to 97.6% and 86.9%, with almost perfect agreement ( = 0.86). Quantitative assessment with an estimated best SUVmax ratio threshold of more than 1.2 to define hypoxia positivity produced a mean sensitivity and specificity of 56.8% and 95.9%, respectively, with substantial interreader agreement ( = 0.66), after initial instruction. After dedicated training, mean sensitivity improved to 89.6% whereas mean specificity remained high at 95.3%, with near-perfect interreader agreement ( = 0.86). Conclusion: Nuclear medicine physicians without 18F-FMISO hypoxia PET reading experience demonstrate much improved interreader agreement with dedicated training using specific interpretation criteria.




validation

G-Protein Signaling in Alzheimer's Disease: Spatial Expression Validation of Semi-supervised Deep Learning-Based Computational Framework

Systemic study of pathogenic pathways and interrelationships underlying genes associated with Alzheimer's disease (AD) facilitates the identification of new targets for effective treatments. Recently available large-scale multiomics datasets provide opportunities to use computational approaches for such studies. Here, we devised a novel disease gene identification (digID) computational framework that consists of a semi-supervised deep learning classifier to predict AD-associated genes and a protein–protein interaction (PPI) network-based analysis to prioritize the importance of these predicted genes in AD. digID predicted 1,529 AD-associated genes and revealed potentially new AD molecular mechanisms and therapeutic targets including GNAI1 and GNB1, two G-protein subunits that regulate cell signaling, and KNG1, an upstream modulator of CDC42 small G-protein signaling and mediator of inflammation and candidate coregulator of amyloid precursor protein (APP). Analysis of mRNA expression validated their dysregulation in AD brains but further revealed the significant spatial patterns in different brain regions as well as among different subregions of the frontal cortex and hippocampi. Super-resolution STochastic Optical Reconstruction Microscopy (STORM) further demonstrated their subcellular colocalization and molecular interactions with APP in a transgenic mouse model of both sexes with AD-like mutations. These studies support the predictions made by digID while highlighting the importance of concurrent biological validation of computationally identified gene clusters as potential new AD therapeutic targets.




validation

Design and Validation of Guide RNAs for CRISPR-Cas9 Genome Editing in Mosquitoes

CRISPR–Cas9 has revolutionized gene editing for traditional and nontraditional model organisms alike. This tool has opened the door to new mechanistic studies of basic mosquito biology as well as the development of novel vector control strategies based on CRISPR–Cas9, including gene drives that spread genetic elements in the population. Although the promise of the specificity, flexibility, and ease of deployment CRISPR is real, its implementation still requires empirical optimization for each new species of interest, as well as to each genomic target within a given species. Here, we provide an overview of designing and testing single-guide RNAs for the use of CRISPR-based gene editing tools.




validation

The Office of the Marijuana Commissioner launched the Social Equity Eligibility Validation Application and DIA Map

The Office of the Marijuana Commissioner (OMC) is pleased to announce the launch of the Social Equity Eligibility Validation Application and Disproportionately Impacted Areas (DIA) Map OMC is responsible for implementing the Marijuana Control Act in Delaware, which includes a Social Equity License program. Individuals interested in applying for a social equity license must first […]




validation

Development and Validation of a Customized Amplex UltraRed Assay for Sensitive Hydrogen Peroxide Detection in Pharmaceutical Water

For clean-room technologies such as isolators and restricted access barrier systems (RABS), decontamination using hydrogen peroxide (H2O2) is increasingly attractive to fulfill regulatory requirements. Several approaches are currently used, ranging from manual wipe disinfection to vapor phase hydrogen peroxide (VPHP) or automated nebulization sanitization. Although the residual airborne H2O2 concentration can be easily monitored, detection of trace H2O2 residues in filled products is rather challenging. To simulate the filling process in a specific clean room, technical runs with water for injection (WfI) are popular. Thus, the ability to detect traces of H2O2 in water is an important prerequisite to ensure a safe and reliable use of H2O2 for isolator or clean room decontamination. The objective of this study was to provide a validated quantitative, fluorometric Amplex UltraRed assay, which satisfies the analytical target profile of quantifying H2O2 in WfI at low nanomolar to low micromolar concentrations (ppb range) with high accuracy and high precision. The Amplex UltraRed technology provides a solid basis for this purpose; however, no commercial assay kit that fulfills these requirements is available. Therefore, a customized Amplex UltraRed assay was developed, optimized, and validated. This approach resulted in an assay that is capable of quantifying H2O2 in WfI selectively, sensitively, accurately, precisely, and robustly. This assay is used in process development and qualification approaches using WfI in H2O2-decontaminated clean rooms and isolators.




validation

Longitudinal validation of King's Sarcoidosis Questionnaire in a prospective cohort with mild sarcoidosis

Background

Quality of life is impaired in patients with sarcoidosis. The King's Sarcoidosis Questionnaire (KSQ) is a brief questionnaire assessing health-related quality of life in patients with sarcoidosis, comprising subdomains of General Health Status (GHS), Lung, Medication, Skin and Eyes. The aim of this study was to enhance the validation of the KSQ, incorporating longitudinal validation and known-groups validity in a cohort with mild sarcoidosis.

Methods

The KSQ was linguistically validated according to guidelines. Patients with sarcoidosis completed KSQ and other questionnaires at baseline, after 2 weeks and at 12 months. Forced vital capacity (FVC) was measured. Concurrent validity, reliability and responsiveness were assessed.

Results

In patients (n=150), the KSQ had moderate to strong correlations with the Short Form-12 (Mental Component Summary), the King's Brief Interstitial Lung Disease questionnaire and the Fatigue Assessment Scale (r=0.30–0.70) and weak correlations with the Short Form-12 (Physical Component Summary) and FVC (r=0.01–0.29). The KSQ GHS and Lung domains were able to discriminate between groups of patients stratified according to fatigue, treatment and FVC. The KSQ had high internal consistency (Cronbach's α=0.73–0.90) and repeatability (interclass correlation coefficients 0.72–0.81). Correlations to comparable questionnaires at baseline were moderate or strong for the GHS, Lung and GHS–Lung subdomains and weak or moderate for FVC. The KSQ was responsive to changes over time.

Conclusion

This study strengthened the validation of the KSQ by introducing known-groups validity and assessments of responsiveness over 12 months in patients with mild sarcoidosis.




validation

Validation of an Artificial Intelligence-Based Prediction Model Using 5 External PET/CT Datasets of Diffuse Large B-Cell Lymphoma

The aim of this study was to validate a previously developed deep learning model in 5 independent clinical trials. The predictive performance of this model was compared with the international prognostic index (IPI) and 2 models incorporating radiomic PET/CT features (clinical PET and PET models). Methods: In total, 1,132 diffuse large B-cell lymphoma patients were included: 296 for training and 836 for external validation. The primary outcome was 2-y time to progression. The deep learning model was trained on maximum-intensity projections from PET/CT scans. The clinical PET model included metabolic tumor volume, maximum distance from the bulkiest lesion to another lesion, SUVpeak, age, and performance status. The PET model included metabolic tumor volume, maximum distance from the bulkiest lesion to another lesion, and SUVpeak. Model performance was assessed using the area under the curve (AUC) and Kaplan–Meier curves. Results: The IPI yielded an AUC of 0.60 on all external data. The deep learning model yielded a significantly higher AUC of 0.66 (P < 0.01). For each individual clinical trial, the model was consistently better than IPI. Radiomic model AUCs remained higher for all clinical trials. The deep learning and clinical PET models showed equivalent performance (AUC, 0.69; P > 0.05). The PET model yielded the highest AUC of all models (AUC, 0.71; P < 0.05). Conclusion: The deep learning model predicted outcome in all trials with a higher performance than IPI and better survival curve separation. This model can predict treatment outcome in diffuse large B-cell lymphoma without tumor delineation but at the cost of a lower prognostic performance than with radiomics.




validation

Validation of a Simplified Tissue-to-Reference Ratio Measurement Using SUVR to Assess Synaptic Density Alterations in Alzheimer Disease with [11C]UCB-J PET

Simplified methods of acquisition and quantification would facilitate the use of synaptic density imaging in multicenter and longitudinal studies of Alzheimer disease (AD). We validated a simplified tissue-to-reference ratio method using SUV ratios (SUVRs) for estimating synaptic density with [11C]UCB-J PET. Methods: Participants included 31 older adults with AD and 16 with normal cognition. The distribution volume ratio (DVR) using simplified reference tissue model 2 was compared with SUVR at short scan windows using a whole-cerebellum reference region. Results: Synaptic density was reduced in AD participants using DVR or SUVR. SUVR using later scan windows (60–90 or 70–90 min) was minimally biased, with the strongest correlation with DVR. Effect sizes using SUVR at these late time windows were minimally reduced compared with effect sizes with DVR. Conclusion: A simplified tissue-to-reference method may be useful for multicenter and longitudinal studies seeking to measure synaptic density in AD.




validation

Cyclic Aspiration in Mechanical Thrombectomy: Influencing Factors and Experimental Validation [RESEARCH]

BACKGROUND AND PURPOSE:

Mechanical thrombectomy is a fundamental intervention for acute ischemic stroke treatment. While conventional techniques are effective, cyclic aspiration (CyA) shows potential for better recanalization rates. We aim to investigate factors affecting CyA and compare them with static aspiration (StA).

MATERIALS AND METHODS:

StA setup consisted of an aspiration pump connected to pressure transducer. CyA was tested with 5 subsequent iterations: single solenoid valve with air plus saline (i1) or saline alone (i2) as aspiration medium; 2 solenoid valves with air plus saline (i3) as aspiration medium; complete air removal and saline feeding (i4); and pressurized saline feeding (i5). To assess the efficacy of clot ingestion, the pressure transducer was replaced with a distal aspiration catheter. Moderately stiff clot analogs (15 mm) were used to investigate the ingestion quantified as clot relative weight loss. Additionally, the aspiration flow rate was assessed for each setup.

RESULTS:

With CyA i1, the amplitude of the achieved negative pressure waves declined with increasing frequencies but progressively increased with each subsequent iteration, achieving a maximum amplitude of 81 kPa for i5 at 1 Hz. Relative clot weight loss was significantly higher with i5 at 5 Hz than with StA (100% versus 37.8%; P = .05). Aspiration flow rate was lower with CyA than with StA (i5 at 5 Hz: 199.8 mL/min versus StA: 311 mL/min; P < .01).

CONCLUSIONS:

CyA with the appropriate setup may represent an encouraging innovation in mechanical thrombectomy, offering a promising pathway for improving efficacy in clot ingestion and recanalization. The observed benefits warrant confirmation in a clinical setting.