data

Notice of Special Interest (NOSI): Analysis of Existing Linked Datasets to Understand the Relationship between Housing Program Participation and Risk for Chronic Diseases and Other Conditions (R01-Clinical Trial Not Allowed) [First Available Due Date: Oct

The post Notice of Special Interest (NOSI): Analysis of Existing Linked Datasets to Understand the Relationship between Housing Program Participation and Risk for Chronic Diseases and Other Conditions (R01-Clinical Trial Not Allowed) [First Available Due Date: Oct 07] was curated by information for practice.




data

Notice of Special Interest (NOSI): Analysis of Existing Linked Datasets to Understand the Relationship between Housing Program Participation and Risk for Chronic Diseases and Other Conditions (R01-Clinical Trial Not Allowed)

The post Notice of Special Interest (NOSI): Analysis of Existing Linked Datasets to Understand the Relationship between Housing Program Participation and Risk for Chronic Diseases and Other Conditions (R01-Clinical Trial Not Allowed) was curated by information for practice.





data

Population-Level Administrative Data: A Resource to Advance Psychological Science

Current Directions in Psychological Science, Ahead of Print. Population-level administrative data—data on individuals’ interactions with administrative systems, such as health-care, social-welfare, criminal-justice, and education systems—are a fruitful resource for research into behavior, development, and well-being. However, administrative data are underutilized in psychological science. Here, we review advantages of population-level administrative data for psychological research and […]

The post Population-Level Administrative Data: A Resource to Advance Psychological Science was curated by information for practice.



  • Journal Article Abstracts

data

The risk of breaking electronic devices rises 24% over Easter, new data reveals

New data has revealed that there’s a 24% rise in Brits dealing with broken laptops, tablets and phones over the Easter break each year, making it the riskiest holiday for devices.




data

Datalogic launches new Memor 30/35 Family PDA

Datalogic, provider of automatic data capture solutions, has launched its Memor 30 and 35 flagship mobile computers. The Memor 30/35 is designed to expedite operations with a superior scanning performance in all environments.




data

Out of the data jungle: More interoperability for resilient supply chains

By Philipp Pfister, Chief Customer Experience Officer at Transporeon (a Trimble company)

There’s no denying that the transport sector is a prominent cog across most global industries. As the saying goes, “There is no production without procurement logistics, and no trade without delivery traffic.”




data

43% of data breaches target small businesses in 5 industries

Some industries are more vulnerable to cyber threats hence facing a high risk of experiencing data breaches or other types of cyber attacks in the future.




data

Harnessing the power of connected data

Special Technology Report on Mobile Computing /Automatic Identification and Data Capture (AIDC).

RetailTechnologyReview.com.com spoke with leading spokespeople within the vendor and analyst community about current trends and developments within the automatic identification & data capture (AIDC)/mobile computing solutions space, including those related to modern supply chain challenges and omnichannel.




data

Data Fabric in Retail: The Go-To Solution to Boost Customer Experience and Personalization

By Jack Pollard, freelance writer.

Retail is one of the most competitive landscapes out there today. Consumption is at an all-time high, but the cost of living crisis means that brands need to fight harder than ever to convince consumers that their products are what they need.




data

Young-Earth Creationist Helium Diffusion "Dates": Fallacies Based on Bad Assumptions and Questionable Data

Updated July 25, 2006: Young-Earth creationists consider the helium diffusion studies of D. Russell Humphreys and others to be one of their greatest achievements in arguing for a 6,000 year old Earth. A geologist shows that these studies are extensively flawed and include: serious miscalculations in their data, sampling the wrong rock type, failing to eliminate possible contamination, using equations that are based on invalid assumptions and relying on questionable data. Appendices C and D have been added in response to Dr. Humphreys' most recent statements in his January 2006 "Trueorigins" essay.




data

British retail B2B companies take 42 days to collect and enrich data needed for new product launches, hampering UK productivity, according to Akeneo’s research

British Business-to-Business (B2B) companies take on average 32 days to collect, collate and enrich all the necessary data for a new product launch, significantly impacting UK productivity and go-to-market times, according to research conducted by Akeneo, the Product Experience (PX) Company and leading provider of Product Information Management (PIM) solutions.




data

Precision Retail launches rewards-based survey plugin to capture consent-based consumer data

The Golden Quarter of retail is approaching. Perfect time for retailers to gather mounds of consumer data. But how to do so compliantly? Two Toronto-based marketing specialists have an answer: Precision Retail, a new venture offering what they believe to be the world's first and only rewards-based post-purchase survey plugin to capture 0PD.




data

Datalogic Memor 30/35 achieves rugged Certification under Android Enterprise Recommended (AER) Program

Datalogic, the global provider of automatic data capture and industrial automation solutions, has announced that the Memor 30/35 has been officially certified as a rugged device under the prestigious Android Enterprise Recommended (AER) program, meeting and surpassing key standards set by Android.




data

Overcome These 5 Customer Data Challenges in Retail with Cloud Solutions

By Franklin Carpenter, freelance writer.

The importance of customer data in retail continues to grow, pushing businesses to seek efficient management strategies. Cloud computing has proven to be a powerful resource for tackling customer data challenges, enabling retailers to streamline their processes.




data

TrusTrace spotlights ‘Data-Driven Decarbonisation’ at COP29: Navigating Fashion’s Path to Net Zero

TrusTrace, a global SaaS company with a platform for product traceability and supply chain compliance in fashion and retail, host a key session at COP29 entitled, ‘Data-Driven Decarbonisation: Navigating Fashion’s Path to Net Zero’ on November 16th from 13.00-13.40 at the Swedish Pavilion, C17, COP29 Blue Zone. 




data

Data: 1 Out of 10 Couples Who Got Married in 2023 are Multicultural Couples

[Economy] :
One out of ten couples who tied the knot last year were from different cultures, as the number of multicultural marriages returned to pre-pandemic levels. According to Statistics Korea on Thursday, there were 20-thousand-431 multicultural couples in 2023, up 17-point-two percent from a year ...

[more...]














data

Automated selection of nanoparticle models for small-angle X-ray scattering data analysis using machine learning

Small-angle X-ray scattering (SAXS) is widely used to analyze the shape and size of nanoparticles in solution. A multitude of models, describing the SAXS intensity resulting from nanoparticles of various shapes, have been developed by the scientific community and are used for data analysis. Choosing the optimal model is a crucial step in data analysis, which can be difficult and time-consuming, especially for non-expert users. An algorithm is proposed, based on machine learning, representation learning and SAXS-specific preprocessing methods, which instantly selects the nanoparticle model best suited to describe SAXS data. The different algorithms compared are trained and evaluated on a simulated database. This database includes 75 000 scattering spectra from nine nanoparticle models, and realistically simulates two distinct device configurations. It will be made freely available to serve as a basis of comparison for future work. Deploying a universal solution for automatic nanoparticle model selection is a challenge made more difficult by the diversity of SAXS instruments and their flexible settings. The poor transferability of classification rules learned on one device configuration to another is highlighted. It is shown that training on several device configurations enables the algorithm to be generalized, without degrading performance compared with configuration-specific training. Finally, the classification algorithm is evaluated on a real data set obtained by performing SAXS experiments on nanoparticles for each of the instrumental configurations, which have been characterized by transmission electron microscopy. This data set, although very limited, allows estimation of the transferability of the classification rules learned on simulated data to real data.




data

ClusterFinder: a fast tool to find cluster structures from pair distribution function data

A novel automated high-throughput screening approach, ClusterFinder, is reported for finding candidate structures for atomic pair distribution function (PDF) structural refinements. Finding starting models for PDF refinements is notoriously difficult when the PDF originates from nanoclusters or small nanoparticles. The reported ClusterFinder algorithm can screen 104 to 105 candidate structures from structural databases such as the Inorganic Crystal Structure Database (ICSD) in minutes, using the crystal structures as templates in which it looks for atomic clusters that result in a PDF similar to the target measured PDF. The algorithm returns a rank-ordered list of clusters for further assessment by the user. The algorithm has performed well for simulated and measured PDFs of metal–oxido clusters such as Keggin clusters. This is therefore a powerful approach to finding structural cluster candidates in a modelling campaign for PDFs of nanoparticles and nanoclusters.




data

Sheet-on-sheet fixed target data collection devices for serial crystallography at synchrotron and XFEL sources

Fixed targets (`chips') offer efficient, high-throughput microcrystal delivery for serial crystallography at synchrotrons and X-ray free-electron lasers (XFELs). Within this family, sheet-on-sheet (SOS) chips offer noteworthy advantages in cost, adaptability, universality and ease of crystal loading. We describe our latest generation of SOS devices, which are now in active use at both synchrotrons and XFELs.




data

Exploiting Friedel pairs to interpret scanning 3DXRD data from complex geological materials

A new processing technique for synchrotron scanning 3D X-ray diffraction data is introduced, utilizing symmetric Bragg reflections hkl and hkl, known as Friedel pairs. This technique is designed to tackle the difficulties associated with large, highly deformed, polyphase materials, especially geological samples.




data

Towards expansion of the MATTS data bank with heavier elements: the influence of the wavefunction basis set on the multipole model derived from the wavefunction

This study examines the quality of charge density obtained by fitting the multipole model to wavefunctions in different basis sets. The complex analysis reveals that changing the basis set quality from double- to triple-zeta can notably improve the charge density related properties of a multipole model.




data

Optimal operation guidelines for direct recovery of high-purity precursor from spent lithium-ion batteries: hybrid operation model of population balance equation and data-driven classifier

This study proposes an operation optimization framework for impurity-free recycling of spent lithium-ion batteries. Using a hybrid population balance equation integrated with a data-driven condition classifier, the study firstly identifies the optimal batch and semi-batch operation conditions that significantly reduce the operation time with 100% purity of product; detailed guidelines are given for industrial applications.




data

TOMOMAN: a software package for large-scale cryo-electron tomography data preprocessing, community data sharing and collaborative computing

Here we describe TOMOMAN (TOMOgram MANager), an extensible open-sourced software package for handling cryo-electron tomography data preprocessing. TOMOMAN streamlines interoperability between a wide range of external packages and provides tools for project sharing and archival.




data

Exploiting Friedel pairs to interpret scanning 3DXRD data from complex geological materials

The present study introduces a processing strategy for synchrotron scanning 3D X-ray diffraction (s3DXRD) data, aimed at addressing the challenges posed by large, highly deformed, polyphase materials such as crystalline rocks. Leveraging symmetric Bragg reflections known as Friedel pairs, our method enables diffraction events to be precisely located within the sample volume. This method allows for fitting the phase, crystal structure and unit-cell parameters at the intra-grain scale on a voxel grid. The processing workflow incorporates several new modules, designed to (i) efficiently match Friedel pairs in large s3DXRD datasets containing up to 108 diffraction peaks; (ii) assign phases to each pixel or voxel, resolving potential ambiguities arising from overlap in scattering angles between different crystallographic phases; and (iii) fit the crystal orientation and unit cell locally on a point-by-point basis. We demonstrate the effectiveness of our technique on fractured granite samples, highlighting the ability of the method to characterize complex geological materials and show their internal structure and mineral composition. Additionally, we include the characterization of a metal gasket made of a commercial aluminium alloy, which surrounded the granite sample during experiments. The results show the effectiveness of the technique in recovering information about the internal texture and residual strain of materials that have undergone high levels of plastic deformation.




data

Sheet-on-sheet fixed target data collection devices for serial crystallography at synchrotron and XFEL sources

Serial crystallography (SX) efficiently distributes over many crystals the radiation dose absorbed during diffraction data acquisition, enabling structure determination of samples at ambient temperature. SX relies on the rapid and reliable replacement of X-ray-exposed crystals with fresh crystals at a rate commensurate with the data acquisition rate. `Solid supports', also known as `fixed targets' or `chips', offer one approach. These are microscopically thin solid panes into or onto which crystals are deposited to be individually interrogated by an X-ray beam. Solid supports are generally patterned using photolithography methods to produce a regular array of features that trap single crystals. A simpler and less expensive alternative is to merely sandwich the microcrystals between two unpatterned X-ray-transparent polymer sheets. Known as sheet-on-sheet (SOS) chips, these offer significantly more versatility. SOS chips place no constraint on the size or size distribution of the microcrystals or their growth conditions. Crystals ranging from true nanocrystals up to microcrystals can be investigated, as can crystals grown in media ranging from low viscosity (aqueous solution) up to high viscosity (such as lipidic cubic phase). Here, we describe our two SOS devices. The first is a compact and lightweight version designed specifically for synchrotron use. It incorporates a standard SPINE-type magnetic base for mounting on a conventional macromolecular crystallography goniometer. The second and larger chip is intended for both X-ray free-electron laser and synchrotron use and is fully compatible with the fast-scanning XY-raster stages developed for data collection with patterned chips.




data

Finback: a web-based data collection system at SSRF biological macromolecular crystallography beamlines

An integrated computer software system for macromolecular crystallography (MX) data collection at the BL02U1 and BL10U2 beamlines of the Shanghai Synchrotron Radiation Facility is described. The system, Finback, implements a set of features designed for the automated MX beamlines, and is marked with a user-friendly web-based graphical user interface (GUI) for interactive data collection. The Finback client GUI can run on modern browsers and has been developed using several modern web technologies including WebSocket, WebGL, WebWorker and WebAssembly. Finback supports multiple concurrent sessions, so on-site and remote users can access the beamline simultaneously. Finback also cooperates with the deployed experimental data and information management system, the relevant experimental parameters and results are automatically deposited to a database.




data

DOMAS: a data management software framework for advanced light sources

In recent years, China's advanced light sources have entered a period of rapid construction and development. As modern X-ray detectors and data acquisition technologies advance, these facilities are expected to generate massive volumes of data annually, presenting significant challenges in data management and utilization. These challenges encompass data storage, metadata handling, data transfer and user data access. In response, the Data Organization Management Access Software (DOMAS) has been designed as a framework to address these issues. DOMAS encapsulates four fundamental modules of data management software, including metadata catalogue, metadata acquisition, data transfer and data service. For light source facilities, building a data management system only requires parameter configuration and minimal code development within DOMAS. This paper firstly discusses the development of advanced light sources in China and the associated demands and challenges in data management, prompting a reconsideration of data management software framework design. It then outlines the architecture of the framework, detailing its components and functions. Lastly, it highlights the application progress and effectiveness of DOMAS when deployed for the High Energy Photon Source (HEPS) and Beijing Synchrotron Radiation Facility (BSRF).




data

A distributed data processing scheme based on Hadoop for synchrotron radiation experiments

With the development of synchrotron radiation sources and high-frame-rate detectors, the amount of experimental data collected at synchrotron radiation beamlines has increased exponentially. As a result, data processing for synchrotron radiation experiments has entered the era of big data. It is becoming increasingly important for beamlines to have the capability to process large-scale data in parallel to keep up with the rapid growth of data. Currently, there is no set of data processing solutions based on the big data technology framework for beamlines. Apache Hadoop is a widely used distributed system architecture for solving the problem of massive data storage and computation. This paper presents a set of distributed data processing schemes for beamlines with experimental data using Hadoop. The Hadoop Distributed File System is utilized as the distributed file storage system, and Hadoop YARN serves as the resource scheduler for the distributed computing cluster. A distributed data processing pipeline that can carry out massively parallel computation is designed and developed using Hadoop Spark. The entire data processing platform adopts a distributed microservice architecture, which makes the system easy to expand, reduces module coupling and improves reliability.




data

StreamSAXS: a Python-based workflow platform for processing streaming SAXS/WAXS data

StreamSAXS is a Python-based small- and wide-angle X-ray scattering (SAXS/WAXS) data analysis workflow platform with graphical user interface (GUI). It aims to provide an interactive and user-friendly tool for analysis of both batch data files and real-time data streams. Users can easily create customizable workflows through the GUI to meet their specific needs. One characteristic of StreamSAXS is its plug-in framework, which enables developers to extend the built-in workflow tasks. Another feature is the support for both already acquired and real-time data sources, allowing StreamSAXS to function as an offline analysis platform or be integrated into large-scale acquisition systems for end-to-end data management. This paper presents the core design of StreamSAXS and provides user cases demonstrating its utilization for SAXS/WAXS data analysis in offline and online scenarios.




data

MuscleX: data analysis software for fiber diffraction patterns from muscle

MuscleX is an integrated, open-source computer software suite for data reduction of X-ray fiber diffraction patterns from striated muscle and other fibrous systems. It is written in Python and runs on Linux, Microsoft Windows or macOS. Most modules can be run either from a graphical user interface or in a `headless mode' from the command line, suitable for incorporation into beamline control systems. Here, we provide an overview of the general structure of the MuscleX software package and describe the specific features of the individual modules as well as examples of applications.




data

RefXAS: an open access database of X-ray absorption spectra

Under DAPHNE4NFDI, the X-ray absorption spectroscopy (XAS) reference database, RefXAS, has been set up. For this purpose, we developed a method to enable users to submit a raw dataset, with its associated metadata, via a dedicated website for inclusion in the database. Implementation of the database includes an upload of metadata to the scientific catalogue and an upload of files via object storage, with automated query capabilities through a web server and visualization of the data and files. Based on the mode of measurements, quality criteria have been formulated for the automated check of any uploaded data. In the present work, the significant metadata fields for reusability, as well as reproducibility of results (FAIR data principles), are discussed. Quality criteria for the data uploaded to the database have been formulated and assessed. Moreover, the usability and interoperability of available XAS data/file formats have been explored. The first version of the RefXAS database prototype is presented, which features a human verification procedure, currently being tested with a new user interface designed specifically for curators; a user-friendly landing page; a full list of datasets; advanced search capabilities; a streamlined upload process; and, finally, a server-side automatic authentication and (meta-) data storage via MongoDB, PostgreSQL and (data-) files via relevant APIs.




data

A distributed software system for integrating data-intensive imaging methods in a hard X-ray nanoprobe beamline at the SSRF

The development of hard X-ray nanoprobe techniques has given rise to a number of experimental methods, like nano-XAS, nano-XRD, nano-XRF, ptychography and tomography. Each method has its own unique data processing algorithms. With the increase in data acquisition rate, the large amount of generated data is now a big challenge to these algorithms. In this work, an intuitive, user-friendly software system is introduced to integrate and manage these algorithms; by taking advantage of the loosely coupled, component-based design approach of the system, the data processing speed of the imaging algorithm is enhanced through optimization of the parallelism efficiency. This study provides meaningful solutions to tackle complexity challenges faced in synchrotron data processing.




data

Redetermination of germacrone type II based on single-crystal X-ray data

The extraction and purification procedures, crystallization and crystal structure refinement (single-crystal X-ray data) of germacrone type II, C15H22O, are presented. The structural results are compared with a previous powder X-ray synchrotron study [Kaduk et al. (2022). Powder Diffr. 37, 98–104], revealing significant improvements in terms of accuracy and precision. Hirshfeld atom refinement (HAR), as well as Hirshfeld surface analysis, give insight into the inter­molecular inter­actions of germacrone type II.




data

Data collection is your last experiment




data

TAAM refinement on high-resolution experimental and simulated 3D ED/MicroED data for organic mol­ecules

3D electron diffraction (3D ED), or microcrystal electron diffraction (MicroED), has become an alternative technique for determining the high-resolution crystal structures of compounds from sub-micron-sized crystals. Here, we considered l-alanine, α-glycine and urea, which are known to form good-quality crystals, and collected high-resolution 3D ED data on our in-house TEM instrument. In this study, we present a comparison of independent atom model (IAM) and transferable aspherical atom model (TAAM) kinematical refinement against experimental and simulated data. TAAM refinement on both experimental and simulated data clearly improves the model fitting statistics (R factors and residual electrostatic potential) compared to IAM refinement. This shows that TAAM better represents the experimental electrostatic potential of organic crystals than IAM. Furthermore, we compared the geometrical parameters and atomic displacement parameters (ADPs) resulting from the experimental refinements with the simulated refinements, with the periodic density functional theory (DFT) calculations and with published X-ray and neutron crystal structures. The TAAM refinements on the 3D ED data did not improve the accuracy of the bond lengths between the non-H atoms. The experimental 3D ED data provided more accurate H-atom positions than the IAM refinements on the X-ray diffraction data. The IAM refinements against 3D ED data had a tendency to lead to slightly longer X—H bond lengths than TAAM, but the difference was statistically insignificant. Atomic displacement parameters were too large by tens of percent for l-alanine and α-glycine. Most probably, other unmodelled effects were causing this behaviour, such as radiation damage or dynamical scattering.




data

Deep residual networks for crystallography trained on synthetic data

The use of artificial intelligence to process diffraction images is challenged by the need to assemble large and precisely designed training data sets. To address this, a codebase called Resonet was developed for synthesizing diffraction data and training residual neural networks on these data. Here, two per-pattern capabilities of Resonet are demonstrated: (i) interpretation of crystal resolution and (ii) identification of overlapping lattices. Resonet was tested across a compilation of diffraction images from synchrotron experiments and X-ray free-electron laser experiments. Crucially, these models readily execute on graphics processing units and can thus significantly outperform conventional algorithms. While Resonet is currently utilized to provide real-time feedback for macromolecular crystallography users at the Stanford Synchrotron Radiation Lightsource, its simple Python-based interface makes it easy to embed in other processing frameworks. This work highlights the utility of physics-based simulation for training deep neural networks and lays the groundwork for the development of additional models to enhance diffraction collection and analysis.




data

A web-based dashboard for RELION metadata visualization

Cryo-electron microscopy (cryo-EM) has witnessed radical progress in the past decade, driven by developments in hardware and software. While current software packages include processing pipelines that simplify the image-processing workflow, they do not prioritize the in-depth analysis of crucial metadata, limiting troubleshooting for challenging data sets. The widely used RELION software package lacks a graphical native representation of the underlying metadata. Here, two web-based tools are introduced: relion_live.py, which offers real-time feedback on data collection, aiding swift decision-making during data acquisition, and relion_analyse.py, a graphical interface to represent RELION projects by plotting essential metadata including interactive data filtration and analysis. A useful script for estimating ice thickness and data quality during movie pre-processing is also presented. These tools empower researchers to analyse data efficiently and allow informed decisions during data collection and processing.