process

How food and beverage processors can build a glass and brittle plastic program

There are many different foreign materials that could and actually have ended up in foods. The most insidious of these and the one most feared is glass.




process

The food processing industry needs automation to succeed

A long list of concerns continues to keep businesses in the food processing industry on their toes. Inflation. Supply chain issues. Hiring challenges. The pandemic. However, a connective thread runs through these four: automation.




process

Now is the Time to Secure Food Processing Facilities

The threats facing food processing facilities are already here, and there is no longer any time to delay implementing minimum protections for these sites.




process

Küberit Showcases New Profile Selection Process, Sample Acquisition and Specification Support

Küberit USA will debut its new integrated process for profile selection, sample acquisiiton, and resources for specification support at this week’s TISE. 




process

How Prestage Foods of Iowa built a state-of-the-art pork processing plant

Producing more than 1.4 billion pounds of pork and turkey annually, the Prestage Farms family of companies employs more than 2,700 associates and is affiliated with more than 470 farm families across the U.S.




process

Taking Old-World Processes Into the New World

The JBS Principe facility is FOOD ENGINEERING’s 2024 Plant of the Year for its daring modern take on traditional Italian-style dried meat production.




process

Frozen Dessert Maker Maintains Authenticity with In-house Processing

Kaurina’s Kulfi could take shortcuts with its ‘Indian ice cream’ for more product and profit, but instead chooses a scaled up process first used in the family’s kitchen in order to stay true to the frozen treat’s heritage.




process

AI-Based Inspection System Brings for Potato Processors

The system includes cameras, AI software, a conveyor and automatic ejection mechanisms with dual drops (one for foreign material, one for culls) to ensure only ideal potatoes reach later process stages.




process

Tech tools to help processors make sense of today’s supply chain

While we hope the COVID-19 pandemic is fading into the background, we still need to be concerned with labor shortages, transportation interruptions, political issues, weather extremes, and other peripheral circumstances that can still break critical links in the supply chain.




process

Processors face higher commodity prices, but climate-smart technologies may help tame cost increases

The USDA says 2022 will be a great year for agriculture, and climate-smart farming practices can help.




process

Your window to the process: Clear view or obstructed?

Chances are, you’ve probably been using the same HMI, MES or SCADA program for some time in your operation, and you’ve been through several version upgrades too. Providers of these products work hard to make their software backward compatible with equipment and operating systems as well as create new features users want.




process

Marion Process Solutions Debuts Chopper for Fast and Efficient Mixing

Marion’s new chopper is designed to deliver increased throughput, easier maintenance and enhanced safety for processors across multiple industries.




process

Eriez Expands Salient Pole Rare Earth Drum Line to Food Processing Applications

Building on the product’s success in the mineral processing and recycling industries, Salient Pole Rare Earth drums are now optimized to meet the stringent requirements of food processing, ensuring the purity and safety of food products.




process

Drainage Solutions for Food Processing Facilities Emphasize Food Safety

The Slot Dog, Slot Hog, Cleaning Brushes, and the Tamper-Proof Magnetic Strainer—each purpose-built to enhance food safety through effective drainage and sanitation systems




process

Hamilton Process Analytics Unveils White Paper

The white paper addresses key challenges in cultivated meat production.




process

How Low-Temperature Absorption Chillers Can Optimize Food and Beverage Processing

As food and beverage manufacturers seek solutions to reduce operational costs and decarbonize, they may be surprised to find the answer in a solution that’s nearly a century old: absorption cooling.




process

Engineers will Explore Green Future For Food Processing at WSU-Hosted Conference

Launched in 1991, the biannual conference brings together food engineers and technologists from across industries, academic institutions and government to discuss emerging challenges and potential solutions for delivery of safe, nutritious and sustainable foods.




process

Is Your Facility Safe In Processing Dry Products?

Unlike wet ingredients, the handling of dry powder ingredients and products usually requires specifically designed process areas to prevent dangerous situations that could result in worker injuries or death.




process

Packaging Strategies for Meat Processors to Scale Up for Holiday Demand

Meat processors can turn to automation and vary film gauge, among other strategies, to meet demand for holiday products.




process

Revolutionizing Food & Beverage Processing with Time-Sensitive Networking

By embracing TSN, food and beverage companies not only improve their OEE but also set the stage for a future where production lines are not just automated but intelligently interconnected and extremely flexible.




process

PACK EXPO 2024 Offers Solutions for Packaging, Processing and Automation

Check out some of the latest packaging and processing solutions exhibitors plan to debut or showcase at PACK EXPO, set for Nov. 3-6 in Chicago.




process

SunOpta Invests $26 Million To Expand California Plant-Based Beverage Processing Facility

The Modesto expansion is the second largest capital project in the company’s history.




process

Saputo Completes the Sale of Two Milk Processing Facilities

The completion of this transaction is part of the company’s overall network optimization strategy, one of the pillars of its Global Strategic Plan.




process

Louis Dreyfus Company Breaks Ground on Ohio Soybean Processing Facility

The new facility will boost the company’s presence in growing edible oil and animal feed markets, and will create opportunities in renewable energy feedstock markets.




process

New Jarrett Foods Custom Poultry Processing Facility Opens

The company offers a wide variety of services, including whole bird cut-up, custom breast and thigh deboning, breast and tender hand portioning, marination and re-packing.




process

Attuning to processes of affective sociomaterialisation: exploring subjectivity and identity in outdoor early childhood provision in Scotland, UK.

Children's Geographies; 10/01/2024
(AN 180134748); ISSN: 14733285
Academic Search Premier





process

Addressing integration in the organization of palliative care in belgium: a multilevel ecosystems approach using the analytic hierarchy process (AHP) method

Palliative care is becoming an essential component of healthcare, but there is insufficient research on how integration across different levels of care (micro, meso, and macro) is realized in practice. Without… Read the full article ›

The post Addressing integration in the organization of palliative care in belgium: a multilevel ecosystems approach using the analytic hierarchy process (AHP) method was curated by information for practice.



  • Open Access Journal Articles

process

Graduate Image Processing R & D Engineer, Graduates, Manchester, UK, Research

About the Role
As an Imaging R&D Graduate, you will be joining the ISP team within Arm, which develops and designs image processing technology that is used in a range of applications including automobiles, security cameras, and drones. The algorithm development team is tasked with solving a variety of image processing problems, from denoise to demosaic, auto-exposure to motion compensation. Our algorithms must satisfy the competing demands of high image quality, and efficient, low-power hardware implementation.

This is an opportunity to contribute towards the next generation of imaging systems, for both human viewing and autonomous driving applications.

Why should you apply?

  • You want to work in leading digital imaging technology.
  • You have a keen interest in imaging or image processing, which you would like to develop into a career.
  • You want to see tangible results from your work.
  • You want to have the opportunity to learn from the best engineers and start a career in a leading imaging and vision technology group.

What will I be accountable for?

  • Working with image quality experts to determine requirements for processing.
  • Developing new image processing algorithms, often from early concept phase and typically in a mathematical modelling environment.
  • Implementing novel algorithms, starting from a floating-point model
  • Testing and benchmarking of the results, working closely with our image quality experts.
  • Collaborating with the wider engineering team to arrive at an architecture and fixed-point model of your algorithm, optimized for hardware or software implementation




process

One Size Does Not Fit All: Unraveling Item Response Process Heterogeneity Using the Mixture Dominance-Unfolding Model (MixDUM)

Organizational Research Methods, Ahead of Print. When modeling responses to items measuring non-cognitive constructs that require introspection (e.g., personality, attitude), most studies have assumed that respondents follow the same item response process—either a dominance or an unfolding one. Nevertheless, the results are not equivocal, as some preliminary evidence suggests that some people use an unfolding […]

The post One Size Does Not Fit All: Unraveling Item Response Process Heterogeneity Using the Mixture Dominance-Unfolding Model (MixDUM) was curated by information for practice.



  • Journal Article Abstracts

process

How the Senate confirmation process works and how Trump wants to change it

NPR's Michel Martin talks with Edward Whelan of the Ethics and Public Policy Center about President-elect Trump's influence on Senate Republicans' selection of a new majority leader.




process

Cleveron’s newest solution enables DIY and home furnishing retailers to automate their click-and-collect processes

Cleveron, a click-and-collect automation solutions innovator, is proud to launch a modular outdoor parcel locker, Cleveron 355. The newest solution is specially engineered for DIY and home furnishing retailers, enabling the automated handover of extra-large items.




process

Volumatic partners with MHouse to help streamline cash processing

Cash handling solutions provider, Volumatic, has joined forces with EPoS solution providers MHouse Business Solutions to offer more efficient, accurate and streamlined cash processing to convenience retailers in Scotland and beyond.




process

CashComplete to Demonstrate Unified Cash Management Process at EuroShop 2020

SUZOHAPP – a global market leader of software and hardware payment management solutions – will be exhibiting at EuroShop, the world’s largest retail trade fair from February 16-20 in Dusseldorf, Germany.On display at booth #6A75 will be the market leading CashComplete™ line of payment management solutions.




process

Doddle launches self-service kiosks to help simplify returns process

Doddle, the global e-commerce technology provider, has introduced a range of self-service kiosks to its solution portfolio, enabling retailers and carriers to address the challenge of online returns. 




process

TOMOMAN: a software package for large-scale cryo-electron tomography data preprocessing, community data sharing and collaborative computing

Here we describe TOMOMAN (TOMOgram MANager), an extensible open-sourced software package for handling cryo-electron tomography data preprocessing. TOMOMAN streamlines interoperability between a wide range of external packages and provides tools for project sharing and archival.




process

ProSPyX: software for post-processing images of X-ray ptychography with spectral capabilities

X-ray ptychography is a coherent diffraction imaging technique based on acquiring multiple diffraction patterns obtained through the illumination of the sample at different partially overlapping probe positions. The diffraction patterns collected are used to retrieve the complex transmittivity function of the sample and the probe using a phase retrieval algorithm. Absorption or phase contrast images of the sample as well as the real and imaginary parts of the probe function can be obtained. Furthermore, X-ray ptychography can also provide spectral information of the sample from absorption or phase shift images by capturing multiple ptychographic projections at varying energies around the resonant energy of the element of interest. However, post-processing of the images is required to extract the spectra. To facilitate this, ProSPyX, a Python package that offers the analysis tools and a graphical user interface required to process spectral ptychography datasets, is presented. Using the PyQt5 Python open-source module for development and design, the software facilitates extraction of absorption and phase spectral information from spectral ptychographic datasets. It also saves the spectra in file formats compatible with other X-ray absorption spectroscopy data analysis software tools, streamlining integration into existing spectroscopic data analysis pipelines. To illustrate its capabilities, ProSPyX was applied to process the spectral ptychography dataset recently acquired on a nickel wire at the SWING beamline of the SOLEIL synchrotron.




process

A distributed data processing scheme based on Hadoop for synchrotron radiation experiments

With the development of synchrotron radiation sources and high-frame-rate detectors, the amount of experimental data collected at synchrotron radiation beamlines has increased exponentially. As a result, data processing for synchrotron radiation experiments has entered the era of big data. It is becoming increasingly important for beamlines to have the capability to process large-scale data in parallel to keep up with the rapid growth of data. Currently, there is no set of data processing solutions based on the big data technology framework for beamlines. Apache Hadoop is a widely used distributed system architecture for solving the problem of massive data storage and computation. This paper presents a set of distributed data processing schemes for beamlines with experimental data using Hadoop. The Hadoop Distributed File System is utilized as the distributed file storage system, and Hadoop YARN serves as the resource scheduler for the distributed computing cluster. A distributed data processing pipeline that can carry out massively parallel computation is designed and developed using Hadoop Spark. The entire data processing platform adopts a distributed microservice architecture, which makes the system easy to expand, reduces module coupling and improves reliability.




process

StreamSAXS: a Python-based workflow platform for processing streaming SAXS/WAXS data

StreamSAXS is a Python-based small- and wide-angle X-ray scattering (SAXS/WAXS) data analysis workflow platform with graphical user interface (GUI). It aims to provide an interactive and user-friendly tool for analysis of both batch data files and real-time data streams. Users can easily create customizable workflows through the GUI to meet their specific needs. One characteristic of StreamSAXS is its plug-in framework, which enables developers to extend the built-in workflow tasks. Another feature is the support for both already acquired and real-time data sources, allowing StreamSAXS to function as an offline analysis platform or be integrated into large-scale acquisition systems for end-to-end data management. This paper presents the core design of StreamSAXS and provides user cases demonstrating its utilization for SAXS/WAXS data analysis in offline and online scenarios.




process

Hyperspectral full-field quick-EXAFS imaging at the ROCK beamline for monitoring micrometre-sized heterogeneity of functional materials under process conditions

Full-field transmission X-ray microscopy has been recently implemented at the hard X-ray ROCK–SOLEIL quick-EXAFS beamline, adding micrometre spatial resolution to the second time resolution characterizing the beamline. Benefiting from a beam size versatility due to the beamline focusing optics, full-field hyperspectral XANES imaging has been successfully used at the Fe K-edge for monitoring the pressure-induced spin transition of a 150 µm × 150 µm Fe(o-phen)2(NCS)2 single crystal and the charge of millimetre-sized LiFePO4 battery electrodes. Hyperspectral imaging over 2000 eV has been reported for the simultaneous monitoring of Fe and Cu speciation changes during activation of a FeCu bimetallic catalyst along a millimetre-sized catalyst bed. Strategies of data acquisition and post-data analysis using Jupyter notebooks and multivariate data analysis are presented, and the gain obtained using full-field hyperspectral quick-EXAFS imaging for studies of functional materials under process conditions in comparison with macroscopic information obtained by non-spatially resolved quick-EXAFS techniques is discussed.




process

A service-based approach to cryoEM facility processing pipelines at eBIC

Electron cryo-microscopy image-processing workflows are typically composed of elements that may, broadly speaking, be categorized as high-throughput workloads which transition to high-performance workloads as preprocessed data are aggregated. The high-throughput elements are of particular importance in the context of live processing, where an optimal response is highly coupled to the temporal profile of the data collection. In other words, each movie should be processed as quickly as possible at the earliest opportunity. The high level of disconnected parallelization in the high-throughput problem directly allows a completely scalable solution across a distributed computer system, with the only technical obstacle being an efficient and reliable implementation. The cloud computing frameworks primarily developed for the deployment of high-availability web applications provide an environment with a number of appealing features for such high-throughput processing tasks. Here, an implementation of an early-stage processing pipeline for electron cryotomography experiments using a service-based architecture deployed on a Kubernetes cluster is discussed in order to demonstrate the benefits of this approach and how it may be extended to scenarios of considerably increased complexity.




process

Advanced exploitation of unmerged reflection data during processing and refinement with autoPROC and BUSTER

The validation of structural models obtained by macromolecular X-ray crystallography against experimental diffraction data, whether before deposition into the PDB or after, is typically carried out exclusively against the merged data that are eventually archived along with the atomic coordinates. It is shown here that the availability of unmerged reflection data enables valuable additional analyses to be performed that yield improvements in the final models, and tools are presented to implement them, together with examples of the results to which they give access. The first example is the automatic identification and removal of image ranges affected by loss of crystal centering or by excessive decay of the diffraction pattern as a result of radiation damage. The second example is the `reflection-auditing' process, whereby individual merged data items showing especially poor agreement with model predictions during refinement are investigated thanks to the specific metadata (such as image number and detector position) that are available for the corresponding unmerged data, potentially revealing previously undiagnosed instrumental, experimental or processing problems. The third example is the calculation of so-called F(early) − F(late) maps from carefully selected subsets of unmerged amplitude data, which can not only highlight the location and extent of radiation damage but can also provide guidance towards suitable fine-grained parametrizations to model the localized effects of such damage.




process

The success rate of processed predicted models in molecular replacement: implications for experimental phasing in the AlphaFold era

The availability of highly accurate protein structure predictions from AlphaFold2 (AF2) and similar tools has hugely expanded the applicability of molecular replacement (MR) for crystal structure solution. Many structures can be solved routinely using raw models, structures processed to remove unreliable parts or models split into distinct structural units. There is therefore an open question around how many and which cases still require experimental phasing methods such as single-wavelength anomalous diffraction (SAD). Here, this question is addressed using a large set of PDB depositions that were solved by SAD. A large majority (87%) could be solved using unedited or minimally edited AF2 predictions. A further 18 (4%) yield straightforwardly to MR after splitting of the AF2 prediction using Slice'N'Dice, although different splitting methods succeeded on slightly different sets of cases. It is also found that further unique targets can be solved by alternative modelling approaches such as ESMFold (four cases), alternative MR approaches such as ARCIMBOLDO and AMPLE (two cases each), and multimeric model building with AlphaFold-Multimer or UniFold (three cases). Ultimately, only 12 cases, or 3% of the SAD-phased set, did not yield to any form of MR tested here, offering valuable hints as to the number and the characteristics of cases where experimental phasing remains essential for macromolecular structure solution.




process

EMhub: a web platform for data management and on-the-fly processing in scientific facilities

Most scientific facilities produce large amounts of heterogeneous data at a rapid pace. Managing users, instruments, reports and invoices presents additional challenges. To address these challenges, EMhub, a web platform designed to support the daily operations and record-keeping of a scientific facility, has been introduced. EMhub enables the easy management of user information, instruments, bookings and projects. The application was initially developed to meet the needs of a cryoEM facility, but its functionality and adaptability have proven to be broad enough to be extended to other data-generating centers. The expansion of EMHub is enabled by the modular nature of its core functionalities. The application allows external processes to be connected via a REST API, automating tasks such as folder creation, user and password generation, and the execution of real-time data-processing pipelines. EMhub has been used for several years at the Swedish National CryoEM Facility and has been installed in the CryoEM center at the Structural Biology Department at St. Jude Children's Research Hospital. A fully automated single-particle pipeline has been implemented for on-the-fly data processing and analysis. At St. Jude, the X-Ray Crystallography Center and the Single-Molecule Imaging Center have already expanded the platform to support their operational and data-management workflows.




process

Analysis of COF-300 synthesis: probing degradation processes and 3D electron diffraction structure

Although COF-300 is often used as an example to study the synthesis and structure of (3D) covalent organic frameworks (COFs), knowledge of the underlying synthetic processes is still fragmented. Here, an optimized synthetic procedure based on a combination of linker protection and modulation was applied. Using this approach, the influence of time and temperature on the synthesis of COF-300 was studied. Synthesis times that were too short produced materials with limited crystallinity and porosity, lacking the typical pore flexibility associated with COF-300. On the other hand, synthesis times that were too long could be characterized by loss of crystallinity and pore order by degradation of the tetrakis(4-aminophenyl)methane (TAM) linker used. The presence of the degradation product was confirmed by visual inspection, Raman spectroscopy and X-ray photoelectron spectroscopy (XPS). As TAM is by far the most popular linker for the synthesis of 3D COFs, this degradation process might be one of the reasons why the development of 3D COFs is still lagging compared with 2D COFs. However, COF crystals obtained via an optimized procedure could be structurally probed using 3D electron diffraction (3DED). The 3DED analysis resulted in a full structure determination of COF-300 at atomic resolution with satisfying data parameters. Comparison of our 3DED-derived structural model with previously reported single-crystal X-ray diffraction data for this material, as well as parameters derived from the Cambridge Structural Database, demonstrates the high accuracy of the 3DED method for structure determination. This validation might accelerate the exploitation of 3DED as a structure determination technique for COFs and other porous materials.




process

Phase quantification using deep neural network processing of XRD patterns

Mineral identification and quantification are key to the understanding and, hence, the capacity to predict material properties. The method of choice for mineral quantification is powder X-ray diffraction (XRD), generally using a Rietveld refinement approach. However, a successful Rietveld refinement requires preliminary identification of the phases that make up the sample. This is generally carried out manually, and this task becomes extremely long or virtually impossible in the case of very large datasets such as those from synchrotron X-ray diffraction computed tomography. To circumvent this issue, this article proposes a novel neural network (NN) method for automating phase identification and quantification. An XRD pattern calculation code was used to generate large datasets of synthetic data that are used to train the NN. This approach offers significant advantages, including the ability to construct databases with a substantial number of XRD patterns and the introduction of extensive variability into these patterns. To enhance the performance of the NN, a specifically designed loss function for proportion inference was employed during the training process, offering improved efficiency and stability compared with traditional functions. The NN, trained exclusively with synthetic data, proved its ability to identify and quantify mineral phases on synthetic and real XRD patterns. Trained NN errors were equal to 0.5% for phase quantification on the synthetic test set, and 6% on the experimental data, in a system containing four phases of contrasting crystal structures (calcite, gibbsite, dolomite and hematite). The proposed method is freely available on GitHub and allows for major advances since it can be applied to any dataset, regardless of the mineral phases present.




process

Millisecond X-ray reflectometry and neural network analysis: unveiling fast processes in spin coating

X-ray reflectometry (XRR) is a powerful tool for probing the structural characteristics of nanoscale films and layered structures, which is an important field of nanotechnology and is often used in semiconductor and optics manufacturing. This study introduces a novel approach for conducting quantitative high-resolution millisecond monochromatic XRR measurements. This is an order of magnitude faster than in previously published work. Quick XRR (qXRR) enables real time and in situ monitoring of nanoscale processes such as thin film formation during spin coating. A record qXRR acquisition time of 1.4 ms is demonstrated for a static gold thin film on a silicon sample. As a second example of this novel approach, dynamic in situ measurements are performed during PMMA spin coating onto silicon wafers and fast fitting of XRR curves using machine learning is demonstrated. This investigation primarily focuses on the evolution of film structure and surface morphology, resolving for the first time with qXRR the initial film thinning via mass transport and also shedding light on later thinning via solvent evaporation. This innovative millisecond qXRR technique is of significance for in situ studies of thin film deposition. It addresses the challenge of following intrinsically fast processes, such as thin film growth of high deposition rate or spin coating. Beyond thin film growth processes, millisecond XRR has implications for resolving fast structural changes such as photostriction or diffusion processes.




process

The tin content of lead inclusions in ancient tin-bronze artifacts: a time-dependent process?

In antiquity, Pb was a common element added in the production of large bronze artifacts, especially large statues, to impart fluidity to the casting process. As Pb does not form a solid solution with pure Cu or with the Sn–Cu alloy phases, it is normally observed in the metal matrix as globular droplets embedded within or in interstitial positions among the crystals of Sn-bronze (normally the α phase) as the last crystallizing phase during the cooling process of the Cu–Sn–Pb ternary melt. The disequilibrium Sn content of the Pb droplets has recently been suggested as a viable parameter to detect modern materials [Shilstein, Berner, Feldman, Shalev & Rosenberg (2019). STAR Sci. Tech. Archaeol. Res. 5, 29–35]. The application assumes a time-dependent process, with a timescale of hundreds of years, estimated on the basis of the diffusion coefficient of Sn in Pb over a length of a few micrometres [Oberschmidt, Kim & Gupta (1982). J. Appl. Phys. 53, 5672–5677]. Therefore, Pb inclusions in recent Sn-bronze artifacts are actually a metastable solid solution of Pb–Sn containing ∼3% atomic Sn. In contrast, in ancient artifacts, unmixing processes and diffusion of Sn from the micro- and nano-inclusions of Pb to the matrix occur, resulting in the Pb inclusions containing a substantially lower or negligible amount of Sn. The Sn content in the Pb inclusions relies on accurate measurement of the lattice parameter of the phase in the Pb–Sn solid solution, since for low Sn values it closely follows Vegard's law. Here, several new measurements on modern and ancient samples are presented and discussed in order to verify the applicability of the method to the detection of modern artwork pretending to be ancient.




process

Tracking copper nanofiller evolution in polysiloxane during processing into SiOC ceramic

Polymer-derived ceramics (PDCs) remain at the forefront of research for a variety of applications including ultra-high-temperature ceramics, energy storage and functional coatings. Despite their wide use, questions remain about the complex structural transition from polymer to ceramic and how local structure influences the final microstructure and resulting properties. This is further complicated when nanofillers are introduced to tailor structural and functional properties, as nanoparticle surfaces can interact with the matrix and influence the resulting structure. The inclusion of crystalline nanofiller produces a mixed crystalline–amorphous composite, which poses characterization challenges. With this study, we aim to address these challenges with a local-scale structural study that probes changes in a polysiloxane matrix with incorporated copper nanofiller. Composites were processed at three unique temperatures to capture mixing, pyrolysis and initial crystallization stages for the pre-ceramic polymer. We observed the evolution of the nanofiller with electron microscopy and applied synchrotron X-ray diffraction with differential pair distribution function (d-PDF) analysis to monitor changes in the matrix's local structure and interactions with the nanofiller. The application of the d-PDF to PDC materials is novel and informs future studies to understand interfacial interactions between nanofiller and matrix throughout PDC processing.




process

Automated pipeline processing X-ray diffraction data from dynamic compression experiments on the Extreme Conditions Beamline of PETRA III

Presented and discussed here is the implementation of a software solution that provides prompt X-ray diffraction data analysis during fast dynamic compression experiments conducted within the dynamic diamond anvil cell technique. It includes efficient data collection, streaming of data and metadata to a high-performance cluster (HPC), fast azimuthal data integration on the cluster, and tools for controlling the data processing steps and visualizing the data using the DIOPTAS software package. This data processing pipeline is invaluable for a great number of studies. The potential of the pipeline is illustrated with two examples of data collected on ammonia–water mixtures and multiphase mineral assemblies under high pressure. The pipeline is designed to be generic in nature and could be readily adapted to provide rapid feedback for many other X-ray diffraction techniques, e.g. large-volume press studies, in situ stress/strain studies, phase transformation studies, chemical reactions studied with high-resolution diffraction etc.