processing

BENEO Invests $54 million in Pulse-Processing Plant

The new production site further strengthens the company’s plant-based protein portfolio and enables BENEO to meet the growing demand for plant-based food and feed ingredients.




processing

Cold-Pressed Processing Transforms Dried Green Bananas into Pasta

Solely says it used its signature, patented cold-pressing process to transform dried green bananas into a traditional fusilli shape and texture. The dark-colored Organic Green Banana Fusilli Pasta cooks to an al-dente texture in just four minutes and is ready to be topped with a sauce or other ingredients.




processing

White River Soy Processing Purchases Benson Hill Ingredients

White River Soy Processing, LLC (White River), a developer and operator of oilseed processing plants in the US, announced that it purchased Benson Hill Ingredients, LLC, which operates an established food grade soybean processing facility in Creston, Iowa, from Benson Hill, Inc.




processing

Food processors can get big benefits from automating batch processing

Automation of batch processing isn’t new, but as more companies move beyond the basics of automation, they are finding that new technology brings big rewards—as well as big challenges.




processing

Prioritizing Food Safety—Metal Detection in Milchwerke Schwaben’s Dairy Processing

Milchwerke Schwaben’s presence in the dairy industry starts as early as 1922, dairy farmers in Ulm, Germany joined together to form a cooperative that would make it possible to produce dairy products with greater efficiency. This merger resulted in a company whose products meet the needs of consumers throughout Germany and abroad.




processing

Dry Processing Products | August 2022

This month’s dry processing product focus covers bulk bag handling equipment.




processing

Choosing the Right Pump for Poultry Processing Applications

Selecting the right pump to convey product in a food-safe manner during processing steps or to remove waste from processing operations to a waste facility is important for poultry processors.




processing

Laying a Model of Sustainability for Food Processing

It only takes a few seconds after stepping into Vital Farms’ Egg Central Station (ECS) to realize that the company truly loves its egg-related puns. If there’s writing nearby, be it on a wall, memo or bulletin board, there’s a pun not far behind. They’re part of an ethos that starts from the top and flows down the company’s ranks to put a smile on someone’s face, even if just for a second. That idea of doing good extends beyond the facility walls to its other locations, the farmers who supply the eggs, the consumers who buy the product and, finally, to the planet itself. That visible commitment to the environment is why the addition at ECS was chosen as FOOD ENGINEERING’s 2023 Sustainable Plant of the Year.




processing

Automation and food processing today: It's about labor

Processors are turning to automation to fill jobs where people are unavailable.





processing

Crespel & Deiters Commissions Silo Building at German Wheat Processing Facility

Wheat processor Crespel & Deiters commissioned the construction of a €18.5 million ($20.7 million) silo building at its main site in Ibbenbüren, Germany at the beginning of the year – the largest single investment in infrastructure in the company’s history. 




processing

How Silicas Can Improve Dry Processing Production

Jordan Talmadge, PPG’s global commercial director, silicas, recently spoke about the role silicas can play in preventing caking and optimizing the flow characteristics of food products, as well as improving product performance and manufacturing productivity.




processing

The food processing industry needs automation to succeed

A long list of concerns continues to keep businesses in the food processing industry on their toes. Inflation. Supply chain issues. Hiring challenges. The pandemic. However, a connective thread runs through these four: automation.




processing

Now is the Time to Secure Food Processing Facilities

The threats facing food processing facilities are already here, and there is no longer any time to delay implementing minimum protections for these sites.




processing

How Prestage Foods of Iowa built a state-of-the-art pork processing plant

Producing more than 1.4 billion pounds of pork and turkey annually, the Prestage Farms family of companies employs more than 2,700 associates and is affiliated with more than 470 farm families across the U.S.




processing

Frozen Dessert Maker Maintains Authenticity with In-house Processing

Kaurina’s Kulfi could take shortcuts with its ‘Indian ice cream’ for more product and profit, but instead chooses a scaled up process first used in the family’s kitchen in order to stay true to the frozen treat’s heritage.




processing

Eriez Expands Salient Pole Rare Earth Drum Line to Food Processing Applications

Building on the product’s success in the mineral processing and recycling industries, Salient Pole Rare Earth drums are now optimized to meet the stringent requirements of food processing, ensuring the purity and safety of food products.




processing

Drainage Solutions for Food Processing Facilities Emphasize Food Safety

The Slot Dog, Slot Hog, Cleaning Brushes, and the Tamper-Proof Magnetic Strainer—each purpose-built to enhance food safety through effective drainage and sanitation systems




processing

How Low-Temperature Absorption Chillers Can Optimize Food and Beverage Processing

As food and beverage manufacturers seek solutions to reduce operational costs and decarbonize, they may be surprised to find the answer in a solution that’s nearly a century old: absorption cooling.




processing

Engineers will Explore Green Future For Food Processing at WSU-Hosted Conference

Launched in 1991, the biannual conference brings together food engineers and technologists from across industries, academic institutions and government to discuss emerging challenges and potential solutions for delivery of safe, nutritious and sustainable foods.




processing

Is Your Facility Safe In Processing Dry Products?

Unlike wet ingredients, the handling of dry powder ingredients and products usually requires specifically designed process areas to prevent dangerous situations that could result in worker injuries or death.




processing

Revolutionizing Food & Beverage Processing with Time-Sensitive Networking

By embracing TSN, food and beverage companies not only improve their OEE but also set the stage for a future where production lines are not just automated but intelligently interconnected and extremely flexible.




processing

PACK EXPO 2024 Offers Solutions for Packaging, Processing and Automation

Check out some of the latest packaging and processing solutions exhibitors plan to debut or showcase at PACK EXPO, set for Nov. 3-6 in Chicago.




processing

SunOpta Invests $26 Million To Expand California Plant-Based Beverage Processing Facility

The Modesto expansion is the second largest capital project in the company’s history.




processing

Saputo Completes the Sale of Two Milk Processing Facilities

The completion of this transaction is part of the company’s overall network optimization strategy, one of the pillars of its Global Strategic Plan.




processing

Louis Dreyfus Company Breaks Ground on Ohio Soybean Processing Facility

The new facility will boost the company’s presence in growing edible oil and animal feed markets, and will create opportunities in renewable energy feedstock markets.




processing

New Jarrett Foods Custom Poultry Processing Facility Opens

The company offers a wide variety of services, including whole bird cut-up, custom breast and thigh deboning, breast and tender hand portioning, marination and re-packing.




processing

Graduate Image Processing R & D Engineer, Graduates, Manchester, UK, Research

About the Role
As an Imaging R&D Graduate, you will be joining the ISP team within Arm, which develops and designs image processing technology that is used in a range of applications including automobiles, security cameras, and drones. The algorithm development team is tasked with solving a variety of image processing problems, from denoise to demosaic, auto-exposure to motion compensation. Our algorithms must satisfy the competing demands of high image quality, and efficient, low-power hardware implementation.

This is an opportunity to contribute towards the next generation of imaging systems, for both human viewing and autonomous driving applications.

Why should you apply?

  • You want to work in leading digital imaging technology.
  • You have a keen interest in imaging or image processing, which you would like to develop into a career.
  • You want to see tangible results from your work.
  • You want to have the opportunity to learn from the best engineers and start a career in a leading imaging and vision technology group.

What will I be accountable for?

  • Working with image quality experts to determine requirements for processing.
  • Developing new image processing algorithms, often from early concept phase and typically in a mathematical modelling environment.
  • Implementing novel algorithms, starting from a floating-point model
  • Testing and benchmarking of the results, working closely with our image quality experts.
  • Collaborating with the wider engineering team to arrive at an architecture and fixed-point model of your algorithm, optimized for hardware or software implementation




processing

Volumatic partners with MHouse to help streamline cash processing

Cash handling solutions provider, Volumatic, has joined forces with EPoS solution providers MHouse Business Solutions to offer more efficient, accurate and streamlined cash processing to convenience retailers in Scotland and beyond.




processing

TOMOMAN: a software package for large-scale cryo-electron tomography data preprocessing, community data sharing and collaborative computing

Here we describe TOMOMAN (TOMOgram MANager), an extensible open-sourced software package for handling cryo-electron tomography data preprocessing. TOMOMAN streamlines interoperability between a wide range of external packages and provides tools for project sharing and archival.




processing

ProSPyX: software for post-processing images of X-ray ptychography with spectral capabilities

X-ray ptychography is a coherent diffraction imaging technique based on acquiring multiple diffraction patterns obtained through the illumination of the sample at different partially overlapping probe positions. The diffraction patterns collected are used to retrieve the complex transmittivity function of the sample and the probe using a phase retrieval algorithm. Absorption or phase contrast images of the sample as well as the real and imaginary parts of the probe function can be obtained. Furthermore, X-ray ptychography can also provide spectral information of the sample from absorption or phase shift images by capturing multiple ptychographic projections at varying energies around the resonant energy of the element of interest. However, post-processing of the images is required to extract the spectra. To facilitate this, ProSPyX, a Python package that offers the analysis tools and a graphical user interface required to process spectral ptychography datasets, is presented. Using the PyQt5 Python open-source module for development and design, the software facilitates extraction of absorption and phase spectral information from spectral ptychographic datasets. It also saves the spectra in file formats compatible with other X-ray absorption spectroscopy data analysis software tools, streamlining integration into existing spectroscopic data analysis pipelines. To illustrate its capabilities, ProSPyX was applied to process the spectral ptychography dataset recently acquired on a nickel wire at the SWING beamline of the SOLEIL synchrotron.




processing

A distributed data processing scheme based on Hadoop for synchrotron radiation experiments

With the development of synchrotron radiation sources and high-frame-rate detectors, the amount of experimental data collected at synchrotron radiation beamlines has increased exponentially. As a result, data processing for synchrotron radiation experiments has entered the era of big data. It is becoming increasingly important for beamlines to have the capability to process large-scale data in parallel to keep up with the rapid growth of data. Currently, there is no set of data processing solutions based on the big data technology framework for beamlines. Apache Hadoop is a widely used distributed system architecture for solving the problem of massive data storage and computation. This paper presents a set of distributed data processing schemes for beamlines with experimental data using Hadoop. The Hadoop Distributed File System is utilized as the distributed file storage system, and Hadoop YARN serves as the resource scheduler for the distributed computing cluster. A distributed data processing pipeline that can carry out massively parallel computation is designed and developed using Hadoop Spark. The entire data processing platform adopts a distributed microservice architecture, which makes the system easy to expand, reduces module coupling and improves reliability.




processing

StreamSAXS: a Python-based workflow platform for processing streaming SAXS/WAXS data

StreamSAXS is a Python-based small- and wide-angle X-ray scattering (SAXS/WAXS) data analysis workflow platform with graphical user interface (GUI). It aims to provide an interactive and user-friendly tool for analysis of both batch data files and real-time data streams. Users can easily create customizable workflows through the GUI to meet their specific needs. One characteristic of StreamSAXS is its plug-in framework, which enables developers to extend the built-in workflow tasks. Another feature is the support for both already acquired and real-time data sources, allowing StreamSAXS to function as an offline analysis platform or be integrated into large-scale acquisition systems for end-to-end data management. This paper presents the core design of StreamSAXS and provides user cases demonstrating its utilization for SAXS/WAXS data analysis in offline and online scenarios.




processing

A service-based approach to cryoEM facility processing pipelines at eBIC

Electron cryo-microscopy image-processing workflows are typically composed of elements that may, broadly speaking, be categorized as high-throughput workloads which transition to high-performance workloads as preprocessed data are aggregated. The high-throughput elements are of particular importance in the context of live processing, where an optimal response is highly coupled to the temporal profile of the data collection. In other words, each movie should be processed as quickly as possible at the earliest opportunity. The high level of disconnected parallelization in the high-throughput problem directly allows a completely scalable solution across a distributed computer system, with the only technical obstacle being an efficient and reliable implementation. The cloud computing frameworks primarily developed for the deployment of high-availability web applications provide an environment with a number of appealing features for such high-throughput processing tasks. Here, an implementation of an early-stage processing pipeline for electron cryotomography experiments using a service-based architecture deployed on a Kubernetes cluster is discussed in order to demonstrate the benefits of this approach and how it may be extended to scenarios of considerably increased complexity.




processing

Advanced exploitation of unmerged reflection data during processing and refinement with autoPROC and BUSTER

The validation of structural models obtained by macromolecular X-ray crystallography against experimental diffraction data, whether before deposition into the PDB or after, is typically carried out exclusively against the merged data that are eventually archived along with the atomic coordinates. It is shown here that the availability of unmerged reflection data enables valuable additional analyses to be performed that yield improvements in the final models, and tools are presented to implement them, together with examples of the results to which they give access. The first example is the automatic identification and removal of image ranges affected by loss of crystal centering or by excessive decay of the diffraction pattern as a result of radiation damage. The second example is the `reflection-auditing' process, whereby individual merged data items showing especially poor agreement with model predictions during refinement are investigated thanks to the specific metadata (such as image number and detector position) that are available for the corresponding unmerged data, potentially revealing previously undiagnosed instrumental, experimental or processing problems. The third example is the calculation of so-called F(early) − F(late) maps from carefully selected subsets of unmerged amplitude data, which can not only highlight the location and extent of radiation damage but can also provide guidance towards suitable fine-grained parametrizations to model the localized effects of such damage.




processing

EMhub: a web platform for data management and on-the-fly processing in scientific facilities

Most scientific facilities produce large amounts of heterogeneous data at a rapid pace. Managing users, instruments, reports and invoices presents additional challenges. To address these challenges, EMhub, a web platform designed to support the daily operations and record-keeping of a scientific facility, has been introduced. EMhub enables the easy management of user information, instruments, bookings and projects. The application was initially developed to meet the needs of a cryoEM facility, but its functionality and adaptability have proven to be broad enough to be extended to other data-generating centers. The expansion of EMHub is enabled by the modular nature of its core functionalities. The application allows external processes to be connected via a REST API, automating tasks such as folder creation, user and password generation, and the execution of real-time data-processing pipelines. EMhub has been used for several years at the Swedish National CryoEM Facility and has been installed in the CryoEM center at the Structural Biology Department at St. Jude Children's Research Hospital. A fully automated single-particle pipeline has been implemented for on-the-fly data processing and analysis. At St. Jude, the X-Ray Crystallography Center and the Single-Molecule Imaging Center have already expanded the platform to support their operational and data-management workflows.




processing

Phase quantification using deep neural network processing of XRD patterns

Mineral identification and quantification are key to the understanding and, hence, the capacity to predict material properties. The method of choice for mineral quantification is powder X-ray diffraction (XRD), generally using a Rietveld refinement approach. However, a successful Rietveld refinement requires preliminary identification of the phases that make up the sample. This is generally carried out manually, and this task becomes extremely long or virtually impossible in the case of very large datasets such as those from synchrotron X-ray diffraction computed tomography. To circumvent this issue, this article proposes a novel neural network (NN) method for automating phase identification and quantification. An XRD pattern calculation code was used to generate large datasets of synthetic data that are used to train the NN. This approach offers significant advantages, including the ability to construct databases with a substantial number of XRD patterns and the introduction of extensive variability into these patterns. To enhance the performance of the NN, a specifically designed loss function for proportion inference was employed during the training process, offering improved efficiency and stability compared with traditional functions. The NN, trained exclusively with synthetic data, proved its ability to identify and quantify mineral phases on synthetic and real XRD patterns. Trained NN errors were equal to 0.5% for phase quantification on the synthetic test set, and 6% on the experimental data, in a system containing four phases of contrasting crystal structures (calcite, gibbsite, dolomite and hematite). The proposed method is freely available on GitHub and allows for major advances since it can be applied to any dataset, regardless of the mineral phases present.




processing

Tracking copper nanofiller evolution in polysiloxane during processing into SiOC ceramic

Polymer-derived ceramics (PDCs) remain at the forefront of research for a variety of applications including ultra-high-temperature ceramics, energy storage and functional coatings. Despite their wide use, questions remain about the complex structural transition from polymer to ceramic and how local structure influences the final microstructure and resulting properties. This is further complicated when nanofillers are introduced to tailor structural and functional properties, as nanoparticle surfaces can interact with the matrix and influence the resulting structure. The inclusion of crystalline nanofiller produces a mixed crystalline–amorphous composite, which poses characterization challenges. With this study, we aim to address these challenges with a local-scale structural study that probes changes in a polysiloxane matrix with incorporated copper nanofiller. Composites were processed at three unique temperatures to capture mixing, pyrolysis and initial crystallization stages for the pre-ceramic polymer. We observed the evolution of the nanofiller with electron microscopy and applied synchrotron X-ray diffraction with differential pair distribution function (d-PDF) analysis to monitor changes in the matrix's local structure and interactions with the nanofiller. The application of the d-PDF to PDC materials is novel and informs future studies to understand interfacial interactions between nanofiller and matrix throughout PDC processing.




processing

Automated pipeline processing X-ray diffraction data from dynamic compression experiments on the Extreme Conditions Beamline of PETRA III

Presented and discussed here is the implementation of a software solution that provides prompt X-ray diffraction data analysis during fast dynamic compression experiments conducted within the dynamic diamond anvil cell technique. It includes efficient data collection, streaming of data and metadata to a high-performance cluster (HPC), fast azimuthal data integration on the cluster, and tools for controlling the data processing steps and visualizing the data using the DIOPTAS software package. This data processing pipeline is invaluable for a great number of studies. The potential of the pipeline is illustrated with two examples of data collected on ammonia–water mixtures and multiphase mineral assemblies under high pressure. The pipeline is designed to be generic in nature and could be readily adapted to provide rapid feedback for many other X-ray diffraction techniques, e.g. large-volume press studies, in situ stress/strain studies, phase transformation studies, chemical reactions studied with high-resolution diffraction etc.




processing

Perfecting artisan bread processing

Ingredients and processing play a 50/50 role in the quality production of artisan bread. It's the simple ingredient list and simple processing that is helping boost this style of bread's popularity in the natural and clean label trend.




processing

Mariani opts for Key Technology optical sorter for fruit processing

The newly installed equipment is rated to sort up to 20,000 lbs of product per hour.




processing

PACK EXPO Las Vegas spotlights packaging, processing advances for snack and bakery

When shopping for packaging materials and technology, a snack or bakery pro can’t just drive down to the nearest mall or log onto Amazon to have something delivered to their front door. 




processing

PACK EXPO 2024 delivers access to packaging and processing advances

PACK EXPO 2024 offers snack and bakery pros the chance to dive into the latest packaging and processing solutions




processing

New bakery mixers offer automation, sanitation, and continuous processing

Mixers serve essential functions for snack and bakery companies creating their own dough. We ask a lot of this category of heavy-duty equipment, including reliability and efficiency, to maintain established product quality-control standards.




processing

Gericke USA Turbo Compact Mixing Module meets ATEX requirements for explosion-proof processing

The Turbo Compact Mixing (TCM) module from process equipment manufacturer Gericke USA, Somerset, NJ, meets ATEX requirements for operation in hazardous environments. 




processing

High-tech almond processing at Blue Diamond Growers

Going for LEED Silver certification, Blue Diamond Growers’ new, state-of-the-art almond production plant in Turlock, CA, features nearly 200,000 sq. ft. of sophisticated almond technology, setting new standards of excellence for food safety, processing and building innovation. 




processing

Processing Technology: Turning up the Heat

Reliability, flexibility, consistent product quality, easy sanitation and high-volume output are just some of the things bakers and snack producers want from a new oven or fryer. 




processing

PROCESS EXPO 2017 review: ongoing processing innovations

PROCESS EXPO 2017 took place September 19–22 at McCormick Place in Chicago. The biennial event, operated by the Food Processing Suppliers Association, attracts food industry professionals from around the world. 




processing

BuyFin payment processing and consumer financing now available to help business owners unlock growth

BuyFin Payment Processing, which is offered in conjunction with third-party partners, helps business owners get paid faster from customers with competitive processing rates and a best-in-class technology platform, the company states.




processing

Maintaining hygiene regulations in food processing

This relentless commitment to cleanliness is critical in navigating the complex regulatory compliance landscape, ensuring your facility stands out for all the right reasons. 




processing

High-pressure processing ensures product safety, extended shelf life for beverages

With a multitude of benefits, high-pressure processing (HPP) technology is used by beverage-makers for product like cold-pressed juices and cold brew coffees, experts share.