model

SAD phasing of XFEL data depends critically on the error model

A nonlinear least-squares method for refining a parametric expression describing the estimated errors of reflection intensities in serial crystallographic (SX) data is presented. This approach, which is similar to that used in the rotation method of crystallographic data collection at synchrotrons, propagates error estimates from photon-counting statistics to the merged data. Here, it is demonstrated that the application of this approach to SX data provides better SAD phasing ability, enabling the autobuilding of a protein structure that had previously failed to be built. Estimating the error in the merged reflection intensities requires the understanding and propagation of all of the sources of error arising from the measurements. One type of error, which is well understood, is the counting error introduced when the detector counts X-ray photons. Thus, if other types of random errors (such as readout noise) as well as uncertainties in systematic corrections (such as from X-ray attenuation) are completely understood, they can be propagated along with the counting error, as appropriate. In practice, most software packages propagate as much error as they know how to model and then include error-adjustment terms that scale the error estimates until they explain the variance among the measurements. If this is performed carefully, then during SAD phasing likelihood-based approaches can make optimal use of these error estimates, increasing the chance of a successful structure solution. In serial crystallography, SAD phasing has remained challenging, with the few examples of de novo protein structure solution each requiring many thousands of diffraction patterns. Here, the effects of different methods of treating the error estimates are estimated and it is shown that using a parametric approach that includes terms proportional to the known experimental uncertainty, the reflection intensity and the squared reflection intensity to improve the error estimates can allow SAD phasing even from weak zinc anomalous signal.




model

Deriving and refining atomic models in crystallography and cryo-EM: the latest Phenix tools to facilitate structure analysis




model

Extending the scope of coiled-coil crystal structure solution by AMPLE through improved ab initio modelling

The phase problem remains a major barrier to overcome in protein structure solution by X-ray crystallography. In recent years, new molecular-replacement approaches using ab initio models and ideal secondary-structure components have greatly contributed to the solution of novel structures in the absence of clear homologues in the PDB or experimental phasing information. This has been particularly successful for highly α-helical structures, and especially coiled-coils, in which the relatively rigid α-helices provide very useful molecular-replacement fragments. This has been seen within the program AMPLE, which uses clustered and truncated ensembles of numerous ab initio models in structure solution, and is already accomplished for α-helical and coiled-coil structures. Here, an expansion in the scope of coiled-coil structure solution by AMPLE is reported, which has been achieved through general improvements in the pipeline, the removal of tNCS correction in molecular replacement and two improved methods for ab initio modelling. Of the latter improvements, enforcing the modelling of elongated helices overcame the bias towards globular folds and provided a rapid method (equivalent to the time requirements of the existing modelling procedures in AMPLE) for enhanced solution. Further, the modelling of two-, three- and four-helical oligomeric coiled-coils, and the use of full/partial oligomers in molecular replacement, provided additional success in difficult and lower resolution cases. Together, these approaches have enabled the solution of a number of parallel/antiparallel dimeric, trimeric and tetrameric coiled-coils at resolutions as low as 3.3 Å, and have thus overcome previous limitations in AMPLE and provided a new functionality in coiled-coil structure solution at lower resolutions. These new approaches have been incorporated into a new release of AMPLE in which automated elongated monomer and oligomer modelling may be activated by selecting `coiled-coil' mode.




model

Towards the spatial resolution of metalloprotein charge states by detailed modeling of XFEL crystallographic diffraction

Oxidation states of individual metal atoms within a metalloprotein can be assigned by examining X-ray absorption edges, which shift to higher energy for progressively more positive valence numbers. Indeed, X-ray crystallography is well suited for such a measurement, owing to its ability to spatially resolve the scattering contributions of individual metal atoms that have distinct electronic environments contributing to protein function. However, as the magnitude of the shift is quite small, about +2 eV per valence state for iron, it has only been possible to measure the effect when performed with monochromated X-ray sources at synchrotron facilities with energy resolutions in the range 2–3 × 10−4 (ΔE/E). This paper tests whether X-ray free-electron laser (XFEL) pulses, which have a broader bandpass (ΔE/E = 3 × 10−3) when used without a monochromator, might also be useful for such studies. The program nanoBragg is used to simulate serial femtosecond crystallography (SFX) diffraction images with sufficient granularity to model the XFEL spectrum, the crystal mosaicity and the wavelength-dependent anomalous scattering factors contributed by two differently charged iron centers in the 110-amino-acid protein, ferredoxin. Bayesian methods are then used to deduce, from the simulated data, the most likely X-ray absorption curves for each metal atom in the protein, which agree well with the curves chosen for the simulation. The data analysis relies critically on the ability to measure the incident spectrum for each pulse, and also on the nanoBragg simulator to predict the size, shape and intensity profile of Bragg spots based on an underlying physical model that includes the absorption curves, which are then modified to produce the best agreement with the simulated data. This inference methodology potentially enables the use of SFX diffraction for the study of metalloenzyme mechanisms and, in general, offers a more detailed approach to Bragg spot data reduction.




model

The use of local structural similarity of distant homologues for crystallographic model building from a molecular-replacement solution

The performance of automated protein model building usually decreases with resolution, mainly owing to the lower information content of the experimental data. This calls for a more elaborate use of the available structural information about macromolecules. Here, a new method is presented that uses structural homologues to improve the quality of protein models automatically constructed using ARP/wARP. The method uses local structural similarity between deposited models and the model being built, and results in longer main-chain fragments that in turn can be more reliably docked to the protein sequence. The application of the homology-based model extension method to the example of a CFA synthase at 2.7 Å resolution resulted in a more complete model with almost all of the residues correctly built and docked to the sequence. The method was also evaluated on 1493 molecular-replacement solutions at a resolution of 4.0 Å and better that were submitted to the ARP/wARP web service for model building. A significant improvement in the completeness and sequence coverage of the built models has been observed.




model

Estimating local protein model quality: prospects for molecular replacement

Model quality assessment programs estimate the quality of protein models and can be used to estimate local error in protein models. ProQ3D is the most recent and most accurate version of our software. Here, it is demonstrated that it is possible to use local error estimates to substantially increase the quality of the models for molecular replacement (MR). Adjusting the B factors using ProQ3D improved the log-likelihood gain (LLG) score by over 50% on average, resulting in significantly more successful models in MR compared with not using error estimates. On a data set of 431 homology models to address difficult MR targets, models with error estimates from ProQ3D received an LLG of >50 for almost half of the models 209/431 (48.5%), compared with 175/431 (40.6%) for the previous version, ProQ2, and only 74/431 (17.2%) for models with no error estimates, clearly demonstrating the added value of using error estimates to enable MR for more targets. ProQ3D is available from http://proq3.bioinfo.se/ both as a server and as a standalone download.




model

Structure analysis of supported disordered molybdenum oxides using pair distribution function analysis and automated cluster modelling

Molybdenum oxides and sulfides on various low-cost high-surface-area supports are excellent catalysts for several industrially relevant reactions. The surface layer structure of these materials is, however, difficult to characterize due to small and disordered MoOx domains. Here, it is shown how X-ray total scattering can be applied to gain insights into the structure through differential pair distribution function (d-PDF) analysis, where the scattering signal from the support material is subtracted to obtain structural information on the supported structure. MoOx catalysts supported on alumina nanoparticles and on zeolites are investigated, and it is shown that the structure of the hydrated molybdenum oxide layer is closely related to that of disordered and polydisperse polyoxometalates. By analysing the PDFs with a large number of automatically generated cluster structures, which are constructed in an iterative manner from known polyoxometalate clusters, information is derived on the structural motifs in supported MoOx.




model

Disorder in La1−xBa1+xGaO4−x/2 ionic conductor: resolving the pair distribution function through insight from first-principles modeling

Ba excess in LaBaGaO4 triggers ionic conductivity together with structural disorder. A direct correlation is found between the density functional theory model energy and the pair distribution function fit residual.




model

Calculation of total scattering from a crystalline structural model based on experimental optics parameters

A calculation procedure for X-ray total scattering and the pair distribution function from a crystalline structural model is presented. It allows one to easily and precisely deal with diffraction-angle-dependent parameters such as the atomic form factor and the resolution of the optics.




model

F-BAR domain protein Syndapin regulates actomyosin dynamics during apical cap remodeling in syncytial Drosophila embryos [SHORT REPORT]

Aparna Sherlekar, Gayatri Mundhe, Prachi Richa, Bipasha Dey, Swati Sharma, and Richa Rikhy

Branched actin networks driven by Arp2/3 collaborate with actomyosin filaments in processes such as cell migration. The syncytial Drosophila blastoderm embryo also shows expansion of apical caps by Arp2/3 driven actin polymerization in interphase and buckling at contact edges by MyosinII to form furrows in metaphase. Here we study the role of Syndapin (Synd), an F-BAR domain containing protein in apical cap remodelling prior to furrow extension. synd depletion showed larger apical caps. STED super-resolution and TIRF microscopy showed long apical actin protrusions in caps in interphase and short protrusions in metaphase in control embryos. synd depletion led to sustained long protrusions even in metaphase. Loss of Arp2/3 function in synd mutants partly reverted defects in apical cap expansion and protrusion remodelling. MyosinII levels were decreased in synd mutants and MyosinII mutant embryos have been previously reported to have expanded caps. We propose that Syndapin function limits branching activity during cap expansion and affects MyosinII distribution in order to shift actin remodeling from apical cap expansion to favor lateral furrow extension.




model

Structure-mining: screening structure models by automated fitting to the atomic pair distribution function over large numbers of models

A new approach is presented to obtain candidate structures from atomic pair distribution function (PDF) data in a highly automated way. It fetches, from web-based structural databases, all the structures meeting the experimenter's search criteria and performs structure refinements on them without human intervention. It supports both X-ray and neutron PDFs. Tests on various material systems show the effectiveness and robustness of the algorithm in finding the correct atomic crystal structure. It works on crystalline and nanocrystalline materials including complex oxide nanoparticles and nanowires, low-symmetry and locally distorted structures, and complicated doped and magnetic materials. This approach could greatly reduce the traditional structure searching work and enable the possibility of high-throughput real-time auto-analysis PDF experiments in the future.





model

Using Fossils in Panama to Model Future Climate Change

When Smithsonian Tropical Research Institute paleobotanist Carlos Jaramillo learned that Panama was expanding its canal in 2006 and blasting 100 million tons of rock to […]

The post Using Fossils in Panama to Model Future Climate Change appeared first on Smithsonian Insider.




model

Obtaining the best results: aspects of data collection, model finalization and inter­pretation of results in small-mol­ecule crystal-structure determination

This article aims to encourage practitioners, young and seasoned, by enhancing their structure-determination toolboxes with a selection of tips and tricks on recognizing and handling aspects of data collection, structure modelling and refinement, and the interpretation of results.




model

Model-independent extraction of the shapes and Fourier transforms from patterns of partially overlapped peaks with extended tails

This work presents a technique for extracting the detailed shape of peaks with extended, overlapping tails in an X-ray powder diffraction pattern. The application discussed here concerns crystallite size broadening, though the technique can be applied to spectra of any origin and without regard to how the profiles are to be subsequently analyzed. Historically, the extraction of profile shapes has been difficult due to the complexity of determining the background under the peak, resulting in an offset of the low-frequency components of the Fourier transform of the peak known as the `hook' problem. The use of a carefully considered statistical weighting function in a non-linear least-squares fit, followed by summing the residuals from such a fit with the fit itself, allows one to extract the full shape of an isolated peak, without contributions from either the background or adjacent peaks. The extracted shape, consisting of the fit function recombined with the residuals, is not dependent on any specific shape model. The application of this to analysis of microstructure is performed independently of global parametric models, which would reduce the number of refined parameters; therefore the technique requires high-quality data to produce results of interest. The effectiveness of the technique is demonstrated by extraction of Fourier transforms of peaks from two sets of size-broadened materials with two differing pieces of equipment.




model

Modeling of energy-dispersive X-ray diffraction for high-symmetry crystal orientation

The methods for X-ray crystal orientation are rapidly evolving towards versatility, fewer goniometry measurements, automation, high accuracy and precision. One method that attracts a lot of attention is energy-dispersive X-ray diffraction (EDXRD) which is based on detecting reflections from crystallographic planes in a crystal at fixed angles of a parallel polychromatic X-ray incident beam. In theory, an EDXRD peak can move in a diffraction pattern as a function of a crystallographic plane d-spacing and its orientation relative to a fixed direction in space can change also. This is equivalent to the possibility of measuring the orientation of single crystals. The article provides a modeling for the EDXRD method whose main feature is the nonmoving crystal in the sense of traditional goniometry where the angle measurements of diffracting planes are a must. The article defines the equation of orientation for the method and shows the derivation in great detail. It is shown that the exact solutions of the equations can be obtained using the generalized reduced gradient method, a mathematical subroutine that is implemented in Excel software. The significance and scientific impact of the work are discussed along with the validated tested results.




model

Aspherical scattering factors for SHELXL – model, implementation and application

A new aspherical scattering factor formalism has been implemented in the crystallographic least-squares refinement program SHELXL. The formalism relies on Gaussian functions and can optionally complement the independent atom model to take into account the deformation of electron-density distribution due to chemical bonding and lone pairs. Asphericity contributions were derived from the electron density obtained from quantum-chemical density functional theory computations of suitable model compounds that contain particular chemical environments, as defined by the invariom formalism. Thanks to a new algorithm, invariom assignment for refinement in SHELXL is automated. A suitable parameterization for each chemical environment within the new model was achieved by metaheuristics. Figures of merit, precision and accuracy of crystallographic least-squares refinements improve significantly upon using the new model.




model

Ultrafast calculation of diffuse scattering from atomistic models

Diffuse scattering is a rich source of information about disorder in crystalline materials, which can be modelled using atomistic techniques such as Monte Carlo and molecular dynamics simulations. Modern X-ray and neutron scattering instruments can rapidly measure large volumes of diffuse-scattering data. Unfortunately, current algorithms for atomistic diffuse-scattering calculations are too slow to model large data sets completely, because the fast Fourier transform (FFT) algorithm has long been considered unsuitable for such calculations [Butler & Welberry (1992). J. Appl. Cryst. 25, 391–399]. Here, a new approach is presented for ultrafast calculation of atomistic diffuse-scattering patterns. It is shown that the FFT can actually be used to perform such calculations rapidly, and that a fast method based on sampling theory can be used to reduce high-frequency noise in the calculations. These algorithms are benchmarked using realistic examples of compositional, magnetic and displacive disorder. They accelerate the calculations by a factor of at least 102, making refinement of atomistic models to large diffuse-scattering volumes practical.




model

Simulink - Update diagram fails for referenced model when anonymous structure type matches multiple bus types

In a Model block, if the instance-specific value of a model argument has an anonymous structure type, an update diagram reports an error when there are multiple bus types that match that anonymous structure type.This bug exists in the following release(s):
R2020a

This bug has a workaround

Interested in Upgrading?




model

Simulink - Incorrect Code Generation: In a model containing blocks from the SoC Blockset and asynchronous sample time, the sorted order might be incorrect

Simulink might produce an incorrect sorted order for a model that meets all of the following conditions:

  • The model contains blocks from the SoC Blockset
  • The Signal logging option is selected in the model configuration set
  • Signals using asynchronous sample time are configured for logging
As a result, Simulink might produce incorrect results in Normal, Accelerator, and Rapid Accelerator simulation modes as well as in generated code.
This bug exists in the following release(s):
R2020a

Interested in Upgrading?




model

Obtaining the best results: aspects of data collection, model finalization and interpretation of results in small-molecule crystal-structure determination

In small-molecule single-crystal structure determination, we now have at our disposal an inspiring range of fantastic diffractometers with better, brighter sources, and faster, more sensitive detectors. Faster and more powerful computers provide integrated tools and software with impressive graphical user interfaces. Yet these tools can lead to the temptation not to check the work thoroughly and one can too easily overlook tell-tale signs that something might be amiss in a structure determination; validation with checkCIF is not always revealing. This article aims to encourage practitioners, young and seasoned, by enhancing their structure-determination toolboxes with a selection tips and tricks on recognizing and handling aspects that one should constantly be aware of. Topics include a pitfall when setting up data collections, the usefulness of reciprocal lattice layer images, processing twinned data, tips for disorder modelling and the use of restraints, ensuring hydrogen atoms are added to a model correctly, validation beyond checkCIF, and the derivation and interpretation of the final results.




model

Structure-mining: screening structure models by automated fitting to the atomic pair distribution function over large numbers of models

Structure-mining finds and returns the best-fit structures from structural databases given a measured pair distribution function data set. Using databases and heuristics for automation it has the potential to save experimenters a large amount of time as they explore candidate structures from the literature.




model

Direct recovery of interfacial topography from coherent X-ray reflectivity: model calculations for a one-dimensional interface

The inversion of X-ray reflectivity to reveal the topography of a one-dimensional interface is evaluated through model calculations.




model

QuadPay, Stripe work together for instalment model solution

Payment instalment platform QuadPay has announced it is...




model

Virtual Clinical Trials - A New Model for Patient Engagement

For some patients, the ability to participate in a clinical trial from the comfort of one’s home is becoming a reality.




model

COVID-19: The Latest With Physician, Models Predict Significant Increase In U.S. Cases

A cleaning crew disinfects a New York City subway train on May 4, 2020 in New York City. ; Credit: Stephanie Keith/Getty Images

AirTalk®

As of Monday afternoon, L.A. County has at least 1,260 deaths and 26,238 confirmed cases of coronavirus. The United States has more than a million cases of the virus with more than 67,000 deaths. Meanwhile, new models put together by FEMA project that we could see up to 200,000 new cases a day by the end of the month, according to the New York Times

The L.A. Times reports that scientists have discovered a new strain of the deadly coronavirus that is even more contagious. The study finds that the new strain first appeared in February in Europe and has been the dominant strain across the world since mid-March. Plus, some COVID-19 patients are experiencing issues with blood clotting even after respiratory issues have died down. Today on AirTalk, we get the latest with an infectious disease specialist who will take your questions. Call 866-893-5722 to join the conversation. 

With files from LAist. Read the full story here.

Guest:

Dean Blumberg, M.D., professor of medicine and chief of Pediatric Infectious Diseases at UC Davis Children’s Hospital

This content is from Southern California Public Radio. View the original story at SCPR.org.




model

Image modeling for biomedical organs

Image modeling for biomedical organs




model

Minecraft's business model is 'leave users alone' — will it be Microsoft's?

Will Davidson and his Minecraft creation, modeled off the Santa Cruz Mission; Credit: Steve Henn

Minecraft is a deceptively simple video game. You're dropped into a virtual world, and you get to build things. It's like a digital Lego set, but with infinite pieces.

Its simplicity makes it a big hit with kids, like 10-year old Will Davidson. Last year, Will built a Spanish mission for a school report. He modeled his off the Santa Cruz Mission. "I made a chapel over here," Davidson says. "I also have a bell tower."

After he turned in his report, he added a few things. Like skeleton archers. "And zombies ... and exploding things, and spiders, that try to kill you," he said.

Minecraft is popular with kids because they're free to create almost anything, says Ramin Shokrizade, a game designer.

Also, kids aren't manipulated into clicking buttons to buy add-ons within the game. In other games, designers give players a special power for free at first, then take it away and offer it back at a price.

Zynga, the creator of Farmville, calls this fun pain, according to Shokrizade. "That's the idea that, if you make the consumer uncomfortable enough, and then tell them that for money we'll make you less uncomfortable, then [they] will give us money," he says.

Kids, Shokrizade says, are especially susceptible to this — and Minecraft has a loyal following, in part, because it doesn't do it.

Susan Linn, from the Campaign for a Commercial-Free Childhood, agrees. She says a big reason she likes Minecraft is because after you purchase the game upfront, that's it.

"Parents don't have to worry that their kids are going to be targeted for more marketing," Linn says. "How forward-thinking!"

But Linn is worried. Microsoft bought Mojang, the company that created Minecraft, on Monday for $2.5 billion, and she says that any time a large company spends billions to acquire a smaller company, executives are bound start looking for new ways to get even more money out of it.

Copyright 2014 NPR. To see more, visit http://www.npr.org/.

 




model

Could Building Information Modelling support sustainable building practices?

Building Information Modelling (BIM) can enhance the design of a building, reduce costs and save energy. However, little research has been carried out on its impact on sustainable practices. A US survey illustrates that many practitioners do not see sustainability as a primary application, suggesting that more effort is needed to encourage the integration of ‘green’ design and construction into BIM.




model

A vision and roadmap for integrated environmental modelling

Integrated environmental modelling (IEM) is an organised approach to streamlining the movement of scientific information from its research sources to its application in problem solving, according to a study that envisions a global-scale IEM community. The researchers present a roadmap for the future of IEM, describing issues that could be addressed to develop its potential even further, such as how best to integrate diverse stakeholder perspectives and appropriate guidelines for ‘problem statements’.




model

Managing flood risk: more realistic models need to take account of spatial differences

Effective flood-risk management requires accurate risk-analysis models. Conventional analysis approaches, however, are based on the evaluation of spatially homogenous scenarios, which do not account for variation in flooding across a river reach/ region. Since flood events are often spatially heterogeneous (i.e. unevenly distributed), this paves the way for error. Now, scientists have developed a novel framework for risk analysis that accounts for their heterogeneity, and successfully demonstrated the accuracy of the approach by applying it in a proof-of-concept exercise in Vorarlberg, Austria. By facilitating improved prediction and quantification of flood events, this model is likely to inform future flood-risk management and related decision-making.




model

New model ransomware need help




model

Covid-19 Heroes: Digitization is creating new revenue models for Apollo Hospitals

A remote consultation app at Apollo is allowing safety for both the patients and the frontline doctors while using AI to improve diagnostics.




model

A more comprehensive ecological risk assessment combines existing models

Assessment (ERA), each with different goals. The researchers find that overlaps between the three assessments could be combined to create a more comprehensive form of ERA, usable by regulators and environmental decision makers.




model

Macro-economic models need to widen their perspective

The recent recession has prompted the adoption of 'return to growth' policies but the tools used to assess growth often have a narrow economic focus. A new report has assessed current macro-economic models and suggests they need to incorporate the impact that environmental factors can have on the economy, and vice versa, and recommends they should consider limits on resource and material consumption.




model

Model offers insight into long-term costs and payoff of brownfield redevelopment

It can take six to seven years before the financial benefits of brownfield regeneration projects are realised, according to a new study which focused on redevelopment in Michigan, USA. The study examines liability issues, regulatory concerns, clean-up standards and funding mechanisms, and introduces a new model that informs debate on brownfield redevelopment policies and funding mechanisms.




model

Simple steps to increase the uptake of sustainable service-based business models

‘Product-service systems’ are innovative business models designed to satisfy societal needs in an environmentally sustainable manner. This study explores how government policies could increase the uptake of these systems, outlining five key recommendations to achieve this, including schemes to raise awareness and involve local authorities.




model

New model developed to optimise management of irrigation

Under water restrictions, farmers will achieve the optimal balance of income and efficient water use if they combine the planting of crops that require little water with the planting of more profitable crops that need more water, according to research.




model

Combining behavioural change and game-like incentive models encourages consumers to save water

Domestic water saving is important — not only to address water scarcity and drought, but also to save energy and tackle climate change. Water-management strategies are needed to prevent these shortages, and include incentives to change consumers’ behaviour concerning water use. This study examines the design of a behaviour-change system and a linked incentive model to stimulate a sustainable change in water-consumption behaviour.




model

Biodiversity model includes indirect impact of harvesting wild species

Researchers have developed a new model to estimate the impact of harvesting wild species and land use change on biodiversity. Unlike previous models, it considers the indirect effect of harvesting or pest control on landscape structure through reducing the variety of species.




model

Little Bustard: case study for modelling conservation costs

A new model, named OUTOPIE could help design more effective agri-environmental schemes. The model links the farm, field and landscape levels to allow a more accurate assessment of the costs of enrolling specific fields in conservation schemes. Using the model, the researchers were able to assess the cost-effectiveness of different policies for the conservation of the Little Bustard bird (Tetrax tetrax) in France.




model

Baltic nutrient abatement measures identified by hybrid ecological-economic model

Policies to manage marine ecological quality can be improved by combining economic and ecological concerns, finds a new study. Using this integrated perspective, researchers developed a model which identified the most cost-effective options for reducing nutrient pollution in the Baltic Sea within a 40-year time-span. The total cost of meeting the commonly agreed targets is estimated to be €1,487 million annually.




model

Better predictions of climate change impact on wildlife thanks to genetically informed modelling

The effects of climate change on the distribution of species can be predicted more accurately by considering the genetic differences between different groups of the same species, a new study suggests. The researchers found that a computer model which incorporated genetic information on different groups of a US tree species was up to 12 times more accurate in predicting tree locations than a non-genetically informed model.




model

How to model trade-offs between agricultural yield and biodiversity

New research has examined three different categories of Ecological Risk Assessment (ERA), each with different goals. The researchers find that overlaps between the three assessments could be combined to create a more comprehensive form of ERA, usable by regulators and environmental decision makers. There is an inherent trade-off between increasing agricultural production and protection of biodiversity. This study models the effects of biodiversity conservation agri-environment schemes (AESs) and ecosystem service provider schemes, and shows that determining the aim of an agri-environment scheme is key to improving its efficiency. Such an optimisation could allow AES to be rolled out more generally to provide the backbone for both high yields and enhanced farmland biodiversity, say the researchers.




model

New models to assess developmental toxicity for REACH

In line with EU legislative requirements, new research has developed models to assess the toxicity of chemicals in terms of their effects on human development. One of the models has been made freely accessible online, so that it is easy to use for industry and regulators.




model

Nitrogen pollution models reviewed

Computer models can be powerful tools when developing policies to address nitrogen pollution from agriculture. In a new study, researchers have made recommendations regarding the best design and use of these models to aid the effective implementation of European legislation on nitrogen.




model

Modelling emissions of perfluorinated chemicals to the Danube River Basin

The emissions of two perfluoroalkyl acids (PFAAs) into the Danube River Basin have been estimated in a test of four different hypotheses regarding the factors affecting those emissions. The results were used to simulate water concentrations for comparison with measured data. The researchers found that incorporating wastewater treatment information and wealth distribution alongside population data can improve the accuracy of emissions estimates.




model

Advances in freshwater risk assessment: experiences with Biotic Ligand Models

To assess the risk posed by metals in the aquatic environment, Biotic Ligand Models (BLMs) were developed, and are now considered suitable for use in regulatory risk assessments. This study reviews the advantages of BLMs and BLM-based software tools, providing examples from across the EU, and offers recommendations for their widespread implementation.




model

New computer modelling tool to identify persistent chemicals

Chemicals that persist in the environment can harm humans and wildlife. This study describes a computer modelling-based approach to predict which chemical compounds are likely to be persistent. The models were correctly able to predict persistence for 11 of 12 chemicals tested and could provide a cost-effective alternative to laboratory testing.




model

New model for estimating ship emissions to guide policy

EU-supported research has established a new model to calculate air pollution emissions from ships. Its calculations could create a database that lists emissions per ship type and size as well as by country.