modelling

Multiscale modelling of CO2 hydrogenation of TiO2-supported Ni8 clusters: on the influence of anatase and rutile polymorphs

Catal. Sci. Technol., 2024, 14,6393-6410
DOI: 10.1039/D4CY00586D, Paper
Open Access
  This article is licensed under a Creative Commons Attribution 3.0 Unported Licence.
Lulu Chen, Ying-Ying Ye, Rozemarijn D. E. Krösschell, Emiel J. M. Hensen, Ivo A. W. Filot
The selection of TiO2 phase, whether anatase or rutile, for supporting small Ni clusters significantly influences the activity and selectivity in CO2 hydrogenation to methane.
The content of this RSS Feed (c) The Royal Society of Chemistry




modelling

Fatigue and fracture of adhesively-bonded composite joints: behaviour, simulation and modelling / edited by A.P. Vassilopoulos

Online Resource




modelling

Addressing Competitiveness and Carbon Leakage Impacts Arising from Multiple Carbon Markets: A modelling Assessment - Environment Working Paper No. 58

Competitiveness and carbon leakage issues have been some of the main concerns in the implementation and discussions of climate policies. This paper examines the macroeconomic and sectoral competitiveness and carbon leakage impacts associated with a range of stylised mitigation policy scenarios.




modelling

Integrated Assessment of Climate Change Impacts: Conceptual Frameworks, Modelling Approaches and Research Needs - Environment Working Paper No. 66

This paper presents a framework to include feedbacks from climate impacts on the economy in integrated assessment models. The proposed framework uses a production function approach, which links climate impacts to key variables and parameters used in the specification of economic activity. The paper pays particular attention to the challenges of distinguishing between damages and the costs of adapting to climate change.




modelling

The Macroeconomics of the Circular Economy Transition: A Critical Review of Modelling Approaches - Environment Working Paper

This paper reviews the existing literature on modelling the macroeconomic consequences of the transition to a circular economy. It provides insights into the current state of the art on modelling policies to improve resource efficiency and the transition to a circular economy by examining 24 modelling-based assessments of a circular economy transition.




modelling

Modelling of distribution impacts of energy subsidy reforms: An illustration with Indonesia - Environment Working Paper

This report develops an analytical framework that assesses the macroeconomic, environmental and distributional consequences of energy subsidy reforms. The framework is applied to the case of Indonesia to study the consequences in this country of a gradual phase out of all energy consumption subsidies between 2012 and 2020.




modelling

Jhatpat shoots are the new route to modelling moolah

She is fair, light-eyed and the perfect clothes horse. But on days when regular modelling assignments dry up, she doesn't mind changing 50 outfits and being photographed in a small studio against a plain white backdrop. Front, side, and back and you are done, says Anushree, 24, a Mumbai model.




modelling

Sequence assignment for low-resolution modelling of protein crystal structures

The performance of automated model building in crystal structure determination usually decreases with the resolution of the experimental data, and may result in fragmented models and incorrect side-chain assignment. Presented here are new methods for machine-learning-based docking of main-chain fragments to the sequence and for their sequence-independent connection using a dedicated library of protein fragments. The combined use of these new methods noticeably increases sequence coverage and reduces fragmentation of the protein models automatically built with ARP/wARP.




modelling

Extending the scope of coiled-coil crystal structure solution by AMPLE through improved ab initio modelling

The phase problem remains a major barrier to overcome in protein structure solution by X-ray crystallography. In recent years, new molecular-replacement approaches using ab initio models and ideal secondary-structure components have greatly contributed to the solution of novel structures in the absence of clear homologues in the PDB or experimental phasing information. This has been particularly successful for highly α-helical structures, and especially coiled-coils, in which the relatively rigid α-helices provide very useful molecular-replacement fragments. This has been seen within the program AMPLE, which uses clustered and truncated ensembles of numerous ab initio models in structure solution, and is already accomplished for α-helical and coiled-coil structures. Here, an expansion in the scope of coiled-coil structure solution by AMPLE is reported, which has been achieved through general improvements in the pipeline, the removal of tNCS correction in molecular replacement and two improved methods for ab initio modelling. Of the latter improvements, enforcing the modelling of elongated helices overcame the bias towards globular folds and provided a rapid method (equivalent to the time requirements of the existing modelling procedures in AMPLE) for enhanced solution. Further, the modelling of two-, three- and four-helical oligomeric coiled-coils, and the use of full/partial oligomers in molecular replacement, provided additional success in difficult and lower resolution cases. Together, these approaches have enabled the solution of a number of parallel/antiparallel dimeric, trimeric and tetrameric coiled-coils at resolutions as low as 3.3 Å, and have thus overcome previous limitations in AMPLE and provided a new functionality in coiled-coil structure solution at lower resolutions. These new approaches have been incorporated into a new release of AMPLE in which automated elongated monomer and oligomer modelling may be activated by selecting `coiled-coil' mode.




modelling

Structure analysis of supported disordered molybdenum oxides using pair distribution function analysis and automated cluster modelling

Molybdenum oxides and sulfides on various low-cost high-surface-area supports are excellent catalysts for several industrially relevant reactions. The surface layer structure of these materials is, however, difficult to characterize due to small and disordered MoOx domains. Here, it is shown how X-ray total scattering can be applied to gain insights into the structure through differential pair distribution function (d-PDF) analysis, where the scattering signal from the support material is subtracted to obtain structural information on the supported structure. MoOx catalysts supported on alumina nanoparticles and on zeolites are investigated, and it is shown that the structure of the hydrated molybdenum oxide layer is closely related to that of disordered and polydisperse polyoxometalates. By analysing the PDFs with a large number of automatically generated cluster structures, which are constructed in an iterative manner from known polyoxometalate clusters, information is derived on the structural motifs in supported MoOx.




modelling

Could Building Information Modelling support sustainable building practices?

Building Information Modelling (BIM) can enhance the design of a building, reduce costs and save energy. However, little research has been carried out on its impact on sustainable practices. A US survey illustrates that many practitioners do not see sustainability as a primary application, suggesting that more effort is needed to encourage the integration of ‘green’ design and construction into BIM.




modelling

A vision and roadmap for integrated environmental modelling

Integrated environmental modelling (IEM) is an organised approach to streamlining the movement of scientific information from its research sources to its application in problem solving, according to a study that envisions a global-scale IEM community. The researchers present a roadmap for the future of IEM, describing issues that could be addressed to develop its potential even further, such as how best to integrate diverse stakeholder perspectives and appropriate guidelines for ‘problem statements’.




modelling

Little Bustard: case study for modelling conservation costs

A new model, named OUTOPIE could help design more effective agri-environmental schemes. The model links the farm, field and landscape levels to allow a more accurate assessment of the costs of enrolling specific fields in conservation schemes. Using the model, the researchers were able to assess the cost-effectiveness of different policies for the conservation of the Little Bustard bird (Tetrax tetrax) in France.




modelling

Better predictions of climate change impact on wildlife thanks to genetically informed modelling

The effects of climate change on the distribution of species can be predicted more accurately by considering the genetic differences between different groups of the same species, a new study suggests. The researchers found that a computer model which incorporated genetic information on different groups of a US tree species was up to 12 times more accurate in predicting tree locations than a non-genetically informed model.




modelling

Modelling emissions of perfluorinated chemicals to the Danube River Basin

The emissions of two perfluoroalkyl acids (PFAAs) into the Danube River Basin have been estimated in a test of four different hypotheses regarding the factors affecting those emissions. The results were used to simulate water concentrations for comparison with measured data. The researchers found that incorporating wastewater treatment information and wealth distribution alongside population data can improve the accuracy of emissions estimates.




modelling

New computer modelling tool to identify persistent chemicals

Chemicals that persist in the environment can harm humans and wildlife. This study describes a computer modelling-based approach to predict which chemical compounds are likely to be persistent. The models were correctly able to predict persistence for 11 of 12 chemicals tested and could provide a cost-effective alternative to laboratory testing.




modelling

Modelling marine exhaust emissions in the Baltic Sea

A new tool used to investigate exhaust emissions of marine vessels has been developed and applied to shipping in the Baltic Sea.




modelling

Air pollution modelling could help predict algal blooms

Models that predict how nitrogen from the air is deposited in the sea could be useful in predicting algal blooms. Based on the knowledge that excess nitrogen increases algal growth rates, researchers simulated nitrogen deposition in the North Sea and suggested that, using predicted weather data, it might be possible to adapt this approach to predict algal blooms.




modelling

Understanding uncertainty in air-quality modelling with new framework

A recent study develops a framework for implementing IAMs using the Lombardy region of Italy as a case study. Researchers have run an uncertainty and sensitivity analysis with an environmental model, specifically with an Integrated Assessment Model (IAM) for air quality, demonstrating how model components are sources of uncertainty in the output of an integrated assessment. Policy responses should therefore consider uncertainty and sensitivity when developing measures to improve air quality.




modelling

CSSplay - 3D Perspective Modelling

Another look at 3D modelling suitable for IE10 and IE11 without using 'transform-style: preserve-3d;'




modelling

CSSplay - 360º 3D Modelling for IE10 and IE11

3D modelling with 360 º animation suitable for IE10 and IE11 without using 'transform-style: preserve-3d;'




modelling

CSSplay - 360º 3D Modelling NOT for IE10 and IE11

3D modelling with 360 º animation using 'transform-style: preserve-3d;' so NOT for IE10 or IE11




modelling

Queensland tsunami modelling shows how coastal communities will be impacted

Low lying areas are swamped, millions of people have hours to evacuate and destruction on a mass scale is predicted by scientists who have mapped the worst-case scenarios for how Queensland's coastline would be impacted if a one-in-10,000-year tsunami hit.




modelling

Shenhua mining under fire after 'damning' report highlights flawed environmental modelling

A Chinese mining giant is being accused of underestimating the impact a proposed open cut mine will have on groundwater on the New South Wales Liverpool Plains.




modelling

Wet winter forecast should be good news for farmers, but they remain cautious about modelling

There's growing consensus among weather forecasting models that Australia could be in for a wet winter. But what do farmers think?




modelling

Predictions and Policymaking: Complex Modelling Beyond COVID-19

1 April 2020

Yasmin Afina

Research Assistant, International Security Programme

Calum Inverarity

Research Analyst and Coordinator, International Security Programme
The COVID-19 pandemic has highlighted the potential of complex systems modelling for policymaking but it is crucial to also understand its limitations.

GettyImages-1208425931.jpg

A member of the media wearing a protective face mask works in Downing Street where Britain's Prime Minister Boris Johnson is self-isolating in central London, 27 March 2020. Photo by TOLGA AKMEN/AFP via Getty Images.

Complex systems models have played a significant role in informing and shaping the public health measures adopted by governments in the context of the COVID-19 pandemic. For instance, modelling carried out by a team at Imperial College London is widely reported to have driven the approach in the UK from a strategy of mitigation to one of suppression.

Complex systems modelling will increasingly feed into policymaking by predicting a range of potential correlations, results and outcomes based on a set of parameters, assumptions, data and pre-defined interactions. It is already instrumental in developing risk mitigation and resilience measures to address and prepare for existential crises such as pandemics, prospects of a nuclear war, as well as climate change.

The human factor

In the end, model-driven approaches must stand up to the test of real-life data. Modelling for policymaking must take into account a number of caveats and limitations. Models are developed to help answer specific questions, and their predictions will depend on the hypotheses and definitions set by the modellers, which are subject to their individual and collective biases and assumptions. For instance, the models developed by Imperial College came with the caveated assumption that a policy of social distancing for people over 70 will have a 75 per cent compliance rate. This assumption is based on the modellers’ own perceptions of demographics and society, and may not reflect all societal factors that could impact this compliance rate in real life, such as gender, age, ethnicity, genetic diversity, economic stability, as well as access to food, supplies and healthcare. This is why modelling benefits from a cognitively diverse team who bring a wide range of knowledge and understanding to the early creation of a model.

The potential of artificial intelligence

Machine learning, or artificial intelligence (AI), has the potential to advance the capacity and accuracy of modelling techniques by identifying new patterns and interactions, and overcoming some of the limitations resulting from human assumptions and bias. Yet, increasing reliance on these techniques raises the issue of explainability. Policymakers need to be fully aware and understand the model, assumptions and input data behind any predictions and must be able to communicate this aspect of modelling in order to uphold democratic accountability and transparency in public decision-making.

In addition, models using machine learning techniques require extensive amounts of data, which must also be of high quality and as free from bias as possible to ensure accuracy and address the issues at stake. Although technology may be used in the process (i.e. automated extraction and processing of information with big data), data is ultimately created, collected, aggregated and analysed by and for human users. Datasets will reflect the individual and collective biases and assumptions of those creating, collecting, processing and analysing this data. Algorithmic bias is inevitable, and it is essential that policy- and decision-makers are fully aware of how reliable the systems are, as well as their potential social implications.

The age of distrust

Increasing use of emerging technologies for data- and evidence-based policymaking is taking place, paradoxically, in an era of growing mistrust towards expertise and experts, as infamously surmised by Michael Gove. Policymakers and subject-matter experts have faced increased public scrutiny of their findings and the resultant policies that they have been used to justify.

This distrust and scepticism within public discourse has only been fuelled by an ever-increasing availability of diffuse sources of information, not all of which are verifiable and robust. This has caused tension between experts, policymakers and public, which has led to conflicts and uncertainty over what data and predictions can be trusted, and to what degree. This dynamic is exacerbated when considering that certain individuals may purposefully misappropriate, or simply misinterpret, data to support their argument or policies. Politicians are presently considered the least trusted professionals by the UK public, highlighting the importance of better and more effective communication between the scientific community, policymakers and the populations affected by policy decisions.

Acknowledging limitations

While measures can and should be built in to improve the transparency and robustness of scientific models in order to counteract these common criticisms, it is important to acknowledge that there are limitations to the steps that can be taken. This is particularly the case when dealing with predictions of future events, which inherently involve degrees of uncertainty that cannot be fully accounted for by human or machine. As a result, if not carefully considered and communicated, the increased use of complex modelling in policymaking holds the potential to undermine and obfuscate the policymaking process, which may contribute towards significant mistakes being made, increased uncertainty, lack of trust in the models and in the political process and further disaffection of citizens.

The potential contribution of complexity modelling to the work of policymakers is undeniable. However, it is imperative to appreciate the inner workings and limitations of these models, such as the biases that underpin their functioning and the uncertainties that they will not be fully capable of accounting for, in spite of their immense power. They must be tested against the data, again and again, as new information becomes available or there is a risk of scientific models becoming embroiled in partisan politicization and potentially weaponized for political purposes. It is therefore important not to consider these models as oracles, but instead as one of many contributions to the process of policymaking.




modelling

How modelling articulates the science of climate change

To imagine earth without greenhouse gases in its atmosphere is to turn the familiar blue marble into a barren lump of rock and ice on which the average surface temperature hovers around -18ºC. Such a planet would not receive less of the sunlight which is the ultimate source of all Earth's warmth. But when the energy it absorbed from the sunlight was re-emitted as infrared radiation, as the laws of physics require, it would head unimpeded back out into space.




modelling

Predictions and Policymaking: Complex Modelling Beyond COVID-19

1 April 2020

Yasmin Afina

Research Assistant, International Security Programme

Calum Inverarity

Research Analyst and Coordinator, International Security Programme
The COVID-19 pandemic has highlighted the potential of complex systems modelling for policymaking but it is crucial to also understand its limitations.

GettyImages-1208425931.jpg

A member of the media wearing a protective face mask works in Downing Street where Britain's Prime Minister Boris Johnson is self-isolating in central London, 27 March 2020. Photo by TOLGA AKMEN/AFP via Getty Images.

Complex systems models have played a significant role in informing and shaping the public health measures adopted by governments in the context of the COVID-19 pandemic. For instance, modelling carried out by a team at Imperial College London is widely reported to have driven the approach in the UK from a strategy of mitigation to one of suppression.

Complex systems modelling will increasingly feed into policymaking by predicting a range of potential correlations, results and outcomes based on a set of parameters, assumptions, data and pre-defined interactions. It is already instrumental in developing risk mitigation and resilience measures to address and prepare for existential crises such as pandemics, prospects of a nuclear war, as well as climate change.

The human factor

In the end, model-driven approaches must stand up to the test of real-life data. Modelling for policymaking must take into account a number of caveats and limitations. Models are developed to help answer specific questions, and their predictions will depend on the hypotheses and definitions set by the modellers, which are subject to their individual and collective biases and assumptions. For instance, the models developed by Imperial College came with the caveated assumption that a policy of social distancing for people over 70 will have a 75 per cent compliance rate. This assumption is based on the modellers’ own perceptions of demographics and society, and may not reflect all societal factors that could impact this compliance rate in real life, such as gender, age, ethnicity, genetic diversity, economic stability, as well as access to food, supplies and healthcare. This is why modelling benefits from a cognitively diverse team who bring a wide range of knowledge and understanding to the early creation of a model.

The potential of artificial intelligence

Machine learning, or artificial intelligence (AI), has the potential to advance the capacity and accuracy of modelling techniques by identifying new patterns and interactions, and overcoming some of the limitations resulting from human assumptions and bias. Yet, increasing reliance on these techniques raises the issue of explainability. Policymakers need to be fully aware and understand the model, assumptions and input data behind any predictions and must be able to communicate this aspect of modelling in order to uphold democratic accountability and transparency in public decision-making.

In addition, models using machine learning techniques require extensive amounts of data, which must also be of high quality and as free from bias as possible to ensure accuracy and address the issues at stake. Although technology may be used in the process (i.e. automated extraction and processing of information with big data), data is ultimately created, collected, aggregated and analysed by and for human users. Datasets will reflect the individual and collective biases and assumptions of those creating, collecting, processing and analysing this data. Algorithmic bias is inevitable, and it is essential that policy- and decision-makers are fully aware of how reliable the systems are, as well as their potential social implications.

The age of distrust

Increasing use of emerging technologies for data- and evidence-based policymaking is taking place, paradoxically, in an era of growing mistrust towards expertise and experts, as infamously surmised by Michael Gove. Policymakers and subject-matter experts have faced increased public scrutiny of their findings and the resultant policies that they have been used to justify.

This distrust and scepticism within public discourse has only been fuelled by an ever-increasing availability of diffuse sources of information, not all of which are verifiable and robust. This has caused tension between experts, policymakers and public, which has led to conflicts and uncertainty over what data and predictions can be trusted, and to what degree. This dynamic is exacerbated when considering that certain individuals may purposefully misappropriate, or simply misinterpret, data to support their argument or policies. Politicians are presently considered the least trusted professionals by the UK public, highlighting the importance of better and more effective communication between the scientific community, policymakers and the populations affected by policy decisions.

Acknowledging limitations

While measures can and should be built in to improve the transparency and robustness of scientific models in order to counteract these common criticisms, it is important to acknowledge that there are limitations to the steps that can be taken. This is particularly the case when dealing with predictions of future events, which inherently involve degrees of uncertainty that cannot be fully accounted for by human or machine. As a result, if not carefully considered and communicated, the increased use of complex modelling in policymaking holds the potential to undermine and obfuscate the policymaking process, which may contribute towards significant mistakes being made, increased uncertainty, lack of trust in the models and in the political process and further disaffection of citizens.

The potential contribution of complexity modelling to the work of policymakers is undeniable. However, it is imperative to appreciate the inner workings and limitations of these models, such as the biases that underpin their functioning and the uncertainties that they will not be fully capable of accounting for, in spite of their immense power. They must be tested against the data, again and again, as new information becomes available or there is a risk of scientific models becoming embroiled in partisan politicization and potentially weaponized for political purposes. It is therefore important not to consider these models as oracles, but instead as one of many contributions to the process of policymaking.




modelling

Predictions and Policymaking: Complex Modelling Beyond COVID-19

1 April 2020

Yasmin Afina

Research Assistant, International Security Programme

Calum Inverarity

Research Analyst and Coordinator, International Security Programme
The COVID-19 pandemic has highlighted the potential of complex systems modelling for policymaking but it is crucial to also understand its limitations.

GettyImages-1208425931.jpg

A member of the media wearing a protective face mask works in Downing Street where Britain's Prime Minister Boris Johnson is self-isolating in central London, 27 March 2020. Photo by TOLGA AKMEN/AFP via Getty Images.

Complex systems models have played a significant role in informing and shaping the public health measures adopted by governments in the context of the COVID-19 pandemic. For instance, modelling carried out by a team at Imperial College London is widely reported to have driven the approach in the UK from a strategy of mitigation to one of suppression.

Complex systems modelling will increasingly feed into policymaking by predicting a range of potential correlations, results and outcomes based on a set of parameters, assumptions, data and pre-defined interactions. It is already instrumental in developing risk mitigation and resilience measures to address and prepare for existential crises such as pandemics, prospects of a nuclear war, as well as climate change.

The human factor

In the end, model-driven approaches must stand up to the test of real-life data. Modelling for policymaking must take into account a number of caveats and limitations. Models are developed to help answer specific questions, and their predictions will depend on the hypotheses and definitions set by the modellers, which are subject to their individual and collective biases and assumptions. For instance, the models developed by Imperial College came with the caveated assumption that a policy of social distancing for people over 70 will have a 75 per cent compliance rate. This assumption is based on the modellers’ own perceptions of demographics and society, and may not reflect all societal factors that could impact this compliance rate in real life, such as gender, age, ethnicity, genetic diversity, economic stability, as well as access to food, supplies and healthcare. This is why modelling benefits from a cognitively diverse team who bring a wide range of knowledge and understanding to the early creation of a model.

The potential of artificial intelligence

Machine learning, or artificial intelligence (AI), has the potential to advance the capacity and accuracy of modelling techniques by identifying new patterns and interactions, and overcoming some of the limitations resulting from human assumptions and bias. Yet, increasing reliance on these techniques raises the issue of explainability. Policymakers need to be fully aware and understand the model, assumptions and input data behind any predictions and must be able to communicate this aspect of modelling in order to uphold democratic accountability and transparency in public decision-making.

In addition, models using machine learning techniques require extensive amounts of data, which must also be of high quality and as free from bias as possible to ensure accuracy and address the issues at stake. Although technology may be used in the process (i.e. automated extraction and processing of information with big data), data is ultimately created, collected, aggregated and analysed by and for human users. Datasets will reflect the individual and collective biases and assumptions of those creating, collecting, processing and analysing this data. Algorithmic bias is inevitable, and it is essential that policy- and decision-makers are fully aware of how reliable the systems are, as well as their potential social implications.

The age of distrust

Increasing use of emerging technologies for data- and evidence-based policymaking is taking place, paradoxically, in an era of growing mistrust towards expertise and experts, as infamously surmised by Michael Gove. Policymakers and subject-matter experts have faced increased public scrutiny of their findings and the resultant policies that they have been used to justify.

This distrust and scepticism within public discourse has only been fuelled by an ever-increasing availability of diffuse sources of information, not all of which are verifiable and robust. This has caused tension between experts, policymakers and public, which has led to conflicts and uncertainty over what data and predictions can be trusted, and to what degree. This dynamic is exacerbated when considering that certain individuals may purposefully misappropriate, or simply misinterpret, data to support their argument or policies. Politicians are presently considered the least trusted professionals by the UK public, highlighting the importance of better and more effective communication between the scientific community, policymakers and the populations affected by policy decisions.

Acknowledging limitations

While measures can and should be built in to improve the transparency and robustness of scientific models in order to counteract these common criticisms, it is important to acknowledge that there are limitations to the steps that can be taken. This is particularly the case when dealing with predictions of future events, which inherently involve degrees of uncertainty that cannot be fully accounted for by human or machine. As a result, if not carefully considered and communicated, the increased use of complex modelling in policymaking holds the potential to undermine and obfuscate the policymaking process, which may contribute towards significant mistakes being made, increased uncertainty, lack of trust in the models and in the political process and further disaffection of citizens.

The potential contribution of complexity modelling to the work of policymakers is undeniable. However, it is imperative to appreciate the inner workings and limitations of these models, such as the biases that underpin their functioning and the uncertainties that they will not be fully capable of accounting for, in spite of their immense power. They must be tested against the data, again and again, as new information becomes available or there is a risk of scientific models becoming embroiled in partisan politicization and potentially weaponized for political purposes. It is therefore important not to consider these models as oracles, but instead as one of many contributions to the process of policymaking.




modelling

Estimated population wide benefits and risks in China of lowering sodium through potassium enriched salt substitution: modelling study




modelling

Modelling carp biomass : estimates for the year 2023 / Charles R. Todd, John D. Koehn, Tim R. Brown, Ben Fanson, Shane Brooks and Ivor Stuart.




modelling

A reproducible framework for 3D acoustic forward modelling of hard rock geological models with Madagascar / Andrew Squelch, Mahyar Madadi, Milovan Urosevic.

"A special challenge of hard rock exploration is to identify targets of interest within complex geological settings. Interpretation of the geology can be made from direct geological observations and knowledge of the area, and from 2D or 3D seismic surveys. These interpretations can be developed into 3D geological models that provide the basis for predictions as to likely targets for drilling and/or mining. To verify these predictions we need to simulate 3D seismic wave propagation in the proposed geological models and compare the simulation results to seismic survey data. To achieve this we convert geological surfaces created in an interpretation software package into discretised block models representing the different lithostratigraphic units, and segment these into discrete volumes to which appropriate density and seismic velocity values are assigned. This approach allows us to scale models appropriately for desired wave propagation parameters and to go from local to global geological models and vice versa. Then we use these digital models with forward modelling codes to undertake numerous 3D acoustic wave simulations. Simulations are performed with single shot and with exploding reflector (located on extracted geological surface) configurations" -- Summary.




modelling

Bayesian modelling of the abilities in dichotomous IRT models via regression with missing values in the covariates

Flávio B. Gonçalves, Bárbara C. C. Dias.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 4, 782--800.

Abstract:
Educational assessment usually considers a contextual questionnaire to extract relevant information from the applicants. This may include items related to socio-economical profile as well as items to extract other characteristics potentially related to applicant’s performance in the test. A careful analysis of the questionnaires jointly with the test’s results may evidence important relations between profiles and test performance. The most coherent way to perform this task in a statistical context is to use the information from the questionnaire to help explain the variability of the abilities in a joint model-based approach. Nevertheless, the responses to the questionnaire typically present missing values which, in some cases, may be missing not at random. This paper proposes a statistical methodology to model the abilities in dichotomous IRT models using the information of the contextual questionnaires via linear regression. The proposed methodology models the missing data jointly with the all the observed data, which allows for the estimation of the former. The missing data modelling is flexible enough to allow the specification of missing not at random structures. Furthermore, even if those structures are not assumed a priori, they can be estimated from the posterior results when assuming missing (completely) at random structures a priori. Statistical inference is performed under the Bayesian paradigm via an efficient MCMC algorithm. Simulated and real examples are presented to investigate the efficiency and applicability of the proposed methodology.




modelling

Hierarchical modelling of power law processes for the analysis of repairable systems with different truncation times: An empirical Bayes approach

Rodrigo Citton P. dos Reis, Enrico A. Colosimo, Gustavo L. Gilardoni.

Source: Brazilian Journal of Probability and Statistics, Volume 33, Number 2, 374--396.

Abstract:
In the data analysis from multiple repairable systems, it is usual to observe both different truncation times and heterogeneity among the systems. Among other reasons, the latter is caused by different manufacturing lines and maintenance teams of the systems. In this paper, a hierarchical model is proposed for the statistical analysis of multiple repairable systems under different truncation times. A reparameterization of the power law process is proposed in order to obtain a quasi-conjugate bayesian analysis. An empirical Bayes approach is used to estimate model hyperparameters. The uncertainty in the estimate of these quantities are corrected by using a parametric bootstrap approach. The results are illustrated in a real data set of failure times of power transformers from an electric company in Brazil.




modelling

Generating Thermal Image Data Samples using 3D Facial Modelling Techniques and Deep Learning Methodologies. (arXiv:2005.01923v2 [cs.CV] UPDATED)

Methods for generating synthetic data have become of increasing importance to build large datasets required for Convolution Neural Networks (CNN) based deep learning techniques for a wide range of computer vision applications. In this work, we extend existing methodologies to show how 2D thermal facial data can be mapped to provide 3D facial models. For the proposed research work we have used tufts datasets for generating 3D varying face poses by using a single frontal face pose. The system works by refining the existing image quality by performing fusion based image preprocessing operations. The refined outputs have better contrast adjustments, decreased noise level and higher exposedness of the dark regions. It makes the facial landmarks and temperature patterns on the human face more discernible and visible when compared to original raw data. Different image quality metrics are used to compare the refined version of images with original images. In the next phase of the proposed study, the refined version of images is used to create 3D facial geometry structures by using Convolution Neural Networks (CNN). The generated outputs are then imported in blender software to finally extract the 3D thermal facial outputs of both males and females. The same technique is also used on our thermal face data acquired using prototype thermal camera (developed under Heliaus EU project) in an indoor lab environment which is then used for generating synthetic 3D face data along with varying yaw face angles and lastly facial depth map is generated.




modelling

Landscape modelling and decision support

9783030374211 (electronic bk.)




modelling

A hierarchical dependent Dirichlet process prior for modelling bird migration patterns in the UK

Alex Diana, Eleni Matechou, Jim Griffin, Alison Johnston.

Source: The Annals of Applied Statistics, Volume 14, Number 1, 473--493.

Abstract:
Environmental changes in recent years have been linked to phenological shifts which in turn are linked to the survival of species. The work in this paper is motivated by capture-recapture data on blackcaps collected by the British Trust for Ornithology as part of the Constant Effort Sites monitoring scheme. Blackcaps overwinter abroad and migrate to the UK annually for breeding purposes. We propose a novel Bayesian nonparametric approach for expressing the bivariate density of individual arrival and departure times at different sites across a number of years as a mixture model. The new model combines the ideas of the hierarchical and the dependent Dirichlet process, allowing the estimation of site-specific weights and year-specific mixture locations, which are modelled as functions of environmental covariates using a multivariate extension of the Gaussian process. The proposed modelling framework is extremely general and can be used in any context where multivariate density estimation is performed jointly across different groups and in the presence of a continuous covariate.




modelling

Network modelling of topological domains using Hi-C data

Y. X. Rachel Wang, Purnamrita Sarkar, Oana Ursu, Anshul Kundaje, Peter J. Bickel.

Source: The Annals of Applied Statistics, Volume 13, Number 3, 1511--1536.

Abstract:
Chromosome conformation capture experiments such as Hi-C are used to map the three-dimensional spatial organization of genomes. One specific feature of the 3D organization is known as topologically associating domains (TADs), which are densely interacting, contiguous chromatin regions playing important roles in regulating gene expression. A few algorithms have been proposed to detect TADs. In particular, the structure of Hi-C data naturally inspires application of community detection methods. However, one of the drawbacks of community detection is that most methods take exchangeability of the nodes in the network for granted; whereas the nodes in this case, that is, the positions on the chromosomes, are not exchangeable. We propose a network model for detecting TADs using Hi-C data that takes into account this nonexchangeability. In addition, our model explicitly makes use of cell-type specific CTCF binding sites as biological covariates and can be used to identify conserved TADs across multiple cell types. The model leads to a likelihood objective that can be efficiently optimized via relaxation. We also prove that when suitably initialized, this model finds the underlying TAD structure with high probability. Using simulated data, we show the advantages of our method and the caveats of popular community detection methods, such as spectral clustering, in this application. Applying our method to real Hi-C data, we demonstrate the domains identified have desirable epigenetic features and compare them across different cell types.




modelling

A Tale of Two Parasites: Statistical Modelling to Support Disease Control Programmes in Africa

Peter J. Diggle, Emanuele Giorgi, Julienne Atsame, Sylvie Ntsame Ella, Kisito Ogoussan, Katherine Gass.

Source: Statistical Science, Volume 35, Number 1, 42--50.

Abstract:
Vector-borne diseases have long presented major challenges to the health of rural communities in the wet tropical regions of the world, but especially in sub-Saharan Africa. In this paper, we describe the contribution that statistical modelling has made to the global elimination programme for one vector-borne disease, onchocerciasis. We explain why information on the spatial distribution of a second vector-borne disease, Loa loa, is needed before communities at high risk of onchocerciasis can be treated safely with mass distribution of ivermectin, an antifiarial medication. We show how a model-based geostatistical analysis of Loa loa prevalence survey data can be used to map the predictive probability that each location in the region of interest meets a WHO policy guideline for safe mass distribution of ivermectin and describe two applications: one is to data from Cameroon that assesses prevalence using traditional blood-smear microscopy; the other is to Africa-wide data that uses a low-cost questionnaire-based method. We describe how a recent technological development in image-based microscopy has resulted in a change of emphasis from prevalence alone to the bivariate spatial distribution of prevalence and the intensity of infection among infected individuals. We discuss how statistical modelling of the kind described here can contribute to health policy guidelines and decision-making in two ways. One is to ensure that, in a resource-limited setting, prevalence surveys are designed, and the resulting data analysed, as efficiently as possible. The other is to provide an honest quantification of the uncertainty attached to any binary decision by reporting predictive probabilities that a policy-defined condition for action is or is not met.




modelling

Understanding US export dynamics: does modelling the extensive margin of exports help?

Bank of England Working Papers by Aydan Dogan and Ida Hjortsoe




modelling

Modelling – As easy as 1 2 3

It’s always interesting to know about some cool tricks that you can use when it comes to modelling within SOLIDWORKS. Well, within this blog we just happen to have one of those tricks for you. They always say a magician never

Author information

TMS CADCentre is a SOLIDWORKS Reseller based in Scotland providing CAD Design Software, analysis software & product data management software. Founded in 1981, TMS CADCentre is the only UK SOLIDWORKS Reseller based and funded within Scotland and have been providing SOLIDWORKS software, training and support since 1996 when the product was first launched in the UK.

The post Modelling – As easy as 1 2 3 appeared first on SOLIDWORKS Tech Blog.




modelling

Review of: Modelling Transitions: Virtues, Vices, Visions of the Future

Review of: Modelling Transitions: Virtues, Vices, Visions of the Future by Moallemi, Enayat A. and de Haan, Fjalar J. (Eds.), reviewed by Cesar Garcia-Diaz




modelling

Stability analyses of large waste dumps via 3D numerical modelling considering cracks and earthquake loading: a case study of Zhujiabaobao waste dump

This paper uses a 3D model for stability assessment of Zhujiabaobao waste dump with ground cracks. The study data were gathered via reconnaissance, geomorphological analysis and laboratory experiment. A 3D finite extended element method model that can consider cracks was then used to calculate the factor of safety (FOS) of the waste dump via the strength reduction technique. The simulation shows the dump to have an FOS of 1.22 and both the position and depth of penetration of cracks in the waste dump have a crucial impact on the stability of the slope. Because the study area is located in a seismically active area, simulation and analysis of the dynamic response of the waste dump under different magnitudes of seismic waves (peak acceleration is 0.05, 0.15, 0.25 and 0.45g) were performed via an explicit dynamic model. The simulation shows that high steps in the slope are particularly responsive to earthquakes. The approach used here for analysing stability under static and dynamic loads is useful for hazard prevention and mitigation.




modelling

Opposition accuses Government of scaring Victorians with 'worst-case scenario' modelling

The modelling predicts more than a quarter of a million jobs could be lost in Victoria due to the coronavirus pandemic in what Premier Daniel Andrews says is the perhaps the "biggest economic and employment challenge" in the state's history.




modelling

This physicist-turned-economist is modelling the pandemic’s financial fallout




modelling

Basement membrane remodelling regulates mouse embryogenesis




modelling

Tissue-resident ductal macrophages survey the mammary epithelium and facilitate tissue remodelling




modelling

Factoring Pandemic Risks into Financial Modelling

Today’s economic crisis leaves us with an unsettling and perplexing regret. Why weren’t financial portfolios already adjusted for risks that stem from health events such as pandemics? After all, financial portfolios are adjusted for liquidity risks, market risks, credit risks, and even operational and political risks.




modelling

Factoring Pandemic Risks into Financial Modelling

Today’s economic crisis leaves us with an unsettling and perplexing regret. Why weren’t financial portfolios already adjusted for risks that stem from health events such as pandemics? After all, financial portfolios are adjusted for liquidity risks, market risks, credit risks, and even operational and political risks.




modelling

Factoring Pandemic Risks into Financial Modelling

Today’s economic crisis leaves us with an unsettling and perplexing regret. Why weren’t financial portfolios already adjusted for risks that stem from health events such as pandemics? After all, financial portfolios are adjusted for liquidity risks, market risks, credit risks, and even operational and political risks.