computing

Argonne Researchers Highlight Breakthroughs in Supercomputing and AI at SC24

Argonne National Laboratory researchers to showcase leading-edge work in high performance computing, AI and more at SC24 international conference.




computing

Understand. Anticipate. Improve. How Cognitive Computing Is Revolutionizing Knowledge Management

For decades, organizations have tried to unlock the collective knowledge contained within their people and systems. And the challenge is getting harder, since every year, massive amounts of additional information are created for people to share. We've reached a point at which individuals are unable consume, understand, or even find half the information that is available to them.




computing

DeepComputing: Early Access Program for RISC-V Mainboard for Framework Laptop 13

Comments




computing

Lead High Performance Computing Architect

Roles & Responsibilities: The Scientific Computing and Data group at the Icahn School of Medicine at Mount Sinai partners with scientists to accelerate scientific discovery. To achieve these aims, we support a cutting-edge high-performance computing and data ecosystem along with MD/PhD-level support for researchers. The group is composed of a high-performance computing team, the research clinical data warehouse team and a research data services team. The Lead HPC Architect, High Performance Computational and Data Ecosystem, is responsible for architecting, designing, and leading the technical operations for Scientific Computing’s computational and data science ecosystem. This ecosystem includes high-performance computing (HPC) systems, clinical research databases, and a software development infrastructure for local and national projects. To meet Sinai’s scientific and clinical goals, the Lead brings a strategic, tactical and customer-focused vision to evolve Sinai’s computational and data-rich environment to be continually more resilient, scalable and productive for basic and translational biomedical research. The development and execution of the vision includes a deep technical understanding of the best practices for computational, data and software development systems along with a strong focus on customer service for researchers. The Lead is an expert troubleshooter and productive team member. The incumbent is a productive partner for researchers and technologists throughout the organization and beyond. This position reports to the Director for Computational & Data Ecosystem in Scientific Computing. Responsibilities 1.Lead the technical operations including the architect, design, expansion, monitoring, support, and maintenance for Scientific Computing’s computational and data science ecosystem consistent with best practices. Key components include a 50,000+ core and 30+ petabyte usable high-performance computing cluster, clinical data warehouse and software development environment. 2.Lead the troubleshooting, isolation and resolution of all technical issues 3.Lead the design, development, implementation and management of all system administration tasks, including hardware and software configuration, configuration management, system monitoring (including the development and maintenance of regression tests), usage reporting, system performance (file systems, scheduler, interconnect, high availability, etc.), security, networking and metrics, etc. 4.Ensures that the design and operation of the HPC ecosystem is productive for research. 5.Collaborates effectively with research and hospital system IT, compliance, HIPAA, security and other departments to ensure compliance with all regulations and Sinai policies. 6.Partners with other peers regionally, nationally and internationally to discover, propose and deploy a world-class research infrastructure for Mount Sinai. 7.Prepares and manages budgets for hardware, software and maintenance. Participates in chargeback/fee recovery analysis and provides suggestions to make operations sustainable. 8.Lead the integration of HPC resources with laboratory equipment such as genomic sequencers, etc. 9.Researches, deploys and optimizes resource management and scheduling software and policies and actively monitoring. 10.Designs, tunes, manages and upgrades parallel file systems, storage and data-oriented resources. Researches, deploys and manages security infrastructure, including development of policies and procedures. 11.Lead and assist the team to resolve user support requests from researchers. 12.Assists in developing and writing system design for research proposals. 13.Lead the development of a framework for effective system documentation. 14.Works effectively and productively with other team members within the group and across Mount Sinai. 15.Provide after-hours support in case of a critical system issue.




computing

Multiverse Computing Launches Singularity ML Classification Function in IBM’s Qiskit Functions Catalog

DONOSTIA-SAN SEBASTIÁN, Spain, Nov. 13, 2024 — Multiverse Computing, a leading quantum AI software company, today announced the launch of Singularity Machine Learning – Classification within IBM’s recently launched Qiskit […]

The post Multiverse Computing Launches Singularity ML Classification Function in IBM’s Qiskit Functions Catalog appeared first on HPCwire.




computing

Argonne Researchers to Highlight Breakthroughs in Supercomputing and AI at SC24

Nov. 13, 2024 — Researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory will highlight their work in using powerful supercomputers to tackle challenges in science and technology at SC24, […]

The post Argonne Researchers to Highlight Breakthroughs in Supercomputing and AI at SC24 appeared first on HPCwire.




computing

IQM Quantum Computers Unveils Development Roadmap Focused on Fault-tolerant Quantum Computing by 2030

ESPOO, Finland, Nov. 13, 2024 — IQM Quantum Computers (IQM), a global leader in superconducting quantum computing, today announced its development roadmap with technical milestones targeting fault tolerant quantum computing by […]

The post IQM Quantum Computers Unveils Development Roadmap Focused on Fault-tolerant Quantum Computing by 2030 appeared first on HPCwire.




computing

M4 Mac mini review: The first redesign in years hides incredible computing power

Apple's long-overdue overhaul of the Mac mini shrinks an already great package even more, yet it still punches far above its weight class.


M4 Mac mini

I've owned a Mac mini since the very first G4 model. In an era of bulky towers and a G4 Cube just years before that impressed but didn't deliver, the little box brought Apple power to a small desktop package, and I was enthused.

And since then, I've had one in service 24/7 constantly. Even now with a Mac Studio on my desk, there's one upstairs in use with a family member, one in the other room silently humming away acting as my network attached storage and test platform, and a few more on my shelf that I've hoarded over the years, just waiting for an application.


Continue Reading on AppleInsider | Discuss on our Forums




computing

How 'Clean' Does a Quantum Computing Test Facility Need to Be? PNNL Scientists Show the Way

How to keep stray radiation from "shorting" superconducting qubits; a pair of studies shows where ionizing radiation is lurking and how to banish it.




computing

Argonne Researchers Highlight Breakthroughs in Supercomputing and AI at SC24

Argonne National Laboratory researchers to showcase leading-edge work in high performance computing, AI and more at SC24 international conference.




computing

Cloud Computing Helps Lift Small Business Valuations

It takes more than a solid business plan and gumption to succeed in business nowadays. Growing your company in a competitive businesses landscape—and attracting interested investors—requires a solid footing in technology.

complete article




computing

Neuromorphic Computing Market Expected to Reach $1,325.2 million by 2030

(EMAILWIRE.COM, October 28, 2024 ) The neuromorphic computing market size is expected to reach USD 1,325.2 million by 2030 growing at a compound Annual Growth Rate (CAGR) of 89.7%, from USD 28.5 million in 2024. The globalization of neuromorphic computing would further gain its momentum based on...




computing

The Cloud High Performance Computing Market Set for Rapid Growth as Demand Surges, as per Maximize Market Research.

(EMAILWIRE.COM, November 01, 2024 ) The global Cloud High Performance Computing (HPC) market is poised for significant expansion, driven by growing demands in industries such as healthcare, finance, and automotive. Cloud HPC enables businesses to perform complex computations and simulations faster...




computing

Edge Computing Market Expected to Surge to $110.6 Billion by 2029

(EMAILWIRE.COM, November 07, 2024 ) According to MarketsandMarkets' latest research, the Edge Computing Market is projected to grow from USD 60.0 billion in 2024 to USD 110.6 billion by 2029, achieving a compound annual growth rate (CAGR) of 13.0%. As the Internet of Things (IoT) continues its rapid...




computing

Clickbank - Computing & Internet - Internet, Networking - Download Now

Access many ebooks and softwares for internet, networking, screensaver, domains, site design, programming and other issues related to computers on Clickbank.





computing

Least Slack Time Rate first: an Efficient Scheduling Algorithm for Pervasive Computing Environment

Real-time systems like pervasive computing have to complete executing a task within the predetermined time while ensuring that the execution results are logically correct. Such systems require intelligent scheduling methods that can adequately promptly distribute the given tasks to a processor(s). In this paper, we propose LSTR (Least Slack Time Rate first), a new and simple scheduling algorithm, for a multi-processor environment, and demonstrate its efficient performance through various tests.




computing

Cloud Computing




computing

Feature-aware task offloading and scheduling mechanism in vehicle edge computing environment

With the rapid development and application of driverless technology, the number and location of vehicles, the channel and bandwidth of wireless network are time-varying, which leads to the increase of offloading delay and energy consumption of existing algorithms. To solve this problem, the vehicle terminal task offloading decision problem is modelled as a Markov decision process, and a task offloading algorithm based on DDQN is proposed. In order to guide agents to quickly select optimal strategies, this paper proposes an offloading mechanism based on task feature. In order to solve the problem that the processing delay of some edge server tasks exceeds the upper limit of their delay, a task scheduling mechanism based on buffer delay is proposed. Simulation results show that the proposed method has greater performance advantages in reducing delay and energy consumption compared with existing algorithms.




computing

International Journal of Wireless and Mobile Computing




computing

Design of intelligent financial sharing platform driven by consensus mechanism under mobile edge computing and accounting transformation

The intelligent financial sharing platform in the online realm is capable of collecting, storing, processing, analysing and sharing financial data through the integration of AI and big data processing technologies. However, as data volume grows exponentially, the cost of financial data storage and processing increases, and the asset accounting and financial profit data sharing analysis structure in financial sharing platforms is inadequate. To address the issue of data security sharing in the intelligent financial digital sharing platform, this paper proposes a data-sharing framework based on blockchain and edge computing. Building upon this framework, a non-separable task distribution algorithm based on data sharing is developed, which employs multiple nodes for cooperative data storage, reducing the pressure on the central server for data storage and solving the problem of non-separable task distribution. Multiple sets of comparative experiments confirm the proposed scheme has good feasibility in improving algorithm performance and reducing energy consumption and latency.




computing

Dual network control system for bottom hole throttling pressure control based on RBF with big data computing

In the context of smart city development, the managed pressure drilling (MPD) drilling process faces many uncertainties, but the characteristics of the process are complex and require accurate wellbore pressure control. However, this process runs the risk of introducing un-modelled dynamics into the system. To this problem, this paper employs neural network control techniques to construct a dual-network system for throttle pressure control, the design encompasses both the controller and identifier components. The radial basis function (RBF) network and proportional features are connected in parallel in the controller structure, and the RBF network learning algorithm is used to train the identifier structure. The simulation results show that the actual wellbore pressure can quickly track the reference pressure value when the pressure setpoint changes. In addition, the controller based on neural network realises effective control, which enables the system to track the input target quickly and achieve stable convergence.




computing

Computer aided translation technology based on edge computing intelligent algorithm

To explore the computer-aided translation technology based on the intelligent algorithm of edge computing. This paper presents the research on computer-aided translation technology based on edge computing intelligent algorithm. In the K-means computer edge algorithm, it analyses the traditional way of average resource allocation and the way of virtual machine allocation. In the process of online solution, we have a more detailed understanding of the data information at the edge, and also avoid the connection relationship between network users and the platform, which has a certain impact on the internal operation efficiency of the system. The network user group is divided into several different types of existence through K-means computer algorithm, and various information resources are counted according to their own characteristics. Computer-aided translation technology can significantly improve the quality of translation, improve the translation efficiency, and reduce the translation cost.




computing

Research on low voltage current transformer power measurement technology in the context of cloud computing

As IOT develops drastically these years, the application of cloud computing in many fields has become possible. In this paper, we take low-voltage current transformers in power systems as the research object and propose a TCN-BI-GRU power measurement method that incorporates the signal characteristics based on the transformer input and output. Firstly, the basic signal enhancement extraction of input and output is completed by using EMD and correlation coefficients. Secondly, multi-dimensional feature extraction is completed to improve the data performance according to the established TCN network. Finally, the power prediction is completed by using BI-GRU, and the results show that the RMSE of this framework is 5.69 significantly lower than other methods. In the laboratory test, the device after being subjected to strong disturbance, its correlation coefficient feature has a large impact, leading to a large deviation in the prediction, which provides a new idea for future intelligent prediction.




computing

Survival Mode: The Stresses and Strains of Computing Curricula Review




computing

Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories




computing

Designing a Network and Systems Computing Curriculum: The Stakeholders and the Issues




computing

Incorporating Knowledge of Legal and Ethical Aspects into Computing Curricula of South African Universities




computing

Cloud Computing: Short Term Impacts of 1:1 Computing in the Sixth Grade




computing

Generating a Template for an Educational Software Development Methodology for Novice Computing Undergraduates: An Integrative Review

Aim/Purpose: The teaching of appropriate problem-solving techniques to novice learners in undergraduate software development education is often poorly defined when compared to the delivery of programming techniques. Given the global need for qualified designers of information technology, the purpose of this research is to produce a foundational template for an educational software development methodology grounded in the established literature. This template can be used by third-level educators and researchers to develop robust educational methodologies to cultivate structured problem solving and software development habits in their students while systematically teaching the intricacies of software creation. Background: While software development methodologies are a standard approach to structured and traceable problem solving in commercial software development, educational methodologies for inexperienced learners remain a neglected area of research due to their assumption of prior programming knowledge. This research aims to address this deficit by conducting an integrative review to produce a template for such a methodology. Methodology: An integrative review was conducted on the key components of Teaching Software Development Education, Problem Solving, Threshold Concepts, and Computational Thinking. Systematic reviews were conducted on Computational Thinking and Software Development Education by employing the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) process. Narrative reviews were conducted on Problem Solving and Threshold Concepts. Contribution: This research provides a comprehensive analysis of problem solving, software development education, computational thinking, and threshold concepts in computing in the context of undergraduate software development education. It also synthesizes review findings from these four areas and combines them to form a research-based foundational template methodology for use by educators and researchers interested in software development undergraduate education. Findings: This review identifies seven skills and four concepts required by novice learners. The skills include the ability to perform abstraction, data representation, decomposition, evaluation, mental modeling, pattern recognition, and writing algorithms. The concepts include state and sequential flow, non-sequential flow control, modularity, and object interaction. The teaching of these skills and concepts is combined into a spiral learning framework and is joined by four development stages to guide software problem solving: understanding the problem, breaking into tasks, designing, coding, testing, and integrating, and final evaluation and reflection. This produces the principal finding, which is a research-based foundational template for educational software development methodologies. Recommendations for Practitioners: Focusing introductory undergraduate computing courses on a programming syllabus without giving adequate support to problem solving may hinder students in their attainment of development skills. Therefore, providing a structured methodology is necessary as it equips students with essential problem-solving skills and ensures they develop good development practices from the start, which is crucial to ensuring undergraduate success in their studies and beyond. Recommendation for Researchers: The creation of educational software development methodologies with tool support is an under-researched area in undergraduate education. The template produced by this research can serve as a foundational conceptual model for researchers to create concrete tools to better support computing undergraduates. Impact on Society: Improving the educational value and experience of software development undergraduates is crucial for society once they graduate. They drive innovation and economic growth by creating new technologies, improving efficiency in various industries, and solving complex problems. Future Research: Future research should concentrate on using the template produced by this research to create a concrete educational methodology adapted to suit a specific programming paradigm, with an associated learning tool that can be used with first-year computing undergraduates.




computing

Exploring the Research Ethics Domain for Postgraduate Students in Computing




computing

Evaluating Critical Reflection for Postgraduate Students in Computing




computing

The Energy Inefficiency of Office Computing and Potential Emerging Technology Solutions




computing

Would Cloud Computing Revolutionize Teaching Business Intelligence Courses?




computing

Cloud Computing as an Enabler of Agile Global Software Development

Agile global software development (AGSD) is an increasingly prevalent software development strategy, as organizations hope to realize the benefits of accessing a larger resource pool of skilled labor, at a potentially reduced cost, while at the same time delivering value incrementally and iteratively. However, the distributed nature of AGSD creates geographic, temporal, socio-cultural distances that challenge collaboration between project stakeholders. The Cloud Computing (CC) service models of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) are similar to the aspirant qualities of AGSD as they provide services that are globally accessible, efficient, and stable, with lower predictable operating costs that scale to meet the computational demand. This study focused on the 12 agile principles upon which all agile methodologies are based, therein potentially increasing the potential for the findings to be generalized. Domestication Theory was used to assist in understanding how cloud technologies were appropriated in support of AGSD. The research strategy took the form of case study research. The findings suggest that some of the challenges in applying the agile principles in AGSD may be overcome by using CC.




computing

Software as a Service (SaaS) Cloud Computing: An Empirical Investigation on University Students’ Perception

Aim/Purpose: This study aims to propose and empirically validate a model and investigates the factors influencing acceptance and use of Software as a Services cloud computing services (SaaS) from individuals’ perspectives utilizing an integrative model of Theory of Planned Behavior (TPB) and Technology Acceptance Model (TAM) with modifications to suit the objective of the study. Background: Even though SaaS cloud computing services has gained the acceptance in its educational and technical aspects, it is still expanding constantly with emerging cloud technologies. Moreover, the individual as an end-user of this technology has not been given the ample attention pertaining to SaaS acceptance and adoption (AUSaaS). Additionally, the higher education sector needs to be probed regarding AUSaaS perception, not only from a managerial stance, but also the individual. Hence, further investigation in all aspects, including the human factor, deserves deeper inspection. Methodology: A quantitative approach with probability multi-stage sampling procedure conducted utilizing survey instrument distributed among students from three public Malaysian universities. The valid collected responses were 289 Bachelor’s degree students. The survey included the demographic part as well as the items to measure the constructs relationships hypothesized. Contribution: The empirical results disclosed the appropriateness of the integrated model in explaining the individual’s attitude (R2 = 57%), the behavior intention (R2 = 64%), and AUSaaS at the university settings (R2 = 50%). Also, the study offers valuable findings and examines new relationships that considered a theoretical contribution with proven empirical results. That is, the subjective norms effect on attitude and AUSaaS is adding empirical evidence of the model hypothesized. Knowing the significance of social effect is important in utilizing it to promote university products and SaaS applications – developed inside the university – through social media networks. Also, the direct effect of perceived usefulness on AUSaaS is another important theoretical contribution the SaaS service providers/higher education institutes should consider in promoting the usefulness of their products/services developed or offered to students/end-users. Additionally, the research contributes to the knowledge of the literature and is considered one of the leading studies on accepting SaaS services and applications as proliferation of studies focus on the general and broad concept of cloud computing. Furthermore, by integrating two theories (i.e., TPB and TAM), the study employed different factors in studying the perceptions towards the acceptance of SaaS services and applications: social factors (i.e., subjective norms), personal capabilities and capacities (i.e., perceived behavioral control), technological factors (i.e., perceived usefulness and perceived ease of use), and attitudinal factors. These factors are the strength of both theories and utilizing them is articulated to unveil the salient factors affecting the acceptance of SaaS services and applications. Findings: A statistically positive significant influence of the main TPB constructs with AUSaaS was revealed. Furthermore, subjective norms (SN) and perceived usefulness (PU) demonstrated prediction ability on AUSaaS. Also, SN proved a statically significant effect on attitude (ATT). Specifically, the main contributors of intention are PU, perceived ease of use, ATT, and perceived behavioral control. Also, the proposed framework is validated empirically and statistically. Recommendation for Researchers: The proposed model is highly recommended to be tested in different settings and cultures. Also, recruiting different respondents with different roles, occupations, and cultures would likely draw more insights of the results obtained in the current research and its generalizability Future Research: Participants from private universities or other educational institutes suggested in future work as the sample here focused only on public sector universities. The model included limited number of variables suggesting that it can be extended in future works with other constructs such as trialability, compatibility, security, risk, privacy, and self-efficacy. Comparison of different ethnic groups, ages, genders, or fields of study in future research would be invaluable to enhance the findings or reveal new insights. Replication of the study in different settings is encouraged.




computing

International Journal of Social and Humanistic Computing




computing

Does 1:1 Computing in a Junior High-School Change the Pedagogical Perspectives of Teachers and their Educational Discourse?

Transforming a school from traditional teaching and learning to a one-to-one (1:1) classroom, in which a teacher and students have personal digital devices, inevitably requires changes in the way the teacher addresses her role. This study examined the implications of integrating 1:1 computing on teachers’ pedagogical perceptions and the classroom’s educational discourse. A change in pedagogical perceptions during three years of teaching within this model was investigated. The research analyzed data from 14 teachers teaching in a junior high school in the north of Israel collected over the course of three years through interviews and lesson observations. The findings show that the 1:1 computing allows teachers to improve their teaching skills; however, it fails to change their fundamental attitudes in regard to teaching and learning processes. It was further found that the use of a laptop by each student does not significantly improve the classroom’s learning discourse. The computer is perceived as an individual or group learning technology rather than as a tool for conducting learning discourse. An analysis of the data collected shows a great contribution to collaboration among teachers in preparing technology-enhanced lessons. The findings are discussed in terms of Bruner’s (Olson & Bruner, 1996) “folk psychology” and “folk pedagogy” of teachers and “the new learning ecology” framework in 1:1 classroom (Lee, Spires, Wiebe, Hollebrands, & Young, 2015). One of the main recommendations of this research is to reflect on findings from the teaching staff and the school community emphasizing 1:1 technology as a tool for significant pedagogical change. It seems that the use of personal technology per se is not enough for pedagogical changes to take place; the change must begin with teachers’ perceptions and attitudes.




computing

Can Designing Self-Representations through Creative Computing Promote an Incremental View of Intelligence and Enhance Creativity among At-Risk Youth?

Creative computing is one of the rapidly growing educational trends around the world. Previous studies have shown that creative computing can empower disadvantaged children and youth. At-risk youth tend to hold a negative view of self and perceive their abilities as inferior compared to “normative” pupils. The Implicit Theories of Intelligence approach (ITI; Dweck, 1999, 2008) suggests a way of changing beliefs regarding one’s abilities. This paper reports findings from an experiment that explores the impact of a short intervention among at-risk youth and “normative” high-school students on (1) changing ITI from being perceived as fixed (entity view of intelligence) to more flexible (incremental view of intelligence) and (2) the quality of digital self-representations programmed though a creative computing app. The participants were 117 Israeli youth aged 14-17, half of whom were at-risk youth. The participants were randomly assigned to the experimental and control conditions. The experimental group watched a video of a lecture regarding brain plasticity that emphasized flexibility and the potential of human intelligence to be cultivated. The control group watched a neutral lecture about brain-functioning and creativity. Following the intervention, all of the participants watched screencasts of basic training for the Scratch programming app, designed artifacts that digitally represented themselves five years later and reported their ITI. The results showed more incremental ITI in the experimental group compared to the control group and among normative students compared to at-risk youth. In contrast to the research hypothesis, the Scratch projects of the at-risk youth, especially in the experimental condition, were rated by neutral judges as being more creative, more aesthetically designed, and more clearly conveying their message. The results suggest that creative computing combined with the ITI intervention is a way of developing creativity, especially among at-risk youth. Increasing the number of youths who hold incremental views of intelligence and developing computational thinking may contribute to their empowerment and well-being, improve learning and promote creativity.




computing

An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment




computing

International Journal of Ad Hoc and Ubiquitous Computing




computing

Ethical and legal aspects of computing: a professional perspective from software engineering

With this book, O’Regan efficiently addresses a wide range of ethical and legal issues in computing. It is well crafted, organized, and reader friendly, featuring many recent, relevant examples like tweets, fake news, disinformation




computing

Artificial intelligence to automate the systematic review of scientific literature from Computing

The study shows that artificial intelligence (AI) has become highly important in contemporary computing because of its capacity to efficiently tackle intricate jobs that were typically carried out by people. The authors provide scientific literature that analyzes and




computing

Spind: a package for computing spatially corrected accuracy measures





computing

TOMOMAN: a software package for large-scale cryo-electron tomography data preprocessing, community data sharing and collaborative computing

Here we describe TOMOMAN (TOMOgram MANager), an extensible open-sourced software package for handling cryo-electron tomography data preprocessing. TOMOMAN streamlines interoperability between a wide range of external packages and provides tools for project sharing and archival.




computing

Accelerating imaging research at large-scale scientific facilities through scientific computing

To date, computed tomography experiments, carried-out at synchrotron radiation facilities worldwide, pose a tremendous challenge in terms of the breadth and complexity of the experimental datasets produced. Furthermore, near real-time three-dimensional reconstruction capabilities are becoming a crucial requirement in order to perform high-quality and result-informed synchrotron imaging experiments, where a large amount of data is collected and processed within a short time window. To address these challenges, we have developed and deployed a synchrotron computed tomography framework designed to automatically process online the experimental data from the synchrotron imaging beamlines, while leveraging the high-performance computing cluster capabilities to accelerate the real-time feedback to the users on their experimental results. We have, further, integrated it within a modern unified national authentication and data management framework, which we have developed and deployed, spanning the entire data lifecycle of a large-scale scientific facility. In this study, the overall architecture, functional modules and workflow design of our synchrotron computed tomography framework are presented in detail. Moreover, the successful integration of the imaging beamlines at the Shanghai Synchrotron Radiation Facility into our scientific computing framework is also detailed, which, ultimately, resulted in accelerating and fully automating their entire data processing pipelines. In fact, when compared with the original three-dimensional tomography reconstruction approaches, the implementation of our synchrotron computed tomography framework led to an acceleration in the experimental data processing capabilities, while maintaining a high level of integration with all the beamline processing software and systems.




computing

The pypadf package: computing the pair angle distribution function from fluctuation scattering data

The pair angle distribution function (PADF) is a three- and four-atom correlation function that characterizes the local angular structure of disordered materials, particles or nanocrystalline materials. The PADF can be measured using X-ray or electron fluctuation diffraction data, which can be collected by scanning or flowing a structurally disordered sample through a focused beam. It is a natural generalization of established pair distribution methods, which do not provide angular information. The software package pypadf provides tools to calculate the PADF from fluctuation diffraction data. The package includes tools for calculating the intensity correlation function, which is a necessary step in the PADF calculation and also the basis for other fluctuation scattering analysis techniques.




computing

Designing Learning Experiences with Attention to Students’ Backgrounds Can Attract Underrepresented Groups to Computing

Learning experiences in computing that are designed with attention to K-12 students’ interests, identities, and backgrounds may attract underrepresented groups to computing better than learning experiences that mimic current professional computing practices and culture do, says a new report from the National Academies of Sciences, Engineering, and Medicine.




computing

National Nuclear Security Administration Cannot Continue With ‘Business as Usual’ in the Shifting Supercomputing Landscape, Says New Report

The National Nuclear Security Administration needs to fundamentally rethink the strategy for its next generation of high-performance computing and cannot continue with ‘business as usual’ through shifting technical and geopolitical landscapes. Advanced computing capabilities help the NNSA ensure that the U.S. maintains a safe, secure, and reliable nuclear stockpile.




computing

The Untapped Potential of Computing and Cognition in Tackling Climate Change

A new NAE Perspective by Adiba M. Proma, Robert M. Wachter, and Ehsan Hoque discusses how helping people change their behaviors may be where technology can have its greatest impact on climate change.