technology

CONTACT Open World: Technology leaders showcase best practices for digital transformation

Numerous new developments in CONTACT’s Elements platform and innovative digitalisation strategies will take centre stage at this year’s Open World.




technology

Made Smarter powers SME manufacturers to invest £25m in technology

Made Smarter, the movement accelerating the digital transformation of SME manufacturers, recently reached a major milestone - backing North West companies to invest £25m in new technologies.




technology

Retail payroll teams struggling with seasonal hiring, but too few are leveraging technology to alleviate the burden

With the holiday season fast approaching, retail payroll teams around the world are bracing for the strain of seasonal hiring.




technology

Column: Bermuda’s Stone Age Technology

[Opinion column written by Dr Edward Harris ] The Stone Age generally ended some five thousand years ago with the invention of forging tools in iron. Prior to that, implements were made of stone and probably of timber, although the latter is less obvious in the archaeological record, as it tends to rot. Sadly perhaps, […]




technology

Technology changing lives: how technology can support the goals of the Care Act

Social Care Institute for Excellence (SCIE) Report 73 from SCIE roundtable discussion held on 26 March 2015. This report considers the potential of technology to transform how health and social care services are delivered.




technology

Too Much Magic: Wishful Thinking, Technology, and the Fate of the Nation

Paperback, Hardcover, Kindle, Audiobook – July 9, 2013




technology

Mapping brain function, safer autonomous vehicles are focus of Schmidt Transformative Technology fund

Two projects — one that maps the function of the brain’s neuronal network in unprecedented detail and another that combines robotics and light-based computer circuits to create safe self-driving vehicles — have been awarded funding through Princeton’s Eric and Wendy Schmidt Transformative Technology Fund.




technology

How Modern Technology has Changed the Way we Listen to Music?

From the moment the phonograph was discovered, things began to move upwards, so that today, for very little money, we could have music and even find out the name of the song we are currently listening to, with one click. Modern technology has changed the ways of listening to music. Music has never been more ... Read more

The post How Modern Technology has Changed the Way we Listen to Music? appeared first on Star Two.




technology

Technology, Humanity, and the Existential Test

I’m still digging through some of the pieces I posted at the now defunct NewCo Shift, and found this piece, adapted from a talk I gave at the Thrival Humans X Tech conference in Pittsburgh back in September of 2018. I was alarmed by trends that I saw intensifying – a push by the tech … Continue reading "Technology, Humanity, and the Existential Test"




technology

Next steps for EU-US cooperation on trade and technology

Next steps for EU-US cooperation on trade and technology 8 December 2022 — 3:00PM TO 4:00PM Anonymous (not verified) 21 November 2022 Online

How can the EU and US increase cooperation on AI, semi-conductors and funding information communication technology services?

On trade and technology policy, the EU and the US are making meaningful progress towards cooperation while at the same time navigating tensions. As senior officials meet on 5 December for the third meeting of the Trade and Technology Council (TTC), both sides have vowed to move towards concrete results. But can the US and EU increase cooperation on artificial intelligence, semiconductors, and funding information communication technology services? 

This event draws on insights from a forthcoming Briefing Paper by Marianne Schneider-Petsinger that explores next steps for US-EU cooperation on trade and technology, which is part of a project supported by the Hanns Seidel Foundation. 




technology

Why technology does not make easy wars

Why technology does not make easy wars Interview LJefferson 28 November 2022

Stephanie Carvin explains why technology does not overcome the challenges of war.

The invasion of Ukraine has demonstrated that many of the assumptions held about the role of technology in contemporary warfare are flawed. The lesson that technology cannot overcome the challenges of warfare is one that the West has also yet to learn, despite a series of failed interventions since the end of the Cold War.

In a wide-ranging conversation, Isabel Muttreja sat down with Stephanie Carvin to talk about her contribution to the September 2022 issue of International Affairs on ‘how not to war’. They discuss the US’ over-reliance on technology and why ‘easy wars’ become ‘forever wars.’

You argue in your article that the US overly relies on technology in war. When did this start?

I don’t necessarily think the US is exceptional. I think all states have tried in some ways to use technologies. One of the key arguments in the article is that the US is an enlightenment country, and part of the enlightenment is a belief in rationality and science and that you can better things through the application of science.

The idea is that if you have perfect information, you are going to be able to dominate the battlefield, and that’s proven itself to be false.

I think that there is this particular scientific approach or embracing of technology, in the American and in fact larger Western tradition on technology as a way to save lives. There is a strange humanitarian impulse that often underlies this use of technology.

We are seeing a quest to try and get perfect information. The idea is that if you have perfect information, you are going to be able to dominate the battlefield, and that’s proven itself to be false. I’m not even sure you can ever get perfect information.

But it underlines this modern approach, that if you can have all the information that’s out there, crunch it into some kind of algorithm, that you can then target discriminately, proportionately, reduce the level of casualties, and reduce the level of unnecessary damage. And that’s a kind of liberal tradition. You are trying to have your cake and eat it too.

You talk about the US being an ultimately liberal state, but they have been involved in a lot of wars over the last 10–20 years. Is that a contradiction?

I hope it is. But I think it goes back to the enlightenment nature of the United States, which is that the US sees itself as a shining city on a hill that has to protect itself at all costs. Liberals abhor tyranny, and they abhor unnecessary deaths. But I think that the idea is that if you threaten us, we see ourselves as embodying these values, therefore, we have to protect ourselves.

There’s a tendency to not really recognize the kind of insurgencies that we’ve seen in Iraq and Afghanistan, or even Vietnam, as war. We don’t really see that as a kind of armed conflict, even though, arguably, that has been the dominant mode of conflict for some time. They even used to call it ‘military operations other than warfare’. We tend to still think of war as great power competition or as the Second World War.

The West has struggled to culturally understand the way other people fight. And that’s when the laws of war conventions have broken down.

My first book was on prisoners of war in the American tradition. What often determined the treatment of people as prisoners of war was if the United States recognized their form of warfare. There’s a racial element here too that I don’t want to dismiss.

So, for example, the US war in the Philippines at the start of the 20th century: They went in, won a very quick victory over the Spanish and effectively took over the Philippines. And then they had a long insurgency for two years with the native Filipinos who didn’t want US domination. While they gave the Spanish all the prisoner of war rights, they didn’t give them to the Filipinos.

This is because they recognized the form of conflict that the Spanish engaged in, but the Indigenous way of warfare was not recognized. The West has struggled to culturally understand the way other people fight. And that’s when the laws of war conventions have broken down between, say, the United States, the West, and other states.

You talk in your article about the US entering ‘easy wars’ and ending up with ‘forever wars’ – what does this mean?

There’s an allure to this high-tech version of warfare, that it can solve a lot of problems, but it’s an illusion. It is ultimately a bit of a false promise.

The idea that machines are going to replace humans is fundamentally untrue. We are seeing this to a certain extent right now, even in the Russia/Ukraine war. This is very much a battle of machines and soldiers. One of the themes of this issue of International Affairs is hubris. The idea that things that appear to be quick wins often tend to be long-term losses. And that’s exactly what this article is talking about.

‘Forever wars’ is not my favourite term, but it’s this concept that what was promised to be an easy war, a high technology-driven conflict, where you can go in, use some surgically precise weapons, take care of the problem, eliminate your opponent and then extract yourself from a situation, has actually turned into a quagmire.

There’s an allure to this high-tech version of warfare, that it can solve a lot of problems, but it’s an illusion.

The limits of technology become apparent within a few months as well as the fact of the messy business of state-building, or the fact that insurgencies and political movements don’t just disintegrate at the show of some high-tech, sophisticated weaponry. It just tends to mean that these wars do go on for a long time, and you have to eventually extricate yourself, but there’s no clean way to do this. We saw this of course with Afghanistan, and to a large extent Iraq.

We get distracted by the shiny object. We see this promise, we see this vision of a kind of warfare that for some may have great appeal.  There are new super weapons, whether it be cyber information warfare or artificial intelligence. Everyone wants to be ahead of the curve, right?

Are these lessons on technology and ‘easy wars’ applicable to other countries?

I think what we’ve learned about the Russian military is that there’s a lot more at the heart of it. Part of the problem Russia is experiencing is that its capabilities were not what it thought they were. It’s clear that Vladimir Putin was enamoured with a lot of the ideas, like that the Russian military was increasingly high-tech and that they had these hypersonic missiles.

They also had very powerful cyber weapons amongst other things. Putin, too, seems to have been caught up in this idea that he could have had a 72-hour special military operation, which would have taken Kyiv. Clearly, that hasn’t happened. Once again, we see the underestimation of the human factor.




technology

A Multidimensional Chromatography Technology for In-depth Phosphoproteome Analysis

Claudio P. Albuquerque
Jul 1, 2008; 7:1389-1396
Research




technology

In conversation with James Manyika, Senior Vice President of Research, Technology and Society at Google

In conversation with James Manyika, Senior Vice President of Research, Technology and Society at Google 12 December 2024 — 11:15AM TO 12:45PM Anonymous (not verified) Chatham House and Online

A conversation on AI’s global, societal and economic impacts.

2024 has been a landmark year for Artificial Intelligence (AI) development, deployment and use, with significant progress in AI-driven science, governance and cooperation. Looking ahead, AI continues to demonstrate economic promise and potential to expand on scientific breakthroughs in areas such as climate and health. This wave of innovation is occurring against a backdrop of geopolitical uncertainty and not all countries are fully able to participate. Heading into 2025, there are urgent questions about how best to maximise shared opportunities when it comes to AI and to advance global cooperation.

James Manyika, Senior Vice President of Research, Technology & Society at Google, will unpack what 2025 will bring for AI in science, economics, global governance and international cooperation. 

Key questions include:

  • What will be AI’s global societal and economic impact in 2025 and beyond? 
  • What are the ways AI could help increase economic growth and economy-wide productivity? What factors must be in place for this to happen?
  • How best can we maximise shared opportunities and advance global cooperation when it comes to AI? Where can public-private partnerships unlock scientific breakthroughs for societal progress, combatting shared global challenges such as climate change and global health issues?  
  • What are the principles of safe, responsible AI, and how should companies remain responsive to their evolution and integrate them into technology design and implementation? 
  • What is the current – and ideal – role of technology companies in emerging mechanisms for global cooperation and national governance on AI?

This event is being held in partnership with Google.

You will receive notice by 13:00 on Wednesday 11 December if you have been successful in securing an in-person place.

The institute occupies a position of respect and trust, and is committed to fostering inclusive dialogue at all events. Event attendees are expected to uphold this by adhering to our code of conduct.




technology

Dark Commerce: Technology’s Contribution to the Illegal Economy




technology

Is Technology Destroying Democracy?




technology

Gender Inequality: Making Technology the Solution, Not the Problem




technology

Is Technology Re-Engineering Humanity?




technology

Undercurrents: Bonus Episode - How Technology is Changing International Affairs




technology

Refugees and Technology: Panel Discussion




technology

Technology Diplomacy in the Digital Age




technology

Chatham House Commission on Democracy and Technology in Europe

Chatham House Commission on Democracy and Technology in Europe News Release sysadmin 25 July 2019

Our project on Democracy and Technology in Europe is now entering its final phase. Now we want your help in shaping the final report.




technology

Can global technology governance anticipate the future?

Can global technology governance anticipate the future? Expert comment NCapeling 27 April 2021

Trying to govern disruption is perilous as complex technology is increasingly embedded in societies and omnipresent in economic, social, and political activity.

Technology governance is beset by the challenges of how regulation can keep pace with rapid digital transformation, how governments can regulate in a context of deep knowledge asymmetry, and how policymakers can address the transnational nature of technology.

Keeping pace with, much less understanding, the implications of digital platforms and artificial intelligence for societies is increasingly challenging as technology becomes more sophisticated and yet more ubiquitous.

To overcome these obstacles, there is an urgent need to move towards a more anticipatory and inclusive model of technology governance. There are some signs of this in recent proposals by the European Union (EU) and the UK on the regulation of online harms.

Regulation failing to keep up

The speed of the digital revolution, further accelerated by the pandemic, has largely outstripped policymakers’ ability to provide appropriate frameworks to regulate and direct technology transformations.

Governments around the world face a ‘pacing problem’, a phenomenon described by Gary Marchant in 2011 as ‘the growing gap between the pace of science and technology and the lagging responsiveness of legal and ethical oversight that society relies on to govern emerging technologies’.

The speed of the digital revolution, further accelerated by the pandemic, has largely outstripped policymakers’ ability to provide appropriate frameworks to regulate and direct technology transformations

This ever-growing rift, Marchant argues, has been exacerbated by the increasing public appetite for and adoption of new technologies, as well as political inertia. As a result, legislation on emerging technologies risks being ineffective or out-of-date by the time it is implemented.

Effective regulation requires a thorough understanding of both the underlying technology design, processes and business model, and how current or new policy tools can be used to promote principles of good governance.

Artificial intelligence, for example, is penetrating all sectors of society and spanning multiple regulatory regimes without any regard for jurisdictional boundaries. As technology is increasingly developed and applied by the private sector rather than the state, officials often lack the technical expertise to adequately comprehend and act on emerging issues. This increases the risk of superficial regulation which fails to address the underlying structural causes of societal harms.

The significant lack of knowledge from those who aim to regulate compared to those who design, develop and market technology is prevalent in most technology-related domains, including powerful online platforms and providers such as Facebook, Twitter, Google and YouTube.

For example, the ability for governments and researchers to access the algorithms used in the business model of social media companies to promote online content – harmful or otherwise – remains opaque so, to a crucial extent, the regulator is operating in the dark.

The transnational nature of technology also poses additional problems for effective governance. Digital technologies intensify the gathering, harvesting, and transfer of data across borders, challenging administrative boundaries both domestically and internationally.

While there have been some efforts at the international level to coordinate approaches to the regulation of – for example – artificial intelligence (AI) and online content governance, more work is needed to promote global regulatory alignment, including on cross-border data flows and antitrust.

Reactive national legislative approaches are often based on targeted interventions in specific policy areas, and so risk failing to address the scale, complexity, and transnational nature of socio-technological challenges. Greater attention needs to be placed on how regulatory functions and policy tools should evolve to effectively govern technology, requiring a shift from a reactionary and rigid framework to a more anticipatory and adaptive model of governance.

Holistic and systemic versus mechanistic and linear

Some recent proposals for technology governance may offer potential solutions. The EU publication of a series of interlinked regulatory proposals – the Digital Services Act, Digital Markets Act and European Democracy Action Plan – integrates several novel and anticipatory features.

The EU package recognizes that the solutions to online harms such as disinformation, hate speech, and extremism lie in a holistic approach which draws on a range of disciplines, such as international human rights law, competition law, e-commerce, and behavioural science.

By tackling the complexity and unpredictability of technology governance through holistic and systemic approaches rather than mechanistic and linear ones, the UK and EU proposals represent an important pivot from reactive to anticipatory digital governance

It consists of a combination of light touch regulation – such as codes of conduct – and hard law requirements such as transparency obligations. Codes of conduct provide flexibility as to how requirements are achieved by digital platforms, and can be updated and tweaked relatively easily enabling regulations to keep pace as technology evolves.

As with the EU Digital Services Act, the UK’s recent proposals for an online safety bill are innovative in adopting a ‘systems-based’ approach which broadly focuses on the procedures and policies of technology companies rather than the substance of online content.

This means the proposals can be adapted to different types of content, and differentiated according to the size and reach of the technology company concerned. This ‘co-regulatory’ model recognizes the evolving nature of digital ecosystems and the ongoing responsibilities of the companies concerned. The forthcoming UK draft legislation will also be complemented by a ‘Safety by Design’ framework, which is forward-looking in focusing on responsible product design.

By tackling the complexity and unpredictability of technology governance through holistic and systemic approaches rather than mechanistic and linear ones, the UK and EU proposals represent an important pivot from reactive to anticipatory digital governance.

Both sets of proposals were also the result of extensive multistakeholder engagement, including between policy officials and technology actors. This engagement broke down silos within the technical and policy/legal communities and helped bridge the knowledge gap between dominant technology companies and policymakers, facilitating a more agile, inclusive, and pragmatic regulatory approach.

Coherence rather than fragmentation

Anticipatory governance also recognizes the need for new coalitions to promote regulatory coherence rather than fragmentation at the international level. The EU has been pushing for greater transatlantic engagement on regulation of the digital space, and the UK – as chair of the G7 presidency in 2021 – aims to work with democratic allies to forge a coherent response to online harms.

Meanwhile the OECD’s AI Policy Observatory enables member states to share best practice on the regulation of AI, and an increasing number of states such as France, Norway, and the UK are using ‘regulatory sandboxes’ to test and build AI or personal data systems that meet privacy standards.

Not all states currently have the organizational capacity and institutional depth to design and deliver regulatory schemes of this nature, as well as the resource-intensive consultation processes which often accompany them.

So, as an increasing number of states ponder how to ‘futureproof’ their regulation of tomorrow’s technology – whether 6G, quantum computing or biotechnology – there is a need for capacity building in governments both on the theory of anticipatory governance and on how it can be applied in practice to global technology regulation.




technology

Detection of multiple autoantibodies in patients with ankylosing spondylitis using nucleic acid programmable protein arrays [11. Microarrays/Combinatorics/Display Technology]

Ankylosing Spondylitis (AS) is a common, inflammatory rheumatic disease, which primarily affects the axial skeleton and is associated with sacroiliitis, uveitis and enthesitis. Unlike other autoimmune rheumatic diseases, such as rheumatoid arthritis or systemic lupus erythematosus, autoantibodies have not yet been reported to be a feature of AS. We therefore wished to determine if plasma from patients with AS contained autoantibodies and if so, characterize and quantify this response in comparison to patients with Rheumatoid Arthritis (RA) and healthy controls. Two high-density nucleic acid programmable protein arrays expressing a total of 3498 proteins were screened with plasma from 25 patients with AS, 17 with RA and 25 healthy controls. Autoantigens identified were subjected to Ingenuity Pathway Analysis in order to determine patterns of signalling cascades or tissue origin. 44% of patients with Ankylosing Spondylitis demonstrated a broad autoantibody response, as compared to 33% of patients with RA and only 8% of healthy controls. Individuals with AS demonstrated autoantibody responses to shared autoantigens, and 60% of autoantigens identified in the AS cohort were restricted to that group. The AS patients autoantibody responses were targeted towards connective, skeletal and muscular tissue, unlike those of RA patients or healthy controls. Thus, patients with AS show evidence of systemic humoral autoimmunity and multispecific autoantibody production. Nucleic Acid Programmable Protein Arrays constitute a powerful tool to study autoimmune diseases.




technology

The ProteoRed MIAPE web toolkit: A user-friendly framework to connect and share proteomics standards [Technology]

The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. Firstly, it can verify the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Secondly, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/.




technology

Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements [Technology]

As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.




technology

Quantitative profiling of protein tyrosine kinases in human cancer cell lines by multiplexed parallel reaction monitoring assays [Technology]

Protein tyrosine kinases (PTKs) play key roles in cellular signal transduction, cell cycle regulation, cell division, and cell differentiation. Dysregulation of PTK-activated pathways, often by receptor overexpression, gene amplification, or genetic mutation, is a causal factor underlying numerous cancers. In this study, we have developed a parallel reaction monitoring (PRM)-based assay for quantitative profiling of 83 PTKs. The assay detects 308 proteotypic peptides from 54 receptor tyrosine kinases and 29 nonreceptor tyrosine kinases in a single run. Quantitative comparisons were based on the labeled reference peptide method. We implemented the assay in four cell models: 1) a comparison of proliferating versus epidermal growth factor (EGF)-stimulated A431 cells, 2) a comparison of SW480Null (mutant APC) and SW480APC (APC restored) colon tumor cell lines, and 3) a comparison of 10 colorectal cancer cell lines with different genomic abnormalities, and 4) lung cancer cell lines with either susceptibility (11-18) or acquired resistance (11-18R) to the epidermal growth factor receptor tyrosine kinase inhibitor erlotinib. We observed distinct PTK expression changes that were induced by stimuli, genomic features or drug resistance, which were consistent with previous reports. However, most of the measured expression differences were novel observations. For example, acquired resistance to erlotinib in the 11-18 cell model was associated not only with previously reported upregulation of MET, but also with upregulation of FLK2 and downregulation of LYN and PTK7. Immunoblot analyses and shotgun proteomics data were highly consistent with PRM data. Multiplexed PRM assays provide a targeted, systems-level profiling approach to evaluate cancer-related proteotypes and adaptations. Data are available through Proteome eXchange Accession PXD002706.




technology

Advance Science, Technology and Sophistication with SX-Aurora TSUBASA or Vector Processor or Vector Engine (VE)

Noritaka Hoshi, Senior Manager, AI Platform Division, talks about the impetus for and challenges within the development of SX-Aurora TSUBASA or a massive SIMD, created to handle enormous computing and […]

The post Advance Science, Technology and Sophistication with SX-Aurora TSUBASA or Vector Processor or Vector Engine (VE) appeared first on HPCwire.




technology

Leveraging Exascale Era Technology for Advanced Computer-Aided Engineering

How can manufacturers apply lessons learned from the dawn of the “Exascale Era” to Computer-Aided Engineering to achieve results like never before? Join Addison Snell, CEO, Intersect360 Research, Bill Mannel, VP, High Performance […]

The post Leveraging Exascale Era Technology for Advanced Computer-Aided Engineering appeared first on HPCwire.




technology

Gender in Science, Technology, Engineering, and Mathematics: Issues, Causes, Solutions

Tessa E.S. Charlesworth
Sep 11, 2019; 39:7228-7243
Viewpoints




technology

Ask Smithsonian: How Does Night Vision Technology Work?

Who’s afraid of the dark? Our Ask Smithsonian host Eric Schulze is here to explain the illuminating science behind night vision.




technology

Canada launches AI watchdog to oversee the technology’s safe development and use

Amid rapid global advances and deployment of artificial intelligence technologies, the federal government has invested millions to combine the minds of three existing institutes into one that can keep an eye on potential dangers ahead.




technology

SolidWorks Software Helps Westfield Sportscars Combine New Technology And A Vintage Package

Engineers Use CAD To Adapt Standard Parts, Assemblies To Fit In Custom Body Design from 1950s




technology

SolidWorks Corporation offers new technology software grants for teachers

SolidWorks software donation supports fusion of science, technology, engineering, and math (STEM) in education




technology

SolidWorks Corporation unveils SolidWorks 2007, powered by revolutionary 'SWIFT' technology

Innovation introduces expert intelligence into 3D CAD software




technology

Syncroness uses COSMOSMotion analysis technology to design high-quality, complex mechanisms in less time at lower cost

Cutting days and weeks from design cycle gives Syncroness more time to focus on designing lightweight, high performance products




technology

SOLIDWORKS' SWIFT wins IndustryWeek Technology of the Year award

Breakthrough technology puts expert design techniques in every user's hands




technology

Washington State schools make technology education much more than 'shop class'

Schools combine SOLIDWORKS 3D CAD software with 3D scanning and printing to enrich student learning




technology

domo gives emergency responders eyes on the spot with video technology designed in SOLIDWORKS software

U.K. company cut weeks out of design time, eliminated prototyping, and increased innovation with SOLIDWORKS and CircuitWorks




technology

SyCSA increases production 200 percent by embracing 3D design technology

Company can design a silo in half an hour and deliver it earlier while helping improve the quality of Mexico's food, plastics, and construction practices




technology

SOLIDWORKS Labs more than doubles its emerging technology offerings

Nearly 100,000 visitors experimented with forward-looking tools and services in first year




technology

Pittsburg State Engineering Technology Students Learn Faster, Easier with 1,000 New Licenses of SolidWorks software

Off-Campus, Around-the-Clock Access and Integrated Simulation are Big Advantages for Students




technology

SolidWorks Users Provided Intelligent Technology That Helped Save Chilean Miners

Equipment Designed and Modified With SolidWorks Software Located Miners and Helped Drill Escape Shaft Weeks Sooner Than Predicted




technology

Using new technology to share the gospel

When it comes to reaching the least-reached, OM workers are using new technology to make ministry more effective—one byte at a time.




technology

Educational Technology: What's Behind the Hype?

While laptops and videos can make the classroom fun and interactive, how much does technology really improve achievement?




technology

How Teachers Can and Should Use Technology in the Classroom

Integrating technology requires a significant investment of time and money, but the resources are well-spent if the focus is improving instruction, writes educational consultant Matthew Lynch.




technology

Project aims to build strong manufacturing workforce with immersive technology 

The Richard King Mellon Foundation recently awarded $392,000 to Penn State to build a strong science- and technology-focused workforce in the state’s Mon Valley region through collaboration and virtual, augmented and mixed reality trainings and tools.




technology

Teaching and Learning with Technology announces 2024-2026 Faculty Fellows cohort

Penn State’s Teaching and Learning with Technology (TLT), part of Penn State University Libraries, has welcomed five University instructors across three campuses into its Faculty Fellows cohort for 2024-26 and started collaborating on a new collection of projects. Faculty chosen for the Faculty Fellows program team up with TLT on innovative technology projects, with past endeavors spanning a broad spectrum from learning spaces to virtual reality/immersive experiences to data-empowered learning.




technology

Common Assessments a Test for Schools' Technology

As the two big groups of states craft common-assessment systems, experts warn that the smallest details could undermine their work.




technology

Penn State DuBois Wildlife Technology Program achieves reaccreditation

Recently, the Wildlife Technology Program at Penn State DuBois earned reaccreditation with the North American Wildlife Technology Association for five years.




technology

Transformational technology

OM workers in Central Asia use technology to develop new discipleship and worship tools for local believers.