techno

Park Aerospace Technologies, Melville NY United States

Park Manufactures Laminates Prepregs And Rf Microwave Ptfe Substrate Materials For The Global Pcb Printed Circuit Board Electron... Linda Legneese, Director Human Resources And Risk Management, Human Resources, Melville, NY, United States




  • Park Aerospace Technologies

techno

Communications Applied Technology, Reston VA United States

Radio Interoperability Gateway Systems Wired Intercoms And Wireless Intercoms For Military And Public Safety First Responders Co... Seth Leyman, President, Management, Reston, VA, United States




  • Communications Applied Technology

techno

Lamp Technology, Bohemia NY United States

Lamp Technology Inc Lamp Technology Inc... Edith Reuter, Chairman, Management, Bohemia, NY, United States





techno

Bd Micro Technologies, Siletz OR United States

Bd Micro Technologies The Bd 5 People... Siletz, OR, United States




  • Bd Micro Technologies

techno

Bd Micro Technologies, Siletz OR United States

Bd Micro Technologies The Bd 5 People... Siletz, OR, United States




  • Bd Micro Technologies

techno

Starship Technologies: bringing autonomous food delivery to even more customers in Tallinn

 Starship Technologies and Bolt have announced the launch of a new food delivery service.




techno

America’s commitment to technological innovation is at a crossroads

One of America’s enduring strengths has been its long embrace of technological innovation. From the widespread adoption of groundbreaking technologies such as the automobile and airplane, to the invention of common household appliances such as the dishwasher and microwave, America has never lost sight of technology’s critical role in driving economic development and societal progress.




techno

Inland Northwest tribes are using technology to track young salmon in hopes of returning runs to the Columbia and Spokane rivers

It starts raining just as two trucks hauling juvenile salmon arrive near the end of a gravel road at Chief Joseph Dam in the Central Washington town of Bridgeport on Friday, May 6…



  • News/Local News

techno

Tycho's Scott Hansen looks for balance between nature and technology on Infinite Health

For Tycho's Scott Hansen, it all comes back to water…




techno

Coventry conference demonstrates the business benefits of digital technology

Businesses in the region are being encouraged to find out more about the importance of the growing digital economy at a free event taking place at Coventry University in January.




techno

Webiatorstechnology 1.0 by Webiatorstechnology

Established in 2017, Webiators is dedicated to meeting the digital requirements of individuals with ...




techno

Rethinking energy storage technology as our need for battery power grows

How can we meet the increased demand for the materials needed to build batteries, while keeping the environmental and human costs of resource extraction low?




techno

CONTACT Open World: Technology leaders showcase best practices for digital transformation

Numerous new developments in CONTACT’s Elements platform and innovative digitalisation strategies will take centre stage at this year’s Open World.




techno

Only 1 in 5 businesses are currently adopting AI technologies

A recent analysis of data from the ONS Business Insights report found that the number of UK businesses currently adopting AI technologies has increased by 5% since September of last year.




techno

Made Smarter powers SME manufacturers to invest £25m in technology

Made Smarter, the movement accelerating the digital transformation of SME manufacturers, recently reached a major milestone - backing North West companies to invest £25m in new technologies.




techno

Retail payroll teams struggling with seasonal hiring, but too few are leveraging technology to alleviate the burden

With the holiday season fast approaching, retail payroll teams around the world are bracing for the strain of seasonal hiring.




techno

Column: Bermuda’s Stone Age Technology

[Opinion column written by Dr Edward Harris ] The Stone Age generally ended some five thousand years ago with the invention of forging tools in iron. Prior to that, implements were made of stone and probably of timber, although the latter is less obvious in the archaeological record, as it tends to rot. Sadly perhaps, […]




techno

Should You Buy This Millionaire-Maker Stock Instead of Palantir Technologies?

There may not be a hotter stock on Wall Street right now than Palantir Technologies. The company's artificial intelligence (AI) software is revving its growth engine, and there is a wide-open market opportunity in both government and the private sector. The stock is up approximately 800% since the…




techno

Technology changing lives: how technology can support the goals of the Care Act

Social Care Institute for Excellence (SCIE) Report 73 from SCIE roundtable discussion held on 26 March 2015. This report considers the potential of technology to transform how health and social care services are delivered.




techno

Too Much Magic: Wishful Thinking, Technology, and the Fate of the Nation

Paperback, Hardcover, Kindle, Audiobook – July 9, 2013




techno

Princeton archaeologists are using cutting-edge digital technologies to help reveal the ancient past

In the field, digital technology saves immense amounts of time and limits fruitless digging. In the classroom, VR recreations help bring the past to life.




techno

Mapping brain function, safer autonomous vehicles are focus of Schmidt Transformative Technology fund

Two projects — one that maps the function of the brain’s neuronal network in unprecedented detail and another that combines robotics and light-based computer circuits to create safe self-driving vehicles — have been awarded funding through Princeton’s Eric and Wendy Schmidt Transformative Technology Fund.




techno

How Modern Technology has Changed the Way we Listen to Music?

From the moment the phonograph was discovered, things began to move upwards, so that today, for very little money, we could have music and even find out the name of the song we are currently listening to, with one click. Modern technology has changed the ways of listening to music. Music has never been more ... Read more

The post How Modern Technology has Changed the Way we Listen to Music? appeared first on Star Two.




techno

Technology, Humanity, and the Existential Test

I’m still digging through some of the pieces I posted at the now defunct NewCo Shift, and found this piece, adapted from a talk I gave at the Thrival Humans X Tech conference in Pittsburgh back in September of 2018. I was alarmed by trends that I saw intensifying – a push by the tech … Continue reading "Technology, Humanity, and the Existential Test"




techno

Next steps for EU-US cooperation on trade and technology

Next steps for EU-US cooperation on trade and technology 8 December 2022 — 3:00PM TO 4:00PM Anonymous (not verified) 21 November 2022 Online

How can the EU and US increase cooperation on AI, semi-conductors and funding information communication technology services?

On trade and technology policy, the EU and the US are making meaningful progress towards cooperation while at the same time navigating tensions. As senior officials meet on 5 December for the third meeting of the Trade and Technology Council (TTC), both sides have vowed to move towards concrete results. But can the US and EU increase cooperation on artificial intelligence, semiconductors, and funding information communication technology services? 

This event draws on insights from a forthcoming Briefing Paper by Marianne Schneider-Petsinger that explores next steps for US-EU cooperation on trade and technology, which is part of a project supported by the Hanns Seidel Foundation. 




techno

Why technology does not make easy wars

Why technology does not make easy wars Interview LJefferson 28 November 2022

Stephanie Carvin explains why technology does not overcome the challenges of war.

The invasion of Ukraine has demonstrated that many of the assumptions held about the role of technology in contemporary warfare are flawed. The lesson that technology cannot overcome the challenges of warfare is one that the West has also yet to learn, despite a series of failed interventions since the end of the Cold War.

In a wide-ranging conversation, Isabel Muttreja sat down with Stephanie Carvin to talk about her contribution to the September 2022 issue of International Affairs on ‘how not to war’. They discuss the US’ over-reliance on technology and why ‘easy wars’ become ‘forever wars.’

You argue in your article that the US overly relies on technology in war. When did this start?

I don’t necessarily think the US is exceptional. I think all states have tried in some ways to use technologies. One of the key arguments in the article is that the US is an enlightenment country, and part of the enlightenment is a belief in rationality and science and that you can better things through the application of science.

The idea is that if you have perfect information, you are going to be able to dominate the battlefield, and that’s proven itself to be false.

I think that there is this particular scientific approach or embracing of technology, in the American and in fact larger Western tradition on technology as a way to save lives. There is a strange humanitarian impulse that often underlies this use of technology.

We are seeing a quest to try and get perfect information. The idea is that if you have perfect information, you are going to be able to dominate the battlefield, and that’s proven itself to be false. I’m not even sure you can ever get perfect information.

But it underlines this modern approach, that if you can have all the information that’s out there, crunch it into some kind of algorithm, that you can then target discriminately, proportionately, reduce the level of casualties, and reduce the level of unnecessary damage. And that’s a kind of liberal tradition. You are trying to have your cake and eat it too.

You talk about the US being an ultimately liberal state, but they have been involved in a lot of wars over the last 10–20 years. Is that a contradiction?

I hope it is. But I think it goes back to the enlightenment nature of the United States, which is that the US sees itself as a shining city on a hill that has to protect itself at all costs. Liberals abhor tyranny, and they abhor unnecessary deaths. But I think that the idea is that if you threaten us, we see ourselves as embodying these values, therefore, we have to protect ourselves.

There’s a tendency to not really recognize the kind of insurgencies that we’ve seen in Iraq and Afghanistan, or even Vietnam, as war. We don’t really see that as a kind of armed conflict, even though, arguably, that has been the dominant mode of conflict for some time. They even used to call it ‘military operations other than warfare’. We tend to still think of war as great power competition or as the Second World War.

The West has struggled to culturally understand the way other people fight. And that’s when the laws of war conventions have broken down.

My first book was on prisoners of war in the American tradition. What often determined the treatment of people as prisoners of war was if the United States recognized their form of warfare. There’s a racial element here too that I don’t want to dismiss.

So, for example, the US war in the Philippines at the start of the 20th century: They went in, won a very quick victory over the Spanish and effectively took over the Philippines. And then they had a long insurgency for two years with the native Filipinos who didn’t want US domination. While they gave the Spanish all the prisoner of war rights, they didn’t give them to the Filipinos.

This is because they recognized the form of conflict that the Spanish engaged in, but the Indigenous way of warfare was not recognized. The West has struggled to culturally understand the way other people fight. And that’s when the laws of war conventions have broken down between, say, the United States, the West, and other states.

You talk in your article about the US entering ‘easy wars’ and ending up with ‘forever wars’ – what does this mean?

There’s an allure to this high-tech version of warfare, that it can solve a lot of problems, but it’s an illusion. It is ultimately a bit of a false promise.

The idea that machines are going to replace humans is fundamentally untrue. We are seeing this to a certain extent right now, even in the Russia/Ukraine war. This is very much a battle of machines and soldiers. One of the themes of this issue of International Affairs is hubris. The idea that things that appear to be quick wins often tend to be long-term losses. And that’s exactly what this article is talking about.

‘Forever wars’ is not my favourite term, but it’s this concept that what was promised to be an easy war, a high technology-driven conflict, where you can go in, use some surgically precise weapons, take care of the problem, eliminate your opponent and then extract yourself from a situation, has actually turned into a quagmire.

There’s an allure to this high-tech version of warfare, that it can solve a lot of problems, but it’s an illusion.

The limits of technology become apparent within a few months as well as the fact of the messy business of state-building, or the fact that insurgencies and political movements don’t just disintegrate at the show of some high-tech, sophisticated weaponry. It just tends to mean that these wars do go on for a long time, and you have to eventually extricate yourself, but there’s no clean way to do this. We saw this of course with Afghanistan, and to a large extent Iraq.

We get distracted by the shiny object. We see this promise, we see this vision of a kind of warfare that for some may have great appeal.  There are new super weapons, whether it be cyber information warfare or artificial intelligence. Everyone wants to be ahead of the curve, right?

Are these lessons on technology and ‘easy wars’ applicable to other countries?

I think what we’ve learned about the Russian military is that there’s a lot more at the heart of it. Part of the problem Russia is experiencing is that its capabilities were not what it thought they were. It’s clear that Vladimir Putin was enamoured with a lot of the ideas, like that the Russian military was increasingly high-tech and that they had these hypersonic missiles.

They also had very powerful cyber weapons amongst other things. Putin, too, seems to have been caught up in this idea that he could have had a 72-hour special military operation, which would have taken Kyiv. Clearly, that hasn’t happened. Once again, we see the underestimation of the human factor.




techno

A Multidimensional Chromatography Technology for In-depth Phosphoproteome Analysis

Claudio P. Albuquerque
Jul 1, 2008; 7:1389-1396
Research




techno

In conversation with James Manyika, Senior Vice President of Research, Technology and Society at Google

In conversation with James Manyika, Senior Vice President of Research, Technology and Society at Google 12 December 2024 — 11:15AM TO 12:45PM Anonymous (not verified) Chatham House and Online

A conversation on AI’s global, societal and economic impacts.

2024 has been a landmark year for Artificial Intelligence (AI) development, deployment and use, with significant progress in AI-driven science, governance and cooperation. Looking ahead, AI continues to demonstrate economic promise and potential to expand on scientific breakthroughs in areas such as climate and health. This wave of innovation is occurring against a backdrop of geopolitical uncertainty and not all countries are fully able to participate. Heading into 2025, there are urgent questions about how best to maximise shared opportunities when it comes to AI and to advance global cooperation.

James Manyika, Senior Vice President of Research, Technology & Society at Google, will unpack what 2025 will bring for AI in science, economics, global governance and international cooperation. 

Key questions include:

  • What will be AI’s global societal and economic impact in 2025 and beyond? 
  • What are the ways AI could help increase economic growth and economy-wide productivity? What factors must be in place for this to happen?
  • How best can we maximise shared opportunities and advance global cooperation when it comes to AI? Where can public-private partnerships unlock scientific breakthroughs for societal progress, combatting shared global challenges such as climate change and global health issues?  
  • What are the principles of safe, responsible AI, and how should companies remain responsive to their evolution and integrate them into technology design and implementation? 
  • What is the current – and ideal – role of technology companies in emerging mechanisms for global cooperation and national governance on AI?

This event is being held in partnership with Google.

You will receive notice by 13:00 on Wednesday 11 December if you have been successful in securing an in-person place.

The institute occupies a position of respect and trust, and is committed to fostering inclusive dialogue at all events. Event attendees are expected to uphold this by adhering to our code of conduct.




techno

Global Trade Landscape Series 2018: Technological Transitions and the Future of Global Trade




techno

Dark Commerce: Technology’s Contribution to the Illegal Economy




techno

Is Technology Destroying Democracy?




techno

Gender Inequality: Making Technology the Solution, Not the Problem




techno

Is Technology Re-Engineering Humanity?




techno

Undercurrents: Bonus Episode - How Technology is Changing International Affairs




techno

Refugees and Technology: Panel Discussion




techno

Technology Diplomacy in the Digital Age




techno

Chatham House Commission on Democracy and Technology in Europe

Chatham House Commission on Democracy and Technology in Europe News Release sysadmin 25 July 2019

Our project on Democracy and Technology in Europe is now entering its final phase. Now we want your help in shaping the final report.




techno

Can global technology governance anticipate the future?

Can global technology governance anticipate the future? Expert comment NCapeling 27 April 2021

Trying to govern disruption is perilous as complex technology is increasingly embedded in societies and omnipresent in economic, social, and political activity.

Technology governance is beset by the challenges of how regulation can keep pace with rapid digital transformation, how governments can regulate in a context of deep knowledge asymmetry, and how policymakers can address the transnational nature of technology.

Keeping pace with, much less understanding, the implications of digital platforms and artificial intelligence for societies is increasingly challenging as technology becomes more sophisticated and yet more ubiquitous.

To overcome these obstacles, there is an urgent need to move towards a more anticipatory and inclusive model of technology governance. There are some signs of this in recent proposals by the European Union (EU) and the UK on the regulation of online harms.

Regulation failing to keep up

The speed of the digital revolution, further accelerated by the pandemic, has largely outstripped policymakers’ ability to provide appropriate frameworks to regulate and direct technology transformations.

Governments around the world face a ‘pacing problem’, a phenomenon described by Gary Marchant in 2011 as ‘the growing gap between the pace of science and technology and the lagging responsiveness of legal and ethical oversight that society relies on to govern emerging technologies’.

The speed of the digital revolution, further accelerated by the pandemic, has largely outstripped policymakers’ ability to provide appropriate frameworks to regulate and direct technology transformations

This ever-growing rift, Marchant argues, has been exacerbated by the increasing public appetite for and adoption of new technologies, as well as political inertia. As a result, legislation on emerging technologies risks being ineffective or out-of-date by the time it is implemented.

Effective regulation requires a thorough understanding of both the underlying technology design, processes and business model, and how current or new policy tools can be used to promote principles of good governance.

Artificial intelligence, for example, is penetrating all sectors of society and spanning multiple regulatory regimes without any regard for jurisdictional boundaries. As technology is increasingly developed and applied by the private sector rather than the state, officials often lack the technical expertise to adequately comprehend and act on emerging issues. This increases the risk of superficial regulation which fails to address the underlying structural causes of societal harms.

The significant lack of knowledge from those who aim to regulate compared to those who design, develop and market technology is prevalent in most technology-related domains, including powerful online platforms and providers such as Facebook, Twitter, Google and YouTube.

For example, the ability for governments and researchers to access the algorithms used in the business model of social media companies to promote online content – harmful or otherwise – remains opaque so, to a crucial extent, the regulator is operating in the dark.

The transnational nature of technology also poses additional problems for effective governance. Digital technologies intensify the gathering, harvesting, and transfer of data across borders, challenging administrative boundaries both domestically and internationally.

While there have been some efforts at the international level to coordinate approaches to the regulation of – for example – artificial intelligence (AI) and online content governance, more work is needed to promote global regulatory alignment, including on cross-border data flows and antitrust.

Reactive national legislative approaches are often based on targeted interventions in specific policy areas, and so risk failing to address the scale, complexity, and transnational nature of socio-technological challenges. Greater attention needs to be placed on how regulatory functions and policy tools should evolve to effectively govern technology, requiring a shift from a reactionary and rigid framework to a more anticipatory and adaptive model of governance.

Holistic and systemic versus mechanistic and linear

Some recent proposals for technology governance may offer potential solutions. The EU publication of a series of interlinked regulatory proposals – the Digital Services Act, Digital Markets Act and European Democracy Action Plan – integrates several novel and anticipatory features.

The EU package recognizes that the solutions to online harms such as disinformation, hate speech, and extremism lie in a holistic approach which draws on a range of disciplines, such as international human rights law, competition law, e-commerce, and behavioural science.

By tackling the complexity and unpredictability of technology governance through holistic and systemic approaches rather than mechanistic and linear ones, the UK and EU proposals represent an important pivot from reactive to anticipatory digital governance

It consists of a combination of light touch regulation – such as codes of conduct – and hard law requirements such as transparency obligations. Codes of conduct provide flexibility as to how requirements are achieved by digital platforms, and can be updated and tweaked relatively easily enabling regulations to keep pace as technology evolves.

As with the EU Digital Services Act, the UK’s recent proposals for an online safety bill are innovative in adopting a ‘systems-based’ approach which broadly focuses on the procedures and policies of technology companies rather than the substance of online content.

This means the proposals can be adapted to different types of content, and differentiated according to the size and reach of the technology company concerned. This ‘co-regulatory’ model recognizes the evolving nature of digital ecosystems and the ongoing responsibilities of the companies concerned. The forthcoming UK draft legislation will also be complemented by a ‘Safety by Design’ framework, which is forward-looking in focusing on responsible product design.

By tackling the complexity and unpredictability of technology governance through holistic and systemic approaches rather than mechanistic and linear ones, the UK and EU proposals represent an important pivot from reactive to anticipatory digital governance.

Both sets of proposals were also the result of extensive multistakeholder engagement, including between policy officials and technology actors. This engagement broke down silos within the technical and policy/legal communities and helped bridge the knowledge gap between dominant technology companies and policymakers, facilitating a more agile, inclusive, and pragmatic regulatory approach.

Coherence rather than fragmentation

Anticipatory governance also recognizes the need for new coalitions to promote regulatory coherence rather than fragmentation at the international level. The EU has been pushing for greater transatlantic engagement on regulation of the digital space, and the UK – as chair of the G7 presidency in 2021 – aims to work with democratic allies to forge a coherent response to online harms.

Meanwhile the OECD’s AI Policy Observatory enables member states to share best practice on the regulation of AI, and an increasing number of states such as France, Norway, and the UK are using ‘regulatory sandboxes’ to test and build AI or personal data systems that meet privacy standards.

Not all states currently have the organizational capacity and institutional depth to design and deliver regulatory schemes of this nature, as well as the resource-intensive consultation processes which often accompany them.

So, as an increasing number of states ponder how to ‘futureproof’ their regulation of tomorrow’s technology – whether 6G, quantum computing or biotechnology – there is a need for capacity building in governments both on the theory of anticipatory governance and on how it can be applied in practice to global technology regulation.




techno

Flexible Distribution Systems: New Services, Actors and Technologies

Flexible Distribution Systems: New Services, Actors and Technologies 4 September 2018 — 9:00AM TO 10:30AM Anonymous (not verified) 31 July 2018 Chatham House, London

The pace of the energy transition is accelerating. Solar and wind are dramatically falling in cost and displacing fossil fuel generators. Simultaneously, the rapid uptake of electric vehicles and battery storage systems are beginning to send shock-waves through the electricity sector.

As the proportion of distributed energy resources (DERs) connected to the distribution network grows, a significant opportunity is beginning to present itself. What if the concerns of renewable integration and associated costs could be solved by the smart integration of these DERs?

By properly valuing the services DERs can provide, actively managing the distribution system and creating new market places, might a truly renewable electricity system capable of supporting the electrification of heat and transport be possible?

During this roundtable, Andrew Scobie, CEO of Faraday Grid, will provide an overview of the challenges and opportunities faced within the distribution network and explain why the current system is no longer fit for purpose.

This is the inaugural event in the Energy Transitions Roundtable (ETR) series.




techno

Decarbonizing Heat: A New Frontier for Technologies and Business Models

Decarbonizing Heat: A New Frontier for Technologies and Business Models 27 February 2019 — 8:15AM TO 9:45AM Anonymous (not verified) 3 December 2018 Chatham House | 10 St James's Square | London | SW1Y 4LE

Building space and water heating accounts for over 35 percent of global energy consumption - nearly double that of transport. However, there has been limited progress in decarbonizing the sector to date. International cooperation is required to ensure harmonized policies drag low carbon heating technologies down the cost curve to the extent that low carbon heating is cost competitive and affordable. The initial presentations and discussion focus on:

  • Demand reduction technologies and policies that speed up transformation of the sector.
  • The different challenges for energy efficiency of retrofitting as opposed to new build.
  • The impact of electrification on GHG emissions and the power sector.
  • The comparative role of national and city level initiatives.

The meeting concludes by looking at the challenges and risks in accelerating the transformation of heating and the lessons that can be learned from other sectors.




techno

LEGO Classic Space: the robot final rebellion on the capital planet ousts the federation rule and replace with a techno republic and dictatorship (the final episode (for a while)) (AFOL toy hobby photography with droids, police and minifigures city MOC

dannyhennesy posted a photo:

On the Capital planet the rebellious droids had followed maily the Bat-Bot, but as time progressed his circuits had gone all mushy at 780 years or so without maintenance…

Several splinter groups all with their local bot leaders emerged such as the Che-bot, the traffic-light-robot and the Butt-bot, but none of these collected enough sentient circuits to call themselves a popular (or Animata) mass movement!

That was until a cyborg came along, one known as Jones, a long time prisoner and terrorist, his easy solutions to every problem rang well in the masses' auditory circuits!!!

His slogans and simple rhetoric were simple enough for the simple traffic-light to comprehend and cheer!

His language was full of hate towards the organics and especially the humans who were the most common races among the ruling class of the federation!!!

Despite being a “Fleshie” himself his message collected the angry enslaved
bot community by only weeks all rebellious robots except for a few fringe loonies had forgotten the old leaders…

One morning at Jones gave the signal…

All over the capital planet hordes and swarms of any form of mechanical sentient beings attacked first the police stations, then the Company boards running the planet and the federation as well as their starfleet…

Many died, especially the low level police and army! Many mechanicals died too, but their ranks were soon filled by Mutant fleshie allies of the lower levels who hated the Federation feudal society and upper classes as much as their technological allies…

The Federation state apparatus and ruling class, most of their fleet army fled when they knew the game was up, they activated the emergency escape plan and whole city blocks with important factories, administrational units, valuable assets and so on separated from the capital by hidden rocket engines and homed in their course to Mars…

On Mars the federation regrouped and formed their new society…

On the Capital planet, the robots proclaimed the first Techno-republic of the advanced inorganic civilization, the low level fleshies left behind, became slaves and their mutant allies got to rule their own minute chiefdoms as protectorates under the Techno-republic…

Jones was now the undisputed ruler of the capital planet, but the victory was a pyrros one since, all important buildings, all of value was now one Mars!

But as Jones put it:

Our proud race the Techno-species didn’t need the Fleshies administration, their infrastructure, their spaceships…

We shall start from scratch, with a new administration, a new order, every droid shall work at 4x speed than they did during human oppression since now we are free and the fleshies shall work twice as hard than the Techno-Race, until we have breed enough new fleshies so they can do all work!

Our future is bright and shiny like glistering shiny metal!

The snapshot seen here is from the first police station attacked in sector 45-34v-ss-g the first one to fall according to official techno-history!

———————————————/
Designers note:

I am sad to say that this is the last episode in this years-spanning space series… At least for a while, I will still post LEGO hobby stuff here but without a storyline, perhaps small designs and builds… and occasionally a story when I feel like it!!!

I would like to thank all who had been in this journey of our heros, but it has taken far to much time and effort and since the state of the world is as it is, I am spiraling down in another depression, I must stop it before I reach the abyss, so I have remove some stress out of my equation… I ended it in a cliffhanger so I can easily restart it when my mental health improves… I hope that won’t be forever???

I would love if someone used my characters or ideas, please send me a link if you do, I would love to read it or look at it!!!

But there will be more Lego, just in different format without long stories, I need to focus more on my art and to be honest that is the only time the mental pain eases, when I create!!!


Peace and Noise!

MushroomBrain a FOL




techno

Fast Quantitative Analysis of timsTOF PASEF Data with MSFragger and IonQuant [Technological Innovation and Resources]

Ion mobility brings an additional dimension of separation to LC–MS, improving identification of peptides and proteins in complex mixtures. A recently introduced timsTOF mass spectrometer (Bruker) couples trapped ion mobility separation to TOF mass analysis. With the parallel accumulation serial fragmentation (PASEF) method, the timsTOF platform achieves promising results, yet analysis of the data generated on this platform represents a major bottleneck. Currently, MaxQuant and PEAKS are most used to analyze these data. However, because of the high complexity of timsTOF PASEF data, both require substantial time to perform even standard tryptic searches. Advanced searches (e.g. with many variable modifications, semi- or non-enzymatic searches, or open searches for post-translational modification discovery) are practically impossible. We have extended our fast peptide identification tool MSFragger to support timsTOF PASEF data, and developed a label-free quantification tool, IonQuant, for fast and accurate 4-D feature extraction and quantification. Using a HeLa data set published by Meier et al. (2018), we demonstrate that MSFragger identifies significantly (~30%) more unique peptides than MaxQuant (1.6.10.43), and performs comparably or better than PEAKS X+ (~10% more peptides). IonQuant outperforms both in terms of number of quantified proteins while maintaining good quantification precision and accuracy. Runtime tests show that MSFragger and IonQuant can fully process a typical two-hour PASEF run in under 70 min on a typical desktop (6 CPU cores, 32 GB RAM), significantly faster than other tools. Finally, through semi-enzymatic searching, we significantly increase the number of identified peptides. Within these semi-tryptic identifications, we report evidence of gas-phase fragmentation before MS/MS analysis.




techno

Open Database Searching Enables the Identification and Comparison of Bacterial Glycoproteomes without Defining Glycan Compositions Prior to Searching [Technological Innovation and Resources]

Mass spectrometry has become an indispensable tool for the characterization of glycosylation across biological systems. Our ability to generate rich fragmentation of glycopeptides has dramatically improved over the last decade yet our informatic approaches still lag behind. Although glycoproteomic informatics approaches using glycan databases have attracted considerable attention, database independent approaches have not. This has significantly limited high throughput studies of unusual or atypical glycosylation events such as those observed in bacteria. As such, computational approaches to examine bacterial glycosylation and identify chemically diverse glycans are desperately needed. Here we describe the use of wide-tolerance (up to 2000 Da) open searching as a means to rapidly examine bacterial glycoproteomes. We benchmarked this approach using N-linked glycopeptides of Campylobacter fetus subsp. fetus as well as O-linked glycopeptides of Acinetobacter baumannii and Burkholderia cenocepacia revealing glycopeptides modified with a range of glycans can be readily identified without defining the glycan masses before database searching. Using this approach, we demonstrate how wide tolerance searching can be used to compare glycan use across bacterial species by examining the glycoproteomes of eight Burkholderia species (B. pseudomallei; B. multivorans; B. dolosa; B. humptydooensis; B. ubonensis, B. anthina; B. diffusa; B. pseudomultivorans). Finally, we demonstrate how open searching enables the identification of low frequency glycoforms based on shared modified peptides sequences. Combined, these results show that open searching is a robust computational approach for the determination of glycan diversity within bacterial proteomes.




techno

Molecular Dynamics Simulation-assisted Ionic Liquid Screening for Deep Coverage Proteome Analysis [Technological Innovation and Resources]

In-depth coverage of proteomic analysis could enhance our understanding to the mechanism of the protein functions. Unfortunately, many highly hydrophobic proteins and low-abundance proteins, which play critical roles in signaling networks, are easily lost during sample preparation, mainly attributed to the fact that very few extractants can simultaneously satisfy the requirements on strong solubilizing ability to membrane proteins and good enzyme compatibility. Thus, it is urgent to screen out ideal extractant from the huge compound libraries in a fast and effective way. Herein, by investigating the interior mechanism of extractants on the membrane proteins solubilization and trypsin compatibility, a molecular dynamics simulation system was established as complement to the experimental procedure to narrow down the scope of candidates for proteomics analysis. The simulation data shows that the van der Waals interaction between cation group of ionic liquid and membrane protein is the dominant factor in determining protein solubilization. In combination with the experimental data, 1-dodecyl-3-methylimidazolium chloride (C12Im-Cl) is on the shortlist for the suitable candidates from comprehensive aspects. Inspired by the advantages of C12Im-Cl, an ionic liquid-based filter-aided sample preparation (i-FASP) method was developed. Using this strategy, over 3,300 proteins were confidently identified from 103 HeLa cells (~100 ng proteins) in a single run, an improvement of 53% over the conventional FASP method. Then the i-FASP method was further successfully applied to the label-free relative quantitation of human liver cancer and para-carcinoma tissues with obviously improved accuracy, reproducibility and coverage than the commonly used urea-based FASP method. The above results demonstrated that the i-FASP method could be performed as a versatile tool for the in-depth coverage proteomic analysis of biological samples.




techno

MSstatsTMT: Statistical Detection of Differentially Abundant Proteins in Experiments with Isobaric Labeling and Multiple Mixtures [Technological Innovation and Resources]

Tandem mass tag (TMT) is a multiplexing technology widely-used in proteomic research. It enables relative quantification of proteins from multiple biological samples in a single MS run with high efficiency and high throughput. However, experiments often require more biological replicates or conditions than can be accommodated by a single run, and involve multiple TMT mixtures and multiple runs. Such larger-scale experiments combine sources of biological and technical variation in patterns that are complex, unique to TMT-based workflows, and challenging for the downstream statistical analysis. These patterns cannot be adequately characterized by statistical methods designed for other technologies, such as label-free proteomics or transcriptomics. This manuscript proposes a general statistical approach for relative protein quantification in MS- based experiments with TMT labeling. It is applicable to experiments with multiple conditions, multiple biological replicate runs and multiple technical replicate runs, and unbalanced designs. It is based on a flexible family of linear mixed-effects models that handle complex patterns of technical artifacts and missing values. The approach is implemented in MSstatsTMT, a freely available open-source R/Bioconductor package compatible with data processing tools such as Proteome Discoverer, MaxQuant, OpenMS, and SpectroMine. Evaluation on a controlled mixture, simulated datasets, and three biological investigations with diverse designs demonstrated that MSstatsTMT balanced the sensitivity and the specificity of detecting differentially abundant proteins, in large-scale experiments with multiple biological mixtures.




techno

OpenPepXL: An Open-Source Tool for Sensitive Identification of Cross-Linked Peptides in XL-MS [Technological Innovation and Resources]

Cross-linking MS (XL-MS) has been recognized as an effective source of information about protein structures and interactions. In contrast to regular peptide identification, XL-MS has to deal with a quadratic search space, where peptides from every protein could potentially be cross-linked to any other protein. To cope with this search space, most tools apply different heuristics for search space reduction. We introduce a new open-source XL-MS database search algorithm, OpenPepXL, which offers increased sensitivity compared with other tools. OpenPepXL searches the full search space of an XL-MS experiment without using heuristics to reduce it. Because of efficient data structures and built-in parallelization OpenPepXL achieves excellent runtimes and can also be deployed on large compute clusters and cloud services while maintaining a slim memory footprint. We compared OpenPepXL to several other commonly used tools for identification of noncleavable labeled and label-free cross-linkers on a diverse set of XL-MS experiments. In our first comparison, we used a data set from a fraction of a cell lysate with a protein database of 128 targets and 128 decoys. At 5% FDR, OpenPepXL finds from 7% to over 50% more unique residue pairs (URPs) than other tools. On data sets with available high-resolution structures for cross-link validation OpenPepXL reports from 7% to over 40% more structurally validated URPs than other tools. Additionally, we used a synthetic peptide data set that allows objective validation of cross-links without relying on structural information and found that OpenPepXL reports at least 12% more validated URPs than other tools. It has been built as part of the OpenMS suite of tools and supports Windows, macOS, and Linux operating systems. OpenPepXL also supports the MzIdentML 1.2 format for XL-MS identification results. It is freely available under a three-clause BSD license at https://openms.org/openpepxl.




techno

ProAlanase is an Effective Alternative to Trypsin for Proteomics Applications and Disulfide Bond Mapping [Technological Innovation and Resources]

Trypsin is the protease of choice in bottom-up proteomics. However, its application can be limited by the amino acid composition of target proteins and the pH of the digestion solution. In this study we characterize ProAlanase, a protease from the fungus Aspergillus niger that cleaves primarily on the C-terminal side of proline and alanine residues. ProAlanase achieves high proteolytic activity and specificity when digestion is carried out at acidic pH (1.5) for relatively short (2 h) time periods. To elucidate the potential of ProAlanase in proteomics applications, we conducted a series of investigations comprising comparative multi-enzymatic profiling of a human cell line proteome, histone PTM analysis, ancient bone protein identification, phosphosite mapping and de novo sequencing of a proline-rich protein and disulfide bond mapping in mAb. The results demonstrate that ProAlanase is highly suitable for proteomics analysis of the arginine- and lysine-rich histones, enabling high sequence coverage of multiple histone family members. It also facilitates an efficient digestion of bone collagen thanks to the cleavage at the C terminus of hydroxyproline which is highly prevalent in collagen. This allows to identify complementary proteins in ProAlanase- and trypsin-digested ancient bone samples, as well as to increase sequence coverage of noncollagenous proteins. Moreover, digestion with ProAlanase improves protein sequence coverage and phosphosite localization for the proline-rich protein Notch3 intracellular domain (N3ICD). Furthermore, we achieve a nearly complete coverage of N3ICD protein by de novo sequencing using the combination of ProAlanase and tryptic peptides. Finally, we demonstrate that ProAlanase is efficient in disulfide bond mapping, showing high coverage of disulfide-containing regions in a nonreduced mAb.




techno

Identification of Microorganisms by Liquid Chromatography-Mass Spectrometry (LC-MS1) and in Silico Peptide Mass Libraries [Technological Innovation and Resources]

Over the past decade, modern methods of MS (MS) have emerged that allow reliable, fast and cost-effective identification of pathogenic microorganisms. Although MALDI-TOF MS has already revolutionized the way microorganisms are identified, recent years have witnessed also substantial progress in the development of liquid chromatography (LC)-MS based proteomics for microbiological applications. For example, LC-tandem MS (LC-MS2) has been proposed for microbial characterization by means of multiple discriminative peptides that enable identification at the species, or sometimes at the strain level. However, such investigations can be laborious and time-consuming, especially if the experimental LC-MS2 data are tested against sequence databases covering a broad panel of different microbiological taxa. In this proof of concept study, we present an alternative bottom-up proteomics method for microbial identification. The proposed approach involves efficient extraction of proteins from cultivated microbial cells, digestion by trypsin and LC–MS measurements. Peptide masses are then extracted from MS1 data and systematically tested against an in silico library of all possible peptide mass data compiled in-house. The library has been computed from the UniProt Knowledgebase covering Swiss-Prot and TrEMBL databases and comprises more than 12,000 strain-specific in silico profiles, each containing tens of thousands of peptide mass entries. Identification analysis involves computation of score values derived from correlation coefficients between experimental and strain-specific in silico peptide mass profiles and compilation of score ranking lists. The taxonomic positions of the microbial samples are then determined by using the best-matching database entries. The suggested method is computationally efficient – less than 2 mins per sample - and has been successfully tested by a test set of 39 LC-MS1 peak lists obtained from 19 different microbial pathogens. The proposed method is rapid, simple and automatable and we foresee wide application potential for future microbiological applications.




techno

ReactomeGSA - Efficient Multi-Omics Comparative Pathway Analysis [Technological Innovation and Resources]

Pathway analyses are key methods to analyze 'omics experiments. Nevertheless, integrating data from different 'omics technologies and different species still requires considerable bioinformatics knowledge.

Here we present the novel ReactomeGSA resource for comparative pathway analyses of multi-omics datasets. ReactomeGSA can be used through Reactome's existing web interface and the novel ReactomeGSA R Bioconductor package with explicit support for scRNA-seq data. Data from different species is automatically mapped to a common pathway space. Public data from ExpressionAtlas and Single Cell ExpressionAtlas can be directly integrated in the analysis. ReactomeGSA greatly reduces the technical barrier for multi-omics, cross-species, comparative pathway analyses.

We used ReactomeGSA to characterize the role of B cells in anti-tumor immunity. We compared B cell rich and poor human cancer samples from five of the Cancer Genome Atlas (TCGA) transcriptomics and two of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) proteomics studies. B cell-rich lung adenocarcinoma samples lacked the otherwise present activation through NFkappaB. This may be linked to the presence of a specific subset of tumor associated IgG+ plasma cells that lack NFkappaB activation in scRNA-seq data from human melanoma. This showcases how ReactomeGSA can derive novel biomedical insights by integrating large multi-omics datasets.




techno

Detection of multiple autoantibodies in patients with ankylosing spondylitis using nucleic acid programmable protein arrays [11. Microarrays/Combinatorics/Display Technology]

Ankylosing Spondylitis (AS) is a common, inflammatory rheumatic disease, which primarily affects the axial skeleton and is associated with sacroiliitis, uveitis and enthesitis. Unlike other autoimmune rheumatic diseases, such as rheumatoid arthritis or systemic lupus erythematosus, autoantibodies have not yet been reported to be a feature of AS. We therefore wished to determine if plasma from patients with AS contained autoantibodies and if so, characterize and quantify this response in comparison to patients with Rheumatoid Arthritis (RA) and healthy controls. Two high-density nucleic acid programmable protein arrays expressing a total of 3498 proteins were screened with plasma from 25 patients with AS, 17 with RA and 25 healthy controls. Autoantigens identified were subjected to Ingenuity Pathway Analysis in order to determine patterns of signalling cascades or tissue origin. 44% of patients with Ankylosing Spondylitis demonstrated a broad autoantibody response, as compared to 33% of patients with RA and only 8% of healthy controls. Individuals with AS demonstrated autoantibody responses to shared autoantigens, and 60% of autoantigens identified in the AS cohort were restricted to that group. The AS patients autoantibody responses were targeted towards connective, skeletal and muscular tissue, unlike those of RA patients or healthy controls. Thus, patients with AS show evidence of systemic humoral autoimmunity and multispecific autoantibody production. Nucleic Acid Programmable Protein Arrays constitute a powerful tool to study autoimmune diseases.