gen

Fire chiefs hail £30m investment in ‘whole new level’ 999 emergency system

The technology will deliver enhanced day-to-day and major incident response capability




gen

RPG Cast – Episode 563: “Isn’t Compile Heart Jail Against the Geneva Convention?”

We can't believe we have to say this, but this week Chris tries to understand water vapor. Anna Marie takes a week off to go date some demons. Josh frequents a sex shop to pick up some weapons. Pascal starts vaping in his Xbox. And Robert and Kelley are testing the jiggle physics of sacks in World of Warcraft.

The post RPG Cast – Episode 563: “Isn’t Compile Heart Jail Against the Geneva Convention?” appeared first on RPGamer.



  • News
  • Podcasts
  • RPG Cast
  • Disgaea 1 Complete
  • Ghost of a Tale
  • The Legend of Heroes: Trails of Cold Steel IV
  • World of Warcraft: Battle for Azeroth
  • Yakuza: Like a Dragon


gen

One Direction Star Liam Payne Dead After Fall in Argentina

Marc Piasecki/GC Images via Getty Images

Liam Payne, a former member of the boy band One Direction, was found dead in a hotel courtyard in Buenos Aires, Argentina on Wednesday evening, according to CNN and La Nacion, both of which cited local police.

The singer died after an apparent fall from the balcony of his third-floor hotel room. Argentinian authorities told Good Morning America that Payne had been staying at the hotel CasaSur in the upscale neighborhood of Palermo.

He was 31.

Read more at The Daily Beast.




gen

Respawn have killed Apex Legends' Steam Deck support in the name of anti-cheat

The Steam Deck is something of a talisman for gaming on Linux, its popularity and penguin-powered SteamOS having almost singlehandedly dragged it past MacOS as the second-most-used operating system among Steam users. Sadly, this also means the Valve handheld is the primary casualty when developers decide to stop bothering with Linux support, as Respawn Entertainment have decided to do for Apex Legends.

Read more




gen

Elder Scrolls: Legends has been removed from sale and will become unplayable in January 2025

The Elder Scrolls: Legends, the free-to-play card game set in Bethesda's fantasy world, has been removed from sale on Steam. Its servers will shut down for good on January 30th, 2025, after which it will no longer be playable. The closure comes five years after the game was last updated.

Read more




gen

Thysiastery is an anime Legend Of Grimrock, and you can attack the dinosaur merchants if you’re a complete monster

One of my lesser quality tests for an RPG is whether the shopkeepers complain at you for not buying anything. Grumpy shopkeepers, good RPG. This most specific of litmus tests has served me well, although I must admit that I’d happily upgrade it to ‘shopkeepers you can attack’, would that not disqualify 99% of games. But not turn based dungeon crawler Thysiastery, it turns out. This “dungeon crawler RPG featuring traditional roguelike and turn-based gameplay” apparently trusts you enough to let you recklessly batter its friendly wandering lizard merchants. You’d be a monster for it, of course, but it’s nice to have options.

Read more




gen

Metal Slug Tactics, the surprising genre-twist of the classic run-and-gunner, is out now

I would never have predicted there'd be an isometric tactics game based on run-and-gun series Metal Slug, yet here Metal Slug Tactics is, and I am here for it. We've been following its development for a while but it's out now on Steam, and seemingly as strong as its demo suggested.

Read more




gen

Apex Legends is revisiting the past, but it should be prouder of its present

Nostalgia, when you think about it, is bollocks. There has never been a better time than right this second – averaged out, and despite repeated attempts to the contrary, humanity has never been healthier, freer, or more enlightened by knowledge. It’s true of games too. For every by-committee platter of passionless map markers, there are thousands of more personal, more creative, more interesting works, all adding to the decades' worth of great stuff we can still play today.

What isn’t bollocks is the emotional pull that nostalgia, for all its lack of cold, hard reason, still manages to wield inside our warm, squishy brains. Hence, the centrepiece of Apex Legends’ Season 23 update is a mode that recreates the battle royale FPS as it was back in 2019, defaulting back to the original map and weapon arsenal while cutting the 26-strong legend roster to the earliest ten. It’s a Fortnite-style rolling back of the clock, and a passably enjoyable one, but also a reminder that the good old days weren’t always that good.

Read more




gen

Sega sell off studio behind Endless Legend and Humankind as part of "restructuring" - but it goes to the original owners

Amplitude Studios, developers of many a game with "Endless" in the name, have split with publisher Sega to become independent again, with ownership of the studio reverting to its original founders and "other members of the team". The developers say everyone is parting "on good terms" and that the last eight years of getting published under Sega has been "amazing". But there are other businessy reasons, of course. Namely, Sega have been trying to trim down their European studios for the past year, and Amplitude is just the latest bunch of devs affected by that.

Read more




gen

Generative AI creates playable version of Doom game with no code

A neural network can recreate the classic computer game Doom despite using none of its code or graphics, hinting that generative AI could be used to create games from scratch in future




gen

How to avoid being fooled by AI-generated misinformation

Advances in generative AI mean fake images, videos, audio and bots are now everywhere. But studies have revealed the best ways to tell if something is real




gen

How to spot deepfakes and AI-generated images

It can be difficult to spot AI generated videos known as deepfakes, but there are ways to spot one if you know what to look for




gen

The AI expert who says artificial general intelligence is nonsense

Artificial intelligence has more in common with ants than humans, says Neil Lawrence. Only by taking a more nuanced view of intelligence can we see how machines will truly transform society




gen

Google tool makes AI-generated writing easily detectable

Google DeepMind has been using its AI watermarking method on Gemini chatbot responses for months – and now it’s making the tool available to any AI developer




gen

Are we really ready for genuine communication with animals through AI?

Thanks to artificial intelligence, understanding animals may be closer than we think. But we may not like what they are going to tell us, says RSPCA chief executive Chris Sherwood




gen

Agent payouts to shift stock

Agents are being offered double the normal commission to help shift apartments throughout capital cities.




gen

We Have Urgent Questions About the Unholy Provenance of Netflix’s Hot Frosty

Who built this ripped, anatomically graphic snowman? Is there a world of snowmen offscreen waiting to be turned into sex objects for widows?




gen

Artificial intelligence is being used in university classes. How it's being used matters, say profs

As artificial intelligence becomes more common in university classrooms, some professors are weighing the benefits — and downsides — of students using it for research projects.



  • News/Canada/Nova Scotia

gen

Can the free market ensure artificial intelligence won't wipe out human workers?

People keep predicting that each wave of new technology will mean humans can put their feet up. It hasn't happened yet. Some economists and anthropologists who study the subject say even with the arrival of artificial intelligence, humans will remain integral to making the world go round.




gen

N.L. institution says due diligence on OceanGate wasn't necessary prior to Titan implosion

The Marine Institute and OceanGate signed a partnership in early 2023, but it remains unclear if the Memorial University campus knew the ill-fated Titan submersible was unregulated, unclassed and uncertified.



  • News/Canada/Nfld. & Labrador

gen

Trump picks former national intelligence head John Ratcliffe for CIA

President-elect Donald Trump announced Tuesday that John Ratcliffe, a former Texas congressman and director of national intelligence, will serve as the director of the Central Intelligence Agency in his administration.






gen

Exclusive: Liz Truss urged to act with Britain facing biggest loss of sports facilities in a generation





gen

British Athletics want 'open' category for transgender women to compete with men





gen

Canada’s anti-money laundering agency hit by a cyber attack

Canada’s national anti-money laundering agency has been hit by a cyber attack. The Financial Transactions and Reports Analysis Centre of Canada (FINTRAC) said Tuesday that over the last 24 hours it has been managing a cyber incident. “The incident does not involve the centre’s intelligence or classified systems,” it said in a statement. “As a […]

The post Canada’s anti-money laundering agency hit by a cyber attack first appeared on ITBusiness.ca.




gen

Rise of the superbaby? US startup offers genetic IQ screening for wealthy elite: report

U.S.-based startup company Heliospect Genomics reportedly is offering wealthy couples embryo screening for IQ and other traits at $50,000 for 100 embryos.



  • a9f4cec8-f174-58c5-8724-21636f21cd43
  • fnc
  • Fox News
  • fox-news/science/natural-science/genetics
  • fox-news/science
  • fox-news/health
  • fox-news/us
  • fox-news/lifestyle
  • fox-news/science
  • article

gen

Corsair's next gen AiO coolers feature circular LCD panels

Meanwhile, the firm has announced LGA1700 upgrade kits for its existing AiO series.




gen

Aorus reveals a next-gen gaming PC concept - Project Cielo

The design is portable, modular and boasts of 5G connectivity.




gen

Next-generation technology is a critical mid-step in dementia care

New technologies will radically change the experience of living with and caring for someone with Alzheimer's, says Professor Fiona Carragher, chief policy and research officer at Alzheimer's Society, UK




gen

Food recalls in the U.S. spike due to Listeria, Salmonella and allergens

An in-depth analysis in the United States, covering 2002 to 2023, reveals that biological contamination and allergens are the leading causes of food recalls. The study, recently published in the Journal of Food Protection, examined more than 35,000 food and beverage recalls overseen by the U.S. Food and Drug Administration... Continue Reading




gen

RFK Jr. and the Make America Healthy Again agenda could impact food safety

RFK Jr., a lawyer-politician, could replace lawyer-politician Xavier Becerra as Secretary of Health and Human  Services. Or RFK Jr could be the next Secretary of Agriculture, replacing  Tom Vilsack, a lawyer. Deputy FDA Commissioners are sometimes lawyers. Dr. Robert Califf, a cardiologist, is the outgoing FDA Commissioner. The fact that... Continue Reading




gen

Agencies tight-lipped on kickbacks

Australia’s leading media agencies have ducked questions about cash kickbacks.




gen

Andrew Ng: Unbiggen AI



Andrew Ng has serious street cred in artificial intelligence. He pioneered the use of graphics processing units (GPUs) to train deep learning models in the late 2000s with his students at Stanford University, cofounded Google Brain in 2011, and then served for three years as chief scientist for Baidu, where he helped build the Chinese tech giant’s AI group. So when he says he has identified the next big shift in artificial intelligence, people listen. And that’s what he told IEEE Spectrum in an exclusive Q&A.


Ng’s current efforts are focused on his company Landing AI, which built a platform called LandingLens to help manufacturers improve visual inspection with computer vision. He has also become something of an evangelist for what he calls the data-centric AI movement, which he says can yield “small data” solutions to big issues in AI, including model efficiency, accuracy, and bias.

Andrew Ng on...

The great advances in deep learning over the past decade or so have been powered by ever-bigger models crunching ever-bigger amounts of data. Some people argue that that’s an unsustainable trajectory. Do you agree that it can’t go on that way?

Andrew Ng: This is a big question. We’ve seen foundation models in NLP [natural language processing]. I’m excited about NLP models getting even bigger, and also about the potential of building foundation models in computer vision. I think there’s lots of signal to still be exploited in video: We have not been able to build foundation models yet for video because of compute bandwidth and the cost of processing video, as opposed to tokenized text. So I think that this engine of scaling up deep learning algorithms, which has been running for something like 15 years now, still has steam in it. Having said that, it only applies to certain problems, and there’s a set of other problems that need small data solutions.

When you say you want a foundation model for computer vision, what do you mean by that?

Ng: This is a term coined by Percy Liang and some of my friends at Stanford to refer to very large models, trained on very large data sets, that can be tuned for specific applications. For example, GPT-3 is an example of a foundation model [for NLP]. Foundation models offer a lot of promise as a new paradigm in developing machine learning applications, but also challenges in terms of making sure that they’re reasonably fair and free from bias, especially if many of us will be building on top of them.

What needs to happen for someone to build a foundation model for video?

Ng: I think there is a scalability problem. The compute power needed to process the large volume of images for video is significant, and I think that’s why foundation models have arisen first in NLP. Many researchers are working on this, and I think we’re seeing early signs of such models being developed in computer vision. But I’m confident that if a semiconductor maker gave us 10 times more processor power, we could easily find 10 times more video to build such models for vision.

Having said that, a lot of what’s happened over the past decade is that deep learning has happened in consumer-facing companies that have large user bases, sometimes billions of users, and therefore very large data sets. While that paradigm of machine learning has driven a lot of economic value in consumer software, I find that that recipe of scale doesn’t work for other industries.

Back to top

It’s funny to hear you say that, because your early work was at a consumer-facing company with millions of users.

Ng: Over a decade ago, when I proposed starting the Google Brain project to use Google’s compute infrastructure to build very large neural networks, it was a controversial step. One very senior person pulled me aside and warned me that starting Google Brain would be bad for my career. I think he felt that the action couldn’t just be in scaling up, and that I should instead focus on architecture innovation.

“In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.”
—Andrew Ng, CEO & Founder, Landing AI

I remember when my students and I published the first NeurIPS workshop paper advocating using CUDA, a platform for processing on GPUs, for deep learning—a different senior person in AI sat me down and said, “CUDA is really complicated to program. As a programming paradigm, this seems like too much work.” I did manage to convince him; the other person I did not convince.

I expect they’re both convinced now.

Ng: I think so, yes.

Over the past year as I’ve been speaking to people about the data-centric AI movement, I’ve been getting flashbacks to when I was speaking to people about deep learning and scalability 10 or 15 years ago. In the past year, I’ve been getting the same mix of “there’s nothing new here” and “this seems like the wrong direction.”

Back to top

How do you define data-centric AI, and why do you consider it a movement?

Ng: Data-centric AI is the discipline of systematically engineering the data needed to successfully build an AI system. For an AI system, you have to implement some algorithm, say a neural network, in code and then train it on your data set. The dominant paradigm over the last decade was to download the data set while you focus on improving the code. Thanks to that paradigm, over the last decade deep learning networks have improved significantly, to the point where for a lot of applications the code—the neural network architecture—is basically a solved problem. So for many practical applications, it’s now more productive to hold the neural network architecture fixed, and instead find ways to improve the data.

When I started speaking about this, there were many practitioners who, completely appropriately, raised their hands and said, “Yes, we’ve been doing this for 20 years.” This is the time to take the things that some individuals have been doing intuitively and make it a systematic engineering discipline.

The data-centric AI movement is much bigger than one company or group of researchers. My collaborators and I organized a data-centric AI workshop at NeurIPS, and I was really delighted at the number of authors and presenters that showed up.

You often talk about companies or institutions that have only a small amount of data to work with. How can data-centric AI help them?

Ng: You hear a lot about vision systems built with millions of images—I once built a face recognition system using 350 million images. Architectures built for hundreds of millions of images don’t work with only 50 images. But it turns out, if you have 50 really good examples, you can build something valuable, like a defect-inspection system. In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.

When you talk about training a model with just 50 images, does that really mean you’re taking an existing model that was trained on a very large data set and fine-tuning it? Or do you mean a brand new model that’s designed to learn only from that small data set?

Ng: Let me describe what Landing AI does. When doing visual inspection for manufacturers, we often use our own flavor of RetinaNet. It is a pretrained model. Having said that, the pretraining is a small piece of the puzzle. What’s a bigger piece of the puzzle is providing tools that enable the manufacturer to pick the right set of images [to use for fine-tuning] and label them in a consistent way. There’s a very practical problem we’ve seen spanning vision, NLP, and speech, where even human annotators don’t agree on the appropriate label. For big data applications, the common response has been: If the data is noisy, let’s just get a lot of data and the algorithm will average over it. But if you can develop tools that flag where the data’s inconsistent and give you a very targeted way to improve the consistency of the data, that turns out to be a more efficient way to get a high-performing system.

“Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.”
—Andrew Ng

For example, if you have 10,000 images where 30 images are of one class, and those 30 images are labeled inconsistently, one of the things we do is build tools to draw your attention to the subset of data that’s inconsistent. So you can very quickly relabel those images to be more consistent, and this leads to improvement in performance.

Could this focus on high-quality data help with bias in data sets? If you’re able to curate the data more before training?

Ng: Very much so. Many researchers have pointed out that biased data is one factor among many leading to biased systems. There have been many thoughtful efforts to engineer the data. At the NeurIPS workshop, Olga Russakovsky gave a really nice talk on this. At the main NeurIPS conference, I also really enjoyed Mary Gray’s presentation, which touched on how data-centric AI is one piece of the solution, but not the entire solution. New tools like Datasheets for Datasets also seem like an important piece of the puzzle.

One of the powerful tools that data-centric AI gives us is the ability to engineer a subset of the data. Imagine training a machine-learning system and finding that its performance is okay for most of the data set, but its performance is biased for just a subset of the data. If you try to change the whole neural network architecture to improve the performance on just that subset, it’s quite difficult. But if you can engineer a subset of the data you can address the problem in a much more targeted way.

When you talk about engineering the data, what do you mean exactly?

Ng: In AI, data cleaning is important, but the way the data has been cleaned has often been in very manual ways. In computer vision, someone may visualize images through a Jupyter notebook and maybe spot the problem, and maybe fix it. But I’m excited about tools that allow you to have a very large data set, tools that draw your attention quickly and efficiently to the subset of data where, say, the labels are noisy. Or to quickly bring your attention to the one class among 100 classes where it would benefit you to collect more data. Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.

For example, I once figured out that a speech-recognition system was performing poorly when there was car noise in the background. Knowing that allowed me to collect more data with car noise in the background, rather than trying to collect more data for everything, which would have been expensive and slow.

Back to top

What about using synthetic data, is that often a good solution?

Ng: I think synthetic data is an important tool in the tool chest of data-centric AI. At the NeurIPS workshop, Anima Anandkumar gave a great talk that touched on synthetic data. I think there are important uses of synthetic data that go beyond just being a preprocessing step for increasing the data set for a learning algorithm. I’d love to see more tools to let developers use synthetic data generation as part of the closed loop of iterative machine learning development.

Do you mean that synthetic data would allow you to try the model on more data sets?

Ng: Not really. Here’s an example. Let’s say you’re trying to detect defects in a smartphone casing. There are many different types of defects on smartphones. It could be a scratch, a dent, pit marks, discoloration of the material, other types of blemishes. If you train the model and then find through error analysis that it’s doing well overall but it’s performing poorly on pit marks, then synthetic data generation allows you to address the problem in a more targeted way. You could generate more data just for the pit-mark category.

“In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models.”
—Andrew Ng

Synthetic data generation is a very powerful tool, but there are many simpler tools that I will often try first. Such as data augmentation, improving labeling consistency, or just asking a factory to collect more data.

Back to top

To make these issues more concrete, can you walk me through an example? When a company approaches Landing AI and says it has a problem with visual inspection, how do you onboard them and work toward deployment?

Ng: When a customer approaches us we usually have a conversation about their inspection problem and look at a few images to verify that the problem is feasible with computer vision. Assuming it is, we ask them to upload the data to the LandingLens platform. We often advise them on the methodology of data-centric AI and help them label the data.

One of the foci of Landing AI is to empower manufacturing companies to do the machine learning work themselves. A lot of our work is making sure the software is fast and easy to use. Through the iterative process of machine learning development, we advise customers on things like how to train models on the platform, when and how to improve the labeling of data so the performance of the model improves. Our training and software supports them all the way through deploying the trained model to an edge device in the factory.

How do you deal with changing needs? If products change or lighting conditions change in the factory, can the model keep up?

Ng: It varies by manufacturer. There is data drift in many contexts. But there are some manufacturers that have been running the same manufacturing line for 20 years now with few changes, so they don’t expect changes in the next five years. Those stable environments make things easier. For other manufacturers, we provide tools to flag when there’s a significant data-drift issue. I find it really important to empower manufacturing customers to correct data, retrain, and update the model. Because if something changes and it’s 3 a.m. in the United States, I want them to be able to adapt their learning algorithm right away to maintain operations.

In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models. The challenge is, how do you do that without Landing AI having to hire 10,000 machine learning specialists?

So you’re saying that to make it scale, you have to empower customers to do a lot of the training and other work.

Ng: Yes, exactly! This is an industry-wide problem in AI, not just in manufacturing. Look at health care. Every hospital has its own slightly different format for electronic health records. How can every hospital train its own custom AI model? Expecting every hospital’s IT personnel to invent new neural-network architectures is unrealistic. The only way out of this dilemma is to build tools that empower the customers to build their own models by giving them tools to engineer the data and express their domain knowledge. That’s what Landing AI is executing in computer vision, and the field of AI needs other teams to execute this in other domains.

Is there anything else you think it’s important for people to understand about the work you’re doing or the data-centric AI movement?

Ng: In the last decade, the biggest shift in AI was a shift to deep learning. I think it’s quite possible that in this decade the biggest shift will be to data-centric AI. With the maturity of today’s neural network architectures, I think for a lot of the practical applications the bottleneck will be whether we can efficiently get the data we need to develop systems that work well. The data-centric AI movement has tremendous energy and momentum across the whole community. I hope more researchers and developers will jump in and work on it.

Back to top

This article appears in the April 2022 print issue as “Andrew Ng, AI Minimalist.”




gen

New Carrier Fluid Makes Hydrogen Way Easier to Transport



Imagine pulling up to a refueling station and filling your vehicle’s tank with liquid hydrogen, as safe and convenient to handle as gasoline or diesel, without the need for high-pressure tanks or cryogenic storage. This vision of a sustainable future could become a reality if a Calgary, Canada–based company, Ayrton Energy, can scale up its innovative method of hydrogen storage and distribution. Ayrton’s technology could make hydrogen a viable, one-to-one replacement for fossil fuels in existing infrastructure like pipelines, fuel tankers, rail cars, and trucks.

The company’s approach is to use liquid organic hydrogen carriers (LOHCs) to make it easier to transport and store hydrogen. The method chemically bonds hydrogen to carrier molecules, which absorb hydrogen molecules and make them more stable—kind of like hydrogenating cooking oil to produce margarine.

A researcher pours a sample of Ayrton’s LOHC fluid into a vial.Ayrton Energy

The approach would allow liquid hydrogen to be transported and stored in ambient conditions, rather than in the high-pressure, cryogenic tanks (to hold it at temperatures below -252 ºC) currently required for keeping hydrogen in liquid form. It would also be a big improvement on gaseous hydrogen, which is highly volatile and difficult to keep contained.

Founded in 2021, Ayrton is one of several companies across the globe developing LOHCs, including Japan’s Chiyoda and Mitsubishi, Germany’s Covalion, and China’s Hynertech. But toxicity, energy density, and input energy issues have limited LOHCs as contenders for making liquid hydrogen feasible. Ayrton says its formulation eliminates these trade-offs.

Safe, Efficient Hydrogen Fuel for Vehicles

Conventional LOHC technologies used by most of the aforementioned companies rely on substances such as toluene, which forms methylcyclohexane when hydrogenated. These carriers pose safety risks due to their flammability and volatility. Hydrogenious LOHC Technologies in Erlanger, Germany and other hydrogen fuel companies have shifted toward dibenzyltoluene, a more stable carrier that holds more hydrogen per unit volume than methylcyclohexane, though it requires higher temperatures (and thus more energy) to bind and release the hydrogen. Dibenzyltoluene hydrogenation occurs at between 3 and 10 megapascals (30 and 100 bar) and 200–300 ºC, compared with 10 MPa (100 bar), and just under 200 ºC for methylcyclohexane.

Ayrton’s proprietary oil-based hydrogen carrier not only captures and releases hydrogen with less input energy than is required for other LOHCs, but also stores more hydrogen than methylcyclohexane can—55 kilograms per cubic meter compared with methylcyclohexane’s 50 kg/m³. Dibenzyltoluene holds more hydrogen per unit volume (up to 65 kg/m³), but Ayrton’s approach to infusing the carrier with hydrogen atoms promises to cost less. Hydrogenation or dehydrogenation with Ayrton’s carrier fluid occurs at 0.1 megapascal (1 bar) and about 100 ºC, says founder and CEO Natasha Kostenuk. And as with the other LOHCs, after hydrogenation it can be transported and stored at ambient temperatures and pressures.

Judges described [Ayrton's approach] as a critical technology for the deployment of hydrogen at large scale.” —Katie Richardson, National Renewable Energy Lab

Ayrton’s LOHC fluid is as safe to handle as margarine, but it’s still a chemical, says Kostenuk. “I wouldn’t drink it. If you did, you wouldn’t feel very good. But it’s not lethal,” she says.

Kostenuk and fellow Ayrton cofounder Brandy Kinkead (who serves as the company’s chief technical officer) were originally trying to bring hydrogen generators to market to fill gaps in the electrical grid. “We were looking for fuel cells and hydrogen storage. Fuel cells were easy to find, but we couldn’t find a hydrogen storage method or medium that would be safe and easy to transport to fuel our vision of what we were trying to do with hydrogen generators,” Kostenuk says. During the search, they came across LOHC technology but weren’t satisfied with the trade-offs demanded by existing liquid hydrogen carriers. “We had the idea that we could do it better,” she says. The duo pivoted, adjusting their focus from hydrogen generators to hydrogen storage solutions.

“Everybody gets excited about hydrogen production and hydrogen end use, but they forget that you have to store and manage the hydrogen,” Kostenuk says. Incompatibility with current storage and distribution has been a barrier to adoption, she says. “We’re really excited about being able to reuse existing infrastructure that’s in place all over the world.” Ayrton’s hydrogenated liquid has fuel-cell-grade (99.999 percent) hydrogen purity, so there’s no advantage in using pure liquid hydrogen with its need for subzero temperatures, according to the company.

The main challenge the company faces is the set of issues that come along with any technology scaling up from pilot-stage production to commercial manufacturing, says Kostenuk. “A crucial part of that is aligning ourselves with the right manufacturing partners along the way,” she notes.

Asked about how Ayrton is dealing with some other challenges common to LOHCs, Kostenuk says Ayrton has managed to sidestep them. “We stayed away from materials that are expensive and hard to procure, which will help us avoid any supply chain issues,” she says. By performing the reactions at such low temperatures, Ayrton can get its carrier fluid to withstand 1,000 hydrogenation-dehydrogenation cycles before it no longer holds enough hydrogen to be useful. Conventional LOHCs are limited to a couple of hundred cycles before the high temperatures required for bonding and releasing the hydrogen breaks down the fluid and diminishes its storage capacity, Kostenuk says.

Breakthrough in Hydrogen Storage Technology

In acknowledgement of what Ayrton’s nontoxic, oil-based carrier fluid could mean for the energy and transportation sectors, the U.S. National Renewable Energy Lab (NREL) at its annual Industry Growth Forum in May named Ayrton an “outstanding early-stage venture.” A selection committee of more than 180 climate tech and cleantech investors and industry experts chose Ayrton from a pool of more than 200 initial applicants, says Katie Richardson, group manager of NREL’s Innovation and Entrepreneurship Center, which organized the forum. The committee based its decision on the company’s innovation, market positioning, business model, team, next steps for funding, technology, capital use, and quality of pitch presentation. “Judges described Ayrton’s approach as a critical technology for the deployment of hydrogen at large scale,” Richardson says.

As a next step toward enabling hydrogen to push gasoline and diesel aside, “we’re talking with hydrogen producers who are right now offering their customers cryogenic and compressed hydrogen,” says Kostenuk. “If they offered LOHC, it would enable them to deliver across longer distances, in larger volumes, in a multimodal way.” The company is also talking to some industrial site owners who could use the hydrogenated LOHC for buffer storage to hold onto some of the energy they’re getting from clean, intermittent sources like solar and wind. Another natural fit, she says, is energy service providers that are looking for a reliable method of seasonal storage beyond what batteries can offer. The goal is to eventually scale up enough to become the go-to alternative (or perhaps the standard) fuel for cars, trucks, trains, and ships.








gen

UK needs to ‘update equipment’ and be ‘ready for threats we face’, says Tom Tugendhat

We spoke to the Conservative MP and former army officer Tom Tugendhat, who served in Iraq and Afghanistan.






gen

Gospel Legend Mavis Staples Comes 'Full Circle'

The gospel legend, whose new album is titled One True Vine, has a career spanning more than 60 years. She says of the record, made in collaboration with Wilco's Jeff Tweedy, "I've gone from the strictly gospel to folk to country, and here I am right back at home where I began."




gen

The Impact of GenAI on Data Loss Prevention

Data is essential for any organization. This isn’t a new concept, and it’s not one that should be a surprise, but it

The post The Impact of GenAI on Data Loss Prevention appeared first on Gigaom.




gen

Legendary Fighting Series ‘The King of Fighters’ Crosses Over into ‘Another Eden’ with the “Another Bout” Event that Kicks Off August 22nd

I did not have Another Eden, beloved mobile role-playing game, having a crossover event with The King of Fighters, legendary …