ge

Snoring isn't just a nuisance, it's dangerous. Why can't we treat it?

Snoring is often viewed as harmless, at least to the snorer, but we are now uncovering its potentially serious effects on cardiovascular health. And finding ways to stop is surprisingly challenging




ge

Stem cell transplant gives hope for treating age-related sight loss

A monkey that performed poorly on vision tests did much better after having a stem cell transplant to patch up holes in its retina




ge

Next-generation technology is a critical mid-step in dementia care

New technologies will radically change the experience of living with and caring for someone with Alzheimer's, says Professor Fiona Carragher, chief policy and research officer at Alzheimer's Society, UK




ge

Hospital hit by Hurricane Milton gets system to grab water from air

Systems that can harvest water from moisture in the atmosphere could offer a valuable water source in the wake of disasters




ge

Listening to music after surgery seems to be an effective painkiller

People who listen to music after having surgery report lower levels of pain and require less morphine than those who don't




ge

How bad is vaping for your health? We’re finally getting answers

As more of us take up vaping and concerns rise about the long-term effects, we now have enough data to get a grip on the health impact – and how it compares to smoking




ge

Do certain foods suppress inflammation and help you live longer?

Recent research shows that anti-inflammatory diets are not as faddish as they might sound, with the power to reduce the risk of heart attacks and some cancers




ge

One course of antibiotics can change your gut microbiome for years

Antibiotics can reduce diversity in the gut microbiome, raising the risk of infections that cause diarrhoea - and the effects may last years




ge

Michelangelo's 'The Flood' seems to depict a woman with breast cancer

The Renaissance artist Michelangelo had carried out human dissections, which may have led him to include women with breast cancer in some of his pieces




ge

Dazzling images illuminate research on cardiovascular disease

The British Heart Foundation’s Reflections of Research competition showcases beautiful images captured by researchers studying heart and circulatory disease




ge

More people are living with pain today than before covid emerged

Chronic pain has increased among adults in the US since 2019, which could be due to a rise in sedentary lifestyles or reduced access to healthcare amid covid-19 restrictions




ge

Israeli leader tells Biden 'we have to get hostages back' who are 'going through hell in dungeons of Gaza'

Israeli President Isaac Herzog says hostages are "going through hell in the dungeons of Gaza" during meeting with President Biden at White House.



  • 338ac2a7-ac44-5ca0-8150-4a2aada67f5d
  • fnc
  • Fox News
  • fox-news/world/world-regions/israel
  • fox-news/person/joe-biden
  • fox-news/politics/executive/white-house
  • fox-news/politics/foreign-policy
  • fox-news/world/conflicts
  • fox-news/world/world-regions/middle-east
  • fox-news/world
  • fox-news/politics
  • article

ge

ICE nabs another illegal immigrant in Mass. charged with child sex crime, as gov snubs Trump deportations

Immigration and Customs Enforcement has arrested another illegal immigrant charged with child sex offenses, as the state's governor says police won't help the Trump administration.



  • 70ed09d8-1b0b-5551-9915-35cb29dcea5e
  • fnc
  • Fox News
  • fox-news/us/immigration/illegal-immigrants
  • fox-news/us/immigration
  • fox-news/politics/executive/homeland-security
  • fox-news/us/us-regions/northeast/massachusetts
  • fox-news/politics
  • article

ge

Food recalls in the U.S. spike due to Listeria, Salmonella and allergens

An in-depth analysis in the United States, covering 2002 to 2023, reveals that biological contamination and allergens are the leading causes of food recalls. The study, recently published in the Journal of Food Protection, examined more than 35,000 food and beverage recalls overseen by the U.S. Food and Drug Administration... Continue Reading




ge

No changes involving animals came about in Colorado elections

On Tuesday, three of nine ballot issues Denver voters had to decide dealt with animals and animal products. But nothing changed because all of them were slaughtered at the ballot  box. One of the ballot issues called for prohibiting any slaughterhouse from operating in the City or County of Denver. That... Continue Reading




ge

Israel plans changes to food licensing rules

Israel has proposed a revised system of food business licensing to ease the regulatory burden on industry and improve sanitary conditions. The Ministry of Health said the current regulation, regarding business licensing in general and food businesses in particular, is outdated and places a heavy regulatory burden on companies. This... Continue Reading




ge

Germany sees outbreaks decline, but cases increase

Germany has reported a decline in outbreaks for 2023, but more people were sick than in the previous year. In 2023, the Federal Office of Consumer Protection and Food Safety (BVL) and the Robert Koch Institute (RKI) received 190 reports of foodborne outbreaks that caused 2,248 illnesses, 283 hospitalizations, and... Continue Reading




ge

Australians urged to read labels as country marks Food Safety Week

Australians have been urged to look before they cook and read the safety advice on food labels. The Food Safety Information Council (FSIC) issued the call ahead of Australian Food Safety week from Nov. 9 to 16. Lydia Buchtmann, FSIC CEO, said the charity’s research shows that only 3 in... Continue Reading




ge

RFK Jr. and the Make America Healthy Again agenda could impact food safety

RFK Jr., a lawyer-politician, could replace lawyer-politician Xavier Becerra as Secretary of Health and Human  Services. Or RFK Jr could be the next Secretary of Agriculture, replacing  Tom Vilsack, a lawyer. Deputy FDA Commissioners are sometimes lawyers. Dr. Robert Califf, a cardiologist, is the outgoing FDA Commissioner. The fact that... Continue Reading




ge

Large EU-wide Salmonella outbreak linked to tomatoes from Italy

A multi-country Salmonella outbreak in Europe linked to tomatoes from Italy has sickened more than 250 people. From January 2023 to November 2024, 266 confirmed cases of Salmonella Strathcona have been identified in 16 European countries and the United Kingdom. Croatia, Czech Republic, Denmark, Estonia, Finland, France, Ireland, Luxembourg, the... Continue Reading




ge

AIMCo expansion, Alberta's investment focus were sources of tension before purge, sources say

Pension veterans say there was more going on behind the scenes than scrutiny of costs




ge

Major labour shortage looms in Atlantic Canada as immigration cuts take hold

Atlantic Canadians say the region has room to grow, but is facing a shrinking labour pool




ge

Cowboys' Dak Prescott elects to have season-ending surgery to address injured hamstring, Jerry Jones says

The Dallas Cowboys quarterback got another opinion on his hamstring and decided that surgery would be the best way to address the injury.



  • f8d4b7f0-229c-5132-b195-d53df731c643
  • fnc
  • Fox News
  • fox-news/sports/nfl/dallas-cowboys
  • fox-news/sports/nfl
  • fox-news/person/dak-prescott
  • fox-news/sports
  • fox-news/health/medical-research/surgery
  • fox-news/sports
  • article

ge

British woman busted at Los Angeles airport with meth-soaked T-shirts: police

Myah Saakwa-Mante, a 20-year-old British university student, was caught at Los Angeles International Airport and arrested after allegedly attempting to smuggle T-shirts soaked with methamphetamine.



  • 025772a1-a0d2-5169-b96e-07d8919e9f08
  • fnc
  • Fox News
  • fox-news/us/crime
  • fox-news/us/los-angeles
  • fox-news/travel/general/airports
  • fox-news/us/crime/drugs
  • fox-news/us
  • article

ge

Georgia on outside of College Football Playoff bracket as wild week brings rankings shakeup

Georgia's loss to Ole Miss Saturday brought a wild shakeup to the college football rankings, and the Bulldogs find themselves out of the playoff picture.



  • be1a5b1e-e9fd-515d-8deb-af99e8d76913
  • fnc
  • Fox News
  • fox-news/sports/ncaa-fb
  • fox-news/sports/ncaa
  • fox-news/sports
  • fox-news/sports/ncaa/georgia-bulldogs
  • fox-news/sports/ncaa/oregon-ducks
  • fox-news/sports
  • article

ge

Agencies tight-lipped on kickbacks

Australia’s leading media agencies have ducked questions about cash kickbacks.




ge

How AI Will Change Chip Design



The end of Moore’s Law is looming. Engineers and designers can do only so much to miniaturize transistors and pack as many of them as possible into chips. So they’re turning to other approaches to chip design, incorporating technologies like AI into the process.

Samsung, for instance, is adding AI to its memory chips to enable processing in memory, thereby saving energy and speeding up machine learning. Speaking of speed, Google’s TPU V4 AI chip has doubled its processing power compared with that of its previous version.

But AI holds still more promise and potential for the semiconductor industry. To better understand how AI is set to revolutionize chip design, we spoke with Heather Gorr, senior product manager for MathWorks’ MATLAB platform.

How is AI currently being used to design the next generation of chips?

Heather Gorr: AI is such an important technology because it’s involved in most parts of the cycle, including the design and manufacturing process. There’s a lot of important applications here, even in the general process engineering where we want to optimize things. I think defect detection is a big one at all phases of the process, especially in manufacturing. But even thinking ahead in the design process, [AI now plays a significant role] when you’re designing the light and the sensors and all the different components. There’s a lot of anomaly detection and fault mitigation that you really want to consider.

Heather GorrMathWorks

Then, thinking about the logistical modeling that you see in any industry, there is always planned downtime that you want to mitigate; but you also end up having unplanned downtime. So, looking back at that historical data of when you’ve had those moments where maybe it took a bit longer than expected to manufacture something, you can take a look at all of that data and use AI to try to identify the proximate cause or to see something that might jump out even in the processing and design phases. We think of AI oftentimes as a predictive tool, or as a robot doing something, but a lot of times you get a lot of insight from the data through AI.

What are the benefits of using AI for chip design?

Gorr: Historically, we’ve seen a lot of physics-based modeling, which is a very intensive process. We want to do a reduced order model, where instead of solving such a computationally expensive and extensive model, we can do something a little cheaper. You could create a surrogate model, so to speak, of that physics-based model, use the data, and then do your parameter sweeps, your optimizations, your Monte Carlo simulations using the surrogate model. That takes a lot less time computationally than solving the physics-based equations directly. So, we’re seeing that benefit in many ways, including the efficiency and economy that are the results of iterating quickly on the experiments and the simulations that will really help in the design.

So it’s like having a digital twin in a sense?

Gorr: Exactly. That’s pretty much what people are doing, where you have the physical system model and the experimental data. Then, in conjunction, you have this other model that you could tweak and tune and try different parameters and experiments that let sweep through all of those different situations and come up with a better design in the end.

So, it’s going to be more efficient and, as you said, cheaper?

Gorr: Yeah, definitely. Especially in the experimentation and design phases, where you’re trying different things. That’s obviously going to yield dramatic cost savings if you’re actually manufacturing and producing [the chips]. You want to simulate, test, experiment as much as possible without making something using the actual process engineering.

We’ve talked about the benefits. How about the drawbacks?

Gorr: The [AI-based experimental models] tend to not be as accurate as physics-based models. Of course, that’s why you do many simulations and parameter sweeps. But that’s also the benefit of having that digital twin, where you can keep that in mind—it’s not going to be as accurate as that precise model that we’ve developed over the years.

Both chip design and manufacturing are system intensive; you have to consider every little part. And that can be really challenging. It’s a case where you might have models to predict something and different parts of it, but you still need to bring it all together.

One of the other things to think about too is that you need the data to build the models. You have to incorporate data from all sorts of different sensors and different sorts of teams, and so that heightens the challenge.

How can engineers use AI to better prepare and extract insights from hardware or sensor data?

Gorr: We always think about using AI to predict something or do some robot task, but you can use AI to come up with patterns and pick out things you might not have noticed before on your own. People will use AI when they have high-frequency data coming from many different sensors, and a lot of times it’s useful to explore the frequency domain and things like data synchronization or resampling. Those can be really challenging if you’re not sure where to start.

One of the things I would say is, use the tools that are available. There’s a vast community of people working on these things, and you can find lots of examples [of applications and techniques] on GitHub or MATLAB Central, where people have shared nice examples, even little apps they’ve created. I think many of us are buried in data and just not sure what to do with it, so definitely take advantage of what’s already out there in the community. You can explore and see what makes sense to you, and bring in that balance of domain knowledge and the insight you get from the tools and AI.

What should engineers and designers consider when using AI for chip design?

Gorr: Think through what problems you’re trying to solve or what insights you might hope to find, and try to be clear about that. Consider all of the different components, and document and test each of those different parts. Consider all of the people involved, and explain and hand off in a way that is sensible for the whole team.

How do you think AI will affect chip designers’ jobs?

Gorr: It’s going to free up a lot of human capital for more advanced tasks. We can use AI to reduce waste, to optimize the materials, to optimize the design, but then you still have that human involved whenever it comes to decision-making. I think it’s a great example of people and technology working hand in hand. It’s also an industry where all people involved—even on the manufacturing floor—need to have some level of understanding of what’s happening, so this is a great industry for advancing AI because of how we test things and how we think about them before we put them on the chip.

How do you envision the future of AI and chip design?

Gorr: It’s very much dependent on that human element—involving people in the process and having that interpretable model. We can do many things with the mathematical minutiae of modeling, but it comes down to how people are using it, how everybody in the process is understanding and applying it. Communication and involvement of people of all skill levels in the process are going to be really important. We’re going to see less of those superprecise predictions and more transparency of information, sharing, and that digital twin—not only using AI but also using our human knowledge and all of the work that many people have done over the years.




ge

Andrew Ng: Unbiggen AI



Andrew Ng has serious street cred in artificial intelligence. He pioneered the use of graphics processing units (GPUs) to train deep learning models in the late 2000s with his students at Stanford University, cofounded Google Brain in 2011, and then served for three years as chief scientist for Baidu, where he helped build the Chinese tech giant’s AI group. So when he says he has identified the next big shift in artificial intelligence, people listen. And that’s what he told IEEE Spectrum in an exclusive Q&A.


Ng’s current efforts are focused on his company Landing AI, which built a platform called LandingLens to help manufacturers improve visual inspection with computer vision. He has also become something of an evangelist for what he calls the data-centric AI movement, which he says can yield “small data” solutions to big issues in AI, including model efficiency, accuracy, and bias.

Andrew Ng on...

The great advances in deep learning over the past decade or so have been powered by ever-bigger models crunching ever-bigger amounts of data. Some people argue that that’s an unsustainable trajectory. Do you agree that it can’t go on that way?

Andrew Ng: This is a big question. We’ve seen foundation models in NLP [natural language processing]. I’m excited about NLP models getting even bigger, and also about the potential of building foundation models in computer vision. I think there’s lots of signal to still be exploited in video: We have not been able to build foundation models yet for video because of compute bandwidth and the cost of processing video, as opposed to tokenized text. So I think that this engine of scaling up deep learning algorithms, which has been running for something like 15 years now, still has steam in it. Having said that, it only applies to certain problems, and there’s a set of other problems that need small data solutions.

When you say you want a foundation model for computer vision, what do you mean by that?

Ng: This is a term coined by Percy Liang and some of my friends at Stanford to refer to very large models, trained on very large data sets, that can be tuned for specific applications. For example, GPT-3 is an example of a foundation model [for NLP]. Foundation models offer a lot of promise as a new paradigm in developing machine learning applications, but also challenges in terms of making sure that they’re reasonably fair and free from bias, especially if many of us will be building on top of them.

What needs to happen for someone to build a foundation model for video?

Ng: I think there is a scalability problem. The compute power needed to process the large volume of images for video is significant, and I think that’s why foundation models have arisen first in NLP. Many researchers are working on this, and I think we’re seeing early signs of such models being developed in computer vision. But I’m confident that if a semiconductor maker gave us 10 times more processor power, we could easily find 10 times more video to build such models for vision.

Having said that, a lot of what’s happened over the past decade is that deep learning has happened in consumer-facing companies that have large user bases, sometimes billions of users, and therefore very large data sets. While that paradigm of machine learning has driven a lot of economic value in consumer software, I find that that recipe of scale doesn’t work for other industries.

Back to top

It’s funny to hear you say that, because your early work was at a consumer-facing company with millions of users.

Ng: Over a decade ago, when I proposed starting the Google Brain project to use Google’s compute infrastructure to build very large neural networks, it was a controversial step. One very senior person pulled me aside and warned me that starting Google Brain would be bad for my career. I think he felt that the action couldn’t just be in scaling up, and that I should instead focus on architecture innovation.

“In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.”
—Andrew Ng, CEO & Founder, Landing AI

I remember when my students and I published the first NeurIPS workshop paper advocating using CUDA, a platform for processing on GPUs, for deep learning—a different senior person in AI sat me down and said, “CUDA is really complicated to program. As a programming paradigm, this seems like too much work.” I did manage to convince him; the other person I did not convince.

I expect they’re both convinced now.

Ng: I think so, yes.

Over the past year as I’ve been speaking to people about the data-centric AI movement, I’ve been getting flashbacks to when I was speaking to people about deep learning and scalability 10 or 15 years ago. In the past year, I’ve been getting the same mix of “there’s nothing new here” and “this seems like the wrong direction.”

Back to top

How do you define data-centric AI, and why do you consider it a movement?

Ng: Data-centric AI is the discipline of systematically engineering the data needed to successfully build an AI system. For an AI system, you have to implement some algorithm, say a neural network, in code and then train it on your data set. The dominant paradigm over the last decade was to download the data set while you focus on improving the code. Thanks to that paradigm, over the last decade deep learning networks have improved significantly, to the point where for a lot of applications the code—the neural network architecture—is basically a solved problem. So for many practical applications, it’s now more productive to hold the neural network architecture fixed, and instead find ways to improve the data.

When I started speaking about this, there were many practitioners who, completely appropriately, raised their hands and said, “Yes, we’ve been doing this for 20 years.” This is the time to take the things that some individuals have been doing intuitively and make it a systematic engineering discipline.

The data-centric AI movement is much bigger than one company or group of researchers. My collaborators and I organized a data-centric AI workshop at NeurIPS, and I was really delighted at the number of authors and presenters that showed up.

You often talk about companies or institutions that have only a small amount of data to work with. How can data-centric AI help them?

Ng: You hear a lot about vision systems built with millions of images—I once built a face recognition system using 350 million images. Architectures built for hundreds of millions of images don’t work with only 50 images. But it turns out, if you have 50 really good examples, you can build something valuable, like a defect-inspection system. In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.

When you talk about training a model with just 50 images, does that really mean you’re taking an existing model that was trained on a very large data set and fine-tuning it? Or do you mean a brand new model that’s designed to learn only from that small data set?

Ng: Let me describe what Landing AI does. When doing visual inspection for manufacturers, we often use our own flavor of RetinaNet. It is a pretrained model. Having said that, the pretraining is a small piece of the puzzle. What’s a bigger piece of the puzzle is providing tools that enable the manufacturer to pick the right set of images [to use for fine-tuning] and label them in a consistent way. There’s a very practical problem we’ve seen spanning vision, NLP, and speech, where even human annotators don’t agree on the appropriate label. For big data applications, the common response has been: If the data is noisy, let’s just get a lot of data and the algorithm will average over it. But if you can develop tools that flag where the data’s inconsistent and give you a very targeted way to improve the consistency of the data, that turns out to be a more efficient way to get a high-performing system.

“Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.”
—Andrew Ng

For example, if you have 10,000 images where 30 images are of one class, and those 30 images are labeled inconsistently, one of the things we do is build tools to draw your attention to the subset of data that’s inconsistent. So you can very quickly relabel those images to be more consistent, and this leads to improvement in performance.

Could this focus on high-quality data help with bias in data sets? If you’re able to curate the data more before training?

Ng: Very much so. Many researchers have pointed out that biased data is one factor among many leading to biased systems. There have been many thoughtful efforts to engineer the data. At the NeurIPS workshop, Olga Russakovsky gave a really nice talk on this. At the main NeurIPS conference, I also really enjoyed Mary Gray’s presentation, which touched on how data-centric AI is one piece of the solution, but not the entire solution. New tools like Datasheets for Datasets also seem like an important piece of the puzzle.

One of the powerful tools that data-centric AI gives us is the ability to engineer a subset of the data. Imagine training a machine-learning system and finding that its performance is okay for most of the data set, but its performance is biased for just a subset of the data. If you try to change the whole neural network architecture to improve the performance on just that subset, it’s quite difficult. But if you can engineer a subset of the data you can address the problem in a much more targeted way.

When you talk about engineering the data, what do you mean exactly?

Ng: In AI, data cleaning is important, but the way the data has been cleaned has often been in very manual ways. In computer vision, someone may visualize images through a Jupyter notebook and maybe spot the problem, and maybe fix it. But I’m excited about tools that allow you to have a very large data set, tools that draw your attention quickly and efficiently to the subset of data where, say, the labels are noisy. Or to quickly bring your attention to the one class among 100 classes where it would benefit you to collect more data. Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.

For example, I once figured out that a speech-recognition system was performing poorly when there was car noise in the background. Knowing that allowed me to collect more data with car noise in the background, rather than trying to collect more data for everything, which would have been expensive and slow.

Back to top

What about using synthetic data, is that often a good solution?

Ng: I think synthetic data is an important tool in the tool chest of data-centric AI. At the NeurIPS workshop, Anima Anandkumar gave a great talk that touched on synthetic data. I think there are important uses of synthetic data that go beyond just being a preprocessing step for increasing the data set for a learning algorithm. I’d love to see more tools to let developers use synthetic data generation as part of the closed loop of iterative machine learning development.

Do you mean that synthetic data would allow you to try the model on more data sets?

Ng: Not really. Here’s an example. Let’s say you’re trying to detect defects in a smartphone casing. There are many different types of defects on smartphones. It could be a scratch, a dent, pit marks, discoloration of the material, other types of blemishes. If you train the model and then find through error analysis that it’s doing well overall but it’s performing poorly on pit marks, then synthetic data generation allows you to address the problem in a more targeted way. You could generate more data just for the pit-mark category.

“In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models.”
—Andrew Ng

Synthetic data generation is a very powerful tool, but there are many simpler tools that I will often try first. Such as data augmentation, improving labeling consistency, or just asking a factory to collect more data.

Back to top

To make these issues more concrete, can you walk me through an example? When a company approaches Landing AI and says it has a problem with visual inspection, how do you onboard them and work toward deployment?

Ng: When a customer approaches us we usually have a conversation about their inspection problem and look at a few images to verify that the problem is feasible with computer vision. Assuming it is, we ask them to upload the data to the LandingLens platform. We often advise them on the methodology of data-centric AI and help them label the data.

One of the foci of Landing AI is to empower manufacturing companies to do the machine learning work themselves. A lot of our work is making sure the software is fast and easy to use. Through the iterative process of machine learning development, we advise customers on things like how to train models on the platform, when and how to improve the labeling of data so the performance of the model improves. Our training and software supports them all the way through deploying the trained model to an edge device in the factory.

How do you deal with changing needs? If products change or lighting conditions change in the factory, can the model keep up?

Ng: It varies by manufacturer. There is data drift in many contexts. But there are some manufacturers that have been running the same manufacturing line for 20 years now with few changes, so they don’t expect changes in the next five years. Those stable environments make things easier. For other manufacturers, we provide tools to flag when there’s a significant data-drift issue. I find it really important to empower manufacturing customers to correct data, retrain, and update the model. Because if something changes and it’s 3 a.m. in the United States, I want them to be able to adapt their learning algorithm right away to maintain operations.

In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models. The challenge is, how do you do that without Landing AI having to hire 10,000 machine learning specialists?

So you’re saying that to make it scale, you have to empower customers to do a lot of the training and other work.

Ng: Yes, exactly! This is an industry-wide problem in AI, not just in manufacturing. Look at health care. Every hospital has its own slightly different format for electronic health records. How can every hospital train its own custom AI model? Expecting every hospital’s IT personnel to invent new neural-network architectures is unrealistic. The only way out of this dilemma is to build tools that empower the customers to build their own models by giving them tools to engineer the data and express their domain knowledge. That’s what Landing AI is executing in computer vision, and the field of AI needs other teams to execute this in other domains.

Is there anything else you think it’s important for people to understand about the work you’re doing or the data-centric AI movement?

Ng: In the last decade, the biggest shift in AI was a shift to deep learning. I think it’s quite possible that in this decade the biggest shift will be to data-centric AI. With the maturity of today’s neural network architectures, I think for a lot of the practical applications the bottleneck will be whether we can efficiently get the data we need to develop systems that work well. The data-centric AI movement has tremendous energy and momentum across the whole community. I hope more researchers and developers will jump in and work on it.

Back to top

This article appears in the April 2022 print issue as “Andrew Ng, AI Minimalist.”




ge

New Carrier Fluid Makes Hydrogen Way Easier to Transport



Imagine pulling up to a refueling station and filling your vehicle’s tank with liquid hydrogen, as safe and convenient to handle as gasoline or diesel, without the need for high-pressure tanks or cryogenic storage. This vision of a sustainable future could become a reality if a Calgary, Canada–based company, Ayrton Energy, can scale up its innovative method of hydrogen storage and distribution. Ayrton’s technology could make hydrogen a viable, one-to-one replacement for fossil fuels in existing infrastructure like pipelines, fuel tankers, rail cars, and trucks.

The company’s approach is to use liquid organic hydrogen carriers (LOHCs) to make it easier to transport and store hydrogen. The method chemically bonds hydrogen to carrier molecules, which absorb hydrogen molecules and make them more stable—kind of like hydrogenating cooking oil to produce margarine.

A researcher pours a sample of Ayrton’s LOHC fluid into a vial.Ayrton Energy

The approach would allow liquid hydrogen to be transported and stored in ambient conditions, rather than in the high-pressure, cryogenic tanks (to hold it at temperatures below -252 ºC) currently required for keeping hydrogen in liquid form. It would also be a big improvement on gaseous hydrogen, which is highly volatile and difficult to keep contained.

Founded in 2021, Ayrton is one of several companies across the globe developing LOHCs, including Japan’s Chiyoda and Mitsubishi, Germany’s Covalion, and China’s Hynertech. But toxicity, energy density, and input energy issues have limited LOHCs as contenders for making liquid hydrogen feasible. Ayrton says its formulation eliminates these trade-offs.

Safe, Efficient Hydrogen Fuel for Vehicles

Conventional LOHC technologies used by most of the aforementioned companies rely on substances such as toluene, which forms methylcyclohexane when hydrogenated. These carriers pose safety risks due to their flammability and volatility. Hydrogenious LOHC Technologies in Erlanger, Germany and other hydrogen fuel companies have shifted toward dibenzyltoluene, a more stable carrier that holds more hydrogen per unit volume than methylcyclohexane, though it requires higher temperatures (and thus more energy) to bind and release the hydrogen. Dibenzyltoluene hydrogenation occurs at between 3 and 10 megapascals (30 and 100 bar) and 200–300 ºC, compared with 10 MPa (100 bar), and just under 200 ºC for methylcyclohexane.

Ayrton’s proprietary oil-based hydrogen carrier not only captures and releases hydrogen with less input energy than is required for other LOHCs, but also stores more hydrogen than methylcyclohexane can—55 kilograms per cubic meter compared with methylcyclohexane’s 50 kg/m³. Dibenzyltoluene holds more hydrogen per unit volume (up to 65 kg/m³), but Ayrton’s approach to infusing the carrier with hydrogen atoms promises to cost less. Hydrogenation or dehydrogenation with Ayrton’s carrier fluid occurs at 0.1 megapascal (1 bar) and about 100 ºC, says founder and CEO Natasha Kostenuk. And as with the other LOHCs, after hydrogenation it can be transported and stored at ambient temperatures and pressures.

Judges described [Ayrton's approach] as a critical technology for the deployment of hydrogen at large scale.” —Katie Richardson, National Renewable Energy Lab

Ayrton’s LOHC fluid is as safe to handle as margarine, but it’s still a chemical, says Kostenuk. “I wouldn’t drink it. If you did, you wouldn’t feel very good. But it’s not lethal,” she says.

Kostenuk and fellow Ayrton cofounder Brandy Kinkead (who serves as the company’s chief technical officer) were originally trying to bring hydrogen generators to market to fill gaps in the electrical grid. “We were looking for fuel cells and hydrogen storage. Fuel cells were easy to find, but we couldn’t find a hydrogen storage method or medium that would be safe and easy to transport to fuel our vision of what we were trying to do with hydrogen generators,” Kostenuk says. During the search, they came across LOHC technology but weren’t satisfied with the trade-offs demanded by existing liquid hydrogen carriers. “We had the idea that we could do it better,” she says. The duo pivoted, adjusting their focus from hydrogen generators to hydrogen storage solutions.

“Everybody gets excited about hydrogen production and hydrogen end use, but they forget that you have to store and manage the hydrogen,” Kostenuk says. Incompatibility with current storage and distribution has been a barrier to adoption, she says. “We’re really excited about being able to reuse existing infrastructure that’s in place all over the world.” Ayrton’s hydrogenated liquid has fuel-cell-grade (99.999 percent) hydrogen purity, so there’s no advantage in using pure liquid hydrogen with its need for subzero temperatures, according to the company.

The main challenge the company faces is the set of issues that come along with any technology scaling up from pilot-stage production to commercial manufacturing, says Kostenuk. “A crucial part of that is aligning ourselves with the right manufacturing partners along the way,” she notes.

Asked about how Ayrton is dealing with some other challenges common to LOHCs, Kostenuk says Ayrton has managed to sidestep them. “We stayed away from materials that are expensive and hard to procure, which will help us avoid any supply chain issues,” she says. By performing the reactions at such low temperatures, Ayrton can get its carrier fluid to withstand 1,000 hydrogenation-dehydrogenation cycles before it no longer holds enough hydrogen to be useful. Conventional LOHCs are limited to a couple of hundred cycles before the high temperatures required for bonding and releasing the hydrogen breaks down the fluid and diminishes its storage capacity, Kostenuk says.

Breakthrough in Hydrogen Storage Technology

In acknowledgement of what Ayrton’s nontoxic, oil-based carrier fluid could mean for the energy and transportation sectors, the U.S. National Renewable Energy Lab (NREL) at its annual Industry Growth Forum in May named Ayrton an “outstanding early-stage venture.” A selection committee of more than 180 climate tech and cleantech investors and industry experts chose Ayrton from a pool of more than 200 initial applicants, says Katie Richardson, group manager of NREL’s Innovation and Entrepreneurship Center, which organized the forum. The committee based its decision on the company’s innovation, market positioning, business model, team, next steps for funding, technology, capital use, and quality of pitch presentation. “Judges described Ayrton’s approach as a critical technology for the deployment of hydrogen at large scale,” Richardson says.

As a next step toward enabling hydrogen to push gasoline and diesel aside, “we’re talking with hydrogen producers who are right now offering their customers cryogenic and compressed hydrogen,” says Kostenuk. “If they offered LOHC, it would enable them to deliver across longer distances, in larger volumes, in a multimodal way.” The company is also talking to some industrial site owners who could use the hydrogenated LOHC for buffer storage to hold onto some of the energy they’re getting from clean, intermittent sources like solar and wind. Another natural fit, she says, is energy service providers that are looking for a reliable method of seasonal storage beyond what batteries can offer. The goal is to eventually scale up enough to become the go-to alternative (or perhaps the standard) fuel for cars, trucks, trains, and ships.




ge

Get to Know the IEEE Board of Directors



The IEEE Board of Directors shapes the future direction of IEEE and is committed to ensuring IEEE remains a strong and vibrant organization—serving the needs of its members and the engineering and technology community worldwide—while fulfilling the IEEE mission of advancing technology for the benefit of humanity.

This article features IEEE Board of Directors members ChunChe “Lance” Fung, Eric Grigorian, and Christina Schober.

IEEE Senior Member ChunChe “Lance” Fung

Director, Region 10: Asia Pacific

Joanna Mai Yie Leung

Fung has worked in academia and provided industry consultancy services for more than 40 years. His research interests include applying artificial intelligence, machine learning, computational intelligence, and other techniques to solve practical problems. He has authored more than 400 publications in the disciplines of AI, computational intelligence, and related applications. Fung currently works on the ethical applications and social impacts of AI.

A member of the IEEE Systems, Man, and Cybernetics Society, Fung has been an active IEEE volunteer for more than 30 years. As a member and chair of the IEEE Technical Program Integrity and Conference Quality committees, he oversaw the quality of technical programs presented at IEEE conferences. Fung also chaired the Region 10 Educational Activities Committee. He was instrumental in translating educational materials to local languages for the IEEE Reaching Locals project.

As chair of the IEEE New Initiatives Committee, he established and promoted the US $1 Million Challenge Call for New Initiatives, which supports potential IEEE programs, services, or products that will significantly benefit members, the public, the technical community, or customers and could have a lasting impact on IEEE or its business processes.

Fung has left an indelible mark as a dedicated educator at Singapore Polytechnic, Curtin University, and Murdoch University. He was appointed in 2015 as professor emeritus at Murdoch, and he takes pride in training the next generation of volunteers, leaders, teachers, and researchers in the Western Australian community. Fung received the IEEE Third Millennium Medal and the IEEE Region 10 Outstanding Volunteer Award.

IEEE Senior Member Eric Grigorian

Director, Region 3: Southern U.S. & Jamaica

Sean McNeil/GTRI

Grigorian has extensive experience leading international cross-domain teams that support the commercial and defense industries. His current research focuses on implementing model-based systems engineering, creating models that depict system behavior, interfaces, and architecture. His work has led to streamlined processes, reduced costs, and faster design and implementation of capabilities due to efficient modeling and verification. Grigorian holds two U.S. utility patents.

Grigorian has been an active volunteer with IEEE since his time as a student member at the University of Alabama in Huntsville (UAH). He saw it as an excellent way to network and get to know people. He found his personality was suited for working within the organization and building leadership skills. During the past 43 years as an IEEE member, he has been affiliated with the IEEE Aerospace and Electronic Systems (AESS), IEEE Computer, and IEEE Communications societies.

As Grigorian’s career has evolved, his involvement with IEEE has also increased. He has been the IEEE Huntsville Section student activities chair, as well as vice chair, and chair. He also was the section’s AESS chair. He served as IEEE SoutheastCon chair in 2008 and 2019, and served on the IEEE Region 3 executive committee as area chair and conference committee chair, enhancing IEEE members’ benefits, engagement, and career advancement. He has significantly contributed to initiatives within IEEE, including promoting preuniversity science, technology, engineering, and mathematics efforts in Alabama.

Grigorian’s professional achievements have been recognized with numerous awards from employers and local technical chapters, including with the 2020 UAH Alumni of Achievement Award for the College of Engineering and the 2006 IEEE Region 3 Outstanding Engineer of the Year Award. He is a member of the IEEE–Eta Kappa Nu honor society.

IEEE Life Senior Member Christina Schober

Director, Division V

Katie Fears/Brio Art

Schober is an innovative engineer with a diverse design and manufacturing engineering background. With more than 40 years of experience, her career has spanned research, design, and manufacturing sensors for space, commercial, and military aircraft navigation and tactical guidance systems. She was responsible for the successful transition from design to production for groundbreaking programs including an integrated flight management system, the Stinger missile’s roll frequency sensor, and the designing of three phases of the DARPA atomic clock. She holds 17 U.S. patents and 24 other patents in the aerospace and navigation fields.

Schober started her career in the 1980s, at a time when female engineers were not widely accepted. The prevailing attitude required her to “stay tough,” she says, and she credits IEEE for giving her technical and professional support. Because of her experiences, she became dedicated to making diversity and inclusion systemic in IEEE.

Schober has held many leadership roles, including IEEE Division VIII Director, IEEE Sensors Council president, and IEEE Standards Sensors Council secretary. In addition to her membership in the IEEE Photonics Society, she is active with the IEEE Computer Society, IEEE Sensors Council, IEEE Standards Association, and IEEE Women in Engineering.

She is also active in her local community, serving as an invited speaker on STEM for the public school system and was a volunteer at youth shelters. Schober has received numerous awards including the IEEE Sensors Council Lifetime Contribution Award and the IEEE Twin Cities Section’s Young Engineer of the Year Award. She is an IEEE Computer Society Gold Core member, a member of the IEEE–Eta Kappa Nu honor society and received the IEEE Third Millennium Medal.




ge

It’s Time to Redefine What a Megafire Is in the Climate Change Era

It's not the reach of a fire that matters most; it's the speed. Understanding this can help society better prepare.




ge

Comment on Preventing Hair Loss: How Diwali Commitments Disrupt Women’s Hair Care Routine by Emlakçılık Belgesi

https://maps.google.co.in/url?q=https://yukselenakademi.com/kurs/detay/emlakcilik-belgesi-seviye-5




ge

Comment on Diwali Gift Ideas: Feasts For Everyone On Your Checklist by Emlakçılık Belgesi

https://images.google.co.uk/url?q=https://yukselenakademi.com/kurs/detay/emlakcilik-belgesi-seviye-5




ge

Comment on Are You Breathing More Than Just Festive Cheer This Diwali? Beware Of The Air Pollution by Emlakçılık Belgesi

https://maps.google.co.uk/url?q=https://yukselenakademi.com/kurs/detay/emlakcilik-belgesi-seviye-5







ge

Charger recall spells more bad news for Humane’s maligned AI Pin

Humane first reported overheating problems with the portable charger in June.







ge

In a Landmark Study, Scientists Discover Just How Much Earth's Temperature Has Changed Over Nearly 500 Million Years

Researchers show the average surface temperature on our planet has shifted between 51.8 to 96.8 degrees Fahrenheit




ge

How Scientists’ Tender Loving Care Could Save This Endangered Penguin Species

From fish smoothies to oral antibiotics, researchers are taking matters into their own hands in a radical effort to save New Zealand’s yellow-eyed penguins




ge

Can Fungi Save This Endangered Hawaiian Tree?

By inoculating greenhouse na’u seedlings with mycorrhizal fungi, researchers hope to boost survival odds when the plants are returned to the wild








ge

The Best Gel Nail Kits for At-Home Manicures

Due for a mani or pedi? No need for a salon when you have one of these DIY gel nail kits.

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]




ge

'AGT': Daredevil Annaliese Nock Terrifies Judges With Wheel of Death Stunt

'AGT' returned on Tuesday with live shows from Universal Studios Hollywood.

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]