bi

Nova Scotia biologist adapting COVID-19 technology to detect oyster disease

A biologist at Cape Breton University is hoping a piece of technology used to keep people safe in the pandemic can help protect Nova Scotia's oysters against the effects of warming waters.



  • News/Canada/Nova Scotia

bi

House Republicans demand Biden Cabinet members preserve all documents, communications

House Republicans on Tuesday demanded that each member of President Biden's cabinet preserve all relevant documents and communications, a move that signals future investigations into the Biden administration.




bi

Justice Department kept FBI employees in the dark for years about whistleblower protections

A new Government Accountability Office report says that the Justice Department kept FBI employees in the dark for seven years after Congress updated whistleblower protections for bureau personnel in 2016.




bi

Score big on Amazon Black Friday 2024 with my insider tips

Amazon's Black Friday sales event starts Friday, Nov. 22. Kurt the CyberGuy offers some tips on how to get the best deals on merchandise.



  • 2e5e282c-e75c-5f08-b690-6dda2038f64e
  • fnc
  • Fox News
  • fox-news/tech
  • fox-news/tech/companies/amazon
  • fox-business/fox-business-industries/fox-business-retail
  • fox-news/tech
  • article

bi

Watch | Helen Glover: Bidding for glory five years and three children after her last Olympics









bi

Exclusive: Liz Truss urged to act with Britain facing biggest loss of sports facilities in a generation




bi

Hashtag Trending Mar.1- HP debacle; Humanoid robots closer to hitting our workplaces; Apple blew $10 billion on the electric car before pulling the plug

If rumours are true and this one should be, I started it, we have a special edition of the Weekend show where we talk about the evolution of the role of the CIO with two incredible CIOs as the CIO Association of Canada turns 20. Don’t miss it.  MUSIC UP Can HP make you love […]

The post Hashtag Trending Mar.1- HP debacle; Humanoid robots closer to hitting our workplaces; Apple blew $10 billion on the electric car before pulling the plug first appeared on ITBusiness.ca.




bi

Hashtag Trending Mar.5- Apple Music fined for market dominance; LockBit back from the dead; OpenAI kills ChatGPT plugins

Apple Music gives a whole new meaning to the phrase the hits just keep on coming.  It’s not the opposing candidates, it’s public AI systems that are spreading election disinformation, and LockBit, the cybercriminal gang may be back from the dead and saying so long to the ChatGPT plugins, which went from innovation to legacy […]

The post Hashtag Trending Mar.5- Apple Music fined for market dominance; LockBit back from the dead; OpenAI kills ChatGPT plugins first appeared on ITBusiness.ca.




bi

Perseid meteor shower peaks Sunday night, potentially giving stargazers big show

The annual Perseid meteor shower is set to peak on Sunday night into early Monday morning, giving stargazers the chance to see hundreds of meteors.



  • f2292f9b-3ab5-59a6-91d5-1aa85abcf80a
  • fnc
  • Fox News
  • fox-news/science/air-and-space/astronomy
  • fox-news/science/air-and-space/nasa
  • fox-news/science/air-and-space
  • fox-news/science/air-and-space/spaceflight
  • fox-news/science
  • article

bi

Dark matter may be behind wobble in Mars’ orbit, study suggests

A bold new study in Physical Review suggests that a wobble detected in Mars' orbit could be the result of dark matter made up of primordial black holes.



  • 9f285106-087f-5a54-8c1a-504d0b3d36e5
  • fnc
  • Fox News
  • fox-news/science/air-and-space
  • fox-news/science/air-and-space/planets
  • fox-news/science/air-and-space/astronomy
  • fox-news/science
  • fox-news/science
  • article

bi

Engineered bacteria destroy antibiotic resistance DNA in wastewater

Wastewater is a major reservoir for antibiotic resistance genes, but modified bacteria can chop up this DNA before the dangerous microbes reach people




bi

Rejecting standard cancer treatment like Elle Macpherson is a big risk

People with cancer may have understandable reasons to follow Australian supermodel Elle Macpherson in declining chemotherapy, but the odds aren’t in their favour, warns Elle Hunt




bi

Antibiotic resistance forecast to kill 39 million people by 2050

The number of people worldwide directly killed by antibiotic resistance will rise to 1.9 million a year by 2050, according to the most comprehensive study so far




bi

Bird flu virus that infected a person in Missouri had a rare mutation

Genetic analysis of a bird flu virus detected in a person in Missouri who didn’t previously have contact with animals offers more details on the case, but experts say there isn’t substantial evidence to suggest human-to-human transmission is happening




bi

The US is ramping up bird flu surveillance – but will it be enough?

Two more people in the US have tested positive for the H5N1 bird flu virus, highlighting the need for expanded influenza surveillance to prevent a potential pandemic




bi

The brain has its own microbiome. Here's what it means for your health

Neuroscientists have been surprised to discover that the human brain is teeming with microbes, and we are beginning to suspect they could play a role in neurodegenerative disorders like Alzheimer's




bi

France slashed bird flu outbreaks by vaccinating ducks

A vaccination campaign targeting ducks, the farm birds most at risk of getting and spreading bird flu, succeeded in greatly reducing outbreaks of the virus on poultry farms in France




bi

Neuroscientist finds her brain shrinks while taking birth control

A researcher who underwent dozens of brain scans discovered that the volume of her cerebral cortex was 1 per cent lower when she took hormonal contraceptives




bi

One course of antibiotics can change your gut microbiome for years

Antibiotics can reduce diversity in the gut microbiome, raising the risk of infections that cause diarrhoea - and the effects may last years




bi

Bird flu was found in a US pig – does that raise the risk for humans?

A bird flu virus that has been circulating in dairy cattle for months has now been found in a pig in the US for the first time, raising the risk of the virus evolving to become more dangerous to people




bi

Bird flu antibodies found in dairy workers in Michigan and Colorado

Blood tests have shown that about 7 per cent of workers on dairy farms that had H5N1 outbreaks had antibodies against the disease




bi

Israeli leader tells Biden 'we have to get hostages back' who are 'going through hell in dungeons of Gaza'

Israeli President Isaac Herzog says hostages are "going through hell in the dungeons of Gaza" during meeting with President Biden at White House.



  • 338ac2a7-ac44-5ca0-8150-4a2aada67f5d
  • fnc
  • Fox News
  • fox-news/world/world-regions/israel
  • fox-news/person/joe-biden
  • fox-news/politics/executive/white-house
  • fox-news/politics/foreign-policy
  • fox-news/world/conflicts
  • fox-news/world/world-regions/middle-east
  • fox-news/world
  • fox-news/politics
  • article

bi

Here's what happens to Sen. Rubio's seat if he becomes secretary of state and who could replace him

Speculation has already run rampant on who Florida Gov. Ron DeSantis will appoint to replace Sen. Marco Rubio if Rubio becomes President-elect Trump's secretary of state.



  • 40b2164f-a0ab-5ce9-8b39-2d935f9c83ea
  • fnc
  • Fox News
  • fox-news/politics
  • fox-news/us/us-regions/southeast/florida
  • fox-news/person/ron-desantis
  • fox-news/person/marco-rubio
  • fox-news/politics
  • article

bi

Biden supports bringing adversarial nations into new UN cyber crime alliance

The Biden administration will support a United Nations treaty this week that will create a new cybercrime convention, including China and Russia, which has not sat well with some lawmakers and critics.



  • db8a2e71-3ddb-53d7-ad65-f8b304070c47
  • fnc
  • Fox News
  • fox-news/tech/topics/cybercrime
  • fox-news/world/world-regions/china
  • fox-news/world/world-regions/russia
  • fox-news/person/donald-trump
  • fox-news/politics
  • fox-news/politics
  • article

bi

Trump picking Cabinet at breakneck speed compared to 2016

President-elect Trump has made six selections to serve in his Cabinet in the week since the election, a faster pace than he set when elected to the presidency in 2016.



  • 98278b3d-e4ca-528d-9192-16bf8894153f
  • fnc
  • Fox News
  • fox-news/person/donald-trump
  • fox-news/politics/elections/presidential/trump-transition
  • fox-news/politics/elections
  • fox-news/person/marco-rubio
  • fox-news/person/kristi-noem
  • fox-news/politics
  • article

bi

Trump's first Cabinet picks decidedly not isolationists: Ukraine, Israel breathe a sigh of relief

Despite his own isolationist musings, the first picks of President-elect Donald Trump's incoming administration hail from a decidedly more traditionalist wing of the Republican Party.



  • 25699a50-8609-594d-a947-d5270f093d29
  • fnc
  • Fox News
  • fox-news/politics/foreign-policy/secretary-of-state
  • fox-news/politics/executive/national-security
  • fox-news/politics/elections/presidential/trump-transition
  • fox-news/person/donald-trump
  • fox-news/politics
  • article

bi

Jill Biden's apparent cold shoulder for Kamala Harris ignites social media

Social media commentators claimed Jill Biden refused to look at Vice President Harris as they were seated together at Arlington Cemetery for a Veterans' Day Remembrance.



  • 564e5a98-cdcd-57a6-bace-0918257d0b95
  • fnc
  • Fox News
  • fox-news/politics
  • fox-news/person/kamala-harris
  • fox-news/politics/executive/first-family
  • fox-news/politics/biden-pushed-out
  • fox-news/person/joe-biden
  • fox-news/politics/executive/white-house
  • fox-news/politics
  • article

bi

Bird flu study findings have CDC calling for more testing of dairy farm employees

A new study by the Centers for Disease Control and Prevention found that some dairy farm employees showed signs of infection, even when they didn’t report feeling sick. The CDC concluded that more bird flu testing of dairy farm employees is required. According to Dr. Nirav Shah, the CDC’s principal... Continue Reading




bi

The game may have just tilted in favor of a new Farm Bill

Politics and pinball do sometimes have a lot in common. Both can produce surprising and unexpected results. Those lights and metal balls that pinball was known for before the digital age sometimes would make you an unexpected winner. Politics may do that for all those who want to revive the Farm Bill. Politics... Continue Reading




bi

Human bird flu leaps into Canada

Human bird flu has hospitalized a Canadian teenager at British Columbia Children’s Hospital. He is the first person in Canada to test positive for the bird flu virus. The B.C. teen likely acquired the virus from exposure to a bird or animal. B.C. Health said the infection is a rare... Continue Reading




bi

Labour minister moves to end port lockouts in Montreal and British Columbia

Dispute risks damage to Canada's reputation as reliable trade partner, says Steven Mackinnon




bi

Mark Cuban runs to 'less hateful' social media platform after scrubbing X account of Harris support

Dallas Mavericks minority owner Mark Cuban returned to the Bluesky social media platform with a post after weeks of contentious X posts.



  • 03659cc7-b9b2-59bb-a83a-a51c4f033588
  • fnc
  • Fox News
  • fox-news/sports/nba/dallas-mavericks
  • fox-news/sports/nba
  • fox-news/sports
  • fox-news/politics
  • fox-news/sports
  • article

bi

Atomically Thin Materials Significantly Shrink Qubits



Quantum computing is a devilishly complex technology, with many technical hurdles impacting its development. Of these challenges two critical issues stand out: miniaturization and qubit quality.

IBM has adopted the superconducting qubit road map of reaching a 1,121-qubit processor by 2023, leading to the expectation that 1,000 qubits with today’s qubit form factor is feasible. However, current approaches will require very large chips (50 millimeters on a side, or larger) at the scale of small wafers, or the use of chiplets on multichip modules. While this approach will work, the aim is to attain a better path toward scalability.

Now researchers at MIT have been able to both reduce the size of the qubits and done so in a way that reduces the interference that occurs between neighboring qubits. The MIT researchers have increased the number of superconducting qubits that can be added onto a device by a factor of 100.

“We are addressing both qubit miniaturization and quality,” said William Oliver, the director for the Center for Quantum Engineering at MIT. “Unlike conventional transistor scaling, where only the number really matters, for qubits, large numbers are not sufficient, they must also be high-performance. Sacrificing performance for qubit number is not a useful trade in quantum computing. They must go hand in hand.”

The key to this big increase in qubit density and reduction of interference comes down to the use of two-dimensional materials, in particular the 2D insulator hexagonal boron nitride (hBN). The MIT researchers demonstrated that a few atomic monolayers of hBN can be stacked to form the insulator in the capacitors of a superconducting qubit.

Just like other capacitors, the capacitors in these superconducting circuits take the form of a sandwich in which an insulator material is sandwiched between two metal plates. The big difference for these capacitors is that the superconducting circuits can operate only at extremely low temperatures—less than 0.02 degrees above absolute zero (-273.15 °C).

Superconducting qubits are measured at temperatures as low as 20 millikelvin in a dilution refrigerator.Nathan Fiske/MIT

In that environment, insulating materials that are available for the job, such as PE-CVD silicon oxide or silicon nitride, have quite a few defects that are too lossy for quantum computing applications. To get around these material shortcomings, most superconducting circuits use what are called coplanar capacitors. In these capacitors, the plates are positioned laterally to one another, rather than on top of one another.

As a result, the intrinsic silicon substrate below the plates and to a smaller degree the vacuum above the plates serve as the capacitor dielectric. Intrinsic silicon is chemically pure and therefore has few defects, and the large size dilutes the electric field at the plate interfaces, all of which leads to a low-loss capacitor. The lateral size of each plate in this open-face design ends up being quite large (typically 100 by 100 micrometers) in order to achieve the required capacitance.

In an effort to move away from the large lateral configuration, the MIT researchers embarked on a search for an insulator that has very few defects and is compatible with superconducting capacitor plates.

“We chose to study hBN because it is the most widely used insulator in 2D material research due to its cleanliness and chemical inertness,” said colead author Joel Wang, a research scientist in the Engineering Quantum Systems group of the MIT Research Laboratory for Electronics.

On either side of the hBN, the MIT researchers used the 2D superconducting material, niobium diselenide. One of the trickiest aspects of fabricating the capacitors was working with the niobium diselenide, which oxidizes in seconds when exposed to air, according to Wang. This necessitates that the assembly of the capacitor occur in a glove box filled with argon gas.

While this would seemingly complicate the scaling up of the production of these capacitors, Wang doesn’t regard this as a limiting factor.

“What determines the quality factor of the capacitor are the two interfaces between the two materials,” said Wang. “Once the sandwich is made, the two interfaces are “sealed” and we don’t see any noticeable degradation over time when exposed to the atmosphere.”

This lack of degradation is because around 90 percent of the electric field is contained within the sandwich structure, so the oxidation of the outer surface of the niobium diselenide does not play a significant role anymore. This ultimately makes the capacitor footprint much smaller, and it accounts for the reduction in cross talk between the neighboring qubits.

“The main challenge for scaling up the fabrication will be the wafer-scale growth of hBN and 2D superconductors like [niobium diselenide], and how one can do wafer-scale stacking of these films,” added Wang.

Wang believes that this research has shown 2D hBN to be a good insulator candidate for superconducting qubits. He says that the groundwork the MIT team has done will serve as a road map for using other hybrid 2D materials to build superconducting circuits.




bi

Andrew Ng: Unbiggen AI



Andrew Ng has serious street cred in artificial intelligence. He pioneered the use of graphics processing units (GPUs) to train deep learning models in the late 2000s with his students at Stanford University, cofounded Google Brain in 2011, and then served for three years as chief scientist for Baidu, where he helped build the Chinese tech giant’s AI group. So when he says he has identified the next big shift in artificial intelligence, people listen. And that’s what he told IEEE Spectrum in an exclusive Q&A.


Ng’s current efforts are focused on his company Landing AI, which built a platform called LandingLens to help manufacturers improve visual inspection with computer vision. He has also become something of an evangelist for what he calls the data-centric AI movement, which he says can yield “small data” solutions to big issues in AI, including model efficiency, accuracy, and bias.

Andrew Ng on...

The great advances in deep learning over the past decade or so have been powered by ever-bigger models crunching ever-bigger amounts of data. Some people argue that that’s an unsustainable trajectory. Do you agree that it can’t go on that way?

Andrew Ng: This is a big question. We’ve seen foundation models in NLP [natural language processing]. I’m excited about NLP models getting even bigger, and also about the potential of building foundation models in computer vision. I think there’s lots of signal to still be exploited in video: We have not been able to build foundation models yet for video because of compute bandwidth and the cost of processing video, as opposed to tokenized text. So I think that this engine of scaling up deep learning algorithms, which has been running for something like 15 years now, still has steam in it. Having said that, it only applies to certain problems, and there’s a set of other problems that need small data solutions.

When you say you want a foundation model for computer vision, what do you mean by that?

Ng: This is a term coined by Percy Liang and some of my friends at Stanford to refer to very large models, trained on very large data sets, that can be tuned for specific applications. For example, GPT-3 is an example of a foundation model [for NLP]. Foundation models offer a lot of promise as a new paradigm in developing machine learning applications, but also challenges in terms of making sure that they’re reasonably fair and free from bias, especially if many of us will be building on top of them.

What needs to happen for someone to build a foundation model for video?

Ng: I think there is a scalability problem. The compute power needed to process the large volume of images for video is significant, and I think that’s why foundation models have arisen first in NLP. Many researchers are working on this, and I think we’re seeing early signs of such models being developed in computer vision. But I’m confident that if a semiconductor maker gave us 10 times more processor power, we could easily find 10 times more video to build such models for vision.

Having said that, a lot of what’s happened over the past decade is that deep learning has happened in consumer-facing companies that have large user bases, sometimes billions of users, and therefore very large data sets. While that paradigm of machine learning has driven a lot of economic value in consumer software, I find that that recipe of scale doesn’t work for other industries.

Back to top

It’s funny to hear you say that, because your early work was at a consumer-facing company with millions of users.

Ng: Over a decade ago, when I proposed starting the Google Brain project to use Google’s compute infrastructure to build very large neural networks, it was a controversial step. One very senior person pulled me aside and warned me that starting Google Brain would be bad for my career. I think he felt that the action couldn’t just be in scaling up, and that I should instead focus on architecture innovation.

“In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.”
—Andrew Ng, CEO & Founder, Landing AI

I remember when my students and I published the first NeurIPS workshop paper advocating using CUDA, a platform for processing on GPUs, for deep learning—a different senior person in AI sat me down and said, “CUDA is really complicated to program. As a programming paradigm, this seems like too much work.” I did manage to convince him; the other person I did not convince.

I expect they’re both convinced now.

Ng: I think so, yes.

Over the past year as I’ve been speaking to people about the data-centric AI movement, I’ve been getting flashbacks to when I was speaking to people about deep learning and scalability 10 or 15 years ago. In the past year, I’ve been getting the same mix of “there’s nothing new here” and “this seems like the wrong direction.”

Back to top

How do you define data-centric AI, and why do you consider it a movement?

Ng: Data-centric AI is the discipline of systematically engineering the data needed to successfully build an AI system. For an AI system, you have to implement some algorithm, say a neural network, in code and then train it on your data set. The dominant paradigm over the last decade was to download the data set while you focus on improving the code. Thanks to that paradigm, over the last decade deep learning networks have improved significantly, to the point where for a lot of applications the code—the neural network architecture—is basically a solved problem. So for many practical applications, it’s now more productive to hold the neural network architecture fixed, and instead find ways to improve the data.

When I started speaking about this, there were many practitioners who, completely appropriately, raised their hands and said, “Yes, we’ve been doing this for 20 years.” This is the time to take the things that some individuals have been doing intuitively and make it a systematic engineering discipline.

The data-centric AI movement is much bigger than one company or group of researchers. My collaborators and I organized a data-centric AI workshop at NeurIPS, and I was really delighted at the number of authors and presenters that showed up.

You often talk about companies or institutions that have only a small amount of data to work with. How can data-centric AI help them?

Ng: You hear a lot about vision systems built with millions of images—I once built a face recognition system using 350 million images. Architectures built for hundreds of millions of images don’t work with only 50 images. But it turns out, if you have 50 really good examples, you can build something valuable, like a defect-inspection system. In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.

When you talk about training a model with just 50 images, does that really mean you’re taking an existing model that was trained on a very large data set and fine-tuning it? Or do you mean a brand new model that’s designed to learn only from that small data set?

Ng: Let me describe what Landing AI does. When doing visual inspection for manufacturers, we often use our own flavor of RetinaNet. It is a pretrained model. Having said that, the pretraining is a small piece of the puzzle. What’s a bigger piece of the puzzle is providing tools that enable the manufacturer to pick the right set of images [to use for fine-tuning] and label them in a consistent way. There’s a very practical problem we’ve seen spanning vision, NLP, and speech, where even human annotators don’t agree on the appropriate label. For big data applications, the common response has been: If the data is noisy, let’s just get a lot of data and the algorithm will average over it. But if you can develop tools that flag where the data’s inconsistent and give you a very targeted way to improve the consistency of the data, that turns out to be a more efficient way to get a high-performing system.

“Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.”
—Andrew Ng

For example, if you have 10,000 images where 30 images are of one class, and those 30 images are labeled inconsistently, one of the things we do is build tools to draw your attention to the subset of data that’s inconsistent. So you can very quickly relabel those images to be more consistent, and this leads to improvement in performance.

Could this focus on high-quality data help with bias in data sets? If you’re able to curate the data more before training?

Ng: Very much so. Many researchers have pointed out that biased data is one factor among many leading to biased systems. There have been many thoughtful efforts to engineer the data. At the NeurIPS workshop, Olga Russakovsky gave a really nice talk on this. At the main NeurIPS conference, I also really enjoyed Mary Gray’s presentation, which touched on how data-centric AI is one piece of the solution, but not the entire solution. New tools like Datasheets for Datasets also seem like an important piece of the puzzle.

One of the powerful tools that data-centric AI gives us is the ability to engineer a subset of the data. Imagine training a machine-learning system and finding that its performance is okay for most of the data set, but its performance is biased for just a subset of the data. If you try to change the whole neural network architecture to improve the performance on just that subset, it’s quite difficult. But if you can engineer a subset of the data you can address the problem in a much more targeted way.

When you talk about engineering the data, what do you mean exactly?

Ng: In AI, data cleaning is important, but the way the data has been cleaned has often been in very manual ways. In computer vision, someone may visualize images through a Jupyter notebook and maybe spot the problem, and maybe fix it. But I’m excited about tools that allow you to have a very large data set, tools that draw your attention quickly and efficiently to the subset of data where, say, the labels are noisy. Or to quickly bring your attention to the one class among 100 classes where it would benefit you to collect more data. Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.

For example, I once figured out that a speech-recognition system was performing poorly when there was car noise in the background. Knowing that allowed me to collect more data with car noise in the background, rather than trying to collect more data for everything, which would have been expensive and slow.

Back to top

What about using synthetic data, is that often a good solution?

Ng: I think synthetic data is an important tool in the tool chest of data-centric AI. At the NeurIPS workshop, Anima Anandkumar gave a great talk that touched on synthetic data. I think there are important uses of synthetic data that go beyond just being a preprocessing step for increasing the data set for a learning algorithm. I’d love to see more tools to let developers use synthetic data generation as part of the closed loop of iterative machine learning development.

Do you mean that synthetic data would allow you to try the model on more data sets?

Ng: Not really. Here’s an example. Let’s say you’re trying to detect defects in a smartphone casing. There are many different types of defects on smartphones. It could be a scratch, a dent, pit marks, discoloration of the material, other types of blemishes. If you train the model and then find through error analysis that it’s doing well overall but it’s performing poorly on pit marks, then synthetic data generation allows you to address the problem in a more targeted way. You could generate more data just for the pit-mark category.

“In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models.”
—Andrew Ng

Synthetic data generation is a very powerful tool, but there are many simpler tools that I will often try first. Such as data augmentation, improving labeling consistency, or just asking a factory to collect more data.

Back to top

To make these issues more concrete, can you walk me through an example? When a company approaches Landing AI and says it has a problem with visual inspection, how do you onboard them and work toward deployment?

Ng: When a customer approaches us we usually have a conversation about their inspection problem and look at a few images to verify that the problem is feasible with computer vision. Assuming it is, we ask them to upload the data to the LandingLens platform. We often advise them on the methodology of data-centric AI and help them label the data.

One of the foci of Landing AI is to empower manufacturing companies to do the machine learning work themselves. A lot of our work is making sure the software is fast and easy to use. Through the iterative process of machine learning development, we advise customers on things like how to train models on the platform, when and how to improve the labeling of data so the performance of the model improves. Our training and software supports them all the way through deploying the trained model to an edge device in the factory.

How do you deal with changing needs? If products change or lighting conditions change in the factory, can the model keep up?

Ng: It varies by manufacturer. There is data drift in many contexts. But there are some manufacturers that have been running the same manufacturing line for 20 years now with few changes, so they don’t expect changes in the next five years. Those stable environments make things easier. For other manufacturers, we provide tools to flag when there’s a significant data-drift issue. I find it really important to empower manufacturing customers to correct data, retrain, and update the model. Because if something changes and it’s 3 a.m. in the United States, I want them to be able to adapt their learning algorithm right away to maintain operations.

In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models. The challenge is, how do you do that without Landing AI having to hire 10,000 machine learning specialists?

So you’re saying that to make it scale, you have to empower customers to do a lot of the training and other work.

Ng: Yes, exactly! This is an industry-wide problem in AI, not just in manufacturing. Look at health care. Every hospital has its own slightly different format for electronic health records. How can every hospital train its own custom AI model? Expecting every hospital’s IT personnel to invent new neural-network architectures is unrealistic. The only way out of this dilemma is to build tools that empower the customers to build their own models by giving them tools to engineer the data and express their domain knowledge. That’s what Landing AI is executing in computer vision, and the field of AI needs other teams to execute this in other domains.

Is there anything else you think it’s important for people to understand about the work you’re doing or the data-centric AI movement?

Ng: In the last decade, the biggest shift in AI was a shift to deep learning. I think it’s quite possible that in this decade the biggest shift will be to data-centric AI. With the maturity of today’s neural network architectures, I think for a lot of the practical applications the bottleneck will be whether we can efficiently get the data we need to develop systems that work well. The data-centric AI movement has tremendous energy and momentum across the whole community. I hope more researchers and developers will jump in and work on it.

Back to top

This article appears in the April 2022 print issue as “Andrew Ng, AI Minimalist.”




bi

The AI Boom Rests on Billions of Tonnes of Concrete



Along the country road that leads to ATL4, a giant data center going up east of Atlanta, dozens of parked cars and pickups lean tenuously on the narrow dirt shoulders. The many out-of-state plates are typical of the phalanx of tradespeople who muster for these massive construction jobs. With tech giants, utilities, and governments budgeting upwards of US $1 trillion for capital expansion to join the global battle for AI dominance, data centers are the bunkers, factories, and skunkworks—and concrete and electricity are the fuel and ammunition.

To the casual observer, the data industry can seem incorporeal, its products conjured out of weightless bits. But as I stand beside the busy construction site for DataBank’s ATL4, what impresses me most is the gargantuan amount of material—mostly concrete—that gives shape to the goliath that will house, secure, power, and cool the hardware of AI. Big data is big concrete. And that poses a big problem.

This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

Concrete is not just a major ingredient in data centers and the power plants being built to energize them. As the world’s most widely manufactured material, concrete—and especially the cement within it—is also a major contributor to climate change, accounting for around 6 percent of global greenhouse gas emissions. Data centers use so much concrete that the construction boom is wrecking tech giants’ commitments to eliminate their carbon emissions. Even though Google, Meta, and Microsoft have touted goals to be carbon neutral or negative by 2030, and Amazon by 2040, the industry is now moving in the wrong direction.

Last year, Microsoft’s carbon emissions jumped by over 30 percent, primarily due to the materials in its new data centers. Google’s greenhouse emissions are up by nearly 50 percent over the past five years. As data centers proliferate worldwide, Morgan Stanley projects that data centers will release about 2.5 billion tonnes of CO2 each year by 2030—or about 40 percent of what the United States currently emits from all sources.

But even as innovations in AI and the big-data construction boom are boosting emissions for the tech industry’s hyperscalers, the reinvention of concrete could also play a big part in solving the problem. Over the last decade, there’s been a wave of innovation, some of it profit-driven, some of it from academic labs, aimed at fixing concrete’s carbon problem. Pilot plants are being fielded to capture CO 2 from cement plants and sock it safely away. Other projects are cooking up climate-friendlier recipes for cements. And AI and other computational tools are illuminating ways to drastically cut carbon by using less cement in concrete and less concrete in data centers, power plants, and other structures.

Demand for green concrete is clearly growing. Amazon, Google, Meta, and Microsoft recently joined an initiative led by the Open Compute Project Foundation to accelerate testing and deployment of low-carbon concrete in data centers, for example. Supply is increasing, too—though it’s still minuscule compared to humanity’s enormous appetite for moldable rock. But if the green goals of big tech can jump-start innovation in low-carbon concrete and create a robust market for it as well, the boom in big data could eventually become a boon for the planet.

Hyperscaler Data Centers: So Much Concrete

At the construction site for ATL4, I’m met by Tony Qorri, the company’s big, friendly, straight-talking head of construction. He says that this giant building and four others DataBank has recently built or is planning in the Atlanta area will together add 133,000 square meters (1.44 million square feet) of floor space.

They all follow a universal template that Qorri developed to optimize the construction of the company’s ever-larger centers. At each site, trucks haul in more than a thousand prefabricated concrete pieces: wall panels, columns, and other structural elements. Workers quickly assemble the precision-measured parts. Hundreds of electricians swarm the building to wire it up in just a few days. Speed is crucial when construction delays can mean losing ground in the AI battle.

The ATL4 data center outside Atlanta is one of five being built by DataBank. Together they will add over 130,000 square meters of floor space.DataBank

That battle can be measured in new data centers and floor space. The United States is home to more than 5,000 data centers today, and the Department of Commerce forecasts that number to grow by around 450 a year through 2030. Worldwide, the number of data centers now exceeds 10,000, and analysts project another 26.5 million m2 of floor space over the next five years. Here in metro Atlanta, developers broke ground last year on projects that will triple the region’s data-center capacity. Microsoft, for instance, is planning a 186,000-m2 complex; big enough to house around 100,000 rack-mounted servers, it will consume 324 megawatts of electricity.

The velocity of the data-center boom means that no one is pausing to await greener cement. For now, the industry’s mantra is “Build, baby, build.”

“There’s no good substitute for concrete in these projects,” says Aaron Grubbs, a structural engineer at ATL4. The latest processors going on the racks are bigger, heavier, hotter, and far more power hungry than previous generations. As a result, “you add a lot of columns,” Grubbs says.

1,000 Companies Working on Green Concrete

Concrete may not seem an obvious star in the story of how electricity and electronics have permeated modern life. Other materials—copper and silicon, aluminum and lithium—get higher billing. But concrete provides the literal, indispensable foundation for the world’s electrical workings. It is the solid, stable, durable, fire-resistant stuff that makes power generation and distribution possible. It undergirds nearly all advanced manufacturing and telecommunications. What was true in the rapid build-out of the power industry a century ago remains true today for the data industry: Technological progress begets more growth—and more concrete. Although each generation of processor and memory squeezes more computing onto each chip, and advances in superconducting microcircuitry raise the tantalizing prospect of slashing the data center’s footprint, Qorri doesn’t think his buildings will shrink to the size of a shoebox anytime soon. “I’ve been through that kind of change before, and it seems the need for space just grows with it,” he says.

By weight, concrete is not a particularly carbon-intensive material. Creating a kilogram of steel, for instance, releases about 2.4 times as much CO2 as a kilogram of cement does. But the global construction industry consumes about 35 billion tonnes of concrete a year. That’s about 4 tonnes for every person on the planet and twice as much as all other building materials combined. It’s that massive scale—and the associated cost and sheer number of producers—that creates both a threat to the climate and inertia that resists change.

At its Edmonton, Alberta, plant [above], Heidelberg Materials is adding systems to capture carbon dioxide produced by the manufacture of Portland cement.Heidelberg Materials North America

Yet change is afoot. When I visited the innovation center operated by the Swiss materials giant Holcim, in Lyon, France, research executives told me about the database they’ve assembled of nearly 1,000 companies working to decarbonize cement and concrete. None yet has enough traction to measurably reduce global concrete emissions. But the innovators hope that the boom in data centers—and in associated infrastructure such as new nuclear reactors and offshore wind farms, where each turbine foundation can use up to 7,500 cubic meters of concrete—may finally push green cement and concrete beyond labs, startups, and pilot plants.

Why cement production emits so much carbon

Though the terms “cement” and “concrete” are often conflated, they are not the same thing. A popular analogy in the industry is that cement is the egg in the concrete cake. Here’s the basic recipe: Blend cement with larger amounts of sand and other aggregates. Then add water, to trigger a chemical reaction with the cement. Wait a while for the cement to form a matrix that pulls all the components together. Let sit as it cures into a rock-solid mass.

Portland cement, the key binder in most of the world’s concrete, was serendipitously invented in England by William Aspdin, while he was tinkering with earlier mortars that his father, Joseph, had patented in 1824. More than a century of science has revealed the essential chemistry of how cement works in concrete, but new findings are still leading to important innovations, as well as insights into how concrete absorbs atmospheric carbon as it ages.

As in the Aspdins’ day, the process to make Portland cement still begins with limestone, a sedimentary mineral made from crystalline forms of calcium carbonate. Most of the limestone quarried for cement originated hundreds of millions of years ago, when ocean creatures mineralized calcium and carbonate in seawater to make shells, bones, corals, and other hard bits.

Cement producers often build their large plants next to limestone quarries that can supply decades’ worth of stone. The stone is crushed and then heated in stages as it is combined with lesser amounts of other minerals that typically include calcium, silicon, aluminum, and iron. What emerges from the mixing and cooking are small, hard nodules called clinker. A bit more processing, grinding, and mixing turns those pellets into powdered Portland cement, which accounts for about 90 percent of the CO2 emitted by the production of conventional concrete [see infographic, “Roads to Cleaner Concrete”].

Karen Scrivener, shown in her lab at EPFL, has developed concrete recipes that reduce emissions by 30 to 40 percent.Stefan Wermuth/Bloomberg/Getty Images

Decarbonizing Portland cement is often called heavy industry’s “hard problem” because of two processes fundamental to its manufacture. The first process is combustion: To coax limestone’s chemical transformation into clinker, large heaters and kilns must sustain temperatures around 1,500 °C. Currently that means burning coal, coke, fuel oil, or natural gas, often along with waste plastics and tires. The exhaust from those fires generates 35 to 50 percent of the cement industry’s emissions. Most of the remaining emissions result from gaseous CO 2 liberated by the chemical transformation of the calcium carbonate (CaCO3) into calcium oxide (CaO), a process called calcination. That gas also usually heads straight into the atmosphere.

Concrete production, in contrast, is mainly a business of mixing cement powder with other ingredients and then delivering the slurry speedily to its destination before it sets. Most concrete in the United States is prepared to order at batch plants—souped-up materials depots where the ingredients are combined, dosed out from hoppers into special mixer trucks, and then driven to job sites. Because concrete grows too stiff to work after about 90 minutes, concrete production is highly local. There are more ready-mix batch plants in the United States than there are Burger King restaurants.

Batch plants can offer thousands of potential mixes, customized to fit the demands of different jobs. Concrete in a hundred-story building differs from that in a swimming pool. With flexibility to vary the quality of sand and the size of the stone—and to add a wide variety of chemicals—batch plants have more tricks for lowering carbon emissions than any cement plant does.

Cement plants that capture carbon

China accounts for more than half of the concrete produced and used in the world, but companies there are hard to track. Outside of China, the top three multinational cement producers—Holcim, Heidelberg Materials in Germany, and Cemex in Mexico—have launched pilot programs to snare CO2 emissions before they escape and then bury the waste deep underground. To do that, they’re taking carbon capture and storage (CCS) technology already used in the oil and gas industry and bolting it onto their cement plants.

These pilot programs will need to scale up without eating profits—something that eluded the coal industry when it tried CCS decades ago. Tough questions also remain about where exactly to store billions of tonnes of CO 2 safely, year after year.

The appeal of CCS for cement producers is that they can continue using existing plants while still making progress toward carbon neutrality, which trade associations have committed to reach by 2050. But with well over 3,000 plants around the world, adding CCS to all of them would take enormous investment. Currently less than 1 percent of the global supply is low-emission cement. Accenture, a consultancy, estimates that outfitting the whole industry for carbon capture could cost up to $900 billion.

“The economics of carbon capture is a monster,” says Rick Chalaturnyk, a professor of geotechnical engineering at the University of Alberta, in Edmonton, Canada, who studies carbon capture in the petroleum and power industries. He sees incentives for the early movers on CCS, however. “If Heidelberg, for example, wins the race to the lowest carbon, it will be the first [cement] company able to supply those customers that demand low-carbon products”—customers such as hyperscalers.

Though cement companies seem unlikely to invest their own billions in CCS, generous government subsidies have enticed several to begin pilot projects. Heidelberg has announced plans to start capturing CO2 from its Edmonton operations in late 2026, transforming it into what the company claims would be “the world’s first full-scale net-zero cement plant.” Exhaust gas will run through stations that purify the CO2 and compress it into a liquid, which will then be transported to chemical plants to turn it into products or to depleted oil and gas reservoirs for injection underground, where hopefully it will stay put for an epoch or two.

Chalaturnyk says that the scale of the Edmonton plant, which aims to capture a million tonnes of CO2 a year, is big enough to give CCS technology a reasonable test. Proving the economics is another matter. Half the $1 billion cost for the Edmonton project is being paid by the governments of Canada and Alberta.

ROADS TO CLEANER CONCRETE


As the big-data construction boom boosts the tech industry’s emissions, the reinvention of concrete could play a major role in solving the problem.

• CONCRETE TODAY Most of the greenhouse emissions from concrete come from the production of Portland cement, which requires high heat and releases carbon dioxide (CO2) directly into the air.

• CONCRETE TOMORROW At each stage of cement and concrete production, advances in ingredients, energy supplies, and uses of concrete promise to reduce waste and pollution.

The U.S. Department of Energy has similarly offered Heidelberg up to $500 million to help cover the cost of attaching CCS to its Mitchell, Ind., plant and burying up to 2 million tonnes of CO2 per year below the plant. And the European Union has gone even bigger, allocating nearly €1.5 billion ($1.6 billion) from its Innovation Fund to support carbon capture at cement plants in seven of its member nations.

These tests are encouraging, but they are all happening in rich countries, where demand for concrete peaked decades ago. Even in China, concrete production has started to flatten. All the growth in global demand through 2040 is expected to come from less-affluent countries, where populations are still growing and quickly urbanizing. According to projections by the Rhodium Group, cement production in those regions is likely to rise from around 30 percent of the world’s supply today to 50 percent by 2050 and 80 percent before the end of the century.

So will rich-world CCS technology translate to the rest of the world? I asked Juan Esteban Calle Restrepo, the CEO of Cementos Argos, the leading cement producer in Colombia, about that when I sat down with him recently at his office in Medellín. He was frank. “Carbon capture may work for the U.S. or Europe, but countries like ours cannot afford that,” he said.

Better cement through chemistry

As long as cement plants run limestone through fossil-fueled kilns, they will generate excessive amounts of carbon dioxide. But there may be ways to ditch the limestone—and the kilns. Labs and startups have been finding replacements for limestone, such as calcined kaolin clay and fly ash, that don’t release CO 2 when heated. Kaolin clays are abundant around the world and have been used for centuries in Chinese porcelain and more recently in cosmetics and paper. Fly ash—a messy, toxic by-product of coal-fired power plants—is cheap and still widely available, even as coal power dwindles in many regions.

At the Swiss Federal Institute of Technology Lausanne (EPFL), Karen Scrivener and colleagues developed cements that blend calcined kaolin clay and ground limestone with a small portion of clinker. Calcining clay can be done at temperatures low enough that electricity from renewable sources can do the job. Various studies have found that the blend, known as LC3, can reduce overall emissions by 30 to 40 percent compared to those of Portland cement.

LC3 is also cheaper to make than Portland cement and performs as well for nearly all common uses. As a result, calcined clay plants have popped up across Africa, Europe, and Latin America. In Colombia, Cementos Argos is already producing more than 2 million tonnes of the stuff annually. The World Economic Forum’s Centre for Energy and Materials counts LC3 among the best hopes for the decarbonization of concrete. Wide adoption by the cement industry, the centre reckons, “can help prevent up to 500 million tonnes of CO2 emissions by 2030.”

In a win-win for the environment, fly ash can also be used as a building block for low- and even zero-emission concrete, and the high heat of processing neutralizes many of the toxins it contains. Ancient Romans used volcanic ash to make slow-setting but durable concrete: The Pantheon, built nearly two millennia ago with ash-based cement, is still in great shape.

Coal fly ash is a cost-effective ingredient that has reactive properties similar to those of Roman cement and Portland cement. Many concrete plants already add fresh fly ash to their concrete mixes, replacing 15 to 35 percent of the cement. The ash improves the workability of the concrete, and though the resulting concrete is not as strong for the first few months, it grows stronger than regular concrete as it ages, like the Pantheon.

University labs have tested concretes made entirely with fly ash and found that some actually outperform the standard variety. More than 15 years ago, researchers at Montana State University used concrete made with 100 percent fly ash in the floors and walls of a credit union and a transportation research center. But performance depends greatly on the chemical makeup of the ash, which varies from one coal plant to the next, and on following a tricky recipe. The decommissioning of coal-fired plants has also been making fresh fly ash scarcer and more expensive.

At Sublime Systems’ pilot plant in Massachusetts, the company is using electrochemistry instead of heat to produce lime silicate cements that can replace Portland cement.Tony Luong

That has spurred new methods to treat and use fly ash that’s been buried in landfills or dumped into ponds. Such industrial burial grounds hold enough fly ash to make concrete for decades, even after every coal plant shuts down. Utah-based Eco Material Technologies is now producing cements that include both fresh and recovered fly ash as ingredients. The company claims it can replace up to 60 percent of the Portland cement in concrete—and that a new variety, suitable for 3D printing, can substitute entirely for Portland cement.

Hive 3D Builders, a Houston-based startup, has been feeding that low-emissions concrete into robots that are printing houses in several Texas developments. “We are 100 percent Portland cement–free,” says Timothy Lankau, Hive 3D’s CEO. “We want our homes to last 1,000 years.”

Sublime Systems, a startup spun out of MIT by battery scientists, uses electrochemistry rather than heat to make low-carbon cement from rocks that don’t contain carbon. Similar to a battery, Sublime’s process uses a voltage between an electrode and a cathode to create a pH gradient that isolates silicates and reactive calcium, in the form of lime (CaO). The company mixes those ingredients together to make a cement with no fugitive carbon, no kilns or furnaces, and binding power comparable to that of Portland cement. With the help of $87 million from the U.S. Department of Energy, Sublime is building a plant in Holyoke, Mass., that will be powered almost entirely by hydroelectricity. Recently the company was tapped to provide concrete for a major offshore wind farm planned off the coast of Martha’s Vineyard.

Software takes on the hard problem of concrete

It is unlikely that any one innovation will allow the cement industry to hit its target of carbon neutrality before 2050. New technologies take time to mature, scale up, and become cost-competitive. In the meantime, says Philippe Block, a structural engineer at ETH Zurich, smart engineering can reduce carbon emissions through the leaner use of materials.

His research group has developed digital design tools that make clever use of geometry to maximize the strength of concrete structures while minimizing their mass. The team’s designs start with the soaring architectural elements of ancient temples, cathedrals, and mosques—in particular, vaults and arches—which they miniaturize and flatten and then 3D print or mold inside concrete floors and ceilings. The lightweight slabs, suitable for the upper stories of apartment and office buildings, use much less concrete and steel reinforcement and have a CO2 footprint that’s reduced by 80 percent.

There’s hidden magic in such lean design. In multistory buildings, much of the mass of concrete is needed just to hold the weight of the material above it. The carbon savings of Block’s lighter slabs thus compound, because the size, cost, and emissions of a building’s conventional-concrete elements are slashed.

Vaulted, a Swiss startup, uses digital design tools to minimize the concrete in floors and ceilings, cutting their CO2 footprint by 80 percent.Vaulted

In Dübendorf, Switzerland, a wildly shaped experimental building has floors, roofs, and ceilings created by Block’s structural system. Vaulted, a startup spun out of ETH, is engineering and fabricating the lighter floors of a 10-story office building under construction in Zug, Switzerland.

That country has also been a leader in smart ways to recycle and reuse concrete, rather than simply landfilling demolition rubble. This is easier said than done—concrete is tough stuff, riddled with rebar. But there’s an economic incentive: Raw materials such as sand and limestone are becoming scarcer and more costly. Some jurisdictions in Europe now require that new buildings be made from recycled and reused materials. The new addition of the Kunsthaus Zürich museum, a showcase of exquisite Modernist architecture, uses recycled material for all but 2 percent of its concrete.

As new policies goose demand for recycled materials and threaten to restrict future use of Portland cement across Europe, Holcim has begun building recycling plants that can reclaim cement clinker from old concrete. It recently turned the demolition rubble from some 1960s apartment buildings outside Paris into part of a 220-unit housing complex—touted as the first building made from 100 percent recycled concrete. The company says it plans to build concrete recycling centers in every major metro area in Europe and, by 2030, to include 30 percent recycled material in all of its cement.

Further innovations in low-carbon concrete are certain to come, particularly as the powers of machine learning are applied to the problem. Over the past decade, the number of research papers reporting on computational tools to explore the vast space of possible concrete mixes has grown exponentially. Much as AI is being used to accelerate drug discovery, the tools learn from huge databases of proven cement mixes and then apply their inferences to evaluate untested mixes.

Researchers from the University of Illinois and Chicago-based Ozinga, one of the largest private concrete producers in the United States, recently worked with Meta to feed 1,030 known concrete mixes into an AI. The project yielded a novel mix that will be used for sections of a data-center complex in DeKalb, Ill. The AI-derived concrete has a carbon footprint 40 percent lower than the conventional concrete used on the rest of the site. Ryan Cialdella, Ozinga’s vice president of innovation, smiles as he notes the virtuous circle: AI systems that live in data centers can now help cut emissions from the concrete that houses them.

A sustainable foundation for the information age

Cheap, durable, and abundant yet unsustainable, concrete made with Portland cement has been one of modern technology’s Faustian bargains. The built world is on track to double in floor space by 2060, adding 230,000 km 2, or more than half the area of California. Much of that will house the 2 billion more people we are likely to add to our numbers. As global transportation, telecom, energy, and computing networks grow, their new appendages will rest upon concrete. But if concrete doesn’t change, we will perversely be forced to produce even more concrete to protect ourselves from the coming climate chaos, with its rising seas, fires, and extreme weather.

The AI-driven boom in data centers is a strange bargain of its own. In the future, AI may help us live even more prosperously, or it may undermine our freedoms, civilities, employment opportunities, and environment. But solutions to the bad climate bargain that AI’s data centers foist on the planet are at hand, if there’s a will to deploy them. Hyperscalers and governments are among the few organizations with the clout to rapidly change what kinds of cement and concrete the world uses, and how those are made. With a pivot to sustainability, concrete’s unique scale makes it one of the few materials that could do most to protect the world’s natural systems. We can’t live without concrete—but with some ambitious reinvention, we can thrive with it.

This article was updated on 04 November 2024.




bi

Students Tackle Environmental Issues in Colombia and Türkiye



EPICS in IEEE, a service learning program for university students supported by IEEE Educational Activities, offers students opportunities to engage with engineering professionals and mentors, local organizations, and technological innovation to address community-based issues.

The following two environmentally focused projects demonstrate the value of teamwork and direct involvement with project stakeholders. One uses smart biodigesters to better manage waste in Colombia’s rural areas. The other is focused on helping Turkish olive farmers protect their trees from climate change effects by providing them with a warning system that can identify growing problems.

No time to waste in rural Colombia

Proper waste management is critical to a community’s living conditions. In rural La Vega, Colombia, the lack of an effective system has led to contaminated soil and water, an especially concerning issue because the town’s economy relies heavily on agriculture.

The Smart Biodigesters for a Better Environment in Rural Areas project brought students together to devise a solution.

Vivian Estefanía Beltrán, a Ph.D. student at the Universidad del Rosario in Bogotá, addressed the problem by building a low-cost anaerobic digester that uses an instrumentation system to break down microorganisms into biodegradable material. It reduces the amount of solid waste, and the digesters can produce biogas, which can be used to generate electricity.

“Anaerobic digestion is a natural biological process that converts organic matter into two valuable products: biogas and nutrient-rich soil amendments in the form of digestate,” Beltrán says. “As a by-product of our digester’s operation, digestate is organic matter that can’t be transferred into biogas but can be used as a soil amendment for our farmers’ crops, such as coffee.

“While it may sound easy, the process is influenced by a lot of variables. The support we’ve received from EPICS in IEEE is important because it enables us to measure these variables, such as pH levels, temperature of the reactor, and biogas composition [methane and hydrogen sulfide]. The system allows us to make informed decisions that enhance the safety, quality, and efficiency of the process for the benefit of the community.”

The project was a collaborative effort among Universidad del Rosario students, a team of engineering students from Escuela Tecnológica Instituto Técnico Central, Professor Carlos Felipe Vergara, and members of Junta de Acción Comunal (Vereda La Granja), which aims to help residents improve their community.

“It’s been a great experience to see how individuals pursuing different fields of study—from engineering to electronics and computer science—can all work and learn together on a project that will have a direct positive impact on a community.” —Vivian Estefanía Beltrán

Beltrán worked closely with eight undergraduate students and three instructors—Maria Fernanda Gómez, Andrés Pérez Gordillo (the instrumentation group leader), and Carlos Felipe Vergara-Ramirez—as well as IEEE Graduate Student Member Nicolás Castiblanco (the instrumentation group coordinator).

The team constructed and installed their anaerobic digester system in an experimental station in La Vega, a town located roughly 53 kilometers northwest of Bogotá.

“This digester is an important innovation for the residents of La Vega, as it will hopefully offer a productive way to utilize the residual biomass they produce to improve quality of life and boost the economy,” Beltrán says. Soon, she adds, the system will be expanded to incorporate high-tech sensors that automatically monitor biogas production and the digestion process.

“For our students and team members, it’s been a great experience to see how individuals pursuing different fields of study—from engineering to electronics and computer science—can all work and learn together on a project that will have a direct positive impact on a community. It enables all of us to apply our classroom skills to reality,” she says. “The funding we’ve received from EPICS in IEEE has been crucial to designing, proving, and installing the system.”

The project also aims to support the development of a circular economy, which reuses materials to enhance the community’s sustainability and self-sufficiency.

Protecting olive groves in Türkiye

Türkiye is one of the world’s leading producers of olives, but the industry has been challenged in recent years by unprecedented floods, droughts, and other destructive forces of nature resulting from climate change. To help farmers in the western part of the country monitor the health of their olive trees, a team of students from Istanbul Technical University developed an early-warning system to identify irregularities including abnormal growth.

“Almost no olives were produced last year using traditional methods, due to climate conditions and unusual weather patterns,” says Tayfun Akgül, project leader of the Smart Monitoring of Fruit Trees in Western Türkiye initiative.

“Our system will give farmers feedback from each tree so that actions can be taken in advance to improve the yield,” says Akgül, an IEEE senior member and a professor in the university’s electronics and communication engineering department.

“We’re developing deep-learning techniques to detect changes in olive trees and their fruit so that farmers and landowners can take all necessary measures to avoid a low or damaged harvest,” says project coordinator Melike Girgin, a Ph.D. student at the university and an IEEE graduate student member.

Using drones outfitted with 360-degree optical and thermal cameras, the team collects optical, thermal, and hyperspectral imaging data through aerial methods. The information is fed into a cloud-based, open-source database system.

Akgül leads the project and teaches the team skills including signal and image processing and data collection. He says regular communication with community-based stakeholders has been critical to the project’s success.

“There are several farmers in the village who have helped us direct our drone activities to the right locations,” he says. “Their involvement in the project has been instrumental in helping us refine our process for greater effectiveness.

“For students, classroom instruction is straightforward, then they take an exam at the end. But through our EPICS project, students are continuously interacting with farmers in a hands-on, practical way and can see the results of their efforts in real time.”

Looking ahead, the team is excited about expanding the project to encompass other fruits besides olives. The team also intends to apply for a travel grant from IEEE in hopes of presenting its work at a conference.

“We’re so grateful to EPICS in IEEE for this opportunity,” Girgin says. “Our project and some of the technology we required wouldn’t have been possible without the funding we received.”

A purpose-driven partnership

The IEEE Standards Association sponsored both of the proactive environmental projects.

“Technical projects play a crucial role in advancing innovation and ensuring interoperability across various industries,” says Munir Mohammed, IEEE SA senior manager of product development and market engagement. “These projects not only align with our technical standards but also drive technological progress, enhance global collaboration, and ultimately improve the quality of life for communities worldwide.”

For more information on the program or to participate in service-learning projects, visit EPICS in IEEE.

On 7 November, this article was updated from an earlier version.




bi

This Mobile 3D Printer Can Print Directly on Your Floor



Waiting for each part of a 3D-printed project to finish, taking it out of the printer, and then installing it on location can be tedious for multi-part projects. What if there was a way for your printer to print its creation exactly where you needed it? That’s the promise of MobiPrint, a new 3D printing robot that can move around a room, printing designs directly onto the floor.

MobiPrint, designed by Daniel Campos Zamora at the University of Washington, consists of a modified off-the-shelf 3D printer atop a home vacuum robot. First it autonomously maps its space—be it a room, a hallway, or an entire floor of a house. Users can then choose from a prebuilt library or upload their own design to be printed anywhere in the mapped area. The robot then traverses the room and prints the design.

It’s “a new system that combines robotics and 3D printing that could actually go and print in the real world,” Campos Zamora says. He presented MobiPrint on 15 October at the ACM Symposium on User Interface Software and Technology.

Campos Zamora and his team started with a Roborock S5 vacuum robot and installed firmware that allowed it to communicate with the open source program Valetudo. Valetudo disconnects personal robots from their manufacturer’s cloud, connecting them to a local server instead. Data collected by the robot, such as environmental mapping, movement tracking, and path planning, can all be observed locally, enabling users to see the robot’s LIDAR-created map.

Campos Zamora built a layer of software that connects the robot’s perception of its environment to the 3D printer’s print commands. The printer, a modified Prusa Mini+, can print on carpet, hardwood, and vinyl, with maximum printing dimensions of 180 by 180 by 65 millimeters. The robot has printed pet food bowls, signage, and accessibility markers as sample objects.

MakeabilityLab/YouTube

Currently, MobiPrint can only “park and print.” The robot base cannot move during printing to make large objects, like a mobility ramp. Printing designs larger than the robot is one of Campos Zamora’s goals in the future. To learn more about the team’s vision for MobiPrint, Campos Zamora answered a few questions from IEEE Spectrum.

What was the inspiration for creating your mobile 3D printer?

Daniel Campos Zamora: My lab is focused on building systems with an eye towards accessibility. One of the things that really inspired this project was looking at the tactile surface indicators that help blind and low vision users find their way around a space. And so we were like, what if we made something that could automatically go and deploy these things? Especially in indoor environments, which are generally a little trickier and change more frequently over time.

We had to step back and build this entirely different thing, using the environment as a design element. We asked: how do you integrate the real world environment into the design process, and then what kind of things can you print out in the world? That’s how this printer was born.

What were some surprising moments in your design process?

Campos Zamora: When I was testing the robot on different surfaces, I was not expecting the 3D printed designs to stick extremely well to the carpet. It stuck way too well. Like, you know, just completely bonded down there.

I think there’s also just a lot of joy in seeing this printer move. When I was doing a demonstration of it at this conference last week, it almost seemed like the robot had a personality. A vacuum robot can seem to have a personality, but this printer can actually make objects in my environment, so I feel a different relationship to the machine.

Where do you hope to take MobiPrint in the future?

Campos Zamora: There’s several directions I think we could go. Instead of controlling the robot remotely, we could have it follow someone around and print accessibility markers along a path they walk. Or we could integrate an AI system that recommends objects be printed in different locations. I also want to explore having the robot remove and recycle the objects it prints.




bi

Microsoft reports big profits amid massive AI investments

Xbox hardware sales dropped 29 percent, but that barely made a dent.






bi

Photos: Hail blankets Saudi Arabian desert creating winter-like landscape




bi

Is Wildfire Smoke Causing Birds to Tend to Empty Nests?

New studies suggest smoke from western megafires may be damaging bird health and leading to strange behavior




bi

Uncovering the Secrets Behind Hummingbirds' Extreme Lifestyle

Here's how the aerial acrobats are able to survive on a nearly all-sugar diet, fly higher than many helicopters can and migrate over the open ocean




bi

Hurricane Helene Battered the 'Salamander Capital of the World' With Floods and Landslides. Will the Beloved Amphibians Survive the Aftermath?

The storm decimated a region rich with dozens of species already struggling with habitat loss and disease




bi

This Parasitic Fungus Turns Flies Into Zombie Insects

The pathogen takes over the brains of its hosts and controls them for its own sinister ends