été

Remote Sub Sustains Science Kilometers Underwater



The water column is hazy as an unusual remotely operated vehicle glides over the seafloor in search of a delicate tilt meter deployed three years ago off the west side of Vancouver Island. The sensor measures shaking and shifting in continental plates that will eventually unleash another of the region’s 9.0-scale earthquakes (the last was in 1700). Dwindling charge in the instruments’ loggers threatens the continuity of the data.

The 4-metric-ton, C$8-million (US $5.8-million) remotely operated vehicle (ROV) is 50 meters from its target when one of the seismic science platforms appears on its sonar imaging system, the platform’s hard edges crystallizing from the grainy background like a surgical implant jumping out of an ultrasound image. After easing the ROV to the platform, operators 2,575 meters up at the Pacific’s surface instruct its electromechanical arms and pincer hands to deftly unplug a data logger, then plug in a replacement with a fresh battery.

This mission, executed in early October, marked an exciting moment for Josh Tetarenko, director of ROV operations at North Vancouver-based Canpac Marine Services. Tetarenko is the lead designer behind the new science submersible and recently dubbed it Jenny in homage to Forrest Gump, because the fictional character named all of his boats Jenny. Swapping out the data loggers west of Vancouver Island’s Clayoquot Sound was part of a weeklong shakedown to test Jenny’s unique combination of dexterity, visualization chops, power, and pressure resistance.

Jenny is only the third science ROV designed for subsea work to a depth of 6,000 meters.

By all accounts Jenny sailed through. Tetarenko says the worst they saw was a leaky O-ring and the need to add some spring to a few bumpers. “Usually you see more things come up the first time you dive a vehicle to those depths,” says Tetarenko.

Jenny’s successful maiden cruise is just as important for Victoria, B.C.–based Ocean Networks Canada (ONC), which operates the NEPTUNE undersea observatory. The North-East Pacific Time-series Undersea Networked Experiments array boasts thousands of sensors and instruments, including deep-sea video cameras, seismometers, and robotic rovers sprawled across this corner of Pacific. Most of these are connected to shore via an 812-kilometer power and communications cable. Jenny was custom-designed to perform the annual maintenance and equipment swaps that have kept live data streaming from that cabled observatory nearly continuously for the past 15 years, despite trawler strikes, a fault on its backbone cable, and insults from corrosion, crushing pressures, and fouling.

NEPTUNE remains one of the world’s largest installations for oceanographic science despite a proliferation of such cabled observatories since it went live in 2009. ONC’s open data portal has over 37,000 registered users tapping over 1.5 petabytes of ocean data—information that’s growing in importance with the intensification of climate change and the collapse of marine ecosystems.

Over the course of Jenny’s maiden cruise, her operators swapped devices in and out at half a dozen ONC sites, including at several of NEPTUNE’s five nodes and at one of NEPTUNE’s smaller sister observatories closer to Vancouver.

Inside Jenny

ROV Jenny aboard the Valour, Canpac’s 50-meter offshore workhorse, ahead of October’s NEPTUNE observatory maintenance cruise.Ocean Networks Canada

What makes Jenny so special?

  • Jenny is only the third science ROV designed for subsea work to a depth of 6,000 meters.
  • Motion sensors actively adjust her 7,000-meter-long umbilical cable to counteract topside wave action that would otherwise yank the ROV around at depth and, in rough seas, could damage or snap the cable.
  • Dual high-dexterity manipulator arms are controlled by topside operators via a pair of replica mini-manipulators that mirror the movements.
  • Each arm is capable of picking up objects weighing about 275 kilograms, and the ROV itself can transport equipment weighing up to 3,000 kg.
  • 11 high-resolution cameras deliver 4K video, supported by 300,000 lumens of lighting that can be tuned to deliver the soft red light needed to observe bioluminescence.
  • Dual multibeam sonar systems maximize visibility in turbid water.

Meghan Paulson, ONC’s executive director for observatory operations, says the sonar imaging system will be particularly invaluable during dives to shallower sites where sediments stirred up by waves and weather can cut visibility from meters to centimeters. “It really reduces the risk of running into things accidentally,” says Paulson.

To experience the visibility conditions for yourself, check out recordings of the live video broadcast from the NEPTUNE maintenance cruise. Tetarenko says that next year they hope to broadcast not only the main camera feed but also one of the sonar images.

3D video could be next, according to Canpac ROV pilot and Jenny codesigner, James Barnett. He says they would need to boost the computing power installed topside, to process that “firehose of data,” but insists that real-time 3D is “definitely not impossible.” Tetarenko says the science ROV community is collaborating on software to help make that workable: “3D imagining is kind of the very latest thing that’s being tested on lots of ROV systems right now, but nobody’s really there yet.”

More Than Science

Expansion of the cabled observatory concept is the more certain technological legacy for ONC and NEPTUNE. In fact, the technology has evolved beyond just oceanography applications.

ONC tapped Alcatel Submarine Networks (ASN) to design and build the Neptune backbone and the French firm delivered a system that has reliably delivered multigigabit Ethernet plus 10 kilovolts of direct-current electricity to the deep sea. Today ASN deploys a second-generation subsea power and communications networking solution, developed with the Norwegian international energy company Equinor.

ASN’s “Direct Current/Fiber Optic” or DC/FO system provides the 100-km backbone for the ARCA subsea neutrino observatory near Sicily, in addition to providing control systems for a growing number of offshore oil and gas installations. The latter include projects led by Equinor and BP where DC/FO networks drive the subsea injection of captured carbon dioxide and monitor its storage below the seabed. Future oil and gas projects will increasingly rely on the cables’ power supply to replace the hydraulic lines that have traditionally been used to operate machinery on the seafloor, according to Ronan Michel, ASN’s product line manager for oil and gas solutions.

Michel says DC/FO incorporates important lessons learned from the Neptune installation. And the latter’s existence was a crucial prerequisite. “The DC/FO solution would probably not exist if Neptune Canada would not have been developed,” says Michel. “It probably gave confidence to Equinor that ASN was capable to develop subsea power and coms infrastructure.”




été

The Election Depleted Us. Storytelling Can Revive Us

As we share our truths and witness each other's, we build unity and community.




été

Nova Scotia biologist adapting COVID-19 technology to detect oyster disease

A biologist at Cape Breton University is hoping a piece of technology used to keep people safe in the pandemic can help protect Nova Scotia's oysters against the effects of warming waters.



  • News/Canada/Nova Scotia

été

Trump selects Fox host Pete Hegseth for defense secretary

President-elect Donald Trump said he will nominate Fox News host Pete Hegseth to be secretary of defense.




été

Heroes among us: Celebrating American bravery on Veterans Day

Tech expert Kurt “CyberGuy" Knutsson helps you honor our heroes with these powerful podcasts, audiobooks and documentaries this Veterans Day.



  • 52424efb-aa6e-5eb5-9392-dc4ee09364ad
  • fnc
  • Fox News
  • fox-news/tech
  • fox-news/us/military/veterans
  • fox-news/us/military
  • fox-news/entertainment/genres/documentary
  • fox-news/tech
  • article


été

Powerboat racing returns to south coast as speedsters compete for famous Beaverbrook trophy









été

British Athletics want 'open' category for transgender women to compete with men





été

Are federal IT systems supporting the targeted service outcomes? Deloitte examines the future role of the government

In an interview with IT World Canada, consulting giant Deloitte highlighted the importance of an ecosystem-based approach to tackle issues around digital equity in Canada and service delivery challenges in the public sector. “Our strong view is that the people of Canada benefit when there’s effective collaboration between public and private organizations, including on critical […]

The post Are federal IT systems supporting the targeted service outcomes? Deloitte examines the future role of the government first appeared on ITBusiness.ca.




été

How to get the best view of the Perseids meteor shower

The annual Perseids meteor shower is expected to continue through September. Astronomers say it's one of the brightest and most visible meteor showers of the year.



  • 45968b2b-9d48-5c6e-bb3d-8463f3861e77
  • fnc
  • Fox News
  • fox-news/science/air-and-space
  • fox-news/topic/associated-press
  • fox-news/science/air-and-space/asteroids
  • fox-news/science/air-and-space/nasa
  • fox-news/science
  • article

été

Perseid meteor shower peaks Sunday night, potentially giving stargazers big show

The annual Perseid meteor shower is set to peak on Sunday night into early Monday morning, giving stargazers the chance to see hundreds of meteors.



  • f2292f9b-3ab5-59a6-91d5-1aa85abcf80a
  • fnc
  • Fox News
  • fox-news/science/air-and-space/astronomy
  • fox-news/science/air-and-space/nasa
  • fox-news/science/air-and-space
  • fox-news/science/air-and-space/spaceflight
  • fox-news/science
  • article

été

Orionid meteor shower to light up night sky through most of November

The Orionids meteor shower peaks on Monday, but will continue to light up the sky through Nov. 22, as debris from Halley's Comet enters Earth's atmosphere.



  • 3d0d5aa2-d6f1-54b6-b1ad-d6745df3f642
  • fnc
  • Fox News
  • fox-news/science/air-and-space/astronomy
  • fox-news/science/air-and-space/nasa
  • fox-news/science/air-and-space
  • fox-news/science
  • article

été

The Eternal Cylinder Review

An unusual and fascinating survival game with one of the most memorable enemies in years.




été

Does eating meat really raise your risk of type 2 diabetes?

Red and processed meat, and even poultry, seem to raise the risk of developing type 2 diabetes, according to a study of nearly 2 million adults, but not everyone is convinced




été

The complete guide to cooking oils and how they affect your health

From seed oils to olive oil, we now have an overwhelming choice of what to cook with. Here’s how they all stack up, according to the scientific evidence




été

Trump nominates Pete Hegseth to serve as defense secretary

Former Fox News host Pete Hegseth has been selected by President-elect Trump to serve as his secretary of defense. Hegseth served in the U.S. Army.



  • 6fc74de4-5845-502a-9d77-cf6ca51f1b96
  • fnc
  • Fox News
  • fox-news/person/donald-trump
  • fox-news/politics/defense/secretary-of-defense
  • fox-news/politics/defense
  • fox-news/politics
  • fox-news/us
  • fox-news/politics
  • article

été

The AI Boom Rests on Billions of Tonnes of Concrete



Along the country road that leads to ATL4, a giant data center going up east of Atlanta, dozens of parked cars and pickups lean tenuously on the narrow dirt shoulders. The many out-of-state plates are typical of the phalanx of tradespeople who muster for these massive construction jobs. With tech giants, utilities, and governments budgeting upwards of US $1 trillion for capital expansion to join the global battle for AI dominance, data centers are the bunkers, factories, and skunkworks—and concrete and electricity are the fuel and ammunition.

To the casual observer, the data industry can seem incorporeal, its products conjured out of weightless bits. But as I stand beside the busy construction site for DataBank’s ATL4, what impresses me most is the gargantuan amount of material—mostly concrete—that gives shape to the goliath that will house, secure, power, and cool the hardware of AI. Big data is big concrete. And that poses a big problem.

This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

Concrete is not just a major ingredient in data centers and the power plants being built to energize them. As the world’s most widely manufactured material, concrete—and especially the cement within it—is also a major contributor to climate change, accounting for around 6 percent of global greenhouse gas emissions. Data centers use so much concrete that the construction boom is wrecking tech giants’ commitments to eliminate their carbon emissions. Even though Google, Meta, and Microsoft have touted goals to be carbon neutral or negative by 2030, and Amazon by 2040, the industry is now moving in the wrong direction.

Last year, Microsoft’s carbon emissions jumped by over 30 percent, primarily due to the materials in its new data centers. Google’s greenhouse emissions are up by nearly 50 percent over the past five years. As data centers proliferate worldwide, Morgan Stanley projects that data centers will release about 2.5 billion tonnes of CO2 each year by 2030—or about 40 percent of what the United States currently emits from all sources.

But even as innovations in AI and the big-data construction boom are boosting emissions for the tech industry’s hyperscalers, the reinvention of concrete could also play a big part in solving the problem. Over the last decade, there’s been a wave of innovation, some of it profit-driven, some of it from academic labs, aimed at fixing concrete’s carbon problem. Pilot plants are being fielded to capture CO 2 from cement plants and sock it safely away. Other projects are cooking up climate-friendlier recipes for cements. And AI and other computational tools are illuminating ways to drastically cut carbon by using less cement in concrete and less concrete in data centers, power plants, and other structures.

Demand for green concrete is clearly growing. Amazon, Google, Meta, and Microsoft recently joined an initiative led by the Open Compute Project Foundation to accelerate testing and deployment of low-carbon concrete in data centers, for example. Supply is increasing, too—though it’s still minuscule compared to humanity’s enormous appetite for moldable rock. But if the green goals of big tech can jump-start innovation in low-carbon concrete and create a robust market for it as well, the boom in big data could eventually become a boon for the planet.

Hyperscaler Data Centers: So Much Concrete

At the construction site for ATL4, I’m met by Tony Qorri, the company’s big, friendly, straight-talking head of construction. He says that this giant building and four others DataBank has recently built or is planning in the Atlanta area will together add 133,000 square meters (1.44 million square feet) of floor space.

They all follow a universal template that Qorri developed to optimize the construction of the company’s ever-larger centers. At each site, trucks haul in more than a thousand prefabricated concrete pieces: wall panels, columns, and other structural elements. Workers quickly assemble the precision-measured parts. Hundreds of electricians swarm the building to wire it up in just a few days. Speed is crucial when construction delays can mean losing ground in the AI battle.

The ATL4 data center outside Atlanta is one of five being built by DataBank. Together they will add over 130,000 square meters of floor space.DataBank

That battle can be measured in new data centers and floor space. The United States is home to more than 5,000 data centers today, and the Department of Commerce forecasts that number to grow by around 450 a year through 2030. Worldwide, the number of data centers now exceeds 10,000, and analysts project another 26.5 million m2 of floor space over the next five years. Here in metro Atlanta, developers broke ground last year on projects that will triple the region’s data-center capacity. Microsoft, for instance, is planning a 186,000-m2 complex; big enough to house around 100,000 rack-mounted servers, it will consume 324 megawatts of electricity.

The velocity of the data-center boom means that no one is pausing to await greener cement. For now, the industry’s mantra is “Build, baby, build.”

“There’s no good substitute for concrete in these projects,” says Aaron Grubbs, a structural engineer at ATL4. The latest processors going on the racks are bigger, heavier, hotter, and far more power hungry than previous generations. As a result, “you add a lot of columns,” Grubbs says.

1,000 Companies Working on Green Concrete

Concrete may not seem an obvious star in the story of how electricity and electronics have permeated modern life. Other materials—copper and silicon, aluminum and lithium—get higher billing. But concrete provides the literal, indispensable foundation for the world’s electrical workings. It is the solid, stable, durable, fire-resistant stuff that makes power generation and distribution possible. It undergirds nearly all advanced manufacturing and telecommunications. What was true in the rapid build-out of the power industry a century ago remains true today for the data industry: Technological progress begets more growth—and more concrete. Although each generation of processor and memory squeezes more computing onto each chip, and advances in superconducting microcircuitry raise the tantalizing prospect of slashing the data center’s footprint, Qorri doesn’t think his buildings will shrink to the size of a shoebox anytime soon. “I’ve been through that kind of change before, and it seems the need for space just grows with it,” he says.

By weight, concrete is not a particularly carbon-intensive material. Creating a kilogram of steel, for instance, releases about 2.4 times as much CO2 as a kilogram of cement does. But the global construction industry consumes about 35 billion tonnes of concrete a year. That’s about 4 tonnes for every person on the planet and twice as much as all other building materials combined. It’s that massive scale—and the associated cost and sheer number of producers—that creates both a threat to the climate and inertia that resists change.

At its Edmonton, Alberta, plant [above], Heidelberg Materials is adding systems to capture carbon dioxide produced by the manufacture of Portland cement.Heidelberg Materials North America

Yet change is afoot. When I visited the innovation center operated by the Swiss materials giant Holcim, in Lyon, France, research executives told me about the database they’ve assembled of nearly 1,000 companies working to decarbonize cement and concrete. None yet has enough traction to measurably reduce global concrete emissions. But the innovators hope that the boom in data centers—and in associated infrastructure such as new nuclear reactors and offshore wind farms, where each turbine foundation can use up to 7,500 cubic meters of concrete—may finally push green cement and concrete beyond labs, startups, and pilot plants.

Why cement production emits so much carbon

Though the terms “cement” and “concrete” are often conflated, they are not the same thing. A popular analogy in the industry is that cement is the egg in the concrete cake. Here’s the basic recipe: Blend cement with larger amounts of sand and other aggregates. Then add water, to trigger a chemical reaction with the cement. Wait a while for the cement to form a matrix that pulls all the components together. Let sit as it cures into a rock-solid mass.

Portland cement, the key binder in most of the world’s concrete, was serendipitously invented in England by William Aspdin, while he was tinkering with earlier mortars that his father, Joseph, had patented in 1824. More than a century of science has revealed the essential chemistry of how cement works in concrete, but new findings are still leading to important innovations, as well as insights into how concrete absorbs atmospheric carbon as it ages.

As in the Aspdins’ day, the process to make Portland cement still begins with limestone, a sedimentary mineral made from crystalline forms of calcium carbonate. Most of the limestone quarried for cement originated hundreds of millions of years ago, when ocean creatures mineralized calcium and carbonate in seawater to make shells, bones, corals, and other hard bits.

Cement producers often build their large plants next to limestone quarries that can supply decades’ worth of stone. The stone is crushed and then heated in stages as it is combined with lesser amounts of other minerals that typically include calcium, silicon, aluminum, and iron. What emerges from the mixing and cooking are small, hard nodules called clinker. A bit more processing, grinding, and mixing turns those pellets into powdered Portland cement, which accounts for about 90 percent of the CO2 emitted by the production of conventional concrete [see infographic, “Roads to Cleaner Concrete”].

Karen Scrivener, shown in her lab at EPFL, has developed concrete recipes that reduce emissions by 30 to 40 percent.Stefan Wermuth/Bloomberg/Getty Images

Decarbonizing Portland cement is often called heavy industry’s “hard problem” because of two processes fundamental to its manufacture. The first process is combustion: To coax limestone’s chemical transformation into clinker, large heaters and kilns must sustain temperatures around 1,500 °C. Currently that means burning coal, coke, fuel oil, or natural gas, often along with waste plastics and tires. The exhaust from those fires generates 35 to 50 percent of the cement industry’s emissions. Most of the remaining emissions result from gaseous CO 2 liberated by the chemical transformation of the calcium carbonate (CaCO3) into calcium oxide (CaO), a process called calcination. That gas also usually heads straight into the atmosphere.

Concrete production, in contrast, is mainly a business of mixing cement powder with other ingredients and then delivering the slurry speedily to its destination before it sets. Most concrete in the United States is prepared to order at batch plants—souped-up materials depots where the ingredients are combined, dosed out from hoppers into special mixer trucks, and then driven to job sites. Because concrete grows too stiff to work after about 90 minutes, concrete production is highly local. There are more ready-mix batch plants in the United States than there are Burger King restaurants.

Batch plants can offer thousands of potential mixes, customized to fit the demands of different jobs. Concrete in a hundred-story building differs from that in a swimming pool. With flexibility to vary the quality of sand and the size of the stone—and to add a wide variety of chemicals—batch plants have more tricks for lowering carbon emissions than any cement plant does.

Cement plants that capture carbon

China accounts for more than half of the concrete produced and used in the world, but companies there are hard to track. Outside of China, the top three multinational cement producers—Holcim, Heidelberg Materials in Germany, and Cemex in Mexico—have launched pilot programs to snare CO2 emissions before they escape and then bury the waste deep underground. To do that, they’re taking carbon capture and storage (CCS) technology already used in the oil and gas industry and bolting it onto their cement plants.

These pilot programs will need to scale up without eating profits—something that eluded the coal industry when it tried CCS decades ago. Tough questions also remain about where exactly to store billions of tonnes of CO 2 safely, year after year.

The appeal of CCS for cement producers is that they can continue using existing plants while still making progress toward carbon neutrality, which trade associations have committed to reach by 2050. But with well over 3,000 plants around the world, adding CCS to all of them would take enormous investment. Currently less than 1 percent of the global supply is low-emission cement. Accenture, a consultancy, estimates that outfitting the whole industry for carbon capture could cost up to $900 billion.

“The economics of carbon capture is a monster,” says Rick Chalaturnyk, a professor of geotechnical engineering at the University of Alberta, in Edmonton, Canada, who studies carbon capture in the petroleum and power industries. He sees incentives for the early movers on CCS, however. “If Heidelberg, for example, wins the race to the lowest carbon, it will be the first [cement] company able to supply those customers that demand low-carbon products”—customers such as hyperscalers.

Though cement companies seem unlikely to invest their own billions in CCS, generous government subsidies have enticed several to begin pilot projects. Heidelberg has announced plans to start capturing CO2 from its Edmonton operations in late 2026, transforming it into what the company claims would be “the world’s first full-scale net-zero cement plant.” Exhaust gas will run through stations that purify the CO2 and compress it into a liquid, which will then be transported to chemical plants to turn it into products or to depleted oil and gas reservoirs for injection underground, where hopefully it will stay put for an epoch or two.

Chalaturnyk says that the scale of the Edmonton plant, which aims to capture a million tonnes of CO2 a year, is big enough to give CCS technology a reasonable test. Proving the economics is another matter. Half the $1 billion cost for the Edmonton project is being paid by the governments of Canada and Alberta.

ROADS TO CLEANER CONCRETE


As the big-data construction boom boosts the tech industry’s emissions, the reinvention of concrete could play a major role in solving the problem.

• CONCRETE TODAY Most of the greenhouse emissions from concrete come from the production of Portland cement, which requires high heat and releases carbon dioxide (CO2) directly into the air.

• CONCRETE TOMORROW At each stage of cement and concrete production, advances in ingredients, energy supplies, and uses of concrete promise to reduce waste and pollution.

The U.S. Department of Energy has similarly offered Heidelberg up to $500 million to help cover the cost of attaching CCS to its Mitchell, Ind., plant and burying up to 2 million tonnes of CO2 per year below the plant. And the European Union has gone even bigger, allocating nearly €1.5 billion ($1.6 billion) from its Innovation Fund to support carbon capture at cement plants in seven of its member nations.

These tests are encouraging, but they are all happening in rich countries, where demand for concrete peaked decades ago. Even in China, concrete production has started to flatten. All the growth in global demand through 2040 is expected to come from less-affluent countries, where populations are still growing and quickly urbanizing. According to projections by the Rhodium Group, cement production in those regions is likely to rise from around 30 percent of the world’s supply today to 50 percent by 2050 and 80 percent before the end of the century.

So will rich-world CCS technology translate to the rest of the world? I asked Juan Esteban Calle Restrepo, the CEO of Cementos Argos, the leading cement producer in Colombia, about that when I sat down with him recently at his office in Medellín. He was frank. “Carbon capture may work for the U.S. or Europe, but countries like ours cannot afford that,” he said.

Better cement through chemistry

As long as cement plants run limestone through fossil-fueled kilns, they will generate excessive amounts of carbon dioxide. But there may be ways to ditch the limestone—and the kilns. Labs and startups have been finding replacements for limestone, such as calcined kaolin clay and fly ash, that don’t release CO 2 when heated. Kaolin clays are abundant around the world and have been used for centuries in Chinese porcelain and more recently in cosmetics and paper. Fly ash—a messy, toxic by-product of coal-fired power plants—is cheap and still widely available, even as coal power dwindles in many regions.

At the Swiss Federal Institute of Technology Lausanne (EPFL), Karen Scrivener and colleagues developed cements that blend calcined kaolin clay and ground limestone with a small portion of clinker. Calcining clay can be done at temperatures low enough that electricity from renewable sources can do the job. Various studies have found that the blend, known as LC3, can reduce overall emissions by 30 to 40 percent compared to those of Portland cement.

LC3 is also cheaper to make than Portland cement and performs as well for nearly all common uses. As a result, calcined clay plants have popped up across Africa, Europe, and Latin America. In Colombia, Cementos Argos is already producing more than 2 million tonnes of the stuff annually. The World Economic Forum’s Centre for Energy and Materials counts LC3 among the best hopes for the decarbonization of concrete. Wide adoption by the cement industry, the centre reckons, “can help prevent up to 500 million tonnes of CO2 emissions by 2030.”

In a win-win for the environment, fly ash can also be used as a building block for low- and even zero-emission concrete, and the high heat of processing neutralizes many of the toxins it contains. Ancient Romans used volcanic ash to make slow-setting but durable concrete: The Pantheon, built nearly two millennia ago with ash-based cement, is still in great shape.

Coal fly ash is a cost-effective ingredient that has reactive properties similar to those of Roman cement and Portland cement. Many concrete plants already add fresh fly ash to their concrete mixes, replacing 15 to 35 percent of the cement. The ash improves the workability of the concrete, and though the resulting concrete is not as strong for the first few months, it grows stronger than regular concrete as it ages, like the Pantheon.

University labs have tested concretes made entirely with fly ash and found that some actually outperform the standard variety. More than 15 years ago, researchers at Montana State University used concrete made with 100 percent fly ash in the floors and walls of a credit union and a transportation research center. But performance depends greatly on the chemical makeup of the ash, which varies from one coal plant to the next, and on following a tricky recipe. The decommissioning of coal-fired plants has also been making fresh fly ash scarcer and more expensive.

At Sublime Systems’ pilot plant in Massachusetts, the company is using electrochemistry instead of heat to produce lime silicate cements that can replace Portland cement.Tony Luong

That has spurred new methods to treat and use fly ash that’s been buried in landfills or dumped into ponds. Such industrial burial grounds hold enough fly ash to make concrete for decades, even after every coal plant shuts down. Utah-based Eco Material Technologies is now producing cements that include both fresh and recovered fly ash as ingredients. The company claims it can replace up to 60 percent of the Portland cement in concrete—and that a new variety, suitable for 3D printing, can substitute entirely for Portland cement.

Hive 3D Builders, a Houston-based startup, has been feeding that low-emissions concrete into robots that are printing houses in several Texas developments. “We are 100 percent Portland cement–free,” says Timothy Lankau, Hive 3D’s CEO. “We want our homes to last 1,000 years.”

Sublime Systems, a startup spun out of MIT by battery scientists, uses electrochemistry rather than heat to make low-carbon cement from rocks that don’t contain carbon. Similar to a battery, Sublime’s process uses a voltage between an electrode and a cathode to create a pH gradient that isolates silicates and reactive calcium, in the form of lime (CaO). The company mixes those ingredients together to make a cement with no fugitive carbon, no kilns or furnaces, and binding power comparable to that of Portland cement. With the help of $87 million from the U.S. Department of Energy, Sublime is building a plant in Holyoke, Mass., that will be powered almost entirely by hydroelectricity. Recently the company was tapped to provide concrete for a major offshore wind farm planned off the coast of Martha’s Vineyard.

Software takes on the hard problem of concrete

It is unlikely that any one innovation will allow the cement industry to hit its target of carbon neutrality before 2050. New technologies take time to mature, scale up, and become cost-competitive. In the meantime, says Philippe Block, a structural engineer at ETH Zurich, smart engineering can reduce carbon emissions through the leaner use of materials.

His research group has developed digital design tools that make clever use of geometry to maximize the strength of concrete structures while minimizing their mass. The team’s designs start with the soaring architectural elements of ancient temples, cathedrals, and mosques—in particular, vaults and arches—which they miniaturize and flatten and then 3D print or mold inside concrete floors and ceilings. The lightweight slabs, suitable for the upper stories of apartment and office buildings, use much less concrete and steel reinforcement and have a CO2 footprint that’s reduced by 80 percent.

There’s hidden magic in such lean design. In multistory buildings, much of the mass of concrete is needed just to hold the weight of the material above it. The carbon savings of Block’s lighter slabs thus compound, because the size, cost, and emissions of a building’s conventional-concrete elements are slashed.

Vaulted, a Swiss startup, uses digital design tools to minimize the concrete in floors and ceilings, cutting their CO2 footprint by 80 percent.Vaulted

In Dübendorf, Switzerland, a wildly shaped experimental building has floors, roofs, and ceilings created by Block’s structural system. Vaulted, a startup spun out of ETH, is engineering and fabricating the lighter floors of a 10-story office building under construction in Zug, Switzerland.

That country has also been a leader in smart ways to recycle and reuse concrete, rather than simply landfilling demolition rubble. This is easier said than done—concrete is tough stuff, riddled with rebar. But there’s an economic incentive: Raw materials such as sand and limestone are becoming scarcer and more costly. Some jurisdictions in Europe now require that new buildings be made from recycled and reused materials. The new addition of the Kunsthaus Zürich museum, a showcase of exquisite Modernist architecture, uses recycled material for all but 2 percent of its concrete.

As new policies goose demand for recycled materials and threaten to restrict future use of Portland cement across Europe, Holcim has begun building recycling plants that can reclaim cement clinker from old concrete. It recently turned the demolition rubble from some 1960s apartment buildings outside Paris into part of a 220-unit housing complex—touted as the first building made from 100 percent recycled concrete. The company says it plans to build concrete recycling centers in every major metro area in Europe and, by 2030, to include 30 percent recycled material in all of its cement.

Further innovations in low-carbon concrete are certain to come, particularly as the powers of machine learning are applied to the problem. Over the past decade, the number of research papers reporting on computational tools to explore the vast space of possible concrete mixes has grown exponentially. Much as AI is being used to accelerate drug discovery, the tools learn from huge databases of proven cement mixes and then apply their inferences to evaluate untested mixes.

Researchers from the University of Illinois and Chicago-based Ozinga, one of the largest private concrete producers in the United States, recently worked with Meta to feed 1,030 known concrete mixes into an AI. The project yielded a novel mix that will be used for sections of a data-center complex in DeKalb, Ill. The AI-derived concrete has a carbon footprint 40 percent lower than the conventional concrete used on the rest of the site. Ryan Cialdella, Ozinga’s vice president of innovation, smiles as he notes the virtuous circle: AI systems that live in data centers can now help cut emissions from the concrete that houses them.

A sustainable foundation for the information age

Cheap, durable, and abundant yet unsustainable, concrete made with Portland cement has been one of modern technology’s Faustian bargains. The built world is on track to double in floor space by 2060, adding 230,000 km 2, or more than half the area of California. Much of that will house the 2 billion more people we are likely to add to our numbers. As global transportation, telecom, energy, and computing networks grow, their new appendages will rest upon concrete. But if concrete doesn’t change, we will perversely be forced to produce even more concrete to protect ourselves from the coming climate chaos, with its rising seas, fires, and extreme weather.

The AI-driven boom in data centers is a strange bargain of its own. In the future, AI may help us live even more prosperously, or it may undermine our freedoms, civilities, employment opportunities, and environment. But solutions to the bad climate bargain that AI’s data centers foist on the planet are at hand, if there’s a will to deploy them. Hyperscalers and governments are among the few organizations with the clout to rapidly change what kinds of cement and concrete the world uses, and how those are made. With a pivot to sustainability, concrete’s unique scale makes it one of the few materials that could do most to protect the world’s natural systems. We can’t live without concrete—but with some ambitious reinvention, we can thrive with it.

This article was updated on 04 November 2024.




été

Millimeter Waves May Not Be 6G’s Most Promising Spectrum



In 6G telecom research today, a crucial portion of wireless spectrum has been neglected: the Frequency Range 3, or FR3, band. The shortcoming is partly due to a lack of viable software and hardware platforms for studying this region of spectrum, ranging from approximately 6 to 24 gigahertz. But a new, open-source wireless research kit is changing that equation. And research conducted using that kit, presented last week at a leading industry conference, offers proof of viability of this spectrum band for future 6G networks.

In fact, it’s also arguably signaling a moment of telecom industry re-evaluation. The high-bandwidth 6G future, according to these folks, may not be entirely centered around difficult millimeter wave-based technologies. Instead, 6G may leave plenty of room for higher-bandwidth microwave spectrum tech that is ultimately more familiar and accessible.

The FR3 band is a region of microwave spectrum just shy of millimeter-wave frequencies (30 to 300 GHz). FR3 is also already very popular today for satellite Internet and military communications. For future 5G and 6G networks to share the FR3 band with incumbent players would require telecom networks nimble enough to perform regular, rapid-response spectrum-hopping.

Yet spectrum-hopping might still be an easier problem to solve than those posed by the inherent physical shortcomings of some portions of millimeter-wave spectrum—shortcomings that include limited range, poor penetration, line-of-sight operations, higher power requirements, and susceptibility to weather.

Pi-Radio’s New Face

Earlier this year, the Brooklyn, N.Y.-based startup Pi-Radio—a spinoff from New York University’s Tandon School of Engineering—released a wireless spectrum hardware and software kit for telecom research and development. Pi-Radio’s FR-3 is a software-defined radio system developed for the FR3 band specifically, says company co-founder Sundeep Rangan.

“Software-defined radio is basically a programmable platform to experiment and build any type of wireless technology,” says Rangan, who is also the associate director of NYU Wireless. “In the early stages when developing systems, all researchers need these.”

For instance, the Pi-Radio team presented one new research finding that infers direction to an FR3 antenna from measurements taken by a mobile Pi-Radio receiver—presented at the IEEE Signal Processing Society‘s Asilomar Conference on Signals, Systems and Computers in Pacific Grove, Calif. on 30 October.

According to Pi-Radio co-founder Marco Mezzavilla, who’s also an associate professor at the Polytechnic University of Milan, the early-stage FR3 research that the team presented at Asilomar will enable researchers “to capture [signal] propagation in these frequencies and will allow us to characterize it, understand it, and model it... And this is the first stepping stone towards designing future wireless systems at these frequencies.”

There’s a good reason researchers have recently rediscovered FR3, says Paolo Testolina, postdoctoral research fellow at Northeastern University’s Institute for the Wireless Internet of Things unaffiliated with the current research effort. “The current scarcity of spectrum for communications is driving operators and researchers to look in this band, where they believe it is possible to coexist with the current incumbents,” he says. “Spectrum sharing will be key in this band.”

Rangan notes that the work on which Pi-Radio was built has been published earlier this year both on the more foundational aspects of building networks in the FR3 band as well as the specific implementation of Pi-Radio’s unique, frequency-hopping research platform for future wireless networks. (Both papers were published in IEEE journals.)

“If you have frequency hopping, that means you can get systems that are resilient to blockage,” Rangan says. “But even, potentially, if it was attacked or compromised in any other way, this could actually open up a new type of dimension that we typically haven’t had in the cellular infrastructure.” The frequency-hopping that FR3 requires for wireless communications, in other words, could introduce a layer of hack-proofing that might potentially strengthen the overall network.

Complement, Not Replacement

The Pi-Radio team stresses, however, that FR3 would not supplant or supersede other new segments of wireless spectrum. There are, for instance, millimeter wave 5G deployments already underway today that will no doubt expand in scope and performance into the 6G future. That said, the ways that FR3 expand future 5G and 6G spectrum usage is an entirely unwritten chapter: Whether FR3 as a wireless spectrum band fizzles, or takes off, or finds a comfortable place somewhere in between depends in part on how it’s researched and developed now, the Pi-Radio team says.

“We’re at this tipping point where researchers and academics actually are empowered by the combination of this cutting-edge hardware with open-source software,” Mezzavilla says. “And that will enable the testing of new features for communications in these new frequency bands.” (Mezzavilla credits the National Telecommunications and Information Administration for recognizing the potential of FR3, and for funding the group’s research.)

By contrast, millimeter-wave 5G and 6G research has to date been bolstered, the team says, by the presence of a wide range of millimeter-wave software-defined radio (SDR) systems and other research platforms.

“Companies like Qualcomm, Samsung, Nokia, they actually had excellent millimeter wave development platforms,” Rangan says. “But they were in-house. And the effort it took to build one—an SDR at a university lab—was sort of insurmountable.”

So releasing an inexpensive open-source SDR in the FR3 band, Mezzavilla says, could jump start a whole new wave of 6G research.

“This is just the starting point,” Mezzavilla says. “From now on we’re going to build new features—new reference signals, new radio resource control signals, near-field operations... We’re ready to ship these yellow boxes to other academics around the world to test new features and test them quickly, before 6G is even remotely near us.”

This story was updated on 7 November 2024 to include detail about funding from the National Telecommunications and Information Administration.




été

A New Spacecraft Could Help Determine if There’s Life on a Moon of Jupiter

The Europa Clipper, set for launch in October, will explore a distant ocean world.







été

NASA completes spacecraft for TRACERS mission to investigate hazardous solar storms

Solar storms have the ability to harm astronauts and force massive blackouts




été

New LED camouflage can deter shark attacks, scientists say

Sharks less likely to interact as LED lights get brighter




été

Dem Rep. Torres: Biden Showed 'Incompetence' on Immigration Because He Catered to 'Far-Left Elites'


On Tuesday’s broadcast of MSNBC’s “Morning Joe,” Rep. Ritchie Torres (D-NY) stated that “the Biden administration demonstrated incompetence in managing the migrant crisis,” President Joe Biden “had the unilateral ability to issue an executive order restricting migration at the border, and he waited two-and-a-half years,” because the order “was unpopular among far-left elites who have outsized power over the policymaking and messaging of the Democratic Party.” Torres said, “[O]n the subject of immigration, there was genuine political malpractice. Since 2022, there has been an unprecedented wave of migration, whose impact was felt, not only at the border, but in cities like New York, where the shelter system and our municipal finances were completely overwhelmed. … Despite clear signs of popular discontent, it took the Biden administration two-and-a-half years to issue an executive order restricting migration at the border, and by then it was too late. The Republicans had won the issue, had weaponized it against us. And when the President issued the executive order, polling revealed that it was popular among the American people, among people from every racial category, blacks and whites, Latinos and Asians. So, if it was effective at reducing migration at the border and if it was

The post Dem Rep. Torres: Biden Showed ‘Incompetence’ on Immigration Because He Catered to ‘Far-Left Elites’ appeared first on Breitbart.




été

Trump Nominates Fox News Host Pete Hegseth for Secretary of Defense: 'Nobody Fights Harder for the Troops'


President-elect Donald Trump nominated Fox News host Pete Hegseth to serve as his Secretary of Defense, hailing him as a champion of his "peace through strength" policy.

The post Trump Nominates Fox News Host Pete Hegseth for Secretary of Defense: ‘Nobody Fights Harder for the Troops’ appeared first on Breitbart.




été

Delhi Ganesh (1944-2024): The very best of the veteran Tamil actor’s filmography in pictures

A prolific figure in Tamil cinema, the versatile actor appeared in over 400 films across Tamil, Telugu, and Malayalam languages




été

Out Now: ‘Punch Club 2: Fast Forward’, ‘Labyrinth: The Wizard’s Cat’, ‘TENSEI’, ‘Vampire: The Masquerade – Shadows of New York’, ‘Auto Pirates: Captains Cup’, ‘Jenny LeClue – Detectivu’ and More

Each and every day new mobile games are hitting the App Store, and so each week we put together a …





été

The Best Switch Visual Novels and Adventure Games in 2024 – From Fata Morgana and VA-11 Hall-A to Famicom Detective Club and Gnosia

After tackling the best party games on Switch in 2024, the recent release of Emio – The Smiling Man: Famicom …




été

Trump to nominate Fox News host Pete Hegseth to be defense secretary

President-elect Donald Trump has announced he will nominate Fox News host Pete Hegseth to serve as secretary of defense. Hegseth is a combat veteran who has long advocated for veterans.




été

LED lights on underside of surfboards may deter shark attacks!


LED lights on underside of surfboards may deter shark attacks!


(Third column, 13th story, link)


Drudge Report Feed needs your support!   Become a Patron




été

Honor Veterans by Improving the Benefits of Military Service — and Reducing the Risks

Private ownership, consumer choice, and competition would deliver better benefits to veterans — and force policy-makers to confront the costs of military engagements.




été

Thousands gather in Ottawa for Remembrance Day tribute to Canada's veterans

Thousands of veterans, military personnel and their supporters gathered at Canada's National War Memorial in Ottawa to remember those who have fought and died to protect this country and its freedoms.




été

Zach Bryan aurait offert 12 millions $ à son ex pour acheter son silence après leur rupture

Le chanteur Zach Bryan aurait demandé à son ex-copine Brianna LaPaglia de ne pas parler de leur relation en lui offrant 12 millions de dollars.




été

Energy smart meter issues creating north-south divide

Technology differences mean meters in northern England and Scotland may not work properly, energy firm body admits.




été

H5N1 Detected in Pig Highlights the Risk of Bird Flu Mixing with Seasonal Flu

Humans and pigs could both serve as mixing vessels for a bird flu–seasonal flu hybrid, posing a risk of wider spread




été

Brazen Scofflaws? Are Pharma Companies Really Completely Ignoring FDAAA?

Results reporting requirements are pretty clear. Maybe critics should re-check their methods?

Ben Goldacre has rather famously described the clinical trial reporting requirements in the Food and Drug Administration Amendments Act of 2007 as a “fake fix” that was being thoroughly “ignored” by the pharmaceutical industry.

Pharma: breaking the law in broad daylight?
He makes this sweeping, unconditional proclamation about the industry and its regulators on the basis of  a single study in the BMJ, blithely ignoring the fact that a) the authors of the study admitted that they could not adequately determine the number of studies that were meeting FDAAA requirements and b) a subsequent FDA review that identified only 15 trials potentially out of compliance, out of a pool of thousands.


Despite the fact that the FDA, which has access to more data, says that only a tiny fraction of studies are potentially noncompliant, Goldacre's frequently repeated claims that the law is being ignored seems to have caught on in the general run of journalistic and academic discussions about FDAAA.

And now there appears to be additional support for the idea that a large percentage of studies are noncompliant with FDAAA results reporting requirements, in the form of a new study in the Journal of Clinical Oncology: "Public Availability of Results of Trials Assessing Cancer Drugs in the United States" by Thi-Anh-Hoa Nguyen, et al.. In it, the authors report even lower levels of FDAAA compliance – a mere 20% of randomized clinical trials met requirements of posting results on clinicaltrials.gov within one year.

Unsurprisingly, the JCO results were immediately picked up and circulated uncritically by the usual suspects.

I have to admit not knowing much about pure academic and cooperative group trial operations, but I do know a lot about industry-run trials – simply put, I find the data as presented in the JCO study impossible to believe. Everyone I work with in pharma trials is painfully aware of the regulatory environment they work in. FDAAA compliance is a given, a no-brainer: large internal legal and compliance teams are everywhere, ensuring that the letter of the law is followed in clinical trial conduct. If anything, pharma sponsors are twitchily over-compliant with these kinds of regulations (for example, most still adhere to 100% verification of source documentation – sending monitors to physically examine every single record of every single enrolled patient - even after the FDA explicitly told them they didn't have to).

I realize that’s anecdotal evidence, but when such behavior is so pervasive, it’s difficult to buy into data that says it’s not happening at all. The idea that all pharmaceutical companies are ignoring a highly visible law that’s been on the books for 6 years is extraordinary. Are they really so brazenly breaking the rules? And is FDA abetting them by disseminating incorrect information?

Those are extraordinary claims, and would seem to require extraordinary evidence. The BMJ study had clear limitations that make its implications entirely unclear. Is the JCO article any better?

Some Issues


In fact, there appear to be at least two major issues that may have seriously compromised the JCO findings:

1. Studies that were certified as being eligible for delayed reporting requirements, but do not have their certification date listed.

The study authors make what I believe to be a completely unwarranted assumption:

In trials for approval of new drugs or approval for a new indication, a certification [permitting delayed results reporting] should be posted within 1 year and should be publicly available.

It’s unclear to me why the authors think the certifications “should be” publicly available. In re-reading FDAAA section 801, I don’t see any reference to that being a requirement. I suppose I could have missed it, but the authors provide a citation to a page that clearly does not list any such requirement.

But their methodology assumes that all trials that have a certification will have it posted:

If no results were posted at ClinicalTrials.gov, we determined whether the responsible party submitted a certification. In this case, we recorded the date of submission of the certification to ClinicalTrials.gov.

If a sponsor gets approval from FDA to delay reporting (as is routine for all drugs that are either not approved for any indication, or being studied for a new indication – i.e., the overwhelming majority of pharma drug trials), but doesn't post that approval on the registry, the JCO authors deem that trial “noncompliant”. This is not warranted: the company may have simply chosen not to post the certification despite being entirely FDAAA compliant.

2. Studies that were previously certified for delayed reporting and subsequently reported results

It is hard to tell how the authors treated this rather-substantial category of trials. If a trial was certified for delayed results reporting, but then subsequently published results, the certification date becomes difficult to find. Indeed, it appears in the case where there were results, the authors simply looked at the time from study completion to results posting. In effect, this would re-classify almost every single one of these trials from compliant to non-compliant. Consider this example trial:


  • Phase 3 trial completes January 2010
  • Certification of delayed results obtained December 2010 (compliant)
  • FDA approval June 2013
  • Results posted July 2013 (compliant)


In looking at the JCO paper's methods section, it really appears that this trial would be classified as reporting results 3.5 years after completion, and therefore be considered noncompliant with FDAAA. In fact, this trial is entirely kosher, and would be extremely typical for many phase 2 and 3 trials in industry.

Time for Some Data Transparency


The above two concerns may, in fact, be non-issues. They certainly appear to be implied in the JCO paper, but the wording isn't terribly detailed and could easily be giving me the wrong impression.

However, if either or both of these issues are real, they may affect the vast majority of "noncompliant" trials in this study. Given the fact that most clinical trials are either looking at new drugs, or looking at new indications for new drugs, these two issues may entirely explain the gap between the JCO study and the unequivocal FDA statements that contradict it.

I hope that, given the importance of transparency in research, the authors will be willing to post their data set publicly so that others can review their assumptions and independently verify their conclusions. It would be more than a bit ironic otherwise.

[Image credit: Shamless lawlessness via Flikr user willytronics.]


Thi-Anh-Hoa Nguyen, Agnes Dechartres, Soraya Belgherbi, and Philippe Ravaud (2013). Public Availability of Results of Trials Assessing Cancer Drugs in the United States JOURNAL OF CLINICAL ONCOLOGY DOI: 10.1200/JCO.2012.46.9577




été

Retention metrics, simplified

[Originally posted on First Patient In]

In my experience, most clinical trials do not suffer from significant retention issues. This is a testament to the collaborative good will of most patients who consent to participate, and to the patient-first attitude of most research coordinators.

However, in many trials – especially those that last more than a year – the question of whether there is a retention issue will come up at some point while the trial’s still going. This is often associated with a jump in early terminations, which can occur as the first cohort of enrollees has been in the trial for a while.

It’s a good question to ask midstream: are we on course to have as many patients fully complete the trial as we’d originally anticipated?

However, the way we go about answering the question is often flawed and confusing. Here’s an example: a sponsor came to us with what they thought was a higher rate of early terminations than expected. The main problem? They weren't actually sure.

Here’s their data. Can you tell?

Original retention graph. Click to enlarge.
If you can, please let me know how! While this chart is remarkably ... full of numbers, it provides no actual insight into when patients are dropping out, and no way that I can tell to project eventual total retention.

In addition, measuring the “retention rate” as a simple ratio of active to terminated patients will not provide an accurate benchmark until the trial is almost over. Here's why: patients tend to drop out later in a trial, so as long as you’re enrolling new patients, your retention rate will be artificially high. When enrollment ends, your retention rate will appear to drop rapidly – but this is only because of the artificial lift you had earlier.

In fact, that was exactly the problem the sponsor had: when enrollment ended, the retention rate started dropping. It’s good to be concerned, but it’s also important to know how to answer the question.

Fortunately, there is a very simple way to get a clear answer in most cases – one that’s probably already in use by your  biostats team around the corner: the Kaplan-Meier “survival” curve.

Here is the same study data, but patient retention is simply depicted as a K-M graph. The key difference is that instead of calendar dates, we used the relative measure of time in the trial for each patient. That way we can easily spot where the trends are.


In this case, we were able to establish quickly that patient drop-outs were increasing at a relatively small constant rate, with a higher percentage of drops coinciding with the one-year study visit. Most importantly, we were able to very accurately predict the eventual number of patients who would complete the trial. And it only took one graph!







été

Crop Parasites Can Be Deterred by “Electric Fences”



Imagine you’re a baby cocoa plant, just unfurling your first tentative roots into the fertile, welcoming soil.

Somewhere nearby, a predator stirs. It has no ears to hear you, no eyes to see you. But it knows where you are, thanks in part to the weak electric field emitted by your roots.

It is microscopic, but it’s not alone. By the thousands, the creatures converge, slithering through the waterlogged soil, propelled by their flagella. If they reach you, they will use fungal-like hyphae to penetrate and devour you from the inside. They’re getting closer. You’re a plant. You have no legs. There’s no escape.

But just before they fall upon you, they hesitate. They seem confused. Then, en masse, they swarm off in a different direction, lured by a more attractive electric field. You are safe. And they will soon be dead.

If Eleonora Moratto and Giovanni Sena get their way, this is the future of crop pathogen control.

Many variables are involved in the global food crisis, but among the worst are the pests that devastate food crops, ruining up to 40 percent of their yield before they can be harvested. One of these—the little protist in the example above, an oomycete formally known as Phytophthora palmivorahas a US $1 billion appetite for economic staples like cocoa, palm, and rubber.

There is currently no chemical defense that can vanquish these creatures without poisoning the rest of the (often beneficial) organisms living in the soil. So Moratto, Sena, and their colleagues at Sena’s group at Imperial College London settled on a non-traditional approach: They exploited P. palmivora’s electric sense, which can be spoofed.

All plant roots that have been measured to date generate external ion flux, which translates into a very weak electric field. Decades of evidence suggests that this signal is an important target for predators’ navigation systems. However, it remains a matter of some debate how much their predators rely on plants’ electrical signatures to locate them, as opposed to chemical or mechanical information. Last year, Moratto and Sena’s group found that P. palmivora spores are attracted to the positive electrode of a cell generating current densities of 1 ampere per square meter. “The spores followed the electric field,” says Sena, suggesting that a similar mechanism helps them find natural bioelectric fields emitted by roots in the soil.

That got the researchers wondering: Might such an artificial electric field override the protists’ other sensory inputs, and scramble their compasses as they tried to use plant roots’ much weaker electrical output?

To test the idea, the researchers developed two ways to protect plant roots using a constant vertical electric field. They cultivated two common snacks for P. palmivoraa flowering plant related to cabbage and mustard, and a legume often used as a livestock feed plant—in tubes in a hydroponic solution.

Two electric-field configurations were tested: A “global” vertical field [left] and a field generated by two small nearby electrodes. The global field proved to be slightly more effective.Eleonora Moratto

In the first assay, the researchers sandwiched the plant roots between rows of electrodes above and below, which completely engulfed them in a “global” vertical field. For the second set, the field was generated using two small electrodes a short distance away from the plant, creating current densities on the order of 10 A/m2. Then they unleashed the protists.

With respect to the control group, both methods successfully diverted a significant portion of the predators away from the plant roots. They swarmed the positive electrode, where—since zoospores can’t survive for longer than about 2 to 3 hours without a host—they presumably starved to death. Or worse. Neil Gow, whose research presented some of the first evidence for zoospore electrosensing, has other theories about their fate. “Applied electrical fields generate toxic products and steep pH gradients near and around the electrodes due to the electrolysis of water,” he says. “The tropism towards the electrode might be followed by killing or immobilization due to the induced pH gradients.”

Not only did the technique prevent infestation, but some evidence indicates that it may also mitigate existing infections. The researchers published their results in August in Scientific Reports.

The global electric field was marginally more successful than the local. However, it would be harder to translate from lab conditions into a (literal) field trial in soil. The local electric field setup would be easy to replicate: “All you have to do is stick the little plug into the soil next to the crop you want to protect,” says Sena.

Moratto and Sena say this is a proof of concept that demonstrates a basis for a new, pesticide-free way to protect food crops. (Sena likens the technique to the decoys used by fighter jets to draw away incoming missiles by mimicking the signals of the original target.) They are now looking for funding to expand the project. The first step is testing the local setup in soil; the next is to test the approach on Phytophthora infestans, a meaner, scarier cousin of P. palmivora.

P. infestans attacks a more varied diet of crops—you may be familiar with its work during the Irish potato famine. The close genetic similarities imply another promising candidate for electrical pest control. This investigation, however, may require more funding. P. infestans research can be undertaken only under more stringent laboratory security protocols.

The work at Imperial ties into the broader—and somewhat charged—debate around electrostatic ecology; that is, the extent to which creatures including ticks make use of heretofore poorly understood electrical mechanisms to orient themselves and in other ways enhance their survival. “Most people still aren’t aware that naturally occurring electricity can play an ecological role,” says Sam England, a behavioral ecologist with Berlin’s Natural History Museum. “So I suspect that once these electrical phenomena become more well known and understood, they will inspire a greater number of practical applications like this one.”




été

The Trust-Building Playbook: 5 Tips Every Digital Health Marketer Needs to Know

Building trust while simultaneously building products, selling, recruiting, and fundraising can feel impossible. But it’s required whether you have the time or not, and it doesn’t stop no matter how big you grow.

The post The Trust-Building Playbook: 5 Tips Every Digital Health Marketer Needs to Know appeared first on MedCity News.




été

Canada detects its first presumptive human H5 bird flu case

OTTAWA - Canada has detected its first presumptive case of H5 bird flu in a person, a teenager in the western province of British Columbia, health officials said on Saturday (Nov 9). The teenager likely caught the virus from a bird or animal and was receiving care at a children's hospital, the province said in a statement. The province said it was investigating the source of exposure and identifying the teenager's contacts. The risk to the public remains low, Canada's Health Minister Mark Holland said in posting on X. "This is a rare event," British Columbia Health Officer Bonnie Henry said in a statement. "We are conducting a thorough investigation to fully understand the source of exposure here in B.C." H5 bird flu is widespread in wild birds worldwide and is causing outbreaks in poultry and US dairy cows, with several recent human cases in US dairy and poultry workers. There has been no evidence of person-to-person spread so far. But if that were to happen, a pandemic could unfold, scientists have said.




été

190946: Interior Secretary provides terms of A.Q. Khan's modified detention

S.M. Zafar, Khan's prominent and highly respected lawyer, had pledged to the government that the meeting with the press would be Khan's "first and last" such encounter.




été

Ukraine's Foreign Minister Shares Insights on the War and Ukrainian Determination

The Belfer Center at Harvard Kennedy School hosted a virtual conversation with Ukraine's Foreign Minister Dmytro Kuleba Wednesday (Feb. 22) to discuss the war in Ukraine as it reaches a full year since Russia's invasion.