igg

Delaware Division of the Arts and The Biggs Museum of American Art Open Award Winners XXIV Exhibition

Twenty Delaware artists receive significant award from the Delaware Division of the Arts   Dover, Del. (April 22, 2024) – The Biggs Museum of American Art has opened the exhibition, Award Winners XXIV, marking the 24th consecutive year of hosting this exhibition in partnership with the Delaware Division of the Arts. Showcasing the exceptional work […]




igg

Tetley Tea To BigBasket: Tata Group's Biggest Strides Under Ratan Tata

Industrialist Ratan Tata, who passed away at the age of 86, is credited with transforming the Tata Group's portfolio during his tenure as chairman to include products from salt to software to sports cars.




igg

3 Biggest Changes Of iOS 16.2 Update That Every iPhone User Should Know!

In its latest update Apple said that it is preparing for the iOS 16.2 update for iPhones across the world. Notably, like the previous release, there are a couple of changes coming for the iPhones.  iOS 16.2 Update Release Date So far, Apple has not announced a release date for iOS 16.2 update. Reportedly, the […]




igg

Biggest name world leaders missing at UN climate talks, others fill the void

BAKU, Azerbaijan — World leaders are converging Tuesday at the United Nations annual climate conference in Baku, Azerbaijan although the big names and powerful countries are noticeably absent, unlike past climate talks which had the star power of a soccer World Cup. But 2024's climate talks are more like the International Chess Federation world championship, lacking recognizable names but big on nerd power and strategy. The top leaders of the 13 largest carbon dioxide-polluting countries will not appear. Their nations are responsible for more than 70% of 2023's heat-trapping gases. The world's biggest polluters and strongest economies — China and the United States — aren't sending their No. 1s. India and Indonesia's heads of state are also not in attendance, meaning the four most populous nations with more than 42% of all the world's population aren't having leaders speak. “It’s symptomatic of the lack of political will to act. There’s no sense of urgency,” said climate scientist Bill Hare, CEO of Climate Analytics. He said this explains “the absolute mess we’re finding ourselves in.” Transition to clean energy The world has witnessed the hottest day, months and year on record “and a master class in climate destruction,” United Nations Secretary-General Antonio Guterres told the world leaders who did show up. But Guterres held out hope, saying, in a veiled reference to Donald Trump's re-election in the United States, that the “clean energy revolution is here. No group, no business, no government can stop it.” United Nations officials said in 2016, when Trump was first elected, there were 180 gigawatts of clean energy and 700,000 electric vehicles in the world. Now there are 600 gigawatts of clean energy and 14 million electric vehicles. Host Azerbaijan President Ilham Aliyev kicked off two scheduled days of world leaders' speeches by lambasting Armenia, western news media, climate activists and critics who highlighted his country's rich oil and gas history and trade, calling them hypocritical since the United States is the world's biggest oil producer. He said it was “not fair” to call Azerbaijan a “petrostate” because it produces less than 1% of the world's oil and gas. Oil and gas are “a gift of the God” just like the sun, wind and minerals, Aliyev said. “Countries should not be blamed for having them. And should not be blamed for bringing these resources to the market because the market needs them.” As the host and president of the climate talks, called COP29, Aliyev said his country will push hard for a green transition away from fossil fuels, “but at the same time, we must be realistic.” Lack of star power Aliyev, United Kingdom’s Prime Minister Keir Starmer and Turkey’s President Recep Tayyip Erdogan are the headliners of around 50 leaders set to speak on Tuesday. There'll also be a strong showing from the leaders of some of the world’s most climate-vulnerable countries. Several small island nations presidents and over a dozen leaders from countries across Africa are set to speak over the two-day World Leaders’ Summit at COP29. As a sense of how the bar for celebrity has lowered, on Tuesday morning photographers and video cameras ran alongside one leader walking through the halls of the meeting. It was the emergency management minister for host country Azerbaijan. United Nations officials downplayed the lack of head of state star power, saying that every country is represented and active in the climate talks. One logistical issue is that next week, the leaders of the most powerful countries have to be half a world away in Brazil for the G20 meetings. The United States recent election, Germany's government collapse, natural disasters and personal illnesses also have kept some leaders away. The major focus of the negotiations is climate finance, which is rich nations trying to help poor countries pay for transitioning their economies away from fossil fuels, coping with climate change's upcoming harms and compensating for damages from weather extremes. Nations are negotiating over huge amounts of money, anywhere from $100 billion a year to $1.3 trillion a year. That money “is not charity, it's an investment,” Guterres said. “Developing countries must not leave Baku empty-handed,” Guterres said. “A deal is a must.”




igg

Global index for free and fair elections suffers biggest decline on record in 2023, democracy watchdog says

STOCKHOLM — Lower voter turnout and increasingly contested results globally are threatening the credibility of elections, an intergovernmental watchdog warned on Tuesday, as its sub-index for free and fair elections suffered its biggest decline on record in 2023. In its report, the Stockholm-based International Institute for Democracy and Electoral Assistance (IDEA) said 2023 was the eighth consecutive year with a net decline in overall democratic performance, the longest consecutive fall since records began in 1975. The watchdog bases its Global State of Democracy indexes on more than 100 variables and is using four main categories - representation, rights, rule of law and participation - to categorize performance. The category of democracy related to free and fair elections and parliamentary oversight, a sub-category of representation, suffered its worst year on record in 2023. "This report is a call for action to protect democratic elections," IDEA's Secretary-General Kevin Casas-Zamora said in the report. "The success of democracy depends on many things, but it becomes utterly impossible if elections fail." The think-tank said government intimidation and electoral process irregularities, such as fraudulent voter registration and vote-counting, were increasing. It also said that threats of foreign interference, disinformation and the use of artificial intelligence in campaigns added to challenges. It also said that global voter participation had fallen to 55.5% of eligible voters in 2023 from 65.2% in 2008. Globally, in almost 20% of elections between 2020 and 2024, one of the losing candidates or parties rejected the results. IDEA said that the democratic performance in the U.S., which holds a presidential election this year, had recovered somewhat in the past two years, but the assassination attempt on former President Donald Trump in July highlighted continued risks. "Less than half (47%) of the Americans said the 2020 election was 'free and fair' and the country remains deeply polarized," IDEA said.




igg

World's 10 Biggest Energy Gluttons

A look into which countries use the most energy per capita reveals some surprising results, from the Middle East to the Caribbean.




igg

The Biggest Stories of 2023

The world was in turmoil in 2023, rife with stomach-churning news, with the international and U.S. press primarily focused on the same news stories. It seems that the world has never been more interconnected.




igg

PVL: Collective effort gets it done for undermanned Chery Tiggo

MANILA, Philippines — After an exodus at Chery Tiggo, Ara Galang led the team’s collective effort to eke out a come-from-behind win over the gritty Capital1 and jumpstart their 2024-25 PVL All-Filipino Conference. The Crossovers were pushed to their limits right at the start of their post-Eya Laure era as they also missed key members Mylene Paat, Jen Nierva, and Jas Nabor leaving an undermanned team, which also lost EJ Laure and Buding Duremdes. But Galang and the Crossovers stuck to their collective effort to fight back from a 1-2 deficit before surviving the Solar Spikers, 20-25, 25-23, 22-25, 25-18, […]...

Keep on reading: PVL: Collective effort gets it done for undermanned Chery Tiggo




igg

Largest genome sequenced so far is 30 times bigger than a human's

The South American lungfish has a whopping 180 gigabases of DNA in each cell, compared with 6 gigabases in human cells




igg

Listening to worms wriggle can help us monitor ecosystem health

The noises made by organisms like ants and worms as they move around in the soil can be used to assess whether an ecosystem is in good shape




igg

There could be 30,000 species of earthworms wriggling around the world

Nearly 6000 species and subspecies of earthworms have been identified by scientists – but the true number could top 30,000




igg

The physicist who wants to build a telescope bigger than Earth

Alex Lupsasca plans to extend Earth's largest telescope network beyond the atmosphere with a space-based dish. It could spot part of a black hole we've never seen before – and perhaps discover new physics




igg

How clues in honey can help fight our biggest biodiversity challenges

There are secrets aplenty in a pot of honey – from information about bees' "micro-bee-ota" to DNA from the environment – that can help us fight food fraud and even monitor shifts in climate




igg

Can we solve quantum theory’s biggest problem by redefining reality?

With its particles in two places at once, quantum theory strains our common sense notions of how the universe should work. But one group of physicists says we can get reality back if we just redefine its foundations




igg

Peter Higgs, physicist who theorised the Higgs boson, has died aged 94

Nobel prizewinning theoretical physicist Peter Higgs has died aged 94. He proposed the particle that gives other particles mass – now named the Higgs boson and discovered by the Large Hadron Collider at CERN in 2012




igg

How Peter Higgs revealed the forces that hold the universe together

The physicist Peter Higgs quietly revolutionised quantum field theory, then lived long enough to see the discovery of the Higgs boson he theorised. Despite receiving a Nobel prize, he remained in some ways as elusive as the particle that shares his name




igg

Is the world's biggest fusion experiment dead after new delay to 2035?

ITER, a €20 billion nuclear fusion reactor under construction in France, will now not switch on until 2035 - a delay of 10 years. With smaller commercial fusion efforts on the rise, is it worth continuing with this gargantuan project?




igg

We may finally know what caused the biggest cosmic explosion ever seen

The gamma ray burst known as GRB221009A is the biggest explosion astronomers have ever glimpsed and we might finally know what caused the blast




igg

A slight curve helps rocks make the biggest splash

Researchers were surprised to find that a very slightly curved object produces a more dramatic splash than a perfectly flat one




igg

Another blow for dark matter as biggest hunt yet finds nothing

The hunt for particles of dark matter has been stymied once again, with physicists placing constraints on this mysterious substance that are 5 times tighter than the previous best




igg

Can we solve quantum theory’s biggest problem by redefining reality?

With its particles in two places at once, quantum theory strains our common sense notions of how the universe should work. But one group of physicists says we can get reality back if we just redefine its foundations




igg

Snow and rising sea levels may have triggered Japan's earthquake swarm

In an ongoing swarm of earthquakes that began hitting Japan in 2020, the shifting weight of surface water may have spurred the shaking




igg

Why falling birth rates will be a bigger problem than overpopulation

Birthrates are projected to have fallen below the replacement level, of 2.1 per woman, in more than three quarters of countries by 2050




igg

Pollen, Fruits, Veggies Help Trigger Oral Allergy Syndrome

Title: Pollen, Fruits, Veggies Help Trigger Oral Allergy Syndrome
Category: Health News
Created: 8/30/2007 12:00:00 AM
Last Editorial Review: 8/30/2007 12:00:00 AM




igg

Health Tip: Triggers for Hiatal Hernia

Title: Health Tip: Triggers for Hiatal Hernia
Category: Health News
Created: 8/26/2013 7:35:00 AM
Last Editorial Review: 8/26/2013 12:00:00 AM




igg

Vaping May Trigger Lung Damage Like That Seen in Emphysema

Title: Vaping May Trigger Lung Damage Like That Seen in Emphysema
Category: Health News
Created: 8/26/2019 12:00:00 AM
Last Editorial Review: 8/27/2019 12:00:00 AM




igg

Just Like COVID, Severe Flu Can Trigger Heart Crises

Title: Just Like COVID, Severe Flu Can Trigger Heart Crises
Category: Health News
Created: 8/25/2020 12:00:00 AM
Last Editorial Review: 8/25/2020 12:00:00 AM




igg

Changes to CDC's COVID-19 Testing Guidelines Trigger Concern

Title: Changes to CDC's COVID-19 Testing Guidelines Trigger Concern
Category: Health News
Created: 8/26/2020 12:00:00 AM
Last Editorial Review: 8/27/2020 12:00:00 AM




igg

RPG Cast – Episode 705: “Trip the Gold Digger”

Kelley's cat decided Christmas was over. Chris is scared to return to Vana'diel and hates Norrath, but is strangely interested in hanging out with river hobbits in Middle Earth. Meanwhile, Phil tries to recruit everyone into his multilevel Monster Hunter scheme.

The post RPG Cast – Episode 705: “Trip the Gold Digger” appeared first on RPGamer.



  • News
  • Podcasts
  • RPG Cast
  • Lord of the Rings
  • Monster Hunter: World
  • Path of Exile
  • Super Mario RPG Remake
  • World of Warcraft

igg

RPG Cast – Episode 728: “Cats With Jiggle Physics”

Kelley makes a Morgen Wonk. Chris loves the smell of Gundam in the morning. Phil plays some Mathfinder while listening to his favorite new band, Creepy Lego Waifus. The cake has arrived.

The post RPG Cast – Episode 728: “Cats With Jiggle Physics” appeared first on RPGamer.



  • News
  • Podcasts
  • RPG Cast
  • Live A Live
  • SKALD: Against the Black Priory
  • World of Warcraft: Mists of Pandaria

igg

Why the T in ChatGPT is AI's biggest breakthrough - and greatest risk

AI companies hope that feeding ever more data to their models will continue to boost performance, eventually leading to human-level intelligence. Behind this hope is the "transformer", a key breakthrough in AI, but what happens if it fails to deliver?




igg

AIs get worse at answering simple questions as they get bigger

Using more training data and computational power is meant to make AIs more reliable, but tests suggest large language models actually get less reliable as they grow






igg

Exclusive: Liz Truss urged to act with Britain facing biggest loss of sports facilities in a generation




igg

Andrew Ng: Unbiggen AI



Andrew Ng has serious street cred in artificial intelligence. He pioneered the use of graphics processing units (GPUs) to train deep learning models in the late 2000s with his students at Stanford University, cofounded Google Brain in 2011, and then served for three years as chief scientist for Baidu, where he helped build the Chinese tech giant’s AI group. So when he says he has identified the next big shift in artificial intelligence, people listen. And that’s what he told IEEE Spectrum in an exclusive Q&A.


Ng’s current efforts are focused on his company Landing AI, which built a platform called LandingLens to help manufacturers improve visual inspection with computer vision. He has also become something of an evangelist for what he calls the data-centric AI movement, which he says can yield “small data” solutions to big issues in AI, including model efficiency, accuracy, and bias.

Andrew Ng on...

The great advances in deep learning over the past decade or so have been powered by ever-bigger models crunching ever-bigger amounts of data. Some people argue that that’s an unsustainable trajectory. Do you agree that it can’t go on that way?

Andrew Ng: This is a big question. We’ve seen foundation models in NLP [natural language processing]. I’m excited about NLP models getting even bigger, and also about the potential of building foundation models in computer vision. I think there’s lots of signal to still be exploited in video: We have not been able to build foundation models yet for video because of compute bandwidth and the cost of processing video, as opposed to tokenized text. So I think that this engine of scaling up deep learning algorithms, which has been running for something like 15 years now, still has steam in it. Having said that, it only applies to certain problems, and there’s a set of other problems that need small data solutions.

When you say you want a foundation model for computer vision, what do you mean by that?

Ng: This is a term coined by Percy Liang and some of my friends at Stanford to refer to very large models, trained on very large data sets, that can be tuned for specific applications. For example, GPT-3 is an example of a foundation model [for NLP]. Foundation models offer a lot of promise as a new paradigm in developing machine learning applications, but also challenges in terms of making sure that they’re reasonably fair and free from bias, especially if many of us will be building on top of them.

What needs to happen for someone to build a foundation model for video?

Ng: I think there is a scalability problem. The compute power needed to process the large volume of images for video is significant, and I think that’s why foundation models have arisen first in NLP. Many researchers are working on this, and I think we’re seeing early signs of such models being developed in computer vision. But I’m confident that if a semiconductor maker gave us 10 times more processor power, we could easily find 10 times more video to build such models for vision.

Having said that, a lot of what’s happened over the past decade is that deep learning has happened in consumer-facing companies that have large user bases, sometimes billions of users, and therefore very large data sets. While that paradigm of machine learning has driven a lot of economic value in consumer software, I find that that recipe of scale doesn’t work for other industries.

Back to top

It’s funny to hear you say that, because your early work was at a consumer-facing company with millions of users.

Ng: Over a decade ago, when I proposed starting the Google Brain project to use Google’s compute infrastructure to build very large neural networks, it was a controversial step. One very senior person pulled me aside and warned me that starting Google Brain would be bad for my career. I think he felt that the action couldn’t just be in scaling up, and that I should instead focus on architecture innovation.

“In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.”
—Andrew Ng, CEO & Founder, Landing AI

I remember when my students and I published the first NeurIPS workshop paper advocating using CUDA, a platform for processing on GPUs, for deep learning—a different senior person in AI sat me down and said, “CUDA is really complicated to program. As a programming paradigm, this seems like too much work.” I did manage to convince him; the other person I did not convince.

I expect they’re both convinced now.

Ng: I think so, yes.

Over the past year as I’ve been speaking to people about the data-centric AI movement, I’ve been getting flashbacks to when I was speaking to people about deep learning and scalability 10 or 15 years ago. In the past year, I’ve been getting the same mix of “there’s nothing new here” and “this seems like the wrong direction.”

Back to top

How do you define data-centric AI, and why do you consider it a movement?

Ng: Data-centric AI is the discipline of systematically engineering the data needed to successfully build an AI system. For an AI system, you have to implement some algorithm, say a neural network, in code and then train it on your data set. The dominant paradigm over the last decade was to download the data set while you focus on improving the code. Thanks to that paradigm, over the last decade deep learning networks have improved significantly, to the point where for a lot of applications the code—the neural network architecture—is basically a solved problem. So for many practical applications, it’s now more productive to hold the neural network architecture fixed, and instead find ways to improve the data.

When I started speaking about this, there were many practitioners who, completely appropriately, raised their hands and said, “Yes, we’ve been doing this for 20 years.” This is the time to take the things that some individuals have been doing intuitively and make it a systematic engineering discipline.

The data-centric AI movement is much bigger than one company or group of researchers. My collaborators and I organized a data-centric AI workshop at NeurIPS, and I was really delighted at the number of authors and presenters that showed up.

You often talk about companies or institutions that have only a small amount of data to work with. How can data-centric AI help them?

Ng: You hear a lot about vision systems built with millions of images—I once built a face recognition system using 350 million images. Architectures built for hundreds of millions of images don’t work with only 50 images. But it turns out, if you have 50 really good examples, you can build something valuable, like a defect-inspection system. In many industries where giant data sets simply don’t exist, I think the focus has to shift from big data to good data. Having 50 thoughtfully engineered examples can be sufficient to explain to the neural network what you want it to learn.

When you talk about training a model with just 50 images, does that really mean you’re taking an existing model that was trained on a very large data set and fine-tuning it? Or do you mean a brand new model that’s designed to learn only from that small data set?

Ng: Let me describe what Landing AI does. When doing visual inspection for manufacturers, we often use our own flavor of RetinaNet. It is a pretrained model. Having said that, the pretraining is a small piece of the puzzle. What’s a bigger piece of the puzzle is providing tools that enable the manufacturer to pick the right set of images [to use for fine-tuning] and label them in a consistent way. There’s a very practical problem we’ve seen spanning vision, NLP, and speech, where even human annotators don’t agree on the appropriate label. For big data applications, the common response has been: If the data is noisy, let’s just get a lot of data and the algorithm will average over it. But if you can develop tools that flag where the data’s inconsistent and give you a very targeted way to improve the consistency of the data, that turns out to be a more efficient way to get a high-performing system.

“Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.”
—Andrew Ng

For example, if you have 10,000 images where 30 images are of one class, and those 30 images are labeled inconsistently, one of the things we do is build tools to draw your attention to the subset of data that’s inconsistent. So you can very quickly relabel those images to be more consistent, and this leads to improvement in performance.

Could this focus on high-quality data help with bias in data sets? If you’re able to curate the data more before training?

Ng: Very much so. Many researchers have pointed out that biased data is one factor among many leading to biased systems. There have been many thoughtful efforts to engineer the data. At the NeurIPS workshop, Olga Russakovsky gave a really nice talk on this. At the main NeurIPS conference, I also really enjoyed Mary Gray’s presentation, which touched on how data-centric AI is one piece of the solution, but not the entire solution. New tools like Datasheets for Datasets also seem like an important piece of the puzzle.

One of the powerful tools that data-centric AI gives us is the ability to engineer a subset of the data. Imagine training a machine-learning system and finding that its performance is okay for most of the data set, but its performance is biased for just a subset of the data. If you try to change the whole neural network architecture to improve the performance on just that subset, it’s quite difficult. But if you can engineer a subset of the data you can address the problem in a much more targeted way.

When you talk about engineering the data, what do you mean exactly?

Ng: In AI, data cleaning is important, but the way the data has been cleaned has often been in very manual ways. In computer vision, someone may visualize images through a Jupyter notebook and maybe spot the problem, and maybe fix it. But I’m excited about tools that allow you to have a very large data set, tools that draw your attention quickly and efficiently to the subset of data where, say, the labels are noisy. Or to quickly bring your attention to the one class among 100 classes where it would benefit you to collect more data. Collecting more data often helps, but if you try to collect more data for everything, that can be a very expensive activity.

For example, I once figured out that a speech-recognition system was performing poorly when there was car noise in the background. Knowing that allowed me to collect more data with car noise in the background, rather than trying to collect more data for everything, which would have been expensive and slow.

Back to top

What about using synthetic data, is that often a good solution?

Ng: I think synthetic data is an important tool in the tool chest of data-centric AI. At the NeurIPS workshop, Anima Anandkumar gave a great talk that touched on synthetic data. I think there are important uses of synthetic data that go beyond just being a preprocessing step for increasing the data set for a learning algorithm. I’d love to see more tools to let developers use synthetic data generation as part of the closed loop of iterative machine learning development.

Do you mean that synthetic data would allow you to try the model on more data sets?

Ng: Not really. Here’s an example. Let’s say you’re trying to detect defects in a smartphone casing. There are many different types of defects on smartphones. It could be a scratch, a dent, pit marks, discoloration of the material, other types of blemishes. If you train the model and then find through error analysis that it’s doing well overall but it’s performing poorly on pit marks, then synthetic data generation allows you to address the problem in a more targeted way. You could generate more data just for the pit-mark category.

“In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models.”
—Andrew Ng

Synthetic data generation is a very powerful tool, but there are many simpler tools that I will often try first. Such as data augmentation, improving labeling consistency, or just asking a factory to collect more data.

Back to top

To make these issues more concrete, can you walk me through an example? When a company approaches Landing AI and says it has a problem with visual inspection, how do you onboard them and work toward deployment?

Ng: When a customer approaches us we usually have a conversation about their inspection problem and look at a few images to verify that the problem is feasible with computer vision. Assuming it is, we ask them to upload the data to the LandingLens platform. We often advise them on the methodology of data-centric AI and help them label the data.

One of the foci of Landing AI is to empower manufacturing companies to do the machine learning work themselves. A lot of our work is making sure the software is fast and easy to use. Through the iterative process of machine learning development, we advise customers on things like how to train models on the platform, when and how to improve the labeling of data so the performance of the model improves. Our training and software supports them all the way through deploying the trained model to an edge device in the factory.

How do you deal with changing needs? If products change or lighting conditions change in the factory, can the model keep up?

Ng: It varies by manufacturer. There is data drift in many contexts. But there are some manufacturers that have been running the same manufacturing line for 20 years now with few changes, so they don’t expect changes in the next five years. Those stable environments make things easier. For other manufacturers, we provide tools to flag when there’s a significant data-drift issue. I find it really important to empower manufacturing customers to correct data, retrain, and update the model. Because if something changes and it’s 3 a.m. in the United States, I want them to be able to adapt their learning algorithm right away to maintain operations.

In the consumer software Internet, we could train a handful of machine-learning models to serve a billion users. In manufacturing, you might have 10,000 manufacturers building 10,000 custom AI models. The challenge is, how do you do that without Landing AI having to hire 10,000 machine learning specialists?

So you’re saying that to make it scale, you have to empower customers to do a lot of the training and other work.

Ng: Yes, exactly! This is an industry-wide problem in AI, not just in manufacturing. Look at health care. Every hospital has its own slightly different format for electronic health records. How can every hospital train its own custom AI model? Expecting every hospital’s IT personnel to invent new neural-network architectures is unrealistic. The only way out of this dilemma is to build tools that empower the customers to build their own models by giving them tools to engineer the data and express their domain knowledge. That’s what Landing AI is executing in computer vision, and the field of AI needs other teams to execute this in other domains.

Is there anything else you think it’s important for people to understand about the work you’re doing or the data-centric AI movement?

Ng: In the last decade, the biggest shift in AI was a shift to deep learning. I think it’s quite possible that in this decade the biggest shift will be to data-centric AI. With the maturity of today’s neural network architectures, I think for a lot of the practical applications the bottleneck will be whether we can efficiently get the data we need to develop systems that work well. The data-centric AI movement has tremendous energy and momentum across the whole community. I hope more researchers and developers will jump in and work on it.

Back to top

This article appears in the April 2022 print issue as “Andrew Ng, AI Minimalist.”





igg

The PS5 Pro’s biggest problem is that the PS5 is already very good

For $700, I was hoping for a much larger leap in visual impact.




igg

Martin Garrix set to perform in ‘world’s biggest Holi celebration’ in India

Tickets for the event will go on sale on November 10, 2024, via BookMyShow




igg

Tesla posts bigger-than-expected loss, bigger-than-expected revenue [Updated]

Company expects to be cash flow positive in the next two quarters.




igg

Why thousands gathering around rancid 'dead whale' by world's biggest lake...


Why thousands gathering around rancid 'dead whale' by world's biggest lake...


(Third column, 18th story, link)





igg

'YELLOWSTONE' First Episode Without Costner Scores Biggest Premiere Night Audience...


'YELLOWSTONE' First Episode Without Costner Scores Biggest Premiere Night Audience...


(Third column, 14th story, link)





igg

What Are the Biggest Lakes in the U.S.?

The United States is home to some truly spectacular lakes. Whether considering the massive Great Lakes themselves or deep alpine gems like Lake Tahoe, with its crystal-clear waters, America is well-stocked with many sizable bodies of water.




igg

Pakistan's record smog triggers anguish and anxiety

Lahore, Pakistan (AFP) Nov 13, 2024
On the streets of Pakistan's second biggest city, smog stings eyes and burns throats. Inside homes, few people can afford air purifiers to limit the damage of toxic particles that seep through doors and windows. Lahore - a city of 14 million people stuffed with factories on the border with India - regularly ranks among the world's most polluted cities, but it has hit record levels this mon




igg

Apple’s biggest product since the iPhone

APPLE could be set to make its biggest new product announcement since the iPhone, with the company believed to be working on a game changer.




igg

AIOCD urges DCGI to immediately stop partnership between Swiggy & PharmEasy for rapid drug delivery

Raising deep concern over the partnership between Swiggy Instamart and PharmEasy for a rapid drug delivery model, the All India Organization of Chemists and Druggists (AIOCD) has apprised the Drug Controller




igg

Actor/Comedian Rob Riggle Joins Easter Seals Dixon Center to Reinforce Value of Employing Veterans - What to Wear

Rob Riggle Stars alongside Brice Williams in the newest Easter Seals Dixon Center PSA, directed by Jim Fabio with support from Judd Apatow




igg

Holiday Inn Express� Brand Introduces First All Breakfast Emoji Keyboard - Rob Riggle Announces BREAKFA-mojis for Days

Actor/comedian Rob Riggle announces his next breakfast move as the Holiday Inn Express brand�s Creative Director � BREAKFA-mojis!




igg

Grime To Shine Power Tour Lets Customers Demo Pressure Washer Cleaning Systems At Local Lowe's Stores, Sponsored By Briggs & Stratton - Briggs POWERflow+ Pressure Washer

Power washing is made even easier with POWERflow+ Technology by Briggs & Stratton. This pressure washer allows you to do deep cleaning, remove mold and mildew and reach second stories.




igg

Holiday Inn Express� Brand Reunites With Actor/Comedian Rob Riggle For Latest Stay Smart� Campaign - Coffee Tasting Commercial

With a Keurig� in every room at Holiday Inn Express� hotels, smart travelers � like actor/comedian Rob Riggle � can get a great cup of coffee with just the push of a button.