🚀 Gate.io #Launchpad# for Puffverse (PFVS) is Live!
💎 Start with Just 1 $USDT — the More You Commit, The More #PFVS# You Receive!
Commit Now 👉 https://www.gate.io/launchpad/2300
⏰ Commitment Time: 03:00 AM, May 13th - 12:00 PM, May 16th (UTC)
💰 Total Allocation: 10,000,000 #PFVS#
⏳ Limited-Time Offer — Don’t Miss Out!
Learn More: https://www.gate.io/article/44878
#GateioLaunchpad# #GameeFi#
What impact do generative AI tools have on the environment?
While questions about the dangers of AI, including misinformation and threats to human job displacement, still dominate the discussion, a Boston University professor is sounding the alarm about another possible side effect -- that generative AI tools could have considerable environmental impact. **
"As an AI researcher, I often worry about the energy cost of building AI models," Kate Saenko, an associate professor of computer science at Boston University, wrote in an article for The Conversation. She noted, "AI The more powerful it is, the more energy it takes."
While the energy consumption of blockchains like Bitcoin and Ethereum has become the focus of research and debate from Twitter to the halls of Congress, the impact of rapid advances in artificial intelligence on the planet has yet to receive the same attention.
Professor Saenko aims to change that, but in her article she acknowledges that there is limited data on the carbon footprint of individual generative AI queries. However, she said the findings showed that generative AI queries consume four to five times more energy than simple search engine queries.
According to a report in 2019, Professor Saenko mentioned a generative artificial intelligence model called BERT (Bidirectional Encoder Representations from Transformers), which has 110 million parameters, and the energy consumed to train the model is equivalent to one person traveling across Continental flight and model training using Graphics Processing Units (GPUs).
In AI models, parameters are variables learned from data that guide the model's predictions. More parameters in a model usually mean a more complex model and therefore require more data and computing resources. During training, parameters are tuned to minimize error.
In comparison, Professor Saenko mentioned that OpenAI's GPT-3 model has 175 billion parameters and consumes as much energy as 123 gasoline-powered passenger cars for a year, or about 1,287 megawatt-hours. electricity. At the same time, the model produced 552 tons of CO2. She also added that this figure is only when the model is ready to be launched, without any consumers starting to use the model.
"If chatbots become as popular as search engines, the energy costs of deploying these AIs could be very high," said Professor Saenko, citing Microsoft's addition of ChatGPT to its Bing web browser earlier this month, as an example.
Complicating matters further, more and more AI chatbots, such as Perplexity AI and OpenAI's popular ChatGPT, are releasing mobile apps. This makes them easier to use and exposed to a wider user base.
Professor Saenko pointed to a study conducted by Google that found that the use of more efficient model architectures and processors, as well as greener data centers, can significantly reduce the carbon footprint.
"A single large AI model won't devastate the environment, but if thousands of companies develop slightly different AI robots for different purposes, and each robot is used by millions of customers, the energy consumption could be a problem."
In the end, Saenko concluded that more research is needed to make generative AI more efficient, but she is optimistic.
"The good news is that AI can run on renewable energy," she wrote. "By placing computations where, or at times when, renewable energy is more plentiful, it is much easier to do it than to use electricity that is primarily fueled by fossil fuels." This could reduce emissions by a factor of 30 to 40 compared to the dominant grid.”