Friday, November 21, 2025
ad
Home Blog Page 180

Data Center and Climate Change Tango: How do they impact each other?

data center climate change

On a hot July day, Google and Oracle data centers in the U.K. were having trouble keeping up as Europe continued to experience record-breaking high temperatures. Networking, storage, and object computing resources—all powered by servers in the south of England, where temperatures were among the warmest on record—were among the several cloud services that suffered outages. According to the Met Office, the UK had the warmest day ever, with 104 °F shattering a 2019 record. This is a preview of the impact of climate change on data centers. 

Heatwaves been Shutting Data Centers Down?

Both companies attributed the outages to “cooling systems” issues. The initial statement from Oracle, which noted that “unseasonal temperatures” had impacted cloud and networking equipment in its South London plant, was posted on its service website in the late morning of New York Time. Later, Google reported a similar issue at its London location in the afternoon. The company said that one of Google’s buildings, which houses zone europe-west2-a for region europe-west2, experienced a cooling-related issue. According to the statement, the company curtailed preemptive deployments of Google Compute Engine (GCE) and shut down a portion of the zone to avoid machine damage and an extended outage. Later, Google Cloud updated the incident, stating that the cooling-related problem had been fixed. However, a small number of Persistent Disk volumes with HDD backups were still having issues and would display IO errors. Customers still having problems were advised to contact Google Cloud Product Support. One of the key victims of Google’s data center outage was web-hosting service WordPress, as it lost its service in Europe. 

Regional storage services that duplicate client data across many zones, e.g., GCS and BigQuery, could not access any replica for several storage items due to the regional traffic routing change, which prevented consumers from accessing these objects while the routing error was present. It took 18 hours and 23 minutes for the cloud services to recover completely.

Google announced it would fix the problem and thoroughly re-test its failover mechanism due to the event. The cloud giant would research and create more advanced technologies to gradually reduce the heat load inside a single data center area, lowering the likelihood that a complete shutdown would be necessary. Furthermore, Google is reportedly planning to look for vulnerabilities in its processes, tools, and automatic recovery systems. It reportedly plans to undertake an audit of cooling system standards and equipment across all data centers that host Google Cloud globally.

As a result of the extreme heat in London and the area within the M25, where unusually high temperatures have been recorded, staff was dispatched to spray air conditioning units with water. This water use will be in addition to the already high water consumption of data centers, which are most likely to require emergency hosing during a scorching period.

To keep their data centers cool, data center operators are looking for more energy-efficient IT equipment as severe temperatures are expected to become more common in the future.

Read More: Fujitsu and MIT Center for Brains, Minds, and Machines Build AI model to Detect OOD data

To function properly, data centers must be maintained a specified temperature range. According to the US standards group American Society of Heating Refrigerating and Air-Conditioning Engineers (ASHRAE), enterprise servers and storage devices should run between 18°C to 27°C. Since this technology generates heat, data centers must be cooled, usually by chilling the air, and rising outside temperatures impose stress on this cooling equipment. This may lead to higher electricity prices or, in uncommon circumstances, system outages.

Older data centers may suffer during the UK heatwave, resulting in higher energy expenses. In comparison, freshly constructed data centers can withstand temperatures between -40°C and 40°C, thanks to operators who planned their facilities to be as energy efficient as possible.

However, even brand-new data centers may witness a loss in efficiency if the temperature rises beyond 40°C, increasing cooling expenses. Hence, Microsoft is implementing novel strategies to help improve water cooling and temperature control in its data centers. Several teams in Ireland are seeking to harness data center heat and develop more energy-efficient cooling technologies in the neighboring nation. To find innovative ways to recycle used data center heat for essential public infrastructure, Dublin energy company Codema teamed up with data center provider Equinix early this year. 

The Cooling Concerns, the Power Hunger, the Carbon Emissions

A data center must either be kept in a temperature-controlled environment that is constantly maintained or must be built at a location with a naturally cold climate for it to be operational. An increase in cooling systems will increase energy demand, leading to an increase in the burning of fossil fuels. This will then exacerbate global warming and lead to more heatwaves, encouraging companies to invest in and turn on even more potent cooling systems, and so on. 

According to studies from the International Energy Agency, the foundations of contemporary cloud hosting now utilize close to 1% of the world’s electrical supply and contribute 0.3% of all global CO2 emissions. According to projections made by the National Resources Defense Council in 2015, by 2020 alone, American data centers will utilize about 140 billion kWh annually and produce staggering 100 million metric tons of carbon dioxide. The global total is certainly mind-bogglingly larger, with an estimated more than 5 million data centers worldwide. Studies show that cooling IT equipment uses about 40% of data center energy consumed.

Though migrating data centers to regions with cold climates sounds exciting, it is not quite feasible on a large scale. 

In addition to these woes, according to a survey from 2016, the world’s data centers consumed 416.2 terawatt hours of power, which is a substantial amount more than the UK’s 300 terawatt hours. Data centers have the same carbon footprint as the aviation sector, using 3% of the world’s electrical supply and emitting around 2% of all greenhouse gas emissions.

This motivated companies to explore more eco-friendly substitutes. In an effort to replace the notoriously polluting Haber-Bosch process, Fujitsu is collaborating with the Icelandic start-up Atmonia to create sustainable Amonnia, which could be burnt to efficiently power data centers. Trinity spin-off Nexalus is developing a method for power-hungry data centers to maintain cooling and generate electricity rather than only consume it. Microsoft claims to have developed a zero-carbon emissions solution for the diesel-powered generators used in data centers for backups in the case of power outages and other service interruptions. Unlike diesel generators, hydrogen and oxygen are combined in hydrogen fuel cells to produce power, heat, and water instead of pollution. However, the costs and scale needed for data center manufacturing have prevented the use of hydrogen fuel cells as a production alternative.

When Microsoft announced its intention to stop using diesel fuel by 2030, things started to change in July 2020. Because of this, Microsoft is enthusiastic about testing its three-megawatt hydrogen fuel cell system that can power around 10,000 computer servers in a data center. The Microsoft hydrogen generator uses the proton exchange membrane (PEM) fuel cell technology, which combines hydrogen and oxygen in a chemical process to produce power, heat, and water without burning anything or emitting any carbon dioxide.

This kind of stationary backup power might be used in a variety of settings, including datacenters, office buildings, and hospitals, whenever green hydrogen becomes economically feasible and affordable. Because PEM fuel cells can follow a load up and down and, like diesel engines, are quick to turn on and off, they are widely employed in the automobile sector.

According to Mark Monroe, a lead infrastructure engineer on Microsoft’s team for data center advanced development, the quick response and load following capabilities are ideally suited for backup power at data centers.

Will these developments be enough?

Resilience issues are becoming increasingly pressing as climate change intensifies and catastrophic events like hurricanes, floods, record temperatures, and water shortages become more frequent. Due to adverse weather and the general effects of climate change, outages are already getting longer and costing more money. These infrastructures are endangered on many fronts, including electricity, networking infrastructure, and internal resilience, as climate worsens, and extreme weather events increase in frequency. The UK’s prolonged heatwave, which caused data centers to go offline, is only the tip of the climate change ramification iceberg.

The networking infrastructure and data centers will also suffer from rising sea levels. Coastal areas that house data centers will be adversely affected. Even climate change-induced drought might affect data center operations as well as cause an increase in highly catastrophic wildfires that directly endanger installations.

Additionally, there is a need to modernize the grid, develop autonomous microgrids, and research strategies to upgrade networks for higher resilience. The majority of Internet cabling is only buried less than a meter below ground – thus posing a serious risk during storms with high winds and flooding. 

Time for Action

As a result of the fourth industrial revolution’s fast-rising need for data, the data center business is progressively growing in size. However, it is also struggling with rising management costs and power usage. Though many report power use, the electricity and water resources required to keep data centers cool are frequently not documented or accounted for, according to the Uptime Institute’s 2021 Global Data Center Survey

Therefore, governments will need to set tighter carbon reduction targets and transparent reporting requirements, even while consumer action and knowledge are crucial for applying pressure to businesses. With its ambitious Green Deal, the European Union has taken the lead in this field and calls for data centers to become carbon neutral by 2030.

To guarantee sustainability, data center operators and designers must take climate change and its effects into account from the beginning of the asset construction process, as building a data center is a long-term investment.

Advertisement

NVIDIA launches GauGAN360 to turn sketches into 360° HDR environments

NVIDIA launches GauGAN360

Nvidia has launched GauGAN360, which is a new experimental online art tool that turns rough sketches into 360° HDR environments which can be used in 3D scenes.

The tool is based on the same technology as Nvidia’s original GauGAN artificial intelligence-based painting app. It allows users to paint the overall form of a landscape and have GauGAN360 generate a matching equirectangular image or a cube map. It can also turn quick doodles into synthetic 360° environments for use in 3D scenes.

Nvidia introduces GauGAN360 as the next evolution of its popular AI painting app, GauGAN. Launched in 2019, GauGAN uses an artificial intelligence model based on Generative Adversarial Networks to generate photorealistic images matching rough sketches from the users. 

Read More: OpenAI Introduces A New-And-Improved Content Moderation Tool For API Developers

GauGAN does pretty much the same thing as GauGAN360. However, instead of a conventional 2D image, the result is an equirectangular image or a cube map that can be used as an environment map inside 3D software.

Workflow in GauGAN360 is similar to GauGAN. Users can paint the rough form of the image they want to generate onto the online template with the click of a button. AI does the rest. Different brush colors represent different landscape elements, and there are 25 available. The elements are divided into ground elements like mud, sand, land, sky, plants, etc. 

The controls are pretty basic. There is a choice of three brush shapes and a brush size slider to generate images from very quick doodles. Users can also upload an actual image to use as a source. GauGAN360 will generate a synthetic image with the same overall composition as the original.

Advertisement

OpenAI introduces a new-and-improved content moderation tool for API developers

openai new content moderation tool for api developers

OpenAI is introducing a more accurate Moderation endpoint as a new-and-improved update to its content moderation tool to help API developers in protecting their applications. The company has levied this update to perform robustly across a wide range of applications, including social media, messaging systems, and AI chatbots. 

The updated Moderation endpoint allows developers to access the application only via a programming interface to OpenAI’s Generative Pre-trained Transformer-based classifiers. These classifiers detect any unnecessary content in the applications.

Once you give an input text, the Moderation endpoint analyzes for content like hateful speech, sexual content, abusive language, etc, to be filtered out. It blocks all content (generated by OpenAI’s API) that goes against OpenAI’s content policy. Additionally, the endpoint can also weed and block harmful content generated by humans.

Read More: UCLA: Robot dog with an AI brain taught itself to walk in just an hour

For instance, NGL, an anonymous messaging platform, utilizes OpenAI’s tool to filter out hateful language, bullying, racist remarks, etc. 

The company claims that the update significantly reduces the possibility of an AI model “saying” the wrong thing, making it applicable to more sensitive use cases like education. This Moderation endpoint is free if your content is generated on OpenAI API. In the case of non-API content, users will have to pay a fee.

Developers can start using the tool after going through its documentation. OpenAI has also published a paper highlighting training and performance analysis of the tool along with an evaluation dataset to inspire further research in AI-driven moderation.

Advertisement

Singapore-based cryptocurrency lender Hodlnaut suspends Withdrawals

Hodlnaut suspends Withdrawals

A Singapore-based cryptocurrency borrower and lender, Hodlnaut, has suspended withdrawals, swaps, and deposits, the company said on Monday. This suspension can be taken as the latest sign of stress in the cryptocurrency industry.

Hodlnaut also said it would withdraw its application for a license from the Monetary Authority of Singapore (MAS) to facilitate digital token payment services. The crypto lender had received it in principle approval in March. A MAS spokesperson said it had canceled the approval following the request.

Hodlnaut said the reason behind the move was current market conditions. The crypto lender wants to focus on stabilizing their liquidity and preserving assets. Hodlnaut is the latest in the line of global crypto players to face financial crises following a sharp sell-off in markets. It started in May with the collapse of two paired tokens, Luna and TerraUSD. 

Read More: Banks To Spend $31 Billion On AI To Reduce Frauds, Says IDC

US crypto lender Celsius and Singapore-based fund Three Arrows Capital also filed for bankruptcy last month. Hodlnaut was recognized as one of Celsius’ institutional clients. Singapore, a major crypto and blockchain center in Asia, has seen several crypto companies run into a crisis in recent months.

Vauld, a Singapore-based crypto lending and trading platform suspended withdrawals during July. Later that month, Zipmex, a Southeast Asia-focused crypto exchange, suspended withdrawals. It has recently resumed withdrawals for some products.

A MAS spokesperson said that digital payment token service providers that MAS licenses under the Payment Services Act are regulated for terrorism and money laundering financing risks as well as technical risks. They are not subject to risk-based liquidity or capital requirements. Nor are they required to safeguard customer digital tokens or monies from insolvency risk.

Advertisement

28.8 crore personal records of EPS holders leaked

28.8 crore personal records of EPS holders leaked

A Ukraine-based cybersecurity researcher and journalist, Bob Diachenko, has claimed that about 28.8 crore personal records of the holders of Employees’ Pension Scheme (EPS) in the Employees’ Provident Fund Organisation (EPFO) were leaked online before being taken off the Internet. The records contained the full name, bank account number, and nominee information.

The security researcher’s claims about the data exposed online were verified by the EPFO, national cyber agency CERT-In, and the IT Ministry. Cyber threat intelligence director at securitydiscovery.com, Bob Diachenko, claimed that their systems identified two separate IPs with Universal Account Number (UAN) data.

Universal Account Number (UAN) is an integral part of the Indian government registry and is allotted by EPFO. Each record of the provident fund account holder contained personal information, such as marital status, gender, date of birth, and employment status.

Read More: Iran Initiated First Official Import Order Using Cryptocurrency

280 million records were exposed under one IP address, and the other IP address contained about 8.4 million data records that were publicly exposed, claimed Bob Diachenko. 

Given the scale and noticeable sensitivity of the data, the researcher decided to tweet about it without giving any detail about the source and associated info. Within 12 hours after the tweet, both IPs were taken down and are now unavailable. The IANS report mentioned that both the IPs have now been removed from the public domain.

Advertisement

Iran initiated first official import order using cryptocurrency

Iran initiated first official import order using cryptocurrency

Iran initiated its first official import order using cryptocurrency this week, the Tasnim News agency reported on Tuesday. This move could enable the Islamic Republic to evade the US sanctions that have crippled the economy.

The order worth $10 million was the first step toward allowing the country to trade using digital assets that bypass the global financial system that are dollar-dominated. It has also permitted trade with other countries similarly limited by US sanctions, like Russia. The cryptocurrency used for the transaction is unknown yet. 

The Ministry of Industry, Mine, and Trade said that by the end of September, the use of cryptocurrencies and smart contracts would be widely used in foreign trade with target countries.

Read More: Biden Signs $52.7 Bn Bill To Boost US Chip Production And Compete With China

The United States had imposed an almost total economic embargo on Iran, including a ban on all imports, including the oil, banking, and shipping sectors of the country. Tehran is one of the largest economies that is yet to embrace cryptocurrency technology. It was established in 2008 as a payment tool to erode governmental control over finance and economies.

Last year, a study found that 4.5% of all bitcoin mining was taking place in Iran. This was partly a result of the country’s cheap electricity facilities. The mining of cryptocurrency could help Iran earn hundreds of millions of dollars. It can help Iran to buy imports and lessen the impact of

Advertisement

Biden signs $52.7 Bn bill to boost US chip production and compete with China

Biden signs $52.7 Bn bill to boost US chip production

President Joe Biden has signed a bill announcing $52.7 billion in subsidies for semiconductor production and research in the US. The bill also aims to make the United States more capable of competing with China’s science and technology efforts.

The chief executives of Micron, Intel, Lockheed Martin, HP, and Advanced Micro Devices attended the signing. The governors of Pennsylvania and Illinois, the mayors of Detroit, Cleveland, and Salt Lake City, and several lawmakers were also present. 

The White House commented that the passage of the bill had spurred new investments in chips. Recently, Qualcomm has consented to buy an additional $4.2 billion in semiconductor chips from GlobalFoundries’ New York factory. It has brought Qualcomm’s total commitment to $7.4 billion in purchases through 2028.  

Read More: Top Applications Of Quantum Computing

The White House also touted Micron by announcing a $40 billion investment in memory chip manufacturing. This would boost the US market share from 2% to 10%. The White House said the investment was planned with anticipated grants from the chips bill.

The bill also introduces a 25% investment tax credit for chip plants, which is estimated to be worth $24 billion. The legislation also authorizes $200 billion over ten years to boost US scientific research to compete proactively with China. Congress still needs to pass appropriations legislation in order to fund those investments.

The legislation aims to address the persistent shortage of chips that has affected everything in the industry, including cars, weapons, washing machines, etc. China has lobbied against the semiconductor bill. The Chinese Embassy in Washington said that their country firmly opposed the chips bill, calling it reminiscent of a Cold War mentality.

Advertisement

UCLA: Robot dog with an AI brain taught itself to walk in just an hour

robotic dog with ai brain taught itself to walk

A robotic dog at the University of California (UCLA), Berkeley, is surprisingly a quick learner with an AI brain as it taught itself to walk in just an hour. The robotic dog is super intelligent compared to many other robots as it utilizes artificial intelligence to teach itself.

It begins on its back, waves its legs, and learns to flip over itself, stand up, and walk again. It takes as little as 10 minutes to learn new tricks like withstanding and recovering a roll of cardboard pushed by its handlers.

It is not the first time a robot to utilize artificial intelligence for walking but what makes this robot different from others is that it does not train by trial and error over iterations and simulations. As it is being called, the Berkeley bot learned in the real world by itself.

Read More: Meta’s AI chatbot, BlenderBot 3, already making comments, starting with the CEO, Mark Zuckerberg

Here is a video of how the dog learns and functions. 

Researchers believe that transferring algorithms trained in simulation to real-world scenarios is challenging. The little differences and details between simulations and the natural world can confuse robots. Times have changed, and technology has advanced with algorithms like Dactyl, trained in a simulation powered by 6,144 CPUs and V100 GPUs. 

Even after the advancements, the problem more or less remains. The Berkeley team decided to solve it with an algorithm called Dreamer. Dreamer can forecast the likelihood that future action will succeed in its objective by constructing a “world model.” Its projections get more precise with practice.

In other words, a world model can reduce the time it takes to train to the equivalent of a few harrowing hours in the actual world.

Advertisement

PaddlePaddle received new updates to expand AI to Industrial Applications

paddlepaddle received new updates to expand ai

PaddlePaddle, a deep learning platform created by the Chinese AI juggernaut Baidu, recently received new updates to expand its AI to industrial applications. These updates are in addition to the 10 large deep learning models catering to biology, natural language processing, and vision. 

PaddlePaddle is the most widely utilized Chinese framework for deep learning. It has a developer base of more than 4.77M developers and 1,80,000+ enterprises that use its services. It has received some significant improvements from the parent company. These improvements include ERNIE 3.0 Zeus, ERNIE-GeoL, and HELIX GEM.

ERNIE 3.0 Zeus is an NLP model with 100 billion parameters, ERNIE-GeoL is a geography and language pre-trained model, and HELIX GEM is a compound representation learning model. PaddlePaddle has created three new industry-centric models by tuning ERNIE 3.0 Titan and making it beneficial for use cases like banking, aerospace, and electric power. 

Read More: AI startup that improves public speaking by analyzing speech raises $6M

Deep learning frameworks incorporate everything needed to create, train, and evaluate deep neural networks via a high-level interface. Without these tools, deep learning algorithms would be challenging to create 

Baidu started working on such tools about a decade ago, creating PaddlePaddle based on a framework called Caffe, developed by a UCLA student. PaddlePaddle supports convolutional and recurrent neural networks, giving it an edge over competitors in the NLP space.

Advertisement

Rank One Computing aims to expand facial recognition software throughout West Virginia

Rank One Computing (ROC) has set up its East Coast headquarters in West Virginia to expand the United States’ use of facial recognition software. The officials hope the company’s facial recognition software will find more customers in the state. 

ROC’s East Coast headquarters are at Vantage Ventures in Morgantown, primarily stationed in Colorado. They offer facial, object, and text recognition software to help in various industries and scenarios.

CEO Scott Swann said, “We’re mainly known for our face recognition capabilities, but we can detect many types of objects.” He also added that their technology is available on mobile phones that helps users identify particular encounters with other people.

Read more: Microsoft’s OneNote now features AI-powered voice commands via Dictate

Rank One Computing services larger-scale ‘one-to-many’ facial recognition systems for forensic uses within law enforcement and Fintech companies in the commercial sector.  He added that he and his team must be attractive to federal agencies to onshore the use of artificial intelligence, which he believes is currently being led by overseas competition.

ROC’s offerings can benefit sectors beyond the government. They have been talking to numerous organizations and groups in the North Central West Virginia region about how our product can be used in new ways. 

For example, ROC has spoken with the Marion County Board of Education about using the company’s software with the county’s existing camera system. The software allows authorized personnel — such as county employees, parents, and other trusted individuals — to be whitelisted and quickly identified on a school’s camera system.

Advertisement