Creating Helpful Incentives to Produce Semiconductors (CHIPS) Bill was passed by the US House of Representatives on July 28. According to the bill, $52.7 billion will be set aside for the research, development, and domestic manufacturing of semiconductors. A sum of $39 billion will go toward incentivizing manufacturers, and $2 billion will be used to create existing/legacy chips for automotive and defense. Another $13.2 billion will be invested in workforce research and development.
Although meant to boost semiconductor production in the US, what does this bill mean for the global chip industry? And how are the international competitors responding to it? Let’s discuss.
Reason for the Bill
There are several reasons we can consider as to why this bill has been passed. A global shortage of semiconductors or chips last year led to the realization that the United States needed its own substantial manufacturing of semiconductors. A shortage of the former has left carmakers with unfinished vehicles. Since then, chip manufacturers have been lobbying for such a bill to pass.
The growing use of electronic devices like laptops and smartphones in homes has further increased the demand for chips. According to Forbes, about 75% of the world’s semiconductor demand is met by East Asian manufacturers. In particular, Taiwan and South Korea’s Samsung have been critical proponents of manufacturing chips. However, China has been notably upping the production of semiconductors on a large scale.
Now, there is also a nationalistic argument here. The US has pointed to dependence on China for semiconductor supply as a cause for worry. According to the summary of the CHIPS bill, only 12% of chips are currently manufactured domestically in the US, as compared to the 1990s when domestic production was 37%. The bill also points out that many foreign competitors, including China, are investing heavily to dominate the semiconductor industry.
Joe Biden said that the CHIPS bill would strengthen the country’s national security by making the US less dependent on foreign semiconductor sources. He also mentioned maintaining US leadership in the chip industry. Therefore, it is clear that the bill has been passed to give the US an advantage to top the global semiconductor game. The US president further said the bill would make cars, appliances, and computers cheaper. Moreover, it will lower the costs of everyday goods.
China’s Response
The state-run media organizations of China, like the China Daily and Global Times, have criticized the provision of the CHIPS bill that would punish American companies for dealing with China. They accused it of being the latest representation of Washington’s efforts to exclude China from the global supply chains.
Chinese officials also warned the concerned companies that they would risk losing market revenue or share in China if the CHIPS bill is implemented. China has said it strongly opposes this legislation, arguing that it evokes the anti-China sentiment and that it is reminiscent of the cold war mindset. However, pro-bill Americans are saying that China is simply upset because of the advantage the US will be at.
Global Repercussions
The United States is good at researching and designing high-end chips, whereas China and some other countries are good at mass production as these countries have added advantages over the US when it comes to labor costs. According to Gao Lingyun, an expert at the Chinese Academy of Social Sciences (CASS), Beijing, a short-term subsidy through the bill, which is aimed at moving all the semiconductor sectors to the US, will only add to the manufacturing cost. This will make the products less competitive on the international market.
Furthermore, if the concerned companies decide to accept subsidies from the US government and give up the chip investment in the Chinese mainland, it will imply a spontaneous abandonment of the massive Chinese market, which not only produces a large amount of chips at a relatively low cost for them but buys many semiconductor products from these companies. If we consider the facts in this sense, what the bill brings to the global chip industry is loss instead of benefits in the long run.
Conclusion
The CHIPS bill is precisely what the US needs to get the country’s economy on track. By making more semiconductors in the US, the bill will increase domestic manufacturing and lower costs for US residents. And it will also strengthen the country’s national security by making it less dependent on foreign sources for semiconductors.
However, when it comes to the global semiconductor industry, things are not looking so good. Experts say that the CHIPS bill shows the US government’s intention to set hurdles for China and other countries to obstruct their semiconductor industry development. However, as ambitious as it looks, it is unreasonable in many aspects and is making people doubt its effectiveness as well as its sustainability.
NVIDIA has announced a suite of cloud-native AI models and services, NVIDIA Omniverse Avatar Cloud Engine (ACE), that makes it easier to build and customize lifelike virtual assistants and digital humans.
ACE enables businesses of any size to instantly access the massive computing power needed to create and deploy assistants and avatars by bringing these models and services to the cloud. These avatars can respond to speech prompts, understand multiple languages, interact with the environment, and make intelligent recommendations.
Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA, said that ACE combines many sophisticated artificial intelligence technologies that allow developers to create digital assistants that are on a path to passing the Turing test.
ACE is built on top of NVIDIA’s Unified Compute Framework. It provides access to the APIs and rich software tools needed to harness the wide range of skills required for highly fully interactive and realistic avatars.
These skills include:
NVIDIA Metropolis for computer vision and intelligent video analytics
NVIDIA Riva for developing speech AI applications
NVIDIA Merlin™ for high-performing recommender systems
NVIDIA NeMo Megatron for large language models with natural language understanding
NVIDIA Omniverse for AI-enabled animation.
The assistants and avatars enabled by ACE will transform interactions in gaming, banking, entertainment, transportation, and hospitality. Two applications built on ACE include Project Tokkio and NVIDIA’s Project Maxine.
Project Maxine brings state-of-the-art audio and video features to virtual collaboration and content creation applications. Project Tokkio provides interactive avatars that intelligently see, perceive, converse, and provide recommendations to enhance customer service in places like restaurants.
Xiaomi announced the launch of the company’s CyberOne humanoid robot at an event in Beijing on August 11. The debut comes ahead of Tesla’s AI Day in September, where the working prototype of Optimus Bot will be launched. The two subsequent launches are creating an air of competition.
According to Xiaomi, their CyberOne robot, having arms, and legs, reaches peak torque of up to 300 Nm and supports bi-pedal motion posture balancing. The robot can also create a three-dimensional virtual reconstruction of the natural world, detect human emotions, and has advanced vision capabilities.
Xiaomi said that the robot’s AI and mechanical capabilities are being self-developed by the Xiaomi Robotics Lab. The company has also invested heavily in research and development, including algorithm innovation, software, and hardware.
The CEO of Xiaomi said that CyberOne is an exploration of possibilities of Xiaomi’s technological ecosystem of the future with AI at its core and a full-size humanoid frame as its vessel. It is a new breakthrough for the company, he added.
Many have speculated that Xiaomi unveiled the CyberOne in the spirit of competing with Tesla’s Optimus Bot. Experts say that it is most likely true, given the timeframe during which CyberOne has been launched. Although Tesla is yet to launch a working prototype of Optimus, Elon Musk had announced that Tesla AI Day was moved to September 30 with the hopes that Tesla will have the prototype ready by then.
The United Nations Conference on Trade and Development (UNCTAD) urged actions to curb the growth of cryptocurrencies in developing countries. The three policy briefs conveying the same were published on Wednesday. The UN trade and development body warned that while private digital currencies have rewarded and facilitated remittances to some, they are still unstable financial assets involving social risks and costs.
The newly released policy briefs by UNCTAD examined the costs and risks of cryptocurrencies, including the threats cryptocurrencies bring to the security of monetary systems, financial stability, and domestic resource mobilization.
Global use of cryptocurrencies has increased drastically during the pandemic, especially in developing countries. Reasons for the sudden uptake of crypto in developing countries include remittances facilitation and their use as a hedge against risks relating to inflation and currency.
Recent digital currency dips in the crypto market suggest that there are personal risks involved in holding cryptocurrencies. However, it becomes a public problem if the central bank steps in to protect financial stability. This could jeopardize the monetary sovereignty of countries if crypto becomes a widespread means of payment and even replaces domestic currencies unofficially.
In developing countries where the demand for reserve currencies is unmet, the so-called stablecoins – a kind of digital currency that is pegged to the US dollar – pose particular risks. The agency said that the International Monetary Fund had expressed the view that cryptocurrencies pose risks as legal tender for some of these reasons.
UNCTAD called for authorities to act to halt the rising expansion of cryptocurrencies in developing countries. It also mentioned several recommendations, including restricting advertisements related to cryptocurrencies.
A team of scientists in Lata Medical Research Foundation, Nagpur, have developed an artificial intelligence (AI) algorithm that can accurately predict diabetes and pre-diabetes from ECG. The AI algorithm has been derived from the features of individual heartbeats recorded on an electrocardiogram (ECG).
The team included clinical data from 1,262 individuals. A standard 12-lead ECG heart trace lasting 10 seconds for each participant was performed. A predictive algorithm named DiaBeats was generated by combining 100 unique structural and functional features for each of the 10,461 single heartbeats recorded.
The DiaBeats algorithm quickly detected diabetes and pre-diabetes based on the size and shape of individual heartbeats. The algorithm did so with an overall accuracy of 97%, irrespective of factors such as gender, age, and co-existing metabolic disorders.
Vital ECG features consistently matched the known biological triggers that imply cardiac changes that are common for diabetes and pre-diabetes. The method could be used to screen for the disease in settings with low resources if validated in more extensive studies, the team said.
In theory, the study provides a relatively non-invasive, inexpensive, and accurate alternative to the current diagnostic methods. This alternative can effectively detect diabetes and pre-diabetes early in its course. Despite that, the adoption of this algorithm into regular practice will need solid validation on external and independent datasets.
The researchers admitted that the participants of the study were all at high risk of diabetes and other metabolic disorders. Therefore, it is unlikely to represent the general population. Also, DiaBeats was slightly less accurate for patients taking prescription medications for high blood pressure, diabetes, and high cholesterol.
On a hot July day, Google and Oracle data centers in the U.K. were having trouble keeping up as Europe continued to experience record-breaking high temperatures. Networking, storage, and object computing resources—all powered by servers in the south of England, where temperatures were among the warmest on record—were among the several cloud services that suffered outages. According to the Met Office, the UK had the warmest day ever, with 104 °F shattering a 2019 record. This is a preview of the impact of climate change on data centers.
Heatwaves been Shutting Data Centers Down?
Both companies attributed the outages to “cooling systems” issues. The initial statement from Oracle, which noted that “unseasonal temperatures” had impacted cloud and networking equipment in its South London plant, was posted on its service website in the late morning of New York Time. Later, Google reported a similar issue at its London location in the afternoon. The company said that one of Google’s buildings, which houses zone europe-west2-a for region europe-west2, experienced a cooling-related issue. According to the statement, the company curtailed preemptive deployments of Google Compute Engine (GCE) and shut down a portion of the zone to avoid machine damage and an extended outage. Later, Google Cloud updated the incident, stating that the cooling-related problem had been fixed. However, a small number of Persistent Disk volumes with HDD backups were still having issues and would display IO errors. Customers still having problems were advised to contact Google Cloud Product Support. One of the key victims of Google’s data center outage was web-hosting service WordPress, as it lost its service in Europe.
Regional storage services that duplicate client data across many zones, e.g., GCS and BigQuery, could not access any replica for several storage items due to the regional traffic routing change, which prevented consumers from accessing these objects while the routing error was present. It took 18 hours and 23 minutes for the cloud services to recover completely.
Google announced it would fix the problem and thoroughly re-test its failover mechanism due to the event. The cloud giant would research and create more advanced technologies to gradually reduce the heat load inside a single data center area, lowering the likelihood that a complete shutdown would be necessary. Furthermore, Google is reportedly planning to look for vulnerabilities in its processes, tools, and automatic recovery systems. It reportedly plans to undertake an audit of cooling system standards and equipment across all data centers that host Google Cloud globally.
As a result of the extreme heat in London and the area within the M25, where unusually high temperatures have been recorded, staff was dispatched to spray air conditioning units with water. This water use will be in addition to the already high water consumption of data centers, which are most likely to require emergency hosing during a scorching period.
To keep their data centers cool, data center operators are looking for more energy-efficient IT equipment as severe temperatures are expected to become more common in the future.
To function properly, data centers must be maintained a specified temperature range. According to the US standards group American Society of Heating Refrigerating and Air-Conditioning Engineers (ASHRAE), enterprise servers and storage devices should run between 18°C to 27°C. Since this technology generates heat, data centers must be cooled, usually by chilling the air, and rising outside temperatures impose stress on this cooling equipment. This may lead to higher electricity prices or, in uncommon circumstances, system outages.
Older data centers may suffer during the UK heatwave, resulting in higher energy expenses. In comparison, freshly constructed data centers can withstand temperatures between -40°C and 40°C, thanks to operators who planned their facilities to be as energy efficient as possible.
However, even brand-new data centers may witness a loss in efficiency if the temperature rises beyond 40°C, increasing cooling expenses. Hence, Microsoft is implementing novel strategies to help improve water cooling and temperature control in its data centers. Several teams in Ireland are seeking to harness data center heat and develop more energy-efficient cooling technologies in the neighboring nation. To find innovative ways to recycle used data center heat for essential public infrastructure, Dublin energy company Codema teamed up with data center provider Equinix early this year.
The Cooling Concerns, the Power Hunger, the Carbon Emissions
A data center must either be kept in a temperature-controlled environment that is constantly maintained or must be built at a location with a naturally cold climate for it to be operational. An increase in cooling systems will increase energy demand, leading to an increase in the burning of fossil fuels. This will then exacerbate global warming and lead to more heatwaves, encouraging companies to invest in and turn on even more potent cooling systems, and so on.
According to studies from the International Energy Agency, the foundations of contemporary cloud hosting now utilize close to 1% of the world’s electrical supply and contribute 0.3% of all global CO2 emissions. According to projections made by the National Resources Defense Council in 2015, by 2020 alone, American data centers will utilize about 140 billion kWh annually and produce staggering 100 million metric tons of carbon dioxide. The global total is certainly mind-bogglingly larger, with an estimated more than 5 million data centers worldwide. Studies show that cooling IT equipment uses about 40% of data center energy consumed.
Though migrating data centers to regions with cold climates sounds exciting, it is not quite feasible on a large scale.
In addition to these woes, according to a survey from 2016, the world’s data centers consumed 416.2 terawatt hours of power, which is a substantial amount more than the UK’s 300 terawatt hours. Data centers have the same carbon footprint as the aviation sector, using 3% of the world’s electrical supply and emitting around 2% of all greenhouse gas emissions.
This motivated companies to explore more eco-friendly substitutes. In an effort to replace the notoriously polluting Haber-Bosch process, Fujitsu is collaborating with the Icelandic start-up Atmonia to create sustainable Amonnia, which could be burnt to efficiently power data centers. Trinity spin-off Nexalus is developing a method for power-hungry data centers to maintain cooling and generate electricity rather than only consume it. Microsoft claims to have developed a zero-carbon emissions solution for the diesel-powered generators used in data centers for backups in the case of power outages and other service interruptions. Unlike diesel generators, hydrogen and oxygen are combined in hydrogen fuel cells to produce power, heat, and water instead of pollution. However, the costs and scale needed for data center manufacturing have prevented the use of hydrogen fuel cells as a production alternative.
When Microsoft announced its intention to stop using diesel fuel by 2030, things started to change in July 2020. Because of this, Microsoft is enthusiastic about testing its three-megawatt hydrogen fuel cell system that can power around 10,000 computer servers in a data center. The Microsoft hydrogen generator uses the proton exchange membrane (PEM) fuel cell technology, which combines hydrogen and oxygen in a chemical process to produce power, heat, and water without burning anything or emitting any carbon dioxide.
This kind of stationary backup power might be used in a variety of settings, including datacenters, office buildings, and hospitals, whenever green hydrogen becomes economically feasible and affordable. Because PEM fuel cells can follow a load up and down and, like diesel engines, are quick to turn on and off, they are widely employed in the automobile sector.
According to Mark Monroe, a lead infrastructure engineer on Microsoft’s team for data center advanced development, the quick response and load following capabilities are ideally suited for backup power at data centers.
Will these developments be enough?
Resilience issues are becoming increasingly pressing as climate change intensifies and catastrophic events like hurricanes, floods, record temperatures, and water shortages become more frequent. Due to adverse weather and the general effects of climate change, outages are already getting longer and costing more money. These infrastructures are endangered on many fronts, including electricity, networking infrastructure, and internal resilience, as climate worsens, and extreme weather events increase in frequency. The UK’s prolonged heatwave, which caused data centers to go offline, is only the tip of the climate change ramification iceberg.
The networking infrastructure and data centers will also suffer from rising sea levels. Coastal areas that house data centers will be adversely affected. Even climate change-induced drought might affect data center operations as well as cause an increase in highly catastrophic wildfires that directly endanger installations.
Additionally, there is a need to modernize the grid, develop autonomous microgrids, and research strategies to upgrade networks for higher resilience. The majority of Internet cabling is only buried less than a meter below ground – thus posing a serious risk during storms with high winds and flooding.
Time for Action
As a result of the fourth industrial revolution’s fast-rising need for data, the data center business is progressively growing in size. However, it is also struggling with rising management costs and power usage. Though many report power use, the electricity and water resources required to keep data centers cool are frequently not documented or accounted for, according to the Uptime Institute’s 2021 Global Data Center Survey.
Therefore, governments will need to set tighter carbon reduction targets and transparent reporting requirements, even while consumer action and knowledge are crucial for applying pressure to businesses. With its ambitious Green Deal, the European Union has taken the lead in this field and calls for data centers to become carbon neutral by 2030.
To guarantee sustainability, data center operators and designers must take climate change and its effects into account from the beginning of the asset construction process, as building a data center is a long-term investment.
Nvidia has launched GauGAN360, which is a new experimental online art tool that turns rough sketches into 360° HDR environments which can be used in 3D scenes.
The tool is based on the same technology as Nvidia’s original GauGAN artificial intelligence-based painting app. It allows users to paint the overall form of a landscape and have GauGAN360 generate a matching equirectangular image or a cube map. It can also turn quick doodles into synthetic 360° environments for use in 3D scenes.
Nvidia introduces GauGAN360 as the next evolution of its popular AI painting app, GauGAN. Launched in 2019, GauGAN uses an artificial intelligence model based on Generative Adversarial Networks to generate photorealistic images matching rough sketches from the users.
GauGAN does pretty much the same thing as GauGAN360. However, instead of a conventional 2D image, the result is an equirectangular image or a cube map that can be used as an environment map inside 3D software.
Workflow in GauGAN360 is similar to GauGAN. Users can paint the rough form of the image they want to generate onto the online template with the click of a button. AI does the rest. Different brush colors represent different landscape elements, and there are 25 available. The elements are divided into ground elements like mud, sand, land, sky, plants, etc.
The controls are pretty basic. There is a choice of three brush shapes and a brush size slider to generate images from very quick doodles. Users can also upload an actual image to use as a source. GauGAN360 will generate a synthetic image with the same overall composition as the original.
OpenAI is introducing a more accurate Moderation endpoint as a new-and-improved update to its content moderation tool to help API developers in protecting their applications. The company has levied this update to perform robustly across a wide range of applications, including social media, messaging systems, and AI chatbots.
The updated Moderation endpoint allows developers to access the application only via a programming interface to OpenAI’s Generative Pre-trained Transformer-based classifiers. These classifiers detect any unnecessary content in the applications.
Once you give an input text, the Moderation endpoint analyzes for content like hateful speech, sexual content, abusive language, etc, to be filtered out. It blocks all content (generated by OpenAI’s API) that goes against OpenAI’s content policy. Additionally, the endpoint can also weed and block harmful content generated by humans.
For instance, NGL, an anonymous messaging platform, utilizes OpenAI’s tool to filter out hateful language, bullying, racist remarks, etc.
The company claims that the update significantly reduces the possibility of an AI model “saying” the wrong thing, making it applicable to more sensitive use cases like education. This Moderation endpoint is free if your content is generated on OpenAI API. In the case of non-API content, users will have to pay a fee.
Developers can start using the tool after going through its documentation. OpenAI has also published a paper highlighting training and performance analysis of the tool along with an evaluation dataset to inspire further research in AI-driven moderation.
A Singapore-based cryptocurrency borrower and lender, Hodlnaut, has suspended withdrawals, swaps, and deposits, the company said on Monday. This suspension can be taken as the latest sign of stress in the cryptocurrency industry.
Hodlnaut also said it would withdraw its application for a license from the Monetary Authority of Singapore (MAS) to facilitate digital token payment services. The crypto lender had received it in principle approval in March. A MAS spokesperson said it had canceled the approval following the request.
Hodlnaut said the reason behind the move was current market conditions. The crypto lender wants to focus on stabilizing their liquidity and preserving assets. Hodlnaut is the latest in the line of global crypto players to face financial crises following a sharp sell-off in markets. It started in May with the collapse of two paired tokens, Luna and TerraUSD.
US crypto lender Celsius and Singapore-based fund Three Arrows Capital also filed for bankruptcy last month. Hodlnaut was recognized as one of Celsius’ institutional clients. Singapore, a major crypto and blockchain center in Asia, has seen several crypto companies run into a crisis in recent months.
Vauld, a Singapore-based crypto lending and trading platform suspended withdrawals during July. Later that month, Zipmex, a Southeast Asia-focused crypto exchange, suspended withdrawals. It has recently resumed withdrawals for some products.
A MAS spokesperson said that digital payment token service providers that MAS licenses under the Payment Services Act are regulated for terrorism and money laundering financing risks as well as technical risks. They are not subject to risk-based liquidity or capital requirements. Nor are they required to safeguard customer digital tokens or monies from insolvency risk.
A Ukraine-based cybersecurity researcher and journalist, Bob Diachenko, has claimed that about 28.8 crore personal records of the holders of Employees’ Pension Scheme (EPS) in the Employees’ Provident Fund Organisation (EPFO) were leaked online before being taken off the Internet. The records contained the full name, bank account number, and nominee information.
The security researcher’s claims about the data exposed online were verified by the EPFO, national cyber agency CERT-In, and the IT Ministry. Cyber threat intelligence director at securitydiscovery.com, Bob Diachenko, claimed that their systems identified two separate IPs with Universal Account Number (UAN) data.
Universal Account Number (UAN) is an integral part of the Indian government registry and is allotted by EPFO. Each record of the provident fund account holder contained personal information, such as marital status, gender, date of birth, and employment status.
280 million records were exposed under one IP address, and the other IP address contained about 8.4 million data records that were publicly exposed, claimed Bob Diachenko.
Given the scale and noticeable sensitivity of the data, the researcher decided to tweet about it without giving any detail about the source and associated info. Within 12 hours after the tweet, both IPs were taken down and are now unavailable. The IANS report mentioned that both the IPs have now been removed from the public domain.