Friday, November 21, 2025
ad
Home Blog Page 167

Meta stocks hit lowest level since March; other social-media companies fall sharply too

Meta stocks hit lowest level since March

Meta Platforms led a sharp turn lower in social-media stocks Tuesday after the consumer-price index unexpectedly moved higher during August, with continuing inflation reigniting fears of a pullback in online advertising.

Shares of the Facebook parent company fell 9.4% Tuesday, their worst day since February 3, amid broad declines for U.S. stocks, particularly for big tech names. The Nasdaq Composite Index COMP fell 633 points or 5.2%. 

According to Dow Jones Market Data, Meta’s META stock closed Tuesday at $153.13, its lowest level since March 23, 2020. If the stock were to finish a session lower than $146.01, it would officially wipe out all of its pandemic-era gains.

Read More: Oxford Researchers Develop AI For AVs To Achieve Reliable Navigation In Adverse Weather Conditions

Other social-media names also fell sharply, including Snap Inc. SNAP, off 7%, and Pinterest Inc. PINS, down 4.3%. Ad-dependent Alphabet Inc. GOOG GOOGL, which runs the YouTube platform and the Google suite of services, slipped 5.9%.

Meta and its social-media peers have been on a rocky ride this year amid concerns about the impacts of inflation and other economic challenges on advertiser activity. The company logged its first-ever year-over-year drop in revenue during the June quarter, with executives saying they expected overall effects on the advertising business and planned to cut costs given the climate.

Despite jitters about the advertising industry, payment-technology companies have indicated that spending remains strong, even if consumer confidence is pressured.

Advertisement

PEER: a Collaborative Language Model

collaborative language model

Most existing language models are built and trained to provide textual outputs based on the initial inputs. These models efficiently output texts but are restricted to providing only left-to-right language modeling results by predicting the next word. They are not trained to perform specific tasks during the entire writing process, like updating existing texts (even the ones generated by them) and editing. Besides, conventional models are tough to control and cannot explain their actions. Consequently, they are insufficient for collaborative writing.

To address the insufficiencies, Meta AI Research is introducing PEER, a language model extended to enhance collaborative writing techniques by fragmenting a task into smaller subtasks. The PEER or Plan, Edit, Explain, and Repeat model is trained for the entire writing process, not just the final output. It can plan drafts, give suggestions, overlook edits, and explain editing actions, offering several upsides over the standard left-to-right language models. 

You need a proper dataset with an accessible history of edits for training a language model for multiple subtasks like the ones done by PEER. This is because the capability to suggest and explain edits sets PEER apart from other models. It is challenging to obtain edit histories via standard web crawls for most data sources, leading to data scarcity.

To overcome data scarcity, Meta has trained the model on Wikipedia edits to infill PEER with all processes and edit histories, making it applicable to several public domains for which edit histories are unavailable. Wikipedia provides a complete editing history with comments and citations on a large scale. 

While Wikipedia solves the data scarcity problem, it poses a few others, like noisy comments, lack of citations, and high specificity of data to Wikipedia’s textual content and edits. Meta has trained multiple PEER instances, not just one, to solve the above problems by generating synthetic data as a proxy for missing pieces. This synthetic data also replaces low-grade sections in the existing data.  

Read More: Virtual Assets Regulatory Authority welcomes Blockchain.com to operate in Dubai

Framework

The model’s core idea is to explain textual content editing as an iterative process that is repeated till the desired output is received. Let’s say you have a text sequence xₜ, a plan Pₜ, and a set of documents Dₜ containing necessary information. Based on xₜ, P and Dₜ, you can create a plan Pₜ₊₁ to give instructions about further modifications, like “fix spelling errors,” “simplify,” etc. After planning, the model repeats iterations wherein each iteration edits the text to give an updated version xₜ₊₁. Lastly, the model explains the intentions behind the edits via a textual explanation, eₜ, based on (xₜ, xₜ₊₁, and Dₜ). 

The entire process of planning, editing, and explaining is repeated numerous times to obtain a sequence of xₜ, xₜ₊₁, xₜ₊₂, and so on till any xₙ is the same as xₙ₋₁, or there are no more edits. 

Each step makes PEER highly relevant to collaborative writing, where dividing the entire process into phases enhances the quality and worthiness of the output. The explanation and planning phases might be similar given that the model explains what you (or the model) planned; the difference is when they happen. Planning happens before the model edits and explanations are provided in the end. 

Besides editing, PEER enables you to write tests from the beginning using an empty sequence x₀

Meta claims to have enhanced the quality and diversity of the plans, edits, and documents generated by PEER because of several mechanisms it has implemented. 

For quality

PEER prepends control tokens for the output sequences and uses these tokens to guide the model’s generations. Here are some examples:

  • Instruction: this token controls whether the document begins with a noun, verb, etc. 
  • Length as a proxy for the extent of details in explanations.
  • Word overlap to prevent overlapping words in the edit and explanation to ensure that the generated plans are not providing trivial edits by utilizing the specification inputs exactly. 
  • Number of words to control the difference in the number of words between xₜ₊₁ and xₜ.

For diversity

To evaluate PEER’s ability to improve diversity, Meta has trained the model to perform edits on provided documents across multiple domains, especially the ones without edit histories. PEER has a collection of naturally occurring edits for texts obtained from Wikipedia, Wikinews, and the subforums of StackExchange.

Limitations 

A significant drawback is that the model generated many false claims not backed by the documents. Generally, people rely on such results without explicit fact-checking. Since PEER represents edits by rewriting the entire paragraph, it is impossible to deal with lengthy documents in a time-efficient manner.

Moreover, the evaluation technique is limited as it only evaluates PEER and other similar models based on a small subset from only a few domains. The collaborative potential of PEER is also explored menially. Undertaking more extensive research on human-AI interactions would be challenging.

Nevertheless, the model is a significant step forward in improving collaborative writing. It can be actuated further to find more suitable ways to evaluate texts with human assistance and improve PEER’s efficiency in processing entire documents.

Advertisement

Virtual Assets Regulatory Authority welcomes Blockchain.com to operate in Dubai

dubai blockchain.com office

According to reports, Dubai’s Virtual Assets Regulatory Authority, or VARA, has granted provisional regulatory approval to Blockchain.com to operate in the region. The London-based cryptocurrency company, which serves as both a Blockchain wallet and a cryptocurrency exchange for users, is now the most recent in a line of digital asset companies expanding in the Gulf as the area strives to become a center for blockchain technology.

As economic rivalry in the Gulf area escalates, the United Arab Emirates (UAE) has been pushing for the creation of virtual asset regulations to draw in new business models.

Since the inception of a cryptocurrency regulator and corresponding legislation by Dubai’s prime minister and ruler Sheikh Mohammed bin Rashid Al Maktoum in March, VARA has given the clearance for companies including Crypto.com, OKX, and FTX subsidiaries to offer cryptocurrency-focused services in the nation. The creation of a new license program and rules for platforms that provide marketing and advertising services linked to cryptocurrencies were also announced by VARA. All virtual asset service providers must now adhere to the new regulations and disclose any advertising intentions behind their offerings in order for security precautions to safeguard customers.

Read More: Columbia Unveils Guide for Implementing Blockchain for Public Projects

In Singapore, North America, Europe, and South America, Blockchain.com already operates its subsidiaries.

Advertisement

Oxford researchers develop AI for AVs to achieve reliable navigation in adverse weather conditions

Oxford researchers develop AI for AVs to achieve reliable navigation in adverse weather conditions

In collaboration with colleagues from Bogazici University, researchers at Oxford University’s Department of Computer Science have created a novel artificial intelligence (AI) system that allows autonomous vehicles (AVs) to achieve more reliable and safer navigation capability, especially under adverse weather conditions.

Yasin Almalioglu, a part of the research team, said that AVs’ difficulty in achieving precise positioning during challenging adverse weather is a significant reason why these have been limited to relatively small-scale trials. For example, weather such as snow or rain may lead to an AV detecting itself in the wrong lane before a turn or stopping too late at an intersection because of imperfect positioning.

To overcome this issue, Almalioglu and his colleagues developed a new, self-supervised deep learning model for ego-motion estimation, which is a crucial component of an AV’s driving system. It estimates the car’s moving position relative to objects observed from the car itself. 

Read More: Zuckerberg Announces PyTorch Foundation To Accelerate Progress In AI Research

The model brought together highly detailed information from visual sensors (which can be distorted by adverse conditions with data from weather-immune sources, so that the benefits of each of them can be used under various weather conditions.

The model was trained with the help of several publicly available AV datasets, which included data from multiple sensors such as radar, cameras, and lidar under diverse settings, including variable darkness/light and precipitation. These were used to generate algorithms to rebuild scene geometry and evaluate the car’s position from novel data. 

The researchers proved that the model showed robust all-weather performance under various test situations, including conditions fog, rain or snow during both day and night. The team anticipates that this work will bring AVs one step closer to safe and smooth all-weather autonomous driving and, ultimately, a broader use within societies.

Advertisement

Starbucks partners with Polygon to unveil NFT-based Starbucks Odyssey loyalty platform

Starbucks Odyssey Loyalty Program Polygon blockchain

Starbucks, a well-known international coffee chain, has teamed up with blockchain service provider Polygon to create a brand-new non-fungible token (NFT) loyalty program that will debut later this year. This innovative Web3 experience, Starbucks Odyssey, combines the company’s popular Starbucks Rewards loyalty program with an NFT platform, enabling users to earn and buy digital assets that unlock premium experiences and rewards. Starbucks Odyssey aims to foster community among its most loyal customers by enabling them to earn a wider range of benefits.

According to a blog post by the company, Starbucks Odyssey is an extension of the current Starbucks Rewards model, where users earn “stars” that can be redeemed for perks like free beverages and Wi-Fi. Customers in the US can acquire digital collector stamps in NFT form that provide rewards and immersive experiences through the NFT loyalty program. Rewards could include virtual classes on making espresso martinis, invitations to special events at Starbucks Reserve Roasteries, and even vacations to the Starbucks Hacienda Alsacia coffee farm in Costa Rica.

Despite being housed on the Polygon blockchain, these NFTs will be purchased with a credit or debit card instead of a crypto wallet. Starbucks thinks that by decreasing the entry barrier, it will make it easier for customers to participate in the web3 experience. Additionally, it will give bundled pricing rather than adding extra costs to consumers’ transactions, such as “gas fees.”

Starbucks selected Polygon because of its ability to provide low-cost, high-speed transactions while running on top of the Ethereum network. By decreasing Polygon’s carbon footprint by 99.91% as a result of the Ethereum Merge, the scaling solution will be one step closer to achieving its goal of becoming carbon negative by the end of 2022.

Prior to the platform’s official debut later in 2022, Starbucks Odyssey’s waitlist started to take signups on Sept. 12.

Starbucks Rewards members can log in to the web app using their current reward program credentials in order to interact with the Starbucks Odyssey experience. Once in, users will be able to participate in a variety of activities, referred to by Starbucks as “journeys,” such as playing interactive games or completing tasks that will expand their understanding of the Starbucks brand or coffee in general. Members can accumulate digital collectibles in the form of NFTs as they accomplish these journeys. However, Starbucks Odyssey forgoes the technical jargon and refers to these NFT collections as “journey stamps.”

Read More: Microsoft’s Mojang bans NFTs within Minecraft: Reactions and Reality

A selection of limited-edition-stamps NFTs will also be sold on the Starbucks Odyessy web app, which is compatible with mobile devices. Each journey stamp will have a point value depending on how rare it is, and the stamps may be purchased or transferred between users in the marketplace with ownership verified on a blockchain. As stamps are collected, members’ points will increase, giving them access to one-of-a-kind benefits and experiences.

Every stamp will include famous Starbucks artwork co-created with Starbucks partners as well as independent artists, allowing members and partners access to these prized assets for the first time. Additionally, a percentage of the profit generated through the sale of limited-edition stamps will be given to organizations important to Starbucks Rewards members and partners.

Since April, Starbucks has been exploring NFT collaborations in an effort to promote a novel convergence of blockchain and customer engagement. Watching how the Starbucks Odyssey performs by tapping into Starbucks’ coffee experience with the Web3 ecosystem will be fascinating.

Advertisement

Axie Infinity Teams up with Google Cloud to Validate Transactions on Ronin Network

Axie Infinity Google Cloud Collaboration

The developer of the well-known blockchain game “Axie Infinity,” Sky Mavis, has partnered with Google Cloud to identify itself as a validator for its nonfungible token network and blockchain gaming. This collaboration will enable Axie Infinity to scale in a secure and sustainable way. The particular details of the agreement were not made public.

Sky Mavis announced at the AxieCon conference in Barcelona that it would work with Google Cloud to strengthen Ronin’s security and further its goal of creating a gaming universe with interconnected, immersive, and rewarding experiences. In other words, Google Cloud will aid in transaction processing and sidechain network security.

As the 18th validator for Ronin, Google Cloud will join a validator node pool that includes Animoca Brands, DappRadar, Nansen, and other independent enterprise validators for the Ronin network. It will take on the responsibility of managing validator uptimes and operating a validator node on the Ronin network, which will help to strengthen the network’s overall security and governance.

In the play-to-earn blockchain game Axie Infinity, users gather and trade virtual pets modeled after the axolotl salamanders known as Axies. These creatures are represented by NFTs so that users can trade them for cash. Players can breed and combat these creatures against one another within the game for prizes. 

Google Cloud has reportedly been Sky Mavis’s “strategic cloud partner” since 2020, but this is a new development in their partnership, according to a press statement. Searce, a provider of cloud technologies, will support the collaboration as the implementation partner.

Previous collaborations between Google Cloud and other blockchains and distributed ledger-based networks include authenticating transactions for video platform company Theta Labs and joining the Hedera Hashgraph governing council.

Read More: Solana-based Phantom Wallet introduces Burn Factor to Discard Spam NFTs

With the help of this agreement, Sky Mavis’s revenue will have doubled since it lost hundreds of million dollars in cryptocurrency in a well-publicized March hack. Five out of the nine validators were hacked using stolen private keys due to the attack. The US Treasury has charged Lazarus, a North Korean state-sponsored hacker outfit, with the crime.

To provide seamless and reliable experiences to users everywhere on any device, including those who earn a living through Axie Infinity but reside in remote areas with poor online connectivity, Sky Mavis has also chosen Google Cloud for its content delivery network’s advanced load balancing and caching features.

Sky Mavis aims to make Ronin the standard NFT scaling solution for gaming by developing core offerings like Ronin, its exclusive Axie Infinity game titles, and Mavis Hub launchpad on Google Cloud’s scalable, secure, and sustainable infrastructure and low-latency network. This will give users more opportunities to convert their time and effort into real-world value. This implies that independent and third-party game creators can register with Sky Mavis to release their games.

With the objectives to substantially cut gas costs for its rapidly expanding user ecosystem and speed up transactions, Sky Mavis created Ronin in February 2021. This enabled millions of in-game micro-transactions to take place without any hiccups. This blockchain network is constructed utilizing the Ethereum virtual machine (EVM). 

Ronin runs on a Proof of Authority (PoA) consensus model, where validators are chosen based on their credibility and are responsible for validating, voting, and keeping track of transactions on the blockchain. Sky Mavis introduces new validators and puts them to a vote as part of its decentralization move. Overall, the company is seeking to have at least 21 validators for Ronin.

Advertisement

USPTO leverages data analytics, AI, and machine learning to increase efficiency and performance

USPTO leverages data analytics, AI, and machine learning

The United States Patent and Trademark Office (USPTO) is leveraging data analytics and technologies such as AI and machine learning (ML) to increase its performance and improve the quality of systems and processes.

The agency relies on input from hundreds of experienced workers to supplement the technology, captured actively and passively, to train and refine AI-driven models to ensure the technology delivers the expected outcomes. 

The agency has awarded more than 11 million patents since its founding. It employs more than 12,000 people, including attorneys, engineers, analysts, and computer specialists. A constant flow of feedback from its patent examiners on the front lines is also used to improve AI/ML models to fuel the development of new products and support activities in two key areas: patent search and classification.

Read More: Zuckerberg Announces PyTorch Foundation To Accelerate Progress In AI Research

Performing a comprehensive patent search is challenging given the explosion in the volume of data and possible sources of the prior art. To meet the challenges, technology teams are rolling out an artificial intelligence component in a new patent search tool to assist examiners in finding the most relevant sources they need as they scrutinize applications. 

This is important because each of the over 600,000 applications received yearly by the USPTO contains approximately 20 pages of figures and text, or roughly 10,000 words describing the claimed innovations. The agency’s IT organization also developed a classification tool that matches the classification symbols identified with an invention from more than 250,000 possible categories.

In both instances, the models were developed and are continually enhanced through input from human experts who facilitate a human touch to determine whether something is genuinely new and then apply law, facts, and expertise to reach a decision.

Advertisement

Germany’s KBA finds abnormalities in Tesla’s autopilot function 

Germany's KBA finds abnormalities in Tesla's autopilot

Germany’s federal motor transport authority, KBA, found abnormalities while investigating Tesla’s autopilot function. According to WirtschaftsWoche, while some of the problems found during the investigation, which has been running since the start of the year 2022, have been remedied, some outstanding ones where further remedial measures are still being tested and secured.

Recently, Elon Musk-run Tesla was also hit by a class-action lawsuit in the US over a phantom braking problem. The lawsuit allegedly turned the safety feature into a frightening and dangerous nightmare.

The lawsuit accused the electric car-maker of hiding the safety risks associated with the company’s Autopilot driver assist system, thus breaching its warranties, unfairly profiting from Autopilot, and violating California’s unfair competition law.

Read More: Zuckerberg Announces PyTorch Foundation To Accelerate Progress In AI Research

Phantom braking is when a self-driving system or an advanced driver assistance system (ADAS) applies the brakes for no good reason. The system can falsely detect an object on the road or anticipate a collision that will not happen and apply the brake in order to avoid it.

The lawsuit comes as Tesla already faces a federal investigation from the National Highway Traffic Safety Administration (NHTSA) over its phantom braking problem, which first surfaced last fall. The US transport agency investigated over 400,000 Tesla EVs for problems with their automated emergency braking systems.

The US government has received over 750 complaints of unexpected braking from Tesla owners. According to the lawsuit, many Tesla owners have reported significant, unexpected slow-downs and stops due to the false engagement of their Class Vehicle’s braking systems, even though no objects were nearby.

Advertisement

Zuckerberg announces PyTorch Foundation to Accelerate Progress in AI Research

Zuckerberg announces PyTorch Foundation to Accelerate Progress in AI Research

Mark Zuckerberg announced that the community-driven artificial intelligence (AI) research framework PyTorch would transition to a newly launched PyTorch Foundation. The latter will be part of the nonprofit Linux Foundation, a technology consortium whose core mission is the collaborative development of open-source software.

Since 2016, when Meta partnered with the AI community to create the PyTorch framework for AI research, open collaboration has been crucial to its success. With thousands of contributors who have built over 150,000 projects on it, PyTorch has become one of the top platforms for research and production across the AI community.

The creation of the PyTorch Foundation will ensure that decisions are made transparently and openly by a diverse group of board members for years to come. The governing body will consist of representatives from AMD, Meta, Microsoft Azure, Amazon Web Services, Google Cloud, and Nvidia to expand further over time.

Read More: GitHub’s AI-Powered Copilot Increases Developer Flow And Satisfaction, Research Finds

PyTorch was built with an open-source, community-first philosophy that will not change with the transition to the Foundation. When researchers and developers open-source their code, others worldwide can share their work, learn from each other’s advances and then contribute back to the AI community.

Meta will continue to invest in PyTorch and use it as the primary framework for its AI research and production. The transition does not render changes to PyTorch’s code, core project, or developer operating models.

In the future, the framework’s contributors will benefit from the robust governance, diverse leadership, and additional investments provided by the new PyTorch Foundation partners. The Foundation will adhere to four principles: remaining open, maintaining neutral branding, staying fair, and forging a solid technical identity. One of its top priorities will be to keep a clear separation between the business and technical governance of PyTorch.

Advertisement

Quentin Tarantino to settle lawsuit by Miramax over Pulp Fiction NFTs

Tarantino settles Miramax lawsuit over Pulp Fiction NFTs

Hollywood filmmaker Quentin Tarantino was sued by American producer Miramax in November of last year after base-layer blockchain company Secret Network announced the sale of unedited screenplay scenes from the 1994 movie as non-fungible tokens (NFTs). Except for those allocated for Tarantino, which did not include nonfungible tokens, the movie company asserted ownership of all other rights to Pulp Fiction. This case was one of the first intellectual property conflicts involving the well-known assets minted as NFTs.

An application submitted on Thursday in federal court in California states that the Academy Award-winning filmmaker and Miramax intend to file dismissal papers within two weeks. While this was going on, Tarantino and Miramax released a joint statement in which they announced they had resolved the issue and were looking forward to working together again, even on NFT projects.

In a June filing, Tarantino argued that the lawsuit should be dismissed since the NFTs are purely based on his script, which is protected by different copyright. He claimed that the NFTs were associated with media from the script, to which he still had the rights, but Miramax filed a lawsuit on the basis that NFTs were an “emerging technology” from which it could legally benefit. This is because the deal was consummated in 1996 when NFT technology did not exist. 

In the lawsuit, Miramax’s lawyer stated that Tarantino’s limited reserved Rights were “far too limited for him to unilaterally produce, market, and sell the Pulp Fiction NFTs.” Also, because Miramax was unaware of the distribution of any token-gated content, the NFTs’ workings were also at the center of the legal dispute.

Read More: Solana-based Phantom Wallet introduces Burn Factor to Discard Spam NFTs

According to the filing, the NFTs include handwritten Pulp Fiction screenplays and Tarantino commentary. The NFTs would incorporate unique, never-before-seen content from Tarantino’s script that only the NFT’s holder could view via Secret Network’s privacy-focused Secret NFTs technology. As per Secret Network, the first of the seven tokens sold for $1.1 million in January. Secret Network canceled the remaining auctions a few days later, citing “extreme market volatility.”

Many people view the incident as a prime example of Web2 copyright regulations being used in a Web3 environment.

Advertisement