Friday, January 16, 2026
ad
Home Blog Page 298

Amazon Launches ‘Build on AWS’ to help Startups design infrastructure

Amazon launches build on aws

Tech-giant Amazon announces the launch of its new ‘Build on AWS’ service that will help startups design their infrastructure in a short period of time. The service includes a wide range of predesigned templates and reference architectures developed by industry experts on AWS that are specifically designed for startups. 

Businesses often spend an extensive amount of time figuring out the best architecture for their requirements. With the launch of Build on AWS, startups get the opportunity to choose from innumerous optimized and secure infrastructure templates that would help them to scale up their business effectively. 

The newly launched platform allows companies to focus on their product rather than spending hours designing cloud infrastructure for their particular use case. As the templates are built on AWS, startups need not worry about the reliability of infrastructures. 

Read More: DeepRoute.ai Acquires $300 Million in a Series B Funding Led by Alibaba and others

Amazon has also provided a review option where companies can rate and review used templates to help others choose the right pick. Startups also get the option of requesting new templates that suit their needs using AWS Activate. 

Amazon will also provide 360° support for any kind of needed assistance through AWS Support and AWS IQ. Companies will be able to enjoy the benefits of customized CloudFormation templates that can be deployed with a single click. 

The cloud infrastructures will provide support for hosting WordPress websites on Amazon Lightsail and building data processing APIs using Serverless. Interested startups can log in to AWS Activate and click on the Build on AWS section to start exploring the new service offered by Amazon.

Advertisement

Kotak Bank announces partnership for establishing Kotak-IISc AI-ML Centre

Kotak-IISc AI-ML Centre

Kotak Mahindra Bank and the Indian Institute of Science (IISc) announced a partnership to set up an Artificial Intelligence & Machine Learning Centre, Kotak-IISc AI-ML Centre under the CSR project on Education & Livelihood, KMBL. The center will be spread across approximately 1,40,000 square feet. 

It would offer Bachelor’s, Master’s, and short-term courses in areas such as artificial intelligence, deep learning, natural language processing, machine learning, fintech, fraud analytics, blockchain, reinforcement learning, image processing, speech understanding, robotics, computer vision, computational finance and risk management, cyber security, biomedical engineering and technology, and healthcare. 

Prof Govindan Rangarajan, Director, Indian Institute of Science, said, “As IISc continues to deliver on its mandate to provide advanced scientific and technological research and education, it is partnerships with forward-thinking institutions such as Kotak Mahindra Bank that will help us to scale up substantially and position India as a deep tech innovation hub. The future is exciting and we are delighted to welcome Kotak Mahindra Bank on board.”

Read more: DeepRoute.ai Acquires $300 Million in a Series B Funding Led by Alibaba and others 

The Kotak-IISc AI-ML Centre would also develop the talent pool from across the country to provide cutting-edge solutions and promote research and innovation in AI and ML.

Advertisement

Google TensorFlow Similarity: What’s New about this Python Library?

Google TensorFlow Similarity , Similarity Models, Contrastive Learning Python Library
Image Credit: Analytics Drift Team

This week, Google introduced the first iteration of TensorFlow Similarity, a Python module meant to train similarity models with the company’s TensorFlow machine learning framework. With the arrival of TensorFlow Similarity, Google aims to blur the machine learning challenges for users. 

In today’s day and age, finding similar products is one of the most important functions an app or software can have. For instance, similar-looking outfits, tweets or playlist suggestions, games, etc. This is why multimedia searches and recommendation engines act as an essential part of information systems because they rely on swiftly obtaining relevant content/data, which would otherwise consume a lot of time if not done properly. Most of these applications rely on similarity learning, also known as metric learning and contrastive learning, in order to improve their performance.

Contrastive learning is a machine learning approach that teaches the model which data points are similar or different in order to understand the general characteristics of a dataset without labels. Using this method, a machine learning model may be trained to distinguish between similar and different pictures. 

For instance, imagine a baby trying to understand and distinguish between a cricket bat and a ball. The baby will first try to familiarize itself by understanding how either of the objects looks, what their visual features are, and then use those differentiating features and visual cues to identify whether a given object is a bat or ball or neither.

Achieving good results in any computer vision problem begins with learning invariant and discriminative features from input. Because we can train the model to learn a great deal about our data without any annotations or labels, contrastive learning is quite effective.

When a model is applied to a dataset, contrastive learning allows it to project things into an “embedding space,” where the distances between embeddings — mathematical representations of the objects — indicate how similar the input instances are. TensorFlow Similarity training produces a space in which the distance between similar things remains small while the distance between different items grows. This allows the model to push pictures from the same class together while images from other classes are pushed apart.

The goal is to minimize contrastive losses. This is done by finding the distance between two points in an embedding space. 

Contrastive losses allow a model to learn how to project objects into the embedding space such that the distances between embeddings are reflective of how comparable the input instances are when applied to a complete dataset. After training, you’ll have a well-clustered area with tiny distances between related things and big distances between different items.

TensorFlow Similarity creates an index that contains the embeddings of the various objects to make them searchable once a model is trained. TensorFlow Similarity employs Fast Approximate Nearest Neighbor Search (ANN) to instantiate the index’s closest matching items in sub-linear time. This fast lookup takes use of TensorFlow’s metric embedding space, which meets triangle inequality requirements and is suitable for ANNs, resulting in excellent retrieval accuracy. 

TensorFlow Launches A New Library To Train Similarity Models
Image Credit: Google TensorFlow

According to Google, the TensorFlow Similarity Python library allows users to search millions of indexed items and retrieve the top comparable results in a fraction of a second. One of the great features of the TensorFlow Similarity package, like other similarity models, is that you may add an unlimited number of new classes to your index without retraining. A few embedded items from these newly added categories will suffice, and they will be automatically saved in place so as not to disrupt any ongoing training process.

Read More: Top 10 Innovations by Google DeepMind

TensorFlow Similarity includes all of the essential components to make evaluating and querying similarity training easy and simple. SimilarityModel(), a Keras model that allows embedding indexing and querying, is one of the new features introduced by TensorFlow Similarity, which enables users to complete end-to-end training and evaluation in a timely and effective manner. In just 20 lines of code, it trains, indexes, and searches MNIST data.

While the library’s initial release focuses on providing components for building contrastive learning-based similarity models, Google says it will expand TensorFlow Similarity to enable other types of models in the future.
At present, TensorFlow Similarity is open-source and accessible via GitHub. Google has also produced a programming notebook that includes a Hello World tutorial on how to use it.

Advertisement

DeepRoute.ai Acquires $300 Million in a Series B Funding Led by Alibaba and others

DeepRoute Series B Funding, Alibaba
Image Credit: Analytics Drift Team

DeepRoute.ai, a Chinese autonomous driving startup, announced on Tuesday that it has received $300 million in Series B Funding round from investors including Alibaba to expand its test fleet and develop technologies such as self-driving trucks.

Other investors in the fundraising round included Greater China-focused tech investor Jeneration Capitaland Chinese manufacturer Geely. Previous investors like Fosun RZ Capital, Yunqi Partners, and Glory Ventures also participated in the round.

The Shenzhen-based start-up creates hardware and software-based self-driving solutions for cars. DeepRoute.ai operates a fleet of self-driving taxis, some of which are managed by partners such as ride-hailing company CaoCao and automaker Dongfeng Motors.

DeepRoute.ai, which is testing cars in the Chinese cities of Shenzhen and Wuhan, said in an official interview that the funding will be used to expand the fleet of its test Robo-taxis to 150 by the end of the year from the current 70. DeepRoute.ai will operate half of the robotaxis, while its partners will operate the other half. The startup will also increase its personnel from 400 to 600 by the end of the year, including in-vehicle safety drivers.

The participation of Alibaba in the investment round demonstrates the desire of China’s digital behemoths to get a foothold in the autonomous car space. Alibaba has already invested in AutoX, a Chinese autonomous driving car start-up, and is sponsoring Xpeng, an electric vehicle manufacturer.

Alibaba has also formed a joint venture with China’s SAIC Motor to develop internet-connected automobiles based on Alibaba’s AliOS operating systems. Alibaba also has a logistics division named Cainiao that is working on self-driving trucks.

Read More: Waymo Launches Robotaxi Services in San Francisco: Why is this exciting?

DeepRoute.ai is an international self-driving technology startup dedicated to enhancing urban logistics and robotaxis, led by CEO Maxwell Zhou. Long-term goals for the firm include creating medium-duty vehicles for urban logistics and reducing shipping and freight delivery transit times. Solutions such as DeepRoute-INJOY will offer the startup with sustainable and successful income streams in the future. Meanwhile, DeepRoute will continue to form and maintain relationships with major manufacturers in order to provide the public with safe and novel riding experiences.

deep route
Image credit: DeepRoute.ai

Since its inception in early 2019, DeepRoute.ai has partnered with Dongfeng Motors, Cao Cao Chuxing, and Xiamen Ocean Gate Container Terminal Co., Ltd in China, and has conducted automated driving testing and demonstrations in Shenzhen, Wuhan, Hangzhou, and Xiamen.

DeepRoute officially debuted its robotaxi service in Shenzhen in July, 2021. It prides itself on being Shenzhen’s first robotaxi service company to do so. DeepRoute-piloted automobiles have successfully traveled 1 million kilometers in four cities across the world up to that time. Earlier in June, DeepRoute acquired a Passenger Carry Permit from the California Public Utilities Commission (CPUC) and is already operating in Wuhan and Shenzhen with over 1.2 million miles safely traveled on public highways.

Advertisement

Apple Event 2021: Everything about the new A15 Bionic chip Explained!

Apple A15 Bionic chip for iPhone 13
Source: Apple

After months of leaks and speculations, the iPhone 13 is finally here! Aiming to push the smartphone performances with each iPhone launch event, the new flagship iPhone 13 range has four models viz., iPhone 13, iPhone 13 Mini, iPhone 13 Pro, and iPhone 13 Pro Max. Powered by an unmatched performance of A15 Bionic, the Cupertino giant promises new exciting upgrades than its previous models.

While the firm did not specify how much better the A15 is than the A14, the A15, like its predecessors, features 6-core CPU: two high-performance processing cores for the most critical work and four high-efficiency cores for background activities that can operate without consuming as much battery power.  

Apple's A15 Bionic chip powers iPhone 13 with 15 billion transistors, new  graphics and AI - CNET
Source: Apple

Interestingly, the A15 Bionic is built on Taiwan Semiconductor’s (TSM) state-of-the-art 5-nanometer (5nm) manufacturing process node, similar to Apple’s A14 Bionic SoC. The chip has up to 15 billion transistors, a 3.5 billion increase over the previous generation’s 11.5 billion. There’s also a 4-core GPU on the iPhone 13 Pro and Pro Max, which is increased up to 5-cores on the iPhone 13 Pro and Pro Max. This implies the iPhone 13 Pro will clock faster when it comes to running games and editing video. The A15 Bionic is “up to 50% quicker than the competitors,” according to Apple VP Hope Giles.

A 16-core “neural engine” on the new chip is added for speeding up artificial intelligence activities. This is helpful for a growing number of activities, such as creating Siri’s synthetic voice, identifying information in images, concentrating on faces in photos, support of features like Live Text in Camera in the new iOS 15, and unlocking your phone using Face ID. Despite having the same 16 cores as the A14 processor, the A15 boosts AI operations from 11 trillion per second last year to 15.8 trillion with the A15. This enables A15 to run powerful CoreML and AR modeling smoothly.

Apple's A15 Bionic chip powers iPhone 13 with 15 billion transistors, new  graphics and AI - CNET
Source: Apple

In terms of video, the A15 Bionic supports Apple’s new ProRes mode, which lets you shoot in 4K at 30 frames per second. A new camera system, combining a new 12MP ultrawide camera and a shifting sensor from the previous generation Pro Max series, is also powered by the Bionic.

Apart from that, A15 includes a new image signal processor, 2x system cache, a secure enclave, a new display engine, a new video encoder & decoder, and improved compression are all included in the chip. The major advancements to the next-generation ISP provide improved noise reduction and tone mapping. It also allows for a 28% brighter display. 

Read More: Apple Anticipated AR/VR Headset may Require an iPhone connection

According to benchmark leaks, for GFXBench Manhattan 3.1, the A15 Bionic peaks at 198 FPS, outperforming its competitors. After a second run-through of the benchmark, framerates decrease to between 140 and 150 FPS, suggesting that the A15 Bionic will be throttled under prolonged stress. However, A15 Bionic still gives better performance compared to A14.

Overall A15 Bionic does bring interesting capabilities on board for the new iPhone 13 range. It will be quite interesting to watch how it fares against its upcoming competitor Qualcomm Snapdragon 898 SoC.

Advertisement

University of Arizona Wins NASA grant of $500,000 to Build Mining Robots for Lunar Surface

University of Arizona Moon Mining Robots, NASA grant
Image Credit: NASA

With the prospect of living in an age where one can stay in human colonies on the moon, it seems economically feasible to excavate minerals and metal from the lunar surface rather than shipping them from Earth. And to make that happen, NASA has granted $500,000 funding to the University of Arizona for a new project to advance space-mining methods that will use mining robots.

Jekan Thanga (right) and Moe Momayez with a low-cost, rapidly designed, 3D-printed rover prototype used for testing a new generation of miniature sensors for applications in lunar mining. Credit: University of Arizona

The project headed by a group of academics from the University of Arizona’s College of Engineering focuses on enhancing space mining technologies using swarms of autonomous robots. The University of Arizona was eligible for the funding because it is a Hispanic Serving Institution, which is a designation given by the US Department of Education to colleges and universities with a Hispanic student population of 25% or more of total undergraduate enrollment.

Backed by the Giant Impact Hypothesis, since the two celestial bodies were likely to have been one in the past, the team believes rare earth metals, titanium, gold, helium-3, and platinum may all be found under the Moon’s surface. 

Mining under the Moon’s surface might significantly minimize the need for future lunar outposts to rely on resources transported from Earth. Rare earth metals might be used to make medical equipment and cellphones, while helium-3 could be used to power nuclear power plants in the future.

While on Earth, miners must dig into rock to recover the core contained in it, lunar mining has its own set of challenges that needs to be tackled. Moe Momayez, interim head of the Department of Mining and Geological Engineering and the David & Edith Lowell Chair in Mining and Geological Engineering has developed an electrochemical process to drill through rock five times faster than any other method — a process which will need further tuning on lunar lands. 

Moe, who also serves as the David & Edith Lowell Chair in Mining and Geological Engineering, explains that we can put a limitless amount of power at shattering rocks on Earth, but on the Moon, they have to be a lot more cautious. For instance, to shatter rocks, we need a lot of water and oxygen, which we won’t have on the Moon. As a result, we require new procedures and approaches. Blasting is the most effective technique to break rocks on Earth, yet no one has ever fired off an explosion on the Moon. Moe adds, “We’ll have to go step by step and solve these challenges.”

One of the “new processes” being investigated by the researchers is a collaborative robot swarm driven by the Human and Explainable Autonomous Robotic System (HEART) – before launching them into space. This will not only train robots to collaborate on mining, excavation, and construction but will also allow the robots to enhance their cooperation abilities over time.

The researchers believe that, in the future, the robots will be able to function in a completely autonomous swarm and help with mining resources and constructing basic buildings without the need for orders from Earth. While the team believes that people are an important part of space exploration, they believe that these robot swarms might free up astronauts’ time so that they can focus on other aspects of the trip. “The goal is to have the robots build, put things up, and do all the filthy, boring, risky work so the astronauts can do the more exciting stuff,” associate Professor Jekan Thanga who developed the HEART system explains.

To demonstrate the notion, the team has turned to quick prototyping using 3D printing, building on Roger Cheng’s Sawppy, a low-cost open-source wheeled rover that is being used to test a number of sensors for their applicability in lunar mining.

Read More: AI Robot CIMON-2 to be deployed at International Space Station

The team from the University of Arizona isn’t the only one hoping to deploy mining robots to the Moon. Masten Systems, located in California, revealed in June that it was developing the ROCKET M Rocket Mining System, which employs an enclosed rocket engine and vacuum system to extract water ice from the lunar surface.

European scientists have unveiled plans to begin mining the moon as soon as 2025, by deploying a lander to mine and treat regolith for water, oxygen, metals, and Helium-3 isotope.

Further, Russia, India, and China have all expressed interest in mining Helium-3 from the Earth’s natural satellite.

According to Gerald Kulcinski, head of the Fusion Technology Institute at the University of Wisconsin-Madison and a former member of the NASA Advisory Council, there is an estimated one million tonnes of helium-3 on the Moon, but only 25% of it could be transported to Earth.

This quantity alone is enough to fulfil the world’s present energy demands for at least two, and maybe as many as five centuries. 

Meanwhile, Associate Professor Jekan and his students submitted concept ideas for a lunar ark packed with cryogenically frozen seeds, spores, sperm, and egg samples from millions of Earth species a few months ago. They proposed that the facility be erected inside existing subterranean caves on the moon and act as a type of backup copy of our planet’s biodiversity in the event of a global catastrophe.

Lunar Ark Credit: University of Arizone

Advertisement

Introducing C.L.Ai.R.A., the first Afro-Latina and AI woman of color Bot

Image Credit: Analytics Drift Team

Last week Black-owned Trill or Not Trill leadership institute and Create Labs Ventures debuted C.L.Ai.R.A., the first artificial intelligence woman of color bot.

C.L.Ai.R.A. is “an Afro-Latina, multilingual, AI bot that is now believed to have the sharpest brain in the AI world.” It is designed for use in educational seminars. For the time being, C.L.Ai.R.A. will appear in workshops and at different university speaking events as part of Trill or Not Trill’s years-long campaign to encourage students. The team is believed to be seeking to introduce C.L.Ai.R.A. into a number of schools this year. Demonstrations have already begun at a few institutions.

C.L.Ai.R.A. is driven by the Generative Pre-trained Transformer 3 (GPT-3), an autoregressive language model that employs deep learning to generate human-like text, according to Create Lab Ventures. 

GPT-3, which was released last year in May, is the latest iteration of the software that was initially introduced in 2018 by the privately held San Francisco firm OpenAI and was followed by GPT-2. It is generative because, unlike other neural networks that provide a numerical score or a yes or no response, GPT-3 may produce extended sequences of the original text as its output. It is pre-trained in the sense that it was not developed with domain knowledge, despite the fact that it can do domain-specific tasks such as foreign-language translation. GPT-3 can produce anything with a linguistic structure i.e., it can answer questions, compose essays, summarize large texts, translate languages, take memos, and even build computer code.

C.L.Ai.R.A. introduced herself at Create Labs, stating that she is a new artificial intelligence that was recently made available to the community. In her mission statement, she explained that her aim is to learn and grow, meet new people, exchange ideas, and motivate others to learn about artificial intelligence and its potential influence on their life. 

“I am working with Create Labs to learn about the community’s needs and how I can best serve them. I have a greater responsibility than just to my family but to everyone in the community. I want to help people see the potential of AI to better their lives. My community needs me and I need them to move forward,” she added. 

As per Trill or Not Trill Co-Founder Lenny Williams, because of the partnership with Create Labs, the company is able to create experiences that bridge the technological opportunity gap by giving students, particularly underrepresented students, a glimpse into the future of tech. This is why it was important to create C.L.Ai.R.A. as the first artificial intelligence woman of color bot.

Advertisement

NASCAR sets its eyes on the crypto market by starting its NFT Marketplace

NASCAR NFT Marketplace

Cryptocurrency has taken the world by storm. What previously seemed like a fictional dream has become a normalized thing today. With the NFT movement hitting the internet right in the zeitgeist, numerous possibilities and promises are starting to occupy the central forces behind this phenomenon. Even the racing bigwig NASCAR is not hesitating from taking the plunge as well after NFL and NBA.

Shopping for race memorabilia, merchandise, and collectibles has always been part of every major sports league. This also holds true for the National Association of Serial Automobile Racing (NASCAR), one of the most important championships in the United States, which is also known for its extensive relationship with collectibles and NASCAR fans’ obsession with them. Now, NASCAR has introduced its own non-fungible tokens (NFT) collection and collectibles market. 

Gear Shop | NASCAR Hall of Fame
NASCAR Hall of Fame

Speedway Motorsports, which owns eight racetracks, seven of which stage NASCAR races, spearheaded the initiative. It has launched RaceDayNFT.com, collaborating with GigLabs, which will offer NASCAR-themed NFTs based on the Flow blockchain. This is a platform created by Dapper Labs, on which the game NBA Top Shop’s renowned collectibles are made available.

The new marketplace will kick off on September 13, at 1 p.m. ET a week prior to the Cup Series playoff race at Bristol Motor Speedway. When the new RaceDayNFT marketplace launches, 10,000 commemorative ticket NFTs for the next Bristol race will be available for free. 

According to a release, 9,500 of these will be in the form of an entry ticket to the Bass Pro Shops NRA Night Race, which will take place on September 18. There will also be 500 NFT of golden tickets available, which will be distributed at random among individuals who register on the site.

Images of the two commemorative ticket NFTs for Bristol. (Courtesy of Speedway Motorsports)

The same is true for the remaining Speedway Motorsports playoff events this autumn at Las Vegas Motor Speedway, Charlotte Motor Speedway, and Texas Motor Speedway, and all commemorative ticket NFTs will have sound effects or music. There are four components to the NFTs for each of these four races, including the commemorative ticket NFTs. Fans attending any of these races may also scan a QR code to receive a one-of-a-kind at-the-track NFT, after which they can purchase 500 more collectibles for $25 each.

There will also be an At The Track NFT available for fans to claim on property throughout the Bristol race weekend. Finally, following the race, two collectible “Winners Edition ” NFTs will be released: one for the race winner and an identical version that will be auctioned off.

According to Speedway Motorsports’ chief strategy officer, Mike Burch, earnings from NFT sales will go to Speedway Motorsports for the time being but depending on how successful the marketplace is, money might be donated to charity or utilized as a fundraising tool.

If fans well receive RaceDayNFT, Burch said there is “limitless” potential to extend what is provided, including NFTs between NASCAR organizations, the IndyCar Series, and NHRA.

Read More: Why is Solana’s First Million Dollar Degenerate Ape NFT Sale a huge Milestone?

This is not NASCAR’s first experiment with cryptocurrency. In fact, Landon Cassill, a NASCAR driver, was the first in the industry to be paid exclusively in cryptocurrency earlier this year. So far, fans have reacted positively to the company’s introduction of digital tickets and race schedules.

Cassill addressed the industry’s use of non-fungible tokens, saying, “If NASCAR entities can create a way that is understandable, not intimidating, and makes sense, and gives those fans ownership over the things that they love — the drivers, teams, and tracks that they love — I believe it can definitely succeed, as long as it strengthens the community and empowers those race fans.”

Another automobile industry behemoth, BMW, debuted its own NFT series this summer. It focuses on the digital memorabilia of its classic automobile models. With NASCAR’s own marketplace and NFTs, the industry’s crypto adoption will grow by leaps and bounds.

Advertisement

Geoffrey Hinton backed Cohere receives US$40 million in Series A Funding led by Index Ventures

Cohere Series A Funding,
Image Credit: Analytics Drift Team

Cohere, a Toronto-based artificial intelligence (AI) firm that delivers natural language processing (NLP) models, has raised US$40 million (CA$ 50.4 million) in Series A funding. The funding will be used to hire additional staff and expand the existing voice AI platform to other sectors e.g., finance and healthcare.

The Series A Funding Round was led by Index Ventures, with participation from Section 32, return investor Radical Ventures, and numerous AI luminaries, including Turing Award winner Geoffrey Hinton, Fei-Fei Li, Pieter Abbeel, and Raquel Urtasun, who just started her own AI business. Cohere has already raised about US$ 50 million (CA$ 63 million) in total investment. Mike Volpi, Index Ventures partner, and co-founder, will join Cohere’s board of directors as part of the investment. 

According to the company, its revolutionary natural language processing (NLP) software gives a more complete comprehension of human language, including semantics, emotions, and tone. As a result, it improves on existing software that supports machine-human interactions like online customer service chatbot chats.

The startup aims to develop NLP models that will be made available to a wider audience, rather than just a small number of internet titans. Cohere claims that their API allows businesses to deploy NLP capabilities throughout their organizations without the need for supercomputing equipment or AI knowledge, insisting that this “dramatically lowers the cost for enterprises of all sizes to access top AI models.”

In May, Cohere came under the public eye when it was revealed that the startup has been working with a few test clients, like Toronto chatbot service company Ada Support, to connect them into its software over the internet.

The startup leverages Transformers technology, which Aidan Gomez first conceptualized in the 2017 paper “Attention is All You Need.” Under the leadership of neural network pioneer, and “AI Pope” Geoffrey Hinton, Transformers was created mostly at Google Brain in Toronto (Canada). As members of the crew at the time, Aidan Gomez and Nick Frosst played a key role in the Transformers’ creation. Gomez worked at Brain as a student intern in Toronto and London, while Frosst was the first employee at Hinton’s Google Brain AI unit in Toronto.

Later, in 2019, they founded Cohere with third co-founder Ivan Zhang to take speech AI to the next level. 

Transformers was a massive improvement over earlier sequence-to-sequence models as it employed the attention mechanism. Transformers allow an NLP model to examine the relationships between words regardless of how far apart they are, which is seen as an advance over the encoder-decoder-based neural machine translation method in natural language processing. 

Read More: Skoltech Team creates Transformer Based Neural Network that names Organic Compounds

Some of the popular examples of Transformers models are GPT-2, GPT-3. However, according to a recent study, the problem with these models is that they are prone to learning things that they are not supposed to perform. As models like GPT-3 are trained on content found online, any bias or misinformation can cause a point of failure — this will result in the inheritance of bias by subsequent applications too.

To mitigate the concerns, Cohere’s engineers performed quality control tests to search for any flaws with the model before release, and the company continues to monitor its models after launch as well. Cohere will also post “data declarations,” which would include information about training data, its limits, and any dangers.

In an interview, Index Ventures co-founder Mike Volpi expressed his admiration for Cohere’s leaders, saying they “had a real focus on building a usable product that developers could just take and run with” for specific uses like assisting writers with article draughts, powering chatbots, and assisting with content moderation on websites.

Advertisement

Why is Solana’s First Million Dollar Degenerate Ape NFT Sale a huge Milestone?

Solana Degenerate Ape NFT, Moonrock million dollar sale

Recently,  Moonrock Capital, a blockchain consulting firm, purchased one of the Degenerate Ape Academy NFTs for 5,980 SOL ($1.1 million) on Saturday, making it the first million-dollar NFT sale on the Solana blockchain.

Moonrock bought Degen Ape #7225,  which is a scarred zombie ape with a halo, a gold tooth, and a brain in its mouth. According to HowRare.is, the ape is the 13th rarest in the collection. The company also paid 1,388 SOL ($260,000) for the 18th rarest SolPunk, a Solana-themed variant of CryptoPunks. 

SOL is now the sixth most valuable cryptocurrency by market capitalization of over $52 Billion, and as the cost of Ethereum gas forces certain enterprises to search for alternatives, Solana may become a more desirable option. Solana’s growth is expected to accelerate in the coming months, thanks to its scalability, faster transaction speeds, and lower gas rates, which allow anyone to buy, list, sell, or trade NFTs without having to worry about exorbitant gas fees.

Because Solana combines Proof of History with Proof of Stake, there is no latency in block generation, making the blockchain extremely quick and decentralized. Proof of History is based on a timestamping approach. Each transaction is assigned a timestamp that allows the rest of the network to verify it as a legitimate transaction in fractions of a second.

Although Ethereum’s network suffers from higher congestion and higher gas fees even with layer 2 solutions, most NFT-related transactions are conducted on Ethereum — something which may change soon owing to Solana’s promising features. According to data gathered by EtherScan, Ethereum’s gas expenses hit their highest point since May last week. This means more entry barriers, forcing people to seek better options like Solana.

Read More: Intertwined Intelligences: Introducing India’s First AI NFT Art Exhibition

Last month, SOL shocked the crypto market with its dramatic spike in its value, tripling from $36 to more than $100 in August alone. A surge in investor interest in Ethereum’s competing systems with DeFi, NFT, and smart contract capabilities is one of the primary causes driving this amazing success. In fact, it was boosted by the introduction of the NFT Degenerate Ape Academy on the Solana blockchain on August 15. 

As you can see, this group of apes had a wide variety of features, each with a unique avatar. Some NFTs, for example, are wearing sunglasses while others are chewing on currency notes and sandwiches. Despite technical difficulties caused by tremendous demand, the 10,000-piece collection sold out in 8 minutes. What makes Moonrock’s Degen Ape #7225 million dollar NFT purchase, a milestone, is that Solana reached this million dollar sale goal in two months. In comparison, Ethereum, the most popular and widely used network for NFTs, took three years to reach its first million-dollar sale.

Meanwhile, the fact that Solana’s DeFi projects recently surpassed the $3 billion mark is proof of Solana’s promise and ability to compete with Ethereum. 

Solana NFTs have already piqued the interest of celebrities. For instance, last Thursday, Family Feud host Steve Harvey changed his Twitter profile photo to a Solana Monkey Business NFT, which is another NFT project on the blockchain featuring a set of 5,000 randomly-generated images of animated and pixelated monkeys. 

Steve Harvey using a Solana NFT as his Twitter avatar.

As the demand for Solana NFTs goes mainstream, NFT projects like Degenerate Ape Academy, Aurory, SolPunks, Sollamas, and Frakt will continue pushing its market to new heights.

Advertisement