Software developing company SAP India announces its partnership with tech giant Microsoft to launch a new free tech skilling program in India for underserved young women. The unique program named TechSaksham will enable many underprivileged women to build their careers in the field of technology.
The companies aim to train more than 62,000 women in artificial intelligence, machine learning, cloud computing, web designing, and digital marketing. Microsoft has tied up with AICTE training and learning academy ATAL and many state education departments for providing faculty to train the enrolled students.
Chairman of AICTE, Anil D Sahasrabudhe, said, “More than 60,000 women being trained will create a massive impact. Moreover, with more than 1,000 women faculty certified, it will bring big-ticket changes not only in the employability of graduates but will encourage many more students to start their startup journey.”
He added that this new initiative would transform the landscape of workplaces across India. TechSaksham initiative will train 1500 teachers every year, and each teacher will educate over 50 students that would result in the training of nearly 75000 students each year.
The core curriculum of the program will include subjects like understanding the application of technologies and activity-based engagement assignments under expert trainers. It will also focus on the holistic development of the students to ensure their job readiness.
The learners will get a chance to demonstrate their work to many industry professionals and government officials.
President of Microsoft India, Anant Maheshwari, said, “At Microsoft, we are deeply committed to democratizing access to technology. As we rapidly move towards a digital economy, the skills of the future will look very different from the skills that are needed today. Digital fluency will not just be a competitive advantage but a necessity to qualify for jobs.”
Telangana artificial intelligence accelerator program ‘Revv Up’ selects 42 startups to enable them to work closely with the government to develop solutions for complex business problems.
The 42 selected startups under the Revv Up project will receive help from the government and industry to set up their business on a larger scale. The startups will also receive expert mentorship, technology, and IP support from many industry leaders.
The Telangana state government selected startups from eleven different industries from ten states across the country. The NASSCOM-powered project carefully handpicks selectors to scrutinize the use cases of the applicants before finalizing the results.
State Principal Secretary of the Telangana government, Jayesh Ranjan, said, “The participating entrepreneurs and their startups represent innovations powered by artificial intelligence. From addressing noninvasive healthcare solutions to climate modeling and new ways to support learning; These founders will greatly impact how our future will be shaped.”
According to officials, more than 50% of the selected startups are outside Telangana. The startups plan to open a local office in the region by 2022. Over 70% of the startups have already received funding, and 25% of the selected companies have at least one female founder.
The first phase of the Revv Up program will commence its operations this month, and the second phase is expected to be launched in the first quarter of the next year. According to the government report, the program will start accepting applications for its second round in February 2022.
The President of NASSCOM, Debjani Ghosh, said, “NASSCOM’s close partnership with Telangana will provide significant impetus for startups and their aspirations to become the artificial intelligence garage for the world.”
Ecommerce and business service providing company FedEx is now using artificial intelligence robotics-powered fulfillment systems. FedEx wants to revolutionize the e-commerce delivery system by deploying autonomous driving delivery robots to handle innumerous small package orders on a daily basis.
The company has partnered with robotics solutions developer Berkshire Grey to build the Robotic Product Sortation and Identification (RPSi) system that has been on trial at FedEx Ground’s station in Queens, New York.
The artificial intelligence system is capable of identifying, sorting, and collecting small and medium-sized packages that include polybags, tubes, padded mailers, and various other objects. Traditionally this process is done manually, which is quite time-consuming. But this new technology enables FedEx to reduce the processing and delivery time of orders drastically.
Managing Director of Operations Technology and Innovations at FedEx, Ted Dengel, said, “As an industry leader in technology and automation, we see the significant benefits that next-generation innovation brings in terms of enabling increased safety and productivity, enhancing customer service and improving flexibility to adjust to changing package volumes and sizes.”
He further added that the company is satisfied with the accuracy that the artificial intelligence system has shown, and they plan to deploy it in other shipment plans of FedEx in cities like Ohio, Las Vegas, and Columbus in the near future.
Berkshire Grey is a Massachusetts-based material handling and omnichannel fulfillment automation company founded by Tom Wagner in the year 2013. The firm specializes in developing automation solutions for the e-commerce industry using artificial intelligence, machine learning, computer vision, and novel sensing.
Senior Vice President of Berkshire Grey, Jessica Moran, said, “With our patented HyperScanner optical identification modules, barcodes can be read from any angle in milliseconds – all without manual intervention.” She also added that their platform can seamlessly carry out the sorting process, which reduces labor costs considerably.
Humanity (Fall of the Damned), Second State by Scott Eaton (Source: PR Handout)
Recently, Terrain.art launched ‘Intertwined Intelligences,’ India’s first artificial intelligence non-fungible token (NFT) art exhibition, which showcases the relationship between artificial intelligence (AI) and human creativity. The exhibition is on display till August 20 and features six global artists pioneering AI — Pindar Van Arman, David Young, Scott Eaton, Harshit Agrawal, Sofia Crespo, and Feileacan McCormick from Entangled Others Studio.
Terrain.art is a blockchain-powered online platform that focuses on art from South Asia. Intertwined Intelligences, curated by Harshit Agrawal, is the initial phase in Terrain.art’s dedication to creating an environment for artists working in emerging kinds of art. This includes art forms like generative art, neural art, machine learning, and AI-assisted art, which intend to stimulate critical thinking about the kind of future the world of art should create using such technologies.
Harshit got into the limelight with his project “The Anatomy of Dr. Algorithm,” in which he fed photos of surgeries into an algorithm and employed artificial intelligence to produce Rembrandt-inspired art from images of everything from organs to fibroids. For the Terrain.art exhibition he curated 3000 landscape paintings and trained the computer to understand the visual patterns within them to generate its own set of landscape paintings.
Latent Landscapes 4 by Harshit Agrawal (Source: PR Handout)
Sarah_15416 by Pindar Van Arman (Source: PR Handout)
Tiny_Networks_of Everything by Sofia Crespo. (Source: PR Handout)
He explains that the artist can reconfigure their artwork by providing raw creative inputs into a GAN (generative adversarial network), a collection of machine learning algorithms that translate the artist’s inputs into visual media. The system has a lot of back-and-forths, but humans have the final word. Overall, encompassing this method, artificial intelligence offers artists a fresh set of frontiers to discover and cross.
This is not the first time India has hosted an exhibition of art influenced by artificial intelligence. Aparajita Jain, Founder of Terrain.art and Co-Director of Nature Morte, organized India’s first Al display, titled “Gradient Descent,” at Nature Morte in Delhi in August 2018, with all of the artworks created by Al in cooperation with artists. Today, Intertwined Intelligences on Terrain.art has added a new dimension by displaying works that have both physical and digital equivalents and are protected on blockchain using NFTs.
Aparajita says, “Human existence and digital technologies are no longer separable. Our lives are deeply intertwined within a web of technology, which itself is rapidly cultivating an intelligence of its own, seeded by human intelligence and mined data.”
The artwork on display includes 3D creatures that resemble living forms but were created using algorithms, as well as portraits painted by trained robots. This demonstrates how unlimited artistic ingenuity can be. Aparajita also predicts that gaming art will take over in the future when humans will be able to explore AI-generated augmented realities creatively.
The world of art is experiencing a quiet revolution, with Non-Fungible Tokens (NFTs), allowing aesthetically talented artists to demand top pay for their work. Meanwhile, the combination of Covid-19 isolation and cryptocurrency earnings provided a strong motivation for digital-positive collectors to compete for these NFTs, and some creators are profiting handsomely. NFT also saves time and money spent in procuring or selling as well as prevents extra costs due to damaged artwork in the process. Additionally, since blockchain and cryptocurrencies operate in a decentralized marketplace, buyers of digital artwork and NFTs are largely unaffected by the traditional art and craft market.
A recent test conducted by Singapore’s State Technology Agency points out that an artificial intelligence (AI) system outperformed humans in writing better phishing emails. The study was presented during the Black Hat and Defcon security conference held in Los Angeles earlier this month.
Two hundred employees are tested with two phishing emails generated by the artificial intelligence system and humans, respectively. Surprisingly most of the employees fell for the phishing email generated by artificial intelligence.
Researchers used OpenAI’s deep learning model GPT-3 and other artificial intelligence technologies to build this AI program. A government cybersecurity specialist, Eugene Lim, said that researchers have found out that it takes millions of dollars to train such artificial intelligence models with high levels of accuracy.
“But once you put it on AI-as-a-service, it costs a couple of cents, and it’s really easy to use—just text in, text out. You don’t even have to run code, you just give it a prompt, and it will give you output. So that lowers the barrier of entry to a much bigger audience and increases the potential targets for spearphishing.”
The artificial intelligence algorithm focuses on personality analysis for generating phishing emails. OpenAI’s GPT 3 platform analyses an individual’s tendencies to react to something and generate results accordingly.
Leveraging this capability, researchers developed such advanced tools for creating phishing emails that surpass human intelligence to a certain extent.
OpenAI officials said, “We grant access to GPT-3 through our API, and we review every production use of GPT-3 before it goes live. We impose technical measures, such as rate limits, to reduce the likelihood and impact of malicious use by API users.”
They further mentioned that the misuse of language models is an industry wide issue, and they are meticulously working towards the deployment of safe and responsible artificial intelligence technologies.
The National Security Agency (NSA) of the United States partners with the Department of Defence (DoD) for a joint evaluation of federal artificial intelligence use.
The assessment will focus on the integration of artificial intelligence in strategic operations like insights collection from foreign communications and information related to weapon systems.
Both the bodies will also scrutinize the artificial intelligence framework of the country and the ethical use of AI. A recently announcemenced on the DoD’s official website confirms these developments.
DoD officials said, “We may revise the objective as the evaluation proceeds, and we will also consider suggestions from DOD and National Security Agency management on additional or revised objectives.”
Last year the DoD started a review program for the same, but now DoD claims that review has been terminated to start this new evaluation program in collaboration with the NSA.
An official said, “In this case, given the objective as stated in our announcement memo, we determined that it is a better use of taxpayer resources to conduct our oversight jointly with the National Security Agency.”
He further added that the department of defense office of the Inspector General considers a wide variety of factors while determining the time for conducting oversight operations or canceling previously announced programs.
Recently, the NSA also cracked a secret deal with the tech giant Amazon worth $10 billion. Through this deal named ‘WildandStormy,’ Amazon Web Services will provide NSA cloud computing services.
Chinese multinational electronics company Xiaomi launches its new open-source bio-inspired quadruped robot named CyberDog. The unique robot was unveiled during Xiaomi’s launch event held on 10th August.
This new launch marks the beginning of Xiaomi entering into a completely different domain of robotics. Robotics enthusiasts in the worldwide open-source community can now collaborate and compete with Xiaomi engineers for the development of quadruped robots.
During the launch event, Xiaomi said, “CyberDog can analyze its surroundings in real-time, create navigational maps, plot its destination, and avoid obstacles. Coupled with human posture and face recognition tracking, CyberDog is capable of following its owner and darting around obstructions.”
CyberDog weighs nearly 3 kilograms and is powered by the Jetson Xavier AI platform developed by NVIDIA that has 48 Tensor and 384 CUDA cores. The robot comes with an array of different sensors and is equipped with multiple cameras, including Intel’s RealSense D450 that enables it to move autonomously.
It also has a GPS unit installed to aid its movements. Its motors can generate a peak torque of 32Nm that allows it to move at a maximum speed of 11kmph. With these technologies, CyberDog can analyze its surroundings in real-time and generate a navigational map to avoid obstacles during its movements.
According to Xiaomi, the robot dog can perform many complex movements like backflips. CyberDog comes with one HDMI and three USB-C ports that allow it to connect with various tools like LiDAR sensors, panoramic cameras, and lots more.
Xiaomi plans to launch 1000 units of CyberDog in the initial stage and will offer it at a price of $1544. The company will build a ‘Xiaomi Open Source Community that would enable developers to share their progress with a worldwide audience.
Investment management firm JLL acquires Israeli real estate technology developing platform Skyline AI. No information has been shared regarding the valuation of this deal.
Skyline AI uses machine learning and artificial intelligence tools to streamline the fragmented data available in the commercial real estate sector that helps investors analyze and find real estate opportunities.
According to officials, the deal will close soon after the completion of a few remaining formalities. JLL plans to combine the expertise of both companies to provide the best possible insights to its consumers.
JLL’s CEO of Global Capital Market, Richard Bloxam, said, “When you combine the intelligence of the best advisors on the ground with a quantitative expert team and AI data analysis, you get insights that are beyond human and create a competitive edge for JLL and our clients.”
He further added that this acquisition would enable JLL to provide innovative and strategic advice to its clients. Integrating Skyline AI’s technology with JLL’s services will also allow the company to accurately predict future property valuations, identify better investment opportunities, improve cost-saving, and help make informed business decisions.
Richard Winstanley founded JLL in 1997. The company specializes in providing real estate services to its customers to help them make better decisions related to investment and future real estate trends.
Co-CEO of JLL Technologies, Yishai Lerner, said, “Our teams consist of knowledgeable real estate experts and world-class technologists who successfully bring new AI offerings like Skyline AI into the fold and provide the best insights to our clients, accelerating JLL’s leadership in CRE technology.”
He also mentioned that this acquisition is a step forward towards JLL’s goal of accelerating its growth by investing in prop-tech enterprises.
The CEO of Skyline, Guy Zipori, said that JLL would help them achieve their vision of revolutionizing the real estate industry using artificial intelligence tools.
It has been nearly a decade since the first autonomous vehicle (AV) hit the road. Autonomous vehicles are attracting a lot of attention because of the convenience and safety benefits they provide. However, a fully autonomous vehicle is yet to progress beyond the testing stage. One of the biggest hurdles for this technology is not only artificial intelligence algorithms but also fog, rain, and snow weather data.
Today, more than hundreds of self-driving cars, trucks, and other vehicles companies are testing this technology but are leveraging data of road conditions on a clear sunny day. While the majority of the autonomous vehicles have given outstanding results on such test data, making the automobile navigate through rapidly changing road and weather conditions, especially in the circumstances with heavy snowfall, fog, or rain, poses a tremendous challenge.
Driverless vehicles rely on sensors to view street signs and lane dividers, but inclement weather can make it harder for them to ”see” the road and make correct decisions when cruising at high speeds. An autonomous car uses three sensors — camera, radar, and lidar — to view and perceive everything around it. The cameras assist it in obtaining a 360-degree vision of its surroundings, recognizing objects and people, and determining their distance. Radar aids lane maintaining and parking by detecting moving objects and calculating distance and speed in real-time. LIDAR uses lasers instead of radio waves to create 3D images of surroundings and map them, creating a 360-degree view around the car.
However, light showers or snowflakes might cause LIDAR sensor systems to malfunction and lose accuracy. The vehicles also depend heavily on data gathered from optical sensors, which are less reliable in bad weather.
Therefore, leveraging bad weather data will not only play a crucial role in safety-critical AV choices like disengagement and operational domain designation but also help in more basic tasks like lane management.
Despite the pressing need to accommodate bad weather data in the training dataset, there was scanty publicly available data. As a result, the Radiate (RAdar Dataset In Adverse weaThEr) project, directed by Heriot-Watt University, has released a new dataset that will aid in the creation of autonomous vehicles that can drive safely in adverse conditions. The team drew inspiration from the sensors that have already proven their excellence during rain, snow, and fog weather in Scotland. Their goal is to make radar sensing research on object identification, tracking, SLAM (Simultaneous Localization and Mapping), and scene comprehension in harsh weather easier.
This dataset comprises three hours of annotated radar images, multi-modal sensor data (radar, camera, 3D LiDAR, and GPS/IMU), and more. Professor Andrew Wallace and Dr. Sen Wang of Heriot-Watt University have been gathering data since 2019. First, they outfitted a van with LiDAR, radar, stereo cameras, and geopositioning devices. Then, they intentionally drove the vehicle across Edinburgh and the Scottish Highlands at all hours of the day and night, capturing urban and country roads conditions in bad weather.
Wallace explains that such datasets are critical for developing and benchmarking autonomous vehicle perception systems. Though we are still a long way from having driverless cars on the roads, autonomous vehicles are already being tested in controlled environments and piloting zones.
Located on the outskirts of Edinburgh, Heriot-Watt houses the famous National Robotarium that is a £3 million initiative that brings together robotics, cognitive science, and psychology experts with colleagues from Imperial College London and the University of Manchester. Wallace’sWallace’s team is based at Heriot-Watt University’s Institute of Sensors, Signals, and Systems, which has previously pioneered conventional and deep learning techniques for sensory data interpretation.
According to Wallace, the duo successfully demonstrated how radar could assist autonomous cars in navigating, mapping, and interpreting their environment in bad weather when vision and LiDAR are rendered useless. In addition, the team also labeled about 200,000 road actors in the dataset – bicycles, cars, pedestrians, traffic signs, and other road actors, that could help researchers and manufacturers develop safe navigation in autonomous vehicles of the future.
Dr. Wang cites, “When a car pulls out in front of you, you try to predict what it will do – will it swerve, will it take off? That’s what autonomous vehicles will have to do, and now we have a database that can put them on that path, even in bad weather.”
Wallace claims that they need to improve the resolution of the radar, which is naturally fuzzy. Combining high-resolution optical images with improved weather-penetrating capabilities of radar would help autonomous vehicles see and map better and, ultimately, travel more safely.
The Indian Institute of Technology (IIT) Madras announces that it has started accepting applications for its online data science program. The Institute clarified that marks of JEE would not be needed to apply for this program.
This unique course was first launched in 2020 for students who have completed senior secondary education. IIT madras would conduct a qualifying process where the applicants will be trained through online video lectures and assignments for four weeks.
Applicants will also get the opportunity to interact with the course lecturers. Admission will be offered to the learners who successfully clear the qualifying process. Any student who has cleared class 12th examination in English and had studied mathematics till class 10th is eligible to apply for this data science program.
The professor in charge of IIT Madras’s data science program, Prof Andrew Thangaraj, said, “With this program, IIT Madras is delivering the highest quality learning opportunity to a very large number of learners, without compromising the rigor of the process. The combination of online classes and in-person invigilated exams accomplishes this. At each stage, students will have the freedom to exit from the program and receive a Certificate, Diploma or a Degree, from IIT Madras.”
He further mentioned that applicants would be able to build a career in programming and data science after pursuing this course offered by IIT Madras. With this diploma program, students will be able to learn from the best faculties of IIT Madras without the need to appear for the Joint Entrance Exam.
A scholarship of up to 75% will be provided by IIT madras to students belonging from the underprivileged sections of society. Interested candidates can apply from the official website of IIT Madras. The last date to apply for this program is 30th August 2021.