Saturday, November 23, 2024
ad
Home Blog Page 345

Julia Programming 1.5 Released – What’s New

Julia Programming Language 1.5

Julia programming has quickly gained traction in the data science community due to numerous advantages such as speed and usability over Python and other prominent programming languages. And in a continuous effort, Julia Computing has further improved the Julia programming language and released its latest version — 1.5 — earlier this week.

The release of Julia 1.5 has a lot of significant advancements like struct layout and allocation optimizations, multithreading API stabilization and improvements, per-module optimization levels, other latency improvements, implicit keyword argument values, the return of “soft scope” in the REPL, faster random numbers, automated rr-based bug reports, among others.

Julia programming language’s update doesn’t usually aim at specific features but follows timed releases. However, this time around, there are major features updates that one must know.

Also Read: OpenAI Invites For It’s Scholars Program, Will Pay $10K Per Month

Allocation Optimizations

One of the most relevant updates is the ‘struct layout and allocation optimizations.’ According to Julia Computing, this was a long-desired optimization that significantly reduced heap allocations. As Julia programming language has both mutable and immutable objects that enable flexibility to programmers in using objects according to their requirement, handling immutable objects has significant limitations.

When an immutable object is pointed to a heap-allocated mutable object, it would need to be heap-allocated too. To address this challenge, Jameson Nash enhanced the compiler’s performance as it would now track multiple roots inside object fields. This simplified the use of immutable objects irrespective of the type of object it is pointing to.

Multithreading API

Threading was introduced in v0.5 for enabling parallelism to increase performance. Over the last couple of releases, the threading was added with safety and new features. One of the major use cases was the multithreaded CSV parsing, which provides an edge to Julia programming language from other data science-friendly languages.

After being labeled as ‘experimental’ for years, it has been moved to a stable category with newer updates: improved safety for top-level expressions (type definition, global assignments, modules), improvements to task switch performance, and more.

Per-Module Optimization

Julia programming language uses -O2 optimization, which is similar to -O2 in GCC or clang. Although this method is effective while setting up benchmarks, for some plotting and non-inner-loop support tasks, -O2 is not the ideal optimization as it delays the process. Consequently, Julia programming language introduced -O1 optimization level that can be used to provide level hints in each module.

While there are other updates like default package manager and faster random number generation, we listed the most important enhancements. However, you can read more here.

Advertisement

OpenAI Invites For It’s Scholars Program

OpenAI — one of the best research firms — invites people from underrepresented groups in science and engineering (S&E) to learn and innovate in deep learning for six months. The underrepresented group consists of blacks, Hispanics, and American Indians or Alaska Natives. While you can identify if you come underrepresented groups here, you can still apply for the OpenAI Scholars Program if you think the list does not have your group included in it and actually comes under S&E. However, you will have to mention it while applying for the position.

The six-months-long OpenAI Scholars Program is aimed to bring diversity in the developments of the AI landscape. OpenAI will offer a stipend of $10k per month for the entire tenure.

This is OpenAI’s fourth program, which was started in 2018 and completed its program for the year 2020 earlier this year. For 2021, the application is open from 28 July and will be closed on 8 August. A total of 10 people will be selected who will be notified on 21 August. The program will start from 12 October and end on 9 April 2021.

Read Also: Google Releases MCT Library For Model Explainability

Here are the prerequisites that are expected from applicants:

  1. Should have US work authorization and are physically located in the US
  2. Should have 2+ years of experience in software engineering
  3. Programming experience in Python (other languages are helpful too)
  4. Strong mathematics background
  5. Interest in the machine learning field
  6. Should have completed or will be able to finish practical deep learning for coders, v3 or deep learning specialization or an equivalent before the start of the program

If selected, one can get a huge benefit such as access to computing resources, one-on-one video calls each week with mentors, recruitment support, a stipend for AI-related conferences during the OpenAI Scholars program, access to Slack workspace.

In a few cases, people who have received the scholar have in the future worked for OpenAI. Consequently, this is an exceptional opportunity to give your machine learning journey the wing it needs.

Click here to know more and apply for the OpenAI Scholars Program: Fall 2020.

Advertisement

Google Releases MCT Library For Model Explainability

Google Explainability

Google, on Wednesday, released the Model Card Toolkit (MCT) to bring explainability in machine learning models. The information provided by the library will assist developers in making informed decisions while evaluating models for its effectiveness and bias.

MCT provides a structured framework for reporting on ML models, usage, and ethics-informed evaluation. It gives a detailed overview of models’ uses and shortcomings that can benefit developers, users, and regulators.

To demonstrate the use of MCT, Google has also released a Colab tutorial that has leveraged a simple classification model trained on the UCI Census Income dataset.

You can use the information stored in ML Metadata (MLMD) for explainability with JSON schema that is automatically populated with class distributions and model performance statistics. “We also provide a ModelCard data API to represent an instance of the JSON schema and visualize it as a Model Card,” note the author of the blog. You can further customize the report by selecting and displaying the metrics, graphs, and performance deviations of models in Model Card.

Read Also: Microsoft Will Simplify PyTorch For Windows Users

The detailed reports such as limitations, trade-offs, and other information from Google’s MCT can enhance explainability for users and developers. Currently, there is only one template for representing the critical information about explainable AI, but you can create numerous templates in HTML according to your requirement.

Anyone using TensorFlow Extended (TFX) can avail of this open-source library to get started with explainable machine learning. For users who do not utilize TFX, they can leverage through JSON schema and custom HTML templates. 

Over the years, explainable AI has become one of the most discussed topics in technology as today, artificial intelligence has penetrated in various aspects of our lives. Explainability is essential for organizations to bring trust in AI models among stakeholders. Notably, in finance and healthcare, the importance of explainability is immense as any deviation in the prediction can afflict users. Google’s MCT can be a game-changer in the way it simplifies the model explainability for all.

Read more here.

Advertisement

Microsoft Will Simplify PyTorch For Windows Users

Microsoft PyTorch

Installing and using PyTorch will become easier for Windows users as Microsoft will become the maintainer of the Windows version of PyTorch. Facebook, on Tuesday, announced that Microsoft will take the ownership of the development and maintenance of the PyTorch build for Windows.

According to the Stack Overflow developer survey 2020, Windows is still the preferred choice for most of the developers as 45.8% of them use Windows, which is ahead of 27.5% of users who prefer MacOS and 26.6% or programmers who leverage Linux-based operating systems.

Also Read: Intel’s Miseries: From Losing $42 Billion To Changing Leadership

However, installing PyTorch on Windows is a tedious task, and on top of that, it does not support functionalities like distributed training and TorchAudio domain library. Nevertheless, Windows has committed to improving the overall functionality, along with the aforementioned challenges.

Earlier, Jiachen Pu attempted to improve PyTorch’s compatibility with Windows just like it is with other platforms such as Linux and MacOS. But, due to the absence of necessary resources, the initiative was not futuristic. “Lack of test coverage resulted in unexpected issues popping up every now and then. Some of the core tutorials, meant for new users to learn and adopt PyTorch, would fail to run. The installation experience was also not as smooth, with the lack of official PyPI support for PyTorch on Windows,” noted the authors of the announcement.

With the announcement of Microsoft’s extended support, PyTorch 1.6 was released. In the updated version of PyTorch, the overall core quality was enhanced by introducing test coverage for core PyTorch and its domain libraries, along with automatic tutorial testing.

It also added the support for TorchAudio, and test courage to all three domain libraries: TorchVision, TorchText and TorchAudio. In addition, Microsoft added GPU compute support to Windows Subsystem for Linux (WSL) 2 distros. It is devised to allow the use of NVIDIA CUDA for AI and ML workloads. You can now run native Linux based PyTorch applications on Windows without the requirement of a virtual machine or a dual boot setup.

Advertisement

Intel’s Miseries: From Losing $42 Billion To Changing Leadership

Intel's Misery

Intel’s stocks plunged around 18% as the company announced that it is considering outsourcing the production of chips due to delays in the manufacturing processes. This wiped out $42 billion from the company as the stocks were trading at a low of $49.50 on Friday. Intel’s misery with production is not new. Its 10-nanometer chips were supposed to be delivered in 2017, but Intel failed to produce in high-volumes. However, now the company has ramped up the production for its one of the best and popular 10-nanometer chips.

Intel’s Misery In Chips Manufacturing

Everyone was expecting Intel’s 7-nanometer chips as its competitor — AMD — is already offering processors of the same dimension. But, as per the announcement by the CEO of Intel, Bob Swan, the manufacturing of the chip would be delayed by another year.

While warning about the delay of the production, Swan said that the company would be ready to outsource the manufacturing of chips rather than wait to fix the production problems.

“To the extent that we need to use somebody else’s process technology and we call those contingency plans, we will be prepared to do that. That gives us much more optionality and flexibility. So in the event there is a process slip, we can try something rather than make it all ourselves,” said Swan.

This caused tremors among shareholders as it is highly unusual for a 50 plus year world’s largest semiconductor company. In-house manufacturing has provided Intel an edge over its competitors as AMD’s 7nm processors are manufactured by Taiwan Semiconductor Manufacturing Company (TSMC). If Intel outsources the manufacturing, it is highly likely that TSMC would be given the contract, since they are among the best in producing chips.

But, it would not be straight forward to tap TSMC as long-term competitors such as AMD, Apple, MediaTek, NVIDIA, and Qualcomm would oppose the deal. And TSMC will be well aware that Intel would end the deal once it fixes its problems, which are currently causing the delay. Irrespective of the complexities in the potential deal between TSMC and Intel, the world’s largest chipmaker — TSMC — stock rallied 10% to an all-time high as it grew by $33.8 billion.

Intel is head and shoulder above all chip providers in terms of market share in almost all categories. For instance, it has a penetration of 64.9% in the market in x86 computer processors or CPUs (2020), and Xeon has a 96.10% market share in server chips (2019). Consequently, Intel’s misery gives a considerable advantage to its competitors. Over the years, Intel has lost its market penetration to AMD year-over-year (2018 – 2019): Intel lost 0.90% in x86 chips, -2% in server, -4.50% in mobile, and -4.20% in desktop processors. Besides, NVIDIA eclipsed Intel for the first time earlier this month by becoming the most valuable chipmaker. 

Also Read: MIT Task Force: No Self-Driving Cars For At Least 10 Years

Intel’s Misery In The Leadership

Undoubtedly, Intel is facing the heat from its competitors, as it is having a difficult time maneuvering in the competitive chip market. But, the company is striving to make necessary changes in order to clean up its act.

On Monday, Intel’s CEO announced changes to the company’s technology organization and executive team to enhance process execution. As mentioned earlier, the delay did not go well with the company, which has led to the revamp in the leadership, including the ouster of Murthy Renduchintala, Intel’s hardware chief, who will be leaving on 3 August. 

Intel poached Renduchintala from Qualcomm in February 2016. He was given a more prominent role in managing the Technology Systems Architecture and Client Group (TSCG). 

The press release noted that TSCG will be separated into five teams, whose leaders will report directly to the CEO. 

List of the teams:

Technology Development will be led by Dr. Ann Kelleher, who will also lead the development of 7nm and 5nm processors

Manufacturing and Operations, which will be monitored by Keyvan Esfarjani, who will oversee the global manufacturing operations, product ramp, and the build-out of new fab capacity

Design Engineering will be led by an interim leader, Josh Walden, who will supervise design-related initiatives, along with his earlier role of leading Intel Product Assurance and Security Group (IPAS)

Architecture, Software, and Graphics will be continued to be led by Raja Koduri. He will focus on architectures, software strategy, and dedicated graphics product portfolio

Supply Chain will be continued to be led by Dr. Randhir Thakur, who will be responsible for the importance of efficient supply chain as well as relationships with key players in the ecosystem

Also Read: Top 5 Quotes On Artificial Intelligence

Outlook

Intel, with this, had made a significant change in the company to ensure compliance with the timeline it sets. Besides, Intel will have to innovate and deliver on 7nm before AMD creates a monopoly in the market with its microarchitectures that are powering Ryzen for mainstream desktop and Threadripper for high-end desktop systems.

Although the chipmaker revamped the leadership, Intel’s misery might not end soon; unlike software initiatives, veering in a different direction and innovating in the hardware business takes more time. Therefore, Intel will have a challenging year ahead.

Advertisement

Antitrust Hearing Of Apple, Amazon, Google, And Facebook Will Be Held On Wednesday

Antitrust Hearing

CEOs of Facebook, Apple, Amazon, and Google, were about to undergo a Congressional hearing on Monday due to their the allegation of unfair dominance in the tech space. Lawmakers would question and assess these blue-chip companies for antitrust issues in the market and submit its report that could be used to device new rules to maintain competitiveness in the tech market.

Top executives of these organizations have faced backlash from critics, customers, and competitors about their monopoly in the market. It is believed that tech giants, especially, Facebook, Amazon, and Google, are leveraging their broader presence in the market to eliminate competition and implement rules that spoil customer experience.

Also Read: Tesla Sues Rivian And Accuses Of Theft

For one, Apple’s policies of App Store have always been under the scanner; many developers have claimed that the firm does not share essential information such as location data and other challenges in the approval process of applications. Similarly, Google will be questioned on the dominance of digital advertising, Facebook on its accusation of WhatsApp and Instagram, and Amazon on its treatment with third-party sellers.

House Judiciary Committee Chairman Jerrold Nadler and Antitrust Subcommittee Chairman David Cicilline said in a statement Saturday: “Since last June, the Subcommittee has been investigating the dominance of a small number of digital platforms and the adequacy of existing antitrust laws and enforcement. Given the central role these corporations play in the lives of the American people, it is critical that their CEOs are forthcoming. As we have said from the start, their testimony is essential for us to complete this investigation.”

However, the antitrust hearing has been rescheduled to Wednesday as the members of Congress will be paying respect to the late Rep. John Lewis, who died on 17 July.

The hearing will be held remotely in which Mark Zuckerberg, Jeff Bezos, Tim Cook, and Sundar Pichai will explain their stand on various antitrust issues Facebook, Amazon, Apple, and Google are dealing with over the years.

Advertisement

Top Quote On Artificial Intelligence By Leaders

Quotes on Artificial Intelligence

Artificial intelligence is one of the most talked-about topics in the tech landscape due to its potential for revolutionizing the world. Many thought leaders of the domain have spoken their minds on artificial intelligence on various occasions in different parts of the world. Today, we will list down the top artificial intelligence quotes that have an in-depth meaning and are/were ahead of time.

Here is the list of top quotes about artificial intelligence: –

Artificial Intelligence Quote By Jensen Huang

“20 years ago, all of this [AI] was science fiction. 10 years ago, it was a dream. Today, we are living it.”

JENSEN HUANG, CO-FOUNDER AND CEO OF NVIDIA

The quote on artificial intelligence by Jensen Huang was said during NVIDIA GTC 2021 while announcing several products and services during the event. Over the years, NVIDIA has become a key player in the data science industry that is assisting researchers in further the development of the technology.

Quote On Artificial Intelligence By Stephen Hawking

“Success in creating effective AI, could be the biggest event in the history of our civilization. Or the worst. We just don’t know. So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it. Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy.”

Stephen Hawking, 2017

Stephen Hawking’s quotes on artificial intelligence are very optimistic. Some of the famous quotes on artificial intelligence came from Hawking in 2014 when the BBC interviewed him. He said artificial intelligence could spell the end of the human race.

Here are some of the other quotes on artificial intelligence by Stephen Hawking.

Also Read: The Largest NLP Model Can Now Generate Code Automatically

Elon Musk On Artificial Intelligence

I have been banging this AI drum for a decade. We should be concerned about where AI is going. The people I see being the most wrong about AI are the ones who are very smart, because they can not imagine that a computer could be way smarter than them. That’s the flaw in their logic. They are just way dumber than they think they are.

Elon Musk, 2020

Musk has been very vocal about artificial intelligence’s capabilities in changing the way we do our day-to-day tasks. Earlier, he had stressed on the fact that AI can be the cause for world war three. In his Tweet, Musk mentioned ‘it [war] begins’ while quoting a news, which noted Vladimir Putin, President of Russia, though on the ruler of the world; the president said the nation that leads in AI would be the ruler of the world.

Mark Zuckerberg’s Quote

Unlike negative quotes on artificial intelligence by others, Zuckerberg does not believe artificial intelligence will be a threat to the world. In his Facebook live, Zuckerberg answered a user who asked about people like Elon Musk’s opinion about artificial intelligence. Here’s what he said:

“I have pretty strong opinions on this. I am optimistic. I think you can build things and the world gets better. But with AI especially, I am really optimistic. And I think people who are naysayers and try to drum up these doomsday scenarios. I just don’t understand it. It’s really negative and in some ways, I actually think it is pretty irresponsible.”

Mark Zuckerberg, 2017

Larry Page’s Quote

“Artificial intelligence would be the ultimate version of Google. The ultimate search engine that would understand everything on the web. It would understand exactly what you wanted, and it would give you the right thing. We’re nowhere near doing that now. However, we can get incrementally closer to that, and that is basically what we work on.”

Larry Page

Stepped down as the CEO of Alphabet in late 2019, Larry Page has been passionate about integrating artificial intelligence in Google products. This was evident when the search giant announced that the firm is moving from ‘Mobile-first’ to ‘AI-first’.

Sebastian Thrun’s Quote On Artificial Intelligence

“Nobody phrases it this way, but I think that artificial intelligence is almost a humanities discipline. It’s really an attempt to understand human intelligence and human cognition.” 

Sebastian Thrun

Sebastian Thrun is the co-founder of Udacity and earlier established Google X — the team behind Google self-driving car and Google Glass. He is one of the pioneers of the self-driving technology; Thrun, along with his team, won the Pentagon’s 2005 contest for self-driving vehicles, which was a massive leap in the autonomous vehicle landscape.

Advertisement

Tesla Sues Rivian And Accuses Of Theft

Tesla Sues Rivian

Tesla, the world’s leading manufacturer of electric vehicles, accused and filled a suit against its electric vehicle automaker rivals — Rivian.

According to Tesla, Rivian is stimulating employees of Tesla to leave their job and encouraging them to pouch the company’s trade secrets, confidential, and property information while leaving the firm.

Tesla also indicted Rivian by alleging that the rival startup has already recruited four of its employees who stole highly sensitive information while leaving. The largest automaker mentioned in the new suit that a total of 178 former Tesla employees are working at Rivian. And of which around 70 of them directly joined the rival startup.

Also Read: MIT Task Force: No Self-Driving Cars For At Least 10 Years

While Tesla is blaming its rival, Rivian defended itself with a statement to Tech Crunch: “We admire Tesla for its leadership in resetting expectations of what an electric car can be. Rivian is made up of high-performing, mission-driven teams, and our business model and technology are based on many years of engineering, design, and strategy development. This requires the contribution and know-how of thousands of employees from across the technology and automotive spaces. Upon joining Rivian, we require all employees to confirm that they have not, and will not, introduce former employers’ intellectual property into Rivian systems. This suit’s allegations are baseless and run counter to Rivian’s culture, ethos, and corporate policies.”

With investments from Amazon and Ford, Rivian is a significant player in the electric vehicle industry and had bagged high-profile deals from Amazon. The largest e-commerce company has ordered 100,000 electric delivery vans from Rivian, which are yet to be delivered by the Tesla rival. R.J Scaringe founded Rivian in the year 2009, and today the company has footprints in many countries such as Canada, California, Plymouth, Normal, San Jose, United Kingdom, and more.

This is not the first time Tesla alleged that its employees steal sensitive information and leave the company and joined rivals organizations such as Xpeng Motors and Zoox Inc.

Advertisement

MIT Task Force: No Self-Driving Cars For At Least 10 Years

MIT Task Force Self-Driving Cars

In 2018, MIT (Massachusetts Institute of Technology) introduced the MIT Task Force on The Work of Future. The main focus of the Task Force is to study the transmission of human work because of the emerging technologies and the skills required by humans to achieve the goal of the fully digital economy all over the world.

MIT’s faculty and student research team recently published a report mentioning the fact that fully automated driving systems (Cars, trucks, and buses) will take more than ten years to come into effect. According to the report’s latest brief, the growth of automatic vehicles intended to be slow and will occur region by region in specific categories of transportation, resulting in an alteration of availability over the world.

Also ReadArtificial Intelligence In Vehicles Explained

Visions of automation in mobility are still a matter of question and will not be readily accepted by people who are dubious of the possibility of self-driving technology. Many challenges may become the hurdle in the path of fully driverless vehicles. The cost can be one of the barriers, as suggested by coauthors John Leonard, MIT professor of mechanical and ocean engineering, and Erik Stayton, MIT doctoral candidate. For instance, Starsky — a self-driving vehicle company — shut down its operations, and Zoox shed its workforce during COVID-19 pandemic. In addition, we are yet to develop technology that can bring cognitive in AI-based systems that can allow self-driving companies to deliver fully autonomous vehicles. 

Although the report by MIT says that the Level 5 autonomy is at least ten years away, Elon Musk, earlier this month in a Chinese AI conference, said that Tesla could accomplish complete autonomy by the end of 2020. “I remain confident that we will have the basic functionality for level five autonomy complete this year. I think there are no fundamental challenges remaining for level five autonomy,” said Musk.

Nevertheless, MIT’s report also mentioned that investing in the local and national infrastructure and private-public partnerships will encourage the unification of automatic vehicles and strengthen the growth in urban areas.

Read the full report here.

Advertisement

Dataflex: A New Low-Code Data Platform For Microsoft Teams

Dataflex

Microsoft has announced a platform named Dataflex — a relational database that helps business developers create, organize, and handle chatbots and Power Platform apps without leaving Microsoft Teams application.

Equipped with AI capabilities and security benefits, with Dataflex, Microsoft will give its users relational data storage, rich data types, the business-level authority with a single click arrangement. Dataflex is meant to help companies leverage organizational data to make low-code apps that would solve their business problems while leaving the complexities of managing facilities behind the relational database.

Microsoft Teams has been in the fastest growing app of the company since 2018; in April 2020, it achieved 7.5 million daily active users. With Dataflex, Microsoft is all set to help business developers build their own apps for Teams.

AI capabilities

Dataflex is built on top of the Power Platform, which will enable users to embrace artificial intelligence and predictive powers of the platform. Power Automata’s AI Builder helps your systems audit structured and unstructured data from paper-based invoices, images, and other analog sources. AI Builder works with six elements in Dataflex: category classification, entity extraction, key phrase extraction, language detection, sentiment analysis, and prediction. In other words, AI Builder provides a low-code experience for using artificial intelligence and machine learning in business applications.

Also Read: The Largest NLP Model Can Now Generate Code Automatically

Security and other Performance benefits

The Power Platform supports over 350 data connectors, thereby catering to the diverse needs of organizations. This helps business users to tie up with other business systems (SQL Server, Excel file, SharePoint list, or integrate with OneDrive or Dropbox) to reserve the data for their app or chatbot. It reduces the risk of migrating and storing data in other databases. Eliminating the need for a security expert will empower businesses to materialize their ideas rapidly. This will democratize its adoption among organizations, as it can bring value to firms quickly.

Click here for more.

Advertisement