Artificial intelligence algorithms hit a bottleneck, but the hardware revolution pushed them into the mainstream.

The hardware revolution has pushed artificial intelligence into the mainstream, slashing the training time and cost of AI systems and turning AI into an arms race that few people can participate in. In recent years, intelligent algorithms have become a breakthrough in the field of artificial intelligence as computers have shown their superiority over human beings in increasingly complex tasks. Now, however, another force could have a greater impact in moving awe forward.

Advances in professional chips and other hardware have boosted the capabilities of state-of-the-art artificial intelligence systems, while also pushing the technology into the mainstream. Whether this produces tangible business benefits is another matter.

The Artificial Intelligence Index (AI Index), a project launched by a team at Stanford University, clearly demonstrates the importance of the aI hardware revolution. The latest AI Index attempts to summarize the progress of artificial intelligence, capturing a change in the trajectory of the greatest progress of artificial intelligence over the past 18 months.

Artificial intelligence algorithms hit a bottleneck, but the hardware revolution pushed them into the mainstream.

In many ways, these algorithms have not made the leap in recent years. This is partly because the results of this type of technology have not increased significantly in some tasks: in image recognition, for example, computers have not built more after completing the transcendence of humans.

It also reflects the fact that the problems to be solved are becoming more difficult and progress ingenious. As we all know, language is the next frontier of machine intelligence, especially difficult to overcome. Although tasks such as speech recognition and language translation have been solved, understanding and reasoning remain an area dominated by mankind.

Instead, the most dramatic advance sits from hardware. For example, specially designed chips are used to process the vast amounts of data required for machine learning, and the industry is also developing dedicated systems for this work.

OpenAI, a US research firm, points to a hardware inflection point in 2012. Until then, Moore’s Law, the industry’s rule of thumb, dominated artificial intelligence computing. Moore’s Law means that processing power doubles every two years.

Since then, artificial intelligence systems have followed Moore’s law. The capabilities of the most advanced ai systems are boosted every 3.4 months as new hardware and more resources are invested in the problem.

There is a paradox to this hardware acceleration. On the one hand, at the forefront of science, it has turned artificial intelligence into an arms race that few people can participate in.

Big companies and governments that can control vast computing resources will be the only ones that can compete. OpenAI’s philosophy has always been that ai intelligence researchers with the largest computers will inherit the world. The group recently secured a $1 billion investment from Microsoft to stay in the race.

Another effect of the hardware revolution, however, is that it has pushed the technology into the mainstream. Google’s TPU is one of the world’s most advanced machine learning processing chips, which can be rented by the hour from the company’s cloud computing platform (if your workload isn’t time-sensitive and you don’t mind waiting in line for $1.35 an hour).

In Silicon Valley, there is too much advocate for “popular” new technologies, but in artificial intelligence, the idea is justified. As cloud services such as Amazon Web Services (AWS) make low-cost hardware and machine learning tools widely available, training neural networks, the most computationally intensive part of artificial intelligence, are suddenly at their fingertips.

Stanford University’s Dawn Bench project provides a way to benchmark artificial intelligence systems. In less than two years, the time required to train a system on the widely used ImageNet dataset has dropped from three hours to 88 seconds, according to the project. That means being able to slash costs from $2,323 to $12.

Whether the huge reduction in training time and costs will make advanced ai a practical technology is another matter. The widespread impact of machine learning is hard to pin down, but AI Index points to a promising measure. In October, about 1.32 percent of u.S. hiring information was related to artificial intelligence, up from 0.26 percent in 2010. That number is still small, and the definition of “artificial intelligence work” is controversial, but the general direction is clear.

Erik Brynjolfsson, a professor at the Massachusetts Institute of Technology who studies the economic impact of new technologies, warns Companies that employ data scientists and machine learning experts don’t see immediate rewards: they first need to develop the new workflows needed to maximize the use of the technology in order to overcome internal bottlenecks.

The aia race to reap tangible rewards from a much-hyped technology has begun.

Add a Comment

Your email address will not be published. Required fields are marked *