But in the past decade, thanks to advances in deep learning and artificial neural networks, the AI industry has undergone a dramatic change. Today, AI has found its way into many practical applications. Scientists, tech executives and world leaders have all touted AI in general and machine learning in particular as one of the most influential technologies of the next decade. The potential (and hype) surrounding AI has drawn the interest of commercial entities, nation states and the military, all of which want to leverage the technology to maintain the edge over their competitors. The multi-faceted AI arms race has increased demand for AI talent. There is now a major shortage of people who have the skills and knowledge to carry out major AI research projects across different industries. Under such circumstances, those who have deeper pockets have managed to hire AI scientists for their projects. This has led to an AI brain drain, drawing scientists and researchers away from the institutions where artificial intelligence was born and developed into the revolutionary technology it has become.
How deep learning ended the AI winter
Before the deep learning revolution, AI was mostly dominated by rule-based programs, where engineers and developers manually encoded knowledge and operation logic into their software. During those years, artificial intelligence had been widely known for overpromising and underdelivering, and had undergone several “AI winters” after failing to meet expectations. Around the turn of the decade, scientists managed to use neural networks to perform computer vision and natural language processing (NLP), two areas where rule-based performed very poorly. The turn of events enabled AI to enter numerous fields that were previously considered off limits or extremely challenging for computers. Some of these areas included voice and face recognition, object detection and classification, machine translation, question-answering and more. This paved the way for many new commercial uses of AI. Many of the applications we use every day, such as smart speakers, voice-powered digital assistants, translation apps and phone face locks, are all powered by deep learning algorithms and neural networks. The revival of neural networks also created new inroads in other areas such as autonomous driving, where computer vision plays a key role in helping self-driving cars make sense of their surroundings. The renewed interest in neural networks triggered the race to poach AI scientists from academic institutes. And thus began the AI brain drain.
How AI scientists became MVPs
Despite the hype surrounding neural networks, they are almost as old as artificial intelligence itself. But having fallen by the wayside in the decades that followed, they lacked the support and tools available for rule-based software. Neural networks are also fundamentally different from other forms of programming, and discovering and developing new applications for them is often more akin to scientific research than traditional software development. That’s why AI research requires a combination of various math and computer science skills, hardly the kind of knowledge you obtain from reading a programming book over the weekend. The sudden rise in popularity of deep learning created a sudden surge in the demand for AI researchers and scientists. And as in any field where supply doesn’t meet demand, those who have stronger financial resources get the lion’s share. In the past years, rich tech companies and research labs such as Google, Facebook and OpenAI have been using huge salaries, stock options and other bonuses to lure AI scientists away from academic institutions. A New York Times story from 2018 claimed that OpenAI paid some of its scientists more than $1 million per year. More recently, the expense report of DeepMind, the AI research outfit acquired by Google in 2014, stated that the lab had paid $483 million to its 700 employees, an average of $690,000 per employee (though the median is probably much less than that, with a few very high-paid researchers skewing the average upward). Have AI professors and academicians been able to resist the temptation of leaving academia for commercial entities? A recent study by researchers at the University of Rochester has found that over the last 15 years, 153 artificial intelligence professors in American and Canadian universities have left their posts for opportunities in the commercial sector. The trend has been growing in the past few years, with 41 professors making the move in 2018 alone. There are also many AI professors who have dual roles, maintaining their affiliation with their universities while also working for tech companies.
How research costs contribute to the AI brain drain
While handsome salaries play a large role in drawing AI professors and researchers away from universities and to tech companies, they’re not the only factor contributing to the AI brain drain. Scientists also face a cost problem when working on AI research projects. Some areas of AI research require access to huge amounts of data and compute resources. This is especially true of reinforcement learning, a technique in which AI agents develop their behavior through massive trial-and-error such as playing hide-and-seek 500 million times or 45,000 years’ worth of Dota 2, all in super-fast forward. Reinforcement learning is a hot area of AI research, especially in robotics, game bots, resource management and recommendation systems. The computation costs of training reinforcement learning AI models can easily reach millions of dollars, the kind of money that only rich tech companies can spare. Moreover, other kinds of deep learning models often require access to large sets of training data that only large tech companies like Google and Facebook possess. This also makes it very hard for AI researchers to pursue their dreams and projects without the support and financial backing of big tech. And big tech’s support seldom comes for free.
What are the effects of the AI brain drain?
With more and more professors, scientists and researchers flocking to the commercial sector, the AI industry will face several challenges. First, universities will have a hard time hiring and keeping professors to train the next generation of AI scientists. This will in turn further widen the AI skills gap. Consequently, the wages of AI researchers will remain high. This might be pleasant for the researchers themselves, but not so for smaller companies who will struggle to hire AI talent for their projects. The commercialization of artificial intelligence will also affect the kind of advances the field will see in the next years. The interest of the commercial sector in AI is primarily to develop products that have business value. They’re less interested in projects that serve science and the welfare of humanity in general. One notable example is DeepMind, one of the handful of research labs that aims to create human-level AI. Since acquiring DeepMind, Google has given the research lab access to its unlimited compute, data and financial resources. But it has also restructured the AI lab to create a unit that creates commercial products. DeepMind is now in the midst of an identity crisis and has to decide whether it’s a scientific research lab or an extension of its for-profit owner. Finally, the AI brain drain and the commercialization of artificial intelligence will mean less transparency in the industry. For-profit organizations seldom make their source code and AI algorithms available to the public. They tend to treat them as intellectual property and guard them closely behind their walled gardens. This will result in a slower AI research and a lot of “reinventing the wheel” as companies will share less knowledge to keep their edge over their competitors. This article was originally published by Ben Dickson on TechTalks, a publication that examines trends in technology, how they affect the way we live and do business, and the problems they solve. But we also discuss the evil side of technology, the darker implications of new tech and what we need to look out for. You can read the original article here.