State of AI Report 2020

kerrie holley ai report.png

A most excellent report by two distinguished authors. I like the simple, concise, and usable definition of AI, "Artificial intelligence (AI) is a multidisciplinary field of science and engineering whose goal is to create intelligent machines."


Impressive backgrounds as founders, professors, venture capitalists, and entrepreneurs. The report is deep in insights, direct and without hyperbole and speculation. The report focuses on research, talent, industry, politics, and they make a prediction while also letting us know how well they did with their 2019 predictions. The executive summary highlights the most insightful research points:


-A new generation of transformer language models are unlocking new NLP use-cases.

-Huge models, large companies and massive training costs dominate the hottest area of AI today: Natural Language Processing.

-Biology is experiencing its “AI moment”: From medical imaging, genetics, proteomics, chemistry to drug discovery.

-AI is mostly closed source: Only 15% of papers publish their code, which harms accountability and reproducibility in AI.


Competing with companies like Microsoft or Nvidia in terms of building huge models (i.e., billion parameters) may not be viable for most companies as the investment in on-premise GPU infrastructure and talent to construct and tune such models may be cost-prohibitive for most if not all. The key insight is companies, like healthcare for example that may want to build their clinical "BERT" will most likely need to partner with a technology company or watch over time as the technology company builds a product like a clinical language model that could serve many uses cases (e.g., conversational AI for call centers) rightfully should have first been build with companies owning the vast amount of data.


The report also signals that the industry is approaching "outrageous" computational and environmental costs for modest improvements in machine learning model performance and without new research breakthroughs the cost might be prohibitive. Their research also shows that BERT may not be a silver bullet for all NLP tasks. Another big insight is this new generation of transformer language models are unlocking new NLP use-cases, like converting code into another programming language or repair bugging code.


I hope this blog post inspires you to download and read current and past issues of the "State of AI" report.

Kerrie Holley