|Natural Language Processing (NLP) has evolved significantly over the last decade. This paper highlights the most important milestones of this period, while trying to pinpoint the contribution of each individual model and algorithm to the overall progress. Furthermore, it focuses on issues still remaining to be solved, emphasizing on the groundbreaking proposals of Transformers, BERT, and all the similar attention-based models|
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.