23rd November | 3:30 PM - 4.45 PM (IST)

About the Masterclass

The field of natural language processing (NLP) is concerned with developing models that can understand and generate natural language text. NLP has undergone a revolution in the past decade due to (a) the use of neural networks as the primary vehicle for developing NLP systems, and (b) the rise of self- supervised learning, a technique for training neural networks that can harness large amounts of unlabeled data. In this talk, I will review these two paradigm shifts and their effect on NLP research and its applications.

Speaker

Prof. Jonathan Berant, Associate Professor, Blavatnik School of Computer Science; Research Scientist, The Allen Institute for Artificial Intelligence

 

Jonathan Berant is an associate professor at the School of Computer Science at Tel Aviv University and a research scientist at The Allen Institute for AI. Jonathan earned a Ph.D. in Computer Science at Tel- Aviv University, under the supervision of Prof. Ido Dagan. Jonathan was a post-doctoral fellow at Stanford University, working with Prof. Christopher Manning and Prof. Percy Liang, and subsequently a post-doctoral fellow at Google Research, Mountain View. Jonathan Received several awards and fellowships including The Rothschild fellowship, The ACL 2011 best student paper award, EMNLP 2014 best paper award, and NAACL 2019 best resource paper award, as well as several honorable mentions. Jonathan is currently an ERC grantee.

 

Register