Speaker: Paweł Wawrzyński, IDEAS, NCBR
Abstract of the talk:
Recurrent neural networks (RNNs) are the basic tool for the analysis and prediction of sequential data. The most popular RNNs are LSTM and GRU. This is despite the fact that modern RNNs outperform LSTM and GRU by a large margin both in terms of accuracy and training speed. This presentation will overview modern recurrent neural networks. It will mostly focus on Deep Memory Update which is arguably the simplest RNN while still extremely efficient.
Bio of the speaker:
Paweł Wawrzyński, PhD, Eng, DSc leads the learning in control, graphs, and networks team in IDEAS NCBR. In the years 2016-2022, he worked at the Institute of Computer Science, Warsaw University of Technology, where he served as the Deputy Director for Research. He has authored more than 50 scientific publications. He holds four patents and is the author of five pending patent applications. Pawel Wawrzyński’s research interests include machine learning and its practical applications. The areas of his theoretical research are in particular reinforcement learning, graph neural networks, and continual learning. He implements his research results in marketing, robotics, automotive, and energy industries.
Venue: A007, R&D Block
Time: 4:15 PM