学术报告

Mini-Course

作者:    发布时间:2023-11-20    浏览次数:
Speaker: Sergey Nikolenko (St. Petersburg Department of Steklov Mathematical Institute of RAS) Zoom ID: 849 3673 9875  Password: 123456Beijing Time: 15:00-17:00 (Moscow Time: 10:00-12:00)Schedule:DateTopicsNov.   20Attention   in neural networks. Self-attention and the Transformer architecture. BERT and   GPT families.Nov.   22Variational   autoencoders: idea and derivation.Nov.   25Discrete  ...

Speaker: Sergey Nikolenko (St. Petersburg Department of Steklov Mathematical Institute of RAS)

Zoom ID: 849 3673 9875  Password: 123456

Beijing Time: 15:00-17:00 (Moscow Time: 10:00-12:00)

Schedule:

Date

Topics

Nov.   20

Attention   in neural networks. Self-attention and the Transformer architecture. BERT and   GPT families.

Nov.   22

Variational   autoencoders: idea and derivation.

Nov.   25

Discrete   latent spaces: VQ-VAE. VAE + Transformer = DALL-E.

Nov.   27

Vision   Transformers. Multimodal latent spaces: CLIP and BLIP, our recent work   (LAPCA).

Nov.   29

Case   study: video retrieval. How it has developed in the last years.   Postprocessing in video retrieval and our recent work (Sinkhorn transformations).

Dec.   1

Topological   data analysis: extracting features with topology. Our recent work (TDA for   HuBERT, TDA for artificial text detection).

Biography:

Sergey Nikolenko is a computer scientist specializing in machine learning and analysis of algorithms. He is the Head of AI at Synthesis AI, a San Francisco based company specializing on the generation and use of synthetic data for modern machine learning models, and also serves as the Head of the Artificial Intelligence Lab at the Steklov Mathematical Institute at St. Petersburg, Russia. Dr. Nikolenko's interests include synthetic data in machine learning, deep learning models for natural language processing, image manipulation, and computer vision, and algorithms for networking. His previous research includes works on cryptography, theoretical computer science, and algebra.