Skip to yearly menu bar Skip to main content


Workshop

Self-Supervised Learning - What is next?

Michael Dorkenwald

Space 2

Sun 29 Sep, midnight PDT

Keywords:  ML  

From GPT to DINO to diffusion models, the past years have seen major advances in self-supervised learning, with many new methods reaching astounding performances on standard benchmarks. Still, the field of SSL is rapidly evolving with new learning paradigms coming up at an unprecedented speed. At the same time, works on coupled data, such as image-text pairs, have shown large potential in producing even stronger models capable of zero-shot tasks and benefiting from the methodology developed in SSL. Despite this progress, it is also apparent that there are still major unresolved challenges and it is not clear what the next step is going to be. In this workshop, we want to highlight and provide a forum to discuss potential research directions, from radically new self-supervision tasks, data sources, and paradigms to surprising counter-intuitive results. Through invited speakers and paper oral talks, our goal is to provide a forum to discuss and exchange ideas where both the leaders in this field, as well as the new, younger generation, can equally contribute to discussing the future of this field.

Live content is unavailable. Log in and register to view live content