Tutorial at PReMI 2025 - Beyond Transformers - Deep Dive into Mamba and SSMs


Date
Dec 11, 2025 9:00 AM — 12:30 PM
Location
IIT Delhi, India

Tutorial: Beyond Transformers - Deep Dive into Mamba and State Space Models

Presented at the 11th International Conference on Pattern Recognition and Machine Intelligence (PReMI 2025), IIT Delhi.

This tutorial session explored:

  • State Space Models (SSMs) fundamentals and evolution
  • Mamba architecture - a linear-time sequence modeling alternative to transformers
  • Selective state spaces and their advantages
  • Comparison of SSMs vs Transformers for various sequence modeling tasks
  • Nemotron Nanov3

PReMI 2025 brought together researchers and practitioners in pattern recognition, machine intelligence, and related fields from across the globe.

Conference Date: December 11-14, 2025 Tutorial Date: December 11, 2025 (Pre-conference Tutorial Day)

Ayush Maheshwari
Ayush Maheshwari
Senior Solutions Architect at NVIDIA
Foundation Models, NLP, and AI for Science

I work on foundation models, multilingual NLP, machine translation, and AI for Science.