Courses from 1000+ universities
Coursera sees headcount decrease and faces lawsuit in 2023, invests in proprietary content while relying on Big 5 partners.
600 Free Google Certifications
Marketing
Web Development
Digital Skills
Applied Scrum for Agile Project Management
Cyber Security
Improving Healthcare Through Clinical Research
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore Meta AI's LLaMA language models with Yannic Kilcher in under an hour. Learn about training data, hyperparameters, architecture modifications, and more.
Explore Facebook AI's DINO system, combining Self-Supervised Learning and Vision Transformer architecture for impressive results in computer vision. Offered by Yannic Kilcher, under 1 hour.
Explore the concept of Neural Radiance Fields (NeRF) and view synthesis with Yannic Kilcher in less than an hour. Learn about training NeRF from sparse views, volume rendering, and more.
Explore machine learning with Yannic Kilcher in less than an hour. Learn about DreamCoder, a system that solves problems by writing programs, and its potential in deep learning.
Explore DeepMind's Perceiver model with Yannic Kilcher in under an hour. Understand its architecture, built-in assumptions, and how it solves the quadratic bottleneck problem.
Explore the potential of pretrained transformers in machine learning across different modalities with Yannic Kilcher. Learn about fine-tuning, modality transfer, and network architecture in under an hour.
Explore the future of AI with Yannic Kilcher's deep dive into Self-Supervised Learning (SSL), discussing its potential to reduce label dependency and enhance knowledge transfer in under an hour.
Explore the inner workings of OpenAI's CLIP model with Yannic Kilcher in less than an hour. Understand how neurons respond to distinct concepts across multiple modalities.
Explore Geoff Hinton's GLOM model for AI visual scene understanding in this 1-2 hour material by Yannic Kilcher. Learn about object recognition, capsule networks, and GLOM architecture.
Explore DeBERTa, Microsoft's advanced BERT-style Self-Attention Transformer model, in this short, comprehensive study by Yannic Kilcher. Learn about disentangled attention, relative positional encodings, and more.
Explore the advancements in Model-Based Reinforcement Learning with this under 1-hour material by Yannic Kilcher. Learn about world models, latent states, and achieve state-of-the-art single-GPU performance in Atari.
Explore TransGAN, the first GAN with both generator and discriminator as transformers, in this less than 1-hour study by Yannic Kilcher. Learn about its architecture, tricks for training, and results.
Explore the benefits and drawbacks of Batch Normalization in deep learning with Yannic Kilcher. Learn about Normalizer-Free Networks and adaptive gradient clipping in under an hour.
Explore the Nyströmformer algorithm for approximating Self-Attention in Transformers with Yannic Kilcher. Learn about its linear memory and time requirements, and its application in Natural Language Processing.
Explore the limitations of Autoregressive Transformers and learn about Feedback Transformers with Yannic Kilcher in less than an hour. Gain insights into complex reasoning and long-range dependency tasks.
Get personalized course recommendations, track subjects and courses with reminders, and more.