SLM Inference on a Windows laptop 🤯 Intel Lunar Lake CPU/GPU/NPU + OpenVINO
It's Bastille Day 🇫🇷🇫🇷🇫🇷 So, how about some revolutionary action?
In this video (my first ever on Windows!), we transform an MSI Prestige 13+ Evo laptop running Windows 11 into a local AI powerhouse, running cutting-edge language models like Llama-3.1-SuperNova-Lite (8B) with very good performance and efficiency.
Intel's Lunar Lake architecture brings together CPU, GPU, and the revolutionary NPU (Neural Processing Unit) in perfect harmony. Thanks to OpenVINO 2025.02, the latest version of Intel's toolkit for model optimization, you'll see how to leverage each component for maximum AI performance. No more cloud dependencies or expensive API calls - everything runs locally on your Intel hardware.