Machine learning has achieved significant progress in recent years, with systems achieving human-level performance in diverse tasks. However, the main hurdle lies not just in developing these models, but in deploying them optimally in everyday use cases. This is where AI inference takes center stage, emerging as a critical focus for scientists and