Machine learning has advanced considerably in recent years, with algorithms matching human capabilities in various tasks. However, the main hurdle lies not just in developing these models, but in deploying them optimally in real-world applications. This is where inference in AI comes into play, arising as a critical focus for scientists and tech le