How to run LLMs locally: key tools, best practices and practical applications
How to run LLMs locally: key tools, best practices and practical applications
Volodymyr Andrushchak
Want a secure and scalable AI tool on your terms? Join our session to learn how local LLMs offer unmatched privacy and flexibility and how to integrate them into your projects seamlessly.
Prerequisite
This session will be useful for participants with engineering background, team leads, or heads of departments who are curious about implementing LLMs locally into their projects. Basic knowledge in the domain of AI and LLM would be beneficial.
Outline
SaaS-based LLMs, like OpenAI, are powerful tools, no doubt. However, they often fall short when it comes to handling sensitive data or ensuring privacy. OpenAI, for example, imposes restrictions on the types of data that can be analyzed, and for sectors that work with highly sensitive data, SaaS-based LLMs pose a significant threat in terms of privacy.
Luckily, there is another solution – local LLMs.
In our session, you will discover how locally deployed LLMs offer a secure, robust, and flexible alternative to SaaS models. We will break down all the practical aspects of selecting and working with open-source LLMs by providing an overview of the best options on the market, such as LLaMA, Mistral, and Gemma. We will walk you through the major differences between these models in terms of licensing terms, architectural nuances, and deployment complexity, specifically focusing on how all of these affect the overall cost of integration.
Also, we will share insights on how to deploy LLMs locally within your Drupal environment and demonstrate the real-world cases of local LLM usage, namely smart search implementation or seamless migration from the SaaS model to open-source LLM.
Learning Objectives
After the session, participants will know how to work with LLM models in the local environment, what to consider when choosing models, and will have a general understanding of how they can apply open-source LLMs in their projects.
Experience level
Intermediate