AI with Drupal - Using LLM technology is easy, but how do you actually build useful applications?
AI with Drupal - Using LLM technology is easy, but how do you actually build useful applications?
Using AI tools like ChatGPT from OpenAI is easy, and many already have experience with it. Common uses like creating or re-writing text are well integrated with Drupal.
However, the real power of Large Language Models (LLMs) is in building custom applications that use client data and fit specific business needs.
Building these applications is challenging, as public LLMs have limitations, and customization is necessary to meet unique requirements.
In this session, we will provide an accessible overview of how LLMs work, showing when to use public models and when customization is needed, so attendees of all experience levels can follow along.
We’ll focus on In-context Learning and Retrieval Augmented Generation (RAG) as they provide the most value for custom applications. Other concepts like fine-tuning and embeddings will be covered briefly in the context of general LLM use.
To demonstrate, we’ll showcase an AI integrated with Drupal that recommends sessions from DrupalCon Atlanta 2025 using live data. The demo will be available to all session attendees and conference participants.
We will discuss the challenges of public LLMs, such as complexity and cost, and how to overcome them. Attendees can apply these insights to optimize their own applications.
The session will end with an ideation process for creating AI applications, offering ideas to inspire attendees when discussing AI use cases with colleagues and partners.