When:
to
Room:
Room 5 (127-128)
Tags:
drupal showcase, other cms / beyond drupal, development & coding
Track:
agency & business

AI with Drupal - Using LLM technology is easy, but how do you actually build useful applications?

AI with Drupal - Using LLM technology is easy, but how do you actually build useful applications?

Christoph Breidert, Loredan Szocs

AI functionalities based on Large Language Models (LLMs) such as ChatGPT from OpenAI are well integrated with Drupal. But finding the right use-cases for AI in custom projects can be quite challenging.

In this session an overview of possibilities and shortcomings of LLMs will be presented to equip decision makers with the necessary know-how to create useful applications.

To inspire the attendees an engaging AI demo application with Drupal will be presented that recommends DrupalCon sessions with a conversational interface.

Lastly, an easy ideation process will be presented that can help discover potential for useful AI applications.

Prerequisite
No prerequisite - all technical parts will be presented in a way that both technical and non-technical people can follow.

Outline
Using AI applications such as ChatGPT from OpenAI is easy and many have already gained experience using it. Obvious applications like creating or re-writing text already have a nice integration with Drupal.

However, the real value of “Large Language Models” (LLMs) lies in building custom applications for clients using their data and tailoring the applications to their business processes.

Building such applications is difficult, because you quickly learn what the pitfalls of public LLMs are and you will need to customise them to fit your needs.

In this session we will first give a high level overview how LLMs work so the attendees understand where public models can be used and where customising starts.

For customising we will focus on “In-context Learning” and “Retrieval Augmented Generation” (RAG) because this provides the highest value when building your own applications. Other concepts like fine-tuning, embeddings, or pre-training will only be discussed in a high-level context of how to use LLMs in general.

To illustrate such a customised application an engaging AI with Drupal will be presented, that suggests sessions from DrupalCon Barcelona 2024 to the users (Yes, using live session data!). The AI demo will be made available to the attendees of the session and also to all attendees of the conference.

Based on the demo we will discuss the shortcomings of public LLMs such as complexity and cost and how we can overcome them. The attendees can use these findings for optimising their own applications.

To wrap up the session an ideation process for inventing useful AI applications will be presented, and some ideas for solutions will be shown to inspire the attendees when they discuss potential use cases of AI with colleagues and partners.

Note: I held a similar session with great success at a large IT-Trade-Fair in Iceland in February 2024, but without a focus on Drupal.

Learning Objectives
Attendees will learn
- how does AI work (high-level),
- what is possible with LLMs,
- what are the shortcomings of pre-trained transformers,
- what are the costs associated with integrating AI and how can they be optimised,
- what are the most (cost / resource) effective ways to use LLMs,
- and what AI providers exist.
Attendees will be inspired by an engaging demo application and will maybe get ideas for their own AI applications.