When:
to
Room:
Room 2 (131-132)
Tags:
drupal showcase, other cms / beyond drupal, development & coding
Track:
open web community

Content/Comment-Analyzer 2.0

Content/Comment-Analyzer 2.0

Julien Hofer (Methodjules)

The City of Hamburg is pioneering an innovative approach to public engagement and content management through the integration of Large Language Models (LLMs) within their Drupal-based digital platforms. This use case explores the development and implementation of a Drupal module designed to leverage NLP capabilities for analyzing public comments and content. The initiative aims to enhance public service delivery, foster greater community engagement, and streamline content management processes by utilizing advanced text analysis techniques.

Prerequisite
Attendees should have a fundamental understanding of how to develop modules in Drupal, including familiarity with Drupal's hook system, module architecture, and the Drupal API system.

Outline
Introduction
Brief overview of the session and its relevance to Drupal users and developers.
Introduction to the City of Hamburg's initiative for using NLP within Drupal.
Section 1: Background and Motivation
Overview of Natural Language Processing (NLP) and Large Language Models (LLMs).
The challenges faced by the City of Hamburg in managing public engagement and content.
Section 2: Drupal Module Development for NLP
Detailed description of the custom Drupal module developed for integrating LLMs.
Architectural overview and integration points within Drupal.
Section 3: Utilizing the Batch API for NLP Tasks
Explanation of the Batch API and its importance for processing large volumes of data.
Strategies for using the Batch API to maintain system performance during NLP analysis.
Section 4: Use Case Scenario - Urban Development Projects
Detailed presentation of the urban development project feedback analysis.
Process from data collection through Drupal forms to insight generation using NLP.
Section 5: Results, Impact, and Lessons Learned
Overview of the results from implementing the NLP module in Hamburg’s urban development projects.
Discussion of the impact on public engagement, content management, and decision-making.
Lessons learned and best practices for integrating NLP into Drupal platforms.
Conclusion and Future Directions
Recap of the benefits of leveraging LLMs for Drupal content and comment analysis.
Exploration of potential future enhancements and integrations.
Q&A session.

Learning Objectives
By attending this presentation, participants will:

Understand the Basics of NLP and LLMs: Gain a foundational understanding of natural language processing and how large language models can be applied within Drupal to enhance content analysis and user engagement.

Learn How to Integrate LLMs with Drupal: Acquire practical knowledge on developing and integrating a Drupal module that interfaces with (open sourced) large language models for text analysis.

Discover the Use of the Batch API for Scalable NLP Processing: Understand how to utilize Drupal’s Batch API to manage the computational demands of NLP tasks, ensuring efficient processing of large data sets without compromising system performance.

See Real-world Application through the City of Hamburg Use Case: Learn from the City of Hamburg’s experience in implementing a Drupal-based NLP solution for analyzing public feedback on urban development projects, highlighting the module's impact on public service delivery and content management.

Identify Best Practices and Lessons Learned: Benefit from insights into the challenges, solutions, and best practices discovered through the City of Hamburg’s initiative, offering valuable takeaways for implementing similar projects.

Explore Future Possibilities: Stimulate thinking on the potential future applications and enhancements of integrating NLP technologies within Drupal, encouraging innovation and continuous improvement in Drupal content and user interaction strategies.