Drupal Planet

Drupalize.Me: SEO for Drupal Users: What You Need to Know

2 weeks ago
SEO for Drupal Users: What You Need to Know

When I was writing documentation for Drupal CMS’s SEO Tools recommended add-on (aka “recipe”), I realized that not all Drupal site users may be up-to-date on the essentials of SEO and how Drupal can help you make your site discoverable by your target audiences.

While Drupal has long been a solid foundation for building search-friendly websites — that doesn’t mean every Drupal user knows where to start with SEO.

In this post, I’ll define essential SEO concepts and highlight best practices direct from the documentation I wrote about promoting your site with SEO for the Drupal CMS User Guide.

Whether you're configuring a custom Drupal 11 site or using Drupal CMS, these tips apply to you. All of the tools mentioned below may be installed on any Drupal 11 site or installed via the SEO Tools recommended add-on in Drupal CMS.

Amber Matz Wed, 06/18/2025 - 14:41

Dries Buytaert: Automating alt-text generation with AI

2 weeks ago

Billions of images on the web lack proper alt-text, making them inaccessible to millions of users who rely on screen readers.

My own website is no exception, so a few weeks ago, I set out to add missing alt-text to about 9,000 images on this website.

What seemed like a simple fix became a multi-step challenge. I needed to evaluate different AI models and decide between local or cloud processing.

To make the web better, a lot of websites need to add alt-text to their images. So I decided to document my progress here on my blog so others can learn from it – or offer suggestions. This third post dives into the technical details of how I built an automated pipeline to generate alt-text at scale.

[newsletter-blog] High-level architecture overview

My automation process follows three steps for each image:

  1. Check if alt-text exists for a given image
  2. Generate new alt-text using AI when missing
  3. Update the database record for the image with the new alt-text

The rest of this post goes into more detail on each of these steps. If you're interested in the implementation, you can find most of the source code on GitHub.

Retrieving image metadata

To systematically process 9,000 images, I needed a structured way to identify which ones were missing alt-text.

Since my site runs on Drupal, I built two REST API endpoints to interact with the image metadata:

  • GET /album/{album-name}/{image-name}/get – Retrieves metadata for an image, including title, alt-text, and caption.
  • PATCH /album/{album-name}/{image-name}/patch – Updates specific fields, such as adding or modifying alt-text.

I've built similar APIs before, including one for my basement's temperature and humidity monitor. That post provides a more detailed breakdown of how I build endpoints like this.

This API uses separate URL paths (/get and /patch) for different operations, rather than using a single resource URL. I'd prefer to follow RESTful principles, but this approach avoids caching problems, including content negotiation issues in CDNs.

Anyway, with the new endpoints in place, fetching metadata for an image is simple:

[code bash]curl -H "Authorization: test-token" \ "https://dri.es/album/isle-of-skye-2024/journey-to-skye/get"[/code]

Every request requires an authorization token. And no, test-token isn't the real one. Without it, anyone could edit my images. While crowdsourced alt-text might be an interesting experiment, it's not one I'm looking to run today.

This request returns a JSON object with image metadata:

[code bash]{ "title": "Journey to Skye", "alt": "", "caption": "Each year, Klaas and I pick a new destination for our outdoor adventure. In 2024, we set off for the Isle of Skye in Scotland. This stop was near Glencoe, about halfway between Glasgow and Skye." } [/code]

Because the alt-field is empty, the next step is to generate a description using AI.

Generating and refining alt-text with AI

In my first post on AI-generated alt-text, I wrote a Python script to compare 10 different local Large Language Models (LLMs). The script uses PyTorch, a widely used machine learning framework for AI research and deep learning. This implementation was a great learning experience.

The original script takes an image as input and generates alt-text using multiple LLMs:

[code bash]./caption.py journey-to-skye.jpg { "image": "journey-to-skye.jpg", "captions": { "vit-gpt2": "A man standing on top of a lush green field next to a body of water with a bird perched on top of it.", "git": "A man stands in a field next to a body of water with mountains in the background and a mountain in the background.", "blip": "This is an image of a person standing in the middle of a field next to a body of water with a mountain in the background.", "blip2-opt": "A man standing in the middle of a field with mountains in the background.", "blip2-flan": "A man is standing in the middle of a field with a river and mountains behind him on a cloudy day.", "minicpm-v": "A person standing alone amidst nature, with mountains and cloudy skies as backdrop.", "llava-13b": "A person standing alone in a misty, overgrown field with heather and trees, possibly during autumn or early spring due to the presence of red berries on the trees and the foggy atmosphere.", "llava-34b": "A person standing alone on a grassy hillside with a body of water and mountains in the background, under a cloudy sky.", "llama32-vision-11b": "A person standing in a field with mountains and water in the background, surrounded by overgrown grass and trees." } }[/code]

My original plan was to run everything locally for full control, no subscription costs, and optimal privacy. But after testing 10 local LLMs, I changed my mind.

I knew cloud-based models would be better, but wanted to see if local models were good enough for alt-texts. Turns out, they're not quite there. You can read the full comparison, but I gave the best local models a B, while cloud models earned an A.

While local processing aligned with my principles, it compromised the primary goal: creating the best possible descriptions for screen reader users. So I abandoned my local-only approach and decided to use cloud-based LLMs.

To automate alt-text generation for 9,000 images, I needed programmatic access to cloud models rather than relying on their browser-based interfaces — though browser-based AI can be tons of fun.

Instead of expanding my script with cloud LLM support, I switched to Simon Willison's llm tool: https://llm.datasette.io/. llm is a command-line tool and Python library that supports both local and cloud-based models. It takes care of installation, dependencies, API key management, and uploading images. Basically, all the things I didn't want to spend time maintaining myself.

Despite enjoying my PyTorch explorations with vision language models and multimodal encoders, I needed to focus on results. My weekly progress goal meant prioritizing working alt-text over building homegrown inference pipelines.

I also considered you, my readers. If this project inspires you to make your own website more accessible, you're better off with a script built on a well-maintained tool like llm rather than trying to adapt my custom implementation.

Scrapping my PyTorch implementation stung at first, but building on a more mature and active open-source project was far better for me and for you. So I rewrote my script, now in the v2 branch, with the original PyTorch version preserved in v1.

The new version of my script keeps the same simple interface but now supports cloud models like ChatGPT and Claude:

[code bash]./caption.py journey-to-skye.jpg --model chatgpt-4o-latest claude-3-sonnet --context "Location: Glencoe, Scotland" { "image": "journey-to-skye.jpg", "captions": { "chatgpt-4o-latest": "A person in a red jacket stands near a small body of water, looking at distant mountains in Glencoe, Scotland.", "claude-3-sonnet": "A person stands by a small lake surrounded by grassy hills and mountains under a cloudy sky in the Scottish Highlands." } }[/code]

The --context parameter improves alt-text quality by adding details the LLM can't determine from the image alone. This might include GPS coordinates, album titles, or even a blog post about the trip.

In this example, I added "Location: Glencoe, Scotland". Notice how ChatGPT-4o mentions Glencoe directly while Claude-3 Sonnet references the Scottish Highlands. This contextual information makes descriptions more accurate and valuable for users. For maximum accuracy, use all available information!

Updating image metadata

With alt-text generated, the final step is updating each image. The PATCH endpoint accepts only the fields that need changing, preserving other metadata:

[code bash]curl -X PATCH \ -H "Authorization: test-token" \ "https://dri.es/album/isle-of-skye-2024/journey-to-skye/patch" \ -d '{ "alt": "A person stands by a small lake surrounded by grassy hills and mountains under a cloudy sky in the Scottish Highlands.", }' [/code]

That's it. This completes the automation loop for one image. It checks if alt-text is needed, creates a description using a cloud-based LLM, and updates the image if necessary. Now, I just need to do this about 9,000 times.

Tracking AI-generated alt-text

Before running the script on all 9,000 images, I added a label to the database that marks each alt-text as either human-written or AI-generated. This makes it easy to:

  • Re-run AI-generated descriptions without overwriting human-written ones
  • Upgrade AI-generated alt-text as better models become available

With this approach I can update the AI-generated alt-text when ChatGPT 5 is released. And eventually, it might allow me to return to my original principles: to use a high-quality local LLM trained on public domain data. In the mean time, it helps me make the web more accessible today while building toward a better long-term solution tomorrow.

Next steps

Now that the process is automated for a single image, the last step is to run the script on all 9,000. And honestly, it makes me nervous. The perfectionist in me wants to review every single AI-generated alt-text, but that is just not feasible. So, I have to trust AI. I'll probably write one more post to share the results and what I learned from this final step.

Stay tuned.

Dries Buytaert: If a note can be public, it should be

2 weeks ago

A few years ago, I quietly adopted a small principle that has changed how I think about publishing on my website. It's a principle I've been practicing for a while now, though I don't think I've ever written about it publicly.

The principle is: If a note can be public, it should be.

It sounds simple, but this idea has quietly shaped how I treat my personal website.

I was inspired by three overlapping ideas: digital gardens, personal memexes, and "Today I Learned" entries.

Writers like Tom Critchlow, Maggie Appleton, and Andy Matuschak maintain what they call digital gardens. They showed me that a personal website does not have to be a collection of polished blog posts. It can be a living space where ideas can grow and evolve. Think of it more as an ever-evolving notebook than a finished publication, constantly edited and updated over time.

I also learned from Simon Willison, who publishes small, focused Today I Learned (TIL) entries. They are quick, practical notes that capture a moment of learning. They don't aim to be comprehensive; they simply aim to be useful.

And then there is Cory Doctorow. In 2021, he explained his writing and publishing workflow, which he describes as a kind of personal memex. A memex is a way to record your knowledge and ideas over time. While his memex is not public, I found his approach inspiring.

I try to take a lot of notes. For the past four years, my tool of choice has been Obsidian. It is where I jot things down, think things through, and keep track of what I am learning.

In Obsidian, I maintain a Zettelkasten system. It is a method for connecting ideas and building a network of linked thoughts. It is not just about storing information but about helping ideas grow over time.

At some point, I realized that many of my notes don't contain anything private. If they're useful to me, there is a good chance they might be useful to someone else too. That is when I adopted the principle: If a note can be public, it should be.

So a few years ago, I began publishing these kinds of notes on my site. You might have seen examples like Principles for life, PHPUnit tests for Drupal, Brewing coffee with a moka pot when camping or Setting up password-free SSH logins.

These pages on my website are not blog posts. They are living notes. I update them as I learn more or come back to the topic. To make that clear, each note begins with a short disclaimer that says what it is. Think of it as a digital notebook entry rather than a polished essay.

Now, I do my best to follow my principle, but I fall short more than I care to admit. I have plenty of notes in Obsidian that could have made it to my website but never did.

Often, it's simply inertia. Moving a note from Obsidian to my Drupal site involves a few steps. While not difficult, these steps consume time I don't always have. I tell myself I'll do it later, and then 'later' often never arrives.

Other times, I hold back because I feel insecure. I am often most excited to write when I am learning something new, but that is also when I know the least. What if I misunderstood something? The voice of doubt can be loud enough to keep a note trapped in Obsidian, never making it to my website.

But I keep pushing myself to share in public. I have been learning in the open and sharing in the open for 25 years, and some of the best things in my life have come from that. So I try to remember: if notes can be public, they should be.

The Drop Times: A Look Under the Hood of Lupus Decoupled Drupal

2 weeks 1 day ago
Forget the usual trade-offs of headless architecture. Lupus Decoupled keeps Drupal’s powerful backend features intact while giving developers full control on the frontend. In this exclusive interview, Wolfgang Ziegler of Drunomics breaks down how the system works, why it matters, and what’s coming next for the project that’s redefining decoupled Drupal.

Metadrop: Metadrop April 2025: new releases for Drupal ecosystem, privacy and content editorial experience

2 weeks 1 day ago

In March, Metadrop continued its contributions to the Drupal ecosystem with a particular focus on privacy and content editorial experience. The team released new modules, updated existing ones, added integrations, and assisted clients with some internal issues not directly related to Drupal, while still having time do research on AI.

New modules and releases Iframe Consent

We developed a new module to manage IFrame consent, ensuring GDPR-compliant handling of embedded iframes by loading third-party content only after obtaining user consent. This effort enhances privacy in addition to existing modules like EXIF Removal and Youtube Cookies.

Watchdog Statistics 1.0.6

The release of version 1.0.6 added date filters, enabling users to generate reports from previous months and display log statistics for the last month — an…

Talking Drupal: Talking Drupal #507 - International Drupal Federation

2 weeks 2 days ago

In this episode of Talking Drupal, we delve into the International Drupal Federation Initiative with our guest Tim Doyle, CEO of the Drupal Association. We explore the goals, structure, and potential impact of this initiative on the global Drupal community. Additionally, we cover the Modeler API as our module of the week, discussing its functionalities and future potential. Joining the discussion are hosts John Picozzi, Norah Medlin, Nic Laflin, and Martin Anderson-Clutz, who bring their insights and perspectives to the table.

For show notes visit: https://www.talkingDrupal.com/507

Topics
  • Meet the Guest: Tim Doyle
  • Module of the Week: Modeler API
  • Deep Dive into Modeler API
  • Introducing the International Drupal Federation Initiative
  • Governance and Global Impact
  • Challenges and Future Prospects
  • Annual Meeting and Governance Structure
  • Challenges in Crafting Agreements
  • Local Associations and Their Needs
  • Engagement and Communication Strategies
  • Regional Organizations and Governance
  • US-Based Not-for-Profit Focus
  • International Federation and Local Support
  • Potential Risks and Governance Models
  • Implementation Timeline and Costs
  • Legal and Organizational Considerations
  • Community Involvement and Feedback
  • Conclusion and Contact Information
Resources

International Drupal Federation Initiative Recent DA Video Feature on The Drop Times ASBL

Guests

Tim Doyle - Drupal.org Tim D.

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Norah Medlin - tekNorah

Module of the Week

with Martin Anderson-Clutz - mandclu.com mandclu

Modeler API

The Modeler API provides an API for modules like ECA - Events, Conditions, Actions, Migrate Visualize, AI Agents, and maybe others. The purpose is to allow those modules to utilize modelers like BPMN.iO, (and maybe others in the future) to build diagrams constructed of their components (e.g. plugins) and write them back into module-specific config entities.

The Drop Times: Honoring the Balance

2 weeks 2 days ago

Dear Readers,

Something subtle is shifting in the Drupal space. Over the past year, there has been a clear move to consolidate around Drupal CMS as the central message of the project. The intention is understandable. Making Drupal more accessible through low-code and visual tooling lowers the barrier for new users and small teams. But this unified direction, while strategic, risks unintentionally simplifying the perception of what Drupal is and what it is still capable of.

That concern comes into focus when we look at how DrupalCon Atlanta was structured. The sessions and keynotes gave the impression that Drupal CMS is not just a major initiative, but the primary path forward. Yet Drupal has always been more than a product. It has been a framework that adapts to a wide range of use cases, especially in enterprise environments. There was noticeably less visibility for advanced architectures, decoupled implementations, or the tools that support complex digital ecosystems.

This is where the reflections of community members like Jesus Manuel Olivas add useful contrast. His take on DrupalCon highlighted the gap between the official storyline and what many agencies are actively building. For organizations that rely on multi-site strategies, custom front-end frameworks, and API-first infrastructure, the current messaging does not quite reflect their day-to-day reality. These are not theoretical edge cases. They are living, large-scale implementations shaping digital strategy across industries.

Drupal's strength has always come from its flexibility. As the project evolves, it’s important to keep that core identity intact. There is room for Drupal CMS to grow without overshadowing the more complex and less visible work happening across the ecosystem. Honoring that balance is not just a matter of inclusion. It is a matter of relevance.

INTERVIEWDISCOVER DRUPALEVENTSPOTLIGHTORGANIZATION NEWS

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. You can also join us on Drupal Slack at #thedroptimes.

Thank you, 
Sincerely 
Alka Elizabeth 
Sub-editor, The DropTimes.

Salsa Digital: Inside the Drupal AI Strategy

2 weeks 2 days ago
Strategy, Not Hype: The Real Plan for AI in Drupal Let’s be clear, Drupal’s AI Initiative isn’t a rushed bolt-on. It’s a full architectural rethink, designed to embed AI into the platform with a level of governance, flexibility, and transparency that most digital experience platforms can’t touch. The goal isn’t just to keep up with the market, but to set the benchmark for open-source AI in government and enterprise. Architectural Principles: Human Control, Open Choice Human-in-the-Loop, Always: AI agents generate content, layouts, and optimisation suggestions, but every change is logged, auditable, and subject to human review or rollback. Public sector and regulated industries demand this; Drupal is making it non-negotiable.

Gbyte blog: Paginate a grouped Drupal view

2 weeks 3 days ago
Solving Pagination for Grouped Content

In Drupal Views, "grouping" refers to the feature found in the Format section's Style options. When configuring a view's format (Table, HTML List, Grid, etc.), you can select one or more fields to group results by. This creates visual sections with headers for each unique field value, organizing content into logical categories.

The Problem

Standard Views pagination in Drupal operates on individual rows, without awareness of the grouping structure. This results in a broken, inconsistent pager with groups split across multiple pages.

The Solution

Views Grouping Field Pager treats each group as a pagination unit rather than each row. This ensures:

  • Groups remain intact across pages
  • The pager navigation works as expected displaying the correct number of items
Use Cases
  • Project libraries grouped by category
  • News articles organized by date
  • Product catalogs grouped by type
  • Event listings arranged by date
  • Research collections organized by topic
Technical Implementation

The module:

Drupalize.Me: What’s New in the Drupal CMS User Guide: June 2025 Update

2 weeks 5 days ago
What’s New in the Drupal CMS User Guide: June 2025 Update

Since the launch of the Drupal CMS earlier this year, we’ve been hard at work documenting everything you need to build and maintain a site using this new, streamlined Drupal experience. Our goal is to make the Drupal CMS User Guide a go-to reference for site builders of all experience levels — especially those coming to Drupal for the first time.

In this post, we'll share an update on the current state of the guide, highlight two new sections we’re especially excited about, and show you how your team or company can help move the project forward.

Amber Matz Fri, 06/13/2025 - 16:19

Palantir: Palantir is Now a Top Tier Drupal Certified Partner

2 weeks 5 days ago
Palantir is Now a Top Tier Drupal Certified Partner demet Fri, 06/13/2025 - 10:53

How partnerships benefit open source projects and their customers

Palantir has been contributing to open source projects for over 25 years, and we’re one of the world's leading contributors to the Drupal project and community. Over the years we’ve channeled our expertise into creating key modules now incorporated into Drupal core and helping organizations across healthcare, government, education, and more with Drupal solutions tailored to their needs.

We’re proud to announce that Palantir was recently accredited as a Top Tier Drupal Certified Partner. There are currently only three other companies in the entire world who have achieved this status based on their contributions to the project and community.

What is a Drupal Certified Partner?

The Certified Partnership program arose within the Drupal community to recognize the efforts of agencies that significantly contribute to the platform. Drupal Certified Partners are service providers that demonstrate three key characteristics:

  • A high level of expertise in Drupal
  • Commitment to contributing to Drupal core
  • Building the Drupal community

The program is overseen by the Drupal Association, which requires firms to demonstrate how they meet these criteria. Organizations looking for Drupal partners can trust Drupal Certified Partnership status to signal that the company leverages best-in-class Drupal expertise.

Organizations that work with Drupal Certified Partners can trust they’re collaborating with experts who not only build with Drupal— they help build Drupal itself. 

What are some of Palantir’s Drupal contributions?

At Palantir, we have a decades-long track record of building, maintaining, extending and promoting Drupal. As a Top Tier Partner, our clients trust us to maximize the platform’s capabilities to deliver best-in-class web applications—learn more about our Drupal consulting services today.

EditTogether

One of our most recent Drupal innovations is EditTogether: a free, multi-user editing solution that allows you to write and edit exactly where you publish. Rather than relying on external tools for collaborative editing, Edit Together allows you to edit, comment, and version control any field on your Drupal site. This eliminates the need for copy-pasting between your editor and publisher, avoids vendor lock-in, and allows you to take advantage of Drupal’s commitment to data sovereignty in your editing workflows.

EditTogether was conceived as part of our client work for a division of a state agency that provides Medicaid services to its residents. Their content workflows were high-friction and inefficient, relying on transferring content between several proprietary tools for writing, editing, and publishing. After auditing the available solutions, we realized there wasn’t a highly customizable, extensible editor in Drupal that could meet our client’s rigorous security requirements—so our team of developers and architects built a rich text editor using ProseMirror tools and connected it to Drupal via the Drupal Text Editor API.

After successfully delivering the EditTogether solution to the client, we’re currently validating EditTogether against other use cases and preparing for wider release back to the community.

Drupal.org modernization

When the Drupal Association needed to modernize their own site, they trusted us to deliver. In 2024, the Promote Drupal team wanted to migrate Drupal.org from Drupal 7 to Drupal 10 to show off Drupal’s most recent capabilities. Drupal.org is the online home of the Drupal community—so they needed to balance modernizing the site for new adopters and evaluators with making sure the 12,000 pages of documentation and 50,000 modules on the site remained continuously available for the entire community.

The Drupal Association wanted to work with a firm that not only knew Drupal inside out, but could oversee an incremental development plan to deliver the new content as fast as possible while still maintaining excellent service for the existing community. They chose Palantir.net to tackle the job: we delivered their needed site modernizations, dramatically improved test coverage, and built a new components-based design system to make adding new content a breeze.

We’re continuing to work with the Drupal Association to enhance their development team with added technical expertise and strategic consulting for the next phases of the modernization plan.

Why are partnerships vital to open source projects?

Open source technologies give developers and communities greater levels of transparency, flexibility, and financial accessibility in their digital projects—but maintaining open source projects can be challenging. Funding is frequently tight, and governance organizations are often overwhelmed with support requests. Open source solutions rely on organizations contributing their time, expertise, and money back into the technology to keep the lights on. However, many companies view open source contributions as simply a marketing opportunity or a “nice to have”—and the first thing to get cut when to-do lists start to get unwieldy.

The Drupal project wanted to incentivize more people to become active open source contributors— and this is where Certified Partnerships come in.

Partnerships reward consistent contribution

In exchange for meeting certain requirements (for example, a specified number of code commits, or donating a certain amount of money to the open source foundation), organizations can become certified partners of their communities—and enjoy a range of associated benefits.

Certified partnerships are also advantageous to organizations looking for agencies with proven expertise in a specific open source technology. When you work with a certified partner, you’re working with a company who actively helps to build and maintain the software you need. Partners have an intimate knowledge of the software they contribute to, and how to make the most of its capabilities. This knowledge and expertise also enable them to build custom solutions that work well with the core technology.

What makes the Drupal Certified Partner Program unique?

The Drupal Certified Partnerships program recognizes companies and organizations that have “gone above and beyond to sustain and grow the Drupal program”.

Drupal Certified Partnerships don’t just recognize code contributions—they also reward active community participation and encouraging Drupal adoption. Organizations qualify for Certified Partnership through a number of criteria:

  • Contributions and commit credits, weighted towards the aspects of Drupal that are most frequently used and need the most work
  • Proven case studies and success stories of delivering Drupal solutions to clients
  • Annual financial sponsorship of the Drupal Association
  • Event organization, participation, and sponsorship
  • Staff acting in contributor roles, such as board members, mentors, or content moderators
  • Drupal Association members on staff

The partnership program bolsters the thriving community around Drupal. When organizations are looking to partner with service providers with deep technical expertise, they have a robust marketplace of partners to choose from across regional markets and industries. 

Drupal Certified Partners offer solutions you can trust

Open source software (OSS) offers rigorously maintained, transparent solutions, without the hefty price tags of proprietary software— offering organizations of all sizes many incentives to default to open. According to the MIT Technology Review, “The free and open-source software movement is not only alive and well; it has become a keystone of the tech industry.” Indeed, the US Digital Services Playbook encourages federal agencies to evaluate open source solutions at every level of their stack.

When organizations decide to partner with external agencies for their digital transformation and modernization projects, there’s good news and bad news. The good news: numerous agencies build with open source solutions. The bad news: numerous agencies build with open source solutions.

Working with Drupal offers you transparency, flexibility, built-in accessibility, and data sovereignty. Working with a Drupal Certified Partner means you’re working with the people who build Drupal—and who are uniquely positioned to make the most of its powerful capabilities. As Drupal Association CEO Tim Doyle puts it, “Drupal Certified Partners are esteemed agencies that exhibit a profound level of expertise in Drupal, representing a select group of contributors crucial to the vitality and future prosperity of the Drupal ecosystem.”

We’ve been trusted by clients across numerous industries—and by the Drupal Association themselves—to build and deliver highly customized, performant Drupal solutions prioritizing user-centered design, accessibility, and turnkey security.

Learn more about our modular approach to custom Drupal development, or get in touch to discover how we can tailor our expertise to your organization’s needs.

Drupal AI Initiative: Drupal AI 1.1.0 is out and brings major new features!

2 weeks 5 days ago

A huge joint stable release of AI was made yesterday, where 10 modules were updated to 1.1.0. This is a major milestone in the Drupal AI brings a host of improvements and significant new features. 

Announcement by Marcus Johansson.

Some technical facts of the AI release:

  • 105 issues fixed on AI Core, 75 issues fixed on AI Agents and roughly 25 issues on providers.
  • 90 unique contributors on AI Core
  • 244 files changed, 6500 lines added, 15044 lines removed on AI Core.

The most important update is function calling and a new agent framework, but here is a list of some other new features:

  • Custom Operation Types
  • Drush command to run AI
  • Make the chatbot tell it what its doing, while its doing it
  • Make it possible to add autocomplete fields to AI Automator Chains on CKEditor
  • AI Content Suggestion can be based on the rendered HTML of the entity
  • Normalized Structured Response
  • Any many more

Some things directly connected to it:

  • Drupal CMS AI agents have been updated and are now 90% effective up from 80%.
  • A new test framework has been created so can see their effectiveness more clearly and non developers can create new tests.
  • Improvements to AI search making use of function calling means they will more accurately search when you ask it to.
  • Over 1000 people in the #ai slack
  • An officially funded AI position and a Strategic initiative with multiple companies to make this sustainable
  • 4600 installs of AI up from 3000 at Atlanta

Full details of Drupal AI 1.1.0 🚀 Major New Feature: Agentic Framework

The by far largest feature is the new Agentic framework where anyone can build agents without writing a single line of code. They are stored as configurations, meaning that you can build once, export and ship anywhere. And you can trigger them from anywhere you want - Chatbot, CLI, widgets, via an API etc.

Because Drupal is such a flexible and stable CMF, it will be the perfect agent runner. It adheres automatically to your contents permissions, files etc. while also keeping humans in the loop.

With the announcement of the new AI Initiative the future looks bright for Drupal and AI, the announcement is here if you haven't seen it.

🔧 Standards-Based Tooling

We’ve implemented the tooling in a MCP-standardized way, which opens the door for seamless integration with external tools—and makes it easier to share and reuse agents and tools across systems and languages.

🧠 Visual Tools for Everyone

Outside of this we have also added a visual AI Agents Testing Tool where you can setup complex scenarios and retest them over and over, without having to be a developer. An advanced agent tracing tool is also in the pipeline!

And in the 1.1.0 release we also have the possibility to setup the agents via the modeler api and BMPN.io in a graphical way thanks to Jürgen Haas! He also has a visual way of building tools on the way.

The hope is that everything we build, should be possible to use directly from the browser!

🙌 Thank You

I thought I would thank a list of people that have contributed to this, but I checked the unique contributors via git and got 90 people. So I will not try to list everyone, because I will surely forget someone - but special thanks to James Abrahams and FreelyGive Ltd for giving me the opportunity to work with this full time (and more :) )!

It will be really exciting what kind of agents people will build, we have already tried everything from agents that checks configuration diffs for you, to agents that builds components from images. Hopefully your imagination, and not the framework, will be the limit ;)

We have a MR that can be tested for using this within Drupal CMS as well.

For anyone interested in the new agent framework I did a presentation on it at Drupal Days Leuven and also have a longer developers preview. More videos and documentation to follow.

The Drop Times: "We Really Want More People Contributing to Drupal"

2 weeks 5 days ago
Georgia’s Chief Digital and AI Officer, Nikhil Deshpande, shares insights with The DropTimes on how open-source technology and Drupal have transformed state services. In this interview, conducted by sub-editor Alka Elizabeth, Nikhil discusses accessibility, AI integration, community contribution, and the future of digital government platforms.

Drupal blog: Accelerating AI innovation in Drupal

2 weeks 6 days ago

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Drupal launches its AI Initiative with more than $100,000 in funding and a dedicated team to build AI tools for website creation and content management.
 

Imagine a marketer opening Drupal and with a clear goal in mind: launch a campaign for an upcoming event.

They start by uploading a brand kit to Drupal CMS: logos, fonts, and color palette. They define the campaign's audience as mid-sized business owners interested in digital transformation. Then they create a creative guide that outlines the event's goals, key messages, and tone.

With this in place, AI agents within Drupal step in to assist. Drawing from existing content and media, the agents help generate landing pages, each optimized for a specific audience segment. They suggest headlines, refine copy based on the creative guide, create components based on the brand kit, insert a sign-up form, and assemble everything into cohesive, production-ready pages.

Using Drupal's built-in support for the Model Context Protocol (MCP), the AI agents connect to analytics tools and monitor performance. If a page is not converting well, the system makes overnight updates. It might adjust layout, improve clarity, or refine the calls to action.

Every change is tracked. The marketer can review, approve, revert, or adjust anything. They stay in control, even as the system takes on more of the routine work.

Why it matters

AI is changing how websites are built and managed faster than most people expected. The digital experience space is shifting from manual workflows to outcome-driven orchestration. Instead of building everything from scratch, users will set goals, and AI will help deliver results.

This future is not about replacing people. It is about empowering them. It is about freeing up time for creative and strategic work while AI handles the rest. AI will take care of routine tasks, suggest improvements, and respond to real-time feedback. People will remain in control, but supported by powerful new tools that make their work easier and faster.

The path forward won't be perfect. Change is never easy, and there are still many lessons to learn, but standing still isn't an option. If we want AI to head in the right direction, we have to help steer it. We are excited to move fast, but just as committed to doing it thoughtfully and with purpose.

The question is not whether AI will change how we build websites, but how we as a community will shape that change.

A coordinated push forward

Drupal already has a head start in AI. At DrupalCon Barcelona 2024, I showed how Drupal's AI tools help a site creator market wine tours. Since then, we have seen a growing ecosystem of AI modules, active integrations, and a vibrant community pushing boundaries. Today, about 1,000 people are sharing ideas and collaborating in the #ai channel on Drupal Slack.

At DrupalCon Atlanta in March 2025, I shared our latest AI progress. We also brought together key contributors working on AI in Drupal. Our goal was simple: get organized and accelerate progress. After the event, the group committed to align on a shared vision and move forward together.

Since then, this team has been meeting regularly, almost every day. I've been working with the team to help guide the direction. With a lot of hard work behind us, I'm excited to introduce the Drupal AI Initiative.

The Drupal AI Initiative builds on the momentum in our community by bringing structure and shared direction to the work already in progress. By aligning around a common strategy, we can accelerate innovation.

What we're launching today

The Drupal AI Initiative is closely aligned with the broader Drupal CMS strategy, particularly in its focus on making site building both faster and easier. At the same time, this work is not limited to Drupal CMS. It is also intended to benefit people building custom solutions on Drupal Core, as well as those working with alternative distributions of Drupal.

To support this initiative, we are announcing:

  • A clear strategy to guide Drupal's AI vision and priorities (PDF mirror).
  • A Drupal AI leadership team to drive product direction, fundraising, and collaboration across work tracks.
  • A funded delivery team focused on execution, with the equivalent of several full-time roles already committed, including technical leads, UX and project managers, and release coordination.
  • Active work tracks covering areas like AI Core, AI Products, AI Marketing, and AI UX.
  • USD $100,000 in operational funding, contributed by the initiative's founding companies.

For more details, read the full announcement on the Drupal AI Initiative page on Drupal.org.

Founding members and early support

Some of the founding members of the Drupal AI initiative during our launch call on Google Hangouts.

Over the past few months, we've invested hundreds of hours shaping our AI strategy, defining structure, and taking first steps.

I want to thank the founding members of the Drupal AI Initiative. These individuals and organizations played a key role in getting things off the ground. The list is ordered alphabetically by last name to recognize all contributors equally:

These individuals, along with the companies supporting them, have already contributed significant time, energy, and funding. I am grateful for their early commitment.

I also want to thank the staff at the Drupal Association and the Drupal CMS leadership team for their support and collaboration.

What comes next

I'm glad the Drupal AI Initiative is now underway. The Drupal AI strategy is published, the structure is in place, and multiple work tracks are open and moving forward. We'll share more details and updates in the coming weeks.

With every large initiative, we are evolving how we organize, align, and collaborate. The Drupal AI Initiative builds on that progress. As part of that, we are also exploring more ways to recognize and reward meaningful contributions.

We are creating ways for more of you to get involved with Drupal AI. Whether you are a developer, designer, strategist, or sponsor, there is a place for you in this work. If you're part of an agency, we encourage you to step forward and become a Maker. The more agencies that contribute, the more momentum we build.

Update: In addition to the initiative's founding members, Amazee.io already stepped forward with another commitment of USD $20,000 and one full-time contributor. Thank you! This brings the total operating budget to USD $120,000. Please consider joining as well.

AI is changing how websites and digital experiences are built. This is our moment to be part of the change and help define what comes next.

Join us in the #ai-initiative channel on Drupal Slack to get started.

Talking Drupal: TD Cafe #004 - Ivan Stegic & Randy Oest

3 weeks ago

In this episode, Ivan Stegic and Randy Oest discuss the impact of AI on junior developers and other roles, debating whether AI will be a disruptive force in the job market. They delve into the complexities of using LinkedIn for job hunting and effective networking strategies. The conversation shifts to new features in Figma, the potential of AI-driven coding tools like Cursor, and the importance of investing in junior developers. They also explore higher education design systems, innovative business strategies, and reflect on the balance between tactical and digital controls in modern cars. The episode wraps up with a light-hearted chat about slang, parental roles, and mentorship.

For show notes visit: https://www.talkingDrupal.com/cafe004

Topics Ivan Stegic

Ivan is a prominent leader in the Drupal community and the founder of TEN7, a Minneapolis-based technology studio specializing in Drupal development, strategy, and digital transformation. With a background in physics and a passion for problem-solving, Ivan transitioned from science to tech, ultimately finding a perfect fit in the open-source world of Drupal. Since founding TEN7 in 2007, Ivan has championed Drupal as a powerful, scalable platform for mission-driven organizations, nonprofits, and enterprises. Under his leadership, TEN7 has delivered impactful Drupal solutions for clients across education, healthcare, and government sectors. Ivan is also known for fostering a people-first company culture grounded in trust, transparency, and continuous improvement. Beyond his work at TEN7, Ivan is an active contributor to the Drupal project, frequently speaking at DrupalCons and camps, hosting the ONE OF 8 BILLION podcast (formerly the TEN7 Podcast), and mentoring others in the community. His advocacy for open source and ethical tech underscores his commitment to using Drupal to make the internet—and the world—a better place.

Randy Oest

Randy is a design strategist, creative director, and accessibility advocate helping mission-driven organizations craft inclusive, user-centered digital experiences. With a background that spans visual design, front-end development, and content strategy, Randy specializes in building scalable design systems and digital platforms—particularly within the Drupal ecosystem. As the former Creative Director at Four Kitchens, Randy led cross-functional teams in developing cohesive design strategies, architecting front-end systems, and aligning user experience with organizational goals. He’s known for bridging the gap between high-level vision and implementation, ensuring that every project is both beautiful and deeply usable. Beyond his client work, Randy is a frequent speaker at DrupalCon, regional camps, and virtual events, where he shares insights on accessibility, usability, and design systems. A passionate advocate for open-source collaboration and digital equity, he is committed to making the web a more inclusive and empowering space for everyone.

  • Debunking AI Myths: Junior Developers Are Here to Stay
  • Casual Catch-Up: Podcast Conversations and AI Avatars
  • LinkedIn: A Wasteland or a Goldmine?
  • Creative Networking: From Fortune Tellers to Meaningful Connections
  • Figma Innovations: Draw and Sites
  • The Future of Coding: AI Tools and Junior Developers
  • Flying Cars and Spam Texts
  • Dealing with Spam Texts
  • Exploring Higher Education Design Systems
  • The Onion's Creative Agency
  • The Importance of Tactile Controls in Cars
  • Wrapping Up and Future Plans
Guests

Ivan Stegic - TEN7 ivanstegic Randy Oest - amazingrando.com amazingrando

Checked
1 hour 55 minutes ago
Drupal.org - aggregated feeds in category Planet Drupal
Drupal Planet feed