Canonical’s Platform Engineering team has been hard at work crafting documentation in Rockcraft and Charmcraft around native support for web app frameworks like Flask and Django. It’s all part of Canonical’s aim to write high quality documentation and continuously improve it over time through design and development processes.
We’ve focused on making this documentation user-friendly – but how do we ensure that our documentation truly benefits our readers?
Since last November, we’ve been testing tutorials for the various frameworks we support, conducting a total of 24 UX sessions (so far!). These participants spent their valuable time and energy working their way through our tutorial, allowing us to observe their attempts and collect their feedback on the instructions and explanations.
We created the web app framework support as an approachable introduction to Canonical products through a familiar entry point for most users: web app development. Our goal was to attract a wide variety of users, from seasoned engineers to newcomers. To do so, we collaborated with our internal teams, like Web, who use Canonical products every day, as well as reaching out to external developers through online communities and conferences. To make sure our documentation met real-world needs, we actively sought feedback from those who were unfamiliar with Canonical. We even tested the experience with university students, to confirm it would be accessible across all skill levels.
After recruiting each participant, we began the most important phase: the sessions themselves. We carefully crafted these sessions to provide a consistent, comfortable experience for the participant, encouraging their honest feedback about anything – and everything! – in the tutorial.
A typical session begins with a few quick questions to understand each participant’s background, so we could contextualize their experiences. Then, we begin the tutorial. We observe what the participant notices, how they interpret the instructions, and what obstacles they run into. After they complete the tutorial, we ask a set of post-session questions to collect their overall feedback, and explore if the tooling meets their expectations of the upstream framework.
I’ve felt the full spectrum of human emotions over the course of the 24 sessions. First, there’s a huge deal of helplessness that comes from writing and publishing documentation – as soon as the documentation is out in the world, I’m powerless to help my readers! I found it surprisingly difficult to watch users run into problems that I couldn’t help them solve. Thankfully the engineers were there to provide some aid, although even that wasn’t enough at some points. The sessions have been a learning opportunity for me to accept the helplessness that comes with the author role.
Along with helplessness, there were also plenty of moments where I felt panic. There’s an element of risk associated with documentation: Sometimes, I would argue for documentation changes, thinking that they would provide better UX or mitigate confusion, only for those changes to blow up in my face in real time. I’ve learned to keep a straight face, and I accept any criticism or feedback directed at the changes I pushed for. New ideas (at least in documentation) are definitely worth trying, but they only become quality ideas once proven through UX.
Most of the time, the sessions were silent, and I struggled to keep my attention on the participants and their actions. There are many points in the tutorials where the user has to wait – for software to download, for rocks and charms to pack, for their apps to deploy, and so on. It’s very tempting to look away in those moments and focus on other activities, but as I learned, important observations and details can emerge at any time and stage. Paying attention, even in the most innocuous moments, is a vital part of understanding the participant’s experience and their feedback.
The participants provided insightful feedback about both the tooling and the documentation. Here are some of the most common themes we noticed:
For each of the sessions, we culminated all observations into individual documents. Then we collected all the direct feedback and suggestions into a main document; for the Flask tutorial, the main feedback document spans 16 pages. From there, the project lead, UX designer, technical author (myself!), and the engineers discuss the feedback to determine how we will incorporate it. While prioritizing feedback, we account for the following considerations:
We incorporate feedback in small batches over time, prioritizing major blockers and typos. This way, we can resolve issues quicker, meaning our readers reap the benefits right away!
We’ve found that the changes proposed by earlier UX sessions have improved the quality and outcome of later sessions. Common pitfalls in the first couple of sessions are no longer an issue. Questions about how the tooling works come up less. And – some of you will be glad to hear – users with ARM64 machines can go through the entire tutorial.
There are always improvements to make in our documentation, and these UX sessions are a great way for us to include our community members and make our documentation more accessible. If you’re interested in getting involved, please reach out to us on our public Matrix channel!
Career progression doesn’t follow a single path – and at Canonical, we embrace that. Our…
This is a follow-up to the End of Life warning sent earlier to confirm that…
July 10, 2025: Today, Canonical announced the release of Charmed Feast, an enterprise solution for…
For NVIDIA users, the latest 575 series driver will be available soon in Ubuntu repository.…
Amarok, the free open-source KDE music player, released new 3.3 version on Tuesday, named “Far…
After more than one and half year of development, the free open-source geany text editor…