Train your technical writers in UX writing – join our training day on February 10. Book your seat.
UX CONTENT RESOURCES
Like every industry, UX writers and content designers were taken by surprise when generative AI tools were made widely public in 2022. What’s even more surprising is just how useful the technology has been for writers in design teams.
As it turns out, AI tools have been a fantastic addition to day-to-day writing work. In March 2024, we conducted a survey of more than 150 working content designers, and found that 82% use Large Language Models in their day-to-day work.
Over 50% of UX writers and content designers find LLMs either “very” or “somewhat” useful.
Although we’ve been writing about generative AI for years, no one could have predicted just how quickly Large Language Models have become a daily part of work. At first, it seemed as though AI might have a shrinking effect on the market, but that hasn’t been the case.
But there’s a reason why content designers and UX writers find AI useful: they already have a substantial amount of knowledge and skill to make AI work for them. Without that knowledge, LLMs are just a toy. It’s like handing someone a circular saw if they have no carpentry experience. What would be the point?
So when it comes to the question, “can AI replace UX writing?,” the answer is more complex than just “yes” or “no.” It really depends on the skill of the individual.
And we’re quickly discovering that there is a gap between UX writers and content designers who know how to embrace AI, and those who are being left behind.
It’s important to remember that any new technology brings a certain amount of hype that isn’t fully supported by the technology itself. Our imaginations run wild and we picture all types of drastic scenarios.
Immediately after OpenAI released ChatGPT in 2022, it was speculated jobs would soon disappear replaced by digital assistants. The truth is much more nuanced. Although the tech market did suffer an adjacent decline due to over-hiring, many jobs have returned—and unemployment in the United States remains low as of April 2024. Overall, AI hasn’t caused a surge in job losses.
But that doesn’t mean AI hasn’t had an impact. As we discovered in our survey of more than 150 UX writers and content designers, AI has had a substantial effect on people who write user interface text. Many of those people now rely on AI as a day-to-day design tool, much like Figma, Miro, or Notion.
That efficiency may lead to a consolidation of teams, but it hasn’t so far. In fact, the 2024 jobs market for content designers and UX writers remains strong.
But there’s a reason why these content designers and UX writers find AI useful. They already have the skills and knowledge to do the job well—and therefore know what to ask an AI assistant as part of a work process. That knowledge is half the reason why AI is useful in the first place.
Let’s talk about why some content designers and UX writers find AI useful for design, and why some don’t. But in order to do that, we also need to understand how language models work.
Unfortunately, the discussion surrounding Large Language Models (LLMs) and content has been fairly shallow. It’s easy to think AI can do it all for us, and that creating content is as easy as crafting the right prompt. But that assumption is flawed, especially as it relates to content design.
Why? Because any type of content—whether it’s code, prose, or an image—relies on human eyes to evaluate it, and then place it within a context that makes sense. It’s how content designers understand the difference between a message that surprises a user and makes them happy, versus one that would offend them.
Thus, in a time where AI relies on user-created prompts, skills like UX writing fundamentals are more important to know than ever.
Without any knowledge of UX writing or content design best practices, using AI to create content is like painting with your eyes closed. You have the tools, but you have absolutely no control, direction, or context.
“Without any knowledge of UX writing or content design best practices, using AI to create content is like painting with your eyes closed.”
It ignores crucial questions like: which tasks? Why those tasks and not others? What part does human intervention play in any type of editing process?
These are important questions any design leader or professional needs to grapple with. If your hope or intention is to use AI to create text for user interfaces, then simply jumping into a prompt window isn’t going to help you. The act of creating UI text is only part of the full content design process. And it is, indeed, a process with several steps that require full attention.
If you stroll into an AI prompt window with absolutely no knowledge of UX writing or content design best practices, you are relying on the AI itself to guide you and tell you what to do.
There are three critical aspects of any interaction with AI that define whether the output will be useful…or useless.
When you talk to an AI, whether it’s ChatGPT, Gemini, Claude, or a custom language model, you’re giving it a “prompt.” AI will take that prompt and analyze the various words in it to understand your meaning. Part of the way it will do that is by analyzing the relationships between the words you provide.
The more detail you provide in your prompt (and the better structured it is) the more likely you’ll receive a useful response. (We tested this out in a podcast episode.)
That means you need to spend time with assistants in determining how you should structure those prompts, what you need to include, and how long they should be. These change based on whatever tool you’re using and from one model release to the next. What worked today may not work next week, which requires constant fine-tuning.
Like Figma, UX writers and content designers should treat AI models as ongoing tools that require practice and comfort before becoming proficient. It isn’t enough to just treat them as chatting with a friend.
An underrated element of using AI to create design experiences is context. AI isn’t all-knowing. It isn’t able to write strings or strategies for you based on nothing. It doesn’t just require a well-written prompt, it also requires context on your specific design problems—which it won’t be able to gather on its own.
Very few organizations are able to rely on user personas that apply to a broad population set. Even in very generic examples—like, say, a supermarket—there will be user segments with very specific needs. AI assistants don’t have any knowledge on those personas until it’s been taught to them. (This, by the way, is behind the rise of more assistants powered by a company’s proprietary knowledge and user documentation, rather than generic AI models.)
This means any prompt you provide AI needs to include critical information about your design phase, your users, and any relevant research.
This brings us to our third point…
Editing is an underrated skill. Any AI is going to provide you with an immense amount of output— you may get 20 or 30 pieces of content to choose from for any particular design. (Only asking for one is a fool’s errand, which we’ll touch on shortly.) With that much content to choose from, how are you going to decide which to use?
The answer is simple: you need the skills and grounding in best practices to understand which suggestions from AI to adopt, and which ones to discard.
Consider a simple request, like creating an error message. We could simply ask an AI to create an error message for our particular context, but that’s only part of the problem. The next is determining whether the string matches best practices. For instance:
If a message created by AI ignores these points, then it doesn’t matter that it was created faster. It ignores best practices, and is effectively useless. The user experience is still poor.
This is why, when it comes to artificial intelligence and design, learning proper skills and best practices should always come before using tools.
Part of the problem with using AI as a tool to generate large amounts of text is that one can easily reduce the UX writing process to an output. Stating that UX writing is equal to an output is like attempting to judge a software engineer’s value by the number of lines of code they write. The output is not an accurate representation of the breadth of work that has gone into the project.
The truth is that by the time a UX writer or content designer gets to the point of writing strings, most of the work has already been done. Or, to put it more accurately, the work is an ongoing force that will require constant revision and attention. Strings and other types of UX content are not created in a vacuum, then left to serve a role with no review. Iteration is a process that any content designer will undertake.
As our survey shows, even very experienced content designers are able to use LLMs in their day-to-day work.
What’s important to remember, though, is that AI shouldn’t change the ways design is executed. It’s a tool, not a process. So how can content designers and UX writers implement it properly?
We like to follow the “Double Diamond” method of design, which features four distinct phases:
During this phase, the team is researching as much as they can about a specific topic or problem. This may be a broad question, or it might be something specific to a particular user base or segment.
Some of the actions UX writers and content designers might conduct during this phase include research keywords, even speaking with users via surveys or interviews, analyzing competitors, and doing general research on a particular topic area.
Note how many of these actions aren’t actually creating any content yet. They’ll be in the service of discovering content itself. Much of the “gold” to be found won’t lie in generic descriptions of types of users, but in the specific user bases themselves.
For instance, a team working on a supermarket app doesn’t want to know about people who buy groceries in general. They might want to know about people who buy groceries who are very busy, or about people in a specific economic category.
That information can’t be found using any one AI alone for several reasons, not the least of which being that AI doesn’t have a specific training set on certain customers. It has general information. It can also hallucinate, so providing general insights makes it hard to fact check any conclusions.
Instead, UX writers and content designers using AI during the discovery phase are more productive if they create tools, templates, or other material.
For instance, teams can use AI to:
During this phase, the design team begins to hone in on the problem statement. At this point, the UX writer or the content designer may engage in some early brainstorming and tests for content direction. It’s crucial that at this stage, the team has some type of insight to ground their direction.
But UX writers and content designers can’t just rely on brainstorms from generative AI tools. They need real feedback from users to understand what the direction should be.
Research like 5-second tests can be crucial here. Participants are given variants of text to examine for 5 seconds and are then asked a series of questions. Here, AI can be useful in creating variants for these types of tests (though keeping in mind, as we explained earlier, that the AI will need as much context as possible to create usable variants.)
Step 3: Design
During this stage, teams take the problem statement and start developing prototypes to solve it. This is where much of the traditional building and testing takes place, along with the bulk of the writing for a user interface. Though keep in mind, the first two phases have already set the groundwork for direction.
This is where artificial intelligence can play a large part in helping a content designer scale their efforts. Tools like ChatGPT or Gemini can create multiple variants, allowing teams to pick and choose what works best.
The picking and choosing can only work if content designers and UX writers create fully formed prompts with context. That context needs to include:
Any content designers or UX writer examining AI-created content during this phase needs to ensure it remains consistent with other parts of the product. Inserting the appropriate terminology and auditing content for consistency can take a significant amount of time depending on the product.
So far, we’ve focused solely on how to use AI within the context of writing individual strings—the stuff that most content designers or UX writers will do day-to-day. But as design professionals rise in seniority, more of their work is focused on strategic goals.
How does AI play a part in such work?
This is an existential question. The very nature of artificial intelligence means the assumptions about how content strategy could or should operate are now in flux. This doesn’t necessarily mean that principles about UX content within a product change, but it does mean how we go about achieving those goals may very well change.
Let’s start here. This is a model created by James Garrett in his book “The Elements of User Experience” to describe how a product is built. As you create foundational structures (strategy) you move higher up the scaffold to add more, creating further clarity. Content plays a significant role here. From a structural perspective, the existence of AI doesn’t change much if at all here.
Each product still needs a foundational strategy, content requirements that are articulated clearly, along with an IA, navigation, and surface design that makes sense to users. (There is something to be said for the concept that AI can create individual interfaces, but that’s another topic!)
“The Elements of User Experience,” James Garrett
However, the closer we get to actually delivering content, the more we can start to see how AI can affect content strategy.
To look further into the concept of content strategy, let’s take a look at Kristina Halvorson’s definition: “We still define content strategy as guiding the creation, delivery, and governance of useful, usable content.”