Introducing the Content-Action Model for Web Systems

  Filed in: CAM for Web Systems, introducing

Content, IA and UX professionals are spoiled for choice when it comes to tooling. But we need to be clear about which ones are right for the job — and that means distilling our intentions. Thanks to Ian D. Keating for the Image of drill bits [CC License]

Wiring ‘ugly truth’ UX diagnostics for better web strategy & IA.

Gleaning truth about what should be done — and undone — on our websites is the central quest of website strategy. There are many roads to the destination, and an arsenal of tools to make the journey smoother:

  • surveys
  • anecdotal feedback
  • user journeys
  • metrics (i.e. Google Analytics)
  • you get the idea …

Our diagnostic tools are all-important methods towards achieving implementations, and optimising them. But just as you don’t start a quest by buying shoes, you need to take a step back before jumping into your Big Website Fix. Strategy — motivation, destination, and intent — is a much better place to begin a journey of discovery.

Building a better journey

Strategy ➡️
  • Organisational direction
  • in-person interviews
  • big ideas™
Diagnostic tools ➡️
  • Surveys
  • anecdotal feedback
  • metrics
Evaluation

This journey to better IA to in turn better the user experience must be built on a clear strategy, otherwise we risk losing our way. To enrich the wider strategy, and make the journey go well, we need to know when to use which diagnostic tools.

Let’s go.

I’m the Web Design Architect for the European Bioinformatics Institute (EMBL-EBI). This is a not-for-profit institute that serves data to all of the world’s life-science research communities, so it has a lot of different goals.

Building consensus on the direction and evolution of the core website is not a one-off thing. It is a long-term, complex relationship amongst people with different priorities and perspectives, and demands communication tools that are fit for purpose.

The main tool I’ve been working on to achieve this is our “Content-Action Method for Web Systems”. It helps decision-making on all levels, from layout adjustment to microsite building. (In the Web Development team we call it just the D.A.M. for Web Systems, or CAM for short, because shortening things is our culture, for better or worse.)

Origin story?

This method began after using Core Content Model methods — those were great, but not quite what we needed. For more about how the CAM method relates and doesn’t relate, check out the aside post: “Beyond the Core Content Model”.

How we use CAM for Web Systems

We use CAM for Web Systems to clearly model:

  1. Needs: The organisation’s and our target users’
  2. Content: What it is, and what it isn’t
  3. Users: Who is visiting us now, and who we would like to join the party
  4. Connections, paths, actions: Wherever they are needed
  5. Meta stuff: Various contextual information about a model that helps us understand the affordances in motion

We isolate each component as an individual item, then tie and untie them in the same conversation. At EMBL-EBI we use the CAM for Web Systems to compare peer concerns. This goes a long way to helping people understand one another, and the sometimes conflicting frustrations/desires/priorities of user communities. In other words, this is a discussion tool to reach agreement.

We then record our shared understanding for future discussions and development work. Distilling our intentions in the CAM for Web Systems allows us to build our analytics appropriately, capturing metrics that are truly useful for understanding user journeys. Even documenting disagreements shows our reasoning clearly.

Once we isolate ‘purpose’ (this is harder than it looks), we can test, evaluate, and iterate.

CAM process flow
Figure 1. The flow of the CAM for Web Systems. Yes, it is a riff on the Double Diamond.

What is in the CAM Record, and why

CAM for Web Systems contents

  1. Organisational documentation: a firm organisational record about the agreed purpose of a portion of a website (such as an /about section)
  2. Itemisation of components and actions: specific components for each section; including:
  3. Overall purpose: informed by organisational priorities and UX research
  4. URLs
  5. Goals (user and organisation)
  6. User types (prioritised)
  7. Content (and who will make it)
  8. Preceding and following actions (what pages and actions)
  9. Organisational placement (which parts of the organisation are surfaced)
  10. Organisation purpose (how does the content map to overall mission purposes in tone or external focus)
  11. Content owner (who is “responsible” for the content overall)
  12. Relations between components: which user types are seeking this goal, and which content supports it?
  13. Relations across CAM Records: how this content relates to the content, goals or users in other parts of the website (e.g. jobs page). This exposes areas that may need new CAM for Web Systems record.
  14. Record of change and knowledge transfer: to avoid repeating previous missteps, and enable knowledge handover. The importance of capturing this cannot be overstated.
  15. Truth through perspectives: by asking the same question from many different perspectives, expose gaps/missed opportunities (e.g. website content, external content) or prioritise areas for improvement (e.g. page components that are surplus to requirements).

Comparing CAM records

By collecting information consistently, we give ourselves — and our successors — clarity on how things have taken shape, what problems have been solved and what is in the pipeline (where the bodies are buried, essentially).

Figure 2. Logic model: Comparing different CAM records.

Example: Components of a ‘Contact us’ system

Creating a network of CAM Records can pay dividends, as it reveals fragment goals, content and user types. A common diagnostic evaluation:

Figure 3. Output of a CAM Record’s content, goals and users for a “Contact us” system.

In this simplified “Contact us” page for a shop, you can see clearly that the Twitter feed content needs looking at.

This CAM Record shows:

  • the Twitter feed is superfluous to perceived requirements
  • we’ve missed a goal or user type

So we can prioritise looking at the Twitter feed to see if people are using it. Then we can either strike it, or delve into which of our customers are utilising it, and why.

Figure 4. A content fragment often allows us to clearly identify a missed use case.

Where and when to use

By modelling our systems consistently, we routinely surface and resolve logic issues.

The CAM method allows us to develop hypotheses more clearly, and expose important knowledge gaps. This allows us to construct a shared narrative with legitimacy within the organisation, allowing us to cooperate as a group and avoid internal nitpicking.

In the ‘Contact us’ example, looking closely at how people used the Twitter feed (which seemed superfluous) allowed us to add a new, validated goal: “Get info on sales and last-minute updates”. Very handy. But is that a genuine need of our two user types?

Our framework allows us to see and prioritise areas we need to investigate. It helps us articulate our questions, and select the right tools to investigate and test our hypotheses.

Importantly, we can share our findings easily with one another, and feed them back into iterations of the model. By doing so, we are continually improving the model and making it easier to make decisions based on evidence.

What’s next for the CAM for Web Systems?

This is the first of what I hope will be many posts exploring this new approach.

Depending on feedback on this post, and developments in our many digital communications fora, I’ll be writing about:

  • CAM in Action: A how-to on getting started with this methodology.
  • Visualising insights: A dashboard to give knowledgeable stakeholders a ‘forest from the trees’ view on our strategy, interconnected content and user activity;
  • Goal-driven metrics: Looking beyond page views, and using our CAM for Web Systems to capture metrics of success and failure that relate directly to our objectives;
  • Programmatic solutions: Going beyond paper, manipulating our data in lightweight databases like Coda and surfacing in-context through RDFa in our website HTML markup;
  • Specialising: How we take the general model and adapt it to bespoke use cases, such as EMBL-EBI’s free-to-use scientific data services;
  • UX + the CAM: What does it mean to 'do UX' with our Content Action Model? Beyond the buzz.
  • Published: read it here.
Figure 5. The goal: clearly see and show connections to power dashboards and gather new insights.

Kudos in particular to this post

For the growth of this CAM method, I owe thanks to many for inspiration, support and indulgence. But to the writing of this article, I'd like to pass on some specific thanks:

  • Mary Todd-Bergman (@themarytodd): For editing this, sure, but also for on-going support, testing and sounding-out of this approach.
  • Ivan Labra (@ilabra): For bouncing ideas about what I’m really getting at here and getting my speak out of the organisational bubble.

If you're also interested in chatting about this evolving model or improving it: 🐦 @khawkins98