This page contains a list of pitfalls to avoid in future workshops. We document the pitfalls in the context of the guidelines for future workshops and the TACTICs for effective workshops. Yet, the concepts in these pitfalls are interdependent. We encourage you to identify relationships and connections between them.


Here are pitfalls that we have encountered in the context of the framework’s guidelines.

Recruit diverse and creative participants.

A lack of diversity. — Not having a diverse enough group of participants will limit outputs considerably. Ensure that you have recruited participants that represent the varying perspectives of stakeholders and have a deep understanding of domain challenges. Beware of recruiting fellow tool builders — they may not understand the real data analysis challenges.

What do you do? — Not knowing the roles of collaborators in the meeting can be problematic. In one of our workshops, a collaborator did not engage at all in some of the methods. This bothered us — why was she not participating? Was she just not engaged? Were we doing something wrong? In the middle of the workshop, we learned that the collaborator was lab manager — not a data analysts. Because she did not work with data analysis, it was entirely reasonable for her to not contribute to discussions about analysis software. Regardless, this participant raised interesting ideas about how data is products, which contributed to the direction of our collaboration. If we had known her specialized role, we could have more appropriately adapted the workshop methods to foster her interest.

Design within constraints.

What does that mean? — Insufficient knowledge of domain vocabulary can limit the workshop effectiveness. Although we have become confident in our ability to design and execute effective workshops, it is easy to forget that effective workshops require facilitators to select methods that elicit interesting opportunities and to lead discussions about the domain topic. Both of these require a command of the domain vocabulary. In some cases, we have relied on collaborators to help us learn vocabulary. However, some of our workshops could have been more effective — e.g., may have elicited more interesting ideas, established stronger rapport with collaborators, and ensured shared context — had we been better prepared with the domain vocabulary. There is scope for using mutual learning as a means to establish rapport, but there is no replacement for effective preparation.

Test the methods and materials.

Wrong affordances. — The workshop materials, and even its location, may hinder workshop effectiveness. We have fallen into this ptifall in a number of ways: 1) in one workshop we bought wrong sized sticky notes, which allowed participants to write more than one idea on a note and made it impossible to reorder or cluster ideas; 2) in another workshop, we used a method that required sketching ideas, but we provided pens with too small of a tip, which encourage participants to draw details when we intended for them to draw highly abstracted ideas; 3) in a workshop held in small conference room, small groups had to leave the conference room so that they had room to work — but this made it hard to reconvene the workshop participants for discussions. In all three cases, testing the methods and materials more thoroughly would have avoided this pitfall.

Distracting visualization. — The dynamic, colourful, morphing graphics used in presentations can distract participants from the task at hand. Pithy written prompts can summarise what is expected and required of participants during each method. During some methods, we have failed to appropriately describe the methods or noted that participants have forgotten the prompt. This has had discouraging results in some of our methods. In visualization analogies, one response included: “this doesn’t work with our data” while, we were hoping to get our colleagues to comment on what traits they liked or disliked about the visualizations. In subsequent workshops, providing written prompts seemed to more appropriately keep participants on track. Testing the workshop methods can help to discover errors and refine the ideas contained in workshop prompts.

Prepare to execute.

Forgetting the team. — As a group of researchers, designers, developers, visualization experts, and perhaps professional facilitators - you’re a team. Not preparing as such can be a serious pitfall if some colleagues who are involved in methods and facilitation are not fully aware of what is expected during each method. This can mean the outputs are not as tailored or structured as expected, hindering their analysis and utility.

You’re an expert? — All of the workshop facilitators should be prepared to talk clearly and with confidence throughout the methods. We strongly encourage all facilitators to rehearse what they are going to talk about. This is particularly important in workshop methods that are intended to build trust. For example, being unprepared to talk about the examples in visualization analogies may cause participants to question your ability to design useful visualizations for their problem.

Set the Stage — engage.

We didn’t explain the rules of the game. — Defining creativity guidelines in the workshop opening but not explaining why these are important can be problematic. For example, asking participants to think about ‘big picture’ ideas can be challenging, but important. In our experience, talking about specific technical problems can be tempting, but trying to push thinking on longer-term opportunities and goals is one of the main workshop benefits. Thus, we should clearly state, and restate, the guidelines for creative thinking throughout the workshop.

Colleagues lose focus. — In at least one workshop, we failed to set the expectations of co-facilitators adequately. During the first method of the day, all collaborators and cofacilitators were engaged and interested. As the day went on, cofacilitators who were not actively participating took out laptops and phones and started working on other activities. This was distracting. A number of participants also took out their laptops and phones. The expectations of no distracting laptops and phones should have been explicitly set and clearly communicated to participants and facilitators alike.

Create physical and visual artifacts.

Generating artifacts with no longevity. — The artifacts created during the workshop express the needs, concerns, opinions, and aspirations of the workshop stakeholders. Merely generating the artifacts, however, is not sufficient for effective workshops. The artifacts should be organized in a way that allows the workshop outputs to be analyzed. We recommend that researchers consider the longevity of artifacts being produced. This can include asking participants to write legibility with the expectations that someone else is going to read their ideas eventually. More importantly, planning for longevity means selecting methods that organize the artifacts into a format that is easy to transport and revisit. For example, asking participants to cluster sticky notes onto large peices of paper can help preserve concepts that emerge from their physical ordering.

Encourage reflection for validation.

Forgetting the theme. — Use the workshop theme as the basis for discussion, learning, and continuing creative collaboration. Questions to consider for the workshop closing include: How well have you addressed the theme? Have the opportunities identified meant that it has expanded, contracted or shifted? Was it the right theme? Have other themes emerged? Failing to discuss the theme can fail to provide closure, and participants may leave the workshop wondering if their objectives have been achieved.

Promote continued collaboration.

Looks like we’re done here - I’m off! — Ending too abruptly without clarity over next steps is potentially disastrous. Everybody wants to get home at the end of the day, but effective workshops require agreement over next steps. So, rather than leaving participants confused, plan to finish early. But schedule the wrap up and make it clear that this is one of the most important sessions of the day. This will make it easier to analyze workshop output and promote continued collaboration.

Analyse with and open mind.

Don’t count on it. — Counting the frequency of ideas contained in the workshop artifacts is tempting, but it should be approached with caution because the frequency of an idea says little about its potential importance or impact. Consider qualitative analysis methods instead. Better yet, intertwine analysis into the workshop by asking participants to rank or evaluate ideas based on certain factors, e.g., potential impact, novelty, or development costs.

Revisit, reflect, and report on the workshop.

Methods don’t matter. — When writing about the results of research projects, omiting details about the workshop methods, participants, and results means that we throw away useful knowledge about visualization design in practice. We strongly encourage researchers to report and reflect on whether the workshop helped to acheive their goals or how it could be improved. This valuable information can help the visualization community better understand how and why to use workshops.


The six TACTICs for effective workshops pervade the CVO workshop framework. Here, we describe pitfalls related to each of the TACTICs.


A sequence of irrelevant examples. — In methods that display visualization examples (e.g., visualization analogies), the order of visualization examples is important, and failing to make successive examples relevant to collaborators may reduce interest in the workshop. Although there are a wealth of visualizations that facilitators can show participants, we encourage mixing obscure visualizations with those that are more relevant to the workshop topic. This can maintain interest and potentially reduce the challenge of discovering analogies.

Getting stuck in the details. — Details about ideas should be approached with caution during the workshop. For example, discussions of the technology, design, data, or implementation can distract participants from the workshops theme. Becoming too focused on the details of a visualization design too early in a workshop can close down thinking too soon. It is important to focus first on what possibilities any new visualization design opens up and might achieve. Similarly, we also recommend that conversations focus on the problem space, as opposed to the solution space.

Me! Me! Me! Me! Me! — Visualization examples should not be biased towards your own work too heavily. Show a variety of examples. Also, avoid examples that are too complex or too long. Instead, use short descriptions that allow participants to see structure in the data and then think about the patterns and possibilities.

The pie-in-the-sky chart. — Remember: methods should show visualizations with a clear purpose. Providing inappropriate or particularly challenging examples of visualizations may lead participants down unhelpful paths, and result in less useful workshop results. This requires judgement to balance visualizations appropriately. Furthermore, it requires us move through examples that are not generating ideas as expected.


Sticking to the plan. — The plan provides a general structure of the workshop. But, following the plan too strictly can hinder agency and discourage active participations. Effective workshops encourage creativity, and this can result in participants breaking boundaries in unexpected ways. So, allow participants to deviate form the plan, if it will result in useful ideas. If partcipants ask to change something, demonstrate an openness to listen and a willingness to change.


Forcing cross-group communication at all times. — At breaks or lunch, groups of colleagues who know one another may want to discuss what has been happening. It’s important to allow this, so that ideas can be exchanged and allowed to develop. A well structured workshop will have given most participants time in groups with each other and time to focus on individual activity, so time with colleagues during the breaks is likely to be important and effective.

Allowing small groups to persist through the day. — Mix things up to keep it interesting and different. If participants regularly work together, ask them to separate — remind them of advantages of getting a diverse sets of perspectives to inform their ideas. Doing this brings different perspectives to the discussions. You can even ask participants to pair up with the person they have spoken to least often. This will keep people interested and maintain fresh perspectives that occur through new interactions.


Needing to know everything. — Recognize the respective expertise of workshop facilitators and participants. Facilitators are often visualization researchers with expertise in visualization and computation. Participants are likely experts in their domain. Acknowedging the limitations of your expertise will open up possibilities for communication. A well structured workshop and the focused expertise that you have on visualization design will give participants confidence in your ability to contribute positively to the project. Pretending that you know about the domain, or being creative rather than transparent in workshop discussions, may erode trust in the workshop and the collaboration.


Expecting everybody to be interested in everything. — Because it is hard to predict how participants will react to certain methods, maintaing appropriate levels of agency, challenge, topic and interest is critically important. Yet, it is a mistake to expect everybody to be interested in everything all of the time. Encourage participants to do their best in methods, and if they are not interested or question the relevance of an activity, then listen to their view. They may be able to define methods, examples or lines of enquiry that are more interesting. If a workshop method flops, asking particpants “how could we make this more interesting or relevant?” may help to appropriately focus the workshop on relevant ideas.


I can’t do that! — Challenging methods may hinder participation and reduce the workshop effectiveness. To increase the likelyhood of success, select methods that vary the levels of challenge throughout the workshop. Although we may not always predict what participants will find challenging, in general, methods that involve visualization analogy or drawing ideas may be difficult for some participants even though sketching might be natural for visualization researchers. Forcing participants to engage in methods they find too challenging is unlikely to be constructive. If they would rather spend time sitting out, or participate in other ways, be flexible. Try to use workshop time effectively while maintaing a focus on the topic.

Scoping methods inadequately. — Failure to provide clear and definitive explanations of workshop methods — including example outputs, where applicable — may discourage participations. Because participants invest signifcant time and energy into the workshop, it is important to appropriately prepare and scope methods for a productive day. This often requires a few iterations to test the methods before the workshop. Preparing example outputs of workshop methods can also help to scope workshop methods through examples.