Analyzing the results with your team - theliberators/columinity.docs GitHub Wiki
This step-by-step guide explains how to analyze the results with your team once sufficient team members, stakeholders, and supporters have participated in the survey. It focuses on the Team Report, the web-based report your team receives after people participate. The Demo Report is available here.
Who is this guide for?
- Teams who have (recently) run a survey and are ready to analyze the results together
⚠️ Teams should take the lead in analyzing results ⚠️
Columinity is designed for teams, and the Team Report data is theirs. If teams sense that people outside the team are misusing the data, they will start answering the questions in a way that those people want to hear, and Columinity will lose its usefulness. Using Columinity is a change initiative. As with any change initiative, it is necessary to clarify the intent, how it helps people, and who will benefit from it. We recommend you do at least the following:
- Before starting with Columinity, create buy-in with teams and schedule a team workshop to analyze the results. A facilitation guide can be downloaded in the Team Report (in the box "Interpret the results together") and other places.
- Create work agreements about who - beyond the team - can access the Team Report. This can be enforced by setting a "Team Pincode" in the Team Report under "Settings" > "Team settings/privacy".
- Review how we protect anonymity and privacy if your teams have questions about this.
- Create a plan/structure for what happens after you've done the first survey. Will more surveys be run? Will other teams be involved? How will new teams be introduced?
1. Go to your Team Report
Open the Team Report for the team. The report is available in several ways:
- The person who started the survey receives the report by email if they provide a notification email.
- Any participating team member received a personalized Team Report upon completing the survey. When they provide an email address, it is also sent there. The Team Report without personal scores can be found by clicking "Invite" in the sidebar and then under "Share team report."
- If you are on a paid plan, you can always find your Team Reports in the Teams Dashboard.
The Demo Team Report we use in this guide is available here. It shows:
2. Review participation
First, it is helpful to understand how many people participated in the survey. The Team Report hides results by default unless at least three people participated. Beyond that, the more people participate, the more reliable the results will typically be.
- Click "Report".
- Locate the box that says "[XX] Participants" (where XX is a number).
- Ideally, all team members participated. We also recommend inviting stakeholders and supporters to get a more complete view.
- You can always invite additional participants with the invitation links. The Team Report will update automatically.
In addition to team members, Columinity encourages teams to include perspectives from people outside the team:
- Stakeholders: Stakeholders fill in a survey to evaluate their satisfaction with the outcomes of this team. Stakeholders have a stake in your team's work; if you do well, they are satisfied but worry if your team is struggling, too. For product teams, these are typically customers, end-users, key users, product managers, and other stakeholders. External stakeholders will be the most honest and unbiased.
- Supporters: Supporters fill in a survey to evaluate the support for your team. They are the people outside who support teams to be effective, like line managers, team managers, or team coaches.
3. Review Team Effectiveness
With Columinity, we want to help your team become more effective. Teams are effective when they can satisfy their stakeholders through a work process that is also enjoyable to team members. Team Effectiveness is the starting point for any analysis in Columinity. All the other factors we measure contribute to team effectiveness in one form or another, according to scientific studies.
- Click "Report".
- Locate the "Team Effectiveness Score" box and review your team's overall score (75). The score is plotted on a scale from 1 to 100.
- The benchmark is currently set to "Most teams in our database". Click
to change it. Teams on our paid plans can choose more benchmarks, such as "Most effective teams" or teams from certain sectors, regions, or organization sizes.
- The color of the bar for "Your team" indicates whether it is similar to (yellow/orange), lower (red), or higher (green) than the benchmark.
You can use this score to see if your interventions are helping your team improve.
4. Review core factors that contribute to Team Effectiveness
Team effectiveness is the outcome we're after with Columinity. But many factors contribute to it. The scientific model you picked when you set up the survey determines which factors are surveyed and analyzed. Each model identifies a handful of core factors that research has shown to be relevant, with each core factor typically having several sub-factors.
With this in mind, let's review the high-level results. The examples will use the "Agile Team Effectiveness" model. If you pick another model, the factors will be different. But the principles are the same.
- Click "Report".
- Enable "Show range of scores in my team" (this feature is only available on our paid plans).
- Below, the "Core Factors" of the scientific model you picked will be visible. You can expand each core factor ("+") to review the sub-factors.
What it means
Let's use the factor "Responsiveness" as an example.
- Click the factor to see its definition.
- The arrows indicate the scores by the various segments (team members, stakeholders, and supporters):
Green indicates team members. In this case, three team members gave this a score of 65 (on a scale from 1 to 100). The green bar shows the range of scores among team members (15-85% percentile), from 58 to 68. This is a modest spread, which indicates that team members mostly agree.
Yellow indicates stakeholders. In this case, four stakeholders gave it a score of 64 (on a scale from 1 to 100). The yellow bar shows the substantial score spread, meaning stakeholders have different perspectives. A high spread suggests disagreement or at least different perspectives, so you must interpret the single number cautiously. In such cases, we recommend discussing what caused the spread and identifying if there are real issues underneath.
Red indicates supporters. This example does not show scores by supporters.
- The pink bar
represents the selected benchmarks (15-85% percentiles). To change it, click
. The benchmark gives you a sense of how your team scores compared to other teams. Because every team is different, we always recommend interpreting this with caution.
- The arrow next to each factor (
) shows the change since the previous scan for this team (if any). A small change such as this reflects natural variation and is no need for concern. Larger changes, such as for "Refinement" (-18), are more meaningful. Each large improvement is cause for celebration with your team, particularly if it concerns core factors.
- Each factor has a yellow, red, or green icon that indicates how it compares (
) to either the benchmark or the previous survey for this team, depending on what is set under "What should we compare with" in the settings
.
- The link "How to improve" leads you to actionable tips and improvements to improve this factor with your team.
ℹ️ Tip: Columinity hides a lot of statistical complexities from you. This can sometimes lead to seemingly counter-intuitive results. If you find something odd, please consult our FAQ which lists many such scenarios.
ℹ️ Tip: If you want to know which questions in the survey link to each factor, please consult this page.
A possible interpretation
We recommend that you look over these results with your team. A facilitation guide in the Team Report shows exercises on how to do this in a way that maximizes inclusion. Here is one way to read the results:
- Team effectiveness has remained constant (delta is 0). The team has not become more or less effective, so any improvement actions we took between surveys have not yet improved our effectiveness.
- However, we see improvements in some core factors. Management support has significantly increased (+14), followed by Team Autonomy (+5). Stakeholder Concern has gone up by 3, which is probably not significant.
- While Responsiveness is stable, Continuous Improvement has decreased significantly (-7). This is reflected in most of the subfactors, and the scores for many are also below the benchmark.
- Our best bet for improving overall team effectiveness is to invest in Continuous Improvement, particularly in how effectively we learn as a team.
5. Review "Insights" to understand how factors interact
We offer several views in Columinity to analyze your team. In the previous step, we used a tabular overview. Let's consult a visual representation of the same results in this step. This is useful for understanding how factors fit together (based on scientific research) and guiding improvement actions.
- Click "Insights".
- The visualization shows all core factors and subfactors in one overview. If you only see core factors, your browser window is too small. Either enlarge the window or view on a bigger screen to view the whole model.
Below is a screenshot of what you see. This example is based on the Agile Team Effectiveness model. If you choose another model, the factors will be different, but the principles will be the same.
What it means
- Team Effectiveness is shown on the right, in the blue circle. Any analysis should start here and work back through the core factors (circles) and their sub-factors (squares).
- If you click on a factor, a dialog with the definition, a breakdown by segment, and quick actions will appear.
- Each factor shows the average score of team members, stakeholders, and supporters (50). Click on the factor to see a breakdown by segment. The red bar behind "Continuous Improvement" shows how this score compares to the previous snapshot. The top bar shows a legend for the colors. If you click "Settings," you can compare against a selected benchmark. The colors of the bars and the legend are updated accordingly.
- The arrow below each shows the difference since the previous snapshot (
). If no previous snapshot exists, a question mark is shown instead.
- The dots below the factor reflect how many months ago the data was surveyed (
). Each dot reflects one month. The older the data, the less reliable it is for deciding what to do now.
- Each factor shows relevant improvement actions for this factor created by the team. We will talk about this in more detail below. If any actions are marked as impeded by the team, their number is shown in the orange circle (
). The number of open and completed actions for a factor is shown in the dark circle (
). Click on either circle to go directly to the relevant actions.
- Some factors are measured only among team members (like "Refinement", "Team Morale" or "Sprint Goals"), whereas others are measured only among stakeholders (like "Stakeholders: Team Value"). Other factors are measured across multiple segments of participants, like "Support for Team Autonomy". Consult the breakdown of scores by clicking a factor to understand which groups of participants provided their perspective on it.
ℹ️ Tip: Customize this view with "Settings". These settings are tied to your session and not shared with other visitors. Consult this page to learn more about how the various settings impact the visualization.
Effects between factors
The arrows between the core factors reflect the effects we've found across them in our academic research. The thicker the line, the stronger the effect. So, Stakeholder Concern has a strong positive impact on Team Effectiveness. Continuous Improvement, in turn, has a strong effect on Stakeholder Concern. On the other hand, Team Autonomy has a small impact on Stakeholder Concern, but a much stronger one on Continuous Improvement. Team Autonomy improves the ability of teams to improve continuously, which in turn improves Stakeholder Concern and Team Effectiveness. These arrows offer an advanced way to think about what you want to improve first.
6. Review badges to celebrate successes
Columinity awards badges to your team for specific accomplishments. Click "Report" to view the badges in the right sidebar. Click on a badge to see why you earned it or how you can earn it. Badges can be gained and lost over time:
ℹ️ Tip: Make a habit of celebrating new and relevant badges with your team. Teams often focus so much on what isn't going well that they forget to celebrate what is going well.
7. Review tips for improvement actions
The first step in continuous improvement is to know what needs improvement. The second step is discovering how to improve, which is often more challenging. Columinity helps you in both areas. The previous steps helped you identify where issues exist. In this step, we review how Columinity helps you by suggesting actionable improvements:
- Click "Tips".
- Review the tips suggested by Columinity. The tips are grouped by factor and sorted by their expected impact (on a scale from 0 - 100). The impact scores take the model-based effect into account, as well as trends in the results. A detailed explanation is provided here.
- Each tip includes insights from research (expand with "+") and concrete improvement actions (expand with "+"). We always include three broad strategies for improving, followed by a handful of quick tips that are easy to start with. Most tips also include a selection of facilitation guides for self-run team workshops to focus on improving this factor.
- We always recommend that teams do what makes sense to them. You know your team much better than Columinity ever can, so use tips that seem useful and relevant to you and discard others.
ℹ️ Tip: The free version of Columinity is limited to three tips, hides facilitation guides, and limits to one quick tip per factor). Your team can unleash full feedback if your organization acquires a paid plan.
8. Create improvement actions
In this final step, we create and track improvement actions. These actions typically result from the analysis you conduct with your team. If a new scan is performed a month or two later, we can assess the effectiveness of these actions and adjust accordingly.
- Click "Actions".
- Create, edit, or delete actions accordingly. Tie each action to one or more factors that you expect will improve because of it. This allows for much better testing of your actions' effectiveness at achieving that.
- Clicking the flame icon marks an action as impeded (
). This indicates that your team needs help from others, such as management or a coach, to resolve the issue. If your team is on a paid plan, impeded actions will appear in the Teams Dashboard so that team coaches and management can act proactively.
ℹ️ Tip: Many teams use other tools, like JIRA, to manage improvement tickets. With the paid plan, it is possible to export actions (CSV), which could then be imported into JIRA. However, we recommend that teams keep things simple and track model-based improvement actions in Columinity. With the paid plan, actions from all teams are also available in the Teams Dashboard.
9. Repeat
Now that you have completed the scan, analyzed the results, and identified improvements, it is time to implement them. We recommend that you repeat the scan (with at least the topics that are issues now) one or two months later. That way, you can see if your improvements worked out and celebrate success or adjust course.
- Click "Repeat".
- Either repeat now if you're ready or choose "Set reminder" to be notified later.
ℹ️ Tip: If you repeat the scan later and measure the same topics, Columinity will show the changes for each factor. If you measure only a subset of factors again, Columinity will only show the changes for the re-scanned factors. You can also merge snapshots from the past 12 months and use the most recent data for each factor under "Settings" ("Combine snapshots").
10. Analyze trends
Once you start accumulating more scans for your team, Columinity allows you to analyze the trends for teams.
- Click "Trends".
- Select the factors you wish to analyze trends for under "Select Factors"
- Choose a summarization period. The default is 3 months. This means that all snapshots for this team from the past three months are aggregated into 1 point, and then all snapshots for each of the three months before that. You can shorten the period, but this also means you need very frequent scans (like monthly).
- Under "Action count", choose what type of action count summary you'd like to include. Completed actions should go along with improvements over time.
- Select a date range to limit to.