How Healthy is your Enterprise Social Network?

At the heart of any Enterprise Social Network (ESN) are the groups or communities formed within them. Understanding the health and productivity of these groups should therefore be front of mind. For ESNs we can look again to the more mature experiences with consumer and external customer communities for guidance. We have written previously about the need to take care when translating consumer network metrics to the Enterprise. But in the case of community health, we believe the mapping from external community to internal community can be fairly close.

What can we learn from consumer and customer networks?

Arguably the gold standard for community health measures was published several years ago by Lithium, a company that specialises in customer facing communities. Lithium used aggregate data from a decade’s worth of community activity (15 billion actions and 6 million users) to identify key measures of a community’s health:

  • Growth = Members (registrations)
  • Useful  = Content (post and page views)
  • Popular = Traffic (visits)
  • Responsiveness (speed of responsiveness of community members to each other)
  • Interactivity = Topic Interaction (depth of discussion threads taking into account number of contributors)
  • Liveliness (tracking a critical threshold of posting activity in any given area)

march-blog-1

march-blog-2

At the time of publishing, Lithium was hoping to facilitate the creation of an industry standard for measuring community health.

Other contributors to the measurement of online community health include online community consultancy Feverbee with their preferred measures as:

  • New visitors – a form of growth measure
  • New visitors to new registered members– conversion rate measure
  • % members which make a contribution– active participants
  • Members active within the past 30 days– time based activity
  • Contributions per active member per month– diversity and intensity measure
  • Visits per active member per month – traffic measure
  • Content popularity-useful content

Marketing firm Digital Marketer health measure recommendations include:

  • Measuring the total number of active members, rather than including passive members.
  • Number of members who made their first contribution as a proxy for growth.
  • A sense of community (using traditional survey methods).
  • Retention of active members i.e. minimal loss of active members (churn rate).
  • Diversity of membership, especially with respect to innovation communities.
  • Maturity, with reference to the Community Roundtable Maturity Model.

Using SWOOP for Assessing Enterprise Community/Group Health

SWOOP is focused on the Enterprise market and is therefore very interested in what we can usefully draw from the experiences of online consumer and customer networks. The following table summarises the experiences identified above and how SWOOP currently addresses these measures, or not:

Customer Community Health Measures SWOOP Enterprise Health Measures
Growth in Membership Measures active membership and provides a trend chart to monitor both growth and decline.
Useful Content Provides a most engaging posts widget to assess the usefulness of content posted.  We are currently developing a sentiment assessment for content.
Popularity/Traffic SWOOP does not currently measure views or reads. Our focus is more on connections that may result from content viewing.
Responsiveness Has a response rate widget that identifies overall response rate and the type of response e.g. like, reply and the time period within which responses are made.
Interactivity Has several rich measures for interactivity, including network connectivity and a network map, give-receive balance and two way connections. The Topic tab also identifies interactivity around tagged topics.
Liveliness The activity per user widget provides the closest to a liveliness (or lack of liveliness) indicator.
Activity over time The Active Users and Activity per User widgets report on this measure.
Contributions per member The Activity per User widget provides this. The New Community Health Index provides a 12 month history as well as alarms when certain thresholds are breached.
Sense of community Requires a survey, which is outside the scope of SWOOP.
Retention Not currently measured directly. The active members trend chart gives a sense of retention, but does not specifically measure individual retention rates.
Diversity Not provided on the SWOOP dashboard, but is now included in the SWOOP benchmarking service. Diversity can be measured across several dimensions, depending on the profile data provided to SWOOP e.g. formal lines of business, geography, gender etc. In the absence of profile data, diversity is measured by the diversity of individual membership of groups.
Maturity The Community Roundtable maturity assessment is a generic one for both online and offline communities. Our preference is to use a maturity framework that is more aligned to ESN, which we have reported on earlier. How the SWOOP measures can be related to this maturity curve is shown below.

march-blog-3

Thresholds for What’s Good, Not so good and Bad

We know that health measures are important, but they are of little use without providing some sense of what a good, bad or neutral score is. In the human health scenario, it is easy to find out what these thresholds are for basic health measures like BMI and Blood Pressure. This is because the medical research community has been able to access masses of data to correlate with actual health outcomes, to determine these thresholds with some degree of confidence. Online communities have yet to reach such a level of maturity, but the same ‘big data’ approach for determining health thresholds still applies.

As noted earlier, Lithium has gone furthest in achieving this, from the large data sets that they have available to them on their customer platform. At SWOOP we are also collecting similar data for ESNs but as yet, not to the level that Lithium has been able to achieve. Nevertheless, we believe we have achieved a starting point now with our new Community Health Index Widget. While we are only using a single ‘activity per active user’ measure, we have been able to establish some initial thresholds by analysing hundreds of groups across several Yammer installations.

march-blog-4

Our intent is to provide community/group leaders with an early warning system for when their groups may require some added attention. The effects of this attention can then be monitored in the widget itself, or more comprehensively through the suite of SWOOP measures identified in the table above.

Communities are the core value drivers of any ESN. Healthy enterprise communities lead to healthy businesses, so it’s worth taking the trouble to actively monitor it.

 

 

 

 

 

 

 

 

 

 

Data-Driven Collaboration Part 1: How Rich Data Can Improve Your Communication

Originally published on Carpool.

This is the first of a series, coauthored by Laurence Lock Lee of Swoop Analytics and Chris Slemp of Carpool Agency, in which we will explain how you can use rich, people-focused data to enhance communication, increase collaboration, and develop a more efficient and productive workforce.

It’s safe to say that every enterprise hungers for new and better ways of working. It’s even safer to say that the path to those new and better ways is often a struggle.

Many who struggle do so because they are starting from a weak foundation. Some are simply following trends. Others believe they should adopt a new tool or capability simply because it was bundled with another service. Then there are those organizations that focus primarily on “reining in” non-compliant behaviors or tools.

But there’s a way to be innovative and compliant that also improves your adoption: focus instead on the business value of working in new ways—be data-driven. When you incorporate information about your usage patterns to set your goals, you are better positioned to track the value of your efforts and drive the behavior changes that will help you achieve your business objectives.

While it’s assumed that doing market research is critical when marketing to customers, investments in internal audience research have gained less traction, yet they yield the same kinds of return. Data-driven internal communication planning starts at the very beginning of your project.

Here we will demonstrate—using real-world examples—how Carpool and Swoop use data to create better communications environments, nurture those environments, and make iterative improvements to ensure enterprises are always working to their full potential.

Use Data to Identify Your Actual Pain Points

One team Carpool worked with was focused on partnering with customers and consultants to create innovations. They thought they needed a more effective intranet site that would sell their value to internal partners. However, a round of interviews with key stakeholders and end-of-line consumers revealed that a better site wasn’t going to address the core challenge: There were too many places to go for information and each source seemed to tell a slightly different story. We worked with the client to consolidate communications channels and implemented a more manageable content strategy that focused on informal discussion and formal announcements from trusted sources.

In the end, we were able to identify the real pain point for the client and help them address it accordingly because of the research we obtained.

Use Data to Identify New Opportunities

Data can drive even the earliest strategy conversations. In Carpool’s first meeting with a global retail operation, they explained that they wanted to create a new Yammer network as they were trying to curb activity in another, unapproved network. Not only did we agree, but we brought data to that conversation that illustrated the exact size and shape of their compliance situation and the nature of the collaboration that was already happening. This set the tone for a project that is now laser-focused on demonstrating business value and not just bringing their network into compliance.

Use Data to Identify and Enhance Your Strengths

In-depth interviews can be added to the objective data coming from your service usage. Interviews reveal the most important and effective channels, and the responses can be mapped visually to highlight where a communication ecosystem has broadcasters without observers, or groups of catalysts who are sharing knowledge without building any broader consensus or inclusion.

Below, you see one of Carpool’s chord chart diagrams we use to map the interview data we gather. We can filter the information to focus on specific channels and tools, which we then break down further to pinpoint where we have weaknesses, strengths, gaps, and opportunities in our information flow.

CHORD CHART

Turning Data Into Action

These kinds of diagnostic exercises can reveal baselines and specific strategies that can be employed with leaders of the project or the organization.

One of the first activities organizations undertake when implementing an Enterprise Social Networking (ESN) platform is to encourage staff to form collaborative groups and then move their collaboration online. This is the first real signal of ‘shop floor empowerment’, where staff are free to form groups and collaborate as they see fit, without the oversight of their line management. As these groups form, the inevitable ‘long tail’ effect kicks in, where the vast majority of these groups fall into disuse, in contrast to a much smaller number that are wildly successful, and achieving all of the expectations for the ESN. So how can organizations increase their Win/Loss ratio? At Swoop Analytics we have started to look at some of the ‘start-up’ patterns of the Yammer installations of our benchmarking partners. These patterns can emerge after as little as 6 months of operations.

Below, we show a typical first 6 months’ network performance chart, which measures group performance on the dimensions of Diversity (Group Size), Cohesion (Mean 2-Way Relationships formed), and Activity (postings, replies, likes etc.). We then overlay the chart with ‘goal state’ regions reflecting the common group types typically found in ESN implementations. The regions reflect the anticipated networking patterns for a well-performing group of the given type. If a group’s stated purpose positions them in the goal-state region, then we would suggest that they are well positioned to deliver tangible business benefits, aligned with their stated purpose. If they are outside of the goal state, then the framework provides them with implicit guidance as to what has to happen to move them there.

BUBBLE GRAPH

At launch, all groups start in the bottom left-hand corner. As you can see, a selected few have ‘exploded out of the blocks’, while the majority are still struggling to make an impact. The 6-month benchmark provides an early opportunity for group leaders to assess their group against their peer groups, learn from each other, and then begin to accelerate their own performances.

Painting the Big Picture

The convergence of multiple data sources paints a holistic picture of communication and collaboration that extends beyond team boundaries. This new picture extends across platforms and prescribes the design for an ecosystem that meets user and business needs, aligns with industry trends, and is informed by actual usage patterns.

ECOSYSTEM DESIGN

The discussion about the ROI of adopting new ways of working, such as ESNs, hasn’t disappeared. While we believe it’s a waste of resources to try measuring a return from new technologies that have already been proven, it’s clear that developing business metrics and holding these projects accountable to them is just as critical as any effort to increase productivity.

The nature of these metrics also needs to shift from a focus on “counts and amounts” to measures of a higher order that tie more closely to business value. For example, knowing that posting activity has risen by 25% in a year may make you feel a little better about your investment in a collaboration platform. Knowing that there is a higher ratio of people engaging vs. those who are simply consuming is much better. Showing a strong correlation in departments that have higher percentages of engaged users with lower attrition rates … that’s gold.

So now is the time to look at your own organization and wonder: “Do I track how my people are connecting? Do I know how to help them become more engaged and productive? When was the last time I measured the impact of my internal communication ecosystem?”

Then take a moment to imagine the possibilities of what you could do with all of that information.

Stay tuned in the coming weeks for Part 2 and Part 3 when we address the topics of driving engagement by identifying types of enterprise social behavior in individuals, and the results we’ve seen from being data-driven in how we shape internal communications and collaboration.

Need to convince someone? Bring Data (and a good story)

Big data

As Daniel Pink suggests “to sell is human”.  Even if we do not have a formal ‘selling’ role we are always looking to ‘sell’ someone on our point of view, our recommendations, our need for their help etc.. As data analysts we live and breathe data every day, whether we are looking to develop some new insights, prove a case or simply explore possibilities. In the end we are doing it to influence someone or some group. In these days of ‘evidence based decision-making’ I am wary that one person’s ‘evidence’ is another person’s ‘garbage’. You don’t have to look much further than climate change sceptics to appreciate that. I was therefore intrigued when I came across Shawn Callahan’s recent blog post on “The role of stories in data storytelling”. Shawn talks about the use of ‘story’ before, during and after data analysis.

Before data analysis stories

Before data analysis is about understanding the dominant ‘story’ before your analysis. For us a good example of this is our recent work on comparing relationship analytics with activity analytics. The dominant storyline was (and probably still is) that social analytics used in the consumer world i.e. activity measures, are sufficient for use inside the enterprise.

During data analysis stories

The ‘during the data analysis’ story is about how stories evolve from your act of data analysis. Our story in the interactions vs activity debate was about one of our clients observing some analytics provided by Swoop and finding that the measure for social cohesion was far more reflective of their view of how different communities were collaborating and performing than the activity measures reported beside them. For us the ‘stories during data analysis’ is continual. We are always looking to find the ‘story behind the data’. And this usually comes when we can talk directly to the owners of the data, in what we call ‘sense making’ sessions. As an example, we are currently looking at adoption patterns for Yammer using some of the benchmarking data that we have collected. We have learnt from experience that collaboration happens best within ‘groups’. Our prior analysis showed that the social cohesion between groups varies a lot and follows a typical ‘power curve’ distribution when sorted from best to worst. We are now looking at how these groups evolved over time. What patterns existed for those highly cohesive groups versus those that were less cohesive? Is there a story behind these different groups? Our evolving stories are merely speculations at the moment, until we can validate them with the owners of the data.

After data analysis stories

Knowing Doing GapShawn Callahan identifies these stories as needed to bridge the gap between what the data analyst ‘knows’ and what the decision makers need to act on.  He goes on to describe types of data stories, being a chronological change, explanation or discovery stories. He recommends that if you are trying to instigate change from a dominant current story, then it has to be a better story than that one. Thankfully in our case we don’t believe there is a dominant story for the use of activity analytics with Enterprise Social Networking (ESN) implementations. Of course there are supporters, some quite passionate, but the majority point of view is that they are insufficient for the needs of the Enterprise. That said, you still need to come up with a good story. And that is still work in progress for us. We can use a discovery story to relate the trigger for the data analysis we conducted being a simple comment from a client. But our sense is that we will need even more data (evidence) couched in some powerful stories told by individuals, who have changed their interaction behaviours for the better, based on the analytics that they were provided with.

I should finish by giving Shawn’s recent book “Putting Stories to Work” a plug, since I have just completed reading it to help us develop that story. So watch this space!