How Healthy is your Enterprise Social Network?

At the heart of any Enterprise Social Network (ESN) are the groups or communities formed within them. Understanding the health and productivity of these groups should therefore be front of mind. For ESNs we can look again to the more mature experiences with consumer and external customer communities for guidance. We have written previously about the need to take care when translating consumer network metrics to the Enterprise. But in the case of community health, we believe the mapping from external community to internal community can be fairly close.

What can we learn from consumer and customer networks?

Arguably the gold standard for community health measures was published several years ago by Lithium, a company that specialises in customer facing communities. Lithium used aggregate data from a decade’s worth of community activity (15 billion actions and 6 million users) to identify key measures of a community’s health:

  • Growth = Members (registrations)
  • Useful  = Content (post and page views)
  • Popular = Traffic (visits)
  • Responsiveness (speed of responsiveness of community members to each other)
  • Interactivity = Topic Interaction (depth of discussion threads taking into account number of contributors)
  • Liveliness (tracking a critical threshold of posting activity in any given area)

march-blog-1

march-blog-2

At the time of publishing, Lithium was hoping to facilitate the creation of an industry standard for measuring community health.

Other contributors to the measurement of online community health include online community consultancy Feverbee with their preferred measures as:

  • New visitors – a form of growth measure
  • New visitors to new registered members– conversion rate measure
  • % members which make a contribution– active participants
  • Members active within the past 30 days– time based activity
  • Contributions per active member per month– diversity and intensity measure
  • Visits per active member per month – traffic measure
  • Content popularity-useful content

Marketing firm Digital Marketer health measure recommendations include:

  • Measuring the total number of active members, rather than including passive members.
  • Number of members who made their first contribution as a proxy for growth.
  • A sense of community (using traditional survey methods).
  • Retention of active members i.e. minimal loss of active members (churn rate).
  • Diversity of membership, especially with respect to innovation communities.
  • Maturity, with reference to the Community Roundtable Maturity Model.

Using SWOOP for Assessing Enterprise Community/Group Health

SWOOP is focused on the Enterprise market and is therefore very interested in what we can usefully draw from the experiences of online consumer and customer networks. The following table summarises the experiences identified above and how SWOOP currently addresses these measures, or not:

Customer Community Health Measures SWOOP Enterprise Health Measures
Growth in Membership Measures active membership and provides a trend chart to monitor both growth and decline.
Useful Content Provides a most engaging posts widget to assess the usefulness of content posted.  We are currently developing a sentiment assessment for content.
Popularity/Traffic SWOOP does not currently measure views or reads. Our focus is more on connections that may result from content viewing.
Responsiveness Has a response rate widget that identifies overall response rate and the type of response e.g. like, reply and the time period within which responses are made.
Interactivity Has several rich measures for interactivity, including network connectivity and a network map, give-receive balance and two way connections. The Topic tab also identifies interactivity around tagged topics.
Liveliness The activity per user widget provides the closest to a liveliness (or lack of liveliness) indicator.
Activity over time The Active Users and Activity per User widgets report on this measure.
Contributions per member The Activity per User widget provides this. The New Community Health Index provides a 12 month history as well as alarms when certain thresholds are breached.
Sense of community Requires a survey, which is outside the scope of SWOOP.
Retention Not currently measured directly. The active members trend chart gives a sense of retention, but does not specifically measure individual retention rates.
Diversity Not provided on the SWOOP dashboard, but is now included in the SWOOP benchmarking service. Diversity can be measured across several dimensions, depending on the profile data provided to SWOOP e.g. formal lines of business, geography, gender etc. In the absence of profile data, diversity is measured by the diversity of individual membership of groups.
Maturity The Community Roundtable maturity assessment is a generic one for both online and offline communities. Our preference is to use a maturity framework that is more aligned to ESN, which we have reported on earlier. How the SWOOP measures can be related to this maturity curve is shown below.

march-blog-3

Thresholds for What’s Good, Not so good and Bad

We know that health measures are important, but they are of little use without providing some sense of what a good, bad or neutral score is. In the human health scenario, it is easy to find out what these thresholds are for basic health measures like BMI and Blood Pressure. This is because the medical research community has been able to access masses of data to correlate with actual health outcomes, to determine these thresholds with some degree of confidence. Online communities have yet to reach such a level of maturity, but the same ‘big data’ approach for determining health thresholds still applies.

As noted earlier, Lithium has gone furthest in achieving this, from the large data sets that they have available to them on their customer platform. At SWOOP we are also collecting similar data for ESNs but as yet, not to the level that Lithium has been able to achieve. Nevertheless, we believe we have achieved a starting point now with our new Community Health Index Widget. While we are only using a single ‘activity per active user’ measure, we have been able to establish some initial thresholds by analysing hundreds of groups across several Yammer installations.

march-blog-4

Our intent is to provide community/group leaders with an early warning system for when their groups may require some added attention. The effects of this attention can then be monitored in the widget itself, or more comprehensively through the suite of SWOOP measures identified in the table above.

Communities are the core value drivers of any ESN. Healthy enterprise communities lead to healthy businesses, so it’s worth taking the trouble to actively monitor it.

 

 

 

 

 

 

 

 

 

 

Data-Driven Collaboration Part 3: Sustaining Performance through Continuous Value Delivery

In Part 1 of our series on Data-Driven Collaboration, “How Rich Data Can Improve Your Communication,” we identified how to plan for collaboration by ensuring that goals were established and aligned with our organizational strategy. We then moved on to Part 2, “Recognizing Personas and Behaviors to Improve Engagement,” to explain how you can build engagement by managing behaviors. In this, the final post in our series, co-authored by Swoop Analytics and Carpool Agency, we will identify how to sustain the momentum to ensure that value is continuously delivered as a matter of course.

Previously, we identified the importance of migrating from simple activity measures to those that signify when collaborative relationships are being formed. It is through these relationships that tangible outcomes are achieved. Therefore, it is not surprising that analytics—as applied to sustained relationship-building—plays an important role in continuous value delivery from collaboration.

For example, a CEO from one of Carpool’s clients had been using Yammer to receive questions for a regular Q&A session, but they’d grown concerned that the CEO’s infrequent posts in the group were creating an echo chamber among the same small group of contributors. Careful analysis showed that this was more perception than reality, and the group showed a great deal of variety in cross-organization conversation. As this was precisely the executive’s goal in forming the group, the team doubled down on their investment in this executive-to-company relationship.

Monitoring Maturation Using Analytics

At SWOOP, we have been benchmarking Yammer Installations from start-up to ‘normal operations’ for some time. With Yammer, the typical pattern of start-up is a bottom-up use of ‘Free’ Yammer, which for some, lasts for many years. Without exception, however, sustained usage only occurred after a formal launch and the tacit approval of senior management. We observed different patterns of start-ups from the ‘big-bang’ public launch, through to more organic, yet managed approaches. Whatever strategy is used, organizations always reach a stage of steady-state operations or, at worst, a slow decline.

CLASSIC YAMMER

For an Enterprise Social Network (ESN) like Yammer, we have found that the average engagement rate of the 35+ organizations in our benchmark set is around 29% (i.e., non-observers) with the best at around 75%. It is evident from our benchmarking that for larger organizations—for example, more than say 5,000 participants—it can be hard to achieve engagement levels above 30%. However, this doesn’t mean that staff aren’t collaborating.

We are seeing a proliferation of offerings that make up the digital office. For a small organization, Yammer may be their main collaboration tool, where team level activities take place. For larger organizations, however, Yammer may be seen as a place to explore opportunities and build capabilities, rather than as an execution space. Increasingly, tools like Slack, HipChat, and now Microsoft Teams are being used to fill this space for some teams that depend on real-time conversations as their primary mode of communication.

A Collaboration Performance Framework

As organizations mature with their use of collaboration tools, it is critical not to be caught in the ‘collaboration for collaboration sake’ cycle. As we indicated in “How Rich Data Can Improve Your Communication,” collaboration must happen with a purpose and goals in mind. The path to achieving strategic goals is rarely linear. More regularly, we need to adopt a framework of continuous improvement toward our stated goals. For many organizations, this will take the form of a ‘Plan, Do, Check, Act’ cycle of continuous improvement. However, in this age of digital disruptions and transformations, we need a framework that can also accommodate transformational, as well as incremental innovation.

At SWOOP, we have developed a collaboration performance framework drawn from Network Science.

DIVERSITY GRAPHIC

The framework balances two important dimensions for collaborative performance: diversity and cohesion. It identifies a continuous cycle of value delivery, whether it be radical or incremental. Let’s consider an innovation example, with an organizational goal of growing revenue by 200%:

Individuals may have their own ideas for how this radical target could be achieved. By ‘Exploring’ these ideas with others, we can start to get a sense of how feasible our ideas might be, but also have the opportunity to combine ideas to improve their prospects. The important ‘Engaging’ phase would see the ideas brokered between the originators and stakeholders. These stakeholders may be the key beneficiaries and/or providers of the resources needed to exploit a highly prospective idea. Finally, the ‘Exploiting’ phase requires the focus and strong cooperation of a smaller group of participants operating as a team to deliver on the idea.

The performance framework can be deployed at all levels, from enterprise-wide to individual business units, informal groups, teams, and right down to the individual. In a typical Carpool engagement, we work with smaller teams to demonstrate this cycle and then use the success stories to replicate the pattern more broadly. A current client started with a smaller community of interest of 400 people, and is now expanding the pattern to their global, 4,000-member division.

Deploying Analytics and the Performance Framework

Like any performance framework, it can’t operate without data. While the traditional outcome measures need to be present, the important predictors of collaborative success are relationship-centered measures. For example, your personal network can be assessed on its diversity by profiling the members of your network. Your personal network’s cohesiveness can be measured, firstly, by how many of your connections are connected to each other; and secondly, by how many of these connections are two-way (reciprocated). We can then add layers provided from HR systems such as gender, geography, organizational roles, age, ethnicity, etc. to provide a complete picture of diversity beyond typical dimensions.

In the example below, we show the collaboration performance of participants in a large Yammer network over a 12-month period. You can see how challenging it might be to become an ‘Engager’, maximizing both diversity and cohesion.

BUBBLE GRAPH

We profiled their personal networks for their diversity, cohesion, and size, and plotted them on the performance framework. Interestingly the data exposed that the nature of this Yammer network is a place for exploring and, for some, engaging. There is a gap, however, in the Exploiting region. This is not to say that these individuals were poor at putting projects into motion. More likely, at least in this organization, the ESN is not the usual place to collaborate as a team. If there is no easy transition from the ESN to a team environment, then we have a problem that many ESNs experience: lots of activity but a perception of few tangible results directly from the ESN. Carpool’s approach puts this data together with data from other services and sources to create a holistic picture of the results and impact of the organization’s collaboration evolution.

Continuous Monitoring

For many organizations, continuous monitoring simply means monitoring activity on digital platforms. As we indicated in “Recognizing Personas and Behaviors to Improve Engagement,” activity monitoring can be a poor predictor of performance. At SWOOP, we look at activity that establishes or strengthens a relationship. In the screenshot below, you can see measures such as the number of two-way reciprocated relationships; the degree to which relationships are forming between the formal organizational departments; and who is influential, based on the size of their network, not how frequently they contributed. We identify key player risk by looking at how polarized a network may be among a selected few leaders. Even the Activity/User measure inside groups predicts how cohesive that group may be. By providing this data in real-time, we have the best opportunity for both leaders and individuals to adapt their patterns of collaboration as they see fit.

COLLABORATION CHART

At Carpool, our engagements use a set of such dashboards to regularly check in on all the various channels and stakeholders, and make recommendations on an ongoing basis that accounts for the holistic communication picture.

Final Thoughts

In this series, we have taken you on a journey from planning for, launching, and productively operating a digital office. At the very beginning we emphasized the need to collaborate for a purpose. We then emphasized the need to ‘engage’ through relationships and adopting appropriate behavioral personas. Finally, we have explained the importance of adopting a collaboration performance framework that can facilitate continuous delivery of value.

In order to do all of this effectively, we not only need analytics, but interventions triggered by such analytics to improve the way we work. Analytics on their own don’t create change. But in the hands of skilled facilitators, analytics and rich data provide a platform for productive change. Collaboration is not simply about how to get better results for your organization, but also to get better results for yourself, by helping you to be a better collaborator.

Want More?

We hope these insights into data-driven collaboration will give you new ideas to innovate your own approach to internal communication. If you have any questions, or would like to learn how to establish, nurture, and grow deep internal communities, Carpool and SWOOP has a team who are ready to help you grow your business and drive collaboration today.

Yammer Benchmarking Insights #3 – Collaboration at the Personal Level

 In this episode we drill down to the most detailed level. That’s you, the individual collaborator.

At SWOOP we have designed behavioural personas to characterise individual collaboration patterns based on your pattern of activity.For example, if you are a Catalyst, you are good at getting responses to your posts. Catalysts are important for energizing a community and driving the engagement of others. If you are a Responder, you are good at responding to other people’s posts. Responders are important for sustaining a community and extending the discussions. An Engager is able to balance their Catalyst and Responder behaviour and is seen as the Persona to aspire to, as the Engager effectively balances what they give to others in the form of posts, replies, likes etc. and those that they receive from others. Therefore they are well placed to broker new relationships. Broadcasters tend to post without engaging in conversations. Observers are simply not very active, with less than a single activity every 2 weeks. We see Broadcasting and Observing as negative personas.

behavioural-personasWhat does an organisation’s portfolio of Personas typically look like? The results below are generated from our benchmarking results from close to 40 organisations. The lines indicate the minimum-maximum range and the blue square is the average score.

persona-proportions

The large range of % Observers, between less than 10% to over 70%, may reflect the large variation in maturity amongst the organisations we have benchmarked. It may not only be the case of maturity though, as it is fair to say that the smaller organisations have an easier time engaging a higher proportion of their staff with the Enterprise Social network (ESN).  We show the break-up of the active (non-observer) Personas, which shows that Catalysts lead the way with just over 40%, followed by Responders at just under 30%, Engagers just over 20% and Broadcasters at 10%. This would indicate that in general, ESNs are relying on Catalysts to continue to drive participation and then Responders to sustain it.

Personas within Groups

Given that groups are the space where most of the intense collaboration is likely to happen, we were interested in what the Persona patterns were for the leaders of the best performing groups. We used a combination of two-way connection scores and activity scores to identify the strongest groups. We then applied the same measures to the group members to identify the group leaders. In other words, a group leader is someone who has a high number of two-way connections with other group members, and meets a threshold level of overall activity.

Firstly, we plotted all members on a graph, locating them by the size of their network (y-axis) within the group and the number of 2-way connections they have in the group (x-axis). The bubble is sized by their relative levels of interactions (activity). As you can see, the group leaders are clearly identified in the top right hand corner of the graph as different coloured nodes.

persona-tracking

Secondly, we then plotted the top 5 leader’s Persona movements in 1 week intervals, over a 6-month period. In the example above you can see that the leaders played the Catalyst, Engager and Responder roles primarily. The size of the bubbles reflects their relative number of connections made (breadth of influence), for that week. Not all leaders were active every week. What becomes interesting is that we find some leaders have preferred Personas that are sustained over time. Leaders 1 and 4 in this case have a preference for Catalysing and Engaging. Leader 5 prefers Responding. Leaders 2 and 3 appear to be comfortable switching between Personas.

What appears to be important here is that high performing groups need leaders that can cover the spectrum of positive Personas i.e. Catalyst, Engager, Responder. While it’s fine to have leaders who have a preference for a certain behavioural Persona, it is useful to have leaders who can adapt their Persona to the situation or context at hand.

Personal Networking Performance

At SWOOP we use a fundamental network performance framework, which measures performance against the complementary dimensions of cohesion and diversity. We have indicated that individuals with a large number of two-way connections are likely to have more closed and cohesive networks. Cohesive networks are good for getting things done (executing/implementing). From an innovation perspective however, closed networks can be impervious to new ideas. The best ideas come from more open and diverse networks. In our view therefore, maximum network performance occurs by optimising diversity and cohesion. In other words, it’s good to be part of a strong cohesive network, but this should not be at the expense of maintaining a healthy suite of more diverse connections.

In the graphic below we have plotted the members of one large group on the Network Performance graph. In this case the diversity is measured by the number of different groups that an individual has participated in. The size of the bubbles reflects the size of the individual’s network (breadth of influence).

personal-network

We have labelled regions in the graph according to our Explore/Engage/Exploit model of innovation through networks. We can see that the majority of group members exist in the ‘High Diversity/Low Cohesion’ Explore region. This is consistent with what many people give for their reasons for joining a group. The ‘Engage’ region shows those members who are optimising their diversity/cohesion balance. These are the most important leaders in the group. In an innovation context, these people are best placed to broker the connections required to take a good idea into implementation. The bottom right corner is the Exploit region, which for this group is fairly vacant. This might suggest that this group would have difficulty organically deploying an innovation. They would need to take explicit steps to engage an implementation team to execute on the new products, services or practices that they initiate.

The Innovation Cycle – Create New Value for Your Organisation

We conclude this third edition of Yammer Benchmarking insights be reinforcing the role that individuals can play in creating new value for their organisations. For many organisations, the ESNs like Yammer are seen as a means for accelerating the level of innovation that is often stagnating within the formal lines of business.

As individual’s we may have a preference for a given style of working, as characterised by our Personas. Your personal networks may be large, open and diverse; or smaller, closed and cohesive; or indeed somewhere in between. It is important however to see how your collaboration behaviours contribute to the innovation performance of your organisation. Innovation is a collaborative activity, and therefore we recommend that in your groups you:

  1. Avoid lone work (Observing/Broadcasting) and look to explore new ideas and opportunities collaboratively, online (Catalysing/Engaging/Responding).
  2. Recognise that implementing good ideas needs resources, and those resources are owned by the formal lines of business. Use your network to engage with the resource holders. Make the connections. Influence on-line and off-line.
  3. When you have organisational resources behind you, it’s time to go into exploit mode. Build the cohesive focussed teams to execute/implement, avoiding distractions until the job is done.

 

Data-Driven Collaboration Part 2: Recognizing Personas and Behaviors to Improve Engagement

In Part 1 of this series, “Data-Driven Collaboration Design”—a collaboration between Swoop Analytics and Carpool Agency—we demonstrated how data can be used as a diagnostic tool to inform the goals and strategies that drive your business’ internal communication and collaboration. 

In this post, we will take that thought one step further and show how, after your course is charted to improve internal communication and collaboration, your data continues to play a vital role in shaping your journey.

Monitoring More Than participation

Only in the very initial stages of the launch of a new Enterprise Social Network (ESN) or group do we pay any attention to how much activity we see. Quickly, we move to watching such metrics as average response time; breadth of participation across the organization, teams, roles, or regions; and whether conversations are crossing those boundaries. We focus on measures that show something much closer to business value and motivate organizations to strengthen communities.
For our purposes in this post, it will be useful to pivot our strategy to one that focuses on influential individuals. The community or team—whether it’s a community of practice, a community of shared interest, or a working team—isn’t a “group” or “si te,” but a collection of individuals, with all the messiness, pride, altruism, and politics implied. Data can be used to layer some purpose and direction over the messiness.

Patterns Become Personas

The Swoop Social Network Analytics dashboard uniquely provides analytics that are customized to each person who is part of an organization’s ESN. Using the principle of “when you can see how you work, you are better placed to change how you work”, the intent is for individual collaborators to receive real-time feedback on their online collaboration patterns so they can respond appropriately in real-time.
We analyzed the individual online collaboration patterns across several organizations and identified a number of distinct trends that reflect the majority of personal collaboration behaviors. With that data, we were able to identify five distinct personas: Observers, Engagers, Catalysts, Responders, and Broadcasters.

In addition to classifying patterns into personas, we developed a means of ranking the preferred personas needed to enhance an organization’s overall collaboration performance. At the top we classify the Engager as a role that can grow and sustain a community or team through their balance of posting and responding. This is closely followed by the Catalyst, who can energize a community by provoking responses and engaging with a broad network of colleagues. The Responder ensures that participants gain feedback, which is an important role in sustaining a community. The Broadcaster is mostly seen as a negative persona: They post content, but tend not to engage in the conversations that are central to productive collaboration. Finally, we have the Observer, who are sometimes also called ‘lurkers’. Observers are seen as a negative persona with respect to collaboration. While they may indeed be achieving individual learning from the contribution of others, they are not explicitly collaborating.
Using Personas to Improve Your Online Collaboration Behavior
Individuals who log in to the Swoop platform are provided with a privacy-protected personal view of their online collaboration behaviors. The user is provided with their persona classification for the selected period, together with the social network of relationships that they have formed through their interactions:

You may notice that the balance between what you receive and what you contribute is central to determining persona classification. Balanced contributions amongst collaboration partners have been shown to be a key characteristic of high performing teams, hence the placement of the ‘Engager’ as the preferred persona.

Our benchmarking of some 35 Yammer installations demonstrates that 71% of participants, on average, are Observers. Of the positive personas, the Catalyst is the most common, followed by Responders, Engagers, and Broadcasters. It’s therefore not surprising that an organization’s priority often involves converting Observers into more active participants. Enrolling Observers into more active personas is a task that falls on the more-active Engagers and Catalysts, with Responders playing a role of keeping them there.
At Carpool, during a recent engagement with a client, we encountered a senior leadership team that was comprised of Broadcasters who relied on traditional internal communications. Through our coaching—all the while showing them data on their own behavior and the engagement of their audience—they have since transformed into Catalysts.
One team, for example, had been recruiting beta testers through more traditional email broadcasts. But after just a few posts in a more interactive and visible environment, where we taught them how to invite an active conversation, they have seen not only the value of more immediate feedback, but a larger turnout for their tests. Now, it’s all we can do to provide them with all the data they’re asking for!
Identifying the Key Players for Building Increased Participation

When Swoop looks at an organization overall, we will typically find that a small number of participants are responsible for the lion’s share of the connecting and networking load. In the social media world, these people are called ‘influencers’ and are typically measured by the size of the audience they can attract. In our Persona characterization, we refer to them as Catalysts. Unlike the world of consumer marketing—and this point is critical—attracting eyeballs is only part of the challenge. In the enterprise, we need people to actively collaborate and produce tangible business outcomes. This can only happen by engaging the audience in active relationship-building and cooperative work. This added dimension of relationship-building is needed to identify who the real key players are.
In our work with clients, Carpool teaches this concept by coaching influencers to focus on being “interested” in the work of others rather than on being “interesting” through the content they share, whether that’s an interesting link or pithy comment. With one client, our strategy is to take an organization’s leader, a solid Engager in the public social media space, and “transplant” him into the internal communications environment where he can not only legitimize the forum, but also model the behavior we want to see.
In the chart below, we show a typical ‘Personal Network Performance’ chart, using Enterprise Social Networking data from the most active participants in an enterprise. The two dimensions broadly capture an individual’s personal network size (number of unique connections) against the depth of relationships they have been able to form with them (number of reciprocated two-way connections). They reflect our Engager persona characteristics. Additionally, we have sized the bubbles by a diversity index assessed by their posting behavior across multiple groups.
The true ‘Key Players’ on this chart can be seen in the top right-hand corner. These individuals have not only been able to attract a large audience, but also engaged with that audience and reciprocated two-way interactions. And the greater their diversity of connections (bubble size), the more effective they are likely to be.

Data like this is useful in identifying current and potential key players and organizational leaders, and helps us shift those online collaboration personas from Catalyst to Engager and scale up as far and as broadly as they can go.

Continuous Coaching

Having data and continuous feedback on your online collaboration performance is one thing, but effectively taking this feedback and using it to build both your online and offline collaboration capability requires planning and, of course, other people to collaborate with! Carpool believes in a phased approach, where change the behavior of a local team, then like ripples in a pond, expand the movement to new ways of working through compelling storytelling, using the data that has driven previous waves of change.
To get started now, think about your own teams. Would you be prepared to have your team share their collaboration performance data and persona classifications? Are you complementing each other, or competing? If that’s a little too aggressive, why not form a “Working Out Loud” circle with some volunteers where you can collectively work on personal goals for personal collaboration capability, sharing, and critiquing one another’s networking performance data as you progress?
Think about what it takes to move from one behavior Persona to another. How would you accomplish such a transformation, personally? What about the teams you work in and with? Then come back for the next, and final, part of this co-authored series between Swoop and Carpool, where we will explain the value in gaining insights from ongoing analytics and the cycle of behavior changes, analysis, and pivoting strategies.

Data-Driven Collaboration Part 1: How Rich Data Can Improve Your Communication

Originally published on Carpool.

This is the first of a series, coauthored by Laurence Lock Lee of Swoop Analytics and Chris Slemp of Carpool Agency, in which we will explain how you can use rich, people-focused data to enhance communication, increase collaboration, and develop a more efficient and productive workforce.

It’s safe to say that every enterprise hungers for new and better ways of working. It’s even safer to say that the path to those new and better ways is often a struggle.

Many who struggle do so because they are starting from a weak foundation. Some are simply following trends. Others believe they should adopt a new tool or capability simply because it was bundled with another service. Then there are those organizations that focus primarily on “reining in” non-compliant behaviors or tools.

But there’s a way to be innovative and compliant that also improves your adoption: focus instead on the business value of working in new ways—be data-driven. When you incorporate information about your usage patterns to set your goals, you are better positioned to track the value of your efforts and drive the behavior changes that will help you achieve your business objectives.

While it’s assumed that doing market research is critical when marketing to customers, investments in internal audience research have gained less traction, yet they yield the same kinds of return. Data-driven internal communication planning starts at the very beginning of your project.

Here we will demonstrate—using real-world examples—how Carpool and Swoop use data to create better communications environments, nurture those environments, and make iterative improvements to ensure enterprises are always working to their full potential.

Use Data to Identify Your Actual Pain Points

One team Carpool worked with was focused on partnering with customers and consultants to create innovations. They thought they needed a more effective intranet site that would sell their value to internal partners. However, a round of interviews with key stakeholders and end-of-line consumers revealed that a better site wasn’t going to address the core challenge: There were too many places to go for information and each source seemed to tell a slightly different story. We worked with the client to consolidate communications channels and implemented a more manageable content strategy that focused on informal discussion and formal announcements from trusted sources.

In the end, we were able to identify the real pain point for the client and help them address it accordingly because of the research we obtained.

Use Data to Identify New Opportunities

Data can drive even the earliest strategy conversations. In Carpool’s first meeting with a global retail operation, they explained that they wanted to create a new Yammer network as they were trying to curb activity in another, unapproved network. Not only did we agree, but we brought data to that conversation that illustrated the exact size and shape of their compliance situation and the nature of the collaboration that was already happening. This set the tone for a project that is now laser-focused on demonstrating business value and not just bringing their network into compliance.

Use Data to Identify and Enhance Your Strengths

In-depth interviews can be added to the objective data coming from your service usage. Interviews reveal the most important and effective channels, and the responses can be mapped visually to highlight where a communication ecosystem has broadcasters without observers, or groups of catalysts who are sharing knowledge without building any broader consensus or inclusion.

Below, you see one of Carpool’s chord chart diagrams we use to map the interview data we gather. We can filter the information to focus on specific channels and tools, which we then break down further to pinpoint where we have weaknesses, strengths, gaps, and opportunities in our information flow.

CHORD CHART

Turning Data Into Action

These kinds of diagnostic exercises can reveal baselines and specific strategies that can be employed with leaders of the project or the organization.

One of the first activities organizations undertake when implementing an Enterprise Social Networking (ESN) platform is to encourage staff to form collaborative groups and then move their collaboration online. This is the first real signal of ‘shop floor empowerment’, where staff are free to form groups and collaborate as they see fit, without the oversight of their line management. As these groups form, the inevitable ‘long tail’ effect kicks in, where the vast majority of these groups fall into disuse, in contrast to a much smaller number that are wildly successful, and achieving all of the expectations for the ESN. So how can organizations increase their Win/Loss ratio? At Swoop Analytics we have started to look at some of the ‘start-up’ patterns of the Yammer installations of our benchmarking partners. These patterns can emerge after as little as 6 months of operations.

Below, we show a typical first 6 months’ network performance chart, which measures group performance on the dimensions of Diversity (Group Size), Cohesion (Mean 2-Way Relationships formed), and Activity (postings, replies, likes etc.). We then overlay the chart with ‘goal state’ regions reflecting the common group types typically found in ESN implementations. The regions reflect the anticipated networking patterns for a well-performing group of the given type. If a group’s stated purpose positions them in the goal-state region, then we would suggest that they are well positioned to deliver tangible business benefits, aligned with their stated purpose. If they are outside of the goal state, then the framework provides them with implicit guidance as to what has to happen to move them there.

BUBBLE GRAPH

At launch, all groups start in the bottom left-hand corner. As you can see, a selected few have ‘exploded out of the blocks’, while the majority are still struggling to make an impact. The 6-month benchmark provides an early opportunity for group leaders to assess their group against their peer groups, learn from each other, and then begin to accelerate their own performances.

Painting the Big Picture

The convergence of multiple data sources paints a holistic picture of communication and collaboration that extends beyond team boundaries. This new picture extends across platforms and prescribes the design for an ecosystem that meets user and business needs, aligns with industry trends, and is informed by actual usage patterns.

ECOSYSTEM DESIGN

The discussion about the ROI of adopting new ways of working, such as ESNs, hasn’t disappeared. While we believe it’s a waste of resources to try measuring a return from new technologies that have already been proven, it’s clear that developing business metrics and holding these projects accountable to them is just as critical as any effort to increase productivity.

The nature of these metrics also needs to shift from a focus on “counts and amounts” to measures of a higher order that tie more closely to business value. For example, knowing that posting activity has risen by 25% in a year may make you feel a little better about your investment in a collaboration platform. Knowing that there is a higher ratio of people engaging vs. those who are simply consuming is much better. Showing a strong correlation in departments that have higher percentages of engaged users with lower attrition rates … that’s gold.

So now is the time to look at your own organization and wonder: “Do I track how my people are connecting? Do I know how to help them become more engaged and productive? When was the last time I measured the impact of my internal communication ecosystem?”

Then take a moment to imagine the possibilities of what you could do with all of that information.

Stay tuned in the coming weeks for Part 2 and Part 3 when we address the topics of driving engagement by identifying types of enterprise social behavior in individuals, and the results we’ve seen from being data-driven in how we shape internal communications and collaboration.

Are we Getting Closer to True Knowledge Sharing Systems?

knowledge-systems

(image credit: https://mariaalbatok.wordpress.com/2015/02/10/religious-knowledge-systems/)

First generation knowledge management (KM) systems were essentially re-labelled content stores. Labelling such content as ‘knowledge’ did much to discredit the whole Knowledge Management movement of the 1990s. During this time, I commonly referred to knowledge management systems as needing to comprise both “collections and connections”, but we had forgotten about the “connections”.  This shortcoming was addressed with the advent of Enterprise Social Networking (ESN) systems like Yammer, Jive, IBM Connect and now Workplace from Facebook. So now we do have both collections and connections. But do we now have true knowledge sharing?

Who do we Rely on for Knowledge Based Support?

A common occupation for KM professionals is to try and delineate a boundary between information, that can be effectively managed in an information store, and knowledge, which is implicitly and tacitly held by individuals. Tacit knowledge, arguably, can only be shared through direct human interaction. In our Social Network Analysis (SNA) consulting work we regularly surveyed staff on who they relied on to get their work done. We stumbled on the idea of asking them to qualify their selections by choosing only one of:

  • They review and approve my work (infers a line management connection)
  • They provide information that I need (infers an information brokering connection)
  • They provide advice to help me solve difficult problems (infers a knowledge based connection)

The forced choice was key. It proved to be a great way of delineating the information brokers from the true knowledge providers and the pure line managers. When we created our ‘top 10 lists’ for each role, there was regularly very little overlap. For organisations, the critical value in these nominations is that the knowledge providers are the hardest people to replace, and therefore it is critical to know who they are. And who they are, is not always apparent to line management!

So how do staff distribute their connections needs amongst line managers, information brokers and knowledge providers? We collated the results of several organisational surveys, comprising over 35,000 nominations, using this identical question, and came up with the following:

work-done

With 50% of the nominations, the results reinforce the perception that knowledge holders are critical to any organisation.

What do Knowledge Providers Look Like?

So what is special about these peer identified knowledge providers? Are they the ‘wise owls’ of the organisation, with long experiences spanning many different areas? Are they technical specialists with deep knowledge about fairly narrow areas? We took one organisation’s results and assessed the leaders of each of the categories of Approve/review, Information and Knowledge/Advice looking for their breadth or diversity of influence. We measured this by calculating the % of connections, nominating them as an important resource, that came from outside their home business unit. Here are the results:

external-links

As we might anticipate, the inferred line management had the broadest diversity of influence. The lowest % being for the knowledge providers, suggests that it’s not the broadly experienced wise old owls, but those specialising in relatively narrow areas, where people are looking for knowledge/advice from.

Implications for Knowledge Sharing Systems

We have previously written about our Network Performance Framework, where performance is judged based on how individuals, groups, or even full organisations balance diversity and cohesion in their internal networks:

personal-networking

The above framework identifies ‘Specialists’ as those who have limited diversity but a strong following i.e. many nominations as a key resource. These appear to be the people identifying as critical knowledge providers.

The question now is to whether online systems are identifying and supporting specialists to share their knowledge? At SWOOP we have aimed to explore this question initially by using a modification of this performance framework on interactions data drawn from Microsoft Yammer installations:

performance

We measured each individual’s diversity of connections (y-axis) from their activities across multiple Yammer groups. The x-axis identifies the number of reciprocated connections an individual has i.e. stronger ties, together with the size of their personal network, identified by the size of the bubble representing them. We can see here that we have been able to identify those selected few ‘Specialists’ in the lower diversity/stronger cohesion quadrant, from their Yammer activities. These specialists all have relatively large networks of influence.

What we might infer from the above analysis is that an ESN like Yammer can identify those most prospective knowledge providers that staff are seeking out for knowledge transfer. But the bigger question is whether actual knowledge transfer can happen solely through an ESN like Yammer?

Is Having Systems that Provide Connections and Collections Enough to Ensure Effective Knowledge Sharing?

The knowledge management and social networking research is rich with studies addressing the question of how social network structure impacts on effective knowledge sharing. While an exhaustive literature review is beyond the scope of this article, for those inclined, this article on Network Structure and Knowledge Transfer: The Effects of Cohesion and Range is representative. Essentially this research suggests that ‘codified’ knowledge is best transferred through weak ties, but tacit knowledge sharing requires strong tie relationships. Codified knowledge commonly relates to stored artefacts like best practice procedural documents, lessons learned libraries, cases studies and perhaps even archived online Q&A forums. Tacit knowledge by definition cannot be codified, and therefore can only be shared through direct personal interactions.

I would contend that relationships formed solely through ESN interactions, or in fact any electronic systems like chat, email, etc. would be substantially weaker than those generated through regular face to face interactions. Complex tacit knowledge would need frequent and regular human interactions. It is unlikely that the strength of tie required, to effectively share complex knowledge, can be achieved solely through commonly available digital systems. What the ESN’s can do effectively is to help identify who you should be targeting as a knowledge sharing partner. Of course this situation is changing rapidly, as more immersive collaboration experiences are developed. But right now for codified knowledge, yes; for tacit knowledge, not yet

 

Getting “Liked”: Is Content Overrated?

We are regularly bombarded with the message that “Content is King”, quickly followed by a plethora of methods, tips and even tricks on how to make our content more attractive i.e. being “Liked” by many. Social media has introduced the “Like” button so we can more explicitly signal our appreciation of the content that we are exposed to. But how much is appreciation directed by the “content” of that message and how much is that appreciation directed by the messenger? We have some recent analytics that provides some new insights on this.

Content or Messenger?

content-image

Doubt about the true value of content was first flagged by Canadian Philosopher Marshall McLuhan, with his often quoted “the medium is the message” statement in the 1960s. In the age of social media, this has now morphed into “the messenger is the message”, with the rise to prominence of the “Influencer”. Influencers are those rare individuals that can influence the buying behaviours of many, simply through the power of their personal recommendation. Think about your own “liking” behaviour on Facebook. How often would you “like” a passive Facebook advertising page, as opposed to “liking” a posting made by a human influencer, linking back to that very same page? This is a clear example of the power of the messenger, being more important than the message itself.

 

Enterprise “Liking”

I have recently written  about how the “Like Economy” we experience in consumer social networks may not map well when social networks move inside the enterprise in the form of Enterprise Social Networks (ESN). Unlike consumer social networks, we are unlikely to see advertisements tolerated in the ESN. But Enterprises often do want to send messages to “all staff”, particularly for major change initiatives they want staff to “buy into”. Regularly, corporate communications staff are keen to look at statistics on how often the message is read and even ‘liked’. But is this a true reflection of engagement with a message?

Our benchmarking of ESNs  has identified that “Likes” make up well over 50% of all activities undertaken on ESNs. In the absence of carefully crafted advertising sites, just what is driving our “liking” behaviour in the Enterprise? We decided to explored this by not looking at every message posted (for privacy reasons Swoop does not access message content), but by looking at patterns of who “Likes” were directed at. We aggregated the “Likes” from 3 organisations, from our benchmarking partners, for individuals who had posted more than 500  “Likes” over a 12 month period. Collectively, there were over 4,000 individuals that met the criteria. We then categorised their “Likes” according to:

“Like” Characteristic Interpretation
One-off (‘Like’ recipient was a once only occurrence) Attraction is largely based on the content of the message alone.
Repeat Recipient (‘Like’ recipient was a repeat recipient from this individual) Recipients are potentially ‘influencers’, so the motivation may come from the person, more so than the message content.
Reciprocated (‘Like’ recipient has also been a ‘Like’ provider for this individual) Recipients have a ‘relationship’ with the ‘Liker’, which drives this behaviour


‘Like’ Analysis Results

The results of our analysis is shown below:

like-analysis

The results show clearly that in the Enterprise context, the driver for ‘liking’ behaviour is the relationship. The data suggests that you are nearly 3 times as likely to attract a ‘like’ to your message from someone, if you had previously ‘liked’ a posting of theirs.

So what is the implications for the Enterprise?

If indeed an Enterprise is relying on counting ‘likes’ as a measure of staff engagement, one needs to encourage the formation of relationships through reciprocated actions as a priority, over spending time ‘crafting the perfect message’, or even on relying on influencers to build engagement. Specifically, one could:

  • Acknowledge a “Like”, in particular, if you have never responded to this person before.
  • Craft your important messages as a means to start a conversation, more so than a statement of opinion. Explicitly frame your statement as a question or explicitly ask for feedback.
  • Start to think about ‘engagement’ as more than a ‘read’ or a ‘like’ and more from a relationship perspective. How deep and broadly is your issue being discussed?
  • When you read advice from social media experts on “how to generate more ‘Likes’ for you content”, replace this with “how to generate more ‘relationships’ using your content”.

As I am writing this post I’m painfully reminded of the need to ‘eat your own dog food’. So I’m making a commitment that if you respond or ‘like’ this article, I will at least try to respond in kind!

likeimage

 

How do these results map with your own experiences?

Yammer Benchmarking Edition 1

 

First in a series of SWOOP Yammer Benchmarking video blogs. Swoop has benchmarked some 36 Yammer installations to date. This first video blog shares some insights gained on the important measures that influence collaboration performance.

 

Video script:

SLIDE 1

Hello there

My Name is Laurence Lock Lee, and I’m the Co-Founder and Chief Scientist at Swoop Analytics.

If you are watching this you probably know what we do, but just in case you don’t, Swoop is a social analytics dashboard that draws its raw data from enterprise social networking tools like Yammer and provides collaboration intelligence to its users, who can be anyone in the organisation.

Our plan is to provide an ongoing series of short video blogs specifically on our Yammer benchmarking insights, as we work with the data we collect. We will aim to use this format to keep you appraised of developments as they happen. We have also recently signed a joint research agreement with the Digital Disruption Research Group at the University of Sydney in Australia. So expect to see the results of this initiative covered in future editions.

The Swoop privacy safeguards means its pure context free analysis, no organisational names, group names, individual names…we don’t collect them.

SLIDE 2

This is the “Relationships First” benchmarking framework we designed for our benchmarking. But we also measure traditional activity measures, which we tend not to favour as a collaboration performance measure…but more about that later. The 14 measures  help us characterise the organisations we benchmark by comparing them against the maximum, minimum and average scores of those in our sample set,  which currently sits at 36 organisations and growing rapidly. They represent organisations large and small from a full cross section of industries and geographies.

SLIDE 3

For those of you who have not been exposed to the Swoop behavioural online personas, you will find a number of articles on our blog.

Because I will be referring to them it’s useful to know the connection patterns inferred by each of them. We don’t include the ‘Observer’ persona here as they are basically non-participants.

Starting with the Responder; Responders make connections through responding to other people’s posts or replies. This can be a simple ’like’, mention or notify..…and it often is, but sometimes it can be a full written reply.

In contrast the catalyst makes connections through people replying to their posts. A good catalyst can make many connections through a good post. Responders have to work a bit harder. They mostly only get one connection per interaction.

The Engager as you can see is able to mix their giving and receiving. This is a bit of an art, but important as engagers are often the real connectors in the community or group.

And what about the broadcaster? Well if your posts don’t attract any response, then we can’t identify any connections for you.

SLIDE 4

This is how we present our benchmarking results to the participants. You can see that we have the 14 dimensions normalized such that the ‘best in class’ results are scored at 100 points and the worst performance at zero. The orange points are the score for the organisation with lines connecting their scores to the average scores.

A few points to note are that we only count ‘active users’ being those that have had at least one activity in Yammer over the period we analyze, which is the most recent 6 months.

Some of the measures have asterisks (*) , which means that the score has been reversed for comparison purposes. For example, a high score for %Observers is actually a bad result, so this is reversed for comparison purposes.

Finally, not all of the measures are independent of each other, so it is possible to see recurring patterns amongst organisations. We can therefore tell a story of their journey to date, through seeing these patterns.  For example, a poor post/reply ratio indicates to us that the network is immature and therefore we would also expect a high % observers score.

SLIDE 5

One way of understanding which of the 14 measures are most important to monitor is to look at the relative variances for each measure across the full sample set. Where we see a large relative variance, we might assume that this is an area which provides most opportunity for improvement. In our sample to date it is the two-way connections measure which leads the way. I’ll go into a bit more detail on this later on. The % Direction measure relies solely on the use of the ‘notification’ type, which we know some organisations have asked users to avoid, as it’s really just like a cc in an email. So perhaps we can ignore this one to some extent. The Post/Reply measure is, we believe, an indicator of maturity. Foe a new network we would expect a higher proportion of posts to replies, as community leaders look to grow activity. However, over time we would expect that the ratio would move more toward favoring replies, as participants become more comfortable with online discussions.

It’s not surprising that this measure shows up as we do have quite a mix of organisations at different maturity stages in our sample to date. The area where we have seen less variance are the behavioural personas, perhaps with the exception of the %Broadcasters. This suggests that at least at the Enterprise level, organisations are behaving similarly.

SLIDE 6

This slide is a little more complex, but it is important if you are to gain an appreciation of some of the important relationship measures that SWOOP reports on.

Following this simple example:

Mr Catalyst here makes a post in Yammer. It attracts a response from Ms Responder and Mr Engager. These responses we call interactions, or activities. By undertaking an interaction, we have also created a connection for all three participants.

Now Mr Engager’s response was a written reply, that mentions Ms Responder, because that’s the sort of guy he is. Mr Catalyst responds in kind , so now you can see that Mr Catalyst and Mr Engager have created a two way connection.

And Ms Responder responds to Mr Engager’s mention with an appreciative like, thereby creating a two-way connection Between Mr Engager and Ms Responder.  Mr Engager is now placed as a broker of the relationship between Mr Catalyst and Ms Responder. Mr Catalyst could create his own two-way connection with Ms Responder, but perhaps she just responded to Mr catalyst with a like…leaving little opportunity for a return response.

So after this little flurry of activity each individual can reflect on connections made…as Mr Engager is doing here.

So in summary, An interaction is any activity on the platform. A connection is created by an interaction and of course strengthened by more interactions with that connection. Finally, we value two-way interactions as this is reciprocity, which we know leads to trust and more productive collaboration

SLIDE 7

Finally I want to show you how the two-way connections scores varies amongst the 36  participants to date. Typically, we would look to build the largest and most cohesive Yammer network as possible, though we accept this might not always be the case. While the data shows that the top 4 cohesive networks were relatively small, there are also 3 organisations that have quite large networks with quite respectable two-way connections scores.

So there is definitely something to be learnt here between the participants.

SLIDE 8

So in summing up, as of September we have 36 participants in our benchmark and growing rapidly now. The two-way connections measure, which is arguably the most important predictor of collaborative performance, was also the most varied amongst the participants.

By looking at the patterns between the measures we can start to see emerging patterns. We hope to explore these patterns in more detail with our research partners in the coming year.

Finally, we show that network size should not be seen as a constraint to building a more cohesive network. We have reported previously that another common measure, network activity levels are also an unreliable measure for predicting collaboration performance.

SLIDE 9

In the next video blog we will be looking at Yammer groups in more detail. We are aware that for many organisations, it’s the Yammer groups that form the heart of the network, so it makes sense to take a deeper dive into looking at them.

Thank you for your attention and look forward to seeing you next time.

Q&A: Start-ups vs Large Corporates

start-up-versus-corporate

SWOOP Analytics celebrated its 2nd Birthday late last month with our distributed workforce face to face, many for the first time; and also many of our early adopter partners and clients. Unlike most start-ups addressing the consumer market, SWOOP Analytics targets the ‘big end of town” i.e. large corporates and public institutions who’s procurement practices go far beyond someone simply pushing the ‘buy’ button. We have been fortunate to have several highly experienced executives and consultants advising us on our product startup journey. We thought we would take advantage of their presence to conduct a mock Q & A panel session, modelled on the ABC show Q & A. We chose our panel members based on their experience with working and advising both start-ups and large corporations. Our panel topic was “How can Startups work Effectively with Large Corporates”.

Here were our selected panel members:

Dr. Eileen Doyle

Eileen is an experienced executive and company director for big end of town companies like BHP, OneSteel, Boral, GPT, Port Waratah Services, Oil Search and the CSIRO. We also identified Eileen as one of the most connected female company directors on the ASX in our ASX networking studies. But most importantly she is also an Angel investor in Swoop and a former chair of Hunter Angels, so she was well qualified to join our panel.

Ross Dawson

Ross is recognized as one of the world’s leading futurists. He is regularly engaged for keynote speeches and consulting advice by the ‘big end of town’ clients like Macquarie Bank, Ernst & Young, Proctor & Gamble, News Ltd and many more about what is coming ‘down the pipeline of future technologies’. A long term friend of the Swoop founders, Ross is an entrepreneur himself, with several startup initiatives on the go.

Allan Ryan

Allan is the founding director of the Hargraves Institute, celebrating its 10th birthday this year as a leading community for major corporations focusing on innovation.  Many of Australia’s leading organisations have been sharing their innovation experiences and practices in the Hargraves community. And Allan has had a front row seat in observing how large and complex organisations are addressing the innovation challenge.

swoop-panelists

The panel were actively ‘grilled’ by an enthusiastic audience. And the panel to their credit, responded in good spirit. Here are some nuggets of wisdom shared by our panel:

  1. How can big corporations work more effectively with start-ups?

Eileen shared the mindset is different in a large corporate, where you have to look at risk in a different way. The balance between risk and reward is tilted to risk in a large corporate and reward in a start-up, which is why the majority of start-ups fail. Interaction between the two works well when there’s a genuine need that the large corporate has, which aligns with what start up is doing. Her advice is investors will not get rewarded if corporations don’t take risks, it’s ok to fail which we need to learn to celebrate.

Ross shared that it’s key for big corporations to set up mechanisms to deal with start-ups, like accelerators, incubators and hackathons. There needs to be more structures and governance to support transformation. As a Futurist he helps people think about the future to make better decisions today, that will make a different in the future.

From his work at Hargraves Institute, Allan shared that large organisations are maturing rapidly. His advice to start ups was to find the most mature area which has the need for your service and give them a solution that doesn’t give them great risk to test and try.

  1. Quality versus innovation?

An audience member asked about the importance of IT security for starts ups and another shared it can be boring to get the basics right, how crucial is this for successful innovation? Panelist’s shared:

  • Start-ups need to get their disaster recovery and IT security right, at least to the level of the Organisation they’re engaging with.
  • Start–up products need to have their quality right and be tried and tested. Quality is more important than innovation where there are winners and losers.
  • Start-ups need to adopt a philosophy of forever getting better in the basics and making sure they’re improving.
  1. Can Australia become the Silicon Valley of the Southern Hemisphere?

For Australia to further foster the success of start-ups Panelist’s suggestions included:

  • Linking the quality of Australia’s research to effective commercialisation on a global scale
  • Promoting innovation as ‘invention accepted by the market’ by private and public businesses spending more in this area.
  • The Government providing tax breaks and recognition of greater risk.
  • Universities taking a what’s best for the whole country mindset versus what individual academics might want to do.
  • Encouraging small businesses to be more innovative and teaching kids how to have fun doing new things.

Our takeaway message was large corporates have multiple entry points, so it’s important not to get discouraged and keep looking for the people that have roles with a larger risk profile in them.

Image citation: https://www.tnooz.com/article/startup-chic-vs-corporate-geek-can-gen-y-retention-predict-success/

 

 

Tyranny of the ‘Long Tail’

Longtail

The advent of Internet enabled e-commerce brought an increased focus on ‘Long Tail’ distributions . Internet organisations like Amazon are able to exploit their low marginal costs by selling low volumes to the Long Tail of buyers with unique non-mainstream needs. The Long Tail has therefore been celebrated as the new opportunity of the Internet age. Even knowledge sharing systems e.g. blogs, podcasts, video have celebrated the increased reach that the Internet facilitates. The ubiquitous 90/9/1 rule acknowledges that 90% of participants are simply consumers of content.

The Pervasive “Long Tail” Distribution

Our own work with communities and social networks identifies the Long Tail effect. Our benchmarking of ‘Key Players’ with ‘off-line’ social networks identified that the majority of those with large personal networks is confined to a selected few. Our Key Player index identifies how concentrated the core of the network is by measuring the % of participants that represent 50% of all connections. For off-line communities we found that the key player index is typically between 11% and 32%. However, when we applied this measure to online Enterprise Social Networks (ESN), this range drops to from 4% to 12%, meaning as little as 4% of the community members are responsible for 50% of all connections, accentuating how online communities amplify the Long Tail effects. To further demonstrate how pervasive this long tail distribution is, in an earlier post we showed how the social cohesion within Yammer groups in one Enterprise followed the long tail power curve distribution. In a follow up analysis we dug deeper into the group we identified as the most cohesive, to better understand what was happening inside. And what we found was another long tail distribution. Of the 243 staff who had been active in this group, over a period of 18 months since launching, 70 had only a single interaction, while 12 members (5%) were responsible for nearly 68% of all interactions. So even in what are perceived the ‘best’ community groups, most of the connecting is being done by only a selected few.

Knowledge Sharing is not Enough?

Here is the issue. Just because more people are exposed to new information and knowledge, can we assume that new enterprise value is being generated? Perhaps for those organisations that measure their success through increased readership, this is fine. But I would argue that increased readership, if it doesn’t result in increased actions, is a shallow benefit at best. We experienced the same issues with Knowledge Management (KM) in the 1990s. In those days KM solutions were largely content centric. It was common to celebrate shared content. Those of us at the centre of KM programmes of the day were however continuously challenged by our executive to demonstrate real value. I can still recall our CEO addressing the knowledge team by saying “I’m not interested in awards or newspaper articles about how great our KM programme is. What I want to see is real, on the ground, impacts”. While we could see a real change in the level of knowledge sharing that was happening, evidence of real impact was limited to selected anecdotes and one off case studies. As impressive as some of these were, they were far from representative of a sustainable enterprise wide change. Interestingly, this is where many Enterprise Social Network community managers now find themselves today.

Engaging the ESN “Long Tail”

It appears that we cannot escape the ESN “Long Tail”, so what can we do to engage them in more active collaboration? We will be addressing this more comprehensively in future posts, but its suffice to say that simply appreciating the extent of its existence and then creating some targeted interventions is a good start. Taking a leaf out of Amazon’s play book, we need to accept that the needs of the “Tail” are not the same as those at the core. Likely their needs will be more diverse and unique. It’s therefore incumbent of the community leaders to ensure that there is a sufficient richness and diversity in the conversations they seed, to attract greater participation from the ‘Tail’.

Image citation: http://www.longtail.com/about.html