Data-Driven Collaboration Part 1: How Rich Data Can Improve Your Communication

Originally published on Carpool.

This is the first of a series, coauthored by Laurence Lock Lee of Swoop Analytics and Chris Slemp of Carpool Agency, in which we will explain how you can use rich, people-focused data to enhance communication, increase collaboration, and develop a more efficient and productive workforce.

It’s safe to say that every enterprise hungers for new and better ways of working. It’s even safer to say that the path to those new and better ways is often a struggle.

Many who struggle do so because they are starting from a weak foundation. Some are simply following trends. Others believe they should adopt a new tool or capability simply because it was bundled with another service. Then there are those organizations that focus primarily on “reining in” non-compliant behaviors or tools.

But there’s a way to be innovative and compliant that also improves your adoption: focus instead on the business value of working in new ways—be data-driven. When you incorporate information about your usage patterns to set your goals, you are better positioned to track the value of your efforts and drive the behavior changes that will help you achieve your business objectives.

While it’s assumed that doing market research is critical when marketing to customers, investments in internal audience research have gained less traction, yet they yield the same kinds of return. Data-driven internal communication planning starts at the very beginning of your project.

Here we will demonstrate—using real-world examples—how Carpool and Swoop use data to create better communications environments, nurture those environments, and make iterative improvements to ensure enterprises are always working to their full potential.

Use Data to Identify Your Actual Pain Points

One team Carpool worked with was focused on partnering with customers and consultants to create innovations. They thought they needed a more effective intranet site that would sell their value to internal partners. However, a round of interviews with key stakeholders and end-of-line consumers revealed that a better site wasn’t going to address the core challenge: There were too many places to go for information and each source seemed to tell a slightly different story. We worked with the client to consolidate communications channels and implemented a more manageable content strategy that focused on informal discussion and formal announcements from trusted sources.

In the end, we were able to identify the real pain point for the client and help them address it accordingly because of the research we obtained.

Use Data to Identify New Opportunities

Data can drive even the earliest strategy conversations. In Carpool’s first meeting with a global retail operation, they explained that they wanted to create a new Yammer network as they were trying to curb activity in another, unapproved network. Not only did we agree, but we brought data to that conversation that illustrated the exact size and shape of their compliance situation and the nature of the collaboration that was already happening. This set the tone for a project that is now laser-focused on demonstrating business value and not just bringing their network into compliance.

Use Data to Identify and Enhance Your Strengths

In-depth interviews can be added to the objective data coming from your service usage. Interviews reveal the most important and effective channels, and the responses can be mapped visually to highlight where a communication ecosystem has broadcasters without observers, or groups of catalysts who are sharing knowledge without building any broader consensus or inclusion.

Below, you see one of Carpool’s chord chart diagrams we use to map the interview data we gather. We can filter the information to focus on specific channels and tools, which we then break down further to pinpoint where we have weaknesses, strengths, gaps, and opportunities in our information flow.

CHORD CHART

Turning Data Into Action

These kinds of diagnostic exercises can reveal baselines and specific strategies that can be employed with leaders of the project or the organization.

One of the first activities organizations undertake when implementing an Enterprise Social Networking (ESN) platform is to encourage staff to form collaborative groups and then move their collaboration online. This is the first real signal of ‘shop floor empowerment’, where staff are free to form groups and collaborate as they see fit, without the oversight of their line management. As these groups form, the inevitable ‘long tail’ effect kicks in, where the vast majority of these groups fall into disuse, in contrast to a much smaller number that are wildly successful, and achieving all of the expectations for the ESN. So how can organizations increase their Win/Loss ratio? At Swoop Analytics we have started to look at some of the ‘start-up’ patterns of the Yammer installations of our benchmarking partners. These patterns can emerge after as little as 6 months of operations.

Below, we show a typical first 6 months’ network performance chart, which measures group performance on the dimensions of Diversity (Group Size), Cohesion (Mean 2-Way Relationships formed), and Activity (postings, replies, likes etc.). We then overlay the chart with ‘goal state’ regions reflecting the common group types typically found in ESN implementations. The regions reflect the anticipated networking patterns for a well-performing group of the given type. If a group’s stated purpose positions them in the goal-state region, then we would suggest that they are well positioned to deliver tangible business benefits, aligned with their stated purpose. If they are outside of the goal state, then the framework provides them with implicit guidance as to what has to happen to move them there.

BUBBLE GRAPH

At launch, all groups start in the bottom left-hand corner. As you can see, a selected few have ‘exploded out of the blocks’, while the majority are still struggling to make an impact. The 6-month benchmark provides an early opportunity for group leaders to assess their group against their peer groups, learn from each other, and then begin to accelerate their own performances.

Painting the Big Picture

The convergence of multiple data sources paints a holistic picture of communication and collaboration that extends beyond team boundaries. This new picture extends across platforms and prescribes the design for an ecosystem that meets user and business needs, aligns with industry trends, and is informed by actual usage patterns.

ECOSYSTEM DESIGN

The discussion about the ROI of adopting new ways of working, such as ESNs, hasn’t disappeared. While we believe it’s a waste of resources to try measuring a return from new technologies that have already been proven, it’s clear that developing business metrics and holding these projects accountable to them is just as critical as any effort to increase productivity.

The nature of these metrics also needs to shift from a focus on “counts and amounts” to measures of a higher order that tie more closely to business value. For example, knowing that posting activity has risen by 25% in a year may make you feel a little better about your investment in a collaboration platform. Knowing that there is a higher ratio of people engaging vs. those who are simply consuming is much better. Showing a strong correlation in departments that have higher percentages of engaged users with lower attrition rates … that’s gold.

So now is the time to look at your own organization and wonder: “Do I track how my people are connecting? Do I know how to help them become more engaged and productive? When was the last time I measured the impact of my internal communication ecosystem?”

Then take a moment to imagine the possibilities of what you could do with all of that information.

Stay tuned in the coming weeks for Part 2 and Part 3 when we address the topics of driving engagement by identifying types of enterprise social behavior in individuals, and the results we’ve seen from being data-driven in how we shape internal communications and collaboration.

Social Physics: Oxymoron or Big ‘Social’ Data Tipping Point?

I’ve long been a fan of Sandy Pentland’s work at the MIT media lab. Pentland is perhaps best known for his ‘social tags’ used for monitoring individual human interactions to identify those interaction most associated with productive teams. When his new book “Social Physics: How Good Ideas Spread – Lessons from a New Science” coincided with the Easter holiday break, what better way to spend the break than to consume another Pentland tome? The title “Social Physics” in itself is notable as Pentland’s desire to a put a name to what he is doing, that is, identifying social interaction patterns from logged activity data. I once had a similar aspiration based on my early explorations into mapping discussion log and email data, long before even the Internet and social media/networking was around. My term was “Net Mining”, which I cheekily tried to introduce into Wikipedia, but was rightly rejected by the Wikipedia moderators as having ‘no recognised history’. Perhaps Pentland from the loftier heights of the MIT Media lab will have more success. The use of the ‘Physics’ term is an attempt to align his activities with a fundamental science, and perhaps in the future a few Pentland inspired “Social Physics Laws” that underpin our daily lives.

For some, placing the words ‘Social’ and ‘Physics’ side by side is an oxymoron only to be used in the context of sex education (but lets not go there!). An early critic of Social Physics is none other than Nicholas Carr, who famously stated a decade ago that “IT Doesn’t Matter”; has written an article in the MIT Technology Review (to add insult to injury). Carr states about Pentland “what he fails to see is that those norms and influences are themselves shaped by history, politics, and economics, not to mention power and prejudice” in referring to Pentland’s claims that social norms, behaviours and influences can be accurately modelled mathematically.

 While its hard to argue with Carr’s points here, it is fair to say that Pentland’s ‘experimental rig’ is unquestionably the most sophisticated that we have seen to date, generating copious data for measuring social interactions.  While the social tags were viewed as a little extreme when first introduced, since then, the smart phone has become a proxy social tag in Pentland’s research, which brings the practicality of his work eerily close. So even if the ‘physics’ analogy is a bit of a stretch, as Carr suggests,  perhaps the Pentland crusade could still prove the tipping point for the legions of social network analysis (SNA) academics, who have been analysing social relationship networks using mathematical methods for over 80 years now, to a point where mainstream business finally ‘gets it’.

The book essentially builds on his earlier work with social tags and effective teams. Whereas this earlier work identified the correlation between conversational interaction patterns i.e. the to and fro of dialogues had; equality (or not) of interactions used along with body language attributes identified by the social tags; this book goes a little further to claim a degree of predictive capability.  The claim to be able to infer a cause and effect, over a simple correlation observation is facilitated by the longitudinal data that the social tags and now mobile phone apps can collect. Pentland rightly points out that traditional survey based collections are mostly ‘single point in time’ (marked as ‘1’ in the graph) and therefore make it difficult to claim any cause and effect relationships from the results.

SP Graph

This graph from the book is notable in identifying where the social physics efforts exist when compared with the more traditional survey based social science studies (and our own social network analysis surveys, I might add). As much as we could be offended by being placed close to the (0,0) axis point I still feel there is a big difference in the quality of the intelligence you might gain from an astute survey question and any number of social tagging data points. But one can’t argue with the duration of observation measure. Surveys are a point in time. And pragmatically the same survey is unlikely to be repeated more frequently that on an annual basis at best. My long-term view therefore has been that they are complementary instruments, so I am happy to be a Social Physics follower at this point in time. I might add here that Pentland does not rely solely on ‘mined data’ or what he calls ‘reality modeling’. He does describe a mobile phone app they have developed to turn the device into a social tagging device, which includes surveys to capture personal feedback information as an adjunct to the logged data, reinforcing the view of their complementary usage.

The predictive models Pentland and his colleagues developed for use in predicting consumer behaviour, influencing behaviours, or citizen behaviours identified in the book are mathematically as sophisticated as traditional physics and therefore mostly out of reach to 99% of the population. But this is a necessary requirement that is often understated by ‘Big Data’ proponents. To come up with the conclusions that Pentland does, means that terabytes of data have to be filtered and analysed. This is a non-trivial task and is prone to the same erroneous conclusions that poor physics or economics can produce if not undertaken with care. Just one slight criticism here. While Pentland starts to identify some of the mathematics behind his predictive models, in the book there was scant recognition for the decades of social network quantitative analyses undertaken by many researchers around the world, with perhaps the SNA ‘Bible” by Wasserman and Faust and the UCINET software being best representative of its application.

As I worked through the chapters I became more excited by how similar Pentland’s key findings, from his sophisticated big data modelling, reinforced our own more speculative assumptions, gained from our more traditional survey driven social network analyses. The most prominent finding relates to idea flows traveling through an exploration phase, where diversity of input is a key success indicator; followed by an ‘engagement’ phase where more intense interactions are required to turn ideas into actions. These findings reinforce the concepts we introduced in our white paper ‘The Three E’s of Innovation” (Explore, Engage, Exploit) which was designed for application within traditional organisations, where specific organisational departments and individual roles might play one or more of these roles, as part of their organisational mandate. Pentland’s Explore/Engage was more generically framed, though he does acknowledge the organisational context in his work with organisations in identifying what makes great teams. In the book he makes the same plea as many of us working in the social business world, to put the formal organisational hierarchy aside and to focus on the interaction patterns. He suggests that one way of achieving this is by making such interaction patterns visible to the individuals, in the hope that by viewing their own interaction patterns, they will be more motivated to adjust their own interaction behaviours; in contrast to being coached or instructed from above. We are comforted by this suggestion as again it reinforces some of our own experiences regarding the use of network visualisations to influence behaviour change. For example, we collaborated with the University of Technology in Sydney in visualising project interdependencies to influence project manager decision-making. More recently we have been using our interactive web maps as a vehicle for prospective leaders to ‘discover’ their own better networking strategies.

For us Pentland has added the science, which reinforces our own less data intensive findings and field experiences, and therefore adds to our confidence in pursuing our networked approach to innovation.

The third reinforcement comes from the findings from activity around ideas (as assessed by the social tag information). The diversity of participants in these more open conversations was shown to characterise the exploration phase, and then the denser interaction patterns were shown to characterise the engagement phase. We had conducted some preliminary research on data collected from a Spigit implementation. Spigit is an ideas platform, which makes use of an innovative gaming engine to attract participation in a more open and transparent ideas market place. Ideas platforms largely reflect Pentland’s Exploration phase. In our research exercise we had data on ideas that had actually been accepted for implementation by management and therefore could look for patterns of interactions from the Spigit conversation platform that correlated with accepted ideas as opposed to ideas that were rejected. Our tentative findings were that the key predictors were the level of conversation activity around an idea and the diversity of participants in the conversation. Happily Pentland finds exactly the same thing from his more copious data and undoubtedly more sophisticated analyses. Again we can now move forward with this proposition with much more confidence. Thank you Sandy!

Finally the book addresses the use of social physics in planning cities. Geo-tagging is now a common feature in Smartphones,  so it becomes a relatively simple task to visualise social interactions in a geographic layout as an alternative to the common ‘force field’ layout for social networks. Thanks to Google maps, it is now possible to overlay social network connections onto geographic maps. “Location” is a common attribute used to characterise different actors in a social network in identifying the level of collaboration or otherwise between say, geographic regions. We have reported elsewhere how important co-location can be to facilitate effective knowledge sharing. We have recently also been experimenting with providing geo-mapping visualisations of social networks with this example taken a social network survey of a consulting organisation:

SP GeomapSo Oxymoron of Tipping Point? Personally I have seen many terms accused of being an oxymoron survive and proceed to a healthy existence; with ‘knowledge management’ being one notable example. We acknowledge that many a oxymoron are just a statement of aspirations, more so than a legal claim. We can live with ‘health tans’ and ‘business ethics’, so why not Social Physics? In terms of the use of big ‘social’ data for gaining new performance insights; no doubt there are many an HR professional with their carefully crafted HR Information systems shaking in their boots at the thought of a bunch of data scientists, mining data outside their formal HRIS, making them redundant. For us it is whether ‘Social Physics’ can be the tipping point between the ‘other physics analogy and that is the contrast between Newtonian thinking and Quantum thinking:

Newtonian vs quantumIs it enough to break mainstream business’s penchant for reductionist thinking; in breaking businesses down to individual processes and/or people, in favour of looking at networks as a whole?

We certainly hope so!