Why privacy-first social data analytics matters
In an era where each of us is increasingly generating more and more online data about our lives, the way in which that data is used concerns us all. Edward Snowden alerted us to the way in which the State has access to anything and everything about our online lives.
Despite people’s misgiving about Governments spying on them, they seem even more concerned with how commercial organisations share and use the data created about them on social sites.
The challenge isn’t with the social sites themselves collecting and sharing the data, but with how such data is harvested and processed by firms using the massive data aggregation and data insight services now available.
Aggregated data makes us increasingly identifiable online
Sites like Facebook, Twitter, and LinkedIn not only have our primary data — which we input and which we “know” — but also a huge amount of secondary data which is generated as a consequence of the existence and lifecycle of our primary data.
For example, Facebook uses the primary and the secondary data to derive an idea of “why” you liked certain things, so that they can serve you more content that you like, and more ads which are relevant to your needs. This is an example of “human data” created by, and because of, your social media activities. Facebook also sell a lot of this data to third parties — in fact people on Facebook generate 500x more data each day than the New York Stock Exchange. Although Facebook imposes strict terms of service on those accessing their data, there is still a danger that firms using the data do not comply with the precise terms of service.
And this is only one example. Streams of largely unstructured data from all social media sites are being aggregated and analysed on a massive scale — think of it streaming in from Amazon, Twitter, Linkedin, Google, Pinterest, Tumblr, Bitly, Fitbit etc. Firms using this aggregated data know you better than your friends — and they want you to know and trust them, and to buy from them.
The imperative of ‘Privacy By Design’
Most people feel uneasy about being personally identified and linked to every action we take online, yet this ability is technically near.
What it takes to prevent it from being abused is a cultural and moral commitment by organisations using Human Data analytics to firstly embed “Privacy by Design” into their culture, strategy, and practices. Then they need to implement a technical strategy that embeds this as an essential component of all systems which manipulate Human Data analytics.
You may be surprised to know there are established techniques to de-tune social (and other data). These aim to anonymise the data while retaining its value as a behaviour prediction tool.
They include obscuring all personal identity data by encrypting the fields of records; generalising the data by assigning it to categories rather than retaining exact details; manipulating the records to aggregate certain sets in a statistically insignificant way; and, even adding noise records and fields into the data set. These are some of the technical strategies which privacy-led Human Data analytics requires.
You might also be surprised to know that firms using data taken from social providers often do not fully understand the service provider’s terms of service with respect to the distribution and re-use of that data. Social providers like Facebook, Twitter and LinkedIn are on the leading edge of personal privacy protection, and are regularly updating and refining their terms of service. In particular Facebook now has very privacy-focused terms of service to help users control their information.
Three steps for users of Human Data
Where are we going with all this?
Well the bottom line is this: if you are a user of Human Data analytics to enhance your own business’s performance then you need to:
(1) ensure you have established a “Privacy By Design” organisational environment,
(2) go back to your provider of Human Data and reassure yourself that they are privacy-focused, and
(3) ask them to provide assurance that they explicitly respect the terms of service for each and every social network.
Companies are still often tempted to “fly under the radar” of the terms of service in order to extract “competitive” data and insights. However social networks are actively responding to prevent misuse of their members’ information. If trust is lost by end-user companies abusing the terms of service then those terms are certain to become more restrictive, thus limiting the potential insights and benefits available to third parties.
Therefore every company involved in Human Data analytics needs to have active governance and audit policies that at least reflect the best practice of any social provider in their dataset.
They should establish a privacy-led social analytics practice, and embed “Privacy By Design” into their culture and strategy. That way, trust is maintained and social data will continue to be made available for driving new insights to improve the customer experience.
How social personality profiling reinvents content marketing & social care
Big data knows things you’d never tell a market researcher
Does sharing on social media mean giving up privacy?
Your everyday social media activity could lead you to your perfect job
Award-winning Australian recruitment agency, Firebrand Talent, ignites the careers of digital, marketing, creative, communications, advertising, & media talent. If you are looking for your next career move, check out the jobs we currently have on in Sydney & Melbourne.Back