I'M LUCKY TO BE BASED IN RURAL DORSET. i'm probably best described as A DIGITAL STRATEGIST. I EXPLORe DIGITAL AND SUPPORT BUSINESSES all over the uk IN USING IT TO THEIR AND THEIR CUSTOMERS ADVANTAGE.

The real algorithm that Cambridge Analytica used and our part in it..

All right-minded people would agree that gathering data through seemingly innocent and simple questions and interfaces, analysing and interpreting that data to gain a view on behaviour and wants is fine, but that using that data to inform and drive unrelated and unlinked conditions and experiences of peoples’ lives is completely unacceptable.

But I wonder if up until the point that they ‘removed’ the data from Facebook that all Cambridge Analytica did was exploit human behaviour – you know - our own inherent laziness and weakness?

Let’s start with trying to understand if the majority of people really care if what they see as ‘harmless’ personal data is used for various unrelated news, opinion and content creation and curation. If I were to ask anyone in the street to tell me what their favourite colour, animal, country and food are and that I would then be able to influence every political decision they make over the next 6 months – they would basically laugh in my face.

I mean - it’s completely and utterly absurd, an individuals’ political beliefs are just that – there is no way that would influence any final decision an individual made - right? So why would an individual care if companies try and use harmless data for other means – they are not going to fall for that.

But that’s the accusation aimed at Cambridge Analytica and Facebook.

How about adding to the harmlessness of ‘soft’ data - the inherent human condition of laziness?

Sure, there are a very many number of people to which the complexities of the issues facing society politically, economically and culturally are understood and of great concern. Many of these are well informed and take a deep interest in these subjects – going out of their way to educate themselves to make sure that they have visibility of the full picture.

But there are a huge majority of people that do not spend a huge amount of time working through detail – life gets in the way - and this where you could actually argue that Cambridge Analytica were being smart - playing on the fact that they knew that the vast majority of people were lazy and didn’t like reading long winded newspaper articles. They then identified that Facebook as the ideal place for people to be able to see ‘news’ in a very passive way.

They know that Social Media in general doesn't just play on peoples' desire to be social but in many cases their laziness and inertia - after all it is an ever scrolling world of indifference. They know that this plays straight into the hands of those that are able to gain access to levels of personal information and have the ability to constantly refine messages until they hit a nerve. - like Cambridge Analytica. 

This isn’t insight gained through analytical excellence - it’s a very simple understanding of human behaviour and laziness - which you can gain by talking to anyone over a drink. 

So, we therefore have the following inputs into an algorithm - harmless data and human laziness.

Now we add groupthink laziness. Just as the Remain campaign were lazy and didn’t challenge the Brexit camp openly enough, so the Democrats really didn’t believe that Trump could win and consequently waged the wrong campaign. Or at least didn’t push their messages across in the right way to connect with the lower educated or lazy groups (quote from Mr Mike Rundle).

We therefore have the algorithm of; -

Asking for and taking harmless personal data.

Applying the fundamentals of  human laziness

Identifying and adding groupthink outputs

Allows you to identify and define the level of manipulation/content required to influence.

An algorithm as old as time because - however long the circumference - life and civilisation goes in circles. The Athenians and the ancient Chinese (specifically the Han dynasty) - knew that playing on people's inertia can create a false sense of dependency and trust allowing messages and influence to be (depending on individual opinion) manipulated. 

Putin knows this, Cambridge Analytica know this, Mark Zuckerberg knows this - and today although the delivery method has changed - the algorithm is the same as it ever was.

Alexa 2022

Amazon - the future of our web?