/ 31 July 2015

Your purse is the Apple of adverts’ eyes

Hey
Hey

Last week, Apple filed a patent for a novel type of ad format that will only serve ads to users “for goods and services which particular users can afford”.

Apple will provide a system that resides on your phone and has access to the balance of your credit and debit cards – with your permission – and will only send ads to you based on what you can afford, according to those balances.

Apple explains that an ad “delivered to the user includes only one or more objects having a purchase price less than or equal to the available credit for that user … an advantage of such targeted advertising is that advertisements for goods and services which particular users cannot afford are not delivered to these users”.

On first read this approach makes sense logically. Why show ads for something I don’t have the potential to buy? It represents the next step in contextual advertising – serving the right ad to the right person at the right time. Based on my financial data, I can only see the realm of possibility. This approach is a way station on the modern-day marketer’s itinerary to true direct marketing.

But this approach represents a social disaster. For those of you who are unaware, your available credit is not the same thing as what you can afford. For example, Americans are already saddled with an average of more than $15 000 of credit card debt. Promoting further spending and adding to this burden pushes people deeper into a financial hole they are unable to climb out of.

This type of ad system is particularly troubling for the segment of society, the poorest, that is already susceptible to predatory financial practices. It represents a codified digital extension of the systemic inequality of financial education and access. It promotes the idea that you should take your credit to the max.

Apple’s patent filing raises the question: Who decides the framework on which our digital lives are based? There is already serious concern around who ultimately “owns” my personal data and how it can be accessed. Now we have an additional worry about how that data is used to reinforce inequalities. If my current context is poor, do contextual computing platforms bind me further to the cycle I’m in? Is a contextual system a never-ending regurgitation of my circumstances and habits?

Lest you think algorithmic inequality is a problem exclusive to Apple, many algorithms already have demonstrated human bias, racism and inequitable behaviour. For example, Google images tagging African-Americans as gorillas, auto-complete search results suggesting transgendered individuals are “going to hell”, and showing job ads with higher salaries more often going to men than to women.

It is a mistake to assume computers are intrinsically neutral, cold and unbiased. In reality they are really dumb. They only do what we programme them to do. Unfortunately, we all come with perspective and bias. These world views become the de facto standards of how the contextual systems operate. Often this is the subconscious and unintended desire of the developer, but it is a real phenomenon.

This problem won’t be eliminated as we move into the era of artificial intelligence either. In fact, it may be exacerbated as we’ll need to train the computer rather than programme it. Who gets to decide how and what data to train the artificial intelligence with? What ethical framework will the system operate from?

We often read about systemic inequality in our societies. But what does it mean when we literally codify inequality into our digital systems? The danger of algorithmic inequality is that it instantly raises systemic inequality to a global scale.

Algorithmic inequality also has the potential to dwarf the issue of the digital divide. Even if you are able to jump across that chasm, only further embedded disadvantage. On the elite side of the digital divide are an entire host of hidden programmatic snares that are unobservable from the surface.

These algorithms codified behind the scene continue to make the digital landscape a perilous journey for the financially disadvantaged. They will have no idea that their contextual experience is completely different to that of others.

Eric Schmidt, chairman of Google, wrote in the Huffington Post recently: “In the next 10 years, we believe that computers will move beyond their current role as our assistants and become our advisers.”

What does it mean when the advice we are given is biased and holds us to patterns that ensure some of us will never be on a level playing field? You would reject it and find better advice. But what if you had no idea the advice you were getting was detrimental to your financial best interests?

Apple’s patent filing is a wake-up call that as we build the age of contextual computing we have to avoid building algorithmic inequality into the foundations. Our real challenge isn’t solving technical algorithms, but understanding social ones.

Benjamin Robbins is a co-founder of tech consultancy Palador