#Australia #MasterCard to mine #Facebook user data #PricelessEngine

oncludes in Perth this week.

Appetite for such targeted offers is on the rise. In its latest Digital Shopper Relevancy survey, released last week, CapGemini found that 54 per cent of online shoppers want personalised shopping offers; 61 per cent want online stores to remember them to speed up shopping; and in developing markets consumers are keen to use social media to research purchases prior to buying – which is precisely the insight that MasterCard is looking to tap with its Priceless Engine.

Facebook Australia sought to clarify the company’s position on Tuesday, saying no information on individual users was sold to advertisers.

“We applaud what Mastercard is doing with their ‘Priceless Engine’ but, to clarify, we are not providing them, or any other company, personal data of Facebook users.  We are working with them to create targeting clusters using Custom Audiences — a tool that matches anonymised data from Facebook with their own anonymised data for optimising ad delivery on Facebook to their users,” a company spokeswoman said.

She said Facebook works with many clients in similar fashion with compelling results.  “What Mastercard is doing is a really smart way to do marketing,” she said.

Australia’s big banks already analyse their own data reserves in order to develop customer-specific banking products and services. The MasterCard arrangement will let them leverage even broader reserves of data in order to drive online payments and take another slice of the online action.

Dr Ian Opperman, chief executive of Sirca, a not-for-profit organisation established by a consortium of Australian and New Zealand universities to conduct financial sector research and innovation, explained why big data is such a hot topic for banks.

Speaking at a big data seminar organised by the Centre for International Finance and Regulation (CIFR) last week, Opperman noted that “everything can be digitised and everything digitised can be used to make money.”

Progress is, however, patchy. Technology analyst Gartner surveyed 302 of its international Research Circle members to find that 73 per cent have invested or plan to invest in big data initiatives by June 2016, up from 64 per cent a year ago, but few have progressed much beyond the pilot stage.

Maria Garcia de la Banda, deputy dean of the IT Faculty at Monash University, which last week held a big data forum attracting speakers from pioneers such as Walmart, IBM and Telstra, said  there were three stages in big data application – first when organisations started collecting data and issuing alerts, second when they started performing analysis, and finally the “age of discovery” where organisations used big data to support evidence-based decision making, asset management or testing.

Australia’s big banks, she said, were now at the stage of “data enrichment … putting their data together with social media, with data about geographical location and that gives them an enormous amount of influence.”

Issues still holding back the financial sector’s progress with big data is poor data quality, data hoarding by individual departments, lack of standardisation and access to skills, although Professor de la Banda said that universities had a wealth of data analytics capabilities and that from next year Monash would be offering a Masters in data science.

The challenge of managing data quality and standards was highlighted by CIFR research fellow Dr Kingsley Jones who said that a survey of one financial registry had uncovered 140 different ways to describe a single corporate entity – making it impossible to properly analyse the data associated with that entity.

The funds management sector was singled out for having particularly poor data.

Michael Berg, a senior consultant with Rice Warner which provides analysis services to the wealth management sector, said  its Super Insights study  had taken data from 18 funds and 10 million records in order to help the funds benchmark their performance and identify growth opportunities. That uncovered a swathe of data quality and standards issues. For example, 14 different data formats were used across the 18 funds which needed sorting out.

Some of the underlying data quality was also questionable, he said, pointing to superannuation records which were clearly errors.

This article has been amended to include Facebook’s comments.

Source: www.smh.com.au

See on Scoop.itQuand l’assurance apprivoise internet – Ronan de Bellecombe


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s