The Perfect Storm That Made Machine Learning a Reality for Payments

The Perfect Storm That Made Machine Learning a Reality for Payments

The Perfect Storm That Made Machine Learning a Reality for Payments

Guest Bio

Paulo Marques

Paulo is CTO at Feedzai, a data science and machine learning platform for managing payment risk. A frequent speaker on topics about artificial intelligence, Paulo currently heads up Feedzai's product development team and technology strategy.

More Info

Machine learning is to the cognitive revolution what silicon was to the computing revolution.

Like the silicon semiconductor chip, machine learning is launching a revolution in computing that will make life over the next 100 years hardly recognizable from the last century. From self-driving cars, to medical diagnoses, to smart cities to smart, personalized classroom education, machine learning stands to revolutionize the way we live, work and pay.

But why now? Although machine learning has been in existence since the 1950s, it is only now that the technology has become a reality for the payments industry with its potential being fulfilled. But to understand what's in store for the future, let's first pause to understand how we got here.

In 1959, artificial intelligence pioneer Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed." However, it is the key developments from the aughts and beyond that have enabled businesses of any size to tap into machine learning. Here are five key technological trends that, together, created the perfect storm for making machine learning a reality:

1. Affordable parallel computing
In 2004, Google introduced MapReduce, which is a way to process massive amounts of data by using many commodity computers working in parallel. In the decade that followed, Hadoop and other open-sourced versions of Google's key technology became widely available and brought down the cost of computing.

2. Faster processing
Around 2007, the Graphics Processing Unit (GPU), a single-chip processor initially used to manage and boost the performance of video and graphics, became widely available. Unlike CPUs, which only have a few processing cores, GPUs have hundreds of cores and a highly parallel structure, making them more efficient than general-purpose CPUs for algorithms where the processing of massive blocks of data is done in parallel. The GPU accelerated processing 100 times faster than a CPU alone and increased the speed of computing.

3. Cheaper data storage
Around 2008, a new way to store data, called NoSQL data storage technology, became increasingly available. Older, traditional relational databases were unable to meet the scale, speed and data variability of a modern economy powered by the internet, the cloud, mobile, social networks and big data. By contrast, the newer NoSQL databases solve the scalability and big data performance issues that relational databases weren't designed to address by being able to access and analyze massive amounts of unstructured data. NoSQL helped to make available cheap storage of computing data.

4. Big data in a connected world
More than 90 percent of the world’s stored data has been produced within the last few years. This is more than the amount of data created in the entire previous history of the human race, and this data footprint continues to double each year. By the year 2020, our accumulated digital "dataverse" will balloon from around 4.4 zettabytes to more than 44 zettabytes (and keep in mind that one zetabyte equals one trillion gigabytes). Most importantly, these billions and billions of interlinked data points correlate with one another in this interconnected world, meaning that different data streams are providing a more unified view of reality. This intermesh of data is what powers machine learning systems, supplying even more computational fuel for computer learning.

5. Better and more accessible algorithms
In the 2000s, new and better machine learning algorithms and techniques became available that made the work of manipulating data much simpler for data scientists. With esoteric names like Random Forests and Deep Learning, these new algorithms made machine learning models easier to tune, faster to train, and required less work to clean up datasets in order to produce results quickly. These new algorithms provided the hardcore data science for computer learning.

Promise ahead

Although the machine learning story started more than a half century ago, it is only in this last decade that these key trends have aligned to exponentially accelerate progress. The combination of affordability, speed, increased (and cheap) storage of computing data, fuel for computer learning and the power of data science have forged a bright reality and future for machine learning. What machine learning did for other industries and companies such as Google, Amazon and Facebook, it now has the capability to do for the payment industry. The implications and benefits for payment ecosystems — shifting away from a world that is batched, physical, complex and analog toward one that is real-time, data-rich, simple and digital — is underpinned by machine learning.

The statements and opinions of the writer do not necessarily reflect those of TSYS.

Other Articles by Paulo

Paulo Marques

Paulo is CTO at Feedzai, a data science and machine learning platform for managing payment risk. A frequent speaker on topics about artificial intelligence, Paulo currently heads up Feedzai’s product development team and technology strategy. Before founding Feedzai, Paulo was a professor at the University of Coimbra and Carnegie Mellon University and served as the Director of the CMU|Portugal Professional Master Program in Software Engineering. Over the years, he led and worked in a large number of projects for the European Space Agency, Microsoft Research, Siemens, SciSys, among others. Paulo holds a PhD in Distributed Systems, has authored over 40 peer-revised papers and published a book.

Share this story via email or social networks