Transcript
Initially, the story about Taylor Swift’s concert or the BBC reporter’s experience in China does not seem to be related to FinTech, right? So where is the connection?
Well the advances in surveillance we have shared with you, are not just about more and better cameras, but really about the facial recognition and identity analysis software that is growing more efficient due to advances in artificial intelligence (“AI”) and other technologies that fall under the broad umbrella of AI, like machine learning. If that phrase is vague to you right now, don’t worry, we’re going to get to that soon.
Now people have been working on facial recognition software and forms of AI for a while. In fact, a trio of early technologists, Charles Bison, Woody Bledsoe, and Helen Chan, researched how computers could be used for facial recognition as early as the 1960s. So today’s “hot” concepts did not just pop up, but because of the increases in computer processing power, the potential of AI is starting to be realized, which has propelled AI into the public disbook, and rightly so.
So what that means is for those of us participating in this book, you and me, in our lifetimes, many of the big leaps in FinTech will be enabled because computing power has resulted in more mature, developed AI. Thus, a major theme of the still developing FinTech story is about the increasing influence and applicability of the first machine learning and more broadly, artificial intelligence. This is what we want to explore in this chapter.
To help us get started, let’s consider a few terms, and some buzzwords, so we have the right vocabulary for our discussion.
Now keep in mind, that the definitions of many of these terms are not uniformly consistent yet, and even experts may have slightly different approaches or views, but we went with a few definitions that we think are not just comprehensive but also comprehensible even if you’re not a technology expert.
What is Artificial Intelligence?
So what is artificial intelligence or AI? AI is really an umbrella term that encompasses a number of technologies, but before jumping into that let’s start with some history.
Alan Turing, the pioneering English computer scientist and mathematician, and at least one of the grandfathers of AI, first started considering AI concepts even before 1950. His eponymous Turing Test, which moved beyond the question of “Can machines think?” to the more nuanced question of “Can a machine imitate a human” is interesting. Basically, if a computer and a person were answering questions that you asked, but you didn’t know which answers were given by the human or the computer, would you be able to identify the computer from its answers alone, or could the computer trick you into thinking it was a person?
John McCarthy, long-time Stanford professor and one of the fathers of AI, who is widely credited with coining the term “artificial intelligence” expanded further. To “Uncle John” as he was referred to by many students, AI is the “science and engineering of making intelligent machines.”
But what then is intelligence?
Stephen Hawking is widely attributed with saying, “Intelligence is the ability to adapt to change.” And so the increasing capacity of machines to learn and react as new data is presented represents this process of adapting that is at the core of Hawking’s view of intelligence.
Increases in computing power coupled with the creation, collection, and analysis of ever growing collections of data will continue to enhance the capability of artificial intelligence.