In this conversation, we chat with Kevin Levitt who currently leads global business development for the financial services industry at NVIDIA. He focuses on global trends in accelerated compute and AI for consumer finance – including fintech, retail banking, credit card and insurance. Prior to joining NVIDIA, Kevin served as Vice President of Business Development at Credit Karma, and Vice President of Sales for Roostify.
More specifically, we touch on the role data plays in the financial industry, how the needs of financial institutions have changed, the age of big data, the definitions between artificial intelligence and machine learning, how to train an AI algorithm, the reasoning behind the incredible amount of parameters machine learning solutions consume, the fundamental purpose of AI/ML in financial services, what NVIDIA’s platforms comprise of, and lastly the future of AI/ML.
Instead, we are going to tap again into a new development in Art and Neural Networks as a metaphor of where AI progress sits today, and what is feasible in the years to come. For our 2019 “initiation” on this topic with foundational concepts, see here. Today, let’s talk about OpenAI’s CLIP model, connecting natural language inputs with image search navigation, and the generative neural art models like VQ-GAN.
Compared to GPT-3, which is really good at generating language, CLIP is really good at associating language with images through adjacent categories, rather than by training on an entire image data set.