The financial services company has taken a containerized approach to achieve agility and flexibility with its workloads, while exploring the long-term benefits of generative AI.
Discover Financial Services has moved aggressively to the cloud in 2024 with a migration strategy focused on retaining hybrid flexibility and making the most of cloud elasticity.
EVP and CIO Jason Strle, who joined Discover 18 months ago after CIO and CTO roles at Wells Fargo and JPMorgan Chase & Co., has opted to migrate mission-critical workloads using Red Hat OpenShift on AWS. Moving these containerized workloads to AWS offers Discover greater flexibility and agility to handle the spikes and dips of seasonal consumer spending far more efficiently, he says.
Now that much of the migration is complete, the benefits of cloud elasticity have “paid off,” Strle says.
Discover’s implementation is unique in that it operates its OpenShift platform in AWS virtual private clouds (VPC) on an AWS multi-tenant public cloud infrastructure, and with this approach, OpenShift allows for abstraction to the cloud, explains Ed Calusinski, Discover’s VP of enterprise architecture and technology strategy.
For many years, the Riverwood, Ill.-based finserv hosted workloads on a cloud platform within its own data centers. The OpenShift hybrid approach gives Discover the choice to run workloads on private or public clouds, enabling it to better manage and move workloads to multiple clouds and prevent vendor lock-in.
“More workloads were moved [to the cloud] in the first six months of this year than in all the years before, by far, orders of magnitude more,” Strle says. “Due to the elasticity of the environment, we were able to handle circumstances such as big surges, and that’s very important to us because of the way we do marketing and campaigns and different ways people interact with our rewards. That can lead to very spiky consumer behavior, and we can dynamically grow our capacity on public clouds.”
The container-based approach also provides Discover with connectivity to on-prem systems and a gateway that allows access to Discover’s core SaaS vendors — ServiceNow and Workday — as well as integration with external vendors, says Strle, who is also considering alternative container-based architectures as cloud options expand.
Banking on hybrid cloud
Discover’s decision to take a container-based approach as early as 2018 reflects the hybrid approach many consumer financial services have adopted to have maximum control over their workloads. For example, by leveraging OpenShift, Discover and other enterprises can achieve portability across AWS, Microsoft Azure, Google Cloud Platform, and IBM Cloud.
But introducing a container-based approach to cloud computing can introduce complexities and challenges, analysts note. Still the openness and capabilities outweigh the risks for those using OpenShift for AWS, says Sid Nag, VP of cloud, edge and AI infrastructure at Gartner.
“They’re using AWS for basic compute services but not for upper-layer compute services,” Nag explains. “They want to have the ability to run OpenShift anywhere — on the public cloud, on premise, or in a private cloud and they can move workloads around across different hybrid environments.”
Gartner predicts 90% of enterprises will adopt a hybrid cloud approach through 2027. The research firm notes that one major challenge all enterprises face in deploying generative AI will be data synchronization across the hybrid cloud environment.
Gearing up for generative AI
In terms of gen AI, Strle and his teams are exploring the potential long-term benefits, beginning with the company’s use of Microsoft’s Copilot for Office and for GitHub.
But Discover is taking a measured approach to the technology, with a centralized AI governance function within the company responsible for evaluating risk management around developing gen AI solutions, Strle says.
Another part of the organization that oversees data and decision analytics, dubbed DNA, is experimenting with Google’s Vertex gen AI platform for possible contact center use cases, he adds. Some Vertex capabilities are in production and the “ecosystem approach” to managing generative AI solutions as opposed to “stitching together a bunch of different AI tools” is the current gameplan, the CIO says.
“We are intentional about allowing some organic exploration of gen AI capabilities,” Strle says, emphasizing that Discover is not yet exposing customers to gen AI capabilities.
The financial services company is also evaluating open-source models based on Meta’s Llama and is considering more advanced gen AI models that make decisions autonomously — but Discover is not in embracing agentic AI yet.
“We are still focused on that ‘human in the loop’ with our deployment because we still have to manage all the risks and compliance associated with these solutions,” Strle says of the current gen AI models, which assist employees with internal tasks or validate and double-check human activity to eliminate errors.
Initially, Discover’s foray into GenAI will be limited to large language models (LLMs) performing document summarization and possibly supporting customer agents but there will be nothing directly customer-facing for the foreseeable future.
“We’re not going to go there,” the CIO says. “Anything that could potentially be making an important decision for the customer or could cause harm or confusion, those are things in the ‘Do Not Touch’ category.”
But in this era of speedy transformation, Strle won’t count anything out. “I’m not seeing an imminent opportunity, but I know that could change quickly so we’re not closing the door on anything,” the CIO says.
The finserv AI playbook
That approach appears to be a common one among the larger financial services players.
In a recent interview with CIO.com, Gill Haus, Chase CIO at JPMorgan Chase, said he is evaluating use of generative AI to improve internal operations, the contact center, and Chase’s travel business, with some gen AI use cases in production. But he will not deploy the technology in customer-facing applications until it is battle-tested and errors such as hallucinations are gone.
Like Discover, Chase has embarked on a major digital transformation, including the development of a new deposit platform, as well as a modernization of its legacy applications into microservices deployed on private clouds and on AWS and other public cloud providers.
“We will be doing use case-based approach,” Haus said. “It’s not going to be geared for a particular line of business. It’s geared for solving a type of problem or action.”
Their cautious approach to cloud and generative AI is typical for consumer lenders, one analyst says.
“While these companies continue to operate a significant number of financial systems in on-premises datacenters, they have been adopting cloud services for customer-facing websites and mobile apps,” says Dave McCarthy, VP of cloud and edge services at IDC.
“The excitement of implementing gen AI capabilities is tempered by the fact that much of this technology is new and unproven [and] this causes risk-averse companies in financial services to take a cautious approach,” McCarthy says. “Most companies start by experimenting with gen AI to improve internal process before adding customer-facing features.”
SUBSCRIBE TO OUR NEWSLETTER
From our editors straight to your inbox
Get started by entering your email address below.