PegaWorld | 45:00
PegaWorld 2025: Shifting into High Gear: Citibank's Path to Scalable, High-powered Marketing Operations
Gain insights from Citibank on how shifting from an execution-focused approach to a strategic, AI-driven model has streamlined marketing operations. Citibank's Omni-Channel Decision Engine powered by Pega Customer Decision Hub™ enabled them to optimize creative testing, compliance management, and accelerate delivery across programs and stakeholders. They will discuss the challenges of aligning people, processes, and technology, and lessons learned in democratizing access to powerful tools.
PegaWorld 2025: Shifting into High Gear – Citibankʼs Path to Scalable, High-Powered Marketing
Well, welcome. We're so happy to have you here. Um, I am joined on stage by Jay Gao and Sathyanath Parthasarthy from Citi, and we're so excited to have you guys here and talking about your, um, marketing operations journey. Can you each tell us a little bit about yourself and your role at Citi? Uh, sure. Jay Gao I've been at Citi for more than 20 years. Currently, I'm part of the analytics organization, and we drive and own products that support analytics needs.
And as related to Pega, we own the functionality that is called the omnichannel decision engine. And today, Pega CDH serves the role for that. I'm Sathyanath Parthasarthy. I've been at Citi for 19 years now. Um, I'm again a part of analytics organization, and I lead all channel strategies. Um, I would say we are on a journey to drive customer engagement across all customer touch points. And my team is instrumental in driving the transformation. So amazing.
Um, and you guys work really closely together. So starting with Jay, can you talk a little bit about how your teams work in concert to execute against your marketing and customer engagement goals? Sure. Uh, as you would expect as a large bank, our organization is typically pretty complex. Uh, and in that sense, we have multiple teams involved in the whole operations of Pega, CDH or the omnichannel decision engine, which we call it Ode.
So while our team drives the capabilities to making sure the product is working as intended and promote new product features to promote more productivity, more efficiency across the board. We have two sides of the table. One side is we have a whole group of marketing partners, and there are many of them who are providing us with content to be presented and recommended by the ODI as an engine.
And then on the left, we have our analytical power users who are driving the arbitration logic, the fundamental principles of how we make decisions. So that's how we collaborate. So as Jim mentioned, so there is a Platform side of things which is more on the operational side. And then there is the intelligence that the decision engine has primarily powered by AI models, rules and arbitration logic.
So my team works very closely with Jay's team and translating the capability to the business in terms of how it drives value. And in instances where we feel we need a custom built solution, then we again partner with Jay in terms of delivering the custom product. So that's how we interact. And Jay, at a high level, how are you using marketing technology to solve some of your challenges? What are those challenges that you're using tech to solve for sure.
So instead of calling them challenges, I would like to call them focusing areas. Yes. So focus areas, focus areas. And actually there's a good reason for that because challenges are not static. They do evolve. But so long as you have your eyes on the right areas, ideally and hopefully you don't miss the big ones. So for us to make the whole, if you will, operations and the decision engine capability work, we focus on a couple of things. One is data. That's the most fundamental piece.
We need the data at, the quality that we need at the time, that we need it to be there and add clarity that we need the users to understand it. Which it sounds very straightforward and easy. It's not easy. And second is intelligence. How do you translate a raw data element into something that is relevant for the decision engine, as well as for the users? And in that case, we also want to mention the intelligence that will help us discover issues and problems as quickly as we can. Hate to break it out to folks here, but things don't always work accurately, right? So you can have a lot of processes. You can have a lot of maker checker, but you also need to make sure when things break, you have the intelligence at the speedy fashion that you want it to be. So we spend a lot of time building these capabilities to inform our users if something is wrong, if an error message is created, or if the system is broken, broken down data got stuck somewhere.
Things are not populating as intended, so we want that to be a to be caught out as quickly as we can. So these are the things that we focus on. And just make sure that we build a healthy and evolving system. So as a follow on to that, I mean, those marketing ops people in the room here knows that it's a complex job, right? So, um, how does your team navigate the complexity of working with other groups within the organization? Uh, wow. That's, uh, that's that's a really big question.
So first, I want to start to call out something that when you talk with other teams who are in need of using certain aspects of the tool. There is a recognition that not all teams have the same amount of knowledge and familiarity with the capability. So I hate to call it education, but I'm going to call it education to make sure that the people you work with at least have the fundamental understanding of what it means and what does it take.
And then the other part I think is very important for, again, large organizations, small organizations, regardless, that have a very clear set of KPIs. What does success mean to all of us as a business? That also is a very important part for us to set a foundation because technology evolves, advances. People come and go, organization goes. Ideally, they do not change that frequently. And that's the baseline when you have discussions, debates or arguments.
That's where you draw all the way back to are we making our objectives attaining that better or worse? And that's how you align across organizations. And that to me is the most fundamental piece for us to start and conclude any conversation. So I think in many large orgs there's competing priorities across groups, right. So Satya, how do you deal with your team competing with competing priorities across the different orgs that you deal with? That's a that's a loaded question.
It is a loaded question. But so so let's just take a step back. Right. Citi I mean, in comparison with other banks, you know, is a branch line strategy digital first, right. So the the first channel where we integrated the Pega CDH engine was in our digital platforms, which is our, you know, desktop and the mobile app platform. And I know it's perceived as a free channel I would say. So every program manager, every product owner wants to play in that space.
And they want to maximize the share of eyeballs for their offerings. And which is a classic problem to solve. So so we took a measured approach. I would say it was a more structured approach where we wanted to first prove out the value of the capability. So we defined metrics around return on investment because you're investing in a capability. We want to go back to the business and say here is the ROI on the capability.
So hence we started focusing more on products and offerings that we can directly quantify. So which will probably fit into the bill of marketing offers. And then once we check that box out, then we started loading more content from an engagement marketing and proactive servicing. And again, that opens up a different kind of a debate, because whenever any new offering comes into the ecosystem, it can gain impressions only at the cost of programs that are already running in the channel. Right? And that's where I would say, I think the the key learning that we've had is making those decisions which are right for the firm as a whole, which is where we established a steering committee with representatives from all lines of business. They are executives who are, you know, who would report into the CEOs.
They would walk in on a bi monthly basis to be able to review the strategy and make decisions on what should be the strategy going forward, what is the optimal mix, what should be the frequency of or the intensity of marketing? So all those decisions are made, and the key from my perspective is being transparent. So everything is data driven. It's an a B testing. You prove with data whether it works versus it doesn't work.
And I think once you're transparent it cuts away a lot of the noise in the system. Right. So that to me is probably how we manage the challenge. And to me personally, it works as well because it elevates the role that analytics plays in influencing decisions and driving business outcome. So so that's how I view it. That's great. You've got a lot of hungry mouths to feed from the marketers. J. What's the value of the tools that you use?
Like for example, 1 to 1 ops manager that helps you centralize the types of things that your team does. Well, actually that's a great point. And I would use actually the opposite word. It's decentralizing how we get more actions into the action library for the Pega CDH or for the omnichannel decision engine. So, um, I think all CDH practitioners have different journeys in terms of their experiences with ops manager.
We opted to do that because we recognize that one team alone will not be able to generate or accumulate as many actions with many diversified flavors as quickly as we want it to be. So we need to make sure there is a user friendly way for people to contribute their ideas actions into the system and then with his team, help to level set.
How these messages or actions get decisions prioritized in an equal footing, and then with the intelligence that people can go in and understand why actions are working or not working. So coming back to the one on one ops manager, that is a tool that we use. We think of as how do I say I'm opening up our mobile app development with a very user friendly interface that everyone can contribute. I have a journey idea, I have a flow, and here's how I define it.
And then the fact that ops manager links to the core Pega CDH capability with a low code capability. And quite frankly, if we really push it, it can become no code. Meaning you define your action, you define your eligibility, you define the models that you'll be using. You define your resting rules, all of those. Once you define everything, there codes are generated in the back end backhand already, and it's generated at the speed that you want it to be. So we have weekly release schedule.
If you want daily you can you can do all of these things. However, you cannot do any of these things at scale without ops manager. So I'm glad you brought up scale, because one of the things we hear from our clients all the time is that they want to learn how better to build out their action library. So can you talk a little bit about how you scaled the Action library using Ops manager?
I will talk about ops manager, but also I'm going to talk a little bit about the action governance a little bit. Great. So and don't think about it as sequential. So we encourage people to bring as many actions as possible. And we are the team we help these teams to understand where are the areas in the digital channels that you can present these messages and why we just at a high level feel these could be relevant for the customers. So there is an encouragement of building out as many actions as we want, and also with different flavors, because different messages have different associated eligibility criteria. So that's one. Once you do that, we as a team, meaning that's more or less on our team, want to make sure that we're not building redundant actions across different treatments or channels, placements, whatever you want to call it.
The reason is if you build the same action across different treatments as uniquely different actions, anytime you maintain, update, or change those actions, it becomes a lot more work. So because of that concept, we actually at one point reduce the number of actions by almost 60% just to accomplish the operational efficiency that we want it to be.
And with the reduction in action, actually, that creates the opportunity for us to get more actions into the pipe, because otherwise you're going to have too much if you will fluff that you need to carry through in all those development cycles, right? The reduction actually creates opportunity for us to contain and invite more. So that's how we think about the whole action library and Satya. How do you experience it on your side.
Do you see a correlation between business outcomes and having those right actions, having a robust action library? What is your team experience? Yeah. So maybe I'll touch upon, you know, the consolidation of actions that Jay was talking about first prior to getting into the business outcomes. Um, I think we when we started, we went one way where we just loaded actions across all real estates and we know what it can achieve.
And then we started the process of consolidating the actions again through a robust test and learn. Right. So marketers on Citi side would be able to, you know, talk about, you know, how we've essentially Evaluated the performance of every single program in the ecosystem by doing one versus multiple actions. And that really then is a governance forum conversation on. In some instances, consolidation might lead to some softness in performance.
But again, it has got, you know, a larger benefit from an operational efficiency perspective. So those are calls that we again make in the broader governance forum. And then the consolidation of actions and then getting to wherever we are today, what we've actually seen is it not just simplifies your process. It also makes your models more robust and accurate because the models aren't learning on unproductive impressions. You're serving some impressions.
Now you know the value of the impression is absolutely nothing. By eliminating those in the system, you're making your models better. And that would mean stronger models competing from an arbitration perspective, which eventually leads to better business outcomes. So that's how I see all of this action. Library and consolidation feeding into business outcomes. Great, Jay. So, you know, you've been in the business a long time implementing technology.
It's it can have its focus areas that, you know, won't call them challenges anymore. But can you talk a little bit about some of your experience in implementing the technology and give the folks in the audience some key takeaways from your journey of doing that implementation that they can take forward into their own organizations?
So I'm sure that everyone who has worked on technology deployment have the framework that you want to pilot, you want to test, you want to confirm to the business the value that you are driving. So all of these, they also apply at Citi. On top of that, I want to just call out a little bit about CDH, which is quite unique from Citi stamp perspective. It is one of those early, if you will, cloud based solutions that we adopted Citi as a bank.
We have not been very comfortable with cloud based solutions, but CDH is one of the first ones and it was a lot of initial, if you will, negotiation with the organization and make sure that they got comfortable. But once we got there, I think the implementation with a cloud based solution really benefited us in two ways.
One is the product upgrades and all of these things that we sort of offload all of these to Pega and Pega does a good job releasing new capabilities to, in a way encourage or I should use another word, force us to upgrade our product solutions to the latest level, which honestly, had we been using an on prem solution, we would not have been caught up that easily. And second, because of that implementation, or Primarily we need to worry about is the connecting points.
And over time we have consolidated the connecting points to less than five. So right now my biggest worry is really these less than five things. So long as they work, I'm good. Everything else in the Pega box. Sorry. That's your job. Fair enough, fair enough. Um, so this question is for both of you, but there's cultural challenges at large organizations when you bring in new ways of working or new technologies.
Can you give us a little bit of background on what you guys experienced and how you navigated those, and what advice you would give to those who are trying to educate on the benefits in their own orgs? Do you want to start? Yeah. So again, um, like what I mentioned before, um, I think for us from a, from a decisioning Perspective and really influencing the right outcomes. It's been being more transparent. So that to me is a is a big uh, I would say it's a no compromise, uh, situation because.
Once you are extremely transparent in the decisions, why certain decisions are made and the implications of those decisions, I think a lot of the resistance in the system automatically fades away. Right. Um, to me, the aha moment was we've actually evolved to a, you know, to a position where we're talking about making the right decision for the customer slash firm as against pushing a product and, you know, driving a product.
So to me, if I were to just look at the last three years evolution, uh, and the point of arrival, that to me is a big success in our entire transformation roadmap because we now have a buttoned up, uh, forum senior alignment. Everybody knows why decisions are made and most importantly, making the right decisions for the customer and the firm. As against pushing any particular product. So that is a big one.
So if I look at the journey that we had with CDH and how we implement and operationalize it, there's one thing that if we could have done it differently was to get your broader user group engaged, educated and familiarized as soon as early as possible, because any organizational challenges or changes that you're implementing from day one, the larger user group will always say, hey, I'm comfortable with what I'm doing now. I don't want to change, but you have to.
And we I admittedly would say that we made a mistake not getting them. If you were onboarded soon enough. So when we reach the scale when we offloaded all those, if you will change requests sort of responsibility to the user groups and undoubtedly errors happen. So we had to, you know, have some emergency measures to get those fixed and then realign organizationally how we are going to operationally execute all of those change requests.
So it becomes a lot more reactive versus what we could have done was to make the proactive. As you started to onboard folks, were there things that worked really well that you could discuss a little bit? When we come to the realization that people are challenged with the user interfaces or the usage of the, the, the tool. We had very, very open and transparent dialogs in terms of hearing from the user groups. What do you get confused? What do you get stuck on?
And then what we did was we at that time, we brought in a group of designers who literally designed our mobile app, who are facing the consumers, who are not supposed to know anything. Right. So it's supposed to be a very intuitive user experience. So we get all the user inputs. We truly understood the operational needs of using the tool.
And then we brought this group of brilliant designers who revamped the whole ops manager user interface, not the back end technology connections, but the user interface is streamlined in a way that is intuitive, simplified, meaning you're not seeing the screens that you don't need to provide input to. We have, if you will, decision rules to say these programs, you need to provide these 25 elements, these programs, these 22 something like that. So it becomes dynamic, intuitive, simplified.
And we have received a great feedback feedbacks from the user. So I think that's a good way for us. Very very cool. I think it's more taking an inclusive approach versus here is a product and here is the manual to go use the product. So I think that makes a big difference. Absolutely. Not just from a marketer and a user perspective, but also even from an internal regulator perspective.
Because when we're talking about models, real time and things that could refresh three, four times a day, you also need to get your internal regulators a lot more comfortable about the nuances. And how do you containerize your attributes. How do you play within the guardrails? Just getting them comfortable to, you know, also communicating the point that you're not breaching any controls is probably something that's very, very critical in this journey.
So I think it's an inclusive approach is how I would summarize. Amazing. Okay. So if you had to give one piece of advice to the folks in this crowd that are just getting started with their marketing operations, what would it be? You know what I always want to say? Technology, marketing, technology, any technology, they don't they don't drive value. The usage of technology drives value. So before you implement anything, know what you want to get. Without that, you're not going to see value.
What about you? I would say transparency. Transparency. Be very transparent in what you're driving and why you're making a decision. Because I think that will remove a lot of resistance from the system because everybody is now aware. Yes. Why you're making a decision. I'm a fan of transparency as well. Yeah. All right. Well, we wanted to leave enough time for you guys to ask Jay and Satya questions if you want. So I think we have a microphone right here. If anyone has any questions.
I think we have one. Oh, right here in the back. Yep. Stand up. Oh. Hi. Hi. So, uh, my question is about how when we try to implement some solution in one org, then how it kind of interacts or affects the other org. You know, checks and balances. I had some experience in the financial industry, and I know in a bank such as Citibank, there is a significant amount of, uh, resources and human power that is invested in risk management, especially up to the 2007 financial crisis.
So when you're trying to automate the marketing operations with the Asiatic eyes and everything, uh, I would be interested to hear from you that how did you also make sure the same automation was in tandem, in sync with guardrails, along with the, uh, you know, credit risk modeling and all those folks. Thank you. Okay. Okay. So, I mean, because you're familiar with the financial services sector again.
So I would probably anchor back to, you know, we've got a robust process from a valuation perspective, um, where there are core KPIs that we would track as a business on a very frequent basis. Uh, and bringing it home, that could mean what is the level of balances that you drive. What's your default rate? Early warning indicators and a whole bunch of other stuff. Now, when we scaled, you know, the when we rolled out the strategy, right from a digital perspective.
And of course digital digital was outperforming, uh, so which means that we were actually driving and bringing in more balances, uh, than what we had baked in in our outlook. So then the question was about, hey, is this the right decision? Is this what we want to do? And then then, you know, it's a collective decision. So you could you could approach it both ways.
One is you could say whatever incremental is coming in, I'm going to book in that incremental provided it meets all your hurdles and other stuff. And the other way is doing other way of looking at it is, hey, if a free channel or a less cost costly channel is going to bring in those dollars, is there a way I can look at all my, you know, high cost channels and dial down? That's the other way of looking at it.
And that's where, you know, a lot of testing around a B testing around let's reevaluate our direct mail strategy. You know, let's look at it through the lens of if a customer is calling in and he's been routed to a sales agent on the phone side, is there is there a way that, you know, I need not sell on the phones channel because I know my digital is going to still deliver. So that way you could reduce the average handle time, and the cost associated with it is a different way of looking at it.
So that's predominantly the framework that we used in, in essentially solving for it, where we ensured the guardrails are met. And that's where it becomes a big cross-functional initiative as well.
So we've got PNL owners, the product owners, folks from finance risk, everybody in that cross- functional forum and the steering committee making a decision and signing off on whatever strategy that we all collectively align to, because it then fits the bill from an overall firm's risk appetite framework. I would call it so. So that's that's in a nutshell how we approach this. If you want to add anything. Let's see if there are more questions. Hi. Hi.
Um, do you have any like, specific challenges or focus areas that you had to overcome when adopting the use of 1 to 1 ops manager? I think there. Are. I think there are better people in the audience also from Citi who can answer that, but I will share my perspective. I think really, uh, you know, people's knee jerk reaction to any change is I don't want the change.
And the other thing I want to say is just like filling out a credit card application, there are many different ways of filling out that application. You know, different companies have different layouts and different sequence in terms of the questions you got asked. But my recommendation to anybody who is kind of rethinking whether ops manager should be in the picture is think about what you need to bring to the table for this program to work. And they always exist.
You always need them, regardless of whether you're using a user interface A or B or C, it does not matter. Right. So what? What boils to the end? Yes. People are resisting. Everybody resists changes. but think about the business needs. If you need to have a lot of actions introduced at scale, that is about the only way to accomplish that. Then you spell it down into two buckets. One is a regardless what you do, if you go straight to it, you still need to bring to the table. What's your message?
What's your creative? What's your offer? What's your digital journey? All of those. Nothing is going to be negated, right? The only thing now we are asking is, hey, you put all of these things in the now user friendly template, then it doesn't need to even create a code for the most part. Then. Then we realign the business. Is that something better or worse? And I think the answer is pretty obvious. And that sort of forced, if you will, the change to accept the usage of the tool.
Um, so this morning we saw the keynotes. Right? The, uh, CDH Blueprint, which has been out for a while. Have you guys given that a consideration? And how do you see that fitting, uh, with the ops manager in the future? If you have given it the thought of using it at some point? It's a great question. We have given it a lot of thoughts. Uh, and I'm going to share it from purely my personal view.
So Blueprint, from a content generation perspective, uh, is is powerful, uh, as a bank, unless we have figured out a way to automate some of the compliance legal approval process, it's still not going to be very helpful. Uh, on the other hand, I think as a bank, we have tremendous amount of opportunities in leveraging GenAI for our back office operations. We're subject to a huge amount of 22 plus regulations, and each regulation has a vast amount of operational implications.
We have lots of procedure manuals, a lot of things people need to extract from database A and B and C. So personally, my answer to that is instead of trying to use generative AI like Blueprint for one on one decision for a customer facing message, I think a better starting point for us as a bank is something that's facing your employees. Give them a better tool, give them a better efficient way to extract information intelligence that they could not do without. GenAI.
And that's going to a generate immediate impacts and benefits for the business. And B during this time we're going to let legal and compliance see how make them believe GenAI is not going to mess up things. Then you naturally progress to the next stage, which is to face your end users or your end consumers. Great question. Uh, hi. Um, can you talk a little bit more about the journey you took to adopt Pega's adaptive modeling and next best action capabilities?
Um, and just speak a little bit to some of the greatest challenges you've encountered and maybe some of the things that you've learned along the way. And just an extension of that question, have you found, um, that the models tend to work better for some marketing channels over others, for example, lettering versus emails? Yeah. I know it's a loaded question again. So.
I would say from a from an adaptive model perspective, I think the biggest channel, sorry, challenge to begin with was, um, it gets perceived as a black box. So I think getting the regulators comfortable about, you know, there are no unintended, unintended consequences to actually using this modeling framework was the biggest roadblock. I think, um, it did require a lot of effort. Um, and again, we did a we did a lot of, uh, you know, in partnership with, I think, JS team and even Pega team.
Um, we brought all the facts to the table in terms of how the models work, you know, what are the core attributes that the models would leverage? And how would this drive, say, propensity to click? That's probably a basic functionality within your CDH.
And then you could probably elevate it and say, hey, if I move away from click and I do a response driven, or if I want to do a value driven model, there are scenarios that you could build around and you could you could essentially, um, lay down scenarios and influence business on different outcomes. I look at the entire capability as a dial, you could technically turn the dial to the left and maximize clicks, or you could turn the dial to the right and maximize the value.
So those are different ways of looking at the capability. But coming back to your question, I think one was on getting our regulators comfortable. And the other one was how do you actually prove the value of the tool to the business and align it with the overall financials? Because when we're talking about Citigroup, billions of dollars, from a scale perspective, how do you essentially go back to the business and say, look, we've onboarded this tool.
There is a new capability, and this new capability is going to drive a delta over your baseline, which is already running in billions. So that to me required a lot of effort, not just from a test and learn perspective, but also in terms of, you know, answering a lot of questions from the business on the key drivers behind what we see from a decision engine perspective. So to me, I would maybe summarize it as two things. One is regulators.
The other one is how do you go back to the business and convince them that this is really the capability to invest in. And I think we've done we've come a long way in overcoming both. I would say walk into Citi and then talk to them about Decision Engine. And everybody would say, we know it works. So I think that probably to me is a big success story. So one thing I think that helps Citi is the the model within Pega CDH we internally call them dynamic AI models.
And we've been using these models for more or less 20 years. Is it 20? No ten. Sorry, I exaggerated because 2015 we launched our first version of the decision engine for pre logging, which by the way is not Pega. But such models were deployed back then. So internally we have familiarity with that.
And the other thing I want to add to what Satya commented on, which is my responsibility, is I want to make sure for each action we are supplying the model with more or less equal amount of intelligence. So the model can pick and choose the predictors for each action at an equal footing, because at the back end in the core, the model is treating all actions, all the predictions equally.
However, if you don't equip the models with the right level of intelligence, then certain programs will be, if you will, depressed. Did that answer your question fully? Okay. Hi. Hi. My name is Rakesh. I'm from citizens Bank. So just a follow up on the question from the previous gentleman, right. So essentially you mentioned around exploring the, uh, the Blueprint for CDH and the idea around, you know, I like to have insight around using it for back end MBAs, kind of for your teams.
Just wanted to understand, given the regulatory environment that we work in. Right. Um, how do you guys vet or kind of use it to validate the auto generated, let's say, MBAs or treatments or, you know, the presentation content? How do you actually use that to vet it before it shows to your back end staff, or because before it is presented to the customer? That's one question. And the second question I had is around, of course, with the use of adaptive models. Right.
And the focus on, um, clicks, click through rates and all of that. Have you also, you know, if you can delve a little bit around, have you explored, uh, or are you actually using the conversion idioms as well? And what's your take on that. How do you balance it out against click through rate versus conversions. Thank you. I'll start. And then Satya will obviously conclude that first of all, in terms of the Blueprint application generating, uh, either content or actions. I'll be very clear.
We're not doing that today. Uh, we have sorry enough operational challenges with human beings doing that. I cannot imagine using a GenAI to do that today, but it's a feature. It's a feature that we're not ignoring, but we are waiting for it to be a little bit more matured from an adaptive modeling perspective. Um, I think at least from a capability perspective, we really want to build the capability so we can track all the way towards conversion. Uh, but there are a couple challenges.
One is technologically, we're a little bit challenged, a little bit stuck, but I think we'll get over that. And two, there's the recognition that not not all programs or actions have a conversion event. So when you get to that stage, you need to make sure your decisioning framework is considering all these different varieties. And you know, to see if you have anything to. Add, I think you pretty much covered it.
I think, to answer your very specific question on the click versus conversion, I think beyond the technological challenges, it is about if you want to promote a message which is more a value prop of a product where you know there is no funnel involved, it's just about showing a banner that displays, let's assume double cash is a 2% cash back product, hypothetically, right? So there is no call to action really involved other than customers just saying learn more.
So how do you balance that out with, say, a credit card application or a lending application that would probably have a 1 to 2% funnel throughput rate? Right. So I think that's really the challenge in terms of moving directly to a conversion driven model.
Um, having said that, you know, I know we've onboarded a capability where we could actually model for conversions, but again, we still need to go through all the governance and other process to be able to say, hey, if you do conversion, then it's again turn the dial. Right. The outcome is going to be different. And is this something that we really like and we want to do is probably a broader conversation. So nice. Hello. Um, my question is in relation to 1 to 1 ops manager.
So you mentioned that you've got it looking and feeling I guess like Citibank but behaving as Pega designed behind the scenes. Do you see that as a custom build, the 1 to 1 ops manager, or is it still very much out of the box? Okay, so admittedly I don't understand all the back end technology, how they work, but what I like to describe it in a way, is there is a integration between Ops manager and Pega CDH that part we don't want to break and to the degree we can. We don't want to touch.
What we're really touching is just the front end, how people enter their inputs, and then it gets logged, if you will, captured by the back end. Regardless of you're listing out five questions in this order or that order, it does not matter because the back end is still capturing these five. So all what we're doing is the user interface, the front end.
We try not to touch the back end because we know every time, as Pega would advise, that you do over customization, you could break things that you don't know. So we try our best not to do that. I actually have a second question. So whilst I'm here, I'll take the opportunity in relation to your digital digital channels, the content side of things. So do you use 1 to 1 or 1 to 1 ops manager to author the content that's then used in the digital channels? Or do you have a CMS? We do have a CMS.
So CDH today really just gets a creative ID, that's all. And then when the decision is made, essentially we push back the decision with Sid included with a bunch of other information back to digital. Digital in that moment will based on the Sid, pull out the content from CMS and then present it. And then knowing all the back end tracking needs meaning what is your offer code, so on and so forth. And what is your URL? What does the URL lead to? So essentially Ops Manager or Pega is containing just a ID for each of the elements. And then how the ID becomes how the if you will not the page, how the placement gets painted that is performed by digital with the CMS at the background serving up the content. Thank you. Sure. Yeah. Hi. I have a question. Another question on ops manager. Um, so where you said you federate out access to business users for ops manager?
How do you kind of govern the quality of the actions that are being, I guess, put into the tool? I guess at plan stage so that, you know, not saying people do build rubbish, but, you know, you might get how do you make sure that the actions that are being built and the eligibility and all those sorts of things meet certain criteria, and are there good actions and things that you want to take forward and deploy? So there are two aspects to the quality of the actions.
One is and that's my team's job from an action governance perspective. If the action is creating inefficiency, if the action is trying to be placed in the in an area that already has very high competition, we would call out and we would advise you don't do it this way. That's one. And then the other side of the quality may be related to the performance of the action. And then that's where honestly, we would like the action owners to step up and take more accountability of that.
You should be familiar with how you get access to the performance report, and then have some assumptions about the performance and understanding who is competing against the same real estates and why they're performing better than yours. And then think about ways to improve. Thank you. Thank you. Thanks everyone. I think we're at time, so we'll call that a wrap. And if anyone has additional questions I'm pretty easy to find, but please, please find me after this and get my contact info.
Thank you everyone. Thank you. Thank you everyone.
Related Resource
Product
App design, revolutionizedOptimize workflow design, fast, with the power of Pega Blueprint™. Set your vision and see your workflow generated on the spot.