Cindy Moehring visits with Kara Hill, Corporate Chief Information Officer and Transformation Lead at FIS, to learn more about ethics in the fintech world. Join their conversation that covers convenience vs privacy, machine learning, and the challenges of creating ethical regulations in an ever-changing landscape.
Podcast
Resources From the Episode
Episode Transcript
Cindy M. 0:03
Hi, everyone. I'm Cindy Moehring, the Founder and Executive Chair of the Business
Integrity Leadership Initiative at the Sam M. Walton College of Business, and this
is The BIS, The Business Integrity School podcast. Here we talk about applying ethics,
integrity and courageous leadership in business, education, and most importantly,
your life today. I've had nearly 30 years of real world experience as a senior executive.
So if you're looking for practical tips from a business pro who's been there, then
this is the podcast for you. Welcome. Let's get started.
Hi everybody, and welcome back to another episode of The BIS, The Business Integrity School. This season we are talking all about tech ethics. And today we're going to be exploring what that looks like in a FinTech company. And we have with us a very special guest for that discussion, Kara Hill. Hi, Kara. How are you?
Kara Hill 1:00
Hi, Cindy. How are you? Thanks for having me today.
Cindy M. 1:03
Absolutely. So let me tell you a little bit about Kara before we jump into the conversation.
Kara is the corporate Chief Information Officer and transformation lead at FIS Global.
FIS Global is a FinTech company. And there she leads a global team of technology professionals
who deliver information technology and support services to over 70,000 colleagues
around the world. So she's dealing with a lot of really big issues. These issues include
hardware and software applications and mobile devices that are used by FIS Global
workforce. She also currently is leading what's called Workspace Next. It's a transformational
program focused on creating a workspace for the future of FIS. Kara comes to this
position with over 20 years of experience in the FinTech industry. And let me tell
you a little bit about FIS and what a FinTech company really does. So FIS Global creates
technology and solutions and services for merchants and banks, and capital markets,
so that they can advance the way the world pays, banks, and invests. So a very important
connector, a very important kind of part of that entire system. That financial system
is where FIS sits, and Kara sits as the corporate Chief Information Officer. So we
are just thrilled to have you here today. Kara, thank you so much.
Kara Hill 2:31
Oh, great. I'm thrilled to be here and really excited about what you're doing in the
Business Integrity program at the University of Arkansas. Really happy to be a part
of this.
Cindy M. 2:40
Thank you. And yeah, you are also on my external advisory board for the Business Integrity
Leadership Initiative. And we are a better board because of your leadership and participation.
I always like the audience to get to know the guest just a little bit themselves beyond
what it says on paper. So tell us a little bit about your journey Kara, and how you
ended up in the position you're in for a FinTech company?
Kara Hill 3:02
Sure, absolutely. So I started out in finance. I majored in finance at Boston College
and spent about 10 years in finance. During that time, I very quickly learned that
in order to be successful in finance, I really needed to understand the technology.
Technology was moving to the forefront of everything we were doing. A lot of transitions
from early computing applications to more advanced applications. Networking became
possible. Email came into being, the Internet came into being. A lot of the things
that we take for granted today were brand new then. And then I figured out that not
only did I, did I need to understand it, I really liked it. I found that I was really
energized by it. I liked the people I was working within technology. I liked working
with the applications, and just the overall concepts. Around that, the time that I
was making the transition into technology, information security exploded onto the
scene as something that was so critical that we needed to understand data privacy
laws started to be passed. That's how it all came together and I found my way into
tech through the security, data privacy, and compliance areas.
Cindy M. 4:29
Well, they're all so interconnected these days. I mean, it's like where does one end
and the other start, right? So let's start with that point about data as a FinTech
company and a connector for banks and merchants and the capital markets. I imagine
that a company like FIS deals with a lot of data. So how do you manage all of that
responsibly and ethically because it's a big worry of a lot of companies and individuals
these days.
Kara Hill 4:58
The highest priority to IFs is, is making sure we protect the data. It really starts
at the top. And with any organization, I think that's always true, right, and that
the expectations are set by our board and our executive leadership team. So we have
three values. Win as one team, lead with integrity, and be the change. That is ingrained
in everything we do, is leadership with integrity. In addition to that, we're always
very focused on the three C's. So it's our clients, our colleagues and the communities
that we you know, we we live and work in. Every day where we are reminded, and on,
and on a very regular basis we talk about impact of what we do on the three C's, right?
How does this impact our clients? How does this impact our colleagues? How does this
impact our communities, and from there, it's about establishing governance practices.
The right visibility and transparency into what we're doing so that the board and
the executive leadership team are able to manage and make sure that we're meeting
all of our requirements. And certainly, we're audited against those requirements from
a data security and data privacy perspective, but I can't say enough about our values
and our focus on the three C's.
Cindy M. 6:14
Yeah, I know, and you know what, I'm really glad you started there. Because I think
that sometimes individuals and companies can give the importance of values and things
like the three C's that have grounded FIS short shrift in terms of eh, it's just the
words and that's really just all it is. But it isn't, if a company truly believes
in its values, you see it playing out in the way the leaders lead every day, and how
they are bringing those values to life so they actually are meaningful.
Kara Hill 6:53
Absolutely. It sets the stage, it sets the expectation, and then it just becomes part
of the way we do business. It's foundational.
Cindy M. 7:01
Yeah. And then the governance that you set up all the processes, all the procedures
and all the controls, they're grounded in something that speaks to an individual's
heart and can be inspiring to them. Because governance can feel a little heavy at
times. So what are some of the evolving privacy issues that you have to deal with
now? I know, we are dealing with GDPR, and California law and all of that, but what's
kind of, what's next, what's coming next? What are the evolving issues you focus on?
Kara Hill 7:30
Yeah. So as the technology continues to advance, we have new capabilities available
to us. They're more readily available and accessible to us, such as the use of biometrics,
or access and authentication, making sure that they're being protected. Right, and
that they're being rolled out appropriately. And that you're thinking through the
implications of collecting or storing biometrics. It's just an area where we're seeing
a lot of focus right now, across all industries, not just in financial technology,
but all industries.
Cindy M. 8:06
Yeah, but I imagine in financial technology, it's sort of like the bar has been raised
a bit when you're talking about it in that context because you're talking about money
and people's - individual's money and companies' money. So when you say biometrics,
do you mean like face ID and fingerprint ID to acccess?
Kara Hill 8:24
Biometrics are measures or calculations related to individual's, an individual or
human characteristics, right, so the dimensions of your face, your fingerprint, right,
and your Voiceprint is another biometric. So those are the three main biometrics that
are used most often in technology. So for example, to get into your phone you might
use your facial recognition. Even to get into amusement parks these days, sometimes
if you have a multi day pass, they'll take your fingerprint and use it to get into
an amusement park. I've seen that over the years. So yeah, that's exactly what that
is.
Cindy M. 9:03
Sometimes when you're looking at financial information online, like even your own
account information, it's not exactly intuitive. And so sometimes I find myself looking
at the screen, and I'm sort of like, wait a minute... what is that? And I'm crinkling
up my eyebrows. Is that information useful?
It certainly could be. I think it requires a lot of transparency, that the client or the individual knows that that's what's happening, right, and that you're not doing it without their awareness.
Right. Somebody's watching me and I didn't know it and yeah, yeah.
Kara Hill 9:36
And that's really an important point because some people are very comfortable with
that and they like the convenience of what technology can bring them. Using a fingerprint,
using their facial recognition software, or analyzing their facial expressions or
even their location and combining that with where they are. Some people are super
comfortable with that. And they choose convenience over privacy in that regard, right?
They say, Listen, I prefer, I want, I want you to use that information to help me.
Other people are much less comfortable, right? And they just are not interested in
sharing that kind of information. So that's why the transparency is so important.
Right? Making sure that we always go back to those three C's, how does this impact
our clients? How does this impact our colleagues as well? Because in my role, most
of what I do is manage the computer systems that our employees use, making sure that
they have a convenient way of accessing their devices, right? So some of them are
really comfortable. They want to use the fingerprint, they want to use facial recognition.
And others will say, "Yeah, no, I'm okay. I'll memorize the password, that's okay."
And they're not interested.
Cindy M. 10:53
Do you find that breaks down on generational lines, or not so much?
Kara Hill 10:56
Anecdotally, I don't have any research on it, but anecdotally, yes. I do think that
younger generations or newer, newer, folks who are newer to the workforce do seem
to be a bit more comfortable because they've grown up around the technology, it wasn't
new to them necessarily during their working life. But I do think that there is awareness
around it, there's a lot of discussion about it. So I think that's important. And
that then people can make choices based on on good information that's available to
them. Which is key... it's really key.
Cindy M. 11:32
So then it becomes not about the individual saying, okay, yes, you can use my finger,
I want to use my fingerprint, I want to use my facial information. It's kind of behind
the curtain, if you will, about an individual's information being used by companies
in certain ways. And so that involves a lot of artificial intelligence, and some deep
machine learning. And there have been unfortunately some misses along the way, and
some cases out there that have popped up that show that when a mortgage was offered
by a particular bank, not implicating any in particular, but that it came up with
a different result for a woman versus a man, or someone who's black versus white,
or credit card and the amount of credit they were given. What responsibility rests
with companies who are working with individual's data, to make sure that they aren't
injecting bias, for example, into the decisions?
Kara Hill 12:32
Certainly there is risk there of bias in any in any model, financial model, artificial
intelligence, any algorithm that's being written. And I think that's where AI, machine
learning, financial models are all meant to be tools, right, that are used by humans.
It's one tool, right? It's not the only tool. And so I think it's really important
that... first of all, recognizing that, that it's really critical that we understand
the algorithms that we test them. In the financial services space, there's a concept
of model validation. You're looking at your financial models, and there are ways to
test to determine if something is leaning one way or another in terms of bias. If
there's an unintended result, whether the the machine is looking at history, right
and learning from history. Therefore, the historical bias may be built into the algorithm.
So that's where the testing comes in, the QA, the quality assurance, and then having
a human, again, operating the artificial intelligence or the machine learning at the
end of the day, and keeping the checks and balances there to make sure that the end
result is what you want it to be.
Cindy M. 13:48
And you're talking a lot about the governance there right? For before you roll out
AI. And I would imagine that in this new world, where we're using AI more and machine
learning more that the governance around how a product actually or a system actually
comes to market may look a little different than it did in the past. It doesn't sound
to me, like when you're talking about potential bias and other things, that it's just
an engineering project or just an information systems project that I would imagine
it's a broad cross section of people across the company that would get involved in
looking at different aspects.
Kara Hill 14:23
Absolutely. And and at FIS, we have a risk organization RISC. And as new types of
risks emerge, they add to their governance practices those additional risk factors,
right to look at and say okay, what could be a negative or a possible unintended consequence
of this particular product, and they assess it for both the impact on clients, colleagues,
community, right? And then they assess it for the likelihood, right? How likely is
that to happen? And then how do we adjust the product or service to make sure that
we're considering all of that and taking that into account. So yes, absolutely. And
I think once those structures are there, those governance structures are there and
the functions are there, then any type of, of risk can be assessed using that structure.
Right?
Cindy M. 15:23
Right. Which then all ties back to why do you have the governance? Because one of
your core values is operating with and leading with integrity, right? And so it kind
of like all, it ties it together there and makes the governance meaningful. I've even
heard Kara, that there could be some new jobs. We may be moving into an area where
there are AI auditors. So you do all the testing and all the governance that you've
talked about on the front end, but you have to also have some ongoing monitoring.
But that's a different skill set, right? Somebody who can audit an algorithm as opposed
to a set of financials or a business process and say yes you have the right governance,
but looking at the output. Have you guys ventured into that area at all? Or is that
still evolving?
Kara Hill 16:05
Our audit function audits everything. And they use the same set of principles to perform
those audits. I was a deputy general auditor as well. And so I learned how to audit
any process, or as long as you set out with the goal from the beginning is what are
you trying to assess here? Right. So what's the risk? What's the controls that are
in place to prevent or detect? So yeah, I mean, we're doing that all the time. So
we have audits of products, audits of our services, audits of our technology. And
we have also audits of security and privacy and all of that. Even though we have the
risk program in place, our Chief Audit Executive, his organization is very much a
trusted advisor and partner, but also a function that challenges and looks for potential
issues that need to be addressed. For sure.
Cindy M. 17:09
Yeah, I think that's just so important for students in particular, to understand that
when you're developing some new technology like this, it isn't like it's a one and
done. You have to do all the testing and all the governance on the front end, but
even after you roll it out, you still have to maintain it and monitor it and tinker
with it and adjust it.
Kara Hill 17:27
Absolutely. And to be honest, this role that I've been in for about the past year
is one of the first times in my career where I wasn't on the risk or audit side of
the business. I'm now on what's called the first line of defense. You have a totally
different perspective depending on which part of the lines of defense you're working
in, right?
Cindy M. 17:50
That's right, yeah.
Kara Hill 17:51
But at the end of the day, the great part is we have those core values and our focus
on our three C's that ground all of us together. So even though I see things from
a different perspective, because I'm concerned about speed and rolling things out
and making sure that we get good adoption, and that people are able to use all the
technology conveniently, right? And easily. But the security, the privacy, the risk
side of that is with me all the time.
Cindy M. 18:22
Right. It's baked into the way you lead as a business manager, which I think is also
really important for students to understand that even as a business leader, you have
to have these other principles about privacy and data protection, all those things
foremost in your mind because that's considered the first line of defense. So there
is a responsibility there for that.
So let's turn to the topic of regulation here for a minute. And technology, it just develops at the speed of light. I mean it truly does. And if it isn't used responsibly all the time by companies, you may find yourself at a place where we often do, where there's regulation that's needed. But it doesn't keep up. So it creates this really weird construct where you've got technology that arguably needs to be regulated, in many instances, there is no regulation, so you have a void. And then meanwhile, the regulators are working to come up with something that makes sense. So how does one address the void? Like how do we as a system, let's say, address the void? And then two, how do we make sure that when the regulation does come, that it's meaningful and it has the intended effect and impact?
Kara Hill 19:32
We as business leaders, and you all as educators, we have a responsibility to make
sure that we're training people right, in how to think about business ethics. That
allows us to draw from a talent pool who have been educated in thinking about and
how to process and think about that information. Certainly the speed of technology
change and change in general is moving at a faster pace than ever before. And you're
absolutely right. I think our governance structures in the US at the federal and state
government level move at a different pace. But we also have access to a lot more information
than we have had in the past, right? When you have the foundation of good governance
within a company, then while the problems may change and the technology may be different,
your value system, the way that you analyze a problem, or potential new product, or
new technology or really anything new that you're developing, that really stays pretty
much the same. In terms of the way you analyze it and the things you're thinking about
in terms of the impacts.
But I also think it's really important that the connection between the private sector and the public sector is very strong. And I think that's when you get the best regulation is when the private sector voices and explains their views. I think in the absence of that you're right, you can end up with a law or regulation that wasn't fully understood and is being implemented or interpreted in a way that doesn't meet the original purpose, or has, creates some other issue. That was unintended, right? So getting back to the importance, I can't say enough about the importance of training our young people on business ethics and how to think about these issues and challenges we have in the world. Whether it be tech ethics, or really any area of ethical business management, I think is is critical.
Cindy M. 21:38
Yeah, well, I couldn't agree with you more. And so I am thrilled again, that you're
on my external advisory board, and I'm just thrilled to be doing what we're doing.
Kara, this has been a fabulous conversation about FinTech. I'm afraid we're out of
time, and we're gonna have to leave it there. But I do want to ask you one last question.
For those who want to go a bit deeper on this topic of tech ethics, maybe as it specifically
applies in the FinTech area, do you have any good resources? A good documentary, or
a podcast or a book or something, or just even a reference a site that someone can
go?
Kara Hill 22:11
I've been really watching what the World Economic Forum is doing right now. And they're
leading a project called The Responsible Use of Technology. And there's quite a bit
of work going on there. So there have been some business cases with some pretty large
companies. They're really interesting to watch that project and see what they're doing.
And it's a huge group of businesses, and some educators and universities coming together
as well to tackle these questions and these challenges. So that's something that I
think the students and anyone listening to the podcast might want to check out to
see what's going on. As they said, there's some really interesting case studies on
there.
Cindy M. 22:57
And I think a lot of it's just available on their website. So we'll put the link to
that in the show notes for everyone so they've got access to that. That's great. That's
great. Well, Kara, this has been a really great conversation, and I just always enjoy
visiting with you. Thank you for sharing your wisdom and your knowledge and your thoughts
and your experience with the audience today. I know everyone will learn a lot.
Kara Hill 23:18
Oh, thank you so much for having me. Appreciate it.
Cindy M. 23:21
All right. Thanks, Kara. Bye bye.
Thanks for listening to today's episode of the BIS, the Business Integrity School. You can find us on YouTube, Google, SoundCloud, iTunes or wherever you find your podcasts. Be sure to subscribe and rate us and you can find us by searching TheBIS. That's one word, t h e b i s, which stands for The Business Integrity School. Tune in next time for more practical tips from a pro.