Skip to Content
Skip to Footer

Artificial Intelligence

Why Salesforce Aims to Build Products That Are ‘Ethical by Design’

tech ethics

Evolving privacy rules, ever-shifting terms of service, the emergence of Web 3.0, and even AI-turbocharged chatbots writing college essays all raise the same question: How can we advance game-changing innovation while protecting people from potential harm? 

Salesforce has long been focused on this topic. We recently met with Paula Goldman, Salesforce’s Chief Ethical and Humane Use Officer, to learn how Salesforce is tackling tech ethics both inside its walls and with its customers, and what risks — and opportunities — are emerging in today’s digital economy.

Q: What does your team at Salesforce look like and what does it do?

In 2018, Salesforce established the Office of Ethical and Humane Use to address questions and concerns around the use of technology. Today, this team guides the responsible design, development, and use of Salesforce technologies. 

A huge focus of our office is Ethics by Design — what we call product ethics, and which includes AI ethics. We work to minimize the harm and maximize the benefits of our technology by working with product teams to mitigate unintended consequences. 

I also have a policy team that looks at how customers use our products — they think about product responsibility, specifically — and a team that works on product accessibility and inclusive design. They make sure our products are accessible to people with disabilities, both as an equality issue and as a source of innovation. 

Q: How do you approach this work?

We don’t have all the answers at Salesforce, but we know we need to ask the questions, and we know we need to work with our communities to find the answers.

PAULA GOldMAN, Chief Ethical and Humane Use Officer, Salesforce

For example, we have an ethical use advisory board that has folks from academia, from civil society, and from Salesforce – notably, both frontline employees and executives. We ask them: What does responsible technology look like? What areas should we go into (or not)? What guardrails should we be setting for our technologies? 

We try to come up with the answers together. We know we’re not going to get it right all of the time. But we hope that by engaging our communities in these discussions, when we don’t get it right, we’ll be able to pivot more quickly. And, have a little bit of grace in learning and then reapplying those lessons as we move forward.

Q: How did this thinking come about? How did your background prepare you for such a unique role?

I was born in Singapore, lived in Indonesia as a kid, and have also lived in East and West Africa, South Asia and Eastern Europe, which gives me a global perspective on the world. I also have a doctorate in anthropology. Part of earning that degree was studying how ideas become mainstream and how movements grow. These experiences helped instill in me that tech can be an amazing force for good. 

But I’ve also seen that there is an underbelly in tech – a set of risks that need to be addressed. Addressing these risks is not just about what you do with product teams, it’s about changing norms within a company, an industry, and our society. Every customer has a different circumstance when it comes to ethical use of technology. So how do we create and use technologies in a way that respects people and empowers consumers? 

Salesforce is laser-focused on finding those answers. That gives me a lot of energy.

Q: How do you work with product teams to ensure Salesforce technology is used and designed ethically?

Trust must be at the center of every digital transformation and technology that powers it. One recent example is Data Cloud. It’s this massively powerful tool that helps companies unify and leverage their data in real time. 

But to truly realize the value of this innovation, consumers need to know that data is being protected. That’s why we’ve been working with the Genie team over the last two years to develop this incredible tech with privacy and data ethics at the forefront. 

Our team advised on guardrails and protections to protect people’s privacy, developing checklists, resources, and features that help customers manage their data ethically. Our platform, for example, guides customers to consider segmenting their audiences by behavior and interest, instead of gender or race.  

These ethical practices not only protect people’s privacy, they actually help companies realize the value of their data and unlock more savings – both critically important in today’s economy.

PAULA GOlDMAN, Chief Ethical and Humane Use Officer, Salesforce

A recent study from Valoir found that ethical data practices can help companies hold onto more first-party data, and free up 10% of marketers time and 40-50 hours of engineering time every couple weeks.

Q: With changing regulations and policy on top of a very rapidly evolving tech space, what are your big areas of concern or focus?

I’m encouraged by discussion here in the United States about a national privacy bill, as well as the AI regulation that’s being developed in Europe. I serve on the National AI Advisory Council, which is advising on these topics, and while there are a lot of important details that still need to be worked out, it’s great to be moving toward a set of guardrails that everyone can follow. 

This will ultimately make acting responsibly – in the tech industry and beyond – much easier.

Q: How does Salesforce influence humane and ethical use of technology outside of its own four walls?

A huge focus for our office right now is how we can more effectively partner with our customers. 

One way we build that trust with our customers is by designing a lot of intentional defaults into our technology. These defaults help customers and their end users prioritize inclusion, data ethics, and privacy by default. We then work with customers to help them customize our technology in ethical and responsible ways – helping them understand the impact of the choices they’re making. 

By intentionally building ethical guardrails and product defaults into our technology, we’re pushing our industry forward in a responsible way and creating products that people can trust. 

This interview has been edited for clarity and length.

Go deeper:

Get the latest Salesforce News

Exit mobile version
%%footer%%