While strategists argue over the ethical dimensions of artificial intelligence, investors have a role to play in influencing the technology’s positive direction, while taking innovations from ‘lightbulb moment’ to IPO/acquisition.
But industry veteran Tom Siebel warns that, when it comes to AI, there will be unintended consequences from sincere intentions. More on that later.
So, what are the key issues for investors, now that AI’s buzz has reached the ears of an engaged public? How can backers invest with care? Chairing a panel at the World Economic Forum, Ina Fried, Chief Technology Correspondent for Axios, put it neatly and said:
Investors can play a really big role in saying, ‘This is the kind of thing I would put my money in, and this is the kind of thing I wouldn’t.’
Among those joining her on the panel were: CRM prime mover Siebel, now Chair and CEO of enterprise provider C3 AI; Hanzade Dogan, Chairwoman of Turkish e-commerce giant Hepsiburada; Jim Breyer, founder and CEO of investment firm Breyer Capital; and Lauren Woodman, CEO of AI non-profit DataKind, which is committed to advancing the technology for social good.
Woodman said that good investors can help solve some thorny problems. She said:
The challenge is, how do we make sure we don't leave behind organizations that have huge amounts of data that could advance and address some of the societal challenges we're facing?
And, how do we make sure that, as AI is developed, we're not losing sight of the fact that, while there are very good, commercial applications, there are also very good applications in the worlds of trying to achieve sustainable development goals or addressing poverty.
Non-profits, by and large, don't create software tools, it's really not our strength. So, we have to rely on investors, plus the tech sector and regulators, to think through what the implications of all this are. There are lots of issues to resolve and we must make sure that we don’t lose sight of the good we can do.
Siebel focused on investors’ payback, saying:
Rough numbers. When I went to work, the information technology business globally was about $200 billion. Today, I think it serves $7 trillion. When we started building enterprise application software in the 80s, that turned out to be a pretty good idea. That's about a $600 billion market today.
But it is predicted that the enterprise segment of the AI market will also be a $600 billion market in just a few years. So, it looks like this is the fastest-growing market opportunity that I will have seen in my professional career.
Siebel was speaking a couple of days before a class action lawsuit against C3 AI was filed by shareholders, alleging misleading statements prior to IPO. More from Siebel later.
As the “first and only NASDAQ-listed company out of Turkey”, Hepsiburada comes from a very different culture to the ‘big bread and circuses’ of Silicon Valley enterprise software. Dogan said:
From day one, we said we will use our technology power not to destroy and disrupt and get rid of industries, but instead to be a catalyst, an enabler to industries, including retail and banking, to lead their digital transformation.
Other companies share that agenda, of course. But for Dogan, personal and societal transformation were equally high on the agenda. She said:
Five years ago, the woman merchants sharing our GMV [gross merchandise value] were less than one percent. So, I said, ‘This has to change’ and we started a big programme called Take Power to Woman [sic], and we give incentives, we train. But that wasn't enough.
So, we said ‘OK, we're going to hack our algorithms. And we will positively discriminate towards woman merchants, so that our buy-box algorithm doesn't stay biased’. Because the data set is biased. It's always man merchants getting the buy box. If we don't interfere, AI amplifies that.
Today, [women merchants] are eight percent of our GMV, so it's a big jump. Companies like ours who are not Big Tech, but who have data and who use technologies like machine learning and AI, can make a difference. As long as we make sure the model stays within our values.
An impressive follow-through on principles. For serial investor Breyer, meanwhile, the impetus to invest was personal in a different way: wanting to help fight cancer.
George Daley, current Dean of Harvard Medical School [said some years ago] that, without AI, there is absolutely no chance that ten years from now, the best doctors, the not-as-good doctors, the nurses, the practitioners, will be able to do the job they want to do.
The investor shared the story of a friend who had a “terrible experience” of treatment for prostate cancer, in terms of wait times and “data going back and forth”. So, Breyer made the first of what would be over a dozen investments at the meeting point of AI and medicine. In every case, “exceptional data” is critical to successful funding and ventures, he explained:
Most of the unique healthcare data is not in the insurance companies, it's in our best hospitals – mostly research hospitals – and in the medical schools. So, I've been on a mission for certain forms of cancer, such as prostate and breast, to do whatever I can to eradicate it with these tools and companies.
Playing catch up
Fantastic news. But should nations have to rely on philanthropic capitalists to advance technology – and society – towards collective good? Ideally no. But the problem is that, in some VC offices, more funding and sector expertise are now available than most governments could muster.
The job of a venture capitalist is pretty simple. We try to have the ability to look around corners over the next many years and identify extraordinarily large markets. And constantly be meeting many of the best individuals, founders and co-founders, in the world.
But one of the biggest challenges in these AI start-ups is, from the ground up, building the ethical framework. Don't try to tack it on four or five years later!
But at what point does society say, ‘That's not the role of a private investor, we need another institution to step in?’ DataKind’s Woodman responded:
I don't think we get to wash our hands just because we aren't primarily responsible.
One of the things I have been thinking about is that non-profits can't see around the corner. That's not the role we have – we see around different corners, perhaps. So, we rely on investors to think about that on some level.
It’s not okay to come back later and say, ‘We built this and it's 75% right. So, let's tweak around the margins’. We can't do that. And frankly, things move so quickly these days, there's already such a gap between the sectors. You will never have the social sector being able to catch up if responsible investors aren't an active part of the conversation.
But should they be shouldering so much of that burden? Woodman added:
Absolutely not. We all have a role to play in this. It's like any other ecosystem. […] We all have to be good actors in society. But investors are the first line of defence in saying, ‘How do we construct companies, or construct technologies so that, as they evolve, they will have started off in the right direction?’
Fair enough, but we should acknowledge that some investors may be more interested in turning a quick profit than in making society fairer and more equitable.
Hepsiburada’s Dogan stressed her idealism about AI:
I’m a tech enthusiast and an optimist. I do believe AI can decrease the marginal cost of services to zero and make the quality better, so we can have more of every service. So, everyone gets access to legal advice, everyone gets access to diagnostic health, and more, at a lower cost.
An admirable goal. But there’s a problem: some of the higher cost is employees’ wages. So, the flipside of using AI to slash marginal costs may be mass unemployment.
That said, AI will also create jobs, services, and new opportunities – just as mobility and other applications did before them. A 2018 WEF report estimated a net gain to the global economy of nearly 60 million jobs from Industry 4.0 technology – though much as happened since then, of course.
But if we get it wrong, it can be the dystopia of our world. So, it's that serious what we are facing. In this context, we have to expect more from investors. We have to expect more from corporate leaders, more from civil society, and more from the consumer.
It can't just be, ‘investors shouldn't invest in companies that don't have the right internal control mechanisms to serve their values’. A consumer should also be more informed, more alert, and more demanding about what's happening with their data.
And corporate leaders have to be more socially aware too. This era of obsession with profit maximization, of shareholder maximization [being the only motive]. You know, it could be publicly owned algorithms monitoring private algorithms. It can be quality ownership, social equity ownership of Big Tech companies. It can be a change in our social contract.
“I’m not going with you”
Indeed. So, did a corporate leader, Siebel, have any insights on this? He certainly did. And appeared to warn that ethical investment may have unintended, opposite effects to its socially minded aims. He said:
This topic of ethical AI is very troubling.
I believe the largest commercial application will be precision medicine, hard stop. We have the capability today to aggregate the genome sequences and medical care records of the population of, say, the United States into a unified, federated image. We can build machine learning models that are enormously efficacious.
So, yes, we will be able to serve historically underserved constituencies. It is within our grasp today for a population the size of France, the UK, or the United States, to not only engage in early detection, but also disease prediction. And this is huge, right? This is all motherhood and apple pie. We will deliver lower-cost, more efficacious healthcare into a healthier community. What could possibly go wrong?
Well, the idea that […] these people who control this data are gonna act beneficially, get over that: see Facebook for details.
If you think that they're not going to use this data to ration healthcare, get over it. Because they are. They will in the UK, they will in China, and they will in the United States. They're gonna decide at that time you’re too old for this procedure, it's not in the best interests of the country, so get to the back of the line.
To revolutionize the NHS, where we have queues of seven million people waiting for elective surgery, this is troubling. Because with issues related to priority, anytime we have the intersection of AI and sociology, it goes real bad real fast.
Siebel shared a conversation he had with a US Army leader, who he advised not to touch a planned AI system. He said:
This was gonna use AI to decide who to promote. And I said, We could solve this problem, but we're not going to touch it. And my recommendation is you don't touch it either.
Because the problem is, due to the bias in the data, no matter what the question is, the answer is going to be ‘white male who went to West Point’. And in 2023 that is not going to fly. And then you gotta read about yourselves on the front page of The New York Times, and then we get dragged before Congress to testify. And I'm not going with you.
So, the ethical issues are very troubling and we're looking to regulators to bail us out. And when it comes to regulators, the only thing we have worse than the United States might be the EU: you know, the solution is worse than the problem.
Then he added:
This is AI’s core problem: historic data from flawed human society, automated by new technology and given a false veneer of trust.
Silicon Valley itself has done a poor job of promoting women and ethnic minorities: tech remains a white-male dominated industry. But, as Dogan and Woodman both noted, innovators and investors can help fix that through active, principled engagement.
Breyer shared the example of US-based marketplace Etsy, in which he invested, where the majority of buyers and sellers are women (the current figure is thought to be about 80% of merchants). He mentioned this to Amazon’s Jeff Bezos, who had tried to buy the company.
Bezos reportedly said:
That’s a jewel. Whatever you do, don’t change it.