For better or worse, Tim O’Reilly has become known as something of an oracle for the technology industry in his 40-year career as a technical publisher, author and venture capitalist, credited with coining terms like Open Source and Web 2.0.
Today, O'Reilly finds himself in the interesting position of being both a techno-optimist – for instance, about how artificial intelligence (AI) could augment human workers and help solve existential problems like climate change – while also being a fierce critic of the new power centres technology has created, particularly in Silicon Valley.
Finding a new class of problem
"I totally think that there is a massive opportunity for us to augment humans to do things, we need the machines," O'Reilly told InfoWorld last week, from his home in Oakland, California.
"There are such enormous challenges facing our society. Inequity and inequality is a huge part of it. But for me, one of the really big ones is climate change," he says. "We have to solve this problem or we're all toast. We're going to need every bit of ingenuity to do that. I think it will become the focus of innovation."
That change in focus could also lead to an enormous raft of new jobs – provided the planet shifts away from fossil fuels, and what he describes as the "Ponzi scheme" of startup valuations.
O’Reilly stops short of pushing for the sweeping radicalism of "a new socialism", but he insists that "we have to design this system for human flourishing.”
End of the golden age of the programmer
But what does that look like? How do we re-skill the workforce to focus on this new class of problems, while ensuring the spoils are spread evenly, and not concentrated in the hands of big tech companies? Or entrepreneurs like Elon Musk, whom O'Reilly admires.
Short of telling people to "learn to code", O'Reilly sees a new set of literacies being required if the workforce of the future is to take advantage of the oncoming "augmentation" that intelligent systems could enable.
"I think the golden age of the last couple of decades where you can become a programmer and you'll get a job... is sort of over," O'Reilly says. "Programming is now more like being able to read and write. You just have to be able to do it to be able to get the most out of the tools and the environments that you're presented with, whatever they are."
"Every working scientist today is a programmer," he adds. "Programming can make a journalist more successful, programming can make a marketer more successful, programming can make a salesperson more successful, programming can make an HR person more successful. Having technical literacy is on the same level as being good at reading, writing, and speaking."
No silver bullets
O'Reilly isn't blind to the trade-offs that society has made for the convenience that certain technologies bring. How does he maintain such a sunny disposition when it comes to the potential of technology in the face of growing inequality, the erosion of privacy, and the disinformation crisis that Silicon Valley has wrought?
"It's quite clear that we're now really aware of the enormous risks of these technologies, the risks for abuse," he says, adding that he doesn't believe government should be singled out to solve all of these issues.
Although O'Reilly recognises that Congress legislating to regulate facial recognition is a step in the right direction, he notes that it's not nearly comprehensive enough to truly mitigate the risks. "We're not really getting to the root of our engagement with the question of what is the governance structure for technologies that are really changing our society," he says.
Complex problems require complex solutions. Take the recent exodus of advertising revenue from Facebook, where brands such as Unilever and Ben and Jerry's have pulled their marketing dollars from the social network over its policies surrounding hate speech.
O'Reilly argues that Facebook is only doing what it is designed to do and has been thus far rewarded by the market for doing: attract as many eyeballs as possible and sell ads against that attention using algorithmic systems.
"If you understand how algorithmic systems work, you realise they are curatorial systems, they represent choices," O'Reilly says. "We need to have a completely different conversation about it. So too with facial recognition, it's on a continuum with all kinds of other technologies that take away people's privacy. On that continuum are things that people like and embrace and want, and things that they don't want."
There is no silver bullet to solve these issues, but there are some steps that could be taken to realign the priorities of technology companies with those of society at large.
"Until we build ethical principles more broadly into our company governance – which things like the B Corp movement have tried to do – we have to take this as a comprehensive problem, with comprehensive solutions," O'Reilly says.
What next for open source?
As a long-time exponent of the power of open source, where does this community fit in to O'Reilly's vision for technology to help solve society's biggest problems?
"Open source is really challenged in this world, it's not going to be the same thing that it was in the PC era," he says.
Tracing open source back to its roots, there have always been a plethora of opinions around what open source truly means, from the Free Software Foundation's definition, to the computer scientists at UC Berkley, or the MIT X Window System, which O'Reilly is most closely aligned with.
The central idea here is that all code should be openly available to be modified and copied, with the overall aim being to push forward the state of the art.
"If you look at where open source is really thriving it is in areas like science, where there's not that desire to make a lot of money off of this, they just want other people to be able to use this and benefit from it," he says.
"That's why, for example, very early on in the open source discussion, I was saying data is going to be the new source of lock-in, we shouldn't be so focused on source code," he adds. "If we had focused a lot more on issues of what it means when somebody controls the data, when somebody controls the algorithms which shape what data people see? That's where the open source discussion needs to be now."