The risk of outsourcing our future to the private sector

Foto del autor

By TP


Last September, California Governor Gavin Newsom vetoed an artificial intelligence security bill, and the Royal Swedish Academy of Sciences awarded the Nobel Prize in Chemistry to David Baker, a professor at the University of Washington , and Demis Hassabis and John M. Jumper, employees of Google's DeepMind subsidiary and its spin-off company Isomorphic Labs. These two episodes may seem to have little in common, but together they suggest that outsourcing the future of humanity to private corporations that maximize profit is something to celebrate. While the California bill was not perfect, it represented the first substantial effort to hold developers accountable for potential harms their artificial intelligence (AI) models could cause. Likewise, it focused not only on any risk but on “critical harm,” such as the development of weapons of mass destruction or the generation of damage, at a minimum, of at least 500 million dollars. The technology industry, including Google, lobbied fiercely against the bill, appealing to a very old argument. As the Financial Times noted in one of its editorials, the new regulations could “slow down the emergence of a type of innovation that would help diagnose diseases, accelerate scientific research and boost productivity”: again, those opportunity costs are considered more damaging than any damage that AI could cause to people's ability to control their own destiny, or even to live peacefully in their societies. The 2024 Nobel Prize represents the first time that the award has been awarded in a science natural to employees of a corporation multinational. All previous winners were or had been university professors or researchers at government-funded research institutes, who had published their results in peer-reviewed journals and made their findings available to the world. Regardless of whether or not it was the Swedish Academy's intention, its decision to include Google researchers legitimizes the privatization of science, which is no longer part of the common goods of humanity. Like many resources before it, AI science is locked in a walled garden that can only be accessed by those who pay admission. True, the AI ​​model AlphaFold2, which won Hassabis and Jumper the award, along with its code source, is available to the public. According to AlphaFold.com, “Google's DeepMind and the EMBL European Bioinformatics Institute (EMBL-EBI) have partnered to create AlphaFold DB so that the scientific community can access these predictions for free.” On the other hand, DeepMind has multiple patents for AlphaFold. According to the logic of property rights, the company, not the public, will always have the final say over the use of technology. The AlphaFold website is a “.com,” denoting something fundamentally different from the Human Genome Project, for example, with its “.gov” URL. In the world of information technology, “free” is never free. Payments are made in data, not dollars. The data that allows AlphaFold to predict the three-dimensional structure of a protein comes from the public domain. DeepMind's partner in the development of AlphaFold is an intergovernmental research organization funded by more than 20 European Union Member States. According to Jumper, “public data was essential to the development of AlphaFold.” Without the data compiled and organized by scientists who received taxpayer money, AlphaFold would not exist. Despite the foresight of public officials in creating this gigantic database, governments are often disparaged for not having the knowledge, capabilities, resources and foresight necessary to promote innovations and advance scientific and economic progress. We are constantly told that only the private sector, with its compelling monetary incentives, can do what is necessary to move the world forward. In reality, the private sector typically takes advantage of the work done by publicly funded or institute-employed scientists. research publics. The first satellite was launched by the United States Government, not by Elon Musk; The US military developed the Internet before it was commercialized, and pharmaceutical companies rarely invest in basic research. Why worry when you can wait for scientists funded by the US National Institutes of Health or similar organizations to advance a field to the point where profitable investments can be made? That's the logic of big corporations with a spirit of profit. Their goal is financial returns, the greater the benefits, the better, not human progress. Once in the game, they attempt to monopolize scientific knowledge by securing patents or hiding their findings behind barriers provided by trade secret law. Without state help, they would have neither basic science nor legal protections for the monopolies that provide them with huge profits—which they then hold up as proof of their superiority over the government. It's not hard to understand why private companies enjoy this game. The mystery is why governments willingly play into the industry's game, handing over years of publicly funded research without ensuring that the population has a say in determining how it is used. California legislation would have required AI models to include a full shutdown capability in case things went wrong, but this stipulation was removed with the rest of the bill. There is nothing new in the argument that if not If we know enough about future harms, we should refrain from interfering in “private” markets, which always work better without government “interference.” Oil and gas companies depended on him when they denied the risk of and contribution to climate change, despite their own research telling them otherwise. Yet here we are again. Apparently, we should put our future in the hands of private corporations whose only goal is to maximize shareholder value. What could go wrong? Katharina Pistor, professor of Comparative Law at Columbia Law School, is the author of The Code of Capital: How the Law Creates Wealth and Inequality (Princeton University Press, 2019).
© Project Syndicate 1995–2024