top of page

AI Should Not Be Seen as a “Divine Machine”

Karen Hao, author of Empire of AI, and Dr. Timnit Gebru, an AI ethics expert, warn that in the race toward AGI, the exploitation of natural resources and human labor is increasingly treated as acceptable.



Mission [EXP + PROS]

The problem is not AI as a technology, but the way it has been marketed as an all-powerful entity, argue journalist Karen Hao, author of the bestseller Empire of AI, and Dr. Timnit Gebru, a computer scientist who made headlines in 2020 after being dismissed from Google and accusing the company of racism.

Lumping very different technologies together under the same label is one of the biggest problems in the current debate, says Dr. Gebru, a specialist in AI ethics. “That kind of mixing makes any serious conversation about impacts and risks much harder,” she notes.

According to Hao, technological development should start not with the product, but with the objective: “What challenges do we want to solve to improve people’s lives? Only after that should we decide which tools—including non-technological solutions—actually make sense.”

Instead, AI has been pushed into nearly every domain in the name of AGI (Artificial General Intelligence). “The implicit argument is that if we are building a ‘divine machine,’ then any cost becomes justifiable,” Hao argues—“including the exploitation of natural resources or human labor.”


Gebru adds: “What used to be considered a marginal or speculative goal is now guiding much of the sector as if it were a religion.”

If the debate around AI feels deeply polarized, Hao says that is exactly what Silicon Valley wants. “They frame the future as either wonderful or catastrophic, and then argue that a small group should be responsible for controlling it.” And, of course, that group would be the chosen ones.

AI companies often aim to build systems designed to serve the entire world. The problem, however, is that these models inevitably carry the values, culture, and perspectives of the people who built them—while ignoring the diversity of languages, contexts, and ways of life across the planet.

Despite the concentration of power in the industry, both Hao and Gebru see growing forms of resistance: artists suing AI companies, communities protesting against data centers, and journalists learning to cover the sector more critically. They also point to encouraging initiatives from researchers and organizations working to develop smaller technologies better aligned with the specific needs of communities.

bottom of page