Ultimately, it was something Blania said, in passing, during our interview in early March that helped us finally begin to understand Worldcoin.
“We’ll let privacy experts tear down our systems, again and again, before deploying them on a large scale,” he said, responding to a question about the privacy backlash last fall.
Blania had just shared how his company had onboarded 450,000 people to Worldcoin, which meant his orbs had scanned 450,000 pairs of eyes, faces and bodies, stored all that data to train his neural network. The company acknowledged that this data collection was problematic and aimed to stop doing so. Yet it did not provide these early adopters with the same privacy protections. We were perplexed by this apparent contradiction: were we those who lack vision and the ability to see the big picture? After all, compared to the company’s stated goal of registering a billion users, maybe 450,000 East small.
But each of those 450,000 is a person, with their own hopes, lives and rights that have nothing to do with the ambitions of a Silicon Valley startup.
Speaking to Blania clarified something we had struggled to understand: how a company could talk so passionately about its privacy protocols while clearly violating the privacy of so many people. Our interview allowed us to see that, for Worldcoin, these legions of test users were, for the most part, not his intention to finish users. Rather, their eyes, their bodies, and their very lifestyles were just water for Worldcoin’s neural networks. Lower-level orb operators, meanwhile, received pennies to power the algorithm, often privately struggling with their own moral scruples. The massive effort to teach Worldcoin’s AI to recognize who or what was human was, ironically, dehumanizing for those involved.
When we put seven pages of report results and questions to Worldcoin, the company response was that almost everything negative we discovered was simply “an isolated incident[s]“It wouldn’t ultimately matter anyway, because the next (public) iteration would be better. “We believe that the rights to privacy and anonymity are fundamental, which is why, in the coming weeks, anyone signing up to Worldcoin will be able to do so without sharing any of their biometric data with us,” the company wrote. That almost half a million people had already been tested seemed unimportant.
Rather, what really matters are the results: that Worldcoin will have an attractive user number to bolster its sales pitch as Web3’s preferred identity solution. And whenever the real monetizable products – be it the orbs, the web3 passport, the currency itself or all of the above – are launched for its intended users, everything will be ready, without messy signs of the work or human body parts behind.