The breathless pace of development means data protection regulators must be ready for another scandal like Cambridge Analytica, says Wojciech Wiewiórowski, the EU’s data watchdog.
Wiewiórowski is the European Data Protection Supervisor, and he is a powerful figure. Its role is to hold the EU accountable for its own data protection practices, oversee the cutting edge of technology and help coordinate enforcement across the union. I spoke with him about the lessons we should learn from the past decade in technology and what Americans need to understand about the EU’s data protection philosophy. That’s what I had to say.
What tech companies should learn: These products should have privacy features built in from the start. However, “it’s not easy to convince companies that they should embrace privacy-by-design models when they have to deliver very quickly,” he says. Cambridge Analytica remains the best lesson in what can happen if companies push the envelope when it comes to data protection, says Wiewiórowski. The company, which became one of Facebook’s biggest advertising scandals, had deleted the personal data of tens of millions of Americans from their Facebook accounts in an attempt to influence how they voted. It’s only a matter of time before we see another scandal, he adds.
What Americans need to understand about the EU’s data protection philosophy: “The European approach is related to the purpose for which you use the data. So when you change the purpose for which the data is used, and especially if you do it against the information you provide to people, you are infringing the law,” he says. Take Cambridge Analytica. The biggest legal breach was not that the company collected data, but that it purported to collect data for scientific purposes and evidence, and then used it for another purpose, primarily to create political profiles of people. That’s a claim from data protection authorities in Italy, who have temporarily banned ChatGPT there. Authorities say OpenAI collected the data it wanted to use illegally and didn’t tell people how I wanted to use them.
Does regulation hold back innovation? This is a common statement among technologists. Wiewiórowski says the real question we should be asking ourselves is: Are we really sure we want to give companies unlimited access to our personal data? “I don’t think the regulations … are really stopping innovation. They try to make it more civilized,” he says. After all, the GDPR protects not only personal data, but also commerce and the free flow of data across borders.
Big Tech Hell on Earth? Europe is not the only one playing with technology. As I reported last week, the White House is mulling rules for AI accountability, and the Federal Trade Commission has even required companies to delete their algorithms and any data they’ve collected and used illegally, as happened with Weight Watchers. in 2022. Wiewiórowski says he’s glad to see President Biden calling on tech companies to take more responsibility for the safety of their products, and finds it encouraging that U.S. political thinking is converging with European efforts to prevent the risks of AI and put companies on the hook for damages. . “One of the big players in the tech market once said, ‘The definition of hell is European legislation with American enforcement,'” he says.
Read more about ChatGPT
The inside story of how ChatGPT was created from the people who made it