EU consumer protection groups urge governments to investigate AI
[ad_1]
Consumer protection groups from the European Union (EU) have urged regulators to conduct investigations on artificial intelligence (AI) models behind popular chatbots.
According to a report from the AP on Tuesday, 13 watchdog groups issued a warning to their national consumer, data protection, competition and product safety authorities regarding concerns about generative AI.
The groups said regulators should investigate AI behind systems, such as OpenAI’s ChatGPT, in order to assess the risks and vulnerabilities to consumers prior to the introduction of the EU’s AI regulation.
In addition to local officials in the EU, the coalition wrote to United States President Joe Biden with similar concerns regarding potential harm to consumers at the hands of generative AI.
Their call to action urged leaders to utilize existing legislation, as well as bring in new laws to address AI concerns. They cited a report from the Norwegian Consumer Council which highlights the dangers of AI chatbots that include disinformation, data harvesting and manipulation.
Related: AI has a ‘symbiotic relationship’ with blockchain: Animoca Brands CEO
These warnings come shortly after regulators in the EU passed their monumental AI Act on June 14. The bill passed in Parliament with 499 votes for, 28 against and 93 abstaining.
The laws are expected to take effect within the next two to three years, after individual negotiations with EU member states regarding the details of the act. They will serve as a comprehensive set of rules for AI development and deployment in the EU.
In the U.S., officials are also mulling over regulations targeting AI. On June 9, two new bipartisan bills were proposed which target issues of transparency and innovation in the industry.
Regulators in the United Kingdom have called for regulations surrounding AI to be on the same level as medicine and nuclear power.
Magazine: AI Eye: 25K traders bet on ChatGPT’s stock picks, AI sucks at dice throws, and more
[ad_2]
Source link