OpenAI has a big plan to fight election misinformation

OpenAI has a big plan to fight election misinformation

OpenAI’s tools, if we are not careful, could have a huge impact on the election of 2024. New tools are vital for protecting democracy.

OpenAI, out of necessity, has been thinking about this issue and updated its policies today to start addressing it.

The Wall Street Journal reported the new policy, which was published first on OpenAI’s blog. ChatGPT, Dall-e and other OpenAI users and makers will no longer be allowed to use OpenAI tools in order to impersonate local government officials or candidates. Users also cannot use OpenAI tools for campaigning or lobbying. OpenAI users are not allowed to use the tools to misrepresent or discourage voting.

OpenAI plans to include the Coalition for Content Provenance and Authenticity (C2PA’s) digital credentials in images generated by Dall E “early this summer”. Microsoft Amazon Adobe and Getty also work with C2PA in order to combat misinformation using AI image generation.

The digital credential system will encode images with their provenance. This would make it easier to identify artificially created images without having to search for strange hands and exceptionally swag fitting.

OpenAI will direct voting-related questions in the United States towards This is one of the most reliable sources on the Internet for information about where and how to cast a ballot in the United States.

All these tools are still in the development stage and rely heavily on users to report bad actors. AI is a tool that changes rapidly and can be used to create beautiful poetry or outright lies. It’s unclear how effective this technology will be in combating misinformation during the election. Your best bet for now will be to embrace media literacy. It’s important to question any news or images that seem too good to be real and do a quick Google Search if the ChatGPT results are utterly bizarre.

Also Read: Microsoft’s Copilot Pro is a new AI-powered Office feature that everyone can use.

Leave a Comment

Your email address will not be published. Required fields are marked *