People use the new Google AI model to eliminate images filigranes

People use the new Google AI model to eliminate images filigranes

Users on social networks have discovered a controversial use case for the new Gemini AI model of Google: the deletion of images watermark, including images published by Getty Images and other well -known media outfits.

Last week, Google has expanded access to its Gémini 2.0 Flash The model image generation function, which allows the model to generate and modify image content. It’s a powerful abilityby all accounts. But he also seems to have little railing. Gemini 2.0 Flash will create simple images Deceive celebrities And Characters protected by copyrightand – as mentioned above – Remove the filigrane from existing photos.

Like several x and Reddit Users have noted, Gemini 2.0 Flash will not only delete the watermark, but try to fill the gaps created by the removal of a watermark. Other tools fed by AI also do so, but Gemini 2.0 Flash seems to be exceptionally qualified – and free to use.

To be clear, the gemini 2.0 flash generation functionality is labeled “experimental” and “not for production in production” for the moment, and is only available in the tools oriented towards Google developers as To study. The model is not a perfect watermark device either. Gemini 2.0 Flash seems to fight with certain semi-transparent filigranes and filigranes that cause the main parts of images.

However, certain copyright holders will surely dispute the lack of restrictions on the use of Gemini 2.0 Flash. Models such as anthropogens Claude 3.7 SONNET And Openai GPT-4O explicitly refuse to remove the filigrane; Claude calls for the withdrawal of a watermark from an image “contrary to ethics and potentially illegal”.

The abolition of a watermark without the consent of the original owner is considered illegal under the American copyright law (according to Law firms like this)) Apart from rare exceptions.

Google did not immediately respond to a request for comments sent outside normal working hours.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *