EyeEm photo app will use inactive users’ photos to train AI, sparking privacy concerns.
EyeEm, the bankrupt Berlin-based photo-sharing community acquired by the Spanish firm Freepik last year, is now licensing its users’ images to train AI models. The organization notified users through email earlier this month that it would be augmenting its Terms & Conditions with a novel provision authorizing the uploading of user-generated content for “developing, training, and improving machine-learning models, algorithms, and software.” Thirty days were provided for users to delete all their content from EyeEm’s platform to opt-out. By not doing so, they authorized this use case for their labor.
As of the 2023 acquisition, EyeEm possessed a photo library comprising approximately 150,000 users and 160 million images. The organization eventually started to integrate its community with Freepik’s. Monthly downloads continue to amount to nearly 30,000 despite their decline, according to data from Appfigures.
EyeEm, formerly considered a potential Instagram competitor or “Europe’s Instagram,” reduced its workforce to three before its sale to Freepik, as previously reported by TechCrunch’s Ingrid Lunden. The CEO of Freepik, Joaquin Cuenca Abela, alluded to the company’s potential intentions regarding EyeEm by stating that it would investigate ways to incorporate more AI for the benefit of platform creators.
Ultimately, this necessitated the sale of their work to train AI models.
The revised Terms & Conditions of EyeEm are now as follows:
8.1 Grant of Rights – EyeEm Community
By uploading Content to EyeEm Community, you grant us regarding your Content the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, transform, adapt, make derivative works of, communicate to the public and/or promote such Content.
This specifically includes the sublicensable and transferable right to use your Content for the training, development and improvement of software, algorithms and machine learning models. In case you do not agree to this, you should not add your Content to EyeEm Community.
The rights granted in this section 8.1 regarding your Content remains valid until complete deletion from EyeEm Community and partner platforms according to section 13. You can request the deletion of your Content at any time. The conditions for this can be found in section 13.
The company notes that Section 13 outlines a complex deletion procedure that begins with the direct deletion of photographs; this action does not affect content previously shared on social media or EyeEm Magazine. To remove content from platforms such as the EyeEm Market, where photographers sell their images or others, users must submit a request to [email protected] containing the Content ID numbers of the photos they wish to delete, along with the option of having the deleted content removed exclusively from the EyeEm market or the user’s account.
Notably, the notice specifies that there may be a 180-day delay before these deletions are removed from the EyeEm market and partner platforms. Correct, requested deletions may take up to 180 days, whereas consumers have a 30-day opt-out period. This leaves the option of deleting each photo individually by hand.
Furthermore, the organization further asserts that:
You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content according to sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all partner platforms within the time frame indicated above. All license agreements entered into before complete deletion and the rights of use granted thereby remain unaffected by the request for deletion or the deletion.
Section 8 specifies licensing rights for AI training in detail. In Section 10, EyeEm notifies users that deleting their accounts forfeits their right to compensation for their contributions; users may consider deleting their accounts to prevent their data from being fed to AI models. I see your point.
An instance of how AI models are being trained using users’ content, sometimes without their explicit consent, is illustrated by EyeEm’s action. Even though EyeEm did provide a method of opting out, photographers who failed to receive the announcement would have forfeited the ability to control the future usage of their photographs. Considering the substantial decline in EyeEm’s popularity as an alternative to Instagram, numerous photographers may still need to remember whether they never utilized it. It is possible that they disregarded the email, provided that it had yet to be directed to a spam folder.
Opponents were dissatisfied that they were only provided with a 30-day notice and that their contributions could not be deleted in volume, which made opting out more difficult.
Comment requests submitted to EyeEm were not promptly verified; however, in light of the 30-day deadline for this countdown, we have chosen to proceed with publication without awaiting a response.
Due to this dishonest conduct, users are currently contemplating migrating to the open social web. Pixelfed, a federated platform that operates on the ActivityPub protocol utilized by Mastodon, is strategically leveraging the EyeEm circumstance to entice users.
Pixelfed declared in an official statement, “We will never utilize your images to train AI models.” “Privacy First, Permanent Pixels.”