in the beginning In April, the work of the work person began to appear on social media, including LinkedIn and x. He depicts each personality who created it with strange precision, with personal accessories such as reusable coffee cups, yoga, headphones.
All this is possible because of the new Openai GPT-4O works Photo generatorWhich is eager to be eager to edit photos, present the text, and more. Openai ChatgPT images can also create photos in the style of the Japanese film company, Studio Ghibli – which is the direction Viralalso.
The pictures are fun and easy to make – all you need is a free Chatgpt account and a picture. However, to create an image or image of the GHibli-style studio, you also need to deliver a lot of data to Openai, which can be used to train its models.
Hidden data
The data you give is often hidden when using an artificial intelligence photo editor. Every time you download a picture to Chatgpt, you are likely to deliver a “full set of descriptive data,” says Tom Fazdar, head of the cybersecurity at the Open Institute of Technology. “This includes Exif data attached to the image file, such as the time when the image and GPS coordinates were taken from where it was photographed.”
Openai also collects data about the device you use to access the platform. This means your device type, operating system, browser version and unique identifiers, says Vazdar. “Because platforms like ChatGPT work in a conversation, there are also behavioral data, such as what you wrote, the type of images that you requested, and how I interacted with the interface and the frequency of these procedures.”
It is not just your face. If you download a high-resolution image, you give Openai anything else in the image also-background, other people, things in your room and anything that can be read like documents or badges, says Camden Woolven, head of the AI product group in GRC International Group.
Vazdar says that this type of voluntary subsidized data is “a golden mine to train obstetric models”, especially multimedia models that depend on visual inputs.
Openai denies that it regulates viral images trends as a trick to collect user data, however the company is definitely gaining an advantage. Openai does not need to be scattered Web For your face if you are downloading it happily, Vazdar indicates. “This trend, whether by design or a comfortable opportunity, provides the company with huge amounts of high -quality fresh face data from various age groups, races and geography.”
Openai says he is not actively looking for personal information Train modelsIt is not used online public data to build profiles about people to announce or sell their data, and an Openai Wire spokesman tells. However, under the current opery privacy policyYou can keep the pictures provided by ChatGPT and use Improving its models.
Jake Moore, the Security Outfit Eset, who created his own business number to show the risk of privacy in this direction, says that any data, claims or requests shared helps to teach the algorithm – and personal information helps to control it more. On LinkedIn.
Alien
In some markets, your photos are protected according to the regulations. In the United Kingdom and the European Union, the Data Protection Regulations, including gross domestic product Provide strong protection, including the right to access or delete your data. At the same time, the use of biometric data requires clear approval.
However, the pictures become a biometric data only when processing it through a specific technical means Unique Melissa Hall, a senior assistant at MFMAC, says of a specific individual. She says processing an image to create a caricature of the topic in the original image “is unlikely to meet this definition.”
https://media.wired.com/photos/68126da0c6e866c6748464f1/191:100/w_1280,c_limit/ken-avatar-sec-1555807678.jpg
Source link