Alb_x_008@lemm.ee to Technology@lemmy.worldEnglish · 8 months agoChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square61fedilinkarrow-up1206arrow-down111cross-posted to: [email protected][email protected][email protected][email protected]
arrow-up1195arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euAlb_x_008@lemm.ee to Technology@lemmy.worldEnglish · 8 months agomessage-square61fedilinkcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareNeoNachtwaechter@lemmy.worldlinkfedilinkEnglisharrow-up4·8 months ago LLMs don’t actually store any of their training data, Data protection law covers all kinds of data processing. For example, input is processing, too. Output is processing, too. Section 4 of the GDPR. If you really want to rely on excuses, you would need wayyy better ones.
minus-squarevithigar@lemmy.calinkfedilinkEnglisharrow-up2arrow-down2·8 months agoRight, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.
Data protection law covers all kinds of data processing.
For example, input is processing, too. Output is processing, too. Section 4 of the GDPR.
If you really want to rely on excuses, you would need wayyy better ones.
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.