It feels like we have a new privacy threat that’s emerged in the past few years, and this year especially. I kind of think of the privacy threats over the past few decades as happening in waves of:
So for that third one…what do we do? Anything that’s online is fair game to be used to train the new crop of GPTs. Is this a battle that you personally care a lot about, or are you okay with GPTs being trained on stuff you’ve provided? If you do care, do you think there’s any reasonable way we can fight back? Can we poison their training data somehow?
In the digital age, protecting your personal information might seem like an impossible task. We’re here to help.
This is a community for sharing news about privacy, posting information about cool privacy tools and services, and getting advice about your privacy journey.
You can subscribe to this community from any Kbin or Lemmy instance:
Check out our website at privacyguides.org before asking your questions here. We’ve tried answering the common questions and recommendations there!
Want to get involved? The website is open-source on GitHub, and your help would be appreciated!
This community is the “official” Privacy Guides community on Lemmy, which can be verified here. Other “Privacy Guides” communities on other Lemmy servers are not moderated by this team or associated with the website.
Moderation Rules:
Additional Resources:
The biggest problem to me is what I just saw you post in another reply, that these models built upon our knowledge exist almost solely within proprietary ecosystems.
The Washington Post published a great piece which allows you to search which websites were included in the “C4” dataset published in 2019. I searched for my personal blog
jonaharagon.com
and sure enough it was included, and the C4 dataset is practically minuscule compared to what is being compiled for larger models like ChatGPT. If my tiny website was included, Mastodon and Lemmy posts (which are actually very visible and SEO optimized tbh) are 100% being scraped as well, there’s no maybe about it.