Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Ah yes, I don’t know what I’m talking about it’s just that guy happened to have real images just like they do every time because it’s impossible to get your garbage model to produce cp otherwise.
Removed by mod
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
Removed by mod
You don’t know what you’re talking about.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Ah yes, I don’t know what I’m talking about it’s just that guy happened to have real images just like they do every time because it’s impossible to get your garbage model to produce cp otherwise.
You genuinely don’t think CSAM is used in the training of these AI models…?
Why exactly did you feel the need to jump in and defend something like this?
Yes, AI can create tons of content it’s not trained on.
deleted by creator