themachinestops to Technology@lemmy.worldEnglish · 3 months agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square98linkfedilinkarrow-up1603arrow-down123cross-posted to: technology@hexbear.nethackernews@lemmy.bestiver.sepulse_of_truth@infosec.pubtechnology@lemmit.onlinetechnology@lemmy.zipfuck_ai@lemmy.world404media@ibbit.at
arrow-up1580arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops to Technology@lemmy.worldEnglish · 3 months agomessage-square98linkfedilinkcross-posted to: technology@hexbear.nethackernews@lemmy.bestiver.sepulse_of_truth@infosec.pubtechnology@lemmit.onlinetechnology@lemmy.zipfuck_ai@lemmy.world404media@ibbit.at
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·3 months agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Material. Type of material: Image
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?