Yeah that was my thought. Don't reject them, that's obvious and they'll work around it. Feed them shit data - but not too obviously shit - and they'll not only swallow it but eventually build up to levels where it compromises them.
I've suggested the same for plain old non-AI data stealing. Make the data useless to them and cost more work to separate good from bad, and they'll eventually either sod off or die.
A low power AI actually seems like a good way to generate a ton of believable - but bad - data that can be used to fight the bad AI's. It doesn't need to be done real-time either as datasets can be generated in advance
Yeah that was my thought. Don't reject them, that's obvious and they'll work around it. Feed them shit data - but not too obviously shit - and they'll not only swallow it but eventually build up to levels where it compromises them.
I've suggested the same for plain old non-AI data stealing. Make the data useless to them and cost more work to separate good from bad, and they'll eventually either sod off or die.
A low power AI actually seems like a good way to generate a ton of believable - but bad - data that can be used to fight the bad AI's. It doesn't need to be done real-time either as datasets can be generated in advance