Monday, 7 August 2017

"Adversarial preturbations" reliably trick AIs about what kind of road-sign they're seeing

An "adversarial preturbation" is a change to a physical object that is deliberately designed to fool a machine-learning system into mistaking it for something else. (more…)



Share:
Powered by Blogger.