Apple suspends program that used recordings of customers’ sexcapades and medical secrets

0
46

Last week, we told you that Apple, like Google and Amazon, has a third party firm listening to clips recorded by its virtual digital assistant. Apple and the other companies say that this is necessary in order to improve the user experience of their AI-driven helpers. One whistleblower who works for the contractor employed by Apple related how private medical information is sometimes heard on these Siri snippets and occasionally the contractors are titillated by the sounds of two (or more) people engaging in sexual activity. In such situations, Siri is activated by mistake. The whistleblower …

Go to Source