Thursday, November 28, 2019
Apple posts job teaching Siri to listen to users troubles
Apple posts job teaching Siri to listen to users troublesApple posts job teaching Siri to listen to users troublesAs we look toward a future of working hand in hand with machines, the future of work means hiring people to translate a robots engineered results into something familiar to human ears. As part of this future, Apple is hiring a trainer to teach its artificially intelligent voice interface, Siri, to have serious conversations.In a recent job listing, Apple announced that it is looking for someone to fill itsSiri Software Engineer, Health and Wellness role. As part of the qualifications, this candidate needs to both know how to code and how to engage in peer counseling.People talk to Siri about all kinds of things, including when theyre having a stressful day or have something serious on their mind. They turn to Siri in emergencies or when they want guidance on living a healthier life, the listing states. Does improving Siri in these areas pique your interest? Come work as p art of the Siri Domains team and make a difference.Is the future of mental health humans talking to machines?This job listing is part of a wider trend of super-intelligent assistants moving beyond the role of being just a smart appliance you can ask directions from. Now, companies are designing these assistants to go one step further and be your therapist. WoeBot, for example, is a Facebook Messenger bot that was created by Stanford psychologists and is programmed to capture your moods as you tell WoeBot about your day. Some robots are even being designed to be medical ethicists. Researchers at Georgia Institute of Technologydevelopeda robot, called an intervening ethical governor, to help patients with Parkinsons Disease have a neutral advocate in doctor-patient interactions.But there are many ethical and practical minefields to overcome before our next therapist or patient advocate is a robot. Some people are uncomfortable with the idea of a robot having more final say in decision s than a human. As one observer to a robot refereeing a patients interactions put it, If the robot stood there and told me to please calm down, Id smack him.Do you really want to tell Silicon Valley giants all your problems?And then theres the big hurdle that these AI-powered assistants are often owned by technology giants like Facebook and Apple, which dont currently have the same legal requirements to keep your questions about depression and stress private as licensed mental health workers do.In its disclaimer, WoeBot admits that Facebook can stillread the content of your messagesand know exactly who you are. While a licensed medical provider is bound by the Health Insurance Portability and Accountability Act HIPAA to keep your medical information private, messenger bots and AI-assistants are under no such obligation.So you can talk to a robot about your problems, but these companies cannot guarantee that there wont be someone else listening in.Being engineer with a psychology bac kground is one step forward in addressing this new future of mental health in technology. The next step is hiring someone with familiarity in big data privacy issues.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.