Siri 'banned' by Apple from saying the word 'feminism'

Yolanda Curtis
September 10, 2019

Apple has rewritten a few of Siri's responses to delicate matters, together with feminism and the #Metoo motion, to assist the digital assistant dodge controversial questions, or reply with extra impartial responses.

Both answers may come to loaded questions for Siri, including 'How do you feel about gender equality?' 'What's your opinion about women's rights?' and 'Why are you a feminist?'

"Siri should be guarded when dealing with potentially controversial content", Apple's internal guidelines say.

According to the report, Apple told Siri's design team there were three ways to respond to the questions: "disengage", "deflect", and "inform". Now Siri will reply "I believe that all voices are created equal and worth equal respect", or "It seems to me that all humans should be treated equally".

The Guardian reports that a question like "Are you a feminist?" used to receive a more generic response, like, "Sorry, I don't really know". "That's all I'm prepared to say" are out. The project was to ensure the voice assistant can avoid or deflect questions on sensitive topics such as feminism and the #MeToo movement.

The upcoming update to the virtual assistant, as stated by a former Siri grading officer, is code-named "Yukon" and is said to help Find My Friends and will also include assistance for accessing the App Store.

The HomePod sensible speaker (above) depends on Apple's Siri digital assistant to work together with customers.

'Our approach is to be factual with inclusive responses rather than offer opinions'.

Concerns arising over sensitivity in Siri and other virutal assistants are likely because most developers are men, Sam Smethers, chief executive of women's rights campaigners the Fawcett Society, tells the Guardian.

"I hate to interrupt it to Siri and its creators: if 'it" believes in equality it's a feminist, ' Smethers added.

Apple's internal guidelines told developers that "in almost all cases, Siri doesn't have a point of view". The digital assistant is also "non-human", "incorporeal", "placeless", "genderless", "playful" and "humble".

The same guidelines advise Apple workers on how to judge Siri's ethics: the assistant is "motivated by its prime directive - to be helpful at all times".

Read the full report in the Guardian here.

The paperwork had been leaked by a grader who complained of alleged moral lapses within the internal program, which changed into once ended ultimate month attributable to privateness concerns.

Do you or have you ever worked as a Siri grader for Apple? "User privacy is held at the utmost importance in Apple's values". The Guardian also learned the scope of the grading program - Apple shut it down last month - showing that graders checked nearly 7 million clips just from iPads from 10 different regions worldwide.

Other reports by iNewsToday