Why I’m expecting big things for Apple’s Siri

“Apple’s machine learning teams are working to transition the Siri voice assistant away from being a service into becoming a platform in its own right, working seamlessly across every Apple device,” Jonny Evans writes for Computerworld.

“When it ships, iOS 11 will see public introduction of a series of enhancements likely to extend what Siri on Apple devices can achieve,” Evans writes. “These include new machine learning APIs third-party developers will use to create intelligent and personalized apps with Siri support.”

“Now [that] the company seems to be seeing Siri as a platform, rather than a service, we should see the company begin to make more use of those insights across all its other products,” Evans writes. “What’s the end game? I don’t know for sure, but I suspect we’re looking to the creation of an intelligent, personalized, contextually-aware voice- and gesture-based user interface for use in next-generation devices. Apple’s cheeky little assistant is about to grow up.”

Read more in the full article here.

MacDailyNews Note: Here’s what Apple had to say about Siri at WWDC 2017 in June:

Siri is the most popular personal assistant in the world with more than 375 million active devices accessing Siri each month across 36 countries. Using the latest advancements in machine learning and artificial intelligence, new male and female voices are more natural and expressive, adjusting intonation, pitch, emphasis and tempo while speaking, and can translate English words and phrases into Mandarin, French, German, Italian or Spanish. Siri extends beyond voice, using on-device learning to deliver more personal experiences and offers suggestions based on personal usage of Safari, News, Mail, Messages and more. For example, as Siri learns topics or places a user is interested in while browsing Safari, they will be suggested when typing in Mail, Messages and other apps.

[protected-iframe id=”5daffd6b185939e5565ae00f513fc363-17146794-18685410″ info=”https://www.youtube-nocookie.com/embed/dBb_Bq8FVD4″ width=”560″ height=”315″ frameborder=”0″ allowfullscreen=””]

[Thanks to MacDailyNews Readers “Fred Mertz” and “Dan K.” for the heads up.]

12 Comments

  1. I wonder how many old posts I can find on MDN with this same storyline over the years.

    I’ll be happy if the “new Siri” in iOS 11 can just turn Notifications for specific apps on/off.

    1. IMHO siri’s concept is wrong, the idea of natural human voice is misdirected, unfocused and unnessary, humans easily understand and incorporate command words as shortcuts when operating devices, such as using our computers. Human language is extremely adaptable, siri is not. We use slang, acronyms, buzzwords, tags, etc without batting an eyelash, why make siri try to adapt to human atonality, nuances, continuous redefinition and so forth, make humans use basic structure and command words for directing siri any way we want to achieve something. What I hate is second guessing if siri understands my voice and what I need, which will NEVER be accomplished in the progressively complex overhead Apple is attempting due the flawed concept set in motion by top down and disconnected management under Cue and hands off Cook.

  2. Truth be told, I’ve issues with Siri…she’s as much frustration as assistant, but add “Windows” to the WWDC quote above and the magnitude would fit. Widespread use proves/means nothing when speaking of quality/desirability. I hope I’m surprised with the next release.

  3. In my personal experience with Siri (which is pretty vast) I have to wonder if Tim, Eddie, Phil and all the other people in Apple higher management have ever even used Siri because the rosy picture they paint is far from the actual experience of using Siri, which is a frustrating mess. Siri is a great concept but in actuality Siri sucks.

    1. Steve, AMEN BROTHER:

      I ask myself this question each time I use Siri, while I am reminding myself of the last time I used her and promised that I never would again. It actually seems like she is less useful now than when she first appeared.

      It is so bad, that I have figured that I must be doing something wrong.

      1. Exactly. Here’s an example: walking into the grocery store I always try to remember to launch the app called Keyring – which holds my grocery store bonus card. So I hold down the button, when it vibrates I say ‘Launch Keyring’. Used to work like a charm. Now? Pfffft, I either get ‘you’ll need to unlock your iPhone first’, ‘I’m sorry but I don’t see anything for Lunch Catering’, or (upon unlocking) I get search results for Lunch Catering. Almost never works now, no matter how slowly and clearly I speak it.

        Here’s another – I try, while driving, to use my vehicle’s hands free feature to send a text message. ‘I’m sorry I cannot do that while you’re in the car’ (or however that is worded). Has worked for years. Now all of a sudden I get this 50% of the time.

        Personally I couldn’t possibly care any less about the voice improvements – make the f*cking thing work first then worry about how she sounds.

        Functionality THEN Form.

        1. Apple’s efforts to making Siri sound better seems to be like the return to Siri being a fun ‘novelty’ when first introduced. I agree that Siri development should first be making it work well, at least partially functional when no data connection is available and ‘hide’ search redirections. For the final if necessary have the Siri AI parse the search result and produce something ‘new’ that makes it appear Siri ‘knows’ the answer rather than “I don’t know but here’s a (possibly) relevant site”.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.