8 Features Apple Needs to Copy from Google Assistant for Siri in iOS 12

BY Khamosh Pathak

Published 12 May 2018

8 Features Apple Needs to Copy from Google Assistant for Siri in iOS 12

Google Assistant is now two years old we just saw how advanced its AI and natural speech abilities are. That robot call demo from Google I/O is interesting in many ways (from an ethics point of view as well as technology). It also goes to show just how advanced Google Assistant is when compared to Siri and Alexa.

Apple and Google have a rich history of give and take (the latest example is Android P integrating iPhone X-like gestures). So here’s a list of features we’d like to see Apple copy from Google Assistant.

1. Keyboard Support

iOS 11 type to Siri 4

Currently, Type to Siri is an accessibility option. When you turn it on, you can’t speak to Siri. There’s a simple solution Apple, and Google has already integrated it. Just put a little keyboard icon in the bottom-left corner (or even bottom-right, we’re not as picky about the button position) and let users text to Siri when they can’t speak.

There are many situations when you’re in an area where you can’t speak out loud to your personal assistant. But you still need her help to set a calendar appointment, a reminder or to message someone really quickly. In times like these, a quick Type to Siri option would be of huge help. And while we’re on the topic, this feature would be at least 10 times more useful on the Mac.

Read moreHow to Type to Siri on iPhone and iPad in iOS 11

2. Multiple Timers

google assistant iphone tips and tricks 7

If you’re cooking, multiple timers are a god sent. Google Assistant does this really well. You can set multiple timers, name them, see all timers on one card and a lot more. Siri does none of that. This is especially bad now that HomePod is out in the open. We hope that iOS 12 will bring this functionality to all devices. It would be really useful on the iPhone and Apple Watch.

Read moreTop 10 Google Assistant Tips and Tricks for iPhone

3. Extensive Extension Ecosystem

Google Assistant has an amazing ecosystem of third-party apps and services. And it’s quite easy to develop for Google platform as well. Users can even assign the default apps for playing music via Google Assistant.

Siri in comparison is very limited. It only supports half a dozen categories of apps (things like messaging, finance, travel and so on). And even there, developers don’t have a lot of freedom when it comes to functionality. SirKit does most of the heavy lifting of understanding and interpreting the commands (and as we know, this is not Apple’s strong suit).

Apple needs to realize that Siri is a cloud-based system. Developers should be able to write extensions for Siri that work via iCloud, across all the devices that support Siri. Users should be able to interact with apps and actions easily.

4. Same Siri On All Platforms

Right now, the Siri on the iPhone, Apple Watch and HomePod is drastically different. Google Assistant on Android and Google Home is much closer. Siri on HomePod is severely limited and Siri on Mac can’t control your HomeKit devices.

Apple needs to unify Siri’s capabilities. And once again, Apple can solve this problem as well by putting Siri functionality for every user in the iCloud account instead of relying on individual devices.

5. Object Recognition in Camera

It’s amazing the kind of things Google is pulling off using Google Lens technology that’s backed by Google Assistant. Google Lens is integrated into the camera app and the Photos app and it can recognize objects, products, and text.

It acts like a heads up display and soon it will even show you information about things you’re looking at. Just point the camera in a busy town square and Google Lens will show you information about the shops nearby and points of interests overlaid on the camera.

Apple already does this for QR codes in the Camera app so there’s a precedent set for something similar to Google Lens functionality. Apple could integrate data from Apple Maps right inside the camera app. Apple also has object recognition in the new Photos app and we know that Apple is getting serious with ARKit as well. This feature would serve as a nice midpoint between the iPhone and Apple’s AR/VR goggles when they come out in the next 5 years.

6. Siri Intelligence Across iOS and Apple Apps

Google Assistant is everywhere on Android. From Photos app to Google Lens to Google Maps.

We saw a glimpse of something similar in iOS 11. Siri is now integrated into the Safari app and Apple News app. But its functionality is pretty limited. So limited in fact, that in almost a year of using iOS 11 (from the first beta), I have yet to see a single Siri suggestion.

Apple needs to get serious about Siri Intelligence and it needs to be sprinkled across all of Apple’s stock apps like Calendar, Reminders, Notes, Safari and more. And later, we should see integration with third-party apps as well.

Siri can be one of the features that can save us from using our phone too much.

7. Detach Siri Updates from iOS Updates

iOS updates roll out once in a year. A point update only comes out every three months or so. Siri should work like a cloud feature. It should be updated every week. That sounds crazy but it’s doable. Amazon does the same thing with Alexa. Every Friday, Echo users get an email about the new functionality in Alexa. Not every week has a big banner feature but when you look at it over a couple of months, it really adds up.

Apple is a hardware company and is obsessed with not releasing features until they’re absolutely read. Recently, this has resulted in the delay of iMessages in iCloud an AirPlay 2 features. This is where Apple needs to learn from Google. Google doesn’t shy away from releasing unfinished products. This can’t work from everything all the time but a software product can be helped immensely by quick user feedback and iteration.

8. The Hard One: Siri Understands All Kinds of Speech

This is one of the hardest things but we want Apple to copy how good Google Assistant is at understating speech, no matter what your accent is. Google Assistant now even supports languages like Hindi. But Siri should start with nailing English speech recognition with the hundreds of accents that are out there.

This is a simple thing but incredibly hard to do. Google is so good at this because it has the best big data and data analysis technology (which comes from Google’s quest for analyzing all your personal information). Apple, on the other hand, collects little personal data, which means it doesn’t have real-life data to train its speech recognition system. Apple is starting to use more data with its Differential Privacy policy feature which anonymizes data.

Apple needs to figure out the best way to go forward with this without violating their own privacy policies. Because this is the biggest thing that’s holding Siri back. No matter how many features Apple builds on top of Siri, people won’t use it if Siri just can’t understand its users. This is the biggest thing that holding Siri back.

Your iOS 12 Wishlist

What are some of the features you wish to see in iOS 12? How do you want to see Siri get better in the next couple of years? Share with us in the comments below.

Check out the other features that we’d like to see in iOS 12