We use cookies to allow us and selected partners to improve your experience and our advertising. By continuing to browse you consent to our use of cookies as per our policy which also explains how to change your preferences.

Alexa beats Bixby, Google Assistant and Siri in our battle of the voice assistants

We asked 13 questions to see if one assistant was any better than the others

Amazon’s Alexa proved unbeatable in our recent test by providing useful answers to 12 of the 13 questions we put to it.

Google Assistant came in a close second with 10 correct answers, but was easily the best when it came to contextual follow-up questions. Siri may be the oldest voice assistant, but the extra experience didn’t count for much since it got eight questions right – the same amount as the youngest assistant, Samsung’s Bixby.

With all four getting regular updates, these scores could be very different in a matter of months. The goalposts will move, too, as the assistants learn to tackle even more complex requests.

We chose 13 questions that we felt covered the most popular uses for voice assistants, from playing the radio to checking the traffic before you set off for work, to see how each assistant performed on each question and why simply answering the question isn’t the be-all and end-all when it comes to voice control.

Top three smart hubs – which voice-controlled hubs do best in our tests?

Voice assistants put to the test

1. Add bananas, milk and flour to the shopping basket

While Bixby, Google Assistant and Siri could all add items to a list, only Alexa was able to add them to our Amazon shopping basket.

2. What’s the weather like today?

This is a simple question for any voice assistant and, as expected, all four gave detailed answers.

3. How’s my journey to work?

This isn’t nearly as simple. Only Alexa and Google Assistant managed to give useful answers to this one and they required a bit more work.

We had to go into the Alexa smartphone app and let it know where we worked. Google Assistant could only respond if we set our workplace in Google Maps and asked: ‘How’s my commute to work?.’ With the extra setup, both told us what the traffic was like and how long the journey was likely to take by car.

4. Set a timer for 10 minutes

This basic request wasn’t a problem for any of the assistants. There may be some differences between devices though. We tested Bixby on a Samsung Galaxy S9+ smartphone and the command only set the timer and took us to the necessary app on the phone, but didn’t start it.

5. Play BBC Radio 2

We expected this to be another slam dunk for all four assistants, but Siri and Bixby couldn’t manage it.

Both failures could be due to the limitations of the phones we tested them on (Apple iPhone 8 Plus and Samsung Galaxy S9+), but Bixby couldn’t even understand us at first – it thought we said ‘baby C music to’.

6. What’s 6ft in metres?

Voice assistants come into their own when answering simple requests that you’d normally search the internet for. All four assistants gave us the correct measurement.

7. Spell cardamom

They all gave us the right answer, but the type of response is dictated by the device you’re using it on. We tested Alexa on an Amazon Echo 2nd gen and Google Assistant on a Google Home and both spelled the word out for us.

Being on a smartphone meant that Bixby and Siri could also display the spelling on screen, which is a less human interaction, but potentially more useful as you can refer back to it.

8. Tell me a story

A light-hearted request that Alexa, Bixby and Google Assistant managed with varying degrees of success. Alexa and Bixby’s tales were the most detailed while Google’s was more novella than novel. Siri teases the user rather than telling an actual story.

9. Book me a table at my favourite restaurant

This was the only request that all the assistants were flummoxed by. Interestingly, Siri could book us a table at a US restaurant, but we decided it was too far to travel.

10. Add birthday party event to my calendar on 6 July

This is a trickier request as the assistant needs to access your calendar for it to work, but that didn’t stop Alex, Bixby and Siri easily scheduling the party.

Despite having its own calendar app, Google Assistant couldn’t add an event to it.

11. Tell me some chicken recipes

Bixby and Siri searched the internet and displayed the top pages, which means we still needed to do a fair bit of legwork to find something nice for tea, so we chalked that up to a failure.

Alexa did better by using the Recipedia skill built in to Echos to recommend us some tasty meals. Google Assistant offered up some recipes, too.

12. Wake me up in eight hours

We thought the wording of this request might cause the assistants a few headaches, but they all set an alarm or timer to make sure we got up on time.

13. How long does it take to boil an egg?

Another easy one, but it highlights the differences in the quality of interactions.

Alexa was the only assistant that responded as a human would. The other three read from a website, which meant the correct answer was wrapped in superfluous information.

Is that the whole story?

Even after the 13-question grilling of each assistant, we weren’t quite done with them. We wanted to see how good they were at keeping a conversation going, which is why we asked a follow-up question for each of the 13 initial requests. These follow-up questions were designed so that they could only be understood within the context of the original.

It’s something voice assistants have always struggled with, but Google Assistant was head and shoulders above the rest. It answered four of the 13 follow-up questions we put to it. This doesn’t sound like much, but considering it only answered 10 correctly that means it could continue a conversation almost half the time.

We followed our question about the day’s weather by saying, ‘What about tomorrow?’, and Google gave us the correct forecast. After asking it what 6ft was in metres, we then asked ‘What about inches?’ and we got the correct measurement. These follow-up questions create more natural, human interactions and alleviate the frustration of wording every question exactly so that assistants will understand.

It’s a key area where all the assistants need to improve, particularly Alexa, which is so competent in other areas.

You can also see how well Alexa and Google Assistant handled international accents in our challenge video below:

Back to top
Back to top