In September 2023, Amazon seemed ready to usher in its own revolution to the world of AI. At an event in Washington, D.C., the company’s executives showed off the updated version of Alexa, which invited people to chat with it about all sorts of things. Amazon wanted Alexa to stop being as dumb as a rock, and it looked like it could do it. But then, silence.
Where’s the new Alexa? After introducing the latest version of the assistant, which seemed promising, Amazon said it would roll it out to about 500 million devices with Alexa built in. However, the company hasn't provided any news about the project’s development. In January 2024, users on social media posted rumors that the new Alexa could make an appear this year—but through a subscription. What’s going on? Why hasn't Amazon launched this latest version?
Chaos at Amazon. According to company sources quoted in Fortune, the problem lies with Amazon. Employees claim the company is plagued by “structural dysfunction” that has repeatedly delayed the launch of a new generative AI-powered Alexa.
The demo was just that: a demo. As flashy as Alexa’s unveiling was last September, it was just a demo. That version wasn’t ready for users, and it still isn’t. The LLM for this new Alexa is “far from state of the art” for several reasons.
No data, no GPUs. Scientists interviewed who worked on the project told Fortune that Amazon didn’t have enough data to train the model or enough chips to prepare and run it competitively.
Amazon is now focused on developing a chatbot for the cloud. Instead of continuing to work on Alexa full-time, Amazon appears to have shifted its focus to developing a generative AI model for Amazon Web Services. The company invested $4 billion in Anthropic, the creator of Claude, but so far, that hasn’t helped either way.
Amazon denies there are problems. An Amazon spokeswoman told Fortune that the data scientists provided was outdated, adding that the company has access to hundreds of thousands of GPUs. She also denied that Alexa has lost priority or isn’t using Claude but didn’t elaborate on how.
Siri is coming out ahead. That’s what it looks like after this week’s announcements at WWDC 2024. At the event, Apple showed off an improved version of Siri with a more natural synthetic voice and the potential for the assistant to perform actions through apps it connects to. The integration with ChatGPT is another eye-catching option for a voice assistant that Apple launched in 2011. Siri now has a new opportunity to make a splash, even though Apple seems to have forgotten about including it in devices where it could really make a difference, such as the HomePod and the Apple Watch. As such, Alexa could be left behind.
Former Amazon employees aren’t optimistic. Mihail Eric, a machine learning expert who worked on Alexa AI, recently published a long post about what went wrong with Alexa on X. Eric left the company in July 2021, claiming that although Amazon had a slew of resources and many talented people, the company was “plagued by technical and bureaucratic problems.”
What about Olympus? Last November, we found out that Amazon was working on a project to create a giant LLM called Olympus with two trillion parameters. That’s twice the size of GPT-4, but one Fortune interviewee said the number is “a joke.” He says Amazon’s most significant model is about 470 billion parameters (470B), a quarter of what Olympus should theoretically have. Meanwhile, the LLM of this updated version of Alexa is about 100B, and its developers are working on fine-tuning it.
Slow development. Alexa LLM’s progress is still modest. According to published data, Meta's Llama 3 was pre-trained with 15 billion tokens, while the Alexa LLM only had 3 billion tokens. Its fine-tuning is also worse than Meta’s model, which used 10 million “data points” compared to just 1 million in Amazon’s case.
Too many people, too much complexity. The project far exceeds Amazon founder Jeff Bezos' famous "two-pizza rule," which states that internal teams should be small enough to be fed with two pizzas. The AI-powered Alexa project brings together about 10,000 employees working in different usage scenarios, such as home, shopping, music, and entertainment. That doesn’t make things any easier, but Amazon is confident—as are some of the people interviewed— that this LLM for Alexa will eventually come to market.
We'll have to wait and see.
Image | Jonathan Borba
Related | The 12 Best Alexa Games: The Ultimate List and How to Install Them
View 0 comments