TRENDING

A Google Innovation Was Crucial for the GPT Models. The Company Reportedly Has a New Plan to Stop Benefiting Its Rivals

  • According to the Financial Times, DeepMind has opted to tighten its publication policies to maintain its competitive edge.

  • Google’s Transformer architecture is a fundamental component of OpenAI’s AI models.

Google logo
No comments Twitter Flipboard E-mail
javier-marquez

Javier Márquez

Writer
  • Adapted by:

  • Alba Mora

javier-marquez

Javier Márquez

Writer

I've been in media for over a decade, but I've been marveling at the possibilities that technology brings us much longer. I believe we live in a world where the digital revolution is changing everything and that Xataka is the best place to write about it.

157 publications by Javier Márquez
alba-mora

Alba Mora

Writer

An established tech journalist, I entered the world of consumer tech by chance in 2018. In my writing and translating career, I've also covered a diverse range of topics, including entertainment, travel, science, and the economy.

367 publications by Alba Mora

Google acquired AI company DeepMind in 2014 and fully integrated it into a single division in April 2023. DeepMind has long been a prominent player in the AI sector. Its reputation has been built on the quality of its scientific publications and the collaboration of some of the brightest researchers. However, this approach may change soon.

According to several sources obtained by the Financial Times, Google DeepMind is delaying the publication of some papers it deems “strategic” or sensitive in the field of generative AI. This strategy aims to protect the company’s competitive edge and prevent rivals like OpenAI from capitalizing on its latest and most valuable developments.

The Transformer Architecture: The Star of the Show

You can’t fully understand the evolution of generative AI without acknowledging Google’s contributions. One pivotal moment was the 2017 publication of the Attention Is All You Need paper. Authored by eight researchers, the paper introduced the Transformer architecture. This framework significantly enhanced the ability of AI models to process vast amounts of data efficiently.

The Transformer architecture served as the foundation for models like Google’s Bidirectional Encoder Representations from Transformers (BERT). The company integrated BERT into its search engine in 2019 to improve natural language understanding. The Transformer architecture also played a crucial role in developing pre-trained systems like OpenAI’s GPT, including the current GPT-4 and GPT-4.5 models.

As one of the world’s largest publicly traded companies, Google boasts substantial financial resources and access to critical technology. However, the launch of ChatGPT, also built on the Transformers architecture, took Google by surprise. This led the tech giant to declare a “code red” and reorganize internally to start being competitive again in the AI sector, which OpenAI is currently leading.

Google DeepMind logo

It’s widely recognized that when a company becomes a major player in the tech industry, it tends to lose some of the dynamism that characterized its early days as a startup. Big Tech giants often find it challenging to move fast and break things. They need to protect substantial assets and manage complex operations that can’t afford failures. Taking risks becomes significantly more complex.

Despite this, it’s impressive how quickly Google has managed to catch up in the AI race. In a short time, the company has launched a wide array of AI products based on advanced language models. Notable offerings include Gemini, a direct competitor to ChatGPT, and Gemini Live, aimed at rivaling OpenAI’s advanced speech mode. Meanwhile, the Gems function like custom GPTs, and NotebookLM is a groundbreaking tool.

Google’s New Approach

In recent years, Google has implemented significant internal changes. One of the most notable changes affects its policy on publishing scientific papers. The company now imposes a six-month embargo on content deemed strategic before it can be released to the public. The group, led by Nobel Prize winner Demis Hassabis, has also tightened internal processes with stricter reviews.

Google’s new approach may create some unease within the scientific community. A current researcher shared their concerns about this shift with the Financial Times. “I cannot imagine us putting out the Transformer paper for general use now,” they said. Meanwhile, a former DeepMind researcher told the outlet, “The company has shifted to one that cares more about product and less about getting research results out for the general public good.”

Images | Pawel Czerwinski | Google

Related | OpenAI Just Secured the Largest Funding Round in History: Investors Seem to Have Faith in AI After All

Home o Index
×

We use third-party cookies to generate audience statistics and display personalized advertising by analyzing your browsing habits. If you continue browsing, you will be accepting their use. More information