
Luke Jones | Unsplash
Google is probably one of the companies that diversifies its technology the most. We are seeing that many of its applications already include artificial intelligence, mainly with the adaptation of Gemini. As a result of the Google Cloud Next 2025 conference, relevant information has transcended about many of the innovations that are being prepared to be driven and brought to reality as soon as possible. The truth is that we are facing one of the most influential companies of recent times, so these advances that are made, other companies will be taking notes to make their due competition; this is undoubtedly something that favors us, because technology will not stop growing.

Google Cloud Next | Blog
Code Assist | Code Assistant that implements AI agents
Programming is an area that has improved substantially with the advent of AI-integrated interfaces. In fact, many people have questioned how they could displace programmers because a machine can do the same job. In this perspective, we could say that one way to mitigate the criticism would be to include in the workspace an AI that complements, in such a way, that it is just another assistant.
At the moment, the assistance was released by direct competitors such as GitHub Copilot, Cursor and Cogntion Labs, which ultimately exert some pressure with respect to the advancement of technology. That is, Google does not want to be left behind and plans to revolutionize, or at least take a big step in this technological aspect. This is why Gemini Code Assist enters the game, but in the form of an agent.
Gemini Code Assist is an assistant that is focused on facilitating coding through artificial intelligence, so that the user can increase productivity. With this in mind, it could also accommodate other tasks that could be automated, being this possible through the implementation of AI agents. Code Assist is being updated to support more complex tasks such as creating applications through Google Docs via product specifications. It will also be possible to convert an entire block of code from one language to another, which gives some flexibility in the project being built.
In addition, Code Assist can be managed through other programming environments beyond Android Studio. We should even mention the fact that Gemini Code Assist's Kanban board allows agents to perform tasks that have to do with generating work plans and reports regarding the progress of work requests. Among other things, agents can also perform code reviews, unit test generation and documentation. On the other hand, software generation and code migration, and implementation of application functions are possible. As far as we can see, the possibilities are incremental, i.e., as the technology advances, there will be new ways to take advantage of it.

Google DeepMind | Unsplash
Ironwood | Impressive AI accelerator chip
It was also revealed that work has been done on an advanced chip that functions as an artificial intelligence accelerator TPU. Under the name Ironwood, it hides raw power that reflects intentions to lead the super driver sector. This chip is a seventh-generation TPU built by Google, which aims to optimize the inference and execution of AI models.
Going into details, we can infer that this chip will enable extreme performance with respect to how AI can perform on incoming requests, but with a focus on inference and reasoning at scale, which is something that is getting a lot of attention. At the moment, if we have been paying attention, we are made aware that this mode of reasoning will take longer, but will allow responses to requests made to be vastly superior, simulating cognitive processes that are human-like. In my opinion, they are seeking to reduce the time gap in which the request can be completed, while maintaining the quality of the request.
The launch of these features is expected to reach Google Cloud customers by the end of the year. There will be two configurations using clusters; on one side a 256-chip cluster and on the other a 9216-chip cluster. Amin Vahdat, vice president of Google Cloud, assures that this is a TPU that, to date, is the most powerful, capable and extremely efficient.
It is clear that competition is increasing. Great exponents in the field are leading this growth, such as Nvidia; a company known for offering a series of chips that seem to be surpassed again and again (you probably know this company for its video boards). But the competition is so abysmal that companies like Amazon and Microsoft are investing incredibly large sums in revolutionizing this sector. It's not for nothing that such huge investments are going into driving solutions through top-notch processors. Amazon has dedicated its efforts to building chips like Trainium, Inferentia and Graviton, all of which run on AWS. Microsoft, on the other hand, has done its own thing with its Cobalt 100 AI chip, through Azure instances.
When we mentioned raw power, we meant that Ironwood is capable of delivering a maximum processing power of 4614 TFLOPs; data that has been obtained in rigorous testing processes. The chip also has a bandwidth of an incredible 7.4 Tbps, in addition to a dedicated memory of 192 GB of RAM. According to Vahdat, Ironwood is expected to be integrated into an artificial intelligence hypercomputer, which has the ability to integrate with the Google Cloud through a modular computing cluster. In order to overcome all kinds of situations, the TPU has been designed with the aim of shortening communication times, improving data movement and chip latency, and also saving a great deal of energy.
