Ameer Abbas (00:05):
Gemini Code Assist is a product that helps them, augments them to write code of their choice in their tooling the best way that we can. The way to think about it is that developers come in different shapes and sizes. There are back-end developers, front-end developers, mobile developers. Those are traditional what we would consider people that write software, but then there are data scientists, data analysts, there are writing SQL, there are DB admins.
Simon Margolis (00:35):
You are listening to another episode of the Cloud and Clear podcast. Thank you all so much for joining. I’m going to be your host today, Simon Margolis. I am the associate CTO here for AI and ML. And with me I’ve got Ameer. Ameer, do you want to say hi?
Ameer Abbas (00:48):
Hey, everybody. Ameer Abbas, product manager at Google. Great to be here. It’s my first Cloud and Clear?
Simon Margolis (00:54):
That’s right.
Ameer Abbas (00:55):
Yeah, very excited to talk. Thank you for having me.
Simon Margolis (00:57):
Awesome. Thank you so much for being here. I think this is going to be really exciting. And so first things first, we’re going to talk about this really niche topic that nobody’s talking about, which of course is AI and ML. A really niche, weird thing. Super, super popular stuff. And there’s a lot that’s been happening so far, and you and I have been talking a bunch about just all this cool stuff happening in the AI space, especially as it pertains to software development and we thought, hey, why don’t we turn this into a podcast? So here we are. I’m glad that we’re here. So I think really what we want to dive into is just the whole world, of course is, changing because of the generative AI boom and everything that’s happening in AI right now, but in specific there’s a huge impact being made on software developers, the software development lifecycle. And I think before we go too much further, we’re going to talk about this, but maybe you can give a little bit of background about your role and what you do as it pertains to all of this.
Ameer Abbas (01:51):
Yeah, certainly. As I mentioned, product manager in Google Cloud. I’ve worked on a product called Gemini Code Assist, which is basically our version of helping developers how to write better code faster. I focus on enterprises quite a bit, so it’s an enterprise-grade product. I work with large financial industry, different types of customers. I also focus on the roadmap quite a bit. I’m very hands-on, so I actually test the product. I think you’ve seen me at IO and some of my stuff. Unfortunately, we’re not going to have any demos on this one, but hopefully you’ll invite me back.
Simon Margolis (02:27):
That’s right.
Ameer Abbas (02:29):
But I’ve been on this product for probably the past year and a half or so, and I’ve learned quite a bit, because my past is actually from the admin, from the ops side. I used to be an infrastructure person writing Terraform for building Kubernetes clusters. So it’s really good to see the other side, the developer side of things, while having that empathy of the operator as well.
Simon Margolis (02:47):
Yeah, that’s awesome. You and I share that background because I sometimes masquerade as a developer, but I like to tell people I’m not, I really come from the operation side of things. But I think tools like Gemini Code Assist and everything else that’s happening right now are really interesting because it enables us to do that, we can play that role a little bit. And I think a lot of enterprise software developers and even some operation folks are starting to see that light. But before we get into all the details and specifics, maybe you can talk a little bit about what feels like an inflection point of where we are right now in terms of the impact that AI is having on that software development lifecycle.
Ameer Abbas (03:26):
I did want to say that when we talk about AI, and specifically maybe for the next 20, 25 minutes that we’re talking, we’re specifically talking about generative AI, right?
Simon Margolis (03:34):
That’s right. Good call out. Yes.
Ameer Abbas (03:35):
Generative AI changed a lot of things. It made it a lot more approachable from a consumer perspective. Because AI has been around for a very, very long time, but it’s always been a well pre-trained model that did very specific things. But with generative AI, any language you want to write something and you want to generate something, and I’m talking any language, not just coding is part of that language and we’re going to talk about it, it became very, very easy for consumers, enterprise ops, developers to really get into that. And then one other thing I know we’re going to really get into that is we do see this thing as not just another tool in your tool belt, but it’s a transformative way of, let’s just say, coding and we can talk about it.
(04:18):
I’ve gotten my hands dirty with this sort of a thing, and it’s better to not think about it as just a tool inside of your developer environment, but more so think about it as you have to cognitively change the way you think about even writing software or even planning to write software, reviewing software, and those types of things. So it is a very exciting time. I’m really excited to be here. And I’m learning a ton as we’re going around with it.
Simon Margolis (04:43):
That’s awesome. Let’s dive a little bit deeper into it. First of all, for folks that don’t know what Gemini Code Assist is, maybe you can talk about what the… I don’t even want to call it a product, what it is and what problems you intend to solve with it.
Ameer Abbas (05:02):
Yeah, big fan of the what. I’m a big fan of the why too, but let’s start with what it is. So when you hear the word Gemini Code Assist, it is a portfolio of products. You can think of it’s a brand that covers multiple products underneath. The way to think about it is that developers come in different shapes and sizes. There are back-end developers, front-end developers, mobile developers. Those are traditional what we would consider people that write software, but then there are data scientists, data analysts, they’re writing SQL, there are DB admins, there are business technologists that are writing processes and workflows, there are API developers and such. So when we think about developers, we’re thinking about the breadth, what are all the personas there.
(05:46):
And pretty much any persona you pick amongst the ones that I’ve mentioned, Gemini Code Assist is a product that helps them, augments them to write code of their choice in their tooling the best way that we can. And we can talk about it a little bit more too. So you can think of Gemini Code Assist as it gives you entitlement, of course the word Gemini is in the product, to use the Gemini models to help you do your job and the enterprise context to help you write code with higher quality, write code faster. And the goal is to up the developer productivity, but also the goal is to really improve your software delivery performance, because that’s what enterprises ultimately want is better troubleshooting, better writing code, better test coverage, those types of things.
Simon Margolis (06:28):
Yeah, that makes sense. I think we heard on one of the recent Microsoft earnings calls, Satya Nadella was talking about the fact that there is 30 or 40% of Microsoft’s code now is written with AI assistance at least, and so it sounds like this is just yet another set of a suite of tools to help either achieve that or ramp that even higher in terms of business’s ability to do that. I got the privilege of hearing your talk at IO where you talked about a lot of this stuff in detail and had all sorts of really cool demos, which, again, hopefully we get to do at some point in the future, but any call-outs you want to make to the audience of specific things that as developers or as people involved in that software development life cycle you heard at IO as being big and important?
Ameer Abbas (07:13):
Before we even go to the software development tool chain, I noticed from 2024 IO to 2025 how fast things have progressed. We’re already on the third or fourth generation of models, so lots of new multi-modal things were produced. I think, again, this is just us geeking out a little bit, but things like Project Astra was very interesting to me, and that’s where you combine the multi-modality. I think there was a video of somebody fixing their bike and they were basically asking questions, and then they were getting YouTube videos and they could pause this stuff and everything was voice-driven, or videoed, they were showing the camera what part they were working on, and they were using AI-based Google search.
(07:53):
So I think combining the way human cognition works, which is very multi-modal, I thought that was very interesting because in 2024 we just introduced a bunch of singular models that were multi-modal, but I think in 2025 we started to mix a lot of those modalities into use cases, like how a human would work. Specifically from a Code Assist perspective, we were very excited, of course, for the new model, the Gemini 2.5 model, and the context window, which we’ll talk about, one million context window and we’re upping that to two million context window. I’m maybe getting a little ahead of myself.
Simon Margolis (08:23):
That’s awesome.
Ameer Abbas (08:23):
And then from a software development perspective, I think we introduced something called Jules, I call it the Waymo of software development. So it is an autonomous self-driving software developer, it has a very niche sort of a use case. But I feel very privileged to be working at a company like Google that where we can experiment with these things, where we can see is there room for a totally autonomous software delivery robot essentially. That’s been very exciting. Like I said, I talked, and you and I think both we talked to a lot of customers, a lot of interest in Jules, lots of interest in Gemini 2.5. So IO was a very interesting time both of these lessons this year.
Simon Margolis (09:05):
Yeah, I got to say that after that announcement around Jules, I came home to my hotel room that night and took a code base that I was working on and basically said, “I know this is insecure, I know I’ve made mistakes that are going to affect production, can you go through these and just fix all that stuff?” And then I went to bed and I woke up the next morning with a PR solving all these… It felt like magic. And I know that’s an extreme example of a lot of what we’re going to be doing. At least a lot of my customers, they’re not looking to completely offload things to an autonomous agent to go do things, but it is great to see that full picture, and you can then use things like Gemini Code Assist to pick and choose the parts that you do want to implement and work with.
(09:46):
I do want to ask you in a moment about the non-hardcore developer tooling that you have, but if we can focus on that for a moment, would love to know some of the practical applications that you’re seeing where a developer is starting to gain real benefits and what that might look like.
Ameer Abbas (10:02):
Sure. First, I think there are two words that are going around, automation and augmentation.
Simon Margolis (10:08):
Sure.
Ameer Abbas (10:08):
I think we’re right now in the augmentation phase. So there’s two schools of thought about this code assistance, especially for enterprise developers, because software developer is a very lucrative business for a lot of people. A lot of people moved to Silicon Valley, I did that, you did that, to be in technology, and there’s a very visceral fear in terms of am I going to lose my job? And that’s totally understandable because something, again, transformative. This is not just a tool that you’re just implying into your tool chain. So the way I’ve talked to software developers is that this isn’t augmenting your thing. It’s just like how the combine augments a farmer.
(10:48):
There’s no sort of a hate between a combine and the farmer. It just makes them more productive. And so right now we’re in this augmentation sort of a phase where I see a lot of use cases, and I always say this to my customers, I say, “100% of code base that exists is existing.” So there’s a lot of existing code base, and it’s growing exponentially because we are getting much better with tooling, we’re writing code much faster. And by the way, automation already exists with a lot of these tooling. It’s much more deterministic, I get it, but the fact is that we’ve already automated a lot of things the way Google writes code or the way Microsoft perhaps writes code. So that’s the first thing I just wanted to clear that when I talk to especially development leaders, managers, leaders, I talk about this augmenting your development life cycle.
(11:36):
From a developer perspective, the things that we see, modernization comes up all the time. That’s been coming up since I’ve joined Google. So I’ve been at Google for about eight years now. And like I said, I used to work in the field as a platform person, as an infrastructure person, and everybody was trying to modernize whether it was going from monoliths to microservices, or whether it was going from COBOL to a modern language, or mainframe modernization, that story has not changed. And if you start with those end goals, when you think about what our enterprise is trying to do is to basically push features faster to gain that competitive advantage over their competitors. The more nimble, the more modern sort of your software delivery practices are, the better you’re going to be. So I see a lot of existing use cases, a lot of modernization use cases, I see a lot of test coverage use cases. I also see a lot of use cases where developers just don’t want to do it.
(12:28):
So nobody likes to document things, nobody likes to comment stuff, nobody likes to write unit tests and stuff like that. So when we talk about augmentation, there are all these low-hanging fruits that AI is an extremely good use case for. And I’m not even talking about the writing new code and new things, and we can get into that too, but a lot of people are sitting on a lot of technical debt and there’s just not enough human power by itself. It’s like pushing the row instead of using the combine. So that’s the example I use.
Simon Margolis (12:59):
I love that distinction too, because it makes a lot of sense, and there are opportunities for automation.
Ameer Abbas (13:05):
We can talk about that too.
Simon Margolis (13:06):
But there is so much value to be uncovered in just augmenting what we do today. The example of using a different tool I think is right in farming or anything else. When I saw your talk at IO, I’m thinking Gemini Code Assist, so I’m thinking code completion, help me write, things like this. What you just talked about, things that affect the developer directly. What I was not prepared for and what I think I was most blown away by was everything else. I don’t want to limit us by saying just DevOps type stuff, but the integrations with GitHub, the surrounding stuff, because that’s where I get caught up in minutiae, and I hear this from customers all the time, is that is the process that involves so much human involvement it slows things down. Can you tease a little bit of some of the Gemini Code Assist components that are outside that core developer experience that you think are pretty cool?
Ameer Abbas (13:56):
Yeah, and I think that’s why I started by saying Gemini Code Assist is not a single product. It is a portfolio of products. One of the things strategically what we’re doing from a product strategy standpoint is we want to focus on the entire what’s called a software delivery life cycle, or from now on we can just say SDLC, so we don’t have to make this podcast 40 minutes long. And if you think of an SDLC, and I’ll use another analogy, it’s like an assembly line where you’re preparing things and you’re boxing things and you’re testing things. It’s literally just like an assembly line in a factory. And what we realized is that if you’re just focused on one aspect of that assembly line, you can make that extremely efficient, let’s just say developer productivity, how to write code, like the things that you mentioned, code completion, code generation, those types of things. That’s not going to improve the entire assembly line. That’s just going to improve one aspect of the assembly line.
(14:48):
So this is why we’re focused not just on developer productivity, because we feel that that is one aspect of the assembly line. We’re focused on software delivery performance, which is the entire assembly line. As a matter of fact, one of the things we did notice is that overall the throughput started to decline a little bit, which was very surprising to us, because we thought that with AI assistance, your throughput, the way you’re shipping software, and that number came from a research done by DORA, which is a research establishment owned by Google, but they’ve been performing these state of the DevOps research for over a decade.
(15:20):
In 2024, they measured the effects of AI on software development. And two things they noticed that was very of note. One was this throughput thing was declining. And we don’t know fully exactly why that is. We do believe that there are hypotheses. One is that generative AI is fairly chatty, it’s fairly verbose, so the size of the PRs are going up and maybe the number of PRs are also going up. Like I said, these are hypotheses, we need to prove these. Because the number and size of PRs are going up, which means that you’re causing a bottleneck in the code review aspect of your assembly length, somebody has to do that quality check of is the box fully ready to be shipped, and so that’s where the thing is. So what you saw me showing was this idea of that developers don’t just spend their time inside the IDE.
(16:08):
As a matter of fact, multiple studies have showed that you only spend maybe 20% of your time writing code. And we can link some of those in the show notes too. So what is happening for the rest of the 80%? You’re planning things, you’re designing things, obviously you’re in meetings all day long and stuff like that. Then you’re reviewing things, then you’re doing the test coverage thing, and then you’re feeding that back into the life cycle. So one of the things we’re thinking about is how do we affect each stage of the software delivery life cycle? And the example that I showed was being able to, for example, pull a design doc using tools, MCP, model context protocol, is able for us as developers to access tools, because developers don’t just use one tool. They’re using the CLI, they’re using GitHub, like you said, they’re using Google Docs, or whatever document, they’re using Jira. So giving them access to those things.
(16:59):
And then writing some code, and we can talk about what we do a little bit differently in terms of code writing too. And then having that code reviewed inside of a source code management of their choice, because developers also shift surfaces. You’re maybe in a Jira or a GitHub issue and you’re doing some conversations, then you’re in your IDE writing some code, then you’ve pushed it through the CI/CD, and then your inside of your SCM, your source code management, reviewing code. So it’s very important for us to not just focus on one aspect, which there’s a lot of other vendors that are doing that too.
Simon Margolis (17:31):
No, it makes great sense. Again, that was the biggest eye-opener for me that got me so excited, and it’s what I now have heard in the weeks past IO from customers is that, oh, well, we always just thought of Code Assist as this really narrow thing, and, oh, it can be so much more. So on that topic, I’d love to hear where you think… I don’t know how to describe it, not the limits, but where Code Assist fits into the rest of the Google Cloud ecosystem, where those interface points are. And if you can tease a little bit, we didn’t discuss this beforehand, but how this does or doesn’t work outside of the Google Cloud ecosystem, if we have a customer that’s got all their infrastructure at AWS or they’re a Microsoft shop, what does that look like?
Ameer Abbas (18:16):
Like I mentioned, the first piece of learning that we got from the DORA study in 2024 was that the throughput was going down. I don’t know if I mentioned the second one. The second one we noticed, which was a bigger number, there was only a slight decrease in throughput. There was a much larger decrease in what DORA calls software stability. And all that means is how stable is your software when you push a feature out, how many times does that feature error, how many times you’re pushing errors into the world. So they just call that stability. So it’s number of errors pushed and then mean time to recovery. That number declined fairly substantially. Statistically, it was significant. And that really caught our eye, which was basically that just throwing AI at something is not going to improve the quality. So one of the things that we’re focused on from product strategy is one thing I already mentioned is focused on the entire software delivery lifecycle, and we can tease that out. The second thing we’re focused on is focused on the quality of code.
(19:11):
So we don’t want to just write faster code, because then you’re going to… Actually, if you don’t up the quality, and let’s say that your throughput is going up, then you’re linearly also increasing the number of errors that are also being pushed. So it’s almost you have to augment that by producing higher quality code. So that’s a thing. There’s a few things that we do. For example, we have a feature called code customization that allows you to set up a rag database with your code base, with the enterprise code base. And I think this is where the ecosystem portion comes in that you were talking about is that, again, we’re privileged being in a cloud vendor where we can set up these environments very securely, very safely. We’re also the only vendor that I know of that owns the entire stack, soup to nuts, top to bottom. By the time you type your prompt to the time that prompt and your context gets inferenced on a TPU or on the GPU, you never leave the cozy confines of the Google’s backbone network.
(20:05):
We’re never sending your data to a black box. And that affords us to do a lot of interesting things. First is we can focus on other things, we can focus on the ops side of things, we can focus on the pre-planning side of things, we can focus on custom sort of things. So at a high level, since you talked about the ecosystem, and I work on a very specific portion of the AI software assistance ecosystem, which is Gemini Code Assist. But if you take a step back, we also have a product called Agentspace. And if you think about Agentspace, I think of it in a very simple world is that it is a way to create custom agents. Where Gemini Code Assist is just focused on software delivery use cases, and we can talk about agents too, Agentspace is an open world for just creating agents.
(20:54):
So you can create agents. For example, I was talking to a customer and they wanted to create an agent that would help create better PRDs. They said, “I don’t even want to start writing code until my product requirement documents are good.” So that augments that. We have Vertex AI, which is basically an AI platform to write any code. This is where you can pick and choose your own model. You can fine tune your own model. You pick your own architecture, you pick your own developer kit. So you can get very customized with Vertex AI to semi-customized with Agentspace and build your own experiences, to a fully software as a service with Gemini Code Assist, and they all augment each other depending upon your use case, and they work well together.
(21:34):
And then the last thing, since we had teased this idea for Jules, which, like I said, it was the Waymo of software development. Like you had mentioned, you gave it a task, you walked away, you were not intervening with it, and it eventually just gave you some diffs and gave you a PR, which you can review. That’s a very specific use case. There are use cases where you don’t want human intervention. Maybe you’re updating 100 Java applications from Java 8 to 21. You don’t want to have 100 human hours going through that. You just want to automate those types of things. So all these tools hit upon different either aspects of the software delivery life cycle or different use cases, whether it’s human in the loop, whether it’s interactive chat, whether it’s just knowledge-based or whether it’s fully autonomous, self-driving software delivery taxi.