Speaker 3
A personal example that I'll never forget we were doing a diligence
Speaker 1
in the fall of 2020. COVID really struck affected the world and the beginning of 2020.
Speaker 3
We were doing a diligence and the company had a look like a 20% productivity
Speaker 1
decline in June of that
Speaker 1
And some organizations
Speaker 3
have a huge decline. So it's not necessarily
Speaker 1
code is a craft code is a craft you can't just apply numbers and assume they're right, but they, it does lead to questions that that leads to questions. And so we were curious, looked like about
Speaker 1
And it turned out the company had taken Friday furloughs because of a decrease in sales because of COVID. And so it was literally 20% less productive because everyone they were conserving cash. And when I was doing school district work, one of the lessons from the data about school districts and schools is there is so much more intra variation than inter variation. So the variation in how much learning is going on in a school is much larger than the variation and learning across schools. And so there's so much variation we observed within companies over that time. That's what really comes from the lessons that big companies trend a certain way small companies trend a certain way. But there's a lot of variation within those groups.
Speaker 2
Gotcha. And then in your view, what's what are the hot topic items that are happening right now in the software world? Like, for me, I think Jenny is like taking over everything. Every single company, like, is pushing Jenny from like the sweet, sweet, see, sweet down. And even companies that aren't AI centric are diving into Jenny. So that's like my viewpoint is like the big hot topic right now is like Jenny, I like in your point of view, like, what's the hottest thing that's going on in like software analytics that everyone should be paying attention to.
Speaker 1
I'm not telling you this just because you're a gracious host, but because I really agree, it is definitely
Speaker 3
generative AI. And the lens that we spend most of our time thinking about is generative AI in the software development lifecycle.
Speaker 1
It is has the opportunity to deliver unbelievable quality
Speaker 3
of life games for engineers
Speaker 1
who can experiment faster, who can add tests faster with
Speaker 3
less tedium who can stand there flow state because it gives such benefits to them. It comes with a
Speaker 1
huge productivity impact for the organization. A McKinsey study said
Speaker 3
it could be a trillion dollars of impact from generative AI and the SDLC.
Speaker 1
So those are all the benefits, but it's also coming with a huge amount of risk, legal risk, security
Speaker 3
risk, compliance risk, maintainability risk.
Speaker 1
That is definitely above
Speaker 3
and beyond the number one topic that our clients are thinking about today.
Speaker 2
So how do you see the role of AI evolving in software dev in your future? I'll give you an example. For me, I think right now, I'll tell you my point of view. So like my point of view is every pre-gen AI, everybody was using very cheap labor offshore. So you can pay like an offshore person offshore software, the junior software developer, like 10 grand a year to build something. And I think those jobs are going to go away because all of the technology advancements in geni is going to push that like push a push that job off and you can actually do that with a much smaller team rather than hiring like a large team offshore. You could just have a couple onshore that's able to use that's all like mid level software developers that have that intuition to know we should be using this package, these frameworks and coupling that with geni. And that could actually replace entire software departments. Those are my viewpoints based on what I'm seeing right now with like big shifts in the market. A lot of like companies shifting from having offshore coders to having like onshore near shore coders using geni and they're not using like junior developers anymore because it's like a lot of the geni capabilities can be replaced or. Yeah, that's a good play. That's a good word like replaced with geni. And so, where do you see like the role of geni like coming in the near future and like next say 12 months. Yeah.
Speaker 1
So certainly we are in the middle of
Speaker 3
rapid adoption of geni tools at the
Speaker 1
level, which is to say that almost all organizations have at least some developers. Using
Speaker 3
geni at some varied rate. If 2021 to 2023 was the year of experimentation and then rapid experimentation
Speaker 1
2024 is going to be about
Speaker 3
standardization and bringing everyone to not necessarily
Speaker 1
identical levels, but to similar levels of usage.
Speaker 3
Much in the same way that people encourage and manage and monitor open source usage. It is such a bionic arm that can extension of your
Speaker 1
abilities to code. Open source is was and is stack overflow was and is and this geni is one as well. And we're
Speaker 1
companies monitoring the
Speaker 3
usage, encouraging usage, coaching and guiding developers to capture more of the productivity games. But also then paying much more attention to the legal risk and the compliance risks, just like an
Speaker 1
open source mature and large medium and large size companies. Collect information about the different kinds of license use because of the
Speaker 3
using the wrong kind of license. It's and similar you run legal risk. So too is coming for geni risk and so standardizing it
Speaker 1
and really driving adoption, not just experimentation is what we see in the
Speaker 3
next in the next frankly 12 to
Speaker 2
Yeah, I think that those are pretty good timelines as well. I think this year. There's going to be even more mass adoptions. I think we're going to move past the POC stage that was last year where everyone was into production. I think once people realize that once they fiddle around with geni moved it from POC to production, they're going to realize it's very expensive. There's a big shift in right now and like understanding the cost of deployment of geni and production. It is extremely expensive people. So, like you have to be very careful and watch it and monitor your geni development and deployment because it's got burn a huge hole in your wallet. I've already had a couple companies reach out and say, Hey, we, when we deployed it, we didn't realize it was that we didn't realize it was like $20,000 a month and just API calls. Holy cow. That's crazy. We got to do something about this. And people, you have to like really monitor your deployment costs. Just FYI to everyone. Very good to
Speaker 3
know. Very good to know.
Speaker 2
Yeah, it's. I'll give you an example right. So, I'm going to give you a couple of examples. So, when you're in the POC stage for geni, you're typically using a, like a frontier model like opening it as GPT for what you don't realize is that if you deploy that in production with say like a vector database like like pine cone, those two combined in production are very expensive. And so what you should be doing is you should be text prompt engineering your inputs so that you can actually use something less capable like opening eyes 3.5 turbo, which is one tenth the price. Right. So imagine like GPT for is $100 a month for API usage. If you're able to understand text prompt engineering, you could drop it down to $10. Now, if you're that if you're able to run your models using 3.5 turbo, you actually use like other models like open source like llama to and run that on like say, or take other day, a dot AI is one fifth the price of 3.5 turbo. So you can actually drop it down to $2. So you're you can go from like $100 down to $2 of inferencing costs alone just using that type of technique. And most people don't realize that because they've never deployed. So you're like this. You're like this. I'm like this is one of those things that you can do with the way that you're using. This is a very interesting thing. Gen AI solutions into production. And so these are just like words of warning. For everyone in the near future when this year, when you are taking all of the fun stuff that you built last year or you're starting to build things. Like, understand that there's big cost into
Speaker 3
it and there's ways to get the cost. And there's a software vendors out there that can help you. to deploying at scale, anything that's expensive, it behooves you to go get some help from people who know how to do the scaling part because the scaling part is different from very different from the experimentation phase.
Speaker 2
Oh yeah, extremely different. Yeah, so don't do it by yourself. There's a, there's buy versus build right now in the Gen AI world, I would probably recommend buying rather than building just so everyone can get like up to speed. So what I'm really interested in is some insider knowledge on what you guys are building. So you give us some sneak peeks on upcoming products that you guys are about to build or like anything that's on the horizon, anything cool?
Speaker 3
Yeah, for sure.
Speaker 3
hearing after spending so much time working with enterprises about understanding code bases from the executive level and then hearing from all of our clients the importance of Gen AI,
Speaker 1
it was natural to help,
Speaker 3
help enterprises think at the executive level about Gen AI in the software development lifecycle. And so that's what we've been spending a lot of our time on it's been a ton of fun. In coming attractions include
Speaker 1
a single go-to spot to keep track of all of the compliance changes that are coming for Gen AI for software. And let me give you a tiny example. The US patent office, patent and trademark office has already said, excuse me, copyright office, has
Speaker 1
you can't copyright works that are solely coming from prompting.
Speaker 3
If I want, if I say, give me a elephant writing,
Speaker 1
writing a rhinoceros over a rainbow, pick that image for me, I don't own it. I can't copyright it. That is also the rule for code. And so if it's just coming out of prompt engines, coming out of prompts, it's not gonna be yours from a legal perspective. Those offices have also said, we're not sure where the line is. We know that if you write it yourself, it's yours. We know if it's 100% AI, it's not. But that line is coming. And for organizations in many national jurisdictions, which is most organizations, that line is probably gonna be different in different places. And so one of the things is not just keeping track of those laws, but then keeping a composition analysis to know, okay, how much
Speaker 3
was totally written by human, was partially written by human, which we call blended and how much is purely
Speaker 1
from JNI, that is
Speaker 3
here. And then the regulators will be coming in the months ahead in the
Speaker 1
courts to establish how much,
Speaker 3
just how much has to be human or blended for it to count as your own.
Speaker 1
So that's a huge thing on the compliance side. Sorry, Tony, you were gonna say something.
Speaker 2
Yeah, I was able to say, I remember reading an article about that, it was like some guy named Thaler in Europe that built something that used JNI to build something. And then he tried to go through the patent office there and it took it all the way to the UK Supreme Court. And they voted, no, under the definition of an inventor, it must be like a natural person in order to get for it to go through the patent office. And so there was like a big case there. And then he tried to bring it to the States, but that also got shut down. So I remember that distinctly, because I was like, oh, wow, that actually is gonna set precedent for all future like Gen AI, like patents moving forward, is like you can't, you cannot patent any Gen AI generated output moving forward. So that's just me googling as well as we're talking.
Speaker 1
Yeah, it's, we know that 100, zero, 100 pure Gen AI, zero human is not gonna cut it.
Speaker 3
We know 100 human, zero pure Gen AI is gonna cut it.
Speaker 1
The line will be drawn and where it's gonna be drawn will probably be drawn in multiple different places. So folks will need to be able to measure compliance against multiple different standards
Speaker 3
as the laws get
Speaker 3
And the broader point for everybody who is in the middle of implementing Gen
Speaker 1
AI in the SDLC, meaning not necessarily as part of your product, but in this case, your developers using it, you gotta help teach them, encourage them
Speaker 3
to modify the code to make it theirs
Speaker 1
rather than just pure copy paste. We say blend
Speaker 3
rather than pure.
Speaker 1
Most developers understandably have learned for open source code, they should leave it alone. It's been well vetted,
Speaker 3
trusted by the community, likely more secure,
Speaker 1
more safe, because some open source licenses are only triggered if you modify it. The all of that has to be unlearned for Gen AI. Gen AI code is less secure than an individual humans. It is more legally risky. If you don't modify it, it may not be as understandable. So a huge part of the adoption era is not only measuring and encouraging usage generally, but pushing as much blended rather than pure usage as possible on all of the code that's fundamental to that health and value of the business.
Speaker 2
Yeah, I'd say for all of the coders out there, let's listen, using Gen AI for coding is fine, but you have to make sure that the output matches that of your skill level. What I mean is that I will receive code from a junior developer and I will spot out where Gen AI produce code, because for instance, like this person doesn't know how to use classes or this person doesn't know how to use like lambda functions, like how does suddenly appear on here? So going back to what you're saying, you have to be careful with the output from these and make sure that this is not a one and done, like general fix it all solution. You have to peer review these, like a human has to go through and review these to make sure that it's like compliant. There's no leakage of any information like PII data, for instance, it could leak that. Like humans will never go away in the future, in the near future, in my opinion, because we have to be there to check the output of Gen AI's generated content. Like going full auto, it's so terrible. Like there was an example would be, I think Microsoft got into a lot of heat because they took a Guardian, the Guardian is a news outlet. They took an article from the Guardian and they've, Microsoft fired basically all of their editors by the way, backstory. And so what they've done is they've been using Gen AI to produce these columns, these articles at Microsoft. And something that they did very foolishly was that they read the, like the Gen AI read the Guardian's article and then it posted a poll. Now you might think that's okay, but the shitty part is that the Guardian article was about a lady that died. And so the point was like, hey, how did this person die? Was it like, like murder? Was it suicide or whatever? That was the biggest news for two weeks last year. And Microsoft got so much heat from the Guardian because of that, that all goes back to the point of, there has to be a human as of right now to check the output of these Gen AI generated content because you can't have stuff like that, Microsoft Guardian stuff going around. That's gonna kill off your reputation. So you can't do that. And so remember people always have a human in the loop at the somewhere in your pipeline, preferably at the end so that it can check the output. What's your thoughts on that?
Speaker 3
Yeah, it was such a shocking story in it from the world of human language, it bears home. I'd say
Speaker 3
Gen AI is incredibly
Speaker 1
powerful, but in some ways, it's really no different from stack overflow and it's no different from open source. It's a really powerful tool. And junior developers, you gotta learn how to learn your craft first. Start with that, get right on that. And then over
Speaker 3
time, you become
Speaker 1
more powerful with those tools at your disposal. So really hone the craft, take code reviews
Speaker 3
incredibly fricking seriously because they make such a difference. And in organizations, you really need a plan to help
Speaker 1
get junior developers, get everyone using Gen AI the right
Speaker 3
way, which includes
Speaker 1
code reviews, includes a developer council on Gen AI. It's really that important to get their input on this.
Speaker 1
the security tooling and other kinds of depth tooling you would use are fundamentally deeply important in this space.
Speaker 2
Yeah, I just sent you an article on that topic. I'll just read it out loud real quickly. Please. The Guardian has accused Microsoft of damaging its journalistic reputation by publishing an AI generated poll speculating on the cause of a woman's death next to an article by the news publisher. That is some crazy reputation damage right there. So that's just like a warning to everybody. You have to have someone in loop. Yeah. So like moving on to like for like our listeners who are aspiring tech leaders or software developers, you got any like best practices or tips other than obviously checking the work of if you're using Gen AI, check the work of it. Like anything that you would tell your younger self if you had to go back.