Speaker 3
So should we, should we open that up a little bit? I know obviously the, the kind of AI coding assistants or co-pilots as we put them, not to use the specific branded item, but it's one of the most popular ones. So it's not like we can avoid it. How do you feel that is changing the development lifecycle? For better or worse. Yeah. I
Speaker 1
mean, there's a lot of AI co-pilots out there. Typically, the one people are most familiar with are code generators, GitHub Copilot. There's a lot of others out there. And they range from, you know, I only generate code and have chat functionality to I'm a full on developer environment, you know, meant to, you know, save workflows or suggest things to you in that way. So I will say, at least from a GitHub co-pilot perspective, which is, you know, or at least was the most widely adopted AI coding assistant code generator at the time, it's what I've done a bunch of presentations on. It's absolutely revolutionized development life cycles in both good and bad ways. And I do think that even though bread and butter developers were probably the first to flock to these tools that, you know, really, we're talking about something that's going to change things for anyone who codes. And so, you know, I know we've got some network and cloud people. So I'll speak more to that. and when I give talks, the main thing I'm asked again is, you know, do I actually have to learn how to code now? Is it just a matter of prompt engineering? And then there's people on the other side of the fence, right, who, you know, you're like, you shouldn't be able to touch an AI coding assistant until, I don't know, you're a senior developer or whatever. Oh, yeah.
Speaker 2
The gatekeeping. Yeah, all right.
Speaker 1
And what's interesting, at least what I found through my own research and opinions, is, you know, I think that it's really going to be not the code generation per se, especially for network engineers, but it's going to be the chat functionality, believe it or not, that is going to be the most revolutionary. I will say, let me back up for a second. The way developers are taught and hired, and coincidentally, this bleeds a little bit into teaching network engineers how to code and assessing their skill, is essentially, you know, we give you some assignment, we see how you execute said assignment or how you code it, and then are you able to explain it, right? And then we do some form of that in the hiring process when we're evaluating people too, right? It might be on a whiteboard. It might be solving some stupid riddle. But essentially we just give an assignment and ask to code and then whatever. So, you know, that is completely blown on its head now that there are AI coding assistants, right? Because you have an assignment and you can generate the answer to that assignment and you can hit slash explain. It's that easy. Slash explain. Explain to me what was just written. Explain it to me line by line. And that's your student assignment. That's your take home for your developer interview. So it's really not an effective way to... Oh,
Speaker 2
to gauge ability anymore.
Speaker 1
Right. And because it's not an effective way to gauge ability it poses a lot of challenges for teaching you know how do you know when somebody actually knows how to code and how reliant should we be on these tools um so i feel like i'm rambling a little bit i'm sorry no no this is good this is actually really good i think but
Speaker 3
it's funny because like i've i've read on social media i've never had to go through one myself, but I've read a lot about, you know, these software engineering interviews where people feel like the process and the interviewing criteria is so strict and so over the top that they hate the interview process because it's all like, oh, do this live in front of all of us, you know, write this code, do this thing on a whiteboard, blah, blah, blah. And they think that it should be simplified and they should like almost like the, the, the hiring manager should be more trustworthy about things in a sense. But now it almost sounds like from what you're telling me, like it's kind of spinning back in the other direction. Like how do you validate people know what the fuck they're talking about? And they haven't just done this on, you know, through a chat bot, um, other than doing that Like, where do you think it fits into that?
Speaker 1
So I won't spill all the tea, but I'm going to be presenting on this in February. Oh, all right. Yeah, a roadmap, how I think that we should, you know, now be evaluating and teaching and all that. But yeah, I mean, I think it's an opportunity, you know, people see it as like, you know, ruining engineers know they're never going to learn blah, blah, blah. But I think it's a great opportunity to revamp how we teach people to code and how we assess it. I think that the kind of like you mentioned, I think that the model was completely broken. I mean, even when I was just learning to code, right? I mean, it was I overused the word traumatic, but it was traumatic. You know, there were many tears shed. It wasn't fun. I didn't learn the most efficient or effective way. Yeah.
Speaker 1
And so I think that, you know, we can focus more on things like people's past experiences and what they learned from it and how they would apply that to new problems and talking through people's thought processes and the way they think about problems, right? Because that's what engineers do. It's not about, do you know C-sharp or Python? I mean, a little bit. It depends on the level of the role. But it's about, can you actually think? Do you know how to solve a problem? Have you encountered enough problems that you can deduce how to solve a similar problem at our company? So that is generally the way, the direction that I hope interview assessments are taken. And again, I think that from a chat perspective, you know, really it's just Google on steroids. So developers learn through Google, right? We do a lot of teaching ourselves, but co-pilot chat has been absolutely revolutionary. I taught myself TypeScript in a couple of weeks using Copilot chat. Absolutely phenomenal, in my opinion, at least based on my work speed. So, yeah, very long story short, I could talk about that forever, is that it's really making us rethink how we teach people to code and how we assess it for network engineers as well. What
Speaker 2
do you think about not just AI as a code assistant, which is kind of what we're really talking about, you know, hey, generate this code for me. You mentioned like, you know, it changes the way we teach people. Do you think or what do you think AI has a, does AI have a place in the teaching process as well? Or is it really better kept to the assistant level? And if it does have a place, like where do you see that fitting in?
Speaker 1
Well, I can answer yes to both your questions because it should always be kept at the assistant level, in my opinion. It is not the brain. And that is where most people's disappointment comes from. Again, especially if you don't know how to code is you're just like, do this for me, you know? So it is most effective as an assistant. And that makes sense. Right. Because if it's the brain, then why are we even here? And then, yeah, 100 percent. I think that, you know, it's essentially it's going to be akin to using a calculator for math. Right. So, you know, whether again, it doesn't matter what your background is, you're going to want to learn certain foundations. You're going to want to learn how to think programmatically because that is a skill, right? Thinking from like a discrete, mathematic, logical perspective. And then again, I won't spill all the tea, but there's certain concepts I think you should learn, some building blocks. And then I think from there, I think you use your calculator to continue learning and to, I mean, not generate all your code, but at least to continue learning.
Speaker 3
Yeah. I think that's valuable. I mean, I think it's, I hope you call your talk software engineering re-imagined or something like that. Just kidding. No, but I think it's, I think that is a great approach because I feel like that kind of takes a page out of the network engineering interviews that I've done over the past, like whether or not I'm, I'm, you know, the interviewee or the interviewer, um, you know, I interview a lot of people and a lot of times when I'm asking questions, I'm not asking the question to know whether or not, you know, the answer. I don't care if you know the answer. Um, I want to know how you react when you don't know it. Like how, like how you explain how, like, and what you said, you want to know if people have seen enough problems to learn like theory around what their thought process is, how they troubleshoot, you know, if they can draw correlations, I want to see how they build that mind map, like live. Right. And then it depends on what questions they ask me in response. Right. That's how you really
Speaker 2
tell what that's like what questions do they ask you back to try to get more information or because you can teach anybody network engineering. You can teach anybody TypeScript or code or, you know, whatever. I can teach structured data JSON to somebody like all that stuff. It's just stuff in a book that you can teach someone. You can't teach someone how to think. You can't teach someone how to, how to solve problems. Like, you know, not, not really. I mean, you can, but like the actual thought process of solving problems is something that people have to kind of come up with, figure out how it works for them and they have to do it themselves.