

Federal Tech Podcast: for innovators, entrepreneurs, and CEOs who want to increase reach and improve brand awareness
John Gilroy
The federal government spends $90 billion on technology every year.
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company's capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company's capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
Episodes
Mentioned books

Jul 16, 2024 • 21min
Ep. 159 Role of Strategy in Federal Cybersecurity
Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com In baseball, one way to rattle an opponent hitter is to say, "Ya can't hit what ya can't see." Today, we see cyber-attacks at a pace beyond a human's ability to detect; we must consider applying artificial intelligence and automation to meet the current threat. During the interview today, Palo Alto's Erix Trexler outlines the correct approach for cyber defense. First, it is not enough to identify a threat. One needs to get the data, normalize it, and sort it quickly to have actionable intelligence. From there, actions can be taken to stop the attack. Brigadier General Greg Touhill (retired) was the first Federal Chief Information Security Officer. He once said if you prioritize everything, you prioritize nothing. Eric Trexler expands on this concept by emphasizing that each agency must have an effective strategy of prioritizing data, automating response, and then having a formal incident response in place. Erick suggests that artificial intelligence can provide abilities like anomaly detection, capacity prediction, threat intelligence and even data classification to be able to execute an effective strategy. Each agency has a varying level of cyber defense maturity. Eric emphasizes that a company with the resources of Palo Alto they can meet you where you are in your journey.

Jul 16, 2024 • 24min
Ep. 164 What is Proactive Cyber Security?
The volume of cyber attacks on federal organizations has gotten to the level that traditional methods have lost their efficacy. If you merely react to an intrusion, the malicious actor has gotten what he wants and has left. Today, we sat down with Vinay Anand, the Chief Product Officer for a company called NetSPI. Back in 2001, they were founded to improve server, network, and application penetration services. Their initial offering of penetration testing has become so successful that it is being used by nine out of the top ten banks in the United States. Over the decades, they have learned that true security went beyond penetration testing. They had to take a more initiative-taking approach. For example, the attack surface back in 2001 was minuscule compared to what is happening today. Covid has encouraged remote access, sensors are everywhere, and cheap storage has allowed malicious actors the opportunity to place code in unimaginable places. A tech leader must be able to identify and protect the unknown. The first step is to protect the external-facing network and the internal network. The internal aspects can be controlled by tools classified as Cyber Asset Attack Surface Management analysis. The external system can be examined by an External Attack Surface Management system as well. That may be a terrific beginning, but this knowledge must be augmented while simulating an attack. NetSPI can assist an agency in developing an attack plan and narrative. That way, they can understand their risk profile and optimize methods to recover from an attack. During the interview, Vinay Anand gives a terrific overview of the development of different methodologies behind system protection. Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jul 11, 2024 • 30min
Ep. 162 Managing Kubernetes can Increase Security and Reduce Cost
Ep. 162 Managing Kubernetes can Increase Security and Reduce Cost The military likes to use the phrase "situational awareness." Of course, it is important in an anticipated conflict; it can also apply when managing complex federal IT systems. For example, we have seen federal systems move to the cloud. This transition allows for more flexible ways to manage applications, especially with units that can include code, commonly called containers. However, the ease of scale with this cloud environment means that we are presented with challenges in managing these containers. Kubernetes was developed to offer a limited solution for managing replication, load balancing, and scheduling. However, Kubernetes has limitations. Today, we sit down with Dan McGuan from Rancher Government Solutions. During the interview, he describes how they has worked with many agencies over the years to help them with the complex management of virtual systems. For example, we see malicious actors targeting containers. Basic Kubernetes was not designed for cyber protection. Dan McGuan describes how they have worked with Mitre to design a hardening guide for Kubernetes. Another example is controlling energy consumption. Some have described ships in the U.S. Navy as "floating data centers." Every data center has a challenge with energy consumption. Rancher Government Solutions is collaborating with companies like nVidia to present solutions that drastically reduce energy consumption in limited environments like warships. Managing a complex abstract environment can yield more security, more control over data, and reduce infrastructure costs. Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jul 11, 2024 • 34min
Ep. 163 Beyond the SBOM for Secure Software Development
Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com Everyone likes to hit the "Easy" button, especially software developers. Rather than laboriously generate code line-by-line, today's software professionals may just grab code from a repository and re-purpose it. Why reinvent the wheel? Malicious actors have noticed this process and have inserted code into many libraries, acting like a like Trojan Horse. As a result, some organizations are offering codes that have been inspected. They look at known vulnerability lists and see if the code includes any of them. If not, it is given a seal of approval. Frequently, this is called a "Software Bill of Materials." A convenient solution: however, upon inspection, SBOMs can be problematic. The weakness of SBOM During today's interview, Joel Krooswik, Federal CTO for Gitlab, described in detail some of the ways software must be continuously protected. According to the SBOM folks, the code is clean when leaves the "shelf." However, due to continuous improvement code changes hourly. All an SBOM provides is a certification at a specific point in time for known vulnerabilities. Joel Krooswik gives listeners an enterprise architect's perspective. He indicates that digital transition introduces new code, new architectures, and innovative approaches. At any step along the way, security can be compromised. The unknown unknown Donald Rumsfeld famously said, "There are unknown unknowns." This can be directly applied to what GitLab calls "fuzz" testing. This allows professionals to throw random inputs into a system to see what happens. Finally, you get a view of a potential possibilities that are not obvious. Joel Krooswik presents many insights when it comes to protecting software. He states that just because a system is identified as needing a patch, it does not mean it will be done in a flash. Understanding all the risk factors will allow a federal leader to make a prudent choice when it comes to protecting software systems. .

Jul 9, 2024 • 26min
Ep. 161 How to Overcome the Challenge of Modernizing Legacy Systems
All the headlines would make you think the federal government is spending millions of dollars on bleeding-edge innovative technologies. However, a detached perspective shows when looking at funding one can conclude that 80% of technology spent is on operations and maintenance. If we continue this, then will have unreliable systems that are not effective at managing the current volume of data. Today, we sit down with Badri Sriraman, the Senior Vice President of Karsun Solutions. He has years of experience helping federal agencies make this important strategic transition. He has a deep and thorough understanding of many federal systems including records, engagement, and intelligence. Badri suggests that one potential use of artificial intelligence is to apply it to gain a better understanding of existing legacy systems. You may realize what data is duplicative or useless and what serious dependencies the existing system has built in. From there, a plan can be devised where a segment of the legacy system is transitioned and evaluated. One tool that has been successfully used by Karsun Solutions is called "goredux.ai" This was designed to provide an enterprise architect with an idea of how to make a strategic transition. The interview ends on a serious note. Legacy systems are more likely to have known vulnerabilities. There are increased costs inherent in older systems. Older systems may not allow federal leaders to reach agency goals. Finally, if all the budget is assigned to operations and maintenance you can paint yourself in a corner without a budget for modernization. Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jul 5, 2024 • 1min
Trailer
The federal government purchases over $90 billion a year in information technology porductds and services. The purpose of the Federal Tech Podcast is to listen to strategies and tactics for untapping that market for your company. Each week we sit down with technology leaders to hear how their solution fits in the complex federal technology world. For more information on Federal Tech Podcast Follow John Gilroy on LinkedIn

Jul 2, 2024 • 29min
Ep. 158 Safe Use of AI with Privacy Enhancing Technology
Every technology has a maturation cycle; today we see Artificial Intelligence transitioning from being a parlor trick to being considered for serious applications. The federal government wants secure and reliable solutions to solve problems in the military and healthcare. Our guest today is Dr. Ellison Anne Williams, she has a PhD in mathematics and is the founder of Enveil. She provides an overview of AI security by suggesting it is only as good as the data over which you train and use it. AI is exposed to large data sets and models are encoded with the data with which they were trained. This process can leave the model vulnerable and open to attack, she describes one attack called a "model inversion." This is a machine learning technique that examines a model's output and infers personal information about its data subject. Dr. Ellison suggests a group of technologies called "Privacy Enhancing Technology." During the interview, she gives an overview of how it can securely and privately train a model to produce richer insights. PET allows leaders to secure the use of a wider range of data sources. You can use homomorphic encryption to safely train your model over sensitive data. This interview is an overview of a technology that can allow federal agencies that must deal with sensitive information to be able to leverage the speed and insights that AI can provide. Want to leverage you next podcast appearance? Take the quiz. https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jun 25, 2024 • 24min
Ep. 157 What Value is a Software Bill of Materials?
Years ago, people would laboriously code character by character. This tedious process would take hours and would include errors. Over the years, libraries of prewritten code have evolved that allow software developers to "grab" some code, modify it, and finish a project earlier. Malicious actors have taken advantage of this short cut and have injected code into these software libraries that get taken along for the ride. One proposed solution is something borrowed from the shipping industry. A commercial invoice may be packaged with a bill of lading to indicate the contents of the package. This "assurance" has been transferred to the world of pre-written code and is now called a "Software Bill of Materials," or SBOM. In a world where you are shipping a ton of Portland Type II cement overseas, this bill of lading works finds; it has some challenges being transferred to the dynamic world of software. In a typical federal environment, there is continuous change in the code itself. It would be difficult to change on ton of a manufactured product like Portland Type II Cement. However, the once approved software package may have so many changes that the Software Bill of Materials may not have any validity. During the interview today, David Jurkiewicz unpacks the concept of an initial SBOM and then how software packages can evolve over time and still retain compliance. His company can take this basic guarantee and examine the software for many concerns, including. · Vulnerabilities · Dependencies · Integrity · Malware · Foreign presence · License David Jurkiewicz provides details on how companies can resolve vulnerabilities and ensure safe operations in a world where code is grabbed off the shelf and slipped into a package. Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jun 20, 2024 • 22min
Ep. 156 Applying Telemetry to Federal Networks
One of the most practical applications of Artificial Intelligence (AI) is to assist in network observability. The big move to Zero Trust is predicated on the ability to have a thorough understanding of network assets. This is a significant issue for federal information technology. We have legacy systems, shadow IT, and a deluge of data in addition to the confusion that a hybrid network can bring. Riverbed takes a phrase from science, telemetry. Initially, it was used to troubleshoot the original network: the power grid. Since then, the term has been modified to apply to a standard data collection system for analyzing information on a digital network. The fact that 98% of the Fortune 100 uses Riverbed for determining network status means that they are the de facto leaders in the market. Today, we sat down with Jeff Waters to help us understand how Riverbed can be applied to federal systems. You would expect Jeff to emphasize network management, however, he shows how the basic "telemetry" approach can be used for improving user experience. The approach is simple: if a technology can look at movement on a network, it can be applied to understanding how federal sites are used by citizens. We move from DevOps to Artificial Intelligence Ops, or AI Ops at the end of the interview. This concept allows Riverbed to be able to understand a situation and offer remediation. Because the network is so well understood, the solution is effectuated quickly. Telemetry – from old-school electrical troubleshooting to helping with user experience on a federal website. = = = Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com

Jun 18, 2024 • 27min
Ep. 155 For Federal Data, Data is the Gold
In 1848 they found gold at Sutter's Mill in California. According to the CEO of Cohesity, data is the new gold. Today, we sit down with David Kushner from Cohesity to unpack what this "gold" reference means for federal technology leaders. This has always been the case. Today, we see cheap storage, fast Internet, and Generative AI producing an overwhelming amount of data sets. The challenge is how to protect them. This "perfect storm" has not gone unnoticed by federal leaders. If you casually look at a few recent federal mandates, you constantly see a reference to "security" and Artificial Intelligence. · White House: President Biden issues Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence · Homeland Security: Promoting AI Safety and Security · OMB: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence Cohesity has been in business for over a decade and has garnered a reputation for efficient management of critical assets. We sit down with David Kushner to unpack what these innovations mean for the federal government. For example, we all know that the federal government gets attacked thousands of times a day. We all know the "usual suspects," but what is starting to happen is malicious code is being injected into backups. It is conceivable that a systems manager could reach for the backups and introduce compromised data into a sensitive data set. During the discussion, David mentions that Cohesity has worked with over three hundred federal agencies in a wide variety of services. In February of 2024, Cohesity launched a search assistant called Gaia. This allows enterprise-level organizations to use Large Language Models and Retrieval Augmented Generation in a manner that complies with compliance mandates. Listen to learn about Gais, backups, and a new world where the data is the gold that is being exfiltrated. = = = Want to leverage you next podcast appearance? https://content.leadquizzes.com/lp/fk1JL_FgeQ Connect to John Gilroy on LinkedIn https://www.linkedin.com/in/john-gilroy/ Want to listen to other episodes? www.Federaltechpodcast.com


