

Federal Tech Podcast: for innovators, entrepreneurs, and CEOs who want to increase reach and improve brand awareness
John Gilroy
The federal government spends $90 billion on technology every year.
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company’s capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
If you are a tech innovator and want to expand your share of the market, this is the podcast for you to find new opportunities for growth.
Every week, Federal Tech Podcast sits down with successful innovators who have solved complex computer system problems for federal agencies. They cover topics like Artificial Intelligence, Zero Trust, and the Hybrid Cloud. You can listen to the technical issues that concern federal agencies to see if you company’s capabilities can fit.
The moderator, John Gilroy, is an award-winning lecturer at Georgetown University and has recorded over 1,000 interviews. His interviews are humorous and entertaining despite handing a serious topic.
The podcast answers questions like . . .
How can software companies work with the federal government?
What are federal business opportunities?
Who are the cloud providers who work with the federal government?
Should I partner with a federal technology contractor?
What is a federal reseller?
Connect to John Gilroy on LinkedIn
https://www.linkedin.com/in/john-gilroy/
Want to listen to other episodes?
www.Federaltechpodcast.com
Episodes
Mentioned books

Mar 14, 2023 • 24min
Ep. 55 The value of Synthetic Data for the Software Development Lifecycle
Leaders at NASA know the value of simulation. Before the Rover was sent to Mars, designers ran it though many scenarios on earth. Of course, they couldn’t reproduce the climactic conditions on the Red Planet, but they knew they had to try to simulate the environment before it left for a 103.3-million-mile journey. Their pioneer work probably was developed out of training simulations for pilots. Why crash a $200 million plane if you can rehearse a “life-like” cockpit from a training facility? Software developers are presented with a similar problem, but the variables are different. Instead of a pilot not being able to land a plane, they may design a system that exposes private information or a system that does not have the interoperability that the designers planned for. The solution is to artificially generate data that is very similar to “live” information. That way, they can run simulations to learn about unexpected events when systems collide. Thomas George from Vidoori is a data scientist from Vidoori who explains many of the concepts behind synthetic data and its application. For example, a simulation can be run with synthetic data that shows what the expected value of a financial transaction should be. If a set of data is very close to real data, one can tell if there have been any waste, fraud, or abuse possibilities. From a data management perspective, one can take a large data set and “pressure test” the workflow to see if there is latency in the architecture. Thomas George observes applying artificial data should be an essential part of large organizations that have sensitive data that needs to be protected.

Mar 9, 2023 • 26min
Ep. 54 Understanding the strengths and limitations of Robotic Process Automation for Federal Systems
Many phrases are bandied about that give the impression that they will solve the world’s problems. You know them as well as I do: artificial intelligence, machine learning, and even Robotic Process Automation. This is an interview with Brian Baney from Aeyon. He has worked with many varying applications of RPA to solve federal problems and can give an objective perspective. During the interview, Brian gives a full orb analysis of the strengths and weaknesses of applying RPA to federal systems. The discussion begins with defining terms. Rather than a panacea for all ills, Brian reminds the listeners that an RPA system merely responds to programming languages, not human languages. As a result, much consideration must be given to the design of the RPA system. This includes identifying data sources, organizing data, and knowing where a human must step into the process. Brian’s company, Aeyon, has a fantastic track record with organizations ranging from the Air Force to the Marines to NASA. He gives unfiltered answers to questions presented in the interview. For example, he willingly admits bots can add complexity to a system. One bot may be difficult to manage; what kind of unintended consequences will accrue when a group is interacting with each other? The answer is to have experience with process mining and business process management to take advantage of the benefits RPA brings to the table. This includes reducing errors, eliminating mundane tasks, and freeing up time for federal technology professionals to use for value-added tasks like determining the impacts of multiple bots on a federal system

Mar 6, 2023 • 22min
Ep. 53 How Appian solves the Process Automation Problem for the Federal Government
Big problems need big solutions. Let us take a look at the Veteran’s Benefits Administration as an example. It has been reported that the VBA completed a record 1.7 million total claims in fiscal 2022. It disburses about $100 billion in benefits. That would certainly qualify as a major problem for any workflow scenario. No management course in the world would teach that anyone can manually handle that level of complexity. The role of process automation is to take a look at an overwhelming situation, like the one presented above, determine what the repetitive tasks are, and design an automation to speed up the delivery process. That sentence is quite an easy one to write, however, there are many permutations to automation. Let’s start with how the data is stored before the automation is even considered. If a large organization chooses an architecture that involves data lakes or data warehouses, they limit itself to a central storage facility. During the interview with Mike Beckley from Appian, he recommends begins with streamlining the intake process. From there, the manual processes can be automated and then you can make the workflow more efficient. Design process automation can be established, but one has to remember that workflows can change and compliance requirements may update as well. A systems analyst must have a way to easily visualize the automation process so it can adapt to a changing world. Appian has been a leader in helping federal systems automate, they are reported to be used in over 200 government agencies. Results have included a drastic time reduction in processing claims, increased accuracy, and elimination of redundant systems. Listen to the interview to hear how Mike responds to questions about artificial intelligence, continuous improvement, and the Software Bill of Materials.

Mar 2, 2023 • 28min
Ep. 52 Cloud security for large federal organizations “If it’s reachable, it’s breachable”
Today’s interview focuses on how the commercial success of Zscaler and its hundreds of patents can help large military organizations reduce costs and increase security of a cloud transition. A good way to understand the challenges that the U.S. Army facing was by some of the comments that Ray Iyer, the Army’s Chief Information Officer, made during his recent exit interview. When he started his position, he characterized the information technology the Army used as being decentralized. This resulted in duplicate systems and no standard way to prevent attacks. By making the transition to the cloud, he had to look above the stovepipes to see duplicate systems and optimize any investment they made. Steve Kovac from Zscaler outlines how their technology can help leaders like Ray Iyer be able to reduce costs for a cloud transition from massive systems. Steve starts by discussing Zscaler’s achievement of FedRAMP High level across the board. This is unique because it allows them to reach all levels of secure data in a military application. When that is combined with Zscaler’s Security Cloud, the DoD can provide communications that are not only secure but fast. Essentially, Zscaler provides a secure “first hop” during an interaction with a federal system. It is secure because it can completely obfuscate the ability of a malicious actor to intercept the communication. In a humorous and entertaining phrase, Hansan Bae from Zscalar sums up the threat by saying, “If it’s reachable, it’s breachable.” Zscaler provides a solution that eliminates the ability for a malicious actor to “reach” a system. The message is clear: use a trusted intermediary technology to provide you with secure, flexible, and scalable access. If you would like to hear more about federal applications, Zscaler has an inaugural Public Sector Summit on March 8, 2023. It has federal leaders talk about overcoming the challenges of large systems that can enable them to implement Zero Trust.

Feb 28, 2023 • 22min
Ep. 51 Putting it all together: the dark web, the FBI, and federal cybersecurity.
Reliaquest is a very successful commercial company, and it has a set of skills that can be directly applied to the federal government. Today’s interview is with Michael McPherson. He worked for the FBI for over 25 years, and he chose to work for Reliaquest because he believes its technology matrix offers the best hope for securing all networks, that included federal. During the interview, he explains why he believes this combination works. First, his background can assist federal agencies to prioritize extant risks. Second, Reliaquest’s track record in the commercial world gives it skills in scaling and security optimization that are in defense inside the beltway. Finally, Reliaquest has recently acquired a company called Digital Shadows This gives Reliaquest an unusual viewpoint on activities going on the Dark Web. Malicious actors, origins of attacks, and methods can be conveyed to the Reliaquest platform to give federal leaders a wider range of threat information. Continuous improvement is a phrase that is popular in the agile and DevOps world. This idea applies to your agency’s cyber posture. Once a system is placed that can manage, detect, and respond to threats – it must be updated regularly. Reliaquest has proven in the highly competitive commercial world that its system can act as a force multiplier to respond to cyber-attacks.

Feb 14, 2023 • 23min
Ep. 50 Data Storage Strategies for Complex Federal Systems
Learn how Pure Storage provides solutions for federal technology leaders, including reduced costs for data centers, managing unstructured data with Flash Blade, and the importance of abstracting data before it is outdated.

Feb 14, 2023 • 21min
Ep. 49 How to Discover, Manage, and Secure a Federal Network
The podcast discusses the challenges of managing federal technology systems, including the need for visibility and asset management. It explores the risks of software sprawl and the importance of managing unused software assets. The concept of technical debt is also explored, along with the role of Elastic in efficient resource management.

Feb 8, 2023 • 34min
Ep. 48 Deadlines, CMMC, and the Defense Industrial Base
When the concept of Cybersecurity Maturity Model Certification (CMMC) was first developed, nobody envisioned the roller coaster ride it would take since its inception with Executive Order 13556 in 2010 with its emphasis on Controlled Unclassified Information. The goal was to assess and enhance the cybersecurity posture of contractors who serve the DoD. The target framework was a document from NIST called 800-171. Over the years the CMMC guidelines have evolved and so have recommendations from NIST. Over this period of time communication from the DoD about CMMC has ranged from constant briefings to a period where the DoD was incommunicado. The result of that unusual series of events is a deadline in November of 2023, or possibly earlier, when companies will be expected to comply with the revised regulations. Today, we sat down with Igor Volovich from Qmulos to put a framework around CMMC to give the 300,000 members of the Defense Industrial Base a handle on today’s status. During the interview Igor repeats his core message: don’t wait until the last minute to begin the process. You could end up looking at your competition in full compliance and your company running out of time. He suggests that you start with a thorough understanding of the basis for CMMC, the NIST 800-171 document. Next, don’t forget your company is part of a matrix of vendors; you should contact your partners or affiliates to see where the shared responsibility lies. Finally, Igor suggests you speak to vendors who may be able to help. Chances are, if you wait, you will be overwhelmed with work. The normal reaction is to seek out help at that point. However, you may encounter CMMC compliance experts with a serious backlog, The lesson: understand the requirements, seek help from affiliates, contact people with expertise to help with the rough spots, and most of all . . . DO NOT DELAY.

Feb 6, 2023 • 24min
Ep. 47 – Understanding Federal Security: FedRAMP Hight, segmentation zero trust
The podcast discusses Akamai's role in federal cybersecurity, including their expansion into Zero Trust and segmentation. It explores edge computing and FedRAMP accreditation, as well as the US government's proactive approach to cybersecurity. The future of federal security is predicted to focus on zero trust architecture, cloud, APIs, and edge computing.

Feb 2, 2023 • 26min
Ep. 46 Software Project Management and the Shift Left
Federal leaders will attest to the statement, “Security must be top of mind throughout an application’s development.” Today, we sat down with Jeff Gallimore, Chief Technology, and Innovation Officer at Excella to try to see how this noble concept can be applied to the amazingly complex and ever-changing world of federal technology. During the interview, Jeff highlights the areas of continuous improvement, naming conventions, and the shift left. If you were to watch a movie that entails police, you would undoubtedly encounter the abbreviation, CI, which stands for Confidential Informant. However, in today’s discussion of cybersecurity and software, CI brings a new meaning – Continuous Improvement. Jeff Gallimore describes CI as integral to keeping a software project safe. The concept was broached in 2001 with the Manifesto for Agile Software Development. A group of developers met on a mountaintop and gave principles for improving software development. Near the top of the list was their concept of “responding to change,” what we call continuous improvement. Chances are, those experienced developers could not have anticipated the drastic increase in Internet usage and attacks. All this highlights the need to adapt code. Moving on to other terms, when asked to differentiate between DevOps and DevSecOps, Jeff did not want to engage in the latest nomenclature debate. He thinks that federal leaders should focus on outputs, not on defining processes. In the time that a team debates DevOps, they can be moving on to another issue. Another phrase was defined – Shift Left. No, nothing to do with politics, this refers to the traditional way software developers would write code. They would have a large whiteboard and diagram the process of moving from left to right. In this context, a “shift left” indicates an interest in including cybersecurity at earlier stages of the software development life cycle. Jeff also commented on the role of automation in managing large hybrid cloud projects. Automation can be offered as the remedy to this complicated circumstance. However, the range of point solutions and platforms merely reinforces the importance of humans understanding the flow of a project.