The POWER Podcast

POWER
undefined
May 6, 2021 • 25min

87. How Artificial Intelligence Is Improving the Energy Efficiency of Buildings

How Artificial Intelligence Is Improving the Energy Efficiency of Buildings. A lot of energy is consumed by buildings. In fact, the Alliance to Save Energy, a nonprofit energy efficiency advocacy group, says buildings account for about 40% of all U.S. energy consumption and a similar proportion of greenhouse gas emissions. Some estimates suggest about 45% of the energy used in commercial buildings is consumed by heating, ventilation, and air conditioning (HVAC) systems, of which, as much as 30% is often wasted. Most power companies these days have energy efficiency programs that help customers identify waste and implement energy-saving measures, but there are also non-utility providers working on solutions. Montreal, Canada–based BrainBox AI is one of them. It’s using artificial intelligence (AI) to significantly reduce energy consumption in buildings. “We’ve developed an autonomous artificial intelligence technology that applies to commercial buildings in order to render their heating and cooling needs, which is typically the single largest consumer of energy in a building, and to make those much more efficient and certainly much more flexible to outside demands and occupant demands,” Sam Ramadori, president of BrainBox AI, said as a guest on The POWER Podcast. The company’s autonomous AI HVAC technology studies how a building operates and analyses the external factors affecting it. It identifies potential improvement opportunities and then acts to optimize the building’s system. It requires no human intervention and reacts to changes in the built environment immediately to maintain the highest tenant comfort and energy efficiency at all times. “What’s exciting is you don’t have to picture a room full of dozens of engineers managing and monitoring these buildings. It’s truly the AI optimizing the building in real time without human intervention,” Ramadori said. Surprisingly, the BrainBox technology does not require any changes to be made to most buildings’ HVAC systems. It simply connects to what’s already installed and utilizes existing sensors and data, along with third-party resources such as weather forecasts and occupancy information, to drive decision-making. It’s easy to imagine how a building’s HVAC needs change through the course of a day. For example, east-facing offices may require more cooling in earlier parts of the day as the sun rises, while west-facing offices may require more cooling later in the day as the sun shines through windows in the afternoon. The BrainBox technology accounts for those sorts of changes and adjusts dampers to keep each zone optimally heated or cooled. But it doesn’t end there, the AI is constantly learning and evolving. Ramadori explained how changes in a building’s surroundings would also be picked up and accounted for by the technology. “What happens if across the street on the south-facing side, right now there’s a parking lot, and then in a year, they build up a tower right next to it? Well, what happens, that tower is now throwing shade onto part of your building for a part of the day. So suddenly, the behavior of those rooms has changed,” Ramadori said. “What’s exciting is no one has to tell the AI that there’s a building that just went up next door, it will just learn that ‘Wait a second, those rooms that used to get hot at noon, you know, for the bottom half of my building, no longer are getting that hot anymore.’ It doesn’t know why, but it doesn't matter. It just knows. It’ll relearn—by itself without a human reprogramming it—it’ll relearn the new behavior caused by that building built next door.” “We’re cutting energy consumption in a building typically by 20 to 25%—so, it’s a large reduction—and we do so without turning one screw, which makes it super exciting and powerful,” said Ramadori.
undefined
Apr 29, 2021 • 24min

86. Serious Power Transmission without Wires Is Closer Than You Think

Serious Power Transmission without Wires Is Closer Than You Think Most people are aware that wireless charging technology is available today for small electronic devices, such as cell phones and watches, but when it comes to larger-scale power systems, the concept of wireless transmission of electricity probably seems like science fiction. The truth, however, is that systems have been developed and are being tested that could result in kilowatts of power being transmitted over distances of kilometers very soon. “We are looking to have these sort of higher-power, kilowatt-class devices at kilometer-scale distances out for early customer testing and use in the next couple of years,” Tom Nugent, co-founder and CTO of PowerLight Technologies, said as a guest on The POWER Podcast. Unlike most wireless cell phone chargers, which produce a magnetic field that a small coil in the device receives and harvests energy from to charge the battery, PowerLight uses optical power beaming technology, which converts electricity into high-intensity light. PowerLight’s system then shapes, directs, and beams the light to a specialized solar cell receiver that converts the light back to direct-current power. Through the beam, the company says “power can travel over long distances, at high altitudes, and in the deep sea—maintaining uptime, from near and far.” The innovative beam-shaping design “optimizes the energy of the beam at the start, to minimize losses across the transfer medium and maximize power in the end.” “This is a way to take energy from somewhere where it’s easy to generate or access, whether that’s a generator or an electrical outlet, and we convert that electricity into light, and then project it either through the air or through optical fibers to some remote location where it may be very difficult to get power to,” Nugent said. “What this really is, is a wireless extension cord.” PowerLight has already conducted demonstrations in which it delivered as much as a kilowatt of continual power. “One of the advantages of using near-infrared light, as we do, is that it allows you to go very long distances—kilometers or even more,” said Nugent. In fact, the company has delivered power over distances of one kilometer in demonstrations. Currently, PowerLight is focused on providing solutions for the telecommunications and construction industries, and for the military. Some of the applications that seem particularly promising include powering communication nodes, security sensors, and drones. However, as the technology evolves, Nugent envisions scenarios where megawatts of power could be delivered over hundreds of kilometers to remote military bases or small islands—places where it would be impractical to run wires. Nugent said PowerLight is getting very close to releasing some new products to the market. “It’s something that many people haven’t heard of, or don’t realize where the technology is, and it’s actually much, much closer to reality than a lot of people may have thought,” he said.
undefined
Apr 15, 2021 • 33min

85. What's Been Holding Hydrogen Fuel Cells Back, and How to Change That

What’s Been Holding Hydrogen Fuel Cells Back, and How to Change That The technology used in modern hydrogen fuel cells is not new. In fact, NASA used fuel cells for its manned space missions in the 1960s. But fuel cells have not really “taken off” (pardon the pun) in earthly applications since that time. Some industry insiders believe that will change very soon. “We’ve been sort of monitoring hydrogen for a number of years and doing some research in it, and it became clear to us over the past few years that hydrogen can play a huge role in fighting the climate crisis and decarbonizing hard to decarbonize sectors,” Amy Adams, vice president of Fuel Cell and Hydrogen Technologies with Cummins, said as a guest on The POWER Podcast. Among the ways Adams envisions hydrogen being utilized is in fuel cells powering such things as trucks, buses, trains, and ships. There are also stationary applications, including for electric power generation, that could be a good fit. So, what’s been hindering deployment of fuels cells to date? Adams suggested there were four main things holding back widespread adoption of the technology. “First of all is just technical readiness,” said Adams. However, she noted that fuel cell technology has been evolving, and advancements have led to longer-lasting, better-performing, more-efficient, and larger-scale fuel cell systems. “They’re now ready for primetime, if you will, in several applications.” Another barrier has been infrastructure readiness. “That’s got two pieces,” Adams said. “One is the availability of hydrogen, so having hydrogen refueling stations, and then the cost of the hydrogen at the pump.” Adams noted that Cummins has been involved in a number of refueling station projects that use electrolyzers to produce hydrogen. The company has also partnered with ETC in a joint venture called NPROXX, which is based in Europe and will provide customers with hydrogen products for both on-highway and rail applications. Adams said many companies within the industry are working to address the infrastructure challenge, so she expects that to build out over time. A third obstacle has been regulation, but policymakers around the world are beginning to help on that front too. “We continue to see a lot of government activity to accelerate the role of adoption, both through mandates and incentives, tax credits, carbon taxes, etc. So, that’s going to help accelerate investment in both innovation and R&D [research and development], as well as larger-scale deployments,” she said. Lastly, in the past, total cost of ownership has not been where it needed to be. “With any technology adoption, it has to make sense for the customer from a business perspective,” said Adams. But that is also changing. “The costs have come down significantly, and will continue to go down as we go throughout this decade,” she said. According to Cummins’ total cost of ownership analysis, fuel cells will reach parity with diesel engines in heavy-duty truck applications by 2030 or sooner. “We’ve seen positive progress in all of those areas, which is why we see increased interest now and what we believe will be increased adoption over the next few years,” Adams said. One country that has already seen significant growth in fuel cell usage is South Korea. POWER reported on three new electricity generating facilities based on fuel cell technology that were deployed in South Korea last summer: a 50-MW power plant placed in service by Hanwha Energy at its Daesan Industrial Complex in Seosan, a 19.8-MW installation in Hwasung, and an 8.1-MW facility in Paju. “Part of the magic that we’re seeing in Korea as it relates to stationary power using fuel cells is incentives,” said Joe Cargnelli, director of engineering for Cummins’ Fuel Cell and Hydrogen Technologies division. “So, they have incentives that promote the deployment of stationary fuel cells and [they’ve been] highly successful, and I think it’s a great strategy.”
undefined
Apr 8, 2021 • 20min

84. Solar Energy in the Sunshine State: FPL Leads the Way

Solar Energy in the Sunshine State: FPL Leads the Way Florida is known as “The Sunshine State,” so it’s no surprise that solar energy is growing rampantly across the state. Among the utilities adding solar resources to their energy mixes is Florida Power and Light Co. (FPL). FPL claims to be the largest energy company in the U.S. as measured by retail electricity produced and sold. The company serves more than 5.6 million customer accounts supporting more than 11 million residents across Florida. FPL—a subsidiary of Juno Beach, Florida-based NextEra Energy—says it operates “one of the cleanest power generation fleets in the U.S.” “We are big fans of solar energy, and we’ve been working to advance solar in the state for more than a decade,” Jill Dvareckas, senior director of development with FPL, said as a guest on The POWER Podcast. “We currently have 37 solar energy centers in operation, with seven more under construction, which makes FPL the largest producer of solar power in Florida.” FPL stuck its proverbial “toe in the water” back in 1984 when it constructed a 10-kW PV facility in Miami, but it didn’t really get serious about solar until 2009 when it built a 25-MW solar energy center in DeSoto County. Since then, 35 similarly sized installations (74.5 MW each) have been added. “Our commitment to clean energy is evidenced by our groundbreaking ’30-by-30’ goal to install 30 million solar panels by the year 2030,” Dvareckas said. If the company succeeds in reaching that target, solar energy will make up about 20% of FPL’s power capacity at the turn of the decade. In her position, Dvareckas is also responsible for the deployment of other cutting-edge technology, including electric vehicle (EV) and battery storage programs. “There’s no doubt that the electric transportation revolution is underway already,” she said. “FPL has been investing in clean transportation for over a decade. We were the first electric company in America to place the hybrid electric bucket truck into service in 2006.” Today, the company has one of the largest “green” fleets in the nation, with nearly 1,800 vehicles that are either biodiesel-fueled, plug-in hybrids, or EVs. FPL also has an EV charging infrastructure pilot program, called FPL EVolution. “Our goal with the program is to install 1,000 charging ports in 100 locations in our service area across the state to increase the availability of universal EV charging by 50%,” Dvareckas said. Ultimately, more chargers means less range anxiety for EV owners, which many consumers cite as a reason for not wanting to purchase an EV. “From our perspective, this is a pilot program that is really enabling us to learn as the utility ahead of mass adoption to ensure that the infrastructure upgrades and placement that we’re making in the future is done in a thoughtful manner that benefits all of our customers,” said Dvareckas.
undefined
Mar 25, 2021 • 34min

83. Understanding Energy Crises of the 1970s and Avoiding Problems Today

Understanding Energy Crises of the 1970s and Avoiding Problems Today. If you were alive and living in the U.S. during the 1970s, you probably remember waiting in long lines to fill your car with fuel. Yet, gasoline wasn’t the only item in short supply during the “Me Decade”—natural gas was seemingly running out and electricity demand was growing so much that new power plants were going up all over the country. “I would argue, and I think a lot of historians would agree with me, that the 1970s was the most important decade in U.S. energy history, and I say that because of the gasoline interruptions. We had three big crises in the Middle East that reduced our supplies of oil, and that got so bad that at one point, in some states, less than 50% of the stations had any gasoline to sell at all,” Jay Hakes, author of the forthcoming book Energy Crises: Nixon, Ford, Carter, and Hard Choices in the 1970s, said as a guest on The POWER Podcast. “It was also a time where electric demand was expanding at a very rapid rate. There was a lot of optimism that nuclear would fill most of that void,” Hakes said. However, as fate would have it, the Three Mile Island (TMI) accident in 1979 pretty much put an end to the nuclear power construction heyday. In addition to writing books, Hakes has served as the administrator of the U.S. Energy Information Administration during the Clinton administration and as director for Research and Policy for President Obama’s BP Deepwater Horizon Oil Spill Commission. He was also the director of the Jimmy Carter Presidential Library for 13 years, and he has had access to some of President Carter’s personal diaries, giving him unique insight into the events that occurred during Carter’s presidency. “Jimmy Carter worked for Admiral Rickover when they developed the first nuclear submarine,” Hakes pointed out. “So, he actually knew the technology of nuclear reactors—obviously better than any president and better than some of the people that worked at the Atomic Energy Commission.” Carter had also spent time on recovery efforts after the world’s first nuclear accident, which was at the Chalk River site in Ontario, Canada, in 1952. Carter was part of a group that was sent into the containment vessel to clean it up. “So, he would be the best president you’d want to have if there was a nuclear accident.” Hakes noted that reports being sent to the president during the first couple of days after the TMI accident were mostly positive. However, on the third day, Carter decided he needed someone with technical expertise at the site to provide him with better details, so he had a direct phone line set up with Harold Denton, who was onsite following the situation as the head of nuclear reactors for the Nuclear Regulatory Commission. “The short story is the coolant system, which keeps the core from melting, broke down, but the containment vessel—that four-feet thick concrete structure that is around the reactor—did its job, and so, very little contamination reached the public,” Hakes said. Following the incident, Carter formed a commission to investigate and recommend reforms for the nuclear industry. “I think that commission did an excellent job,” said Hakes, noting that many improvements were made based on the lessons learned. “The industry and the government both did a good job of fixing those safety problems. So, you know, in that sense, it’s a good model for dealing with energy crises.” Hakes explained some of the policies, not only of Carter’s administration, but also of Nixon’s, that exacerbated the energy crises of the 1970s, and he shared his insight on how President Biden’s agenda could affect the energy industry going forward. He noted that Biden has put a pause on leasing on federal lands, but said he doesn’t expect that to affect production, at least for several years.
undefined
Mar 18, 2021 • 41min

82. Is It Safe to Invest in Mexican Energy Projects?

Is It Safe to Invest in Mexican Energy Projects? In late 2013, Mexico embarked on a path to transform its energy markets. Then-President Enrique Peña-Nieto oversaw constitutional reforms that ended state-run monopolies, and opened Mexico’s power market to competition and investment from foreign and private companies. By most accounts, the policies were highly effective in spurring investments in renewable energy and efficient natural gas-fired power projects. A great deal of money has been funneled into Mexico by investors from as many as 45 countries since the law was enacted. “The result of that was dramatically successful. I mean, you have millions and millions of dollars that were sunk into the power sector bringing in modern equipment, environmentally friendly, because there were a lot of renewable projects that went online. You see how the percentage of renewables changed in the last 10 years—you can see that it has been successful,” Roberto Aguirre Luzi, a partner with King & Spalding, said as a guest on The POWER Podcast. However, Peña-Nieto is no longer in office, and President Andrés Manuel López Obrador wants state-owned power company Federal Electricity Commission (CFE) to get special treatment in the market. Under the previously enacted reform measures, dispatch priority was based on price, with the lowest-cost generation being delivered first. Earlier this month, Mexican policymakers passed legislation that would change the order in which electricity is dispatched, giving priority to CFE at the expense of private operators. “There were a wave of amparos to challenge this law,” said Fernando Rodriguez-Cortina, senior associate with King & Spalding. An amparo is a protection provided for under Mexico’s constitutional law. It may be filed in federal court by Mexicans and by foreigners in an attempt to guarantee protection of the claimant’s constitutional rights. “The judge granted the amparo with general effects, and now the law is stayed,” said Rodriguez-Cortina. “With general effects” means the stay applies to everyone affected by the law, rather than simply to the amparo filer. President López Obrador is not standing idly by, however. He asked the Mexican Supreme Court to open an investigation into the judge’s conduct, claiming that the judge, who was appointed under the previous administration, acted inappropriately. “This is obviously a political maneuver, because this is not how you initiate a proceeding. I mean, if you want the judge to be investigated, you follow a different route. You don’t go to the Supreme Court,” said Rodriguez-Cortina. The chief justice ultimately referred the case to the proper court for resolution. Aguirre Luzi suggested the actions taken by Mexico’s policymakers should be very concerning to all stakeholders and will have wide-ranging implications on future investments. He said when you have two branches of government making important energy policy changes with the intention of helping two state-owned entities—CFE and PEMEX, which is the fuel supplier to many of CFE’s power plants—it’s going to have long-term effects. “It’s a 180-degree change,” said Aguirre Luzi. “How do you come back from that?” Only time will tell. Rodriguez-Cortina suggested court proceedings could go on for a while. “It usually takes around six months for the amparo to be resolved,” he said, and appeals could take the dispute all the way to the Supreme Court. “So, this is going to be a process that is going to take years to see the actual outcomes,” said Aguirre Luzi.
undefined
Mar 11, 2021 • 38min

81. Are 1-in-10-Year Events Really 1-in-10-Year Events Anymore?

Are 1-in-10-Year Events Really 1-in-10-Year Events Anymore? When evaluating resource adequacy requirements, many power companies and grid operators have used a methodology that originated more than 70 years ago. This probabilistic reliability approach has generally performed adequately through the years. It has generally evaluated loss-of-load events occurring at frequencies of one-day-in-10-years (1-in-10) to be acceptable in terms of system reliability. However, it’s not without risk, as incidents in Texas, California, and other parts of the country and world have demonstrated in recent history. In light of these events, it’s worth asking: have risks changed? It could be that the method used to evaluate what constitutes a 1-in-10 event is no longer sound. “When you have 1-in-a-5 or 1-in-a-10-year event that’s happening every year, most likely those are not 1-in-a-10 or 1-in-100-year events,” Electric Power Research Institute (EPRI) CEO Arshad Mansoor said as a guest on The POWER Podcast. “Really, what we need to go is beyond that. We need to look forward to a future, and not really just back-cast, but forecast. What is the resiliency of the grid that we need when maybe societal dependence on electricity has doubled because of electrification, where extreme weather is becoming frequency, and severity is becoming a norm? And, our resource mix is changing pretty rapidly, and these changes are profound. So, taking all those three trends into consideration, we just need to step back—and resource adequacy is one part of the planning process,” Mansoor said. In rather prescient timing, EPRI published a technical update (or white paper) on Jan. 28—about two weeks before uncharacteristically cold weather caused widespread blackouts all across Texas. “That timing was not by design,” Mansoor said, noting that EPRI has long been working on ways to enhance grid design, planning, and operation to help navigate the energy transition. According to the abstract, “This white paper focuses on planning for resource adequacy given a world in which supply disruptions are correlated and no longer limited to the outage of independent units and may be due to widespread or long-duration events with significant economic impacts on consumers.” The 72-page paper highlights several attributes of planning for resource adequacy in an environment of increasing numbers of extreme events. Among the items addressed are: • Supply disruptions that are common-mode events caused by weather, cyber and/or physical attacks, natural gas constraints, or combinations of factors. • The occurrence of an event (zero/one), consideration of its physical impacts (the amount of unserved energy, breadth of customer base impacted, and duration), and its economic costs to consumers. • The need for the definition of probabilistic metrics and methodologies that over time can be used to incorporate consideration of common-mode and high-impact supply disruptions. The paper concludes with an identification of strategies that individual utilities and independent system operators/regional transmission organizations (ISOs/RTOs) could follow based on their unique situations. “I would encourage all of your audience to go to our website www.epri.com and you should be able to download the paper—we have made it available to all,” Mansoor said (see https://www.epri.com/research/products/000000003002019300).
undefined
Mar 4, 2021 • 27min

80. Battery Technology Used in Outer Space Could Be a Gamechanger on Earth

Battery Technology Used in Outer Space Could Be a Gamechanger on Earth Lithium-ion has become the dominant battery technology used in energy storage applications around the world, but that doesn’t mean it’s the only, or even the best, technology available. Many companies are working on different battery chemistries that could provide safer, longer-lasting, and ultimately more cost-effective options. One alternative that has gotten little exposure until now is a battery chemistry with a 30-plus-year history of successful operation. It’s a metal-hydrogen battery, which has been used by NASA on space missions, including in the Hubble Space Telescope, the Mars Curiosity rover, and the International Space Station. “[The battery was] designed for a use case where these aerospace satellites and so forth needed a battery that would withstand the harsh climate of outer space, meaning super high temperatures, super low temperatures, and then have basically an infinite cycle life and require no maintenance,” Jorg Heinemann, CEO of EnerVenue, said as a guest on The POWER Podcast. “They worked very successfully with over 30,000 cycles—30,000 cycles is like charging the battery and discharging it three times per day for 30 years,” he said. For the sake of comparison, Heinemann said the longest lasting lithium-ion batteries can handle about 3,000 cycles, about one-tenth the cycle life. The metal-hydrogen battery contains no toxic materials, and unlike lithium-ion technology, it has no fire risk. “There are no safety issues. It’s a really safe device. There’s no thermal runaway risk, which is the primary concern with lithium-ion. Our battery operates in a very broad—what I call a ‘happy’—temperature range,” Heinemann said. Specifically, EnerVenue’s battery has been proven to operate reliably in ambient temperatures from –40F to +140F. That means, whether in artic or desert conditions, it doesn’t require large-scale heating and air conditioning systems, which can be expensive and maintenance-intensive. Cost has been the main reason metal-hydrogen chemistry has not been more fully developed for use on Earth. The batteries used in space were very expensive, costing as much as $20,000/kWh, according to Heinemann. However, about two years ago, EnerVenue’s founder, Yi Cui, a professor at Stanford University who was leading a research lab focused on materials innovations for sustainability, came up with a new set of materials to replace the high-cost elements. “It uses Earth-abundant materials—nothing but—there’s nothing that is either rare or problematic. There’s no lithium, no cobalt, no platinum-group metals. It’s just Earth-abundant stuff that you can find virtually on every continent,” Heinemann said. Which means, the cost has come way down, and the kicker is, it even performs better. “We believe that we can match the cost trajectory for lithium-ion battery packs, which is going to continue to go down over time based on the scale effects,” he said. “We can match their CAPEX [capital expenditure expense], and then, we can give the customer a significantly better value proposition in terms of the capabilities of the battery, especially the high temperature range, the durability, the flexibility, and a very significant economic savings because of the fact that there’s no maintenance costs associated with this battery. It’s basically an install-and-forget battery.” Metal-hydrogen batteries are not particularly well-suited for mobile applications, such as electric vehicles or cellphones, so for now, EnerVenue’s target market is the utility-scale energy storage sector. “Our battery is really good for a super broad range of stationary uses,” he said.
undefined
Feb 25, 2021 • 24min

79. Hydrogen and the Energy Transition

Hydrogen and the Energy Transition Power systems around the world are changing. Renewable energy, mainly in the form of wind and solar generation, is being added everywhere, while more traditional forms of power, such as coal-fired and nuclear generation, are being retired from the grid. Meanwhile, natural gas-fired generation has taken the lead role in facilitating the transition by providing relatively quick ramping capability and stable baseload power to backup intermittent renewables. However, there is a lot of research and development work underway that could eventually push natural gas out of the mix. The reason is that gas, like other fossil fuels, releases CO2 and other emissions to the atmosphere, albeit at lower quantities than coal, fuel oil, and diesel on a per-kWh-generated basis. One of the potential supplements or replacements for natural gas could be hydrogen. The concept of a hydrogen economy is not new. It was first contemplated at least as far back as the 1970s, but the economics associated with producing hydrogen at the time made it impractical. That is changing as countries around the world implement decarbonization goals and the share of renewable energy in the power mix increases. Going forward, there are likely to be situations in which the supply of solar and wind power is high, but demand for the electricity is low. Rather than curtailing production, the surplus energy could be used to produce “green hydrogen” through electrolysis at a very reasonable cost. “There’s no CO2 emissions associated with [green hydrogen],” Megan Reusser, hydrogen development lead at Burns & McDonnell, said as a guest on The POWER Podcast. “So, bringing hydrogen to the forefront as a potential way to meet decarbonization goals, coupled with other types of renewable energy such as solar or wind, that’s what’s really giving [hydrogen] kind of a new life and a really big interest currently in the market.” Seeing the writing on the wall, the major gas turbine original equipment manufacturers (OEMs) have jumped aboard the hydrogen bandwagon. Siemens, GE, and Mitsubishi Power all have programs underway to make their combustion turbines 100% hydrogen capable. Their intentions are really designed to “future proof” investments in new power plants. “All the major OEMs have advanced-class gas turbines that are available and can blend up to 30% hydrogen. Where it gets interesting is you see and hear about the concept of hydrogen-ready for the future, and 100% hydrogen capable for the future,” Joey Mashek, business development manager at Burns & McDonnell, said on the podcast. “The plan to develop those technologies to get near 100%, or 100%, is still about 10 years. And I think all the OEMs will say they can do that and will do that, but it’ll be market driven.” Reusser said Burns & McDonnell has seen a lot of interest in hydrogen pilot projects. “By that I mean small-scale applications where people are just trying to understand how all this is going to come together,” she said. One example that she mentioned was a system installed by the Orlando Utilities Commission. “They are developing a pilot facility that has a little bit of everything. It’s got [an] electrolyzer, some storage, and a fuel cell. So, they’re kind of doing the whole spectrum of generating their hydrogen, storing their hydrogen, and then converting it back to power,” said Reusser. “Only thing I can say is, it’s exciting, really exciting time in the energy industry,” Mashek said.
undefined
Feb 18, 2021 • 35min

78. Dirty Electricity, but Not the Kind You Think

Dirty Electricity, but Not the Kind You Think When most people hear the term “dirty electricity,” they probably think of power generated from sources considered more-polluting, such as coal, natural gas, or other fossil fuels. However, Satic Inc., an electronics manufacturer and professional engineering firm based in Missoula, Montana, says electricity in homes and businesses is filled with “electrical pollution” that is not necessarily associated with dirty fuels. In fact, the company claims solar power is one of the main sources of dirty electricity. “Dirty electricity specifically comes from three different main culprit places. Number one, it’s delivered to our panel. Number two, we make it with our electronics—our solar inverters, our LED lighting, our DC devices. And, the wiring in our home—maybe half a mile of high-quality copper wiring—acts as a super antenna. So, that’s how we get dirty electricity into our house. What defines it specifically is, it’s electricity that has distortion or interference, low power factor, etcetera, on it,” B.D. Erickson, Satic’s CEO, said as a guest on The POWER Podcast. Dirty electricity may affect more than just electrical devices. Some people claim to have a hypersensitivity to electromagnetic fields (EMFs), and they report symptoms such as fatigue, dizziness, headaches, problems with concentration and memory, and sleep disturbances as a result of exposure to dirty electricity. While studies on the effects of exposure to EMFs have in some cases been conflicting, Erickson said his son experienced symptoms when the family moved into a home located near large power transmission lines, which is what led him to research the topic. “Electricity has eight attributes that need to be within an acceptable realm, and if they’re not within that acceptable realm, they are considered dirty,” Erickson said. He explained the eight attributes are volts, amps, watts, electromagnetic fields, total harmonic distortion, interference, ohms law of resistance, and frequency. Erickson said when electricity leaves a power plant, it’s properly regulated and is typically within an acceptable range for all eight attributes. But as it flows out to customers, it can degrade or get distorted, usually as a result of the devices everyone uses. “We live in an alternating current world [but] half the stuff we plug in nowadays isn’t alternating current. Anything with [a] battery is DC,” he said. In today’s world, cell phones, computers, tablets, and some other electronic devices are often powered by batteries. Furthermore, lighting has changed from incandescent bulbs, which were essentially resistors that used to act as “energy cleaners,” to compact fluorescent bulbs, and now, LED lighting, which adds electrical pollution. Lastly, Erickson said solar power, and specifically solar inverters, create a lot of dirty electricity. What Erickson and his team of engineers came up with is a product that provides system-wide power conditioning, robust surge protection, and power factor correction with advanced EMF, interference, and harmonics filtration. The system is easy to install in homes and businesses, and the effects are immediate. “You don't have to wait a month like with solar to see your bill. You can see it, you can feel it, you can hear it in real time. The amp draws—your air conditioner might go from five amps to two, and running better and running quieter,” he said. Erickson said the cost savings on electric bills will usually pay for the device in about two years, and there are other benefits, such as robust surge protection, less heat generation, and longer operating lives for appliances and devices, not to mention possibly improving the health of people with EMF sensitivities.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app