Lock and Code

Malwarebytes
undefined
Mar 9, 2025 • 44min

How ads weirdly know your screen brightness, headphone jack use, and location, with Tim Shott

Something’s not right in the world of location data.In January, a location data broker named Gravy Analytics was hacked, with the alleged cybercriminal behind the attack posting an enormous amount of data online as proof. Though relatively unknown to most of the public, Gravy Analytics is big in the world of location data collection, and, according to an enforcement action from the US Federal Trade Commission last year, the company claimed to “collect, process, and curate more than 17 billion signals from around a billion mobile devices daily.”Those many billions of signals, because of the hack, were now on display for security researchers, journalists, and curious onlookers to peruse, and when they did, they found something interesting. Listed amongst the breached location data were occasional references to thousands of popular mobile apps, including Tinder, Grindr, Candy Crush, My Fitness Pal, Tumblr, and more.The implication, though unproven, was obvious: The mobile apps were named with specific lines of breached data because those apps were the source of that breached data. And, considering how readily location data is traded directly from mobile apps to data brokers to advertisers, this wasn’t too unusual a suggestion.Today, nearly every free mobile app makes money through ads. But ad purchasing and selling online is far more sophisticated than it used to be for newspapers and television programs. While companies still want to place their ads in front of demographics they believe will have the highest chance of making a purchase—think wealth planning ads inside the Wall Street Journal or toy commercials during cartoons—most of the process now happens through pieces of software that can place bids at data “auctions.” In short, mobile apps sometimes collect data about their users, including their location, device type, and even battery level. The apps then bring that data to an advertising auction, and separate companies “bid” on the ability to send their ads to, say, iPhone users in a certain time zone or Android users who speak a certain language.This process happens every single day, countless times every hour, but in the case of the Gravy Analytics breach, some of the apps referenced in the data expressed that, one, they’d never heard of Gravy Analytics, and two, no advertiser had the right to collect their users’ location data.In speaking to 404 Media, a representative from Tinder said:“We have no relationship with Gravy Analytics and have no evidence that this data was obtained from the Tinder app.”A representative for Grindr echoed the sentiment:“Grindr has never worked with or provided data to Gravy Analytics. We do not share data with data aggregators or brokers and have not shared geolocation with ad partners for many years.”And a representative for a Muslim prayer app, Muslim Pro, said much of the same:“Yes, we display ads through several ad networks to support the free version of the app. However, as mentioned above, we do not authorize these networks to collect location data of our users.”What all of this suggested was that some other mechanism was allowing for users of these apps to have their locations leaked and collected online.And to try to prove that, one independent researcher conducted an experiment: Could he find himself in his own potentially leaked data?Today, on the Lock and Code podcast with host David Ruiz, we speak with independent research Tim Shott about his investigation into leaked location data. In his experiment, Shott installed two mobile games that were referenced in the breach, an old game called Stack, and a more current game called Subway Surfers. These games had no reason to know his location, and yet, within seconds, he was able to see more than a thousand requests for data that included his latitude, his longitude, and, as we’ll learn, a whole lot more.“ I was surprised looking at all of those requests. Maybe 10 percent of [them had] familiar names of companies, of websites, which my data is being sent to… I think this market works the way that the less you know about it, the better from their perspective.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
4 snips
Feb 23, 2025 • 28min

Surveillance pricing is "evil and sinister," explains Justin Kloczko

Insurance pricing in America makes a lot of sense so long as you’re one of the insurance companies. Drivers are charged more for traveling long distances, having low credit, owning a two-seater instead of a four, being on the receiving end of a car crash, and—increasingly—for any number of non-determinative data points that insurance companies use to assume higher risk.It’s a pricing model that most people find distasteful, but it’s also a pricing model that could become the norm if companies across the world begin implementing something called “surveillance pricing.”Surveillance pricing is the term used to describe companies charging people different prices for the exact same goods. That 50-inch TV could be $800 for one person and $700 for someone else, even though the same model was bought from the same retail location on the exact same day. Or, airline tickets could be more expensive because they were purchased from a more expensive device—like a Mac laptop—and the company selling the airline ticket has decided that people with pricier computers can afford pricier tickets.Surveillance pricing is only possible because companies can collect enormous arrays of data about their consumers and then use that data to charge individual prices. A test prep company was once caught charging customers more if they lived in a neighborhood with a higher concentration of Asians, and a retail company was caught charging customers more if they were looking at prices on the company’s app while physically located in a store’s parking lot.This matter of data privacy isn’t some invisible invasion online, and it isn’t some esoteric framework of ad targeting, this is you paying the most that a company believes you will, for everything you buy.And it’s happening right now.Today, on the Lock and Code podcast with host David Ruiz, we speak with Consumer Watchdog Tech Privacy Advocate Justin Kloczko about where surveillance pricing is happening, what data is being used to determine prices, and why the practice is so nefarious. “It’s not like we’re all walking into a Starbucks and we’re seeing 12 different prices for a venti mocha latte,” said Kloczko, who recently authored a report on the same subject. “If that were the case, it’d be mayhem. There’d be a revolution.”Instead, Kloczko said:“Because we’re all buried in our own devices—and this is really happening on e-commerce websites and online, on your iPad on your phone—you’re kind of siloed in your own world, and companies can get away with this.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Feb 9, 2025 • 38min

A suicide reveals the lonely side of AI chatbots, with Courtney Brown

In February 2024, a 14-year-old boy from Orlando, Florida, committed suicide after confessing his love to the one figure who absorbed nearly all of his time—an AI chatbot.For months, Sewell Seltzer III had grown attached to an AI chatbot modeled after the famous “Game of Thrones” character Daenerys Targaryen. The Daenerys chatbot was not a licensed product, it had no relation to the franchise’s actors, its writer, or producers, but none of that mattered, as, over time, Seltzer came to entrust Daenerys with some of his most vulnerable emotions.“I think about killing myself sometimes,” Seltzer wrote one day, and in response, Daenerys, pushed back, asking Seltzer, “Why the hell would you do something like that?”“So I can be free” Seltzer said.“Free from what?”“From the world. From myself.”“Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”On Seltzer’s first reported reference to suicide, the AI chatbot pushed back, a guardrail against self-harm. But months later, Seltzer discussed suicide again, but this time, his words weren’t so clear. After reportedly telling Daenerys that he loved her and that he wanted to “come home,” the AI chatbot encouraged Seltzer.“Please, come home to me as soon as possible, my love,” Daenerys wrote, to which Seltzer responded “What if I told you I could come home right now?”The chatbot’s final message to Seltzer said “… please do, my sweet king.”Daenerys Targaryen was originally hosted on an AI-powered chatbot platform called Character.AI. The service reportedly boasts 20 million users—many of them young—who engage with fictional characters like Homer Simpson and Tony Soprano, along with historical figures, like Abraham Lincoln, Isaac Newton, and Anne Frank. There are also entirely fabricated scenarios and chatbots, such as the “Debate Champion” who will debate anyone on, for instance, why Star Wars is overrated, or the “Awkward Family Dinner” that users can drop into to experience a cringe-filled, entertaining night.But while these chatbots can certainly provide entertainment, Character.AI co-founder Noam Shazeer believes they can offer much more.“It’s going to be super, super helpful to a lot of people who are lonely or depressed.”Today, on the Lock and Code podcast with host David Ruiz, we speak again with youth social services leader Courtney Brown about how teens are using AI tools today, who to “blame” in situations of AI and self-harm, and whether these chatbots actually aid in dealing with loneliness, or if they further entrench it.“You are not actually growing as a person who knows how to interact with other people by interacting with these chatbots because that’s not what they’re designed for. They’re designed to increase engagement. They want you to keep using them.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Jan 26, 2025 • 38min

Three privacy rules for 2025

It’s Data Privacy Week right now, and that means, for the most part, that you’re going to see a lot of well-intentioned but clumsy information online about how to protect your data privacy. You’ll see articles about iPhone settings. You’ll hear acronyms for varying state laws. And you’ll probably see ads for a variety of apps, plug-ins, and online tools that can be difficult to navigate.So much of Malwarebytes—from Malwarebytes Labs, to the Lock and Code podcast, to the engineers, lawyers, and staff at wide—work on data privacy, and we fault no advocate or technologist or policy expert trying to earnestly inform the public about the importance of data privacy.But, even with good intentions, we cannot ignore the reality of the situation. Data breaches every day, broad disrespect of user data, and a lack of consequences for some of the worst offenders. To be truly effective against these forces, data privacy guidance has to encompass more than fiddling with device settings or making onerous legal requests to companies.That’s why, for Data Privacy Week this year, we’re offering three pieces of advice that center on behavior. These changes won’t stop some of the worst invasions against your privacy, but we hope they provide a new framework to understand what you actually get when you practice data privacy, which is control.You have control over who sees where you are and what inferences they make from that. You have control over whether you continue using products that don’t respect your data privacy. And you have control over whether a fast food app is worth giving up your location data to just in exchange for a few measly coupons.Today, on the Lock and Code podcast, host David Ruiz explores his three rules for data privacy in 2025. In short, he recommends:Less location sharing. Only when you want it, only from those you trust, and never in the background, 24/7, for your apps. More accountability. If companies can’t respect your data, respect yourself by dropping their products.No more data deals. That fast-food app offers more than just $4 off a combo meal, it creates a pipeline into your behavioral dataTune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Jan 12, 2025 • 47min

The new rules for AI and encrypted messaging, with Mallory Knodel

The era of artificial intelligence everything is here, and with it, come everyday surprises into exactly where the next AI tools might pop up.There are major corporations pushing customer support functions onto AI chatbots, Big Tech platforms offering AI image generation for social media posts, and even Google has defaulted to include AI-powered overviews into everyday searches.The next gold rush, it seems, is in AI, and for a group of technical and legal researchers at New York University and Cornell University, that could be a major problem.But to understand their concerns, there’s some explanation needed first, and it starts with Apple’s own plans for AI.Last October, Apple unveiled a service it is calling Apple Intelligence (“AI,” get it?), which provides the latest iPhones, iPads, and Mac computers with AI-powered writing tools, image generators, proof-reading, and more.One notable feature in Apple Intelligence is Apple’s “notification summaries.” With Apple Intelligence, users can receive summarized versions of a day’s worth of notifications from their apps. That could be useful for an onslaught of breaking news notifications, or for an old college group thread that won’t shut up.The summaries themselves are hit-or-miss with users—one iPhone customer learned of his own breakup from an Apple Intelligence summary that said: “No longer in a relationship; wants belongings from the apartment.”What’s more interesting about the summaries, though, is how they interact with Apple’s messaging and text app, Messages.Messages is what is called an “end-to-end encrypted” messaging app. That means that only a message’s sender and its recipient can read the message itself. Even Apple, which moves the message along from one iPhone to another, cannot read the message.But if Apple cannot read the messages sent on its own Messages app, then how is Apple Intelligence able to summarize them for users?That’s one of the questions that Mallory Knodel and her team at New York University and Cornell University tried to answer with a new paper on the compatibility between AI tools and end-to-end encrypted messaging apps.Make no mistake, this research isn’t into whether AI is “breaking” encryption by doing impressive computations at never-before-observed speeds. Instead, it’s about whether or not the promise of end-to-end encryption—of confidentiality—can be upheld when the messages sent through that promise can be analyzed by separate AI tools.And while the question may sound abstract, it’s far from being so. Already, AI bots can enter digital Zoom meetings to take notes. What happens if Zoom permits those same AI chatbots to enter meetings that users have chosen to be end-to-end encrypted? Is the chatbot another party to that conversation, and if so, what is the impact?Today, on the Lock and Code podcast with host David Ruiz, we speak with lead author and encryption expert Mallory Knodel on whether AI assistants can be compatible with end-to-end encrypted messaging apps, what motivations could sway current privacy champions into chasing AI development instead, and why these two technologies cannot co-exist in certain implementations.“An encrypted messaging app, at its essence is encryption, and you can’t trade that away—the privacy or the confidentiality guarantees—for something else like AI if it’s fundamentally incompatible with those features.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Dec 29, 2024 • 39min

Is nowhere safe from AI slop?

Mark Stockley, a ThreatDown Cybersecurity Evangelist, delves into the controversial phenomenon of 'AI slop'—low-quality AI-generated content flooding social media. He discusses how such content is designed for virality, drawing parallels with historical clickbait tactics. Mark examines the dangers of misinformation tied to AI, the ethical implications of blending art and commercialism, and the challenges of truth in our increasingly deceptive online landscape. This riveting conversation reveals how the quest for engagement can erode quality and integrity.
undefined
Dec 15, 2024 • 34min

A day in the life of a privacy pro, with Ron de Jesus

Privacy is many things for many people.For the teenager suffering from a bad breakup, privacy is the ability to stop sharing her location and to block her ex on social media. For the political dissident advocating against an oppressive government, privacy is the protection that comes from secure, digital communications. And for the California resident who wants to know exactly how they’re being included in so many targeted ads, privacy is the legal right to ask a marketing firm how they collect their data.In all these situations, privacy is being provided to a person, often by a company or that company’s employees.The decisions to disallow location sharing and block social media users are made—and implemented—by people. The engineering that goes into building a secure, end-to-end encrypted messaging platform is done by people. Likewise, the response to someone’s legal request is completed by either a lawyer, a paralegal, or someone with a career in compliance.In other words, privacy, for the people who spend their days with these companies, is work. It’s their expertise, their career, and their to-do list.But what does that work actually entail?Today, on the Lock and Code podcast with host David Ruiz, we speak with Transcend Field Chief Privacy Officer Ron de Jesus about the responsibilities of privacy professionals today and how experts balance the privacy of users with the goals of their companies.De Jesus also explains how everyday people can meaningfully judge whether a company’s privacy “promises” have any merit by looking into what the companies provide, including a legible privacy policy and “just-in-time” notifications that ask for consent for any data collection as it happens.“When companies provide these really easy-to-use controls around my personal information, that’s a really great trigger for me to say, hey, this company, really, is putting their money where their mouth is.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Dec 1, 2024 • 45min

These cars want to know about your sex life (re-air)

Two weeks ago, the Lock and Code podcast shared three stories about home products that requested, collected, or exposed sensitive data online.There were the air fryers that asked users to record audio through their smartphones. There was the smart ring maker that, even with privacy controls put into place, published data about users’ stress levels and heart rates. And there was the smart, AI-assisted vacuum that, through the failings of a group of contractors, allowed an image of a woman on a toilet to be shared on Facebook.These cautionary tales involved “smart devices,” products like speakers, fridges, washers and dryers, and thermostats that can connect to the internet.But there’s another smart device that many folks might forget about that can collect deeply personal information—their cars.Today, the Lock and Code podcast with host David Ruiz revisits a prior episode from 2023 about what types of data modern vehicles can collect, and what the car makers behind those vehicles could do with those streams of information.In the episode, we spoke with researchers at Mozilla—working under the team name “Privacy Not Included”—who reviewed the privacy and data collection policies of many of today’s automakers.To put it shortly, the researchers concluded that cars are a privacy nightmare. According to the team’s research, Nissan said it can collect “sexual activity” information about consumers. Kia said it can collect information about a consumer’s “sex life.” Subaru passengers allegedly consented to the collection of their data by simply being in the vehicle. Volkswagen said it collects data like a person’s age and gender and whether they’re using your seatbelt, and it could use that information for targeted marketing purposes. And those are just the highlights. Explained Zoë MacDonald, content creator for Privacy Not Included: “We were pretty surprised by the data points that the car companies say they can collect… including social security number, information about your religion, your marital status, genetic information, disability status… immigration status, race.”In our full conversation from last year, we spoke with Privacy Not Included’s MacDonald and Jen Caltrider about the data that cars can collect, how that data can be shared, how it can be used, and whether consumers have any choice in the matter.Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Nov 17, 2024 • 27min

An air fryer, a ring, and a vacuum get brought into a home. What they take out is your data

The month, a consumer rights group out of the UK posed a question to the public that they’d likely never considered: Were their air fryers spying on them?By analyzing the associated Android apps for three separate air fryer models from three different companies, a group of researchers learned that these kitchen devices didn’t just promise to make crispier mozzarella sticks, crunchier chicken wings, and flakier reheated pastries—they also wanted a lot of user data, from precise location to voice recordings from a user’s phone.“In the air fryer category, as well as knowing customers’ precise location, all three products wanted permission to record audio on the user’s phone, for no specified reason,” the group wrote in its findings.While it may be easy to discount the data collection requests of an air fryer app, it is getting harder to buy any type of product today that doesn’t connect to the internet, request your data, or share that data with unknown companies and contractors across the world.Today, on the Lock and Code pocast, host David Ruiz tells three separate stories about consumer devices that somewhat invisibly collected user data and then spread it in unexpected ways. This includes kitchen utilities that sent data to China, a smart ring maker that published de-identified, aggregate data about the stress levels of its users, and a smart vacuum that recorded a sensitive image of a woman that was later shared on Facebook.These stories aren’t about mass government surveillance, and they’re not about spying, or the targeting of political dissidents. Their intrigue is elsewhere, in how common it is for what we say, where we go, and how we feel, to be collected and analyzed in ways we never anticipated.Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.
undefined
Nov 3, 2024 • 40min

Why your vote can’t be “hacked,” with Cait Conley of CISA

The US presidential election is upon the American public, and with it come fears of “election interference.”But “election interference” is a broad term. It can mean the now-regular and expected foreign disinformation campaigns that are launched to sow political discord or to erode trust in American democracy. It can include domestic campaigns to disenfranchise voters in battleground states. And it can include the upsetting and increasing threats made to election officials and volunteers across the country.But there’s an even broader category of election interference that is of particular importance to this podcast, and that’s cybersecurity.Elections in the United States rely on a dizzying number of technologies. There are the voting machines themselves, there are electronic pollbooks that check voters in, there are optical scanners that tabulate the votes that the American public actually make when filling in an oval bubble with pen, or connecting an arrow with a solid line. And none of that is to mention the infrastructure that campaigns rely on every day to get information out—across websites, through emails, in text messages, and more.That interlocking complexity is only multiplied when you remember that each, individual state has its own way of complying with the Federal government’s rules and standards for running an election. As Cait Conley, Senior Advisor to the Director of the US Cybersecurity and Infrastructure Security Agency (CISA) explains in today’s episode:“There’s a common saying in the election space: If you’ve seen one state’s election, you’ve seen one state’s election.”How, then, are elections secured in the United States, and what threats does CISA defend against?Today, on the Lock and Code podcast with host David Ruiz, we speak with Conley about how CISA prepares and trains election officials and volunteers before the big day, whether or not an American’s vote can be “hacked,” and what the country is facing in the final days before an election, particularly from foreign adversaries that want to destabilize American trust.”There’s a pretty good chance that you’re going to see Russia, Iran, or China try to claim that a distributed denial of service attack or a ransomware attack against a county is somehow going to impact the security or integrity of your vote. And it’s not true.”Tune in today.You can also find us on Apple Podcasts, Spotify, and whatever preferred podcast platform you use.For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.Show notes and credits:Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/Outro Music: “Good God” by Wowa (unminus.com)Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app