When using chat GPT or similar generative AI tools, avoid providing personal information and gradually introduce non-sensitive details.
Be cautious with chat GPT tools, as personal information may be retained by the provider. Minimize including personal information and do not treat these tools as confidential advisers.
Deep dives
Use Chat GPT without Personal Information
When using chat GPT or similar generative AI tools, it is important to avoid providing any personal information, including your name. Start by using the tool without any identifying details, and gradually introduce non-sensitive information as you become more comfortable. Remember that these tools do not require personal identification and that your inputs are used to fine-tune the algorithms. Avoid uploading sensitive content such as resumes as they may contain personal or proprietary information that could be accessed by others.
Beware of Privacy Risks
Be cautious when using chat GPT or similar tools, as the personal information you provide may be retained by the provider. Read the terms of service carefully, as they often grant the provider the rights to keep and use your data. Minimize the inclusion of personal information and do not treat these tools as therapists or providers of confidential advice.
Filter and Control Data
Organizations can implement measures to filter and control data used with chat GPT tools. Similar to blocking access to certain websites in a corporate environment, companies can implement filters to prevent the inclusion of sensitive or proprietary information. This can help mitigate the risk of data exposure and misuse.
Secure Digital Identity with Reusable Verification
Trua offers a solution for secure digital identity with reusable verification. Rather than repeatedly sharing personal information, Trua enables individuals to verify their identity or background without disclosing personally identifiable information (PII). The goal is to store personal information securely in one place and reuse it without sharing it directly with others.
Raj Ananthanpillai from Trua joins Dave to discuss privacy concerns and what you shouldn't share with ChatGPT. Dave and Joe share some listener follow up from Clayton who shares some comments on a previous episode where Dave discusses bomb threats to retail stores for ransom. Dave's story follows Google rapidly trying to correct bogus airline phone numbers that were discovered this week. Joe's story is on an Android app called "Spyhide" which is a phone surveillance app, that has been collecting private phone data from tens of thousands of Android devices around the world. Our catch of the day is from listener Isak who writes in to share a comedic spam email he received.