What Not to Share With AI Chatbots (2024)

ChatGPT was the very first AI chatbot to gain global recognition. Since its launch, many other capable AI chatbots have emerged, providing a wider range of options to meet your specific needs. AI chatbots have become extremely popular and useful tools for obtaining information, advice, and assistance on various topics. You can use them to create a business plan, plan your garden, write articles or code, compose emails, generate art, images, videos, and pretty much anything you can imagine.

However, the more advanced and integrated these AI assistants become in our lives, the more cautious we must be about sharing personal information with them. Why? Because they cannot be trusted with sensitive data.

Behind the friendly chatbot

In order to understand the privacy risks associated with AI chatbots, it's important to know how they work. These chatbots gather and store the full transcripts of conversations that take place when you interact with them. This includes all the questions, prompts, messages you send to the chatbot, and the chatbot's responses. The companies behind these AI assistants analyze and process this conversational data to train and improve their large language models.

Think of the chatbot as a student who takes notes during class. The chatbot writes down everything you say verbatim, so the full context and details are captured. The AI company then reviews these "notes" to help the chatbot learn, much like a student would study their class notes to increase their knowledge. While the intent is to enhance the AI's language understanding and dialogue abilities, it means your raw conversational data- which could potentially include personal information, opinions, and sensitive details that you disclose - is being collected, stored, and studied by the AI companies, at least temporarily.

If you're on the Internet, your personal data is already all over the place. Do you want to see where it is? Use Bitdefender Digital Identity Protection to locate and manage your personal information online.

The risks of oversharing

When you share personal or sensitive information with an AI chatbot, you lose control over where that data goes or how it may be used. AI chatbots store data on servers, which can become vulnerable to hacking attempts or breaches. These servers hold a wealth of information that cybercriminals can exploit in various ways. They can infiltrate the servers, steal the data, and sell it on dark web marketplaces. Additionally, hackers can use this data to crack passwords and gain unauthorized access to your devices.

Any data you provide could potentially be exposed, hacked, or misused, leading to identity theft, financial fraud, or the public exposure of intimate information you would rather keep private. Protecting your privacy means being selective about the details you disclose to AI chatbots.

What not to tell a chatbot

To protect your privacy and personal information, be selective about what you share with AI chatbots.

Be extremely careful with these types of data:

1. Personal Identifying Information: Avoid sharing key pieces of personal identifying information such as your full name, home address, phone number, date of birth, social security number, or other government ID numbers. Any of these can be exploited to impersonate you, leading to identity theft, financial fraud, or other criminal misuse of your personal details.

2. Usernames and passwords: Never share passwords, PINs, authentication codes, or other login credentials with AI chatbots. Even providing hints about your credentials could help hackers access your accounts.

3. Your financial information: You should never share any financial account information, credit card numbers, or income details with AI chatbots.

You can ask them for general finance tips and advice, broad questions to help you budget, or even tax guidance, but keep your sensitive financial information private, as it could easily lead to your financial accounts and assets being compromised.

4. Private and intimate thoughts: While AI chatbots can serve as a sympathetic ear, you should avoid revealing deeply personal thoughts, experiences, or opinions that you wouldn't feel comfortable sharing publicly. Anything from political or religious views to relationship troubles or emotional struggles could be exposed if conversational logs are hacked or mishandled.

5. Confidential work-related information: If you work with proprietary information, trade secrets, insider knowledge, or confidential workplace data of any kind, do not discuss this with public AI chatbots. Avoid using AI chatbots to summarize meeting minutes or automate repetitive tasks, as this poses the risk of unintentionally exposing sensitive data or violating confidentiality agreements and intellectual property protections of your employer.

A Bloomberg report highlighted a case where Samsung employees used ChatGPT for coding purposes and accidentally uploaded sensitive code onto the generative AI platform. This incident resulted in the disclosure of confidential information about Samsung, prompting the company to enforce a ban on AI chatbot usage.

Major tech companies like Apple, Samsung, JPMorgan, and Google have even implemented policies to prohibit their employees from using AI chatbots for work.

6. Your original creative work: Never share your original ideas with chatbots unless you're happy to have them potentially shared with all other users.

7. Health-related Information:

A survey conducted by health tech company Tebra revealed that:

  • 1 in 4 Americans are more likely to talk to an AI chatbot instead of attending therapy.
  • Over 5% of Americans have turned to Chat GPT for a medical diagnosis and followed its advice.

To protect your health data means to protect your access to proper medical care, maintain confidentiality, and safeguard against potential privacy breaches or misuse of sensitive medical information. So, never disclose your medical conditions, diagnoses, treatment details, or medication regimens to AI chatbots. Instead, discuss with qualified healthcare professionals in a secure and private setting.

How to use chatbots safely

Here are 3 things you can do to safely use chatbots and protect your privacy.

1. Be cautious about the information you provide.

2. Read privacy policies and look for chatbot privacy settings.

3. Use the option to opt out of having your data used for training language models when available.

In general, using incognito/private modes, clearing conversation history, and adjusting data settings are the main ways to limit data collection by AI chatbots. Most major AI chatbot providers offer these options.

Here are some examples:

OpenAI (ChatGPT, GPT-3):

- Use incognito/private browsing mode

- Enable "Don't save history" in the settings

- Clear your conversation history regularly

Anthropic (Claude):

- Enable the "Privacy filtering" setting to prevent data collection

- Use the "Incognito Mode" feature

Google (Bard, LaMDA):

- Use guest mode or incognito mode

- Review and adjust your Google data settings

Next level: Make AI keep you safe

While being cautious is wise, what if you could take privacy protection even further by leveraging an AI assistant explicitly designed to safeguard you?

Scamio is our next-gen AI-powered chatbot, designed to detect scams and fraudulent activities. You can send Scamio the tricky text, email, instant message, link, or even QR code you received, and it will provide an instant analysis to determine whether it's a scam attempt.

To start a conversation, visit scamio.bitdefender.com/chat or chat directly with Scamio on Facebook Messenger.

What Not to Share With AI Chatbots (2024)

FAQs

What should I not share with ChatGPT? ›

Your name, your address, your telephone number, even the name of your first pet…all big no nos when it comes to ChatGPT. Anything personal such as this can be exploited to impersonate you, which fraudsters could use to infiltrate private accounts, or carry out impersonation scams – none of which is good news for you.

What not to tell AI? ›

"Never share any private or personally identifying information with an AI chatbot," said Paul Bischoff, consumer privacy advocate at Comparitech, speaking with The U.S. Sun.

Is it safe to share personal information with a chatbot? ›

Don't give out any sensitive or personal information to any bot, such as names, addresses, phone numbers, emails, passwords, credit card numbers, or social security numbers. Also, don't upload any files that contain such information or that are confidential, proprietary, or copyrighted.

What information should you not put into ChatGPT? ›

Don't share personal information or content

Interactions with ChatGPT are not private. OpenAI can use your chat history for research and model improvement purposes which is why you should never share your personal, confidential, or sensitive information, such as passwords or financial details.

Should I tell ChatGPT my real name? ›

Chatbots like OpenAI's ChatGPT are designed to make your life easier. They can answer complex questions and even perform tasks – in an eerily humanlike way. So it may be tempting from time to time to reveal things about yourself, including your name, but you must avoid it at all costs.

What to never say to AI? ›

Six Things You Should Never Ask An AI Assistant
  • Don't ask voice assistants to perform any banking tasks. ...
  • Don't ask voice assistants to be your telephone operator. ...
  • Don't ask voice assistants for any medical advice. ...
  • Don't ask voice assistants for any illegal or harmful activities.

What questions AI can't answer? ›

Unfamiliar: AI models are also not always able to understand questions that are unfamiliar to them. If a question is about a topic that the model has not been trained on, it may not be able to give a meaningful answer.

What not to say to a chatbot? ›

  • 10 Things you Should Never Say or Do to a Bot. Funny & Strange Messages People Text to Chatbots. ...
  • Are you a Bot? are you a bot?
  • Where are you From? Obviously all bots live in the Cloud Silly :) ...
  • “I love you” ...
  • Are You Real? ...
  • What are you wearing? ...
  • And don't Be Getting Sexy with the wrong bot!!! ...
  • Are you Single?

What not to say to ChatGPT? ›

We'll also discuss the reasons these demands are improper, immoral, or even criminal, as well as the possible repercussions of making them.
  • Personal Information. ...
  • Offensive or Inappropriate Content. ...
  • Illegal Activities. ...
  • Medical Advice. ...
  • Financial Advice. ...
  • Future Prediction. ...
  • Personal Opinions. ...
  • Malicious Intent.
Mar 30, 2024

Is it safe to give ChatGPT my email? ›

Justifiable, the potential for data misuse is a valid safety concern. OpenAI's ChatGPT FAQs suggest you don't share sensitive information and warns users that prompts can't be deleted. The same FAQs states that ChatGPT saves conversations which are reviewed by OpenAI for training purposes.

Does chatbot keep your history? ›

Additionally, you should always know how long the chatbot stores user data and delete it when it's no longer necessary. OpenAI has a 30-day retention policy, but other companies may keep your data longer or shorter periods. Check any chatbot provider's terms of service to understand the data storage policies.

How to make sure ChatGPT gives correct answers? ›

Be Specific!

One of the most crucial elements of a well-written prompt is specificity. The more specific you are, the better the chance ChatGPT has at generating relevant and accurate responses. When asking for a response on a particular topic or subject, provide as much context and as many details as possible.

How to get the best answers from AI? ›

Read on to unlock the full potential of your AI tools.
  1. Ask the AI Model to Role-Play a Persona. ...
  2. Describe the Persona's Manner. ...
  3. Use Questioning Words. ...
  4. Ask For Output at a Specific Reading Level. ...
  5. Request a Step-By-Step Response. ...
  6. Avoid Jargon and Acronyms. ...
  7. Use Exclusions. ...
  8. Crucially, Progress.

What questions should a chatbot answer? ›

A customer service chatbot should always be trained to answer questions about shipping, specific products, refunds. An HR chatbot should always be trained to answer questions about leave policies, organisational chart, etc.

What is not allowed on ChatGPT? ›

Disallowed usage includes: illegal activity, violent content, adult content, fraudulent activity, and more.

Is it safe to upload data to ChatGPT? ›

ChatGPT doesn't share user data with third parties without consent. The data collected is used only to improve the chatbot's performance and provide a better user experience.

Is it safe to share your phone number with ChatGPT? ›

Providing ChatGPT with your mobile number while signing up is completely safe and secure. However, this should not be confused with providing your number whilst in conversation with the chatbot. Any personal information you share with ChatGPT may not be secure and could be at risk of a data leak.

Top Articles
How to build an AI trading bot?
Phone number requirements for US Stripe accounts : Stripe: Help & Support
Nullreferenceexception 7 Days To Die
Hotels Near 625 Smith Avenue Nashville Tn 37203
Cintas Pay Bill
Garrison Blacksmith Bench
Plaza Nails Clifton
Sarah F. Tebbens | people.wright.edu
Boggle Brain Busters Bonus Answers
Big Y Digital Coupon App
Osrs But Damage
Craigslist Cars Nwi
Babyrainbow Private
Craigslist Pikeville Tn
Samsung Galaxy S24 Ultra Negru dual-sim, 256 GB, 12 GB RAM - Telefon mobil la pret avantajos - Abonament - In rate | Digi Romania S.A.
Nalley Tartar Sauce
Enterprise Car Sales Jacksonville Used Cars
Bx11
G Switch Unblocked Tyrone
Aldi Bruce B Downs
Lakers Game Summary
Never Give Up Quotes to Keep You Going
1 Filmy4Wap In
Getmnapp
Kirk Franklin Mother Debra Jones Age
FAQ's - KidCheck
O'reilly's In Mathis Texas
Uncovering the Enigmatic Trish Stratus: From Net Worth to Personal Life
Japanese Emoticons Stars
Brenda Song Wikifeet
Golden Tickets
Tendermeetup Login
Drabcoplex Fishing Lure
ENDOCRINOLOGY-PSR in Lewes, DE for Beebe Healthcare
Bones And All Showtimes Near Johnstown Movieplex
Gary Lezak Annual Salary
F9 2385
Panorama Charter Portal
2132815089
Hovia reveals top 4 feel-good wallpaper trends for 2024
Unlock The Secrets Of "Skip The Game" Greensboro North Carolina
St Vrain Schoology
Marcal Paper Products - Nassau Paper Company Ltd. -
Embry Riddle Prescott Academic Calendar
Jane Powell, MGM musical star of 'Seven Brides for Seven Brothers,' 'Royal Wedding,' dead at 92
Crigslist Tucson
Www.homedepot .Com
Barback Salary in 2024: Comprehensive Guide | OysterLink
Causeway Gomovies
Naughty Natt Farting
Kobe Express Bayside Lakes Photos
Latest Posts
Article information

Author: Delena Feil

Last Updated:

Views: 5951

Rating: 4.4 / 5 (65 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Delena Feil

Birthday: 1998-08-29

Address: 747 Lubowitz Run, Sidmouth, HI 90646-5543

Phone: +99513241752844

Job: Design Supervisor

Hobby: Digital arts, Lacemaking, Air sports, Running, Scouting, Shooting, Puzzles

Introduction: My name is Delena Feil, I am a clean, splendid, calm, fancy, jolly, bright, faithful person who loves writing and wants to share my knowledge and understanding with you.