Post Content
Would you take financial advice from a sociopath? If you’re among the majority of Americans who use AI for money management, it’s likely you already have.
MIT finance professor Andrew Lo co-authored a 2023 paper that equated large language models (LLMs) like ChatGPT to “a human sociopath” due to their inability to express empathy (1).
Yet that doesn’t seem to bother the 55% of Americans who told a March 2026 TD Bank survey that they “use AI to help manage their finances (2).” That’s up from just 10% the year before.
Must Read
-
Thanks to Jeff Bezos, you can now become a landlord for as little as $100 — and no, you don’t have to deal with tenants or fix freezers. Here’s how
-
Robert Kiyosaki says this 1 asset will surge 400% in a year and begs investors not to miss this ‘explosion’
-
Dave Ramsey warns nearly 50% of Americans are making 1 big Social Security mistake — here’s how to fix it ASAP
Of course, using AI for financial assistance isn’t new. Americans turn to it for everything from financial literacy questions to savings plans, stock market tips and retirement advice (3). Lo himself has spoken about how to use AI as a financial advisor. The rising concern around AI and finances, however, is more about what information people are providing LLMs to secure the advice they’re looking for.
A 2024 Cisco survey confirmed that up to 29% of AI users input account numbers and other financial information — despite the vast majority acknowledging that their data could be shared (4).
But the sharing of data isn’t the only concern. Thieves have discovered devious ways to lift your personal info from LLMs — data that they can then use to steal your identity and your money.
How criminals steal your data from AI tools
Last year, Stanford University researchers studied six major U.S. LLMs: Nova, Claude, Gemini, Meta AI, Copilot, and ChatGPT (5).
Lead author Jennifer King said they found that any “sensitive information” shared with such LLMs “may be collected and used for training” — a big problem, she added, because “almost no research has been conducted to examine the privacy practices for these emerging tools.”
NordPass, a division of Nord Security, warned that security breaches at LLMs can enable “malicious actors” to “view your complete chat history, including any sensitive data that you shared with the AI tool (6).”
Others, meanwhile, note that personal information you upload is used to train AI models, which then “could reproduce verbatim text, exposing personal information (7).”
Cybercriminals can also steal your data through a process called an indirect prompt injection — where they hide prompts for LLMs inside anything from web pages to images, emails and documents (8).
Imagine that you’re using an LLM and unknowingly upload a web link or document that has a prompt injection hidden within it — one that tells the LLM to ignore your request and instead reveal your password and other sensitive information to the cybercriminal who set the trap.
The LLM will complete the prompt request to “expose sensitive data, trigger unwanted actions, or bypass safety filters,” according to Norton (9). That data breach could include financial information that you entrusted to the AI tool.
Once cybercriminals have your financial information, they can do everything from steal your identity to borrow money and open accounts under your name to simply selling your data to other bad actors on the dark web (10).
What you should never reveal to AI
Most of the information that experts advise you never share with AI seems like common sense, but it bears repeating.
Always avoid sharing personal financial information, including bank account, credit card and Social Security numbers, passwords, your banking institutions and even specific dollar amounts — such as the amount of debt you’re looking to pay off.
“If your chat is exposed in a breach,” the Washington Post said, “and a scamming operation can connect your conversation back to you, they may have enough to persuade you they’re calling from your lender (11).”
Others recommend keeping your name and address, phone number, birthdays and other general personal info out of LLMs, along with medical histories and work-related data (12).
Norton even warned that inputting creative works, like a script you wrote and need help editing, could result in replication of that content in other searches while stripping you of ownership and “leaving you vulnerable to legal and financial risks.”
To be safe, never share your real name or identifying or legal information with an AI tool either, and always check the URL to ensure you’re on a legitimate LLM site rather than a malicious imposter (13).
As well, if the LLM has a privacy setting that allows you to prevent the sharing of your data, enable it (14).
If you do need to ask questions about personal issues, use made-up but realistic-sounding data and financial info so that you can get approximate answers without revealing anything specific about yourself (15). Then clear your chat history, just to be safe.
Lastly, don’t forget to double-check any LLM information or advice with a legitimate source. Remember, they are sociopaths after all.
You May Also Like
-
Millionaires under 43 are reshaping investing — just 25% of their portfolios are in stocks. Here’s where their money is going
-
Taxes are going to change for retirees under Trump’s ‘big beautiful bill’ — here are 4 reasons you can’t afford to waste time
-
Robert Kiyosaki issues grim warning for baby boomers: many could be ‘wiped out’ and homeless ‘all over’ the country
-
Vanguard’s outlook on U.S. stocks is raising alarm bells for retirees. Here’s why and how to protect yourself
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
Article Sources
We rely only on vetted sources and credible third-party reporting. For details, see our ethics and guidelines.
MIT GenAI (1); TD Bank (2); Credit Karma (3); Cisco (4); Stanford University (5); NordPass (6),(13); Milvus (7); EC-Council (8); Norton (9),(14); Aura (10); The Washington Post (11),(12); Sticky Password (15)
This article originally appeared on Moneywise.com under the title: More Americans are asking AI for money advice – and revealing too much personal information. What you should not share
This article provides information only and should not be construed as advice. It is provided without warranty of any kind.
Terms and Privacy Policy