Many or all of the products featured here are from our partners who compensate us. This influences which products we write about and where and how the product appears on a page. However, this does not influence our evaluations. Our opinions are our own. Here is a list of our partners and here's how we make money.
Will artificial intelligence guide our financial lives one day? Provide us with personalized advice and help us make the correct wealth-building decisions?
ChatGPT, Alexa, Siri, Google Assistant and new tools called Ernie, Andi and Bard (were those Sesame Street characters?), along with the new chat-enabled Bing, all scrape the internet to find answers to the questions you ask. Then they (sometimes) rephrase the answer to make it sound a bit more custom. Often, they just quote what they believe to be a reliable source.
Plain ol' Google has been doing this for years, but lately, it seems to be offering results that are top-heavy with ads, barely on topic and less valuable than ever.
Machine-learning chatbots promise straight answers in conversational language.
The Nerds over at the Smart Money podcast have put ChatGPT to the test with some interesting results. Check it out.
But let's walk through this process a bit more.
At capacity and sometimes wrong
The first thing you'll notice about ChatGPT is that it's a busy little bot. You may often get the message, "ChatGPT is at capacity right now." And you'll have to wait. And wait. Or you can click the link to "Get notified when we're back." I've done that several times and never got notified. Rude.
A premium service, ChatGPT Plus ("plus" these days means "paid"), is accepting waitlist registrations and will initially cost $20 per month. OpenAI says it will continue to offer the free service.
Another ChatGPT caveat: A list of "limitations" says the service "may occasionally generate incorrect information. May occasionally produce harmful instructions or biased context." And it has "limited knowledge of world and events after 2021."
There's a vote of confidence, huh?
Without providing details, you'll get an if/then answer
Let's ask ChatGPT a specific financial question and look at the response.
Should I use my tax refund to pay down debt, put it in savings or invest it?
"It depends on your financial goals and the specifics of your situation," ChatGPT said. That's a prudent disclaimer. "If you have high-interest debt, paying it down can be a good way to reduce the amount of interest you pay over time, which can save you money in the long run."
ChatGPT's 200-word response also mentioned:
Building up emergency savings if you have low or no debt.
Splitting the tax refund between debt and an emergency fund.
Investing in a diversified portfolio if you have long-term goals.
"Ultimately, the best choice for you will depend on your individual financial goals, needs and circumstances. Consider seeking the advice of a financial professional if you're unsure about the best choice for you."
Not a bad answer, but really generic advice, right?
But here's the thing. If you sent an email and asked me — for all intents and purposes, a real human being — the same limited-information-available question, my answer would be very similar, just as nonspecific, and still pretty useless, I'll admit. (But please don't bother emailing me a question. My financial advisory days are over, and I no longer provide personal advice.)
Whether it's artificial or human intelligence providing the answer to a general, no-details-provided money query, you'll likely get a series of "if, then" responses — just like ChatGPT delivered.
That could change.
Better information equals better advice
One day soon, you may be able to provide specific information to a machine-learning device and get a better answer. But it will require the input of quite a bit of information.
For example, with our question concerning paying off debt or saving/investing a tax refund, the AI of the near future — if the developers allow the bot to give personalized advice, and that's a big 'if' we'll cover in a moment — would likely ask:
What is your age?
How much is your tax refund?
What is your annual income?
How much do you have in savings and investing accounts, and how is it positioned?
How much debt do you have, and what are the interest rates you're paying?
Do you have an emergency fund? How big is it?
When do you want to retire?
What level of risk are you comfortable with?
With that information, you would likely get a reasonable and actionable answer. Basically, these are the questions a human financial planner would ask to provide you with sensible options.
A wall of regulation could stand in the way
If/when AI moves to this more advanced capability, your choice for guidance may boil down to how you want to obtain your advice: from a human or a machine.
When the stock market gets rocky, and you want reassurance that your investments are still well-positioned, you might want to seek a reasonable, calm and logical voice in a machine — or talk to your human advisor. Or maybe ask both. A checks-and-balances type thing.
However, there is a wall of regulation to be built before AI is allowed to give you specific personal financial advice.
Those regulatory considerations include things like:
Who's responsible if the bot gives you wrong or misleading advice?
Are the developers behind the bot making money off the advice, and if so, how?
And is the advice truly objective — or does it favor the profit motive of the company behind the bot?
You see, until the machines run themselves and don't need human intervention (gahrrgg!), the fleshy minds among us might complicate things with money motivations.
A fiduciary advisor bound by laws, licenses and regulations to put your needs first, above profit and products, is the right choice for individualized financial advice.