We all use ChatGPT for our own reasons. For some, it’s a powerful brainstorming partner. For others, it becomes someone closer to a friend, a space to think out loud and have conversations that feel surprisingly human. Everyone finds their own way of using it, and honestly, there’s no right or wrong approach. I thought I had figured out mine, too, until curiosity nudged me in a different direction.
I don’t really believe in astrology, but that hasn’t stopped me from checking my zodiac reading whenever it pops up on Instagram or sneaks into a newspaper column. It’s harmless curiosity, mostly. I like to know what the day has in store, strictly the good stuff, of course. If the prediction sounds gloomy, I simply choose to believe it’s wrong. Selective belief has its perks.
So, when AI became the thing over the past few years, and ChatGPT quietly wove itself into everyday life, the idea almost felt obvious. If I could casually check my horoscope for reassurance, why not ask AI about my future instead? What harm could it do? As it turns out, the answers weren’t dramatic or alarming. They were unsettling in a quieter, more lingering way, the kind that stays with you longer than you expect.
The beginning of my curiosity
Feeding the machine my fate
I didn’t overthink it. I jumped right in and asked ChatGPT to predict my future using my birth chart. Almost instantly, it came back with a list of details it needed from me: my date of birth, time of birth, and place of birth, down to the city and country. It felt oddly familiar, like the same questions any astrologer would ask before drawing conclusions about your life.
Based on these inputs, ChatGPT explained that it could map out different aspects of who I am and where I might be headed. It would break down my personality traits, career and money tendencies, patterns in love and relationships, and even the general direction my future seems to lean toward. The process sounded structured, almost clinical, yet uncomfortably personal, as if a few data points were enough to sketch out a version of my life.
What followed once I shared my information
The AI started sounding a little too familiar
Once I shared my information, ChatGPT began with what it called a core chart snapshot, breaking it down into my sun sign, moon sign, and rising (or ascendant) sign. To be honest, I only know my sun sign, so I have no real way of verifying the rest. Still, I found myself trusting the process, or maybe trusting the AI more than I expected to.
What followed was a breakdown of my personality, and that’s where things started to feel a little too accurate. It described me as confident, opinionated, and drawn to work. So, that definitely checked out. It also mentioned my resistance to being told what to do, which I won’t deny. Interestingly, it added that when someone tries to control the outcome, I tend to do an even better job than expected. That one hit close to home.
I Asked ChatGPT For Some Hot Takes, and Some of Its Responses Were Wild
ChatGPT isn’t just about making nice pictures and helping with emails.
It also touched on my emotional needs and what it called my “put-together energy.” On the surface, I may seem calm and composed, but internally, my mind is rarely quiet. There are always a hundred thoughts running at once, neatly organized chaos. That balance between appearing steady while constantly overthinking felt familiar.
What genuinely caught me off guard, though, was the career insight. ChatGPT claims that I thrive in writing, media, tech, reviews, and analysis, which is exactly what I do. At that point, I couldn’t help but wonder whether it had somehow pulled information about me from the internet or if I had accidentally walked into a very accurate reflection of my own choices. Either way, it was unsettling. How could an AI be so precise about what I’m doing or what I’m supposedly meant to be doing?
It wrapped up outlining my near future energy, and thankfully, it leaned positive.
I liked it overall. Still, I’m taking it with a healthy pinch of salt, and I’d recommend the same to you. Predictions, whether from astrology or artificial intelligence, can only go so far. In the end, it’s fate, effort, and a bit of chaos that actually decide what unfolds in your future.
The line between insight and illusion
AI is undeniably making our lives easier. I hate to admit it, but it’s true. Most of us have leaned on tools like ChatGPT or Google Gemini to get through tasks, from quick answers to deeper problem-solving. And more often than not, they do the lifting well. That said, AI isn’t infallible. It hallucinates, confidently fills in the gaps, and can sound convincing even when it’s wrong. Trusting it is fine, but only to a point. I’ve seen people treat ChatGPT like a therapist, sharing thoughts and emotions as if it were a substitute for real, human support. While that might feel comforting in the moment, it can’t replace the nuance, empathy, and accountability of a real professional.
The same applies to roles AI seems poised to disrupt. Take astrology, for instance. As this little experiment proved, AI can do a surprisingly good job, sometimes even accurately. But accuracy isn’t the same thing as understanding. A human, flawed as they may be, brings context, intuition, and lived experience that an AI simply can’t replicate. The human element is what gives meaning to things we’re trying to understand about ourselves. So yes, ChatGPT impressed me. I’ll give credit where it’s due. But its confidence can be misleading, especially when it feels right. Hence, I’m not handing over my life’s direction to an AI anytime soon, no matter how well it performs. And with that in mind, here are six more things you probably shouldn’t trust ChatGPT with, even if you rely on it every single day.












