Since January, I’ve worked part-time as a prompt engineer for a platform that trains AI chatbots and large language models (LLMs). Additionally, I’ve subscribed to ChatGPT and have a complimentary subscription to Grok 3 by virtue of being a Premium X user. I have used ChatGPT and Grok 3 to perform research and to learn more about their capabilities. I’ve generated images uses AI — on several different platforms (including Substack) — and I’ve used ChatGPT to edit and critique my fiction writing and help refine a role-playing novel game I’m creating. Below are five things I’ve learned about AI this year.

## **Top 5 Lessons I’ve Learned About AI in 2025**
Below are the top five lessons I’ve learned about generative artificial intelligence since I started working with it in January 2025.
1. **Hallucinations happen** — We all have that friend we adore for multiple reasons, but has that one huge habit we find ultra annoying. It’s not unheard of to love a friend so much that we overlook their annoying superhabits rather than simply tolerate them. We accept those habits as part of the package. Well, I’ve noticed — particularly with ChatGPT — that hallucinations are a regular occurrence. With AI models, a hallucination is when the chatbot, LLM, image generator, or whatever, simply makes up a fact. The AI isn’t consciously lying (because they have no consciousness), but in performing the tasks assigned to it, an AI may simply hallucinate that something is true when it isn’t. For instance, if you ask an LLM to research the history of the United States, it might get that history 99 percent correct, but buried somewhere in its analysis, you find that Hollywood was once the capital of the country, a “fact” that simply isn’t true (this is not a real example; I’m simply using it as an illustration to show what a hallucination is). Because AI models can hallucinate, it’s important to double-check their accuracy.
2. **Different AI models have different strengths** — In addition to ChatGPT Pro and Grok 3, I’ve had a chance to play around with Claude.ai’s free model. What I’ve learned is that all of the AI models are good at something, and all of them are better at certain tasks than others. Claude is known for its creative writing chops. ChatGPT, I’ve learned, is okay with creative writing, but it’s better with short-form writing than long-form writing. And it is prone to memory loss. In other words, if I ask it to do something in an earlier chat message and ask it later to do the same thing, it forgets specific details and I have to remind it of those details. Grok 3 is better at researching more technical subjects, whereas ChatGPT is better for producing marketing content. Understanding model differences is important for choosing which one to use for a specific task.
3. **Each AI also has a variety of different models** — It’s important to understand that chatbots, LLMs, and other generative AI tools are designed for specific tasks. Some models, such as ChatGPT o3, are designed for advanced reasoning. Others, such as ChatGPT 4.5 are excellent research tools but aren’t designed for high-value reasoning tasks such as what is necessary with STEM-based projects, coding, and science. Grok has similar models for similar purposes. On top of that, the tools can be used for deep research, which takes longer than ordinary search prompts. If you want to build an app or a website using AI tools, Grok, ChatGPT, and Claude.ai all have specific coding models to assist with those types of projects.
4. **AI is more efficient at search than I am** — These AI tools — particularly Grok 3 and ChatGPT — are becoming necessary research tools for me. As a freelance writer, it’s important to conduct proper research for most client projects. Until now, my primary tool for research was Google. Whether I performed a general search, used Google Trends, filtered through the latest news, or consulted scholarly articles, some information retrieval process was necessary and it usually involved search engines. With artificial intelligence, I can engineer a prompt and have the AI perform the search. It can retrieve better information much faster and more efficiently than I can on my own, even with 30 years of experience with search engine information retrieval processes. AI is already an indispensable tool for me as a writer, whether I’m working on a client project or a personal project (such as seeking background information on a novel).
5. **With AI, a non-coder can program an app or a website on their own** — As an experiment, I asked ChatGPT to code a web app for me. The result was [this flash fiction writing tool](https://authorallentaylor.com/write-a-flash-fiction-story/). The tool is very simple, but I’m confident I could turn it into a WordPress plugin with ChatGPT’s help. Not only that, but I could code an entire website or more complex publishing tool using ChatGPT, and probably Grok 3, without being a coder, developer, or programmer. That’s how powerful these AI tools are.
These are not the only lessons I’ve learned about generative AI, but these are powerful lessons. And I’m enjoying the process of learning about these tools while getting paid to do so. Using them daily has me thinking about ways I can monetize my knowledge and my ability to use them productively. If you’re looking for a prompt engineer with an excellent working knowledge of ChatGPT and a growing working knowledge of Grok 3, look no further. I’m at your service.
[First published at Substack](https://thetaylorkarass.substack.com/p/5-things-ive-learned-about-ai-this). Image created by Substack's AI image generator.
Posted Using [INLEO](https://inleo.io/@allentaylor/5-lessons-ive-learned-about-ai-in-2025-b4o)