Musings about ChatGPT and LLMs – How I use these tools today in 2025

I have been paying for ChatGPT at $20 a month for close to a year, and since I’m averse to subscriptions, it’s time to think about the value and determine if I want to continue, but also maybe muse for a bit about technology on the larger personal scale between web searches, the web in general, Apple Intelligence, and yes, LLMs like Anthropic, ChatGPT, Llama, and Gemini.

First, the web is not dying, but websites and traditional properties we think of on the Internet like Google.com and Facebook.com are dying. This is nothing new and not impacted at all by LLMs and AI…it’s mobile. The Internet is apps and services that run on our devices. You don’t access a web browser to view the weather on your Samsung Internet-connected fridge. You have a weather widget powered by a service. You use a weather app on your iPhone tailored to your preferences. Mobile and IoT killed websites more than LLMs (so far). I’m still a traditional Internet user in that I have 2-3 tabs open in Safari and use web pages for research, news, and some services, but I’m privileged to have a fast computer, large display, but also an AppleTV and 4K Sony A75L television where I consume all video and don’t have to suffer watching videos on my computer, iPad, or iPhone.

I haven’t used any LLMs except ChatGPT, at least not in a 1st-party way, so maybe Kagi.com, which I pay for, has some LLM-generated content when I add a question mark to a query, but I open ChatGPT on my Mac for researching more tough situations where it’d be less efficient to use the web. For my home’s remodel, I wanted to estimate HVAC options and went through a few thousand words of back and forth, uploading 2D drawings, sharing location, current situation, some thoughts, and getting solid recommendations, which would have not been possible even with hours of reading web pages that are mostly commercial companies just trying to sell me their HVAC project.

I use ChatGPT about 10 times a day, and for the most part, these are deep research or situations where I don’t want to build out a calculation in Excel and just want it done for me. ChatGPT is part calculator, financial planner (not adviser), and then research such as determining what chain I need for my motorcycle and calculating top speed of a sprocket driveline combo or researching sunscreen options or comparing specifications of motorcycles and adding in power-to-weight ratio, resale, cost of ownership, things that are more analytical where combining everything into a spreadsheet then doing calculations would require many dozens of web pages and queries along with the math required to do it, where ChatGPT will do that for me within 10-15 seconds.

Is ChatGPT saving me $20 a month? I think so, and it’s decent enough that I have no interest in trying another product at this time. I only use Google for their YouTube service and nothing else, so I have no interest in anything they have to offer, nor do I have an interest in Facebook’s LLM, and I’m disappointed by Microsoft’s CoPilot that’s built into their office apps, so OpenAI’s services really are where I’m defaulting.

Am I using ChatGPT in Apple Intelligence? Given that I only use Siri to trigger a phone call, iMessage, or HomeKit command, I have not found that I’m using Siri for anything else; therefore, sending a prompt to ChatGPT isn’t something I’ve had to do.

I’ll add though that I am considering my next MacBook Pro to be maxed out on RAM so I can run an LLM locally, which may save me $240 a year to ChatGPT but probably not since many of these models don’t have the agents / services ChatGPT offers, and spending $900 in RAM to run something locally probably isn’t worth it on an ROI basis, but I’m considering trying that out even if I only do 64GB of RAM on my next laptop.

Where does this leave the web, apps, services, and me? Well, I will say that I haven’t been pulled into ChatGPT like some other people. I don’t talk to it or share photos or exact locations, but I do think it’s helpful that ChatGPT knows my title, role, background, and interests as persistent memory. The personalized experience is helpful and does help their moat / lock-in against trying other services. I have found, like photography, that I balance things pretty well. I still use Safari and iPhone apps and AppleTV and, for research or complex things, ChatGPT, but it’s like having a librarian in a piece of software on my Mac. I don’t really use ChatGPT on my iPhone or iPad because those aren’t really devices where I want to do deep research. iPhone is music/navigation/payments/email/iMessage/photos, and iPad is light productivity/notes/reading, where the AppleTV is fully television / movie consumption, and the Mac is still my productivity workhorse where I sit 8+ hours a day, and that’s where I use ChatGPT as an extension to my web research or data compiling. I know ChatGPT can do more, and sometimes I use it for data analysis or summarizing things, but frankly, not as much as I should, and that’s not just trust, it’s just habit and how I’ve worked for over 20 years.

There have been a few situations where ChatGPT knows things that it shouldn’t know, and I can’t really get too detailed, but it has successfully referenced slide numbers in a presentation that’s only available to certain people behind a paywall, meaning someone else has uploaded that entire presentation to ChatGPT in the past, and as soon as I quoted something from it, it knew what I was doing and started recommending a slide number to me. When I asked how it knew that, it played dumb. So humans are uploading way more than they should, and ChatGPT is using that information to benefit all users, which makes me nervous of how much copyright/personal information is in the LLM that really shouldn’t be.

As for AI browsers and AI built into every productivity app or operating system, I think that’s moving too fast for humans to really grasp. We should all get better at using the web and doing our own research or learning how to use our tools without an assistant. Humans are lazy, so that battle is lost.

Where does that leave our traditional tech giants? Well, I’m happy Apple is behind. I don’t need them to follow Adobe and Microsoft to build AI into every piece of software, even though I know that’s coming. Google is screwed…I mean google.com is screwed. Alphabet will be fine, but I think web search is over, and I like that Kagi aggregates web results from everywhere with an LLM on top of it to give quality results, so I’m happy that company and product exist. Microsoft is failing with Copilot, but they’ll figure it out eventually, and Adobe is going too hard, too fast, but will be monetizing “credits,” which will force humans to balance things a bit. You can edit a video in Adobe Premiere manually or pay for credits to have things done for you. I like this model a lot. Facebook, I honestly don’t care what Facebook is up to. I deleted my Facebook account in 2011 and my Instagram accounts in 2019 and 2024, and I’ve never used WhatsApp or any of their other services. My relationship with Google has only been maintained by occasionally watching YouTube videos I find through my RSS read (which I don’t use AI for and read / mark as read everything that is posted on blogs I follow). But YouTube, I use to publish my own videos for which I make $3-$5K a year doing, and I have YouTube recommendations and all watch history turned off since I watch YouTube via the amazing Play app that is on macOS /iOS, where I send videos I find in RSS to then watch them back later on my AppleTV, but I don’t use YouTube for anything but playing things I’ve saved.

Over time, my LLM relationship and use will evolve, but I thought it’s good to think through where I am in 2025 and reference this back later, maybe in 10 years, to see where I ended up when that service matures and becomes more integrated into life.