You Probably Don’t Need to Pay for ChatGPT Anymore
The big tech story last week was GPT-4o , the latest model from OpenAI. However, GPT-4o is more than just an update to ChatGPT. The main feature is a new version of the GPT-4o-based voice mode that essentially turns ChatGPT into Samantha, the smart assistant from the 2013 film Her ( even if “she” no longer sounds like ScarJo ). While demos showcasing the voice mode have understandably gone viral, another piece of news about GPT-4o was just as big: the latest model, including a futuristic and dystopian voice mode, will be offered to all users for free.
This is OpenAI’s first experience. While you’ve always been able to access some ChatGPT services for free, paying for ChatGPT Plus has greatly improved the experience. Previously, a subscription gave access to the latest and greatest models that free users could not access, as well as other perks such as the ChatGPT plugins and their successors, GPT . However, OpenAI now appears to be leveling the playing field. field for everyone, offering many of the previously exclusive Plus features to free users. It seems like there should be a catch here – and there sort of is – but even so, free users have never been able to access so much ChatGPT before, and we’ll note that it’s probably no longer worth paying $20 for ChatGPT Plus.
What does a free ChatGPT account get you?
Whether you create a free ChatGPT account or log in with the same account you’ve had since 2022, you now have (or will soon have) access to the GPT-4o model. It may be enabled by default, or you may need to switch to it manually. (On mobile or desktop, select the model name at the top of the screen (for example, ChatGPT 3.5) and make sure GPT-4o is selected.)
Once selected, you get access tothe same GPT-4o model as paid users . OpenAI claims that it will understand the images you upload better than previous models and, in turn, will have improved responses and understanding. Additionally, while you should see roughly the same performance when working with English text with GPT-4o compared to previous models, OpenAI claims that it handles non-English texts much better. (Check out some anecdotal evidence to see how much faster GPT-4o might be compared to GPT-4.)
More features and better performance
This is very important because the previous GPT-4 and GPT-4 Turbo models were not available in the free tier. (However, you used to be able to access GPT-4 Turbo using Copilot .) But this will become even more of a challenge in the future once OpenAI releases the new GPT-4o voice mode and Plus users get voice mode first. , free users will also receive it in the near future.
But even before the voice mode arrives, users of the free version will be able to try out a lot of new features. With the new model, you now have access to data analytics, file downloading, vision, and web browsing—features associated with GPT-4o. GPT-3.5 can’t read downloads, view photos and videos, or search the web, so for the first time these options are available to users who don’t pay for OpenAI’s services.
Additionally, the free plan now includes access to GPT and GPT Store . GPTs are specially created versions of ChatGPT that specialize in specific skills or services. This feature was previously exclusive to Plus, but now free users can browse the GPT store for bots that specialize in everything from translation to cooking. They can also create their own GPTs if they don’t find what they’re looking for. Memory is another paid exclusive that has become free. ChatGPT will now remember past conversations for free users too, so new chat will no longer be the blank slate it once was.
Why pay for ChatGPT at all?
If you’re really serious about GPT-4o, the answer is simple: message limits. OpenAI states that paid users will have a message limit of five times greater for GPT-4o than free users, but does not specify what the message limits actually are for each plan. However, according to Reddit , users of the free version can have between 10 and 16 messages every three hours using GPT-4o. This is not much, especially if you often communicate with the OpenAI chatbot. Once you reach that limit, you’ll have to revert to the GPT-3.5 model, which isn’t bad, but will likely be much more limited than the last model.
But during periods of downtime, you’re losing more than just better performance: GPT-4o is what gives free users access to data analysis, file downloading, vision, and web browsing. Once you run out of messages with GPT-4o, you will also not be able to use these features until your message limit is reset after three hours.
This doesn’t mean paid users have unlimited access to these features: as of May 13 this year, Plus can send 80 messages every three hours on GPT-4o and 40 messages every three hours on GPT-4. (Why the discrepancy?GPT-4o is actually 50% cheaper than GPT-4 .) While that doesn’t mean paying $20 a month will give you uninterrupted access to GPT-4o and GPT-4, you definitely have will have more time with these models. and their capabilities than those of free users.
This will likely also apply to the new voice mode when it breaks down. When chatting with a model on the free tier, you may quickly run out of messages, while paid users can expand their communication by perhaps as much as 800%.
Most users will benefit from the free tier
In a sense, this new open AI approach adds a sort of “ChatGPT Plus demo” to the free tier: at any time, you can switch to the best OpenAI has to offer and see what life is like on a different tier. side. If you only use ChatGPT for short packets, this may work for you. You can ask a question that requires an internet search, upload a photo and ask something about it in ChatGPT, and have the model perform analysis on a dataset for you, then close the app without paying a penny. But if you need to do this often enough to reach your message limit, you might be tempted to use a credit card.
However, I think most of us can get by without having to pay for ChatGPT: for most simple tasks, GPT-3.5 is still quite useful, and GPT-4o can help with more complex or multimodal queries. Plus, if you really need a faster GPT model, there’s always Copilot.