ChatGPT might not be using the model you think — and it’s also hiding others in settings

There's a whole rotating cast of models working in the background

· TechRadar

News By Eric Hal Schwartz published 20 March 2026

(Image credit: Getty Images)

Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter

Get the TechRadar Newsletter

Sign up for breaking news, reviews, opinion, top tech deals, and more.

Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors


By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.

You are now subscribed

Your newsletter sign-up was successful


An account already exists for this email address, please log in. Subscribe to our newsletter


  • ChatGPT now uses a simplified model picker that can automatically switch between different AI models
  • Users may not realize that different prompts can trigger different underlying models
  • Additional models and controls are hidden in settings that most never access

Don't worry, you haven't gone crazy, ChatGPT's model picker (at the top of the screen) is looking a little cleaner this week, with fewer model names cluttering up the interface.

Forget model names like 5.4, 4o and o3, that's a thing of the past! ChatGPT Plus subscribers will now see just three choices, labeled Instant, Thinking, and Pro. However, the suggestion of transparency over complexity isn't quite what it seems.

The update changes the nature of ChatGPT options from a choice of models to more of a broad style request. The actual model used in an answer will more often be decided by ChatGPT based on your prompt complexity and other settings.

Article continues below

Those factors will affect whether ChatGPT's answers are from a faster, more lightweight model or a more powerful, power-hungry LLM. You might not even be told which one handled your request in the result.

It puts control of ChatGPT a step beyond the user. Selecting a ChatGPT model used to mean just that. Now, selecting any of the three modes might correspond to any of ChatGPT's stable of models, depending on other elements.

You might get answers almost instantly in short, conversational form. Or there might be a pause and a longer, more structured answer. That difference is not just tone. It reflects how much computational effort the system has decided to spend.

Model switching

The change isn't random; it helps OpenAI solve a real problem. Though powerful, the most advanced AI models are also slower and more expensive to run. Using them for every single request would make your experience of using ChatGPT sluggish and costly.

Get daily insight, inspiration and deals in your inbox

Sign up for breaking news, reviews, opinion, top tech deals, and more.

Contact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsors