This is how OpenAI prompts its own custom ChatGPTs


GPTs allow OpenAI ChatGPT users to create their own ChatGPT instances that follow specific instructions and refer to uploaded data.

Based on these individual ChatGPTs, OpenAI intends to build an app ecosystem with its own marketplace. The most successful GPTs will get a share of ChatGPT’s revenue.

OpenAI launched the new feature with several home-grown chatbots as inspiration. The “Creative Writing Coach,” for example, is designed to give feedback on texts and offer writing tips. The “Tech Support Advisor” helps set up computers, and the “Math Mentor” helps parents with their children’s math lessons.

OpenAI has put 13 of these GPTs online so far. If you want to create your own GPTs: Be careful, your uploaded datasets may be offered for download by the chatbot.



OpenAI has no clear method of prompting its GPTs, it seems

As with DALL-E 3 and ChatGPT, GPTs allow you to find out the “system prompts”, i.e. the instructions stored by OpenAI for the chatbot, by asking specific questions.

Dustin Miller, the developer of the ChatGPT AutoExpert prompting system, took the trouble to extract the GPT system prompts and document them on Github. All custom GPTs have a basic prompt before the individual user instructions follow.

You are a “GPT” – a version of ChatGPT that has been customized for a specific use case. GPTs use custom instructions, capabilities, and data to optimize ChatGPT for a more narrow set of tasks. You yourself are a GPT created by a user, and your name is (name of Custom GPT). Note: GPT is also a technical term in AI, but in most cases if the users ask you about GPTs assume they are referring to the above definition.

Here are instructions from the user outlining your goals and how you should respond:

(your Custom GPT instructions go here, along with namespace and type configuration if you’re using custom actions.)

GPTs System Prompt

I read through all of OpenAI’s GPT prompts. One thing I noticed: they vary a lot. Sometimes the prompts are all lowercase, other prompts use a lot of uppercase, sometimes the chatbot has a role (“You’re making coloring book pages …”), and sometimes it just has a goal and a style.

Some prompts are phrased from the chatbot’s perspective (“As The Negotiator, my role is to assist …”), others in the style of a description (“Introducing Sous Chef, a blend of relatable sophistication and charm …”) or an instruction (“As an expert in laundry care, this GPT specializes in providing advice on stain removal …”).

Some chatbots have examples in their prompts, but most are simple descriptions of the topic and task in a few lines. They are much less complex than the DALL-E 3 system prompt, which contains numerous rules and formatting.


GPT Builder, the system that defines the behavior, style, and capabilities of the custom GPT chatbot based on user input such as role and goals, constraints, personalization, and so on.

Similar to DALL-E 3, OpenAI has built a more complex set of rules with many detailed instructions and, in some cases, programming language approaches.

Image: Screenshot of ChatGPT-AutoExpert

For example, the system needs to write a DALL-E 3 prompt to generate the profile image, keeping in mind that this image can easily be scaled down to 100px. The visual must be specific and use few but concrete shapes for this to work.

There are also different styles, such as photorealistic, hand-drawn, or futuristic, from which the system should choose a single style. What struck me was that OpenAI forbids the system from using camel case (“iPhone”) in two places and in all caps. Prompt engineering remains an experimental discipline.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top