Lessons from Structuring GPTs for Customer-Led Growth

Share

Summary

Here is a quick look at the lessons (also, conveniently matches the section headers),
Give it mixed data sets
Use the GPT to self-troubleshoot
Don't have it make decks
Be specific in defining how it should use your templates
Give it good supporting information

By: Tom Swanson, Senior Engagement Manager

If you read our B2B Reads, then you know I have been digging into custom GPT-building for some of our internal operations.  This post is a compilation of the lessons learned on properly structuring these things for customer marketing.  The goal is to speed you up in getting to a workflow that integrates better AI tools.  Before we get too far into it, here are a few other things to note:

  1. If you are unfamiliar with custom GPTs, I will try to sum it up. They are essentially modules within ChatGPT (or other option) that you can load with prompts and knowledge-base documents, and instructions for how to use them.  They are great for repeatable processes such as persona creation, campaign planning, messaging design, and plenty of other functions.
  2. This is not a guide on the mechanics of building them, if you need that I recommend you check out Andy Crestodina’s excellent guide. He has lots of other practical ideas and advice on this too.
  3. I am still pretty new to this function, but we are on this journey together.
  4. This is from doing “customer-led growth” work, but the concepts could easily be adapted to other areas of marketing.
  5. Know your AI usage policies before uploading data/templates/materials into ChatGPT. Proper care should be taken to avoid any sensitive data being divulged.

There is a lot of trial and error in building these tools.  I have gone back and tweaked/adjusted GPTs that we have used for campaign development, persona creation, and others.

While not an agent, this is closer to the concept than just talking to ChatGPT.  Access to targeted information, templates, and docs brings it much closer to a purpose-built function.

And it can all be built in plain English.  Sweet.CLG

 

If you are looking for ideas on what to do with a new CLG GPT, here are a few relevant posts from our blog:

Let’s get into it.

Give it mixed data

For purposes like designing campaigns, generating user personas, or identifying opportunities, custom GPTs function best when given access to both quantitative and qualitative data sets.

This is an old soapbox for me: LLMs really open the doors for marketing to embrace mixed-methods analysis.  With clean and compliant data sets, ChatGPT is great at digging through qualitative data like survey responses, call transcripts, and account notes.  This stuff is time-consuming and often subjective.

Link this with nice, objective, quantitative data about things like product adoption, usage, and churn, and your GPT gets a great grasp of the behavioral patterns and the context that triggers them.

Admittedly, this one is tough to activate.  Cleaning is the challenge here.  How much time this takes depends on the policies, volume, and types of data you are working with, but expect to spend a lot of the time on this part.  The good news is you don’t have to do this very often.

I imagine it is good to update this data on a regular interval (like yearly or 6 months depending on cycle length), but I haven’t gotten there yet.

Use the GPT to troubleshoot itself

This one was pretty fun.  Obviously you don’t want a front-facing GPT to reveal its inner workings, but while working internally on it, I find it is helpful to have it analyze its own actions.  More specifically, I found that if I built some troubleshooting specifications into the instructions, that was a great way to get to answers faster.

I would then go through my campaign strategist GPT.  Each time an answer was wrong or different, we would dig in together.  In fact, the most useful thing I found to do was to have the GPT write instructions for itself on how to reach and replicate the desired outcome.

It is a pretty meta thing to have it writing itself when the outcome is incorrect, but it makes sense.  Who knows you better than yourself?

You can also take this one step further and even have it optimize its own prompts and instructions.  Andy suggests this in his guide, and I found that to be an excellent tip.

Don’t have it make decks

Ok, this one I decided wasn’t worth the troubleshooting time.  Making decks seems like a low ROI activity.

A custom GPT can read decks and pull out information, but I have had a real bear of a time trying to get it to input information into a deck template.  Some solve must exist, but I have not found a time-efficient way to make it consistently produce a template-correct deck.  My lesson is to just avoid it altogether.

Technical difficulties aside, converting to the deck yourself is a good exercise to review the work that the GPT did.  It forces some critical thought. If you remove too many of these human checkpoints, things get real same-y, real fast.

Be specific in defining template usage

“Be specific” was something my high school teachers said when search engines were just getting started.  Some things never change.

Even though it struggles to fill out slide templates, it can still use templates quite effectively.  Specificity is important to effective usage.  For example, I want our GPT to put out information that is relevant to our campaign template, without going too far astray.  However, I don’t want it to make a deck since future users won’t know to expect it to mess that up somehow.  So I got specific, and now the responses are structured in a way that aligns with my needs, and are faster to be review-ready.

Give it supporting information

Think about what all you ultimately need from why you are using the GPT, and give it support documents.

For example, on a whim I fed our CMO’s Guide to Marketing Orchestration into the knowledge base of our Campaign Strategist GPT.  Now, when it spits out campaign strategy thoughts, it includes next steps that are aligned with our orchestration processes and workflows.  I did not ask for this, it just did it.

Operational documents are really solid adds to the knowledge base as the GPT will naturally look to connect the dots.

Here are some other ideas that will help get you further, faster:

  1. High performing ad-creative. You could target this by customer segments, with ads named according to the segments they performed well in.
  2. Project plan templates. This will help it suggest immediate next-steps.
  3. Your org chart.  This helps it assign roles for next steps.  It misses the boat here sometimes, but it has gotten me more than halfway most times.
    1. You can specify that it assign next steps to a role rather than an individual, it does this pretty well.
  4. Email/report templates if you want it to suggest points to add to your next QBR or whatever.

Conclusion

So there we go, 5 lessons I have learned from build custom GPTs to help with customer marketing.  Happy to talk through this with you, if you want to schedule a conversation please reach out to us.