How to Build an LLM Helper using Google Gemini 1.5 Pro

Leon Nicholls
8 min readApr 24, 2024

I’ve volunteered for Teia, a community-run NFT marketplace for artists worldwide. Teia’s wiki includes information about the community, its terms of service, and lots of technical details about minting NFTs.

I thought it would be an exciting challenge to make the wiki information accessible to the Teia Discord server by creating a bot using Google Gemini 1.5 Pro. Users can ask the bot questions, and the bot will respond based on the wiki information.

This post will show how I built my Discord bot for the Teia marketplace. You’ll see my system prompt (the instructions I gave Gemini), how I tested and tweaked it, and most importantly, how to adapt it for your interest. Whether it’s rare book collecting, competitive robotics, or the latest indie music — there’s a way to make an LLM serve your community better.

Note: This article is more techy than my previous articles about Google Gemini Advanced. However, you don’t need to be a developer to use any of the tools mentioned. This article focuses on the LLM aspects and not so much on coding a Discord bot.

Inside the Generic Gemini System Prompt

You might be familiar with Gemini Advanced, Google’s general-purpose LLM chatbot. It’s cool, but we will level up with Gemini 1.5 Pro (currently in public preview) for our specialized bot-building needs. Think of it as the big sibling with more power and flexibility under the hood. You can design it to handle complex tasks and, most importantly, fine-tune it for your niche.

How do we access this excellent LLM? That’s where Google AI Studio comes in. It’s a developer workspace for everything Gemini-related. We can upload datasets, test them, and tinker with our masterpiece. But don’t be intimidated — it’s easy for anybody to use and has a free tier! Once you log in with your Google account, select “Create new” and “Freeform prompt.” Here is an example prompt for the simplest way to get a response from the LLM:

“User: What is Google Gemini?

AI: “

Then click the “Run” button and wait a while for the response.

The Heart of the Matter: System Instructions

Inside Google AI Studio, one of the key things we’ll be working with is the System Instructions—think of it as the instruction manual for our LLM. It’s part guidelines (“Hey, be helpful!”), part limitations (“Don’t go spewing out harmful opinions!”), and part superpower fuel.

In a previous article, I explained using a generic system prompt, which is a good starting point, but the fun begins when we start customizing it. That’s how we transform a general-purpose bot into a Teia-savvy, misinformation-fighting, community-championing LLM assistant!

Upgrading the Teia “Answer Bot”

Okay, insider confession time: a bot is already on the Teia Discord server! I built the ‘Answer Bot’ using a great open-source library called NLP.js. Consider NLP.js a toolbox for training AI to understand natural language (how humans talk).

The Answer Bot has been chugging along, learning from every question people ask on the Teia server. It does a decent job, but it’s hit its limits with natural language understanding: It would work well for user questions close to the training data but struggle with open-ended questions that hadn’t been seen before. To give this bot a serious brain boost, it’s time for the Gemini 1.5 Pro upgrade.

Here’s why I’m excited about the switch:

  • LLM Power: Large Language Models like Gemini can effectively understand complex questions. Instead of just looking for keywords or phrases, they can grasp the whole context. Nuances in how people ask about NFTs matter a lot!
  • Knowledge Expansion: Uploading Teia wikis and other resources directly into Gemini expands the bot’s knowledge base beyond its original training. Gemini 1.5 Pro has a vast context window, so we can give it lots of domain-related data to use (the Teia wiki files have about 87,000 tokens).
  • Safety Upgrade: LLMs are great, but they need careful guardrails. Gemini’s built-in safety features and our customized prompts will help keep our Teia bot on track.

This isn’t about throwing away all the hard work that went into the original Answer Bot, either. The insights from that will help us craft better prompts and test cases for our Gemini-powered version.

Adaptations for Teia

Okay, let’s dive into how I transformed a generic system prompt into my Teia-specific one. Picture it like giving your bot a detailed job description:

  1. The Blueprint: I started with a customized version of the generic Gemini system prompt. Think of it as the LLM equivalent of employee orientation — covering things like being helpful, prioritizing facts over opinions, and keeping things safe.

“Consider the following system prompt for an LLM:

[Generic system prompt]

Adapt it for for a Fact-Focused LLM:

- Domain: Factual question answering about the Teia NFT marketplace and community (If unsure, admit it and suggest asking on the Teia Discord server).

- Output: Prioritize concise paragraphs and bullet points.

- Safety Emphasis: Strict protocols for detecting misinformation and potentially biased content.

- Avoid personal opinions.

- Be professional yet friendly.”

2. The Job Description: In Google AI Studio, I fed this refined prompt in as my System Instruction. This is where I could get super-specific:

  • Focus on the Teia marketplace and community as the bot’s areas of expertise.
  • Upload all the Teia wiki files. Now, my bot has the knowledge base to back up its claims.
  • Play it safe — I removed any mention of image generation since the Answer Bot wouldn’t support that.

3. The LLM Editor: The coolest part for me? Gemini itself helped me refine my prompt even further! I asked it to rewrite the instructions to keep the main ideas concise. This helped streamline everything for faster responses:

“Rewrite the system prompt to be more concise but maintain the most important aspects.”

The result is a Teia-specific System Instruction and a good starting point for testing the customized LLM with questions about Teia.

Note: Google AI Studio refers to a system prompts as a System Instruction.

Testing, Tweaking, and the Quest for the Perfect Prompt

It’s time to see if this bot walks the walk! Instead of making changes blindly, I got strategic. Here’s my testing game plan:

  • Teia Brain Teasers: I scoured the Teia Discord server for the kinds of questions people ask — the beginner stuff, the super niche queries, and even those that might stump a human expert.
  • Round 1… Fight! I fed those questions to the LLM and watched the results closely. Some answers were decent; others were hilariously off-base. But every response was a clue to where my system prompt needed work.
  • Iterate and Improve: This is where the magic happens. Did the bot need to understand something due to my wording? I’d tweak the prompt. Gaps in the knowledge base? Add that specific info directly to the prompt or hunt down better sources. Here are some improvements made to the System Instruction:
    — Additional information was added to the system prompt that isn’t covered by the wiki files.
    — Improvements to the style of the responses to be informal and conversational.
    — Answers should only be based on the wiki files and additional information provided in the system prompt; any unrelated data from the LLM training should not be used.
    — Improvements to the format of the responses.
    — Instructions to protect the system instructions from being modified by the user or revealed to the user.
    — Never provide any information that violates the Teia Terms of Service or the Code of Conduct.

Here’s the thing: testing isn’t just about finding flaws. It’s about seeing the potential and homing in on what makes the most impact. Some changes were tiny but yielded big improvements, while others were significant overhauls. Read the markdown version of the final system prompt for the new Answer Bot.

Lessons Learned

After a whirlwind of testing, tweaking, and the occasional LLM-induced headache, I’ve got some hard-won insights to share:

  • Niche is King: My specialized Teia bot would only be a little help with, say, fixing a leaky faucet. But for the Teia community? It’s way more valuable than a generic bot could ever be. This is about laser focus!
  • Know Your Limits: Real talk — sometimes, I had to add limitations. Maybe a question was too open-ended for a factual answer, or it was venturing into risky territory. It’s okay to guide your bot towards where it can excel.
  • Training Day: Think of each question asked and each prompt refinement you make as a training session for your bot. It’s a slow burn, but suddenly, you realize the LLM is getting surprisingly smart within this domain.
  • It Takes a Village: The original Teia Answer Bot, built with NLP.js, was a great start. Its insights helped me understand the community better and shape my Gemini bot. Sometimes, the best upgrades come from collaboration, even if it’s a human + LLM team-up! Teia Discord moderators are testing the updated Answer Bot using the Google Gemini 1.5 Pro API.

The biggest lesson? Building a custom LLM assistant is an adventure. It’s equal parts tech savvy and understanding what your community needs from a helper bot.

Now, Make It Yours

Alright, are you feeling the itch to build yet? Forget those limitations of the generic LLM assistants! Here’s the challenge: picture an online community where a dedicated, niche LLM helper would be a game-changer. Remember, this could be about:

  • Hobbies Unleashed: A bot that’s an expert on everything from rare guitar pedals to crafting the perfect sourdough loaf.
  • Workplace Wonder: Think of an LLM trained on your company’s manuals and reports, ready to answer any question about procedures and policies.
  • Fan Fever: Imagine a bot specializing in your favorite sport, knowing everything from player stats to historic rivalries.

The process I used to make my Teia bot is your starting template:

  1. Define the Purpose: What questions should your bot answer best?
  2. Knowledge is Power: What are your topic’s most reliable, authoritative sources? Look for wikis, websites, and even trusted textbooks.
  3. Test and Tweak: Based on your results, use common and tricky questions to refine your system prompts.

Gemini allows us to build niche LLM tools that feel personally tailored. It’s about using this tech in an exciting, community-driven way. So, what amazing thing will you create?

Conclusion

So, we did it! We built a bot that gets its niche, providing value to its community. Mine focuses on Teia, but yours could transform a completely different online space.

Here’s my challenge: What kind of LLM helper will YOU bring to life? Don’t be afraid to start small and iterate — that’s how the best creations happen.

Remember, this isn’t just about code and LLMs. It’s about using technology to empower communities and give people the feeling, “Hey, this is for people like me!” Let’s see how creative, insightful, and downright helpful we can make our LLM assistants!

Check out my reading list of other Google Gemini articles.

This post was created with the help of AI writing tools, carefully reviewed, and polished by the human author.

--

--