Code & Pepper built Peppy, an AI-powered Slack assistant that pulls answers from Confluence, Google Sheets, and Firebase to handle the repetitive HR, policy, and operations questions that slow teams down every day. Here’s why we built it, what we learned, and what it means for companies thinking about internal AI tools.

The problem every growing company has
According to a Gartner survey, 47% of knowledge workers struggle to find the information they need to do their jobs. At Code & Pepper, a software house with 19 years of experience and over 500 completed projects, we saw the same pattern. The same questions appeared in Slack every week: “How many vacation days do I have left?” “What’s the process for submitting invoices?” “Who handles equipment requests?”
These questions had answers. They lived in Confluence pages, Google Sheets, and HR documents. But nobody could find them fast enough. So instead of searching, people asked in Slack. And someone from HR or operations would stop what they were doing to answer the same thing for the fifth time that month.
We decided to build a solution instead of living with the problem.
What Peppy Does: An AI Slack Bot for Internal Company Knowledge
Peppy is a Slack bot, built by Code & Pepper’s engineering team, that serves as an internal AI assistant for company knowledge. You send it a direct message in Slack, ask a question in plain language, and get an accurate answer in seconds.
Peppy handles three categories of daily questions:
Vacation and HR data. Peppy pulls remaining vacation days, overtime hours, and leave balances directly from Google Sheets. This data displays in Slack’s Home tab, so employees can check it without sending a single message. Before Peppy, checking vacation balance required opening a spreadsheet or asking HR. Now it takes one glance.
Company policies and procedures. Training budgets. Multisport card rules. Invoice submission steps. Time-tracking guidelines. Contact lists. Peppy reads these directly from Confluence and answers in conversational language. If the policy document in Confluence gets updated, Peppy’s answers reflect that change immediately. No manual sync. No redeployment.
Routing questions to the right person. “Who do I talk to about X?” is one of the most common questions in any company with more than 30 people. Peppy answers it instantly, eliminating the chain of Slack messages that usually follows.
Key takeaway: Peppy reduces the time employees spend searching for internal information from minutes (or hours) to seconds, while freeing HR and operations teams from answering the same questions repeatedly.
How We Built It: Tech Stack and Architecture
Peppy’s architecture uses four components, each chosen for a specific reason:
Slack Bolt framework handles the bot’s core functionality. Bolt is Slack’s own open-source framework for building Slack apps. It manages message events, the Home tab interface, and direct message handling. We chose Bolt because it integrates natively with Slack’s API and supports the Home tab feature we use to display HR data.
Google Sheets stores vacation days, overtime hours, and other HR data. Peppy reads this data and displays it in Slack’s Home tab. Employees see their current balances without asking anyone.
Firebase stores submitted questions and logs. When someone asks a question through Peppy’s form interface, it gets recorded in Firebase for tracking.
Google Gemini powers the AI layer. This is where Peppy becomes more than a simple data-retrieval bot.
Why Gemini function calling changes the approach
Most internal AI chatbots work by dumping company documents into a system prompt. That approach creates two problems: the AI works from a static snapshot (not current data), and you hit context window limits fast.
Peppy takes a different approach. We use Gemini’s function calling capability, which works like this:
- An employee asks a question in Slack.
- Gemini analyzes the question and decides whether it needs company documentation to answer it.
- If yes, the model calls a predefined function (like “get list of Confluence documents” or “get document content”).
- The function runs, fetches the relevant document from Confluence, and returns the content to Gemini.
- Gemini reads the document and formulates an answer based on current data.
This means company knowledge isn’t hardcoded into Peppy’s system prompt. The AI pulls documents dynamically, on demand, every time someone asks a question. If someone updates a policy in Confluence at 2 PM, Peppy’s answers reflect that change at 2:01 PM.
According to Google’s AI documentation, function calling allows models to interact with external systems through structured function declarations, making it possible to build AI applications that access real-time data rather than relying on training data alone.
Key takeaway: Gemini’s function calling enables Peppy to work with live company data from Confluence rather than static document dumps, so answers are always current and context-aware.
Model Selection Matters More Than Most Teams Realize
We tested Peppy with two Google Gemini models: a lighter, faster model and a stronger, more capable one. The performance gap was significant.
The lighter model answered questions quickly but skipped checking Confluence before responding roughly 30-40% of the time. It would rely on whatever it had already “seen” and give an answer that sounded right but wasn’t based on the latest documentation. For casual questions, this was fine. For policy-related answers where accuracy matters, it was a problem.
The stronger model checked Confluence for relevant documentation before answering nearly every time. It treated company documents as its primary source of truth, which is exactly the behavior you need when people rely on a bot for HR and compliance information.
The cost difference between the two models is small. The trust difference is enormous. When an employee asks “what’s the process for expensing client dinners?” and gets an outdated answer, they lose confidence in the tool. One wrong answer can mean weeks of rebuilding adoption.
For any team building an internal AI assistant, this is the most practical lesson we can share: test your model selection against accuracy requirements, not just speed or cost. Internal knowledge tools need to be right, not fast.
Key takeaway: Stronger AI models check source documents before answering and produce more reliable results for internal knowledge applications. The cost difference is marginal; the trust difference is critical.
Data Security: Peppy Passed Its First Penetration Test
When you build an AI tool that accesses employee data (vacation balances, overtime hours, personal leave records), data boundaries are not a feature for version two. They’re a requirement for version one.
During internal testing at Code & Pepper, several team members deliberately tried to get Peppy to reveal other employees’ timesheet data. The bot refused every attempt.
This wasn’t accidental. The system is designed so that Peppy only surfaces data the requesting user has permission to see. The AI layer doesn’t override access controls.
For companies in regulated industries (and Code & Pepper works with FCA-regulated FinTech platforms and HIPAA-compliant HealthTech products daily), this principle applies to every internal tool, not just client-facing products. Employee data requires the same rigor.
Key takeaway: Any AI assistant that accesses employee data must implement access controls from day one. At Code & Pepper, Peppy was tested against data leakage attempts before internal launch.
What We Hit: Slack API Limitations for Internal Apps
Not everything went smoothly. Slack imposes significant API restrictions on apps that aren’t registered in the Slack App Directory (their public marketplace).
The biggest limitation: reading conversation content. Slack restricts how unregistered apps can access message data, likely to prevent large-scale content scraping. Since Peppy is an internal tool and we’re not registering it publicly, some features we originally planned had to be redesigned around these constraints.
This is a real constraint that any engineering team building internal Slack bots should evaluate before starting development. If your bot needs to read and process existing Slack conversations (not just respond to direct messages), check Slack’s current API permissions for unregistered apps early in your planning.
Key takeaway: Slack heavily limits API access for apps not listed in their marketplace. Plan your internal bot’s architecture around these restrictions from the start.
What’s Next: Skills Matching and Broader Data Access
Peppy’s roadmap includes two expansions:
Connecting to the Cosmos platform for skills matching. Cosmos is Code & Pepper’s internal HR and skills database. Once integrated, Peppy will answer questions like “who in the company has TypeScript experience?” or “which engineers have worked with React Native recently?” This turns Peppy from a knowledge bot into an internal talent directory, useful for project staffing and cross-team collaboration.
Expanding data sources beyond Confluence. The function calling architecture means any system with an API can become a knowledge source. The same pattern we use for Confluence (define a function, let the model decide when to call it) works for project management tools, CRM systems, or any internal database.
Why This Matters Beyond Code & Pepper
We build software for clients in FinTech and HealthTech every day: payment platforms, digital banking apps, telemedicine systems, compliance layers. But Peppy wasn’t built for a client. We built it for ourselves, because the problem was annoying enough to fix.
That’s what practical AI looks like in 2025. Not a six-month enterprise rollout. Not a board-level “AI strategy” initiative. A focused tool that solves one specific, recurring problem well.
If your team spends hours every week answering the same internal questions on Slack, you probably don’t need an enterprise knowledge management platform. You need a bot, a connection to your documentation, and an AI model that knows when to check its sources before answering.
The tools to build this exist today. Slack’s Bolt framework is open source. Google Gemini’s function calling API is production-ready. Confluence has a well-documented REST API. The technical barriers are low. The real challenge is building it with the right data boundaries and choosing a model that prioritizes accuracy over speed.
Code & Pepper has 19 years of experience building software that works under real-world constraints, from FCA-regulated payment platforms to HIPAA-compliant patient portals. Building Peppy for our own team gave us another proof point: AI works best when it’s focused, practical, and connected to real data sources.
Frequently Asked Questions
What is an AI Slack assistant for company knowledge?
An AI Slack assistant for company knowledge is a bot that connects to internal documentation (like Confluence, Google Sheets, or internal wikis) and answers employee questions in natural language directly inside Slack. Instead of searching through documents, employees message the bot and get accurate answers in seconds.
How does Gemini function calling work in a Slack bot?
Gemini function calling lets developers define external functions (like “search Confluence” or “get document content”) that the AI model can invoke on its own. When a user asks a question, Gemini decides whether it needs external data, calls the appropriate function, receives the results, and generates an answer based on live data rather than pre-loaded context.
Can you build a Slack bot without registering it in the Slack Marketplace?
Yes. Slack allows internal app development without marketplace registration. However, unregistered apps face stricter API rate limits and reduced access to certain features, particularly reading conversation content. Plan your bot’s architecture around these limitations.
How do you prevent an AI Slack bot from leaking employee data?
Implement access controls at the data layer, not just the AI layer. The bot should only query data that the requesting user has permission to see. Test with deliberate attempts to access other users’ data before any internal rollout.
What tech stack does Code & Pepper’s Peppy bot use?
Peppy uses Slack’s Bolt framework for the bot interface, Google Sheets for HR data, Firebase for question logging, and Google Gemini with function calling for AI-powered document retrieval from Confluence.