Backchat GenAI is a simple Backstage plugin that brings private generative AI chat into your portal. It embeds a chat app that you run on your own stack. You choose the UI you prefer such as ChatbotUI or Ollama Web UI or Text Generation Web UI or Big AGI. The plugin loads that UI inside Backstage so teams can experiment with local or self hosted models without leaving the developer portal. When you point it at servers you control, your prompts and responses can stay on your network.
At a high level, Backchat adds a page in Backstage that frames your chosen chat client. This keeps the plugin small and easy to reason about. It is meant for trials and proofs where you want to see how chat fits your workflows. Common use cases include drafting TechDocs content, summarizing runbooks, sketching ADRs, helping service owners write better descriptions, brainstorming test ideas, or trying prompts with different local models. It is a good way to explore AI features before you commit to deeper integration.

Installation Instructions
These instructions apply to self-hosted Backstage only.
Step 1 Install the frontend package
Run this in the root of your Backstage repo.
yarn add --cwd packages/app @benbravo73/backstage-plugin-backchatStep 2 Configure the AI server URL
Create a local config file if you do not have one yet.
touch app-config.local.yamlAdd this to app-config.local.yaml.
# Mandatory: Choose one URL to use
ai_server:
  # Chatbot UI with LocalAI server
  url: "http://localhost:3001"
  # Ollama Web UI
  # url: "http://localhost:3100"
  # Text Generation Web UI
  # url: "http://localhost:7860"
  # Big AGI UI
  # url: "http://localhost:3456"
# Optional: Backchat catalog and TechDocs
catalog:
  locations:
    - type: url
      target: https://github.com/benwilcock/backstagecon-2023/blob/main/backchat-catalog.yaml
  rules:
    - allow: [Component, API, Resource, System, Domain, Location, Group, User]Step 3 Add the Backchat route
Edit packages/app/src/App.tsx. Import the page then add a route under FlatRoutes.
// packages/app/src/App.tsx
import React from 'react';
import { BackchatPage } from '@benbravo73/backstage-plugin-backchat';
import { FlatRoutes } from '@backstage/core-app-api';
import { Route } from 'react-router-dom';
// inside your App component render
export const AppRoutes = () => (
  <FlatRoutes>
    {/* other routes */}
    <Route path="/backchat" element={<BackchatPage />} />
  </FlatRoutes>
);If your file uses a different structure, place the Route inside the same FlatRoutes block as your other pages.
Step 4 Add Backchat to the sidebar
Edit packages/app/src/components/Root/Root.tsx. Import the icon then add a SidebarItem.
// packages/app/src/components/Root/Root.tsx
import React from 'react';
import ChatIcon from '@material-ui/icons/Chat';
import { SidebarItem, SidebarGroup } from '@backstage/core-components';
// inside the "Menu" SidebarGroup with your other items
export const Root = () => (
  <>
    {/* other layout */}
    <SidebarGroup label="Menu">
      {/* other items */}
      <SidebarItem icon={ChatIcon} to="backchat" text="Backchat AI" />
    </SidebarGroup>
  </>
);If your app uses a different sidebar layout, add the SidebarItem in the same group as your other navigation links.
Step 5 Start the AI server stack
Use the project that provides LocalAI or Ollama or Text Gen Web UI or Big AGI UI. That project includes a docker compose file. Do the setup it needs, then bring the stack up.
docker compose upAfter it starts, the common defaults are
- Chatbot UI on http://localhost:3001
- Text Generation Web UI on http://localhost:7860
Pick one URL and make sure it matches the url you set in app-config.local.yaml.
Step 6 Run Backstage
From the repo root
yarn devYour Backstage app serves the Backchat page at the path you added. The sidebar link Backchat AI points to it. The page embeds the AI chat UI you configured.
Backend on the legacy system
No backend plugin is required. The Backchat page loads the configured AI web UI directly in the browser.
Backend on the new backend system
No backend plugin is required. The Backchat page loads the configured AI web UI directly in the browser.
Notes on AI UI choices
- Chatbot UI with LocalAI server looks good for quick tests
- Ollama Web UI is easy to start and switch models
- Text Generation Web UI has many features
- Big AGI UI can switch between multiple servers in one place
Set up Backstage in minutes with Roadie
Focus on using Backstage, rather than building and maintaining it.
