BrinqaIQ MCP
This article details BrinqaIQ's Model Context Protocol (MCP) integration and provides instructions for connecting MCP-compatible AI clients to BrinqaIQ.
MCP support for BrinqaIQ was introduced in Brinqa Platform version 11.33. Brinqa strongly recommends upgrading to the latest platform version to take full advantage of the functionality.
What is MCP?
MCP is an open standard that enables AI applications to securely connect to external data sources and tools. It provides a universal way for AI systems to access contextual information, allowing them to interact with various platforms and services through a standardized interface.

By implementing MCP, BrinqaIQ enables you to use compatible AI clients (like Claude Desktop, Visual Studio Code, and other MCP-enabled applications) to interact with BrinqaIQ capabilities such as writing BQL queries, exploring CVEs, and accessing documentation.
Available Tools
BrinqaIQ provides an OpenAPI specification that documents all available MCP tools and their capabilities. To view the interactive API documentation and explore available endpoints, go to:
https://<your-brinqa-platform-url>/brinqamax/docs
The documentation includes detailed information about each tool, including parameters, request/response schemas, and authentication requirements.

Security and Access Control
The BrinqaIQ MCP integration uses API token-based authentication to access Brinqa Platform on your behalf. MCP only accesses the specific BrinqaIQ endpoints documented in the Available Tools section above, and all other Brinqa Platform APIs are not accessible through the MCP integration.
All API calls made through MCP inherit the permissions of the user account associated with their API token, including any role-based or data-level access controls. This ensures that MCP can only access data that the user already has permission to view in the Brinqa Platform.
Supported AI Clients
The following AI platforms have been tested and are known to operate successfully with the BrinqaIQ MCP Server:
Brinqa does not provide official support for third-party AI clients. Compatibility depends on each client's implementation of the MCP standard and may vary.
Connect Claude Desktop to BrinqaIQ MCP
This section demonstrates how to connect Claude Desktop to BrinqaIQ MCP.
Prerequisites
Before connecting to BrinqaIQ MCP, you must have these prerequisites in place:
-
A Brinqa Platform API Key
You need an API key to authenticate with the Brinqa Platform. See Obtain an API token for instructions.
-
Claude Desktop application
Download and install Claude Desktop from claude.ai/download.
-
Python package manager (
uv)uvis a Python package and project manager. For installation instructions, see the uv documentation.After installing
uv, find the full path touvxby running this command in your terminal:which uvxSave this path for the MCP server configuration.
Configure the MCP Server
After installing the prerequisites, configure Claude Desktop to connect to BrinqaIQ MCP by adding your Brinqa Platform URL and API token to its MCP settings file.
To configure the MCP server in Claude Desktop, follow these steps:
-
Open Claude on your machine.
-
Click your profile icon in the bottom-left corner of the sidebar.
-
Click Settings and then Developer.
-
Click Edit Config.
-
If prompted, select a text editor to open the
claude_desktop_config.jsonfile. -
Add this configuration to the file:
{
"mcpServers": {
"brinqaMCP": {
"command": "<your-uvx-path>",
"args": [
"mcp-openapi-proxy"
],
"env": {
"OPENAPI_SPEC_URL": "https://<your-brinqa-platform-url>/brinqamax/openapi.json",
"API_KEY": "<your-brinqa-platform-access-token>"
}
}
}
}Replace
<your-uvx-path>,<your-brinqa-platform-url>, and<your-brinqa-platform-access-token>with your actual values.noteThe configuration uses
mcp-openapi-proxy, a Python package that implements an MCP server using the OpenAPI specification. -
Save the file.
-
Restart Claude.
Test the Connection
After Claude Desktop restarts successfully, test the connection:
-
Open Claude.
-
Ask a question related to BrinqaIQ, such as "Are you connected to BrinqaIQ MCP?".
-
The MCP server should connect automatically and provide an answer.

Connect GitHub Copilot to BrinqaIQ MCP
This section demonstrates how to connect GitHub Copilot in VS Code to BrinqaIQ MCP.
Prerequisites
Before connecting to BrinqaIQ MCP, you must have these prerequisites in place:
-
A Brinqa Platform API Key
You need an API key to authenticate with the Brinqa Platform. See Obtain an API token for instructions.
-
GitHub Copilot extension in VS Code
Download and install the GitHub Copilot extension:
- Open VS Code.
- Go to Extensions and search for "GitHub Copilot".
- Click Install.
GitHub Copilot offers a free plan for individuals with limited access, as well as paid plans. For more information, see GitHub Copilot plans.
-
Python package manager (
uv)uvis a Python package and project manager. For installation instructions, see the uv documentation.After installing
uv, find the full path touvxby running this command in your terminal:which uvxSave this path for the MCP server configuration.
Configure the MCP Server
After installing the prerequisites, configure Copilot to connect to BrinqaIQ MCP by adding your Brinqa Platform URL and API token to its MCP settings file.
To configure the MCP server in VS Code, follow these steps:
-
Open your workspace in VS Code.
-
Create a
.vscodefolder in your workspace root if it doesn’t already exist. -
Create a file named
mcp.jsonin the.vscodefolder. -
Add this configuration to the file:
{
"servers": {
"brinqaMCP": {
"type": "stdio",
"command": "<your-uvx-path>",
"args": [
"mcp-openapi-proxy"
],
"env": {
"OPENAPI_SPEC_URL": "https://<your-brinqa-platform-url>/brinqamax/openapi.json",
"API_KEY": "<your-brinqa-platform-access-token>",
"OPENAPI_SIMPLE_MODE": "true"
}
}
}
}Replace
<your-uvx-path>,<your-brinqa-platform-url>, and<your-brinqa-platform-access-token>with your actual values.noteThe configuration uses
mcp-openapi-proxy, a Python package that implements an MCP server using the OpenAPI specification. -
Save the file.
-
In VS Code, you should see clickable actions (code lenses) above the server configuration in
mcp.json. Click Start to start the MCP server. -
If prompted, confirm that you trust the MCP server configuration.
This configuration creates a workspace-level MCP server that only applies to the current workspace. For other configuration options (user-level, dev container, or automatic discovery), see the VS Code MCP documentation.
Test the Connection
After the MCP server starts successfully, test the connection:
-
Open Copilot Chat in VS Code.
-
Ask a question related to BrinqaIQ, such as "Are you connected to BrinqaIQ MCP?" or "What BrinqaIQ tools are available?".
-
Copilot should respond using the BrinqaIQ MCP tools.
To view the MCP server logs or troubleshoot issues:
- Open the Output panel (View > Output).
- Select MCP: brinqaMCP from the dropdown menu.
- Review any error messages or connection logs.
Connect Goose to BrinqaIQ MCP
This section demonstrates how to connect Goose to BrinqaIQ MCP.
Prerequisites
Before connecting to BrinqaIQ MCP, you must have these prerequisites in place:
-
A Brinqa Platform API Key
You need an API key to authenticate with the Brinqa Platform. See Obtain an API token for instructions.
-
Goose
Download and install Goose following the instructions in the Goose documentation. Goose is available as a desktop application or CLI tool.
-
Language model provider
Goose requires a language model to process queries and interact with BrinqaIQ tools. Configure a language model provider (such as Ollama, GitHub Copilot, OpenAI, or other compatible providers) following the Goose documentation.
-
Python package manager (
uv)uvis a Python package and project manager. For installation instructions, see the uv documentation.After installing
uv, you can useuvxdirectly if it's in your system PATH, or use the full path. To find the full path, run:which uvxIn the configuration steps below, you can use either
uvxor the full path returned by this command.
Configure the MCP Extension
After installing the prerequisites, configure Goose to connect to BrinqaIQ MCP by creating an extension.
To configure the MCP extension in Goose, follow these steps:
-
Open Goose Desktop.
-
Go to Extensions and click Add customm extension.
-
Configure the extension with these settings:
- Name: BrinqaMCP
- Type: STDIO
- Command:
<your-uvx-path> mcp-openapi-proxy - Environment Variables:
- API_KEY =
<your-brinqa-platform-access-token> - OPENAPI_SPEC_URL =
https://<your-brinqa-platform-url>/brinqamax/openapi.json
- API_KEY =
Replace
<your-uvx-path>,<your-brinqa-platform-url>, and<your-brinqa-platform-access-token>with your actual values.noteThe configuration uses
mcp-openapi-proxy, a Python package that implements an MCP server using the OpenAPI specification. -
Click Add Extension.
Test the Connection
After configuring the MCP extension, test the connection:
-
In Goose, start a new chat.
-
Ask a question related to BrinqaIQ, such as "Are you connected to BrinqaIQ MCP?" or "What BrinqaIQ tools are available?".
Goose should respond using the BrinqaIQ MCP tools.
Using MCP extensions with local language models (like Ollama) requires significant computational resources. If you experience timeouts or slow responses, consider using a cloud-based provider (like OpenAI or GitHub Copilot) or a machine with more available resources.
If the connection is not working, verify:
- Your language model provider is configured correctly.
- The BrinqaMCP extension is saved and enabled in Goose.
- The API key entered is correct.
- The Brinqa Platform URL is accessible from your machine.
- Check Goose logs for detailed error messages.
Connect Open WebUI to BrinqaIQ
This section demonstrates how to connect Open WebUI to BrinqaIQ using OpenAPI integration.
Prerequisites
Before connecting to BrinqaIQ, you must have these prerequisites in place:
-
A Brinqa Platform API Key
You need an API key to authenticate with the Brinqa Platform. See Obtain an API token for instructions.
-
Open WebUI
Install Open WebUI following the instructions in the Open WebUI documentation. Once installed, start Open WebUI and access it in your browser (typically at http://localhost:8080).
-
Language model
Open WebUI requires a language model to process queries and interact with BrinqaIQ tools. You can use various LLM backends such as Ollama, OpenAI, or other compatible providers. For language model setup instructions, see the Open WebUI documentation.
If using Ollama, smaller models like
llama3.2:3bprovide faster responses while still effectively utilizing BrinqaIQ tools.
Configure the OpenAPI Connection
After installing the prerequisites, configure Open WebUI to connect to BrinqaIQ by adding an OpenAPI connection.
Open WebUI connects to BrinqaIQ through its OpenAPI specification rather than through MCP's stdio transport. This is due to Open WebUI's web-based, multi-tenant architecture.
To configure the connection in Open WebUI, follow these steps:
-
Access Open WebUI in your browser (typically http://localhost:8080).
-
Click your user icon in the top-right corner.
-
Click Settings, and then Admin Settings.
-
Go to External Tools and click + (Add Connection).
-
Configure the connection with these settings:
- Type: OpenAPI
- URL:
https://<your-brinqa-platform-url> - OpenAPI Spec:
- URL:
brinqamax/openapi.json - Auth: None
- URL:
- Headers:
{
"Authorization": "Bearer <your-brinqa-platform-access-token>"
} - Name: BrinqaMCP
- Description: BrinqaIQ API integration
- Visibility: Public
Replace
<your-brinqa-platform-url>and<your-brinqa-platform-access-token>with your actual values. -
Click Save.
Ensure the connection toggle is enabled (switched on).
Test the Connection
After configuring the OpenAPI connection, test it to verify BrinqaIQ tools are available:
-
In Open WebUI, start a new chat.
-
Select your language model from the model dropdown (e.g., llama3.2:3b).
-
At the bottom of the message input field, select Integrations > Tools > BrinqaMCP.
This directs the language model to use the BrinqaIQ connection when responding to your query.
-
Ask a question to test the BrinqaIQ connection, such as "What tools does BrinqaIQ support?" or "Can you query BrinqaIQ data models?".
The model should respond by accessing BrinqaIQ tools and providing information about available capabilities.

If the connection is not working, verify:
- Your language model backend is running (if using Ollama, run
ollama listto check installed models) - The OpenAPI connection is enabled in Open WebUI.
- The Bearer token in the Headers is correct.
- The Brinqa Platform URL is accessible from your machine.