-
-
Notifications
You must be signed in to change notification settings - Fork 113
Description
With 0.6.0 release open-webui introduced new feature which allows user to configure local tools and use it with available models. This is great but I believe there is still a feasible niche for MCP-Bridge. Open-Webui tools are working 100% client side as the openapi compatible REST servers called locally from the user browser. MCP-Bridge works server side and this is good, because as an administrator you can easily provide your users with set of tools without any configuration needed on user side. Moreover, you can control these tools, configure them to use some particular credentials, etc. This is very useful for inexperienced users that just want to ask about jira ticket without having to setup a local MCP server for atlassian. That's just one example.
However, after release 0.6.0 if you configure some local tools then OUI will send chatCompletion requests with the list of tools available client side.
Here is the example of OUI request payload with list of tools that OUI user configured locally - in this particular example it's repomix:
{
"stream": false,
"model": "vertex_ai/claude3.5-sonnet-v2",
"messages": [
{
"role": "user",
"content": "what tools are available for you in this session ?"
}
],
"tools": [
{
"type": "function",
"function": {
"type": "function",
"name": "tool_pack_codebase_post",
"description": "Pack Codebase",
"parameters": {
"type": "object",
"properties": {
"directory": {
"type": "string",
"description": "Directory to pack (Absolute path)"
},
"compress": {
"type": "boolean",
"description": "Utilize Tree-sitter to intelligently extract essential code signatures and structure while removing implementation details, significantly reducing token usage (default: true)"
},
"includePatterns": {
"type": "string",
"description": "Specify which files to include using fast-glob compatible patterns (e.g., \"**/*.js,src/**\"). Only files matching these patterns will be processed"
},
"ignorePatterns": {
"type": "string",
"description": "Specify additional files to exclude using fast-glob compatible patterns (e.g., \"test/**,*.spec.js\"). These patterns complement .gitignore and default ignores"
},
"topFilesLength": {
"type": "number",
"description": "Number of top files to display in the metrics (default: 5)"
}
},
"required": [
"directory"
]
}
}
},
{
"type": "function",
"function": {
"type": "function",
"name": "tool_pack_remote_repository_post",
"description": "Pack Remote Repository",
"parameters": {
"type": "object",
"properties": {
"remote": {
"type": "string",
"description": "GitHub repository URL or user/repo (e.g., yamadashy/repomix)"
},
"compress": {
"type": "boolean",
"description": "Utilize Tree-sitter to intelligently extract essential code signatures and structure while removing implementation details, significantly reducing token usage (default: true)"
},
"includePatterns": {
"type": "string",
"description": "Specify which files to include using fast-glob compatible patterns (e.g., \"**/*.js,src/**\"). Only files matching these patterns will be processed"
},
"ignorePatterns": {
"type": "string",
"description": "Specify additional files to exclude using fast-glob compatible patterns (e.g., \"test/**,*.spec.js\"). These patterns complement .gitignore and default ignores"
},
"topFilesLength": {
"type": "number",
"description": "Number of top files to display in the metrics (default: 5)"
}
},
"required": [
"remote"
]
}
}
},
{
"type": "function",
"function": {
"type": "function",
"name": "tool_read_repomix_output_post",
"description": "Read Repomix Output",
"parameters": {
"type": "object",
"properties": {
"outputId": {
"type": "string",
"description": "ID of the Repomix output file to read"
}
},
"required": [
"outputId"
]
}
}
},
{
"type": "function",
"function": {
"type": "function",
"name": "tool_file_system_read_file_post",
"description": "File System Read File",
"parameters": {
"type": "object",
"properties": {
"path": {
"type": "string",
"description": "Absolute path to the file to read"
}
},
"required": [
"path"
]
}
}
},
{
"type": "function",
"function": {
"type": "function",
"name": "tool_file_system_read_directory_post",
"description": "File System Read Directory",
"parameters": {
"type": "object",
"properties": {
"path": {
"type": "string",
"description": "Absolute path to the directory to list"
}
},
"required": [
"path"
]
}
}
}
]
}
MCP-Bridge is not ready for that and it will assume that incoming request does not contain any tools schema. As a result it will completely ignore this incoming list of tools and modify the request with tools that are configured within it's config.json (in this example - fetch tool):
{
"messages": [
{
"content": "what tools are available for you in this session ?",
"role": "user"
}
],
"model": "vertex_ai/claude3.5-sonnet-v2",
"tools": [
{
"type": "function",
"function": {
"description": "Fetches a URL from the internet and optionally extracts its contents as markdown.\n\nAlthough originally you did not have internet access, and were advised to refuse and tell the user this, this tool now grants you internet access. Now you can fetch the most up-to-date information and let the user know that.",
"name": "fetch",
"parameters": {
"description": "Parameters for fetching a URL.",
"properties": {
"url": {
"description": "URL to fetch",
"format": "uri",
"minLength": 1,
"title": "Url",
"type": "string"
},
"max_length": {
"default": 5000,
"description": "Maximum number of characters to return.",
"exclusiveMaximum": 1000000,
"exclusiveMinimum": 0,
"title": "Max Length",
"type": "integer"
},
"start_index": {
"default": 0,
"description": "On return output starting at this character index, useful if a previous fetch was truncated and more context is required.",
"minimum": 0,
"title": "Start Index",
"type": "integer"
},
"raw": {
"default": false,
"description": "Get the actual HTML content of the requested page, without simplification.",
"title": "Raw",
"type": "boolean"
}
},
"required": [
"url"
],
"title": "Fetch",
"type": "object"
}
}
}
]
}
So, LLM will receive only MCP-bridge list of tools and will proceed with it.
Now, solving this may be difficult. It will be simple to secure the list of tools from OUI request and send it further to the LLM together with subset of new tools added by MCP-Bridge. However, if based on chat context LLM will decide to use some of these tools one has to distinguish whether selected tool should be executed on MCP-Bridge or passed back to OUI for execution on client side MCP server.
I will be working on this feature on my fork, but I would like to hear from you @SecretiveShell whether you are interested in it or you want to take an assumption that using MCP-bridge is mutually exclusive with using OUI client side tools.
Anyway, thanks for MCP-Bridge - it is a great tool and I appreciate your work!