Skip to content

Conversation

@h4x3rotab
Copy link
Contributor

Summary

  • Add confidential-ai/ directory with three examples:
    • inference - Private LLM inference with vLLM on Confidential GPU
    • training - Confidential fine-tuning using Unsloth
    • agents - Secure AI agent with TEE-derived wallet keys
  • Update README with Confidential AI section

🤖 Generated with Claude Code

- Add confidential-ai/ with inference, training, and agents examples
- Update README with Confidential AI section

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
}
)
except Exception as e:
return jsonify({"status": "running", "error": str(e)})

Check warning

Code scanning / CodeQL

Information exposure through an exception Medium

Stack trace information
flows to this location and may be exposed to an external user.

Copilot Autofix

AI 2 days ago

In general, the fix is to avoid returning raw exception messages to the client and instead return a generic error while logging the detailed error server-side. This preserves debuggability without exposing stack traces or potentially sensitive internal information to external users.

Concretely, in confidential-ai/agents/agent.py:

  • For the / (index) route, change the except block so that it:
    • Logs the exception (with stack trace) using Python’s standard logging module.
    • Returns a generic JSON response such as {"status": "error", "message": "Internal server error"} with an appropriate HTTP status code (e.g., 500).
  • To implement logging, add an import for logging at the top and optionally configure a basic logger (if not already configured elsewhere). Since we must not assume external configuration and can only use standard libraries, logging is appropriate and available.
  • Do not change the normal (non-exceptional) behavior of the endpoint; only change what is returned when an exception occurs.

We will:

  1. Add import logging near the other imports.
  2. Replace the except Exception as e: body in index() so that:
    • It calls logging.exception("Error while handling index endpoint") (which logs the exception with a stack trace).
    • It returns a generic error JSON and a 500 status code, without including str(e).

No other routes need changes for this specific alert, as CodeQL only flagged line 144.

Suggested changeset 1
confidential-ai/agents/agent.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/confidential-ai/agents/agent.py b/confidential-ai/agents/agent.py
--- a/confidential-ai/agents/agent.py
+++ b/confidential-ai/agents/agent.py
@@ -10,6 +10,7 @@
 """
 
 import os
+import logging
 
 from dstack_sdk import DstackClient
 from dstack_sdk.ethereum import to_account
@@ -140,8 +141,14 @@
                 "app_id": info.app_id,
             }
         )
-    except Exception as e:
-        return jsonify({"status": "running", "error": str(e)})
+    except Exception:
+        logging.exception("Error while handling index endpoint")
+        return jsonify(
+            {
+                "status": "error",
+                "message": "Internal server error",
+            }
+        ), 500
 
 
 @app.route("/attestation")
EOF
@@ -10,6 +10,7 @@
"""

import os
import logging

from dstack_sdk import DstackClient
from dstack_sdk.ethereum import to_account
@@ -140,8 +141,14 @@
"app_id": info.app_id,
}
)
except Exception as e:
return jsonify({"status": "running", "error": str(e)})
except Exception:
logging.exception("Error while handling index endpoint")
return jsonify(
{
"status": "error",
"message": "Internal server error",
}
), 500


@app.route("/attestation")
Copilot is powered by AI and may make mistakes. Always verify output.
}
)
except Exception as e:
return jsonify({"error": str(e)}), 500

Check warning

Code scanning / CodeQL

Information exposure through an exception Medium

Stack trace information
flows to this location and may be exposed to an external user.

Copilot Autofix

AI 2 days ago

In general, the problem is that the /chat endpoint returns the string form of an arbitrary exception directly to the client, potentially exposing internal details. To fix this, the handler should return a generic, non‑sensitive error message to the user while optionally logging the detailed exception on the server for diagnostics.

The best minimal fix without changing functionality is: (1) introduce a logger for this module using Python’s standard logging library, and (2) in the /chat endpoint’s except block, log the exception (ideally with traceback) and replace {"error": str(e)} with a generic message like {"error": "Internal server error"}. This preserves the HTTP 500 status code and the basic contract that an "error" field is present, but avoids leaking specifics. We only need to modify confidential-ai/agents/agent.py: add a logging import and basic configuration near the top (or just get a logger if configuration is handled elsewhere), and update lines 172‑173 to log and send a generic response.

Concretely:

  • At the top of agent.py, add import logging and a module‑level logger, e.g., logger = logging.getLogger(__name__). If the project doesn’t configure logging elsewhere, we can also add a simple basicConfig.
  • In the /chat endpoint, change the except Exception as e: block to call logger.exception("Error while processing /chat request") (which logs the full traceback) and then return jsonify({"error": "Internal server error"}), 500 instead of using str(e).
Suggested changeset 1
confidential-ai/agents/agent.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/confidential-ai/agents/agent.py b/confidential-ai/agents/agent.py
--- a/confidential-ai/agents/agent.py
+++ b/confidential-ai/agents/agent.py
@@ -10,6 +10,7 @@
 """
 
 import os
+import logging
 
 from dstack_sdk import DstackClient
 from dstack_sdk.ethereum import to_account
@@ -22,6 +23,8 @@
 
 app = Flask(__name__)
 
+logger = logging.getLogger(__name__)
+
 # Lazy initialization - only connect when needed
 _client = None
 _account = None
@@ -170,7 +173,8 @@
             }
         )
     except Exception as e:
-        return jsonify({"error": str(e)}), 500
+        logger.exception("Error while processing /chat request")
+        return jsonify({"error": "Internal server error"}), 500
 
 
 @app.route("/sign", methods=["POST"])
EOF
@@ -10,6 +10,7 @@
"""

import os
import logging

from dstack_sdk import DstackClient
from dstack_sdk.ethereum import to_account
@@ -22,6 +23,8 @@

app = Flask(__name__)

logger = logging.getLogger(__name__)

# Lazy initialization - only connect when needed
_client = None
_account = None
@@ -170,7 +173,8 @@
}
)
except Exception as e:
return jsonify({"error": str(e)}), 500
logger.exception("Error while processing /chat request")
return jsonify({"error": "Internal server error"}), 500


@app.route("/sign", methods=["POST"])
Copilot is powered by AI and may make mistakes. Always verify output.

import argparse
import hashlib
import json

Check notice

Code scanning / CodeQL

Unused import Note

Import of 'json' is not used.

Copilot Autofix

AI 2 days ago

To fix an unused-import issue, you remove the import statement for the module that is not referenced anywhere in the file. This reduces unnecessary dependencies and slightly simplifies the code without changing runtime behavior.

In this specific case, in confidential-ai/inference/verify.py, the json module imported on line 13 is not used anywhere in the shown code. The best fix is to delete that single import json line and leave the rest of the imports (argparse, hashlib, sys, requests) unchanged. No additional methods, definitions, or imports are required, and no other parts of the file need to be modified.

Concretely:

  • Edit confidential-ai/inference/verify.py.
  • Remove the line import json (currently line 13 in the snippet).
  • Ensure all remaining imports and code remain exactly as they are.
Suggested changeset 1
confidential-ai/inference/verify.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/confidential-ai/inference/verify.py b/confidential-ai/inference/verify.py
--- a/confidential-ai/inference/verify.py
+++ b/confidential-ai/inference/verify.py
@@ -10,7 +10,6 @@
 
 import argparse
 import hashlib
-import json
 import sys
 
 import requests
EOF
@@ -10,7 +10,6 @@

import argparse
import hashlib
import json
import sys

import requests
Copilot is powered by AI and may make mistakes. Always verify output.
import argparse
import hashlib
import json
import sys

Check notice

Code scanning / CodeQL

Unused import Note

Import of 'sys' is not used.

Copilot Autofix

AI 2 days ago

To fix an unused import, the correct approach is simply to delete the import statement, as long as the imported module is not referenced anywhere in the file. This removes an unnecessary dependency and slightly improves readability and startup time.

For this file, confidential-ai/inference/verify.py, the single best fix is to remove the line import sys at line 14. No other code changes are needed, and no replacement functionality is required because the module was not used. Concretely: in verify.py, delete line 14 containing import sys while leaving all other imports and code unchanged.

Suggested changeset 1
confidential-ai/inference/verify.py

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/confidential-ai/inference/verify.py b/confidential-ai/inference/verify.py
--- a/confidential-ai/inference/verify.py
+++ b/confidential-ai/inference/verify.py
@@ -11,7 +11,6 @@
 import argparse
 import hashlib
 import json
-import sys
 
 import requests
 
EOF
@@ -11,7 +11,6 @@
import argparse
import hashlib
import json
import sys

import requests

Copilot is powered by AI and may make mistakes. Always verify output.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants