-
Notifications
You must be signed in to change notification settings - Fork 0
Description
🐛 Bug Description
A clear and concise description of what the bug is.
Analysis with AI throws these errors:
[Ollama] Tab injection chat failed: HTTP 404: Not Found background.js:704:25
analyzeEmailContent moz-extension://937918cb-fbf4-4c04-965a-e5572a735921/background.js:704
Error analyzing email: Error: HTTP 404: Not Found
ollamaChatViaTab moz-extension://937918cb-fbf4-4c04-965a-e5572a735921/background.js:101
📋 Steps to Reproduce
Steps to reproduce the behavior:
right click on any email, select "Analyze with AI", see it error in Browser Console
❌ Expected Behavior
What you expected to happen.
get some labels or see it do something?!
👀 Actual Behavior
What actually happened instead.
Nothing
📸 Console Output
Browser Console Log (Ctrl+Shift+J in Thunderbird):
<img width="1327" height="429" alt="Image" src="https://github.com/user-attachments/assets/44c3930e-4f4d-4ce5-8649-e25a6425a1e0" />
🔧 Environment Details
Thunderbird
- Version: 147.0
- OS: Windows
- OS Version: Windows 11
Ollama Setup
- Ollama Version: 0.15.1
- Model Used: tinyllama
- Running on: CPU / GPU (which GPU model?)
- Memory Available: (e.g., 8GB, 16GB)
AutoSort+
- Extension Version: e.g., 1.2.3.1-ollama-test
- Install Method: XPI
✅ Debugging Checklist
- Ollama is running:
curl http://localhost:11434/api/tagsreturns models - Model installed:
ollama listshows your model - Test connection passes: Settings → Test Connection works
- Thunderbird restarted after AutoSort+ install
- Console logs checked (Ctrl+Shift+J shows [Ollama] messages)
- Email is plaintext (not HTML-only)
- Model responds locally:
ollama run tinyllama "test"
🔍 Manual Testing Steps
1. Verify Ollama API works:
curl http://localhost:11434/api/tagsShould list your installed models.
2. Test direct API call:
curl -X POST http://localhost:11434/api/chat \
-H "Content-Type: application/json" \
-d '{
"model": "tinyllama",
"messages": [{"role": "user", "content": "What is email classification?"}],
"stream": false
}'Should return a response from the model.
3. Check model performance:
ollama run tinyllama "Classify this email: [subject line here]"4. Enable verbose logging:
- Ctrl+Shift+J in Thunderbird
- Analyze an email
- Copy all
[Ollama]log entries
📝 Error Message (if applicable)
[Paste the full error message here]
🎯 Additional Context
- What were you trying to do?
- Does it happen consistently or randomly?
- Have you tried other models?
- Any recent Thunderbird or Ollama updates?
📌 For Developers
How to investigate tab injection issues:
- Check if hidden tab at
http://localhost:11434opens and closes - Verify
window.__ollama_resultis populated - Check network tab for POST to
/api/chat - Inspect returned JSON structure from Ollama API