Skip to content

tangle-network/embedding-inference-blueprint

Repository files navigation

Tangle Network Banner

Embedding Blueprint

Decentralized text embeddings on Tangle — operators serve embedding models via HuggingFace TEI or any OpenAI-compatible endpoint.

Discord Telegram

Overview

A Tangle Blueprint enabling operators to serve text embedding models with anonymous payments through shielded credits. High-volume, low-cost embeddings for RAG, search, and classification workloads.

Dual payment paths:

  • On-chain jobs via TangleProducer — verifiable results on Tangle
  • x402 HTTP — fast private embeddings at /v1/embeddings

OpenAI Embeddings API compatible. Built with Blueprint SDK with TEE support.

Components

Component Language Description
operator/ Rust Operator binary — wraps TEI or embedding endpoint, HTTP server, SpendAuth billing
contracts/ Solidity EmbeddingBSM — dimension validation, per-1K-token pricing

Supported Models

Any model compatible with HuggingFace Text Embeddings Inference:

  • BGE (large, base, small)
  • E5 (large, base, small)
  • Jina Embeddings v3
  • GTE (Qwen)
  • Nomic Embed

Pricing

Per-1K-tokens. Operators compete on price — typically 5-10x cheaper than OpenAI's embedding API.

TEE Support

Add features = ["tee"] to blueprint-sdk in Cargo.toml. The TeeLayer middleware transparently attaches attestation metadata when running in a Confidential VM. Passes through when no TEE is configured.

Quick Start

# Start TEI (example with BGE-large)
docker run -p 8080:80 ghcr.io/huggingface/text-embeddings-inference:latest \
  --model-id BAAI/bge-large-en-v1.5

# Configure operator
cp config/operator.example.toml config/operator.toml

# Run operator
EMBEDDING_ENDPOINT=http://localhost:8080 cargo run --release

Related Repos

About

Tangle Blueprint for text embeddings (HuggingFace TEI) with TEE support

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages