Built in Rust for Mission-Critical Apps

Institutional Financial
Sentiment Intelligence.

Institutional-grade sentiment engine built for speed. Utilizing a Three-Tier Intelligence Failover with FinBERT ONNX pipelines, real-time news feeds, and LLM fallbacks.

Engine Status: Optimal Throughput: 1.2M events/sec
Tier 1: Local ONNX Inference (FinBERT) ACTIVE
Tier 2: Real-time News Feeds (Sequential) STANDBY
Tier 3: LLM Intelligence (Grok/DeepSeek) STANDBY
[SYS] Initializing high-scale batch processing...
[SYS] Connection established with Intelligence Cluster.
[DATA] Batch ID 9283-X processed in 0.04ms.
[DATA] Sentiment Vector: [0.892, -0.12, 0.45]

The Intelligence Failover Strategy

Trading bots cannot afford downtime or hallucinations. TierPulse ensures reliability by cascading analysis across three distinct layers.

High-Scale Batching

Processes millions of tokens per second with Rust's memory safety. Automatically batches requests for news providers and LLMs to minimize latency.

Multi-Tier Caching

Hybrid caching strategy using Moka for fast in-memory lookups and Redis for distributed coordination. Scalable performance without bottleneck.

Traffic Control

Built-in "Token Bucket" rate-limiter protects upstream API quotas. Ensures "Five Nines" (99.999%) availability for critical missions.

Why Rust?

Memory Safety: Elimination of data races in high-concurrency sentiment streams.

Zero-Bloat Deployment: Multi-stage distroless builds resulting in a container footprint around 400MB.

Targeted Availability: Engineering towards "Five Nines" (99.999%) uptime for institutional missions.

// TierPulse Engine Orchestration Layer
pub struct AppState {
  engine: InferenceEngine, // Tier 1: Local ONNX
  limiter: RateLimiter, // governor protection
  cache: Cache<String, SentimentResult>,
}

impl AppState {
  pub async fn analyze(&self, req: AnalyzeRequest) -> Result<Response, Error> {
    // 1. Enforce Traffic Control
    self.limiter.check().await?;
    
    // 2. Sequential Intelligence Failover
    if let Some(cached) = self.cache.get(&req.ticker).await {
      return Ok(cached);
    }
    
    // Escalation Tier 2 (Providers) -> Tier 3 (LLMs)
    self.fetch_with_failover(req).await
  }
}

Deploy with Compose

The recommended way to deploy TierPulse is using Docker Compose. A ready-to-use docker-compose.yml is included at the repository root for quick orchestration of the engine and local cache layers.

Recommended: Launch Stack
docker compose up -d
Verify Health
curl -s http://localhost:8080/health/ready
Quick Start Environment (.env)
TP_TIINGO_KEY=your_key_here
TP_AUTH_MODE=api_key
TP_AUTH_API_KEYS=tenant:key
TP_PRIMARY_LLM=grok
# Optional: TP_REDIS_URL, TP_GROK_KEY, TP_DEEPSEEK_KEY, etc.

API Request Payload

{
  "symbols": [
    { "ticker": "AAPL", "name": "Apple Inc." },
    { "ticker": "TSLA", "name": "Tesla, Inc." },
    { "ticker": "BTC", "name": "Bitcoin" }
  ],
  "lookback_hours": 24,
  "max_articles_per_symbol": 5
}

POST /api/v1/analyze

Standardized Response

{
  "request_id": "tp_550e8400-e29b-41d4-a716-446655440000",
  "results": [
    {
      "symbol": "AAPL",
      "sentiment_score": 0.82,
      "label": "bullish",
      "confidence": 0.94,
      "source_tier": "tier_1_local_onnx"
    },
    {
      "symbol": "BTC",
      "sentiment_score": -0.45,
      "label": "bearish",
      "confidence": 0.88,
      "source_tier": "tier_3_llm",
      "reasoning": "Recent regulatory tightening in EU..."
    }
  ],
  "execution_time_ms": 450
}

JSON Response (HTTP 200)

Support the Maintainers

Help us keep TierPulse the fastest financial sentiment engine in the open-source ecosystem.

☕ Support TierPulse Development