A Modern Usenet Binary Downloader Written in Rust
Fast, efficient, and built for self-hosters. Full NNTP pipeline with yEnc decoding, PAR2 verification & repair, archive extraction, and a clean web UI. Multi-server failover, connection pooling, and NNTP pipelining for maximum throughput.
Everything you need for Usenet downloads, built from the ground up in Rust
Send multiple ARTICLE commands per connection before reading responses. Configurable pipeline depth per server eliminates round-trip latency.
Priority-ordered server list with automatic failover. Articles not found on one server are retried on the next. Optional servers for fill providers.
Fast, correct yEnc decoder with CRC32 validation. Handles multi-part articles, escape sequences, and assembles files from decoded segments.
Automatic PAR2 verification after download. Damaged files are repaired using recovery blocks before extraction. Full pipeline automation.
Automatic extraction of RAR, 7z, and ZIP archives after download and repair. Supports multi-part RAR (new and old naming) with cleanup.
Responsive single-page web interface. Queue management, download history, server configuration, real-time logs, and drag-and-drop NZB upload.
Full HTTP API with Swagger/OpenAPI documentation. Queue, history, server management, status, and log endpoints. SABnzbd-compatible API layer.
Built-in tracing and metrics export via OpenTelemetry. Ship logs and metrics to Grafana, Jaeger, or any OTLP-compatible backend.
A modular Rust workspace with clean separation of concerns
Built-in benchmarking suite for head-to-head comparison with SABnzbd
rustnzbd ships with benchnzb, a comprehensive benchmarking harness that runs both rustnzbd and SABnzbd through identical scenarios using a mock NNTP server. It generates test data with yEnc-encoded articles, PAR2 recovery files, and 7z archives, then measures download speed, CPU usage, memory consumption, and post-processing time.
Nine scenarios covering raw download, PAR2 repair, and archive extraction
Tests raw NNTP download throughput with 5 GB of yEnc-encoded articles. No post-processing. Measures connection pooling and pipeline efficiency.
Larger raw download to test sustained throughput and memory stability over extended transfers.
Stress test with 50 GB of raw data. Tests memory management and disk I/O at scale.
5 GB download with 5% missing articles. Tests PAR2 verification and repair pipeline after download.
Larger PAR2 repair scenario testing recovery performance at scale.
50 GB with 5% missing articles. Full end-to-end pipeline including download, verify, and repair.
5 GB download followed by 7z extraction. Tests archive detection and extraction pipeline.
Larger extraction scenario testing decompression speed and disk I/O coordination.
50 GB download and extraction. Tests the complete pipeline under heavy load.
All benchmarks run in Docker containers with a mock NNTP server generating yEnc-encoded articles on the fly. Metrics collected via Docker stats API. Full suite and scripts available on GitHub.
Interactive demo with simulated download queue
| Name | Status | Progress | Speed | Size |
|---|---|---|---|---|
| Ubuntu.24.04.Desktop.x64 | Downloading | 67% |
32.1 MB/s | 4.7 GB |
| LibreOffice.7.6.Full.Pack | 45% |
— | 2.1 GB | |
| Blender.4.0.Benchmark.Scenes | Downloading | 23% |
16.1 MB/s | 8.3 GB |
| PostgreSQL.16.Docs.Pack | Completed | 100% |
— | 892 MB |
Port: 8080 (Web UI + API)
Config: Edit config.toml to add your Usenet server(s) or configure via the web UI
Requirements: Rust 1.85+ (2024 edition), par2 and p7zip for post-processing
Benchmarks: Run cd benchnzb && ./run.sh --scenarios quick to compare against SABnzbd
From NZB file to extracted content — fully automated
XML parsing extracts article message IDs, file segments, groups, and metadata. Password support via <meta> tags.
NNTP connections fetch articles with pipelining. Multi-server failover retries missing articles on backup servers.
yEnc-encoded article bodies are decoded with CRC32 validation. Segments are assembled into complete files.
PAR2 checks file integrity. If blocks are missing, recovery data is used to reconstruct damaged files.
Archives (RAR, 7z, ZIP) are extracted. Temporary files (par2, rar volumes) are cleaned up. Output moved to final directory.