

If you haven’t already signed up for Usenet access then take a look at UNS. Template: src="templates/2" dest="/etc/filebeat/filebeat.This guide will walk you through setting up SABnzbd to work with UseNetServer news servers. name: TASK | Enable and start SAB service Template: src="templates/sabnzbd_2" dest="/apps/data/.sabnzbd/sabnzbd_config.ini" mode=0644 Template: src="templates/2" dest="/etc/systemd/system/rvice" mode=0644 name: TASK | Install systemd unit file for SAB name: TASK | Change owner of SAB to usenet Yum name=epel-release state=present update_cache=yes name: TASK | Create /apps/data/.sabnzbd directory Here is the ansible playbook that configuress the sabnzbd instances: 1 Here is a terraform play to provision 6 new hosts (1 Elasticsearch, 1 HAproxy and 4 Sabnzbd): Here’s the terraform config that builds a Sabnzbd server: 1 It configures a VIP which I point Sonarr to. Terraform builds 4 sabnzbd, 1 haproxy, and 1 ELK instance. I will probably change this to Consul, but I just wanted something quick so I used a basic haproxy config. The instances all run the same sabnzbd config and the instances use haproxy for round-robin distribution. Using Ansible and Terraform (devops automation tools), I can spin up VPC on demand, provision them, configure them as sabnzbd download nodes and then destroy the instances when complete. Since Sonarr can send multiple files to sabnzbd which get queued up, I figured I can reduce the queue by downloading them at the same time. I use Sonarr for searching usenet for freely distributable training videos which then sends them to SABnzbd for downloading. It doesn’t take long to download at all, but out of curiosity I wanted to see if I could parallelize this and download multiple files at the same. I’m limited to about 40MB/s on downloads on my VPC at Digital Ocean, but I run Sabnzbd for downloading large files from usenet.
