YouTube OPML subscription archiver
  • Go 98%
  • Dockerfile 1.5%
  • Just 0.4%
Find a file
Viktor Varland 9324fc0db1
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
fix: subsytctl timestamp bug on running sync
2026-04-03 11:05:10 +02:00
cmd/subsytctl fix: subsytctl timestamp bug on running sync 2026-04-03 11:05:10 +02:00
internal feat: migrate to uv 2026-04-03 10:53:31 +02:00
.gitignore refactor: port the subsytctl bash to go 2026-03-11 09:17:12 +01:00
.woodpecker.yaml ci: migrate to woodpecker 2026-03-09 22:06:46 +01:00
AGENTS.md fix: remove opml references 2026-04-03 10:35:04 +02:00
Containerfile feat: migrate to uv 2026-04-03 10:53:31 +02:00
go.mod feat!: migrate to database driven management 2026-03-11 13:51:28 +01:00
go.sum feat!: migrate to database driven management 2026-03-11 13:51:28 +01:00
justfile feat!: migrate to database driven management 2026-03-11 13:51:28 +01:00
main.go feat: migrate to uv 2026-04-03 10:53:31 +02:00
README.md feat: migrate to uv 2026-04-03 10:53:31 +02:00
shell.nix feat: support opml from youtube 2025-04-08 12:32:12 +02:00

subsyt

description

subsyt is a wrapper around yt-dlp1 to download youtube channels based on your subscriptions. Downloads land in an isolated staging directory before being organised into {show}/{season} folders under your configured media library. During the organising step the tool generates nfo files, extracts thumbnails, downloads posters, banners, and fanart so the media should plug into media libraries well-enough, e.g. Jellyfin and Kodi.

A quick rundown on how to use it:

  • download subsyt or build it into a binary yourself2
  • install yt-dlp3
  • patch it with POT support (POT optional -- yet recommended) 4
  • setup a config file5
  • run subsyt6

Subscriptions are managed via the subsytctl CLI or HTTP API and stored in a local SQLite database.

install

go install git.meatbag.se/varl/subsyt@latest

yt-dlp

Install uv on your system.

curl -LsSf https://astral.sh/uv/install.sh | sh

Then install yt-dlp:

uv tool install yt-dlp

Plugins (e.g. yt-dlp-ejs, curl-cffi) are configured via the plugins field in config.json and installed automatically on each sync cycle.

running

Configuration can be loaded from a file specified either by the env variable CONFIG or --config flag.

The --config flag has priority over CONFIG environment variable.

CONFIG="/path/to/config.json" ./subsyt

./subsyt --config="/patch/to/config"

./subsyt    # assumes "./config.json"

build

We want a statically linked binary so disable CGO.

CGO_ENABLED=0 go build

Or use the justfile to build both subsyt and subsytctl:

just build            # outputs to build/
just install          # installs subsytctl to /usr/local/bin

config

Full config.json:

{
    "daemon": true,
    "dry_run": true,
    "log_file": "/data/log/subsyt.log",
    "db_file": "/data/db/subsyt.db",
    "download_dir": "./vids/_staging",
    "media_dir": "./vids",
    "http_api": {
        "enable": true,
        "listen": "0.0.0.0:6901",
        "queue_file": "./api-queue.json"
    },
    "provider": {
        "youtube": {
            "verbose": false,
            "cmd": "./yt-dlp",
            "plugins": ["yt-dlp-ejs", "curl-cffi", "bgutil-ytdlp-pot-provider"],
            "format": "best",
            "format_sort": "res:1080",
            "output_path_template": "s%(upload_date>%Y)s/%(channel)s.s%(upload_date>%Y)Se%(upload_date>%m%d)S.%(title)s.%(id)s.%(ext)s",
            "url": "https://www.youtube.com",
            "video_url_template": "",
            "match_filters": "!is_live & duration>?60",
            "throttle": 5,
            "cookies_file": "",
            "subscription_file": "",
            "po_token": "",
            "bgutil_server": "http://127.0.0.1:4416",
            "player_client": ""
        }
    }
}

Minimal config.json:

{
    "download_dir": "./vids/_staging",
    "media_dir": "./vids",
    "provider": {
        "youtube": {
            "cmd": "./yt-dlp",
            "url": "https://www.youtube.com",
            "throttle": 5
        }
    }
}

migration

Existing deployments that used to read media directly from download_dir should be migrated manually:

  1. Stop the daemon or API workers so new downloads pause.
  2. Back up your current media_dir and staging tree.
  3. Move the contents of the legacy download_dir/shows and download_dir/episodes directories into the new media_dir, sorted by show and season as desired.
  4. Optionally re-run subsyt with dry_run=true to confirm the new layout before enabling writes again.
  5. Once satisfied, delete the obsolete per-show/per-episode staging directories under download_dir.

The application now keeps raw downloads inside download_dir and writes the organised library exclusively to media_dir.

subsytctl

subsytctl is the CLI for managing the server. It requires a running subsyt daemon with the HTTP API enabled.

export SUBSYT_HOST=subsyt.home.arpa:6901
export SUBSYT_KEY=<auth-token>

Commands:

subsytctl status                        show server status
subsytctl sync-rss                      trigger RSS feed discovery
subsytctl sync-dl                       trigger download of scheduled episodes
subsytctl list [id]                     list subscriptions, or episodes for a channel
subsytctl subscribe [--provider NAME] [--title TITLE] <channel_id|url>
                                        add a subscription (default provider: youtube)
subsytctl unsubscribe <id>              remove a subscription by ID
subsytctl add <url> [out_dir]           enqueue a video for download
subsytctl scheduled                     list episodes scheduled for download
subsytctl skip <video_id>               skip a scheduled episode
subsytctl scrub [--dry-run]             remove watched, non-favourite episodes
subsytctl rename <id> <new-title>       rename a channel's media directory
subsytctl redownload <video_id>         re-download a video by ID
subsytctl rotate-key                    rotate the API auth token

cookies

Warning

Your account MAY be banned when using cookies ! Consider using a throw-away account.

Install an extension that can download cookies per site, e.g. for firefox: https://addons.mozilla.org/en-US/firefox/addon/cookies-txt/

The steps for the browser is:

  1. install cookie export extension, allow in private mode
  2. open a private browsing session (e.g. incognito)
  3. go to youtube.com and login using a (throw-away) account
  4. export the cookies using extension, save to disk
  5. close private browsing session
  6. point cookies_file in config.json to the cookies-file

Cookies may need to be refreshed if/when they expire, if so, repeat steps 2-5.

You can also yt-dlp to do it for you, though that exports all the cookies in the browser:

yt-dlp --cookies-from-browser {browser} --cookies cookies.txt

pot

Youtube has started requiring proof-of-origin tokens for some players, and it may help not getting hit with the "sign in to confirm you are not a bot" together with cookies.

Either add a manually generated POT to the config: "po_token": "{TOKEN}" or, set up bgutils8 with the youtube extractor to do POT generation automatically, in which case, leave the po_token as an empty string ("").

Add bgutil-ytdlp-pot-provider to the plugins list in your config.json (it is included in the full config example above).

On the same machine, run the bgutils http server, e.g. with compose:

 bgutil:
      image: brainicism/bgutil-ytdlp-pot-provider
      container_name: bgutil
      restart: unless-stopped
      ports:
        - 4416:4416

If using default ports and it's available on localhost, yt-dlp will pick up the plugin automatically and can be verified in the logs.

scheduling

When daemon is true, subsyt runs its own scheduler internally: RSS feeds are checked and downloads are triggered on a jittered daily cycle (around 0400). Use subsytctl sync-rss or subsytctl sync-dl to trigger either manually.

systemd (non-daemon)

For one-shot mode (daemon is false), a systemd timer can drive the schedule:

~/.config/systemd/user/subsyt.service

[Unit]
Description=subsyt archival

[Service]
Type=oneshot
ExecStart=/home/varl/yt/subsyt --config=/home/varl/yt/config.json
WorkingDirectory=/home/varl/yt

~/.config/systemd/user/subsyt.timer

[Unit]
Description=subsyt archival on boot and daily

[Timer]
OnCalendar=*-*-* 4:00:00
Persistent=true
AccuracySec=1us
RandomizedDelaySec=30

[Install]
WantedBy=timers.target

container

podman run --rm \
    --volume=path/to/data:/data \
    registry.meatbag.se/varl/subsyt

compose

Runs in daemon mode (scheduled sync around 0400 daily), with automatic POT generation via bgutil.

services:
  subsyt:
      image: registry.meatbag.se/varl/subsyt:latest
      container_name: subsyt
      user: 1000:1000
      ports:
        - 6901:6901
      volumes:
        - /opt/subsyt/data:/data

  bgutil:
      image: brainicism/bgutil-ytdlp-pot-provider
      container_name: bgutil
      restart: unless-stopped
      ports:
        - 4416:4416

http api

Enable the built-in HTTP API to manage subscriptions and queue ad-hoc downloads. An auth token is generated on first run and printed to the log.

{
    "http_api": {
        "enable": true,
        "listen": "0.0.0.0:6901",
        "queue_file": "./api-queue.json"
    }
}

Submit new downloads with bearer authentication:

curl \
  -H "Authorization: Bearer <token>" \
  -H "Content-Type: application/json" \
  --data '{"url":"https://youtu.be/VIDEO","out_dir":"Channel"}' \
  http://127.0.0.1:6901/v1/videos

Check server status (sync timing, subscription/episode counts, queue):

curl -H "Authorization: Bearer <token>" http://127.0.0.1:6901/status

Example response:

{
  "rss_sync": {
    "running": false,
    "next": "2025-03-12T04:15:30Z",
    "last_started": "2025-03-11T14:32:15Z",
    "last_ended": "2025-03-11T14:48:32Z",
    "last_duration_seconds": 977
  },
  "download_sync": {
    "running": false,
    "next": "2025-03-12T07:22:10Z",
    "last_started": "2025-03-11T15:01:00Z",
    "last_ended": "2025-03-11T15:12:45Z",
    "last_duration_seconds": 705,
    "last_downloaded": 3,
    "last_failed": 0
  },
  "subscriptions": 12,
  "episodes": 487,
  "scheduled": 5,
  "queue": { "count": 0, "items": [] }
}

result

.
├── Technology Connextras
│   ├── archive.txt
│   ├── fanart.jpg
│   ├── poster.jpg
│   ├── s2024
│   │   ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p-thumb.jpg
│   │   ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p.nfo
│   │   ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p.webm
│   │   ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p-thumb.jpg
│   │   ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p.nfo
│   │   ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p.webm
│   │   ├── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p-thumb.jpg
│   │   ├── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p.nfo
│   │   └── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p.webm
│   ├── s2025
│   │   ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p-thumb.jpg
│   │   ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p.nfo
│   │   ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p.webm
│   │   ├── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p-thumb.jpg
│   │   ├── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p.nfo
│   │   └── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p.webm
│   └── tvshow.nfo