- Go 97.8%
- Dockerfile 1.7%
- Just 0.4%
|
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
|
||
|---|---|---|
| cmd/subsytctl | ||
| internal | ||
| .gitignore | ||
| .woodpecker.yaml | ||
| AGENTS.md | ||
| Containerfile | ||
| go.mod | ||
| go.sum | ||
| justfile | ||
| main.go | ||
| README.md | ||
| shell.nix | ||
subsyt
description
subsyt is a wrapper around yt-dlp1 to download youtube channels
based on your subscriptions. Downloads land in
an isolated staging directory before being organised into
{show}/{season} folders under your configured media library. During the
organising step the tool generates nfo files, extracts thumbnails,
downloads posters, banners, and fanart so the media should plug into
media libraries well-enough, e.g. Jellyfin and Kodi.
A quick rundown on how to use it:
- download
subsytor build it into a binary yourself2 - install
yt-dlp3 - patch it with POT support (POT optional -- yet recommended) 4
- setup a config file5
- run
subsyt6
Subscriptions are managed via the subsytctl CLI or HTTP API and stored
in a local SQLite database.
install
go install git.meatbag.se/varl/subsyt@latest
yt-dlp
Install pipx on your system.
sudo apt install pipx # debian
sudo pacman -Syu python-pipx # archlinux
pipx install yt-dlp
running
Configuration can be loaded from a file specified either by the env
variable CONFIG or --config flag.
The --config flag has priority over CONFIG environment variable.
CONFIG="/path/to/config.json" ./subsyt
./subsyt --config="/patch/to/config"
./subsyt # assumes "./config.json"
build
We want a statically linked binary so disable CGO.
CGO_ENABLED=0 go build
config
Full config.json:
{
"daemon": true,
"dry_run": true,
"log_file": "/data/log/subsyt.log",
"db_file": "/data/db/subsyt.db",
"download_dir": "./vids/_staging",
"media_dir": "./vids",
"http_api": {
"enable": true,
"listen": "0.0.0.0:6901",
"queue_file": "./api-queue.json"
},
"provider": {
"youtube": {
"verbose": false,
"cmd": "./yt-dlp",
"format": "best",
"format_sort": "res:1080",
"output_path_template": "s%(upload_date>%Y)s/%(channel)s.s%(upload_date>%Y)Se%(upload_date>%m%d)S.%(title)s.%(id)s.%(ext)s",
"url": "https://www.youtube.com",
"throttle": 5,
"cookies_file": "",
"subscription_file": "",
"po_token": "",
"bgutil_server": "http://127.0.0.1:4416",
"player_client": ""
}
}
}
Minimal config.json:
{
"download_dir": "./vids/_staging",
"media_dir": "./vids",
"provider": {
"youtube": {
"cmd": "./yt-dlp",
"url": "https://www.youtube.com",
"throttle": 5
}
}
}
migration
Existing deployments that used to read media directly from
download_dir should be migrated manually:
- Stop the daemon or API workers so new downloads pause.
- Back up your current
media_dirand staging tree. - Move the contents of the legacy
download_dir/showsanddownload_dir/episodesdirectories into the newmedia_dir, sorted by show and season as desired. - Optionally re-run
subsytwithdry_run=trueto confirm the new layout before enabling writes again. - Once satisfied, delete the obsolete per-show/per-episode staging
directories under
download_dir.
The application now keeps raw downloads inside download_dir and writes
the organised library exclusively to media_dir.
subsytctl
subsytctl is the CLI for managing the server. It requires a running
subsyt daemon with the HTTP API enabled.
export SUBSYT_HOST=subsyt.home.arpa:6901
export SUBSYT_KEY=<auth-token>
Commands:
subsytctl status show sync timing, counts, and queue
subsytctl sync trigger a subscription sync
subsytctl list [id] list subscriptions, or episodes for a channel
subsytctl subscribe <channel_id|url> add a subscription
subsytctl unsubscribe <id> remove a subscription by ID
subsytctl add <url> [out_dir] enqueue a video for download
subsytctl scrub [--dry-run] remove watched, non-favourite episodes
subsytctl rename <id> <new-title> rename a channel's media directory
subsytctl redownload <video_id> re-download a video by ID
subsytctl rotate-key rotate the API auth token
cookies
Warning
Your account MAY be banned when using cookies ! Consider using a throw-away account.
Install an extension that can download cookies per site, e.g. for firefox: https://addons.mozilla.org/en-US/firefox/addon/cookies-txt/
The steps for the browser is:
- install cookie export extension, allow in private mode
- open a private browsing session (e.g. incognito)
- go to youtube.com and login using a (throw-away) account
- export the cookies using extension, save to disk
- close private browsing session
- point
cookies_fileinconfig.jsonto the cookies-file
Cookies may need to be refreshed if/when they expire, if so, repeat steps 2-5.
You can also yt-dlp to do it for you, though that exports all the
cookies in the browser:
yt-dlp --cookies-from-browser {browser} --cookies cookies.txt
pot
Youtube has started requiring proof-of-origin tokens for some players, and it may help not getting hit with the "sign in to confirm you are not a bot" together with cookies.
Either add a manually generated POT to the config: po_token = "{TOKEN}" or, set up bgutils8 with the youtube extractor to do POT
generation automatically, in which case, leave the po_token as an
empty string ("").
# assumes pipx was used to install yt-dlp
pipx inject yt-dlp bgutil-ytdlp-pot-provider
Then change the provider.youtube option for cmd to the yt-dlp
binary in the modified venv, e.g. /home/varl/.local/bin/yt-dlp.
On the same machine, run the bgutils http server, e.g. with compose:
bgutil:
image: brainicism/bgutil-ytdlp-pot-provider
container_name: bgutil
restart: unless-stopped
ports:
- 4416:4416
If using default ports and it's available on localhost, yt-dlp will
pick up the plugin automatically and can be verified in the logs.
scheduling
systemd
Tip
Remember to change the
ExecStartpath to the venv'edyt-dlpbinary if using it.
~/.config/systemd/user/subsyt-archival.service
[Unit]
Description=subsyt archival of yt subscribtions
[Service]
Type=oneshot
ExecStart=/home/varl/yt/yt-dlp -U
ExecStart=/home/varl/yt/subsyt
WorkingDirectory=/home/varl/yt
~/.config/systemd/user/subsyt-archival.timer
[Unit]
Description=subsyt archival on boot and daily
[Timer]
OnCalendar=*-*-* 4:00:00
Persistent=true
AccuracySec=1us
RandomizedDelaySec=30
[Install]
WantedBy=timers.target
container
podman run --rm \
--volume=path/to/data:/data \
registry.meatbag.se/varl/subsyt
compose
Runs in daemon mode (scheduled sync around 0400 daily), with automatic POT generation via bgutil.
services:
subsyt:
image: registry.meatbag.se/varl/subsyt:latest
container_name: subsyt
user: 1000:1000
volumes:
- /opt/subsyt/data:/data
bgutil:
image: brainicism/bgutil-ytdlp-pot-provider
container_name: bgutil
restart: unless-stopped
ports:
- 4416:4416
http api
Enable the built-in HTTP API to manage subscriptions and queue ad-hoc downloads. An auth token is generated on first run and printed to the log.
{
"http_api": {
"enable": true,
"listen": "0.0.0.0:6901",
"queue_file": "./api-queue.json"
}
}
Submit new downloads with bearer authentication:
curl \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
--data '{"url":"https://youtu.be/VIDEO","out_dir":"Channel"}' \
http://127.0.0.1:6901/v1/videos
Check server status (sync timing, subscription/episode counts, queue):
curl -H "Authorization: Bearer <token>" http://127.0.0.1:6901/status
Example response:
{
"sync": {
"next": "2025-03-12T04:15:30Z",
"last_started": "2025-03-11T14:32:15Z",
"last_ended": "2025-03-11T14:48:32Z",
"last_duration_seconds": 977
},
"subscriptions": 12,
"episodes": 487,
"queue": { "count": 0, "items": [] }
}
result
.
├── Technology Connextras
│ ├── archive.txt
│ ├── fanart.jpg
│ ├── poster.jpg
│ ├── s2024
│ │ ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p-thumb.jpg
│ │ ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p.nfo
│ │ ├── Technology_Connextras.s2024e0611.Connextras_dishwasher_follow_up_the_sequel.0Kp3bjm55xw-1080p.webm
│ │ ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p-thumb.jpg
│ │ ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p.nfo
│ │ ├── Technology_Connextras.s2024e0712.Here_s_what_Numitron_tubes_in_an_actual_product_look_like.XgzL05Gojfw-1080p.webm
│ │ ├── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p-thumb.jpg
│ │ ├── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p.nfo
│ │ └── Technology_Connextras.s2024e0909.Answering_your_pinball_questions_-_Williams_Aztec_Q_A.P3Y4d2aHnNE-1080p.webm
│ ├── s2025
│ │ ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p-thumb.jpg
│ │ ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p.nfo
│ │ ├── Technology_Connextras.s2025e0330.Renewable_energy_means_we_can_stop_setting_money_on_fire_silly_billy.Y2qSaD1v4cQ-1080p.webm
│ │ ├── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p-thumb.jpg
│ │ ├── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p.nfo
│ │ └── Technology_Connextras.s2025e0331.An_unplanned_trip_from_Chicago_to_Milwaukee_in_an_electric_car.3GUQdrpduo0-1080p.webm
│ └── tvshow.nfo