Reigniting My Star Wars Data Project
Moving from brittle API scraping and data loss to a sustainable local MediaWiki mirror using dumps, Docker, and Unraid.
I shelved this project for a while after a string of avoidable problems: unstable containers, MongoDB volume mishaps, API overuse worries, and a general sense that I was “hammering” the Star Wars Fandom wiki harder than felt respectful. This post covers how I rebooted the whole effort around a radically simpler (and more ethical) model: pull an official dump, stand up a local MediaWiki, and iterate offline.
My Home Server Setup
A deep dive into building a powerful home server for VMs, Docker, game servers, and local AI workflows—covering hardware choices, software, and lessons learned.
Hardware, Configuration, and Lessons Learned:
Portable CI/CD with Dagger
CI/CD Pipelines with Dagger, containerised pipelines
Heroes-Decode stands as a powerful .NET CLI Tool meticulously crafted to parse .StormReplay files. Developed and maintained by HeroestoolChest, spearheaded by Kevin Oliva, this tool offers an indispensable utility for processing .StormReplay files.
C# (4)
Docker (3)
Unraid (3)
AI (2)
Plex (2)
1Password (2)
Blazor (2)
SVG Files (2)
Engineering (1)
Leadership (1)
Vibe Coding (1)
LLMs (1)
Star Wars (1)
MediaWiki (1)
Data Engineering (1)
ETL (1)
Backup (1)
AV1 (1)
HEVC (1)
Transcoding (1)
NVENC (1)
Quick Sync (1)
Media Server (1)
Compression (1)
Storage (1)
Home Lab (1)
Kubernetes (1)
ZFS (1)
Networking (1)
Game Servers (1)
CICD (1)
Dagger (1)
Github Actions (1)
Gitlab CI (1)
SWTOR (1)
MMORPG (1)
SSH (1)
Javascript (1)
Heroes of the Storm (1)
Razor (1)
Pagination (1)