--- title: "An Introduction to Huginn" date: 2024-05-23T00:00:00-05:00 --- I've been playing around a lot with [Huginn](https://github.com/huginn/huginn), which is a service that allows you to run "agents" for automation. It is similar to IFTTT. A lot of people see Huginn, and think it's cool, but don't know what to do with it. I didn't really either when I first heard of the project a few years ago. Hopefully you can get some ideas from this blog. I have [previously written about my "On This Day" software](https://marks.kitchen/blog/on_this_day/), which was a webserver I handcrafted. It ran a bunch of scripts to pull in data from various sources, and would create a daily digest of the items. Since I put this together fairly quickly several years ago, it runs usually without issue. But it's annoying to extend, and annoying to debug. I wanted to try to migrate the functionaility to huginn. Huginn has a bit of a learning curve. At first, I was having a hard time understanding the agent configuration options, and I couldn't even figure out how to use the HTTP request agent (called the "Website Agent"). With some light reading of the source code, I figured everything out. Currently, my "On This Day" scenario features 14 agents. - 7 "source" agents. Most of these are the "Website agents" which scrape some data from the web, and parse the information I want from it. Several of these are for comics, and they simply output the image source for a comic image. One of the agents is a JavaScript agent, which runs a script I wrote that generates a different result based on the date. - 5 formatting agents, which take in random JSON input, and normalize it. Each source has a different input, but the outputs all just have a "message" and a "type". - 1 digest agent, which takes in the normalized JSON, and combines it into one HTML template. This listens for events over the course of a day, and then when scheduled, it outputs the result of the templating. - 1 "data output" agent, which basically just means an RSS feed output. For every new input, this adds an item to the feed. So far, this solution works really well. It's really easy to add new things. A few of the odd data sources from "On This Day" use JavaScript to fill in functionality gaps. For my journal data entry, which gets entries from historic journals, I ended up making my own API service. I finished most of this API in an evening, and adding it into huginn was really simple. For fun, I also added a "reblog" feed. I connected an Webhook agent to my RSS reader, Miniflux. I can "save" stories in Miniflux, and their links will get added to a daily list of "reblogged" items. I used a manual agent as well so that I can add links that don't originate from Miniflux (i.e. if I just find something elsewhere on the web). You can follow [this feed here](https://huginn.marks.kitchen/users/1/web_requests/23/reblogs.xml). I also have been using huginn for my personal tracking stuff. I previously had a bunch of cronjobs running on a raspberry pi. It collected data from my Airgradient arduino, and used a weather API. Some of this I've moved fully to Huginn, which will massively simplify all of the configuration. One benefit of having all of this is huginn is that it is much easier to set it all up again on a new server. As much as possible, everything is together in one place, rather than spread around in cronjobs on various nodes. I've been putting everything in Docker and exposing it via Traefik. It's very simple to set this all up. One downside of Huginn is that it's been hogging a lot of memory on my VPS. This isn't a huge deal, as I was already using a very minimal server. Additionally, as I mentioned before, it does take some time to figure out everything. I'm still not 100% sure of all of the settings, but functionally I was able to get a lot out with minimal tinkering. I also was dissapointed in the Docker setup, which doesn't have as much documentation for as I expected. Overall, I really enjoy Huginn! It's taken a lot of the scripts I've written over the last few years, and simplies their deployment and configuration. It's so much easier to update them, and I can do things I never before attempted.