I've started building tiny little tools to help do tasks that computers are good at (tracking information, injecting randomness, etc.). This article is about the latest one: it's called JitterTime, and it's implemented in Rust using Web Assembly.
JitterTime exists to address the following scenario: you're scheduling a automated task to run every hour/day/week/etc. at (or near) a particular time (for example running automated back-ups every night ad midnight, or scheduling a weekly email on Tuesday at 10am). However, this task interacts with a system that might get overwhelmed at the "obvious" time.
This can happen with big systems that lots of people interact with (scraping news or sports scores, sending backups to the cloud, etc.) but it can also happen with smaller systems that serve a few people. For example, if you schedule a dozen simultaneous tasks to run every morning on your personal computer, you could burden the processor or saturate your home network.
One possible solution is to offset the task by a random amount of time: instead of scheduling all the tasks at 10:00 sharp, put one at 10:02, one at 10:06, and one at 10:25.
JitterTime does just that, and it uses a random number generator to make sure the number is genuinely random.
The algorithm is straightforward and un-demanding, so why did I implement it in Rust?
For starters, I like Rust. This was an afternoon's side-project on my own time, so I picked the tech that I wanted. That might punch a big hole in the rest of this argument for you, and that's fine--you can close the tab now, I won't be mad.
interacting with NPM, choosing a datetime library, setting up a bundler, etc. If
I'm building with Rust, cargo's already in the mix. Cargo is nice enough to work
with, and pulling in
chrono wasn't much trouble. This is
especially appealing as a starting point: once you're comfortable building web
things with Rust it's very easy to extend you project with whatever crates you
might need. JitterTime's needs were minimal, but my next project could be more