One of the authors here, I can't exactly deny that line was added to sound impressive, so guilty as charged. However the savings are much higher than $20/day for a few reasons:
* Many tasks run on expensive instances (hardware acceleration, Windows)
* We have OSX/Android pools that run on physical devices in a data centre (these are an order of magnitude more expensive than Linux)
* There are ancillary costs. For example each task generates artifacts which incur storage costs. These artifacts are downloaded which incur transfer costs.
* There are also overhead costs (idle time, rebooting, etc) that aren't counted in the 10 years / day stat.
All these things see a corresponding decrease in costs with fewer tasks.
* Many tasks run on expensive instances (hardware acceleration, Windows)
* We have OSX/Android pools that run on physical devices in a data centre (these are an order of magnitude more expensive than Linux)
* There are ancillary costs. For example each task generates artifacts which incur storage costs. These artifacts are downloaded which incur transfer costs.
* There are also overhead costs (idle time, rebooting, etc) that aren't counted in the 10 years / day stat.
All these things see a corresponding decrease in costs with fewer tasks.