My favorite use was during my PhD. My thesis could be regenerated from the source data, through to creating plots with gnuplot/GRI and finally assembled from the Latex and eps files into the final pdf.
It was quite simple really, but really powerful to be able to tweak/replace a dataset hit make, and have a fully updated version of my thesis ready to go.
> My thesis could be regenerated from the source data, through to creating plots with gnuplot/GRI and finally assembled from the Latex and eps files into the final pdf.
All research should be like this. Explaining things in words is simply so imprecise you end up spending forever to show someone something that a program can tell you quickly.
"I used xyz transformation with abc parameters" can be gleaned easily from code.
I do the same with latex when generating contracts for my company. The makefile and its corresponding python program asks for various arguments (company name, xxx, yyy). Then it generates the contract with the associated prices. I even put some automatic reduction/gifts in place (free service) if the amount is bigger than X or Y.
I do something similar, here's my Makefile -- I have scripts that build figures in a separate directory, /figures. I'm sure it could be terser, but it does the job for me.
I noticed that you have some file dependencies not encoded in the targets. Also, you might like reading up on Automatic Variables ($@, $^, $<, etc). Anyway, just for fun I tried rewriting your script in a way that should Just Work a little better.
Check out latexmk: it keeps track of having to run bibtex etc, and runs latex "enough times" so that all equation refs etc have stabilised. (Latexmk -pdf to build a pdf, default is dvi.)
For what it's worth, I did something similar for my master's dissertation, but couldn't be bothered to learn make, so I used a python library called pydoit:
It was quite simple really, but really powerful to be able to tweak/replace a dataset hit make, and have a fully updated version of my thesis ready to go.