Hi,
I meant reliability with respect to temperature (industrial grade) and timing acuracy
I used to drive myself half-mad looking to get temperature high linearity and 0.1% timing accuracy using assorted typical methods, in the end I accepted reality and feel much better these days.
You could select a high-accuracy timer IC (go pricey or go home), that or get a 1% timing repeatability 555, use a PPS capacitor of as low a value as possible, and go through temperature compensation methods, paying close attention to each part's ppm/°C and a total error budget. It sucks.
1 to 60 seconds range is very hard for accuracy and repeatability of that accuracy - tiny and enormous resistors (some will get warmer than others) or worse, tiny and enormous, drifty capacitors.
I reckon that if 1 second is +-1% accurate, the 60 seconds will be about +-5% accurate, at best.
Good timer ICs are way better than a 555. And laboriously matching diodes to transistor delta Vbes, getting ptc and ntc resistors to offset each other, and other so ons... Best buy a proven part, instead.
Homebrew, loose-ish accuracy tolerance, maybe CD4049UB ring oscillator as slow as possible with smallest timing dielectric capacitor possible, then something like CD4060 but one that has configurable outputs.
I have a homemade 1-second timer for a frequency counter made from SE555s, 100ppm resistors and expensive-ish timing capacitors, open-frame to keep cool, and imho, it's worthless junk, a gleeful newbie error design to be fond of but not use seriously