The lateral D-S field that can be stood off at Lmin will
determine the working voltage (subject to many features
that modify hot carrier effects). Any gate oxide thicker
than what's needed to reliably stand that max working
voltage (subject to blah blah) is just leaving performance
on the table. You can't really optimize one without the
other (you could go there but it will not be optimum).
Where density is king, geometry goes where it's led.
Shrinking the FET gate is proxy for many other co-shrinks.
In your final question, you could, but then it would be
(advertised as) a 35nm technology and would not live
at 1.8V core voltage. Maybe this is a don't-care (often
is). You trade mask & wafer cost, electrical performance
and functional density for the product under development,
to pick a capable least-cost (you hope) manufacturing
solution. If 180nm does the job, why go to 35nm at 4X
the wafer cost (rough guess) if you are not going to see
the billion-devices-per-year economies of scale on the
back end?