However, it does not increase the temperature. How can I detect the temperature variation if the resistor consume 5mW energy? Thank you.
Of course the temperature is increasing. The question is how much?
If the resistor is well insulated, the heat produced will have nowhere to go and the temp will rise linearly with time. But no insulation is perfect and heat is lost by conduction, convection and radiation.
When the heat lost is exactly balanced by the heat produced, the temp will come to a steady value.
Your resistor is a (assume) small body with some heat capacity. So it will take some heat initially to get hot (just like a soldering iron).
I guess the steady state will come (including all forms of heat conduction) with an increase in temp of the resistor by about 1C (if the resistor has about the same size of a 1/4W typical resistor).
Much of the heat will be conducted away by the leads; rest will be conducted away by convection (surround air) and a rather small amount by radiation.
It will be tough to measure. What kind of thermometer you are using to measure the temp rise? You will need a thermometer with a very low heat capacity (say a thermistor or a Pt100 temp sensor)
Rest assured that it does increase the temp of the resistor. The heat produced has to go somewhere.