A well designed transformer will be greater then 90% however if the transformer is run near or above it maximum flux handling capability its efficiency will severely drop. There can be other negative ramification to this like excessive current spikes on the MOSFET's. A well designed transformer usually distributes design loses equally between transformer core loss and winding wire losses.
Generally losses are anything that heats up. A voltage drop across a component with current flowing through it will be a loss. MOSFET Rds-ON is one of the significant factors. If output is a low voltage DC at high amperage then rectifier diode drops can be significant. Many low voltage output switchers are now using syncronous rectifiers which are MOSFET's with lower voltage drop compared to a diode.
There are many consideration depending on use conditions. For example, a DC to AC inverter that is runs with intermittant loads may need to be designed to reduce no-load or light load losses to reduce the overall average power consumed. In this case, the power it takes to drive the input capacitance of the switching MOSFET's will be a signifcant factor. There are inverters that change how many parallel MOSFET's they chop depending on needed load. This is trading off total Rds-ON losses to total gate drive overhead losses to extend efficiency to lower load levels.
You might want to go through the documents at this site.
**broken link removed**