Hi,
there is no general rule.
Usually a 32 bit machine will have more throughput, but not always. It depends o a lot of things like
* hardware
* clock frequency
* microcontroller type (floating point support...)
* code
* compiler (options)
* periferals
... and so on
Examples:
* Polling a port pin: usually needs one command ... independent of 8bit or 32 bit
* Transferring one 32 bit variable of data form A to B... usually takes 4 x (R & W) on a 8 bit machine and 1 x (R & W) on a 32 bit machine.
* doing I2C communication: will be almost identically fast on both machines, because the bottleneck is the I2C bus.
(unless the machine uses ISR or FIFO or DMA ... which depends on microcontroller type and code)
But on average applications the 32 bit machine will be faster than an 8 bit machine.
In the 1980ies many home computers ran on an 8 bit machine. Keyboard, maybe mouse, monitor, Memory, printer...
And you could wirte a letter with it, or play a game.
Nowadays with extremely high resolution monitors and streaming videos it makes no sense to try to run this on an 8 bit machine.
There is no strict limit and there is no application that strictly needs 8 bit or 32 bit... the question is:
* what are the requirements. (Very, very important! Decide the requirements carefully)
* and how much effort you want to put in
If you are concerned about processing power
... then often it makes more sense to start with the 32 bit machine and maybe pay a bit more hardware cost,
...than starting with 8 bit ... to find out it´s too slow ... and then switch to 32 bit.
Klaus