Has anyone implemented a 1-D DCT on any Digital Signal Processor?
I need to know how long it takes to calculate the DCT. A large length is preferable.
Thanks
I actually don't have any chipset, it's just that I implemented an algorithm on an FPGA and wanted to compare it's performance versus a DSP implementation. However I found an application document of a DCT implemented on a TI TMS3200 series DSP that had useful information on the number of cycles that it took to perform a 1024-DCT which is the one I implemented on the FPGA.
Thanks anyway.
If you are using C language than it will be easy to program DCT and cross compiled into any target DSP chips, check with TI site, there are quite a few of examples