If I'm developing a fully associative cache of significantly large size, which replacement algorithm is best suited to go with it?
I've seen that for smaller fully associative structures like victim cache algorithms like FIFO and LRu tend to work quite well. But as the size increase, these replacement policies tend to perform poorly or they would have large overheads.
I've got a hunch that random replacement would be the best bet. But then again, I have some doubts.
Would someone help me in this regard?