Welcome to our site! EDAboard.com is an international Electronic Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
I havent heard anything like that before. Could you please send a reference which says "the equal 0s and 1s gives the max. capacity"? or where did you hear that?
Added after 3 hours 31 minutes:
Sorry for answering again in a separate reply but something has appeared in my mind about your question; Due to Information theory the maximum entropy (which is strictly related to capacity) can only be obtained if a random variable is Gaussian disributed. If I am not wrong and If I remember my undergraduate lectures properly, then the gaussian distribution means a distribution where the probabilities of every random event is equal. In your question the events are 0s and 1s. In order to get maximum entropy (thus the maximum capacity) you need to make 0s and 1s equi-probable (same numbers of 1s and 0s). If you think this might be the answer then I refer you to some reading on "capacity&entropy" and the "entropy of gausian distribution".