How can I make an one dimensonal array with 256^256 elements?
I've researched big integers, but I haven't found anything about using them as indexes in arrays.
An alternative would be using multidimensional arrays. Anyway, using long 64 bit integers as indexes, it would be needed 32 dimensions. Is there a way to add dimensions to an array after its definition?
EDIT:
I wanted to sort all 256-byte combinations in order of entropy. I may try something more realistic, like 4-byte combinations. But it may be useless. Thanks 开发者_开发问答for help.
As said by others, the universe isn't big enough for such an array. However if it's a sparse array (i.e. not all elements are present only some) then you can use a std::map
(or std::unordered_map
) with the number being the key (now all you have to do if find a big integer library).
ANY form of array with 256^256
elements would be so insanely big that storing it is impossible. For example, if every element would be 1 bit you would still require
36740182248776144775450662437685823488714166466838246415284567578424520364555850043413103562
12820063739027086844759880875030780674761777060562980339733290193252159652649992960022040932
74368143588572566368203155983301169483668442764631414483329492332706895988986119676188622814
73799459727851572551030506458030134050875026721792761435138532498848299128480137280869346120
26576704260231561001336360572266150611167852685380064696955468929349115301864049308461884866
94145775864907486950525312262263093217864817217361867923249690716116542182574428528882508424
4005613936921626330968666174894520389915236768940032
terabytes of storage.
Perhaps refactoring your algorithm would help.
Use a tree and keep only the parts in memory that you really need. Since 256^256 elements are beyond anything that you will ever be able to store on hard disks, so you need to cut down your algorithm anyway.
It seems to me the array you are talking about would require 3 x 10^106 MB of storage. I don't think the worst problem you have is no means to index such an array.
Well, first you'll have to invent a computer with more bytes of memory than there are atoms in the universe. Then, for each byte in that computer's memory, Then make as many of those as there are atoms in the universe, and hook them all up into one large cluster, and... you've nearly got enough memory.
In other words, 256^256 is a big number. You can't make an array that big.
The index of your array will go from 0 to 2^2048. Being 2^2048 a number that cannot be represented on today's computers, the biggest representable number being 2^128. So you have two solutions:
- One, wait 20 or 30 years until the computers can represent that number (and I'm being optimistic).
- Two, find another representation. Your index cannot be represented as a number, so you will need another representation: string, bunch of ints, etc... Use a hash structure to store your index<->value correspondences. I doubt you will fill your array completely because you will never have enough memory to store that kind of array, so a hash table will do the work.
This can be done only using functions or some classes with member functions. Arrays or anything that stores the data in memory will not do. Doing single operation for every element in such array will take more time with current computers than the universe has existed since the big bang.
精彩评论