i m trying to make a program to convert a number into it's binary.
Code:
#include<iostream>
#include<algorithm>
#include<bitset>
using namespace std;
int main()
{
int a;
string k;
bitset<CHAR_BIT> n;
cin>>a;
n=bitset<CHAR_BIT>(a);
cout<<n<<" ";
return 0;
}
The program gives wrong answer for 585 as it contains more than 6 binary digits. H开发者_如何学JAVAow can i such greater numbers?
585 mod 256 = 73 (assuming CHAR_BIT is 8)
73 in base 2 = 0b01001001
The program does print 01001001.
I don't see there's anything wrong.
If you want to store the whole range of a
, the bitset should be declared as
bitset<CHAR_BIT * sizeof(a)> n (a);
A bitset has a fixed number of bits. You specify bitset<CHAR_BIT>
-- on most systems, CHAR_BIT is 8 so you will have an 8-bit bitset. When you try to stuff a bigger number into the bitset, the most significant bits are discarded.
If you know in advance the largest numbers you will have to deal with, you can specify eg bitset<16>
or bitset<32>
. If you don't, you may have to use some other datatype.
精彩评论