#include <iostream>
#include <climits>
int main(void) {
std::cout << SHRT_MIN << std::endl;
std::cout << SHRT_MAX << std::endl;
//-32768 ~ 32767 is the range
short a;
std::cin >> a;
std::cout << "decimal: " << a << std::endl;//decimal
return 0;
}
Look at above code used to observe what happens when I enter an integer outside of range of short.
When I enter any integer smaller than -32768,like -3333333, -32768 will be printed.
When I enter any integer bigger than 32767,like 3333333, 32767 will be printed.
Now I want to see what happens when I enter an integer outside of range of unsigned short.
Here's the code:
#include <iostream>
#include <climits>
int main(void) {
std::cout << USHRT_MAX << std::endl;
unsigned short a;
std::cin >> a;
std::cout << "decimal: " << a << std::endl;//decimal
return 0;
}
There seems to be no USHRT_MIN, but I think it is 0 that is the smallest.
So the range of unsigned short 0 ~ 65535 is the range.
When I enter any integer bigger than 65535 ,like 3333333, 65535 will be printed.
Now comes the part that confuses me a lot.
When I enter an integer smaller than 0,
-1, decimal: 65535
-2, decimal: 65534
-3, decimal: 65533
-4, decimal: 65532
...
-65534, decimal: 2
-65535, decimal: 1
-65536, decimal: 65535
-65537, decimal: 65535
-65538, decimal: 65535
-65539, decimal: 65535
-3333333, decimal: 65535
From -1 to -65535, print value for a seems to have a pattern.
From -65536 to any smaller negative integer, print value for a remains 65535 all the time.
Entering an integer outside of range of short seems like a very easy rule to follow.
Why Entering an integer outside of range of unsigned short is difficult to understand and are there any rules ?
No comments:
Post a Comment