In SQL Server, the BIT_COUNT() function returns the number of bits set to 1 in the given argument, based on the binary equivalent of that argument.
Different values will have a different number of 1s in their binary value. Also, the same value can have a different number of 1s, depending on the data type.
Example
Here’s a simple example to demonstrate how the function works:
SELECT BIT_COUNT(12345);
Result:
6
In this case, the binary equivalent of the given value has six bits set to 1. This is because the binary equivalent of 12345 is 0011 0000 0011 1001. We can see that six bits are set to 1, and so that’s why the BIT_COUNT() function returns 6.
Passing a Binary Value
Here’s the binary equivalent of the above integer value:
SELECT BIT_COUNT(0x00003039);
Result:
6
Same result.
Data Type Matters
The same value can return a different result depending on the data type.
Example:
SELECT
BIT_COUNT(CAST(-12345 AS int)) AS "int",
BIT_COUNT(CAST(-12345 AS smallint)) AS "smallint",
BIT_COUNT(CAST(-12345 AS bigint)) AS "bigint";
Result:
int smallint bigint
-------------------- -------------------- --------------------
27 11 59
Here, I changed the argument to a negative value and converted each one to a different data type. We can see that each data type returned a different result, even though the integer value is the same.