#include <iostream> struct Bitfields { unsigned int a : 3; unsigned int b : 29; }; int main() { Bitfields b; b.a = 2; b.b = 3; const unsigned int& r = b.a; std::cout << sizeof(Bitfields) << ' ' << r << std::endl; b.a = 4; std::cout << r << std::endl; }Go ahead, try it. A few days ago I would still have been convinced that this doesn't even compile. Then a student of mine ran into a problem with bitfields, and in tracking the problem down, we found out that it does - but with a surprising twist.
This is a story about one of the dark corners of C++.
Bitfields are a convenient feature. When you want to run a graph algorithm like Dijkstra and its extensions on a graph with tens of millions or even billions of nodes, reducing your node data structures from, say, 24 bytes to 16 bytes is a big deal. Sometimes this is possible because that 8-bit char really only needs 3 bits, and together with alignment issues these kind of savings can add up. Bitfields are convenient syntax for what one would otherwise have to do manually using bit shifts and masking operations.
Unfortunately, bitfields don't compose very well with other language features. For example, they don't have a byte-level address, so you cannot take pointers to them. Since you cannot take pointers to them, you cannot take references to them.
Wait, hold on, you can. But only
const
references. And what happens then is completely at odds with how references to variables work everywhere else in C++: a temporary object of the underlying non-bitfield type is allocated and initialized with the value of the bitfield. The reference then refers to that temporary. What?!
Once you recover from the shock, a perfectly plausible and sane story explains this insane behavior. Bitfields don't compose well with the rest of the language. This could be changed. The type system could be extended to cover bit fields properly. References to a bitfield of type
unsigned int:7
could be represented as a pointer plus a bit-shift offset. Everything would make perfect theoretical sense... except that bitfields are only rarely used, and in mostly-C-looking code. They are already one of the premier sources of compiler bugs, and making everything involving a rarely-used feature like bitfields more complex is probably not going to help. Hence there isn't much interest in complicating things just to integrate bitfields properly. That's a perfectly reasonable and pragmatic decision to make.
Except that with generic programming and templates, a common pattern is that functions take
const T&
parameters - including functions like std::max
, which one would very plausibly want to use with a value from a bitfield. If you cannot take references to bitfields, that's not possible - and that would become really annoying really quickly for the people who actually do use bitfields. So a pragmatic compromise is made: the bitfield value is copied to a temporary, and a reference to that temporary is passed to the function.
In fact, one might think of this as a special case of what happens when you combine an implicit type conversion with the fact that const references can bind to temporary return values of functions. Consider the following sample program:
#include <iostream> struct Ops { unsigned int a; operator unsigned int() const {return a;} }; int main() { Ops o; o.a = 11; const unsigned int& s = o; std::cout << s << std::endl; o.a = 4; std::cout << s << std::endl; }This program prints
11
twice, because s
ends up being bound to the temporary object that is returned by the custom type conversion operator. In 99% of all cases, this is exactly what you want, and suddenly, the behavior seems downright logical - except that when you see only the first example program above, printing the same number twice is a pretty crazy outcome.
I actually like C++ as a language. I think it occupies an important part of the design space for languages, and especially with the more recent developments, it does so quite well - but it definitely has its share of weirdness. The saving grace is that there is a reasonable story for why it behaves the way it does, and while the outcome is a bit crazy, it doesn't actually matter much in practice - unlike, say, JavaScript's
Array.prototype.sort
.
Keine Kommentare:
Kommentar veröffentlichen