C specifies minimum sizes. That's all you need 99% of the time. I'm always annoyed by the people who assume int is 32-bits. You can't assume that in portable code. Use long or ensure the code works with 16-bit ints. That is how the type system was meant to be used. int was supposed to reflect the natural word size of the machine so you could work with the optimum integral type across mismatched platforms. 64-bit platforms have mostly abandoned that idea and 8-bit never got to participate but the principle is embedded in the language.
> The programmer should rather prescribe intent and shouldn't constantly think about what size this should exactly have.
You still have to constantly think about size! Except now you have to think about _minimum_ size, and possibly use a too big data type because the correctly sized one for your platform had a guaranteed minimum size that's too small for what you want to do.
It does agree with what I intended to say. The values a type needs to be able to represent are very much part of the intent of a variable. What the programmer doesn't need to specify, is with what bit pattern and what exact bits these values are going to be represented. There are use cases where you in fact do want to do that, but then that implies that you actually care about the wrapping semantics and are going to manipulate bit patterns.