[cxx-abi-dev] Run-time array checking
John McCall
rjmccall at apple.com
Fri Sep 7 06:43:51 UTC 2012
On Sep 6, 2012, at 7:40 PM, Dennis Handly wrote:
>> From: John McCall <rjmccall at apple.com>
>> Clang handles large unsigned. This is compiler-generated code, so
>> we do know whether the value has signed type.
>
> It seems strange that the code for signed is different than unsigned but
> the Standard says that signed could overflow and implementation defined.
This conversation is about how to handle various possible values that the
first size expression in an array-new expression might take. That expression
must be of integer type, but it's permitted to have signed integer type, and
so therefore can be negative. In this case, C++11 demands that we throw
an exception of a certain type, std::bad_array_new_length.
This is unrelated to the semantics of overflow in signed arithmetic.
>> Yeah, the assumption that SIZE_MAX is invalid to allocate is valid on
>> basically every flat-addressed platform; it's just not guaranteed by the
>> standard. But you can imagine a platform where individual allocations
>> can't exceed some size that's significantly smaller than a pointer ˜
>
> I thought you got that backwards but if sizeof(size_t) is < sizeof(uintmax),
> then that would truncate that -1 to a much smaller.
>
> But do we care? For that architecture, the implementation-defined limit
> can be set to < SIZE_MAX.
I'm not totally comfortable with the ABI making that decision; it seems like
a decision that platform owners should make. On a platform where
size_t is as large as the address space, sure. On a platform with an
intentionally constrained size_t, maybe not.
>> I guess you could make an abstract argument that an
>> array allocation which could have succeeded with a different bound
>> should always produce std::bad_array_new_length
>
> But isn't that what bad_alloc also says, not enough memory, you greedy
> pig?
The point is that if the spec says "throw a std::bad_array_new_length",
we can't just throw a normal std::bad_alloc, because that's not compliant.
A normal std::bad_alloc means "we couldn't allocate that for some reason";
std::bad_array_new_length is basically a clarification that the failure was
inherent and cannot possibly succeed.
> Or is this the difference between "new []" and operator new/operator new[]?
> The latter two know nothing about "bounds".
It's part of the semantics of new[], yes. operator new[] is not required to
throw this specific exception type. Also, as I read it, the standard implies
that we shouldn't even be calling operator new[] if we have an invalid
size, so we can't handle this by just having operator new[] always
throw the more specific exception.
>> You could make a serious argument that the only allocations which
>> *must* throw std::bad_array_new_length rather than just std::bad_alloc
>> are the cases where you can't call the allocator because the size_t
>> argument would be negative or otherwise mathematically wrong.
>
> Which means you have to be careful for overflows in the evaluation.
>
>> if we're creating a new, constant-sized array of PODs,
>
> (Compile time constant?)
Possibly only constant after optimization, but yes, that's what I meant.
John.
More information about the cxx-abi-dev
mailing list