System.Text.UTF32Encoding.GetCharCount Method

Calculates the number of characters produced by decoding a sequence of bytes starting at the specified byte pointer.

Syntax

[System.CLSCompliant(false)]
public override int GetCharCount (byte* bytes, int count)

Parameters

bytes
A pointer to the first byte to decode.
count
The number of bytes to decode.

Returns

The number of characters produced by decoding the specified sequence of bytes.

Remarks

To calculate the exact array size required by UTF32Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, use UTF32Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxCharCount(int). The UTF32Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxCharCount(int) method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

Requirements

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
Assembly Versions: 2.0.0.0, 4.0.0.0
Since: .NET 2.0