Calculates the number of characters produced by decoding a sequence of bytes starting at the specified byte pointer.
The number of characters produced by decoding the specified sequence of bytes.
To calculate the exact array size required by UTF32Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, use UTF32Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxCharCount(int). The UTF32Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxCharCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.