Calculates the number of characters produced by decoding a sequence of bytes from the specified byte array.
The number of characters produced by decoding the specified sequence of bytes.
To calculate the exact array size required by UTF32Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UTF32Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxCharCount(int). The UTF32Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxCharCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.