Calculates the number of characters produced by decoding a sequence of bytes starting at the specified byte pointer.
The number of characters produced by decoding the specified sequence of bytes.
To calculate the exact array size required by UTF8Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UTF8Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxCharCount(int). The UTF8Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxCharCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.