Calculates the number of characters produced by decoding a sequence of bytes from the specified byte array.
![]()
The number of characters produced by decoding the specified sequence of bytes.
Type Reason ArgumentNullException bytes is null. ArgumentOutOfRangeException index or count is less than zero.
-or-
index and count do not specify a valid range in bytes (i.e. (index + count) > bytes.Length).
ArgumentException Error-checking is turned on for the current instance and bytes contains an invalid surrogate sequence.
To calculate the exact array size required by UTF8Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UTF8Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxCharCount(int). The UTF8Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxCharCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.