The maximum number of characters produced by decoding the specified number of bytes.
Type Reason ArgumentOutOfRangeException byteCount < 0.
To calculate the exact array size required by UTF8Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UTF8Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxCharCount(int). The UTF8Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxCharCount(int) method generally executes faster.
UTF8Encoding.GetMaxCharCount(int) is a worst-case number, including the worst case for the currently selected System.Text.DecoderFallback. If a fallback is chosen with a potentially large string, UTF8Encoding.GetMaxCharCount(int) can return large values.
In most cases, this method returns reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach using erload:System.Text.UTF8Encoding.GetCharCount or erload:System.Text.Encoder.Convert.
UTF8Encoding.GetMaxCharCount(int) has no relation to UTF8Encoding.GetBytes(Char[], int, int, Byte[], int). If your application needs a similar function to use with UTF8Encoding.GetBytes(Char[], int, int, Byte[], int), it should use UTF8Encoding.GetMaxByteCount(int).
GetMaxCharCount(N) is not necessarily the same value as N* GetMaxCharCount(1).