The maximum number of characters produced by decoding the specified number of bytes.
Type Reason ArgumentOutOfRangeException byteCount < 0.
To calculate the exact array size required by UnicodeEncoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UnicodeEncoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UnicodeEncoding.GetMaxCharCount(int). The UnicodeEncoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UnicodeEncoding.GetMaxCharCount(int) method generally executes faster.
UnicodeEncoding.GetMaxCharCount(int) retrieves a worst-case number, including the worst case for the currently selected System.Text.DecoderFallback. If a fallback is chosen with a potentially large string, UnicodeEncoding.GetMaxCharCount(int) retrieves large values.
In most cases, this method retrieves reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach using erload:System.Text.UnicodeEncoding.GetCharCount or erload:System.Text.Decoder.Convert.
UnicodeEncoding.GetMaxCharCount(int) has no relation to UnicodeEncoding.GetBytes(Char[], int, int, Byte[], int). If your application needs a similar function to use with UnicodeEncoding.GetBytes(Char[], int, int, Byte[], int), it should use UnicodeEncoding.GetMaxByteCount(int).
GetMaxCharCount(N) is not necessarily the same value as N* GetMaxCharCount(1).