The maximum number of bytes produced by encoding the specified number of characters.
Type Reason ArgumentOutOfRangeException charCount < 0.
To calculate the exact array size required by UnicodeEncoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UnicodeEncoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UnicodeEncoding.GetMaxByteCount(int). The UnicodeEncoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UnicodeEncoding.GetMaxByteCount(int) method generally executes faster.
UnicodeEncoding.GetMaxByteCount(int) retrieves a worst-case number, including the worst case for the currently selected System.Text.EncoderFallback. If a fallback is chosen with a potentially large string, UnicodeEncoding.GetMaxByteCount(int) can return large values.
In most cases, this method retrieves reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach using erload:System.Text.UnicodeEncoding.GetByteCount or erload:System.Text.Encoder.Convert.
UnicodeEncoding.GetMaxByteCount(int) has no relation to UnicodeEncoding.GetChars(Byte[], int, int, Char[], int). If your application needs a similar function to use with UnicodeEncoding.GetChars(Byte[], int, int, Char[], int), it should use UnicodeEncoding.GetMaxCharCount(int).
GetMaxByteCount(N) is not necessarily the same value as N* GetMaxByteCount(1).