The maximum number of bytes produced by encoding the specified number of characters.
To calculate the exact array size required by UTF7Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF7Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF7Encoding.GetMaxByteCount(int). The UTF7Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF7Encoding.GetMaxByteCount(int) method generally executes faster.
UTF7Encoding.GetMaxByteCount(int) is a worst-case number, including the worst case for the currently selected System.Text.EncoderFallback. If a fallback is chosen with a potentially large string, UTF7Encoding.GetMaxByteCount(int) can return large values.
In most cases, this method returns reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach using erload:System.Text.UTF7Encoding.GetByteCount or erload:System.Text.Encoder.Convert. While UTF-7 is very efficient at encoding ASCII data, one byte per character, it is extremely inefficient for other data. As remarked above, UTF7Encoding.GetMaxByteCount(int) deals with a worst case. If the data to be encoded is largely ASCII, and especially if the ASCII characters cluster together, UTF-7 is significantly more efficient than the number returned by this method suggests.
UTF7Encoding.GetMaxByteCount(int) has no relation to UTF7Encoding.GetChars(Byte[], int, int, Char[], int). If your application needs a similar function to use with UTF7Encoding.GetChars(Byte[], int, int, Char[], int), it should use UTF7Encoding.GetMaxCharCount(int).
GetMaxByteCount(N) is not necessarily the same value as N* GetMaxByteCount(1).