System.Text.UTF32Encoding.GetMaxByteCount Method

Calculates the maximum number of bytes produced by encoding the specified number of characters.

Syntax

public override int GetMaxByteCount (int charCount)

Parameters

charCount
The number of characters to encode.

Returns

The maximum number of bytes produced by encoding the specified number of characters.

Remarks

To calculate the exact array size required by UTF32Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF32Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxByteCount(int). The UTF32Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxByteCount(int) method generally executes faster.

UTF32Encoding.GetMaxByteCount(int) is a worst-case number, including the worst case for the currently selected System.Text.EncoderFallback. If a fallback is chosen with a potentially large string, M UTF32Encoding.GetMaxByteCount(int) can return large values.

In most cases, this method returns reasonable numbers for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case that a more reasonable buffer is exceeded. You might also want to consider a different approach and use erload:System.Text.UTF32Encoding.GetByteCount or erload:System.Text.Encoder.Convert.

UTF32Encoding.GetMaxByteCount(int) has no relation to UTF32Encoding.GetChars(Byte[], int, int, Char[], int). If your application needs a similar function to use with UTF32Encoding.GetChars(Byte[], int, int, Char[], int), it should use UTF32Encoding.GetMaxCharCount(int).

Note:

GetMaxByteCount(N) is not necessarily the same value as N* GetMaxByteCount(1).

Requirements

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
Assembly Versions: 2.0.0.0, 4.0.0.0
Since: .NET 2.0