System.Text.Encoding.GetMaxByteCount Method

When overridden in a derived class, calculates the maximum number of bytes produced by encoding the specified number of characters.

Syntax

public abstract int GetMaxByteCount (int charCount)

Parameters

charCount
The number of characters to encode.

Returns

The maximum number of bytes produced by encoding the specified number of characters.

Remarks

The charCount parameter actually specifies the number of char objects that represent the Unicode characters to encode, because the .NET Framework internally uses UTF-16 to represent Unicode characters. Consequently, most Unicode characters can be represented by one char object, but a Unicode character represented by a surrogate pair, for example, requires two char objects.

To calculate the exact array size required by Encoding.GetBytes(Char[]) to store the resulting bytes, the application should use Encoding.GetByteCount(Char[]). To calculate the maximum array size, it should use Encoding.GetMaxByteCount(int). The Encoding.GetByteCount(Char[]) method generally allows allocation of less memory, while the Encoding.GetMaxByteCount(int) method generally executes faster.

Encoding.GetMaxByteCount(int) retrieves a worst-case number, including the worst case for the currently selected System.Text.EncoderFallback. If a fallback is chosen with a potentially large string, Encoding.GetMaxByteCount(int) retrieves large values, particularly in cases where the worst case for the encoding involves switching modes for every character. For example, this can happen for ISO-2022-JP. For more information, see the blog entry "tp://go.microsoft.com/fwlink/?LinkId=153702" (http://blogs.msdn.com/shawnste/archive/2005/03/02/383903.aspx).

In most cases, this method retrieves reasonable values for small strings. For large strings, you might have to choose between using very large buffers and catching errors in the rare case when a more reasonable buffer is too small. You might also want to consider a different approach using erload:System.Text.Encoding.GetByteCount or erload:System.Text.Encoder.Convert.

When using Encoding.GetMaxByteCount(int), your application should allocate the output buffer based on the maximum size of the input buffer. If the output buffer is constrained in size, the application might use the Convert method.

Note that Encoding.GetMaxByteCount(int) considers potential leftover surrogates from a previous decoder operation. Because of the decoder, passing a value of 1 to the method retrieves 2 for a single-byte encoding, such as ASCII. Your application should use the ASCIIEncoding.IsSingleByte property if this information is necessary.

Note:

GetMaxByteCount(N) is not necessarily the same value as N* GetMaxByteCount(1).

Requirements

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
Assembly Versions: 1.0.5000.0, 2.0.0.0, 4.0.0.0