System.Text.UTF32Encoding.GetByteCount Method

Calculates the number of bytes produced by encoding a set of characters from the specified character array.

Syntax

public override int GetByteCount (char[] chars, int index, int count)

Parameters

chars
The character array containing the set of characters to encode.
index
The index of the first character to encode.
count
The number of characters to encode.

Returns

The number of bytes produced by encoding the specified characters.

Remarks

To calculate the exact array size required by UTF32Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF32Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxByteCount(int). The UTF32Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxByteCount(int) method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

Note:

To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.

Requirements

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
Assembly Versions: 2.0.0.0, 4.0.0.0
Since: .NET 2.0