System.Text.UTF32Encoding.GetChars Method

Decodes a sequence of bytes starting at the specified byte pointer into a set of characters that are stored starting at the specified character pointer.

Syntax

[System.CLSCompliant(false)]
public override int GetChars (byte* bytes, int byteCount, char* chars, int charCount)

Parameters

bytes
A pointer to the first byte to decode.
byteCount
The number of bytes to decode.
chars
A pointer to the location at which to start writing the resulting set of characters.
charCount
The maximum number of characters to write.

Returns

The actual number of characters written at the location indicated by chars.

Remarks

To calculate the exact array size required by UTF32Encoding.GetChars(Byte[], int, int, Char[], int) to store the resulting characters, the application uses UTF32Encoding.GetCharCount(Byte[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxCharCount(int). The UTF32Encoding.GetCharCount(Byte[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxCharCount(int) method generally executes faster.

With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.

Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, the application uses the System.Text.Decoder or the System.Text.Encoder provided by the UTF32Encoding.GetDecoder method or the UTF32Encoding.GetEncoder method, respectively.

Requirements

Namespace: System.Text
Assembly: mscorlib (in mscorlib.dll)
Assembly Versions: 2.0.0.0, 4.0.0.0
Since: .NET 2.0