Encodes a set of characters starting at the specified character pointer into a sequence of bytes that are stored starting at the specified byte pointer.
The actual number of bytes written at the location indicated by bytes.
To calculate the exact array size required by UTF8Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF8Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxByteCount(int). The UTF8Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxByteCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
Data to be converted, such as data read from a stream, might be available only in sequential blocks. In this case, or if the amount of data is so large that it needs to be divided into smaller blocks, the application uses the System.Text.Decoder or the System.Text.Encoder provided by the UTF8Encoding.GetDecoder method or the UTF8Encoding.GetEncoder method, respectively.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.