Calculates the number of bytes produced by encoding a set of characters from the specified character array.
The number of bytes produced by encoding the specified characters.
Type Reason ArgumentNullException chars is null . ArgumentOutOfRangeException The return value is greater than int.MaxValue.
-or-
index or count is less than zero.
-or-
index and count do not specify a valid range in chars (i.e. (index + count) > chars.Length).
ArgumentException Error-checking is turned on for the current instance and chars contains an invalid surrogate sequence.
To calculate the exact array size required by UTF8Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF8Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxByteCount(int). The UTF8Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxByteCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.