Calculates the number of bytes produced by encoding the characters in the specified string.
- chars
- The string containing the set of characters to encode.
The number of bytes produced by encoding the specified characters.
Type Reason ArgumentNullException chars is null. ArgumentException Error-checking is turned on for the current instance and chars contains an invalid surrogate sequence. ArgumentOutOfRangeException The return value is greater than int.MaxValue.
To calculate the exact array size required by UTF8Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF8Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF8Encoding.GetMaxByteCount(int). The UTF8Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF8Encoding.GetMaxByteCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.