Calculates the number of bytes produced by encoding a set of characters from the specified character array.
The number of bytes produced by encoding the specified characters.
Type Reason ArgumentNullException chars is null . ArgumentOutOfRangeException index < 0.
-or-
count < 0.
-or-
index and count do not specify a valid range in chars (i.e. ( index + count) > chars.Length).
To calculate the exact array size required by UnicodeEncoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UnicodeEncoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UnicodeEncoding.GetMaxByteCount(int). The UnicodeEncoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UnicodeEncoding.GetMaxByteCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.