Calculates the number of bytes produced by encoding the characters in the specified string.
- s
- The string containing the set of characters to encode.
The number of bytes produced by encoding the specified characters.
To calculate the exact array size required by UTF32Encoding.GetBytes(string, int, int, Byte[], int) to store the resulting bytes, the application uses UTF32Encoding.GetByteCount(Char[], int, int). To calculate the maximum array size, the application should use UTF32Encoding.GetMaxByteCount(int). The UTF32Encoding.GetByteCount(Char[], int, int) method generally allows allocation of less memory, while the UTF32Encoding.GetMaxByteCount(int) method generally executes faster.
With error detection, an invalid sequence causes this method to throw a ArgumentException. Without error detection, invalid sequences are ignored, and no exception is thrown.
To ensure that the encoded bytes are decoded properly, the application should prefix encoded bytes with a preamble.