Java Reference
In-Depth Information
and 0b for binary format, and suffixes such as L for long literal and F for float literal. The following examples show the
invalid uses of underscores in numeric literals:
int y1 = _1969; // An error. Underscore in the beginning
int y2 = 1969_; // An error. Underscore in the end
int y3 = 0x_7B1; // An error. Underscore after prefix 0x
int y4 = 0_x7B1; // An error. Underscore inside prefix 0x
long z1 = 1969_L; // An error. Underscore with suffix L
double d1 = 1969_.0919; // An error. Underscore before decimal
double d1 = 1969._0919; // An error. Underscore after decimal
you can write the int literal 1969 in octal format as 03661 . the zero in the beginning of an int literal in the
octal format is considered a digit, not a prefix. It is allowed to use underscores after the first zero in an int literal in octal
format. you can write 03661 as 0_3661 .
Tip
Java Compiler and Unicode Escape Sequence
Recall that any Unicode character in a Java program can be expressed in the form of a Unicode escape sequence.
For example, the character 'A' can be replaced by '\u0041' . The Java compiler first converts every occurrence
of a Unicode escape sequence to a Unicode character. A Unicode escape sequence starts with \u followed by four
hexadecimal digits. '\\u0041' is not a Unicode escape sequence. To make uxxxx a valid part of a Unicode escape
sequence, it must be preceded by odd number of backslashes, because two contiguous backslashes ( \\ ) represent one
backslash character. Therefore, "\\u0041" represents a 6-character string composed of '\' , 'u' , '0' , '0' , '4', and
'1' . However, "\\\u0041" represents a two-character string ( "\A" ).
Sometimes, inappropriate use of Unicode escape sequence in Java source code may result in a compile-time
error. Consider the following declaration of a char variable:
char c = '\u000A';
The programmer intends to initialize the variable c with a linefeed character whose Unicode escape sequence is
\u000A . When this piece of code is compiled, the Java compiler will convert \u000A into an actual Unicode character
and this piece of code will be split into two lines as follows:
char c = '
'; // After the actual linefeed is inserted
Since a character literal cannot continue in two lines, the above piece of code generates a compile-time error. The
correct way to initialize the variable c is to use the character escape sequence \n as shown:
char c = '\n'; // Correct
In character and String literals, the linefeed and carriage return should always be written as \n and \r ,
respectively, not as \u000A and \u000D . Even a line of comment may generate a compiler error if you do not use the
linefeed and carriage return characters correctly. Suppose you commented the above wrong declaration of the char
variable, as shown:
// char c = '\u000A';
 
 
Search WWH ::




Custom Search