Java Reference
In-Depth Information
}
protected
protected final
final void
void pad ( StringBuffer to , int
int howMany ) {
for
for ( int
int i = 0 ; i < howMany ; i ++)
to . append ( ' ' );
}
/** Convenience Routine */
String format ( String s ) {
return
return format ( s , new
new StringBuffer (), null
null ). toString ();
}
/** ParseObject is required, but not useful here. */
public
public Object parseObject ( String source , ParsePosition pos ) {
return
return source ;
}
}
See Also
The alignment of numeric columns is considered in Chapter 5 .
Converting Between Unicode Characters and Strings
Problem
You want to convert between Unicode characters and String s.
Solution
Unicode is an international standard that aims to represent all known characters used by
people in their various languages. Though the original ASCII character set is a subset,
Unicode is huge. At the time Java was created, Unicode was a 16-bit character set, so it
seemed natural to make Java char values be 16 bits in width, and for years a char could
hold any Unicode character. However, over time, Unicode has grown, to the point that it now
includes over a million “code points” or characters, more than the 65,525 that could be rep-
resented in 16 bits. [ 14 ] Not all possible 16-bit values were defined as characters in UCS-2, the
16-bit version of Unicode originally used in Java. A few were reserved as “escape charac-
ters,” which allows for multicharacter-length mappings to less common characters. For-
 
Search WWH ::




Custom Search