Hardware Reference
In-Depth Information
Fig. 6.1 Intel 1103, first
DRAM commercial chip with
1024 bits
Nowadays, the role of memory devices in the semiconductor industry is even
clearer. Applications such as computer graphics, digital signal processing, and rapid
retrieval of huge volumes of data, demand an exponentially increasing amount of
memory. A constantly growing percentage of Integrated Circuits (ICs) area is thus
dedicated to implement memory structures. According to the International Technol-
ogy Roadmap for Semiconductors (ITRS) ( ITRS 2007 ) , a leading authority in the
field of semiconductors, memories occupied 20% of the area of an IC in 1999, 52%
in 2002, and are forecasted to occupy up to 90% of the area by the year 2011.
Due to this considerable usage of memories in ICs, any improvement in the de-
sign and fabrication process of these devices has a considerable impact on the overall
ICs characteristics. Reducing the energy consumption, increasing the reliability and,
above all, reducing the cost of memories directly reflect on the systems they are in-
tegrated in.
This continuous research for improvement has historically pushed the memory
technology at its limit, making these devices extremely sensible to physical defects
and environmental influences that may severely compromise their correct behavior.
Efficient and detailed testing of memory components is therefore mandatory. A large
portion of the price of a memory derives today from the high cost of memory testing,
which has to satisfy very high quality constraints, ranging from 50 failing parts
per million (ppm) for computer systems to less than 10 ppm for mission-critical
applications (such as those in the automotive industry).
As physical examination of memory designs is too complex, working with mod-
els capable of precisely representing memory behaviors, architectures, and fault
mechanisms while keeping the overall testing problem complexity under control
is mandatory to guarantee high quality memory products and to reduce the test cost.
This is even more important as we fully enter the very deep sub-micron (VDSM) era.
This chapter provides an overview of models and notations currently used in the
memory testing practice, and concludes by highlighting challenging and still open
problems.
 
Search WWH ::




Custom Search