메뉴 건너뛰기

SuperResolution Image Processing Lab.

Introduction of Section One - Technical Overview

2003.09.06 11:21

srip 조회 수:7892

Section One

Technical Overview




Perception of light, color and shape is an important and necessary issue for people to understand the world around them as well as to communicate with others. As the organic elements of such visual perception are eyes, imaging sensors give computers and electronic machines the same ability as the human eyes do. In recent years computer vision, image analysis and image processing have become widely used tools in various fields, such as astronomical imaging, medical imaging, remote sensing, and a variety of consumer electronic applications. There are two types of imaging sensors that convert light energy into an electrical signal, vacuum tube camera and semiconductor imaging sensor. The vacuum tube camera is a photoconductive device which employs a photosensitive sensor layer consisting of several million mosaic cells insulated from one another on a transparent metal film. Each cell represents a small capacitor whose charge is a function of incident light. The semiconductor imaging sensor is generally called a solid state imaging sensor. A typical solid state imaging sensor consists of many discrete photosensing elements, some form of charge transport mechanism, and an output circuit. The photosensitive sites convert the incoming photons into electrical charges. These charges are integrated and transferred through the transport mechanism to the output circuit where it is converted into a measurable voltage. In this section, of the two sensor types, the solid state imaging sensor is focused and the historical progress of technical development is described.

Yawcheng Lo reviewed the recent advancements and industry trends of both CCD and CMOS imaging sensors in his paper(#2). He discussed sensor performance, device scaling, and fabrication process. He also described some applications and future aspect of solid state imaging sensors.

The history of solid state imaging sensor began with the invention of a Photo Scanner, X-Y Addressed Silicon Junction Photo Sensing Device of S. R. Morrison at Honeywell Co. in 1963. In 1967 P. K. Weimer first suggested the MOS switched type solid state imaging sensor. CCD was first invented as a new signal transporting device by Bill Boyle and George E. Smith at Bell Lab. in 1969(#1). In the same year, Sangster and Teer suggested the Bucket-Brigade Device(BBD) for delay, time-axis conversion, and scanning(#3). In fact, BBD and CCD are based on the same physical principle. In spite of its original purpose, the interline transfer type CCD imaging sensor was introduced in 1973. Afterwards many researches about the imaging sensor are conducted. Carnes et al. analyzed free charge transfer in CCD(#4) and Burke and Michon examined the performance of charge-injection imaging device(CID)(#6). In contrast to CCD, CID imaging sensors use an X-Y addressed array of charge storage capacity like CMOS imaging sensors. Gruw