ABSTRACT

Many versatile point-and-shoot digital cameras, which support attractive functions with satisfactory system performance, have been announced in the consumer electronics market [1], [2], [3], [4]. However, camera system designers still suffer from the difficulty of lacking hardware architecture standards and good software design methodology. According to the product definition, camera designers must carefully select several key hardware components such as camera signal processor (CSP) [5], [6], [7], [8], [9],[10], [11], lens module [12], [13], [14], [15], [16], charge-coupled device (CCD) [17], [18], [19] or complementary metal oxide semiconductor (CMOS) image sensor [20], [21], [22], analog front end (AFE) chip [23], [24], and liquid crystal display (LCD). Among these components, CSP plays the most important role for the entire system. A typical CSP consists of an embedded microprocessor (EMP), hardware engines, peripherals, and other programmable computing units such as digital signal processors (DSPs) for real-time image/video processing. The scheduling as well as the allocation for these heterogeneous computational resources is quite a complex issue in embedded software design.