patents.google.com

CN100515041C - Method for automatically controlling exposure and device for automatically compensating exposure - Google Patents

  • ️Wed Jul 15 2009
Method for automatically controlling exposure and device for automatically compensating exposure Download PDF

Info

Publication number
CN100515041C
CN100515041C CNB2005101343391A CN200510134339A CN100515041C CN 100515041 C CN100515041 C CN 100515041C CN B2005101343391 A CNB2005101343391 A CN B2005101343391A CN 200510134339 A CN200510134339 A CN 200510134339A CN 100515041 C CN100515041 C CN 100515041C Authority
CN
China
Prior art keywords
image
state zone
value
exposure
those
Prior art date
2005-12-14
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2005101343391A
Other languages
Chinese (zh)
Other versions
CN1984260A (en
Inventor
吴政育
林哲弘
陈鸿仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ICatch Technology Inc
Original Assignee
Sunplus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2005-12-14
Filing date
2005-12-14
Publication date
2009-07-15
2005-12-14 Application filed by Sunplus Technology Co Ltd filed Critical Sunplus Technology Co Ltd
2005-12-14 Priority to CNB2005101343391A priority Critical patent/CN100515041C/en
2007-06-20 Publication of CN1984260A publication Critical patent/CN1984260A/en
2009-07-15 Application granted granted Critical
2009-07-15 Publication of CN100515041C publication Critical patent/CN100515041C/en
Status Active legal-status Critical Current
2025-12-14 Anticipated expiration legal-status Critical

Links

  • 238000000034 method Methods 0.000 title claims abstract description 44
  • 238000005516 engineering process Methods 0.000 description 6
  • 238000012545 processing Methods 0.000 description 5
  • 210000000746 body region Anatomy 0.000 description 4
  • 238000012937 correction Methods 0.000 description 4
  • 238000010586 diagram Methods 0.000 description 4
  • 238000006243 chemical reaction Methods 0.000 description 3
  • 238000005259 measurement Methods 0.000 description 3
  • 238000004458 analytical method Methods 0.000 description 2
  • 241001269238 Data Species 0.000 description 1
  • 238000004364 calculation method Methods 0.000 description 1
  • 230000006835 compression Effects 0.000 description 1
  • 238000007906 compression Methods 0.000 description 1
  • 238000000205 computational method Methods 0.000 description 1
  • 238000011161 development Methods 0.000 description 1
  • 230000000694 effects Effects 0.000 description 1
  • 238000004519 manufacturing process Methods 0.000 description 1
  • 230000003287 optical effect Effects 0.000 description 1
  • 230000002093 peripheral effect Effects 0.000 description 1
  • 230000001737 promoting effect Effects 0.000 description 1
  • 238000012797 qualification Methods 0.000 description 1

Images

Landscapes

  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)

Abstract

The method comprises: building a coordinator domain corresponding to multi steps of exposure compensation, and setting threshold values for bright area and dark area; calculating ratio of the amount of pixels whose brightness are more than the threshold value of bright area to the total amount of pixels in order to get the ratio value of bright area; calculating the ratio of the amount of pixels whose brightness are less than the threshold value of dark area to the total amount of pixels in order to get the ratio value of dark area; according to the ratio value of bright area and the ratio value of dark area, determining the location of image in the coordinator domain; according to the steps of coordinator domain where the image locates at, compensating the exposure to the image.

Description

Automatic exposure control method and automatic exposure compensation arrangement

Technical field

The invention relates to a kind of exposal control method and exposure compensating device, and particularly relevant for a kind of automatic exposure control method and automatic exposure compensation arrangement of exposure bias value in order to the decision image.

Background technology

The component units of chromatic image processing device system, consist predominantly of image acquisition unit, image reduction and processing unit, image display cell and image compression unit etc., wherein image acquisition unit is made of image perceptron (image sensor), this is a two-dimensional array formula assembly of being formed with light sensitive diode (photo-diodes), and this photosensory assembly is converted to electric signal with the luminous intensity of institute's sensing, does suitable image processing for thereafter image reduction and processing unit.Because the direction of its light source of scene of taking pictures and intensity can be different, and then cause the main body picture to have bright or dark excessively situation to produce.And exposure control promptly is the intensity size and the time length of control light filling, under different scene situations, revises the picture of taking pictures and crosses bright or dark excessively situation because of the main body that difference caused of main body and background luminance.

Therefore, each dealer is developed the main body that many light measuring methods improve the picture of taking pictures and is crossed bright or dark excessively situation, yet general light measuring method but can't improve this problem effectively.No. the 5703644th, the United States Patent (USP) that prior art such as electronics industry limited company of Panasonic (Matsushita Electric Industrial Co.) propose, name is called the patent of " automatic exposure control device (AutomaticExposure Control Apparatus) ".Its way is that picture is cut into many junior unit blocks, and calculate the mean flow rate of each block, each the block mean flow rate that calculates is arranged in regular turn with size, and count the statistical chart value of block counts to mean flow rate, with this each junior unit block is categorized as main body and background area, the mean flow rate of the junior unit block of choosing by calculating is carried out exposure correction backlight and strong face light field scape.But this way is easily because the position of main body moves up and down or rotates and cause scene to judge failure.No. the 5703644th, United States Patent (USP) for example shown in Figure 1, scene is judged the key diagram of failure scenarios.This patent is that

scene

102 is divided into two zones, and one of them is the

body region

104 at the scene middle section, and another is the

background area

106 in the scene outer peripheral areas.When the personage who is taken pictures (being main body) moves to left to the

position

110 by

position

108, and the

body region

104 of this moment is replaced by background, and background is bright zone (for example being sky), this patented technology can think that sky is a too bright main body by mistake so, and photo is done unnecessary negative compensation (being dim light), will cause above-mentioned scene to judge the situation of failure thus.

Cross the shortcoming of bright or dark excessively situation for the main body of improving the picture of taking pictures, Sony Co., Ltd (Sony Corporation) has also proposed United States Patent (USP) No. 6690424, and name is called the patent of " exposure-control device (Apparatus forControlling the Exposure of an Image Pickup Plane in a Camera) of the conduct control video streaming plane exposure in the camera ".It is a kind of exposal control method that is used for scene backlight, and its way is to utilize the average brightness value of whole image to draw up two brightness reference value, in order to picture is divided into main body and background area, obtains the mean luminance differences of whole image and body region again.See through gain (gain) value of this difference decision exposure compensating.The method can be improved because of the accuracy that body position moves up and down or scene is judged when rotating, and will judge inefficacy but be in main body under the situation of strong face light, causes the body region overexposure.No. the 6690424th, United States Patent (USP) for example shown in Figure 2, scene is judged the key diagram of failure scenarios.Wherein transverse axis is expressed as mean flow rate (Y), and the longitudinal axis is expressed as total pixel value (Pixel counts), and two downward arrows 202,204 are expressed as brightness reference value, and Δ Y is expressed as the luminance difference of two brightness reference value.When the curve that occurs among the figure shown in 206, this patented technology can be judged to be it the situation that (Back-lighting) backlight appearred in the scene of taking pictures, and compensates and it is just done.But when the situation of strong face light appears in scene, similar 206 curve also can appear, yet this patented technology but still is judged as it scene situation backlight to have been occurred and it is just being compensated, make to become brighter on the contrary for the scene of strong face light originally, will cause above-mentioned scene to judge the situation of failure thus.

In addition, No. the 6853806th, the United States Patent (USP) that is proposed as Olympus optics limited company (Olympus Optical Co.), name is called " camera (Camerawith an Exposure Control Function) that uses the exposure control program ".It is a kind of exposal control method that is used for scene backlight, its way is to utilize the perceptron (sensor) of measuring distance and the mean flow rate of calculating whole image, decide the position of main body in picture, and then utilize " criticism backlight unit " to decide the time and the intensity of exposure.This patent needs collocation to measure the perceptron of distance on the structure of camera, thus comparatively complexity and also raising thereupon of cost on making, so for most camera manufacturing and development company and consumer, it does not more have the value of economy.

Summary of the invention

Purpose of the present invention just provides a kind of automatic exposure control method, in order to the exposure bias value of decision image.The method is the scene analysis calculation that utilizes than advantage, make in the scene of taking pictures, can do suitable exposure compensating if occur to the scene of taking pictures under the situation as (back-lighting) backlight, strong face light (strong front-lighting) and dark scene (dark).And occur in the scene of taking pictures under the situation of normal face light (normal) and bright field scape (high-light), also can be kept its original exposure.

Another object of the present invention just provides a kind of automatic exposure compensation arrangement, in order to cooperate automatic exposure control method proposed by the invention the special screne image that is captured is done rational exposure compensating.

Based on above-mentioned purpose, the present invention proposes a kind of automatic exposure control method, comprise and set an illuminated state zone critical value and a dark state zone critical value and set up a coordinate domain, wherein have a plurality of underranges of corresponding different exposure bias values in this coordinate domain, this coordinate domain has reacted the relation between image and those underranges.Calculate in the image brightness then greater than the pixel quantity of illuminated state zone critical value and the ratio of image total pixel number amount, to obtain the illuminated state zone ratio value; And in the calculating image brightness less than the pixel quantity of dark state zone critical value and the ratio of image total pixel number amount, to obtain the dark state zone ratio value.Then determine the position of this image in coordinate domain according to illuminated state zone ratio value and dark state zone ratio value.Then, when the position of this image in coordinate domain falls within one of them of those underranges, then compensate the exposure of this image according to the exposure bias value of the underrange correspondence at image place.

Based on above-mentioned purpose, the present invention proposes a kind of automatic exposure control method, in order to determine the exposure bias value of an image, this automatic exposure control method comprises: set an illuminated state zone critical value and a dark state zone critical value and set up one and look into and convert table, this is looked into and converts table and be the relation of converting of looking between a plurality of underranges and its pairing exposure bias value; Calculate in this image brightness greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain an illuminated state zone ratio value; Calculate in this image brightness less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain a dark state zone ratio value; According to this illuminated state zone ratio value and this dark state zone ratio value, look into to convert in this and look into the exposure bias value of converting this image correspondence in table; And the exposure that compensates this image according to exposure bias value.

Based on above-mentioned purpose, the present invention proposes a kind of automatic exposure compensation arrangement, comprises brightness statistics unit, index calculating and looks into and convert table.Brightness statistics unit wherein, greater than the illuminated state zone pixel quantity of default illuminated state zone critical value, and statistics is less than the dark state zone pixel quantity of default dark state zone critical value in order to brightness in the statistics image.The input of index calculating is coupled to the output of brightness statistics unit, in order to foundation illuminated state zone pixel quantity and dark state zone pixel quantity and produce at least one index.And look into the output that the input of converting table couples index calculating, in order to obtaining exposure bias value according to index, and the output exposure bias value.

Known to above, the present invention adopts scene analysis to calculate to make that the exposure of every kind of scene situation can both suitably be defined.Owing in the scene of taking pictures, understand the bright field scape of appearance, normally face light, backlight, the situation of backlight, strong face light and dark scene by force, all be comprised in a plurality of underranges of the coordinate domain of being set up, and each image of taking pictures all decides this to drop on which position in the coordinate domain by the ratio value of the ratio value of its illuminated state zone and dark state zone, and then the exposure of removing suitably to compensate this image according to the pairing exposure bias value of the underrange at image place.Therefore, adopt method of the present invention to be not easy the situation that scene is judged failure to occur, and make every kind of scene situation suitably to be done exposure compensating.

For above-mentioned purpose of the present invention, feature and advantage can be become apparent, preferred embodiment cited below particularly, and conjunction with figs. are described in detail below.

Description of drawings

Fig. 1 is depicted as United States Patent (USP) No. 5703644, and scene is judged the key diagram of failure scenarios.

Fig. 2 is depicted as United States Patent (USP) No. 6690424, and scene is judged the key diagram of failure scenarios.

Fig. 3 is depicted as the automatic exposure control method flow chart according to one embodiment of the invention.

Fig. 4 is depicted as the flow chart of setting up the coordinate domain step according to one embodiment of the invention.

Fig. 5 a, Fig. 5 b, Fig. 5 c, Fig. 5 d, Fig. 5 e, Fig. 5 f illustrate and are the brightness statistics figure under different scene situations according to one embodiment of the invention.

Fig. 6 illustrates and is the coordinate domain figure according to one embodiment of the invention.

Fig. 7 illustrates and is angle-distance (θ-γ, angle-distance) coordinate domain plane graph according to one embodiment of the invention.

Fig. 8 illustrates and is the angle-range coordinate territory plane graph according to one embodiment of the invention.

Fig. 9 illustrates and is the exposure difference table according to one embodiment of the invention.

Figure 10 illustrates to the exposure bias value according to one embodiment of the invention and defines flow chart.

Figure 11 illustrates and is the weighted average algorithmic descriptions figure according to the exposure bias value of one embodiment of the invention.

Figure 12 illustrates to adopting the digital image pick-up calcspar according to the automatic exposure compensation arrangement of one embodiment of the invention.

Figure 13 illustrates and is the flow chart according to another embodiment of the present invention.

Figure 14 illustrates to look into the flow chart of converting table according to the foundation of another embodiment of the present invention.

Figure 15 illustrates and is the coordinate domain plane graph according to another embodiment of the present invention.

Embodiment

Judge the situation of failure in order to reduce scene, so that every kind of scene situation can both suitably be done exposure compensating, the invention provides the method for the automatic exposure control different with known technology, the embodiment that its technology contents is disclosed with reference to the accompanying drawings is illustrated:

Please refer to the automatic exposure control method flow chart according to one embodiment of the invention shown in Figure 3.It is five execution in step that Fig. 3 is divided into, and is followed successively by 302,306,308,310,312.Wherein,

step

302 is for setting an illuminated state zone critical value and a dark state zone critical value and setting up coordinate domain, and its detailed steps can illustrate with the flow chart of setting up the coordinate domain step according to one embodiment of the invention shown in Figure 4 again.

It is 6 execution in step that Fig. 4 is divided into, and is followed successively by 402,404,406,408,410,412.Wherein, step 402 is for providing a plurality of sample images.Next be

step

404, its step content is for specifying the pairing exposure value of each scene classification.Why to specify the reason of corresponding exposure value, please refer to the brightness statistics figure under different scene situations shown in Fig. 5 a to Fig. 5 f at this and illustrate according to one embodiment of the invention.Wherein, the transverse axis among Fig. 5 a to Fig. 5 f is represented the GTG value, and the longitudinal axis is expressed as pixel quantity.And, two

downward arrows

502 and 504 are all arranged in Fig. 5 a to Fig. 5 f, be expressed as illuminated state zone critical value and dark state zone critical value respectively, so that definition illuminated

state zone

518 and dark state zone 520.And the scene of taking pictures generally can roughly be divided into: normal face light, dark scene, backlight, strong face light, strong backlight and bright field scape, it corresponds to the curve 506,508,510,512,514,516 on brightness statistics Fig. 5 a to 5f respectively.Can find out the image picked image pixel value under different scene situations of taking pictures in Fig. 5 a to 5f significantly, its pairing brightness statistics figure (histogram) has different features at illuminated state zone and dark state zone.And

curve

506 representatives is an optimal scene situation, therefore when appearance during as the scene situation of curve 508,510,512,514,516 representatives, just can do correction, to reach effect as same ideal scenario situation according to the exposure value of specified correspondence.

Next be

step

406, its content is the illuminated state zone ratio value that calculates each sample image.The mode of asking for of illuminated state zone ratio value is to calculate in the image brightness greater than the pixel quantity (being the pixel quantity of illuminated state zone 518) of illuminated state zone critical value and the ratio of image total pixel number amount.Next be

step

408, its content is the dark state zone ratio value that calculates each sample image.The mode of asking for of dark state zone ratio value is to calculate in the image brightness less than the pixel quantity (being the pixel quantity of dark state zone 520) of dark state zone critical value and the ratio of image total pixel number amount.Next be

step

410 again, its content is the position of those samples of decision in coordinate domain.Please refer to the coordinate domain icon according to one embodiment of the invention shown in Figure 6 at this, its transverse axis is expressed as the illuminated state zone ratio value of image, and the longitudinal axis is expressed as the dark state zone ratio value of image.On the coordinate plane of this coordinate domain, add up the image data of a large amount of different scenes, that is each opens image all can form a correspondence with its illuminated state zone ratio value that calculates and dark state zone ratio value on coordinate plane coordinate points.And because the maximum of illuminated state zone ratio value and dark state zone ratio value is all 1, so all drop on the Lower Half of the

oblique line

602 shown in Fig. 6 by the pairing coordinate points of image, and the mathematical notation formula of this

oblique line

602 is X+Y=1.

Next, in order to draw back the difference each other of each scene, (θ-γ is angle-distance) on the coordinate domain plane graph coordinate points that this coordinate domain forms all images in this system through the Coordinate Conversion system again can be converted to as shown in Figure 7 the angle-distance according to one embodiment of the invention.Wherein, be expressed as among Fig. 6 coordinate points to the angle of formed straight line of initial point and transverse axis in the θ shown in Fig. 7, and γ is expressed as the summation of the pairing illuminated state zone ratio value of coordinate points (coordinate figure of transverse axis) and dark state zone ratio value (coordinate figure of the longitudinal axis) among Fig. 6 among Fig. 7, forms the form distribution of the image coordinate point of different scene situations with cluster on this angle-plane, range coordinate territory.

Next be

step

412, its content is each underrange of definition.Owing on coordinate domain plane shown in Figure 7, form of the form distribution of the image coordinate point of different scene situations with cluster, so this angle-plane, range coordinate territory can be divided into some blocks, and distinguish that respectively to cut apart block be bright field scape, normal face light, backlight, each underrange of backlight, strong face light and dark scene by force.And the Rankine-Hugoniot relations of each underrange as shown in Figure 7, and the scene that the block 702,704,706,708,710 shown on the figure is corresponding respectively is bright field scape, face light backlight and backlight by force, strong, dark scene, normal face light.In present embodiment owing to backlightly belong to block 704 backlight and by force, so the value of its exposure compensating is identical, and the exposure bias value of these scenes be respectively+0 ,+1 ,-1 ,+2 ,+0.

Except the form of above-mentioned underrange, can be more accurate for the compensation that makes exposure, use skill person of the present invention more can freely define each underrange according to needed accuracy, for example whole angle-plane, range coordinate territory is divided into more block.The angle that is divided into 25 blocks according to one embodiment of the invention as shown in Figure 8-range coordinate territory plane graph is a wherein example, and it does exposure compensating according to the exposure difference table according to one embodiment of the invention shown in Figure 9.Each block among Fig. 8 all corresponds to one of them the exposure difference among Fig. 9, for example 802 corresponds to 902,804 and corresponds to 904,806 and correspond to 906.In present embodiment, at normal face light district and bright field scenic spot, because belong to normal light source state, so exposure value is not adjusted; District backlight and strong district backlight need light filling because main body crosses dark, thus the difference of exposing be on the occasion of, and there is bigger just exposure difference in strong district backlight; Strong face light needs dim light because main body crosses bright, so the exposure difference is a negative value; At the details in a play not acted out on stage, but told through dialogues scenic spot because overall brightness is low excessively, need light filling promoting the integral body brightness of image of taking pictures, so the difference of exposing be on the occasion of.Find out by above, utilize and set up the corresponding exposure difference table, just can do exposure correction control accurately various scene situation.

After having defined underrange, referring again to Fig. 3, after the image data input, promptly proceed to

step

306, its content is the illuminated state zone ratio value that calculates image, promptly calculates in this image brightness greater than the pixel quantity of illuminated state zone critical value and the ratio of image total pixel number amount.Proceed to

step

308 then, its content is the dark state zone ratio value that calculates image, promptly calculates in this image brightness less than the pixel quantity of dark state zone critical value and the ratio of image total pixel number amount.Next be

step

310, its content is for to decide the position of image in coordinate domain according to illuminated state zone ratio value and dark state zone ratio value.

Next be

step

312, its content is the exposure according to the exposure bias value compensation image of the underrange correspondence at image place, this step can be subdivided into 4 sub-steps again, please refer to the exposure bias value according to one embodiment of the invention shown in Figure 10 and define flow chart, the content of its

first substep

1002 is the frontier district of each underrange of definition.The reason that why will define the frontier district with and the method for definition, will be illustrated according to following content:

In the time of in the pairing coordinate points of image falls within aforementioned defined a certain underrange, just do exposure compensating according to the pairing exposure difference of this underrange.But, if the pairing coordinate points of image falls within a certain underrange, but when the scene classification of this underrange representative is not the scene classification of this image, then with the exposure bias value of coordinate points place underrange correspondence exposure bias value the two weighted average corresponding, with exposure bias value as this image with adjacent underrange.The method can be done explanation by the weighted average algorithmic descriptions figure of exposure bias value according to one embodiment of the invention shown in Figure 11.Wherein, underrange 1102 pairing scenes are the bright field scape; Underrange 1104 pairing scenes are backlight and strong scene backlight; Frontier district 1106 is expressed as the maximum magnitude that the sample image of backlight and strong scene backlight distributes in underrange 1102; Frontier district 1108 is expressed as the maximum magnitude that the sample image of bright field scape distributes in underrange 1104.Definition ten thousand methods of the frontier district of this underrange, be with adjacent underrange the maximum magnitude that in present underrange, distributes of the sample image of corresponding scene classification define the frontier district of present underrange, and underrange that other has an identical situation also according to as upper type define each other frontier district.Coordinate points 1110 is expressed as the coordinate points of the bright field scape image that falls within underrange 1104 (backlight and strong scene backlight) lining.C1 is expressed as underrange 1102 pairing exposure bias values; C2 is expressed as underrange 1104 pairing exposure bias values; Curve 1112 then is expressed as C2 ratio line, its value in underrange 1104 is 1, and when crossing the boundary line 1116 of underrange 1102 and 1104, begin linearity and be decremented to zero, till the maximum magnitude that the sample image that reaches underrange 1104 pairing scene classifications distributes in underrange 1102; And curve 1114 also is expressed as C1 ratio line, its value in underrange 1102 is 1, and when crossing the boundary line 1116 of underrange 1102 and 1104, begin linearity and be decremented to zero, till the maximum magnitude that the sample image that reaches underrange 1102 pairing scene classifications distributes in underrange 1104; U1 is expressed as the ratio value that coordinate points 1110 corresponds to curve 1114; U2 is expressed as the ratio value that coordinate points corresponds to curve 1112.

When the coordinate points 1110 that bright field scape image takes place falls within underrange 1104 (backlight and strong scene backlight) lining, just can calculate the average weighted exposure bias value of its process according to (1) formula as follows:

C 1 × u 1 + C 2 × u 2 u 1 + u 2 . . . . . . . . . ( 1 )

(1) computational methods shown in the formula are not limited to only be fit to fall within the coordinate points use of the frontier district between underrange 1102 and 1104, when coordinate points falls within other frontier district, can calculate its exposure bias value according to above-mentioned weighted average mode too.

(when the

step

1002 of Figure 10 is done) just can subsequently carry out

next procedure

1004 when the frontier district of image is defined out.Please refer to Figure 10, the content of

step

1004 is for to judge whether image falls within outside the frontier district in the position of coordinate domain.If be judged as be, then proceed to

step

1006, its content is to do exposure compensating according to the exposure bias value of the underrange correspondence at image place; If be judged as not, then proceed to

step

1008, its content is to make exposure compensating according to exposure bias value the two weighted average of exposure bias value corresponding with adjacent underrange of the underrange correspondence at image place.

Above-described automatic exposure control method, wherein the

step

306 of Fig. 3 and 308 there is no obvious priority each other, so can exchange mutually.

Step

406 among Fig. 4 and 408 there is no obvious priority each other, so also can exchange mutually.And the determining

step

1004 among Figure 10 is not limited to only judge whether image falls within outside the frontier district in the position of coordinate domain, it also can be to judge whether image falls within the frontier district in the position of coordinate domain, and

step

1006 and 1008 will be carried out according to its judged result.

Figure 12 illustrates and is the part calcspar according to the digital image pick-up of one embodiment of the invention.Digital image pick-up comprises

digital signal processor

1202, automatic

exposure compensation arrangement

1204, light

quantity measurement unit

1206 and

adder

1216 among Figure 12.Wherein automatic

exposure compensation arrangement

1204 comprises

brightness statistics unit

1210, index calculating 1212 and looks into and convert table 1214.And the Input in Figure 12 is the image data input, and θ and γ are index, i.e. the abscissa on angle-distance (angle-distance) coordinate domain plane and the value of ordinate.

In Figure 12,

digital signal processor

1202 receives image data input Input so that carry out image processing, and the image data that will handle exports light

quantity measurement unit

1206 and

brightness statistics unit

1210 to.Brightness is greater than the illuminated state zone pixel quantity of illuminated state zone critical value in the

brightness statistics unit

1210 statistics image datas, and statistics is less than the dark state zone pixel quantity of dark state zone critical value.Above-mentioned illuminated state zone critical value and dark state zone critical value need to set in advance.Index calculating 1212 is coupled to

brightness statistics unit

1210, in order to foundation illuminated state zone pixel quantity and dark state zone pixel quantity and produce at least one index.

In another embodiment,

brightness statistics unit

1210 more calculates the two the ratio of total pixel number amount of illuminated state zone pixel quantity and image, and also calculate the two ratio of the dark state zone pixel quantity of this image and total pixel number amount, with illuminated state zone ratio value and the dark state zone ratio value that obtains this image respectively.Illuminated state zone ratio value that index calculating 1212 is more exported according to

brightness statistics unit

1210 and dark state zone ratio value and produce at least one index.

In this embodiment, index calculating 1212 produces index θ and γ according to illuminated state zone ratio value and dark state zone ratio value, and exports to be indexed to look into and convert table 1214.Look into and convert table 1214 and item export exposure bias value accordingly according to these index.Then

adder

1216 is converted the exposure bias value addition that table 1214 is exported with the output of light

quantity measurement unit

1206 with looking into, with output exposure control value Output1.Then, digital image pick-up (for example digital camera) just can decide the compensation (for example utilizing this exposure control value to remove the shutter and the aperture of control figure camera) that how to change exposure according to exposure control value Output1.Wherein, the above-mentioned content of converting table 1214 of looking into must be set up in advance, and the mode of its foundation will describe it in detail with reference to Figure 14 after a while.

According to spirit of the present invention, automatic exposure control method more can apply it according to another embodiment as shown in figure 13.It is 6 steps that Figure 13 is divided into.And step 1302, content is converted table for setting an illuminated state zone critical value and a dark state zone critical value and setting up to look into, and its step can be subdivided into 7 sub-steps again again.Please refer to the flow chart of setting up the lookup table according to another embodiment of the present invention shown in Figure 14.Wherein, the step 402,404,406,408,410 among

step

1402,1404,1406,1408,1410,1412 step content and Fig. 4,412 step content are the same, so do not repeat them here.And in

step

1412, content is for after the definition underrange, and this embodiment also has an execution in

step

1414, and its content concerns to finish to look into and converts table for converting according to looking into of those underranges and pairing exposure bias value.After finishing this step, following step is 1306,1308.Please refer to back Figure 13, wherein, the

step

306 of

step

1306,1308 step content and Fig. 3,308 step content are identical, so also repeat no more at this.After calculating bright attitude and dark state zone ratio value, next be

step

1310, its content is for to convert this image corresponding exposure bias value with the dark state zone ratio value in looking into to convert to look in the table according to the illuminated state zone ratio value.And then carry out

final step

1312, its content is the exposure according to exposure bias value compensation image.

Figure 15 shows that coordinate domain plane graph according to another embodiment of the present invention.Among Figure 15, on illuminated state zone ratio value (transverse axis) and the formed coordinate plane of dark state zone ratio value (longitudinal axis) with image, add up a large amount of different scene image data, this coordinate domain plane is divided into some blocks, and distinguish that respectively to cut apart block be bright field scape, normal face light, backlight and face light backlight by force, strong and dark scene, as shown in FIG..Its exposure is to comply with the selected exposure value that exposes for the first time earlier from the method for dynamic(al) correction, decides whether it is dark scene, if dark scene is then handled in the light filling mode of dark scene; If not dark scene, then decide affiliated block with the summation (value of the value+longitudinal axis of transverse axis) of illuminated state zone ratio value and dark state zone ratio value and the difference (value of the value-longitudinal axis of transverse axis) of illuminated state zone ratio value and dark state zone ratio value, and, obtain the difference of exposing through tabling look-up by the exposure difference table that corresponding block is set up.Thus just can not need also can reach the purpose that automatic exposure is proofreaied and correct, can simplify computing to save the demand of related hardware via Coordinate Conversion (by transverse axis-ordinate of orthogonal axes territory plane conversion to angle-plane, range coordinate territory).

In sum, use automatic exposure control method of the present invention to make in the scene of taking pictures, can do suitable exposure compensating if occur to the scene of taking pictures under the situation as (back-lighting) backlight, strong face light (strong front-lighting) and dark scene (dark).And occur in the scene of taking pictures under the situation of normal face light (normal) and bright field scape (high-light), also can be kept its original exposure.Therefore, the picture of taking pictures out not only can be more clear, the image frame that simultaneously also can more press close to visually to be experienced.

Though the present invention discloses as above with preferred embodiment; right its is not in order to qualification the present invention, any those skilled in the art, without departing from the spirit and scope of the present invention; when can doing a little change and retouching, so protection scope of the present invention is as the criterion when looking the claim person of defining.

Claims (19)

1. automatic exposure control method, in order to determine the exposure bias value of an image, this automatic exposure control method comprises:

Set an illuminated state zone critical value and a dark state zone critical value and set up a coordinate domain, wherein have a plurality of underranges of corresponding different exposure bias values in this coordinate domain, this coordinate domain has reacted the relation between this image and those underranges;

Calculate in this image brightness greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain an illuminated state zone ratio value;

Calculate in this image brightness less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain a dark state zone ratio value;

According to this illuminated state zone ratio value and this dark state zone ratio value and determine the position of this image in this coordinate domain; And

When the position of this image in this coordinate domain fall within those underranges one of them, then compensate the exposure of this image according to the exposure bias value of this underrange correspondence at this image place.

2. automatic exposure control method as claimed in claim 1, the step of wherein setting this illuminated state zone critical value and this dark state zone critical value and setting up this coordinate domain comprises:

A plurality of sample images are provided, wherein those sample images belong to separately a plurality of scene classifications one of them;

Specify the pairing exposure bias value of each those scene classification;

Calculate in those sample images brightness separately greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain the illuminated state zone ratio value of each sample image;

Calculate in those sample images brightness separately less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain the dark state zone ratio value of each sample image;

Dark state zone ratio value and illuminated state zone ratio value according to each sample image determine the position of those sample images in this coordinate domain; And

Define those underranges according to the distribution of each sample image that belongs to the same scene classification in this coordinate domain.

3. automatic exposure control method as claimed in claim 2, wherein those scene classifications comprise bright field scape, normal face light, backlight, backlight, strong face light and dark scene by force.

4. automatic exposure control method as claimed in claim 2 wherein comprises according to the step that the exposure bias value of this image place underrange correspondence compensates the exposure of this image:

In scope each time, define a frontier district respectively at its boundary vicinity;

When the position of this image in this coordinate domain fall within those underranges one of them, and outside the frontier district of this underrange at place, then with the exposure bias value of place underrange correspondence exposure bias value as this image; And

When the position of this image in this coordinate domain fall within those underranges one of them, and in the frontier district of this underrange at place, then with the exposure bias value of place underrange correspondence exposure bias value the two weighted average corresponding, with exposure bias value as this image with adjacent underrange.

5. automatic exposure control method as claimed in claim 4, wherein one of them is present underrange as if those underranges, then adjacent underrange with aforementioned present underrange the sample image of corresponding scene classification, the maximum magnitude that distributes in aforementioned present underrange is defined as the frontier district of aforementioned present underrange.

6. automatic exposure control method as claimed in claim 1, it is used for the exposure control of digital image pick-up.

7. automatic exposure control method as claimed in claim 6, wherein this digital image pick-up comprises digital camera.

8. automatic exposure control method, in order to determine the exposure bias value of an image, this automatic exposure control method comprises:

Set an illuminated state zone critical value and a dark state zone critical value and set up one and look into and convert table, this is looked into and converts table and be the relation of converting of looking between a plurality of underranges and its pairing exposure bias value;

Calculate in this image brightness greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain an illuminated state zone ratio value;

Calculate in this image brightness less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain a dark state zone ratio value;

According to this illuminated state zone ratio value and this dark state zone ratio value, look into to convert in this and look into the exposure bias value of converting this image correspondence in table; And

Compensate the exposure of this image according to exposure bias value.

9. automatic exposure control method as claimed in claim 8, wherein set up this and look into the step of converting table and comprise:

A plurality of sample images are provided, wherein those sample images belong to separately a plurality of scene classifications one of them;

Specify the pairing exposure bias value of each those scene classification;

Calculate in those sample images brightness separately greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain the illuminated state zone ratio value of each sample image;

Calculate in those sample images brightness separately less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain the dark state zone ratio value of each sample image;

Dark state zone ratio value and illuminated state zone ratio value according to each sample image determine the position of those sample images in a coordinate domain;

Define those underranges according to the distribution of each sample image that belongs to the same scene classification in this coordinate domain; And

According to the relation of converting of looking into of those underranges and pairing exposure bias value, look into and convert table and finish this.

10. automatic exposure control method as claimed in claim 9, wherein those scene classifications comprise bright field scape, normal face light, backlight, backlight, strong face light and dark scene by force.

11. automatic exposure control method as claimed in claim 8, it is used for the exposure control of digital image pick-up.

12. automatic exposure control method as claimed in claim 11, wherein this digital image pick-up comprises digital camera.

13. an automatic exposure compensation arrangement, in order to determine the exposure bias value of an image, this automatic exposure compensation arrangement comprises:

One brightness statistics unit, in order to adding up the illuminated state zone pixel quantity greater than a default illuminated state zone critical value of brightness in this image, and statistics is less than the dark state zone pixel quantity of a default dark state zone critical value;

One index calculating is coupled to this brightness statistics unit, in order to according to this illuminated state zone pixel quantity and this dark state zone pixel quantity and produce at least one index; And

One looks into and converts table, is coupled to this index calculating, in order to obtaining an exposure bias value according to this index, and exports this exposure bias value.

14. automatic exposure compensation arrangement as claimed in claim 13, wherein

This brightness statistics unit more calculates the ratio of total pixel number amount in this illuminated state zone pixel quantity and this image, obtaining an illuminated state zone ratio value, and calculates the ratio of total pixel number amount in this dark state zone pixel quantity and this image, to obtain a dark state zone ratio value; And

This index calculating is more according to this illuminated state zone ratio value and this dark state zone ratio value and produce this index.

15. as claim 13 a described automatic exposure compensation arrangement, wherein this is looked into and converts table obtains this exposure bias value according to this index method and comprise:

Set up a coordinate domain, wherein have a plurality of underranges of corresponding different exposure bias values in this coordinate domain, this coordinate domain has reacted the relation between this image and those underranges;

Determine the position of this image in this coordinate domain according to this index; And

When the position of this image in this coordinate domain fall within those underranges one of them, then export the exposure bias value of this underrange correspondence at this image place.

16. automatic exposure compensation arrangement as claimed in claim 15, the step of wherein setting up this coordinate domain comprises:

A plurality of sample images are provided, wherein those sample images belong to separately a plurality of scene classifications one of them;

Specify the pairing exposure bias value of each those scene classification;

Calculate in those sample images brightness separately greater than the pixel quantity of this illuminated state zone critical value and the ratio of this image total pixel number amount, to obtain the illuminated state zone ratio value of each sample image;

Calculate in those sample images brightness separately less than the pixel quantity of this dark state zone critical value and the ratio of this image total pixel number amount, to obtain the dark state zone ratio value of each sample image;

Dark state zone ratio value and illuminated state zone ratio value according to each sample image determine the position of those sample images in this coordinate domain; And

Define those underranges according to the distribution of each sample image that belongs to the same scene classification in this coordinate domain.

17. automatic exposure compensation arrangement as claimed in claim 16, wherein those scene classifications comprise bright field scape, normal face light, backlight, backlight, strong face light and dark scene by force.

18. automatic exposure compensation arrangement as claimed in claim 13, it is used for the exposure compensating control of digital image pick-up.

19. automatic exposure compensation arrangement as claimed in claim 18, wherein this digital image pick-up comprises digital camera.

CNB2005101343391A 2005-12-14 2005-12-14 Method for automatically controlling exposure and device for automatically compensating exposure Active CN100515041C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2005101343391A CN100515041C (en) 2005-12-14 2005-12-14 Method for automatically controlling exposure and device for automatically compensating exposure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2005101343391A CN100515041C (en) 2005-12-14 2005-12-14 Method for automatically controlling exposure and device for automatically compensating exposure

Publications (2)

Publication Number Publication Date
CN1984260A CN1984260A (en) 2007-06-20
CN100515041C true CN100515041C (en) 2009-07-15

Family

ID=38166443

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005101343391A Active CN100515041C (en) 2005-12-14 2005-12-14 Method for automatically controlling exposure and device for automatically compensating exposure

Country Status (1)

Country Link
CN (1) CN100515041C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111726506A (en) * 2020-06-30 2020-09-29 深圳市精锋医疗科技有限公司 Image processing method, device and storage medium for endoscope

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101873433B (en) * 2009-04-27 2012-06-06 上海乐金广电电子有限公司 Automatic backlight adjusting device of camera and method thereof
CN102420944B (en) * 2011-04-25 2013-10-16 展讯通信(上海)有限公司 High dynamic-range image synthesis method and device
CN102148936B (en) * 2011-05-04 2012-10-17 展讯通信(上海)有限公司 High dynamic range imaging optimization method and device
CN102752512B (en) * 2011-11-30 2017-06-13 新奥特(北京)视频技术有限公司 A kind of method for adjusting image exposure effect
US8982237B2 (en) * 2011-12-09 2015-03-17 Htc Corporation Portable electronic device with auto-exposure control adaptive to environment brightness and image capturing method using the same
CN102724405B (en) * 2012-06-21 2015-04-08 无锡鸿图微电子技术有限公司 Method and device for automatic exposure compensation of backlit scenes in video imaging system
CN102846328B (en) * 2012-08-23 2014-07-02 上海奕瑞光电子科技有限公司 Automatic exposure controlling device and controlling method for digital photography
CN103826066B (en) * 2014-02-26 2017-05-03 芯原微电子(上海)有限公司 Automatic exposure adjusting method and system
CN105227841A (en) * 2015-10-13 2016-01-06 广东欧珀移动通信有限公司 To take pictures reminding method and device
US10306152B1 (en) * 2018-02-14 2019-05-28 Himax Technologies Limited Auto-exposure controller, auto-exposure control method and system based on structured light
CN111050087B (en) * 2019-12-26 2021-03-09 杭州涂鸦信息技术有限公司 Infrared anti-overexposure method based on RGB statistical information and system and device thereof
CN112804461A (en) * 2021-02-01 2021-05-14 深圳瑞为智能科技有限公司 Method for automatically adjusting AE target brightness
CN115883983A (en) * 2021-09-28 2023-03-31 瑞昱半导体股份有限公司 Image processing method, processor, and non-transitory computer-readable storage medium
CN118175430B (en) * 2024-05-11 2024-08-02 杭州海康威视数字技术股份有限公司 Automatic exposure control method and device and image pickup equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111726506A (en) * 2020-06-30 2020-09-29 深圳市精锋医疗科技有限公司 Image processing method, device and storage medium for endoscope

Also Published As

Publication number Publication date
CN1984260A (en) 2007-06-20

Similar Documents

Publication Publication Date Title
US7535511B2 (en) 2009-05-19 Automatic exposure control method and automatic exposure compensation apparatus
CN100515041C (en) 2009-07-15 Method for automatically controlling exposure and device for automatically compensating exposure
CN100377575C (en) 2008-03-26 Image processing method, image processing apparatus, and computer program used therewith
CN100594736C (en) 2010-03-17 Image capture apparatus and control method thereof
US7791652B2 (en) 2010-09-07 Image processing apparatus, image capture apparatus, image output apparatus, and method and program for these apparatus
CN102801918B (en) 2015-08-05 Picture pick-up device and the method for controlling this picture pick-up device
CN1732682B (en) 2010-04-21 Image processing device
US20070047803A1 (en) 2007-03-01 Image processing device with automatic white balance
JP4960605B2 (en) 2012-06-27 Automatic exposure compensation method and compensation device
CN101166240A (en) 2008-04-23 Image processing device, image forming device and image processing method
US20140063288A1 (en) 2014-03-06 Imaging apparatus, electronic device and method providing exposure compensation
US20110115898A1 (en) 2011-05-19 Imaging apparatus, imaging processing method and recording medium
EP1052848A1 (en) 2000-11-15 Image-sensing apparatus
CN106358030A (en) 2017-01-25 Image processing apparatus and image processing method
Hertel et al. 2007 Image quality standards in automotive vision applications
TW201919385A (en) 2019-05-16 Defective pixel compensation method and device
CN101534375B (en) 2013-09-11 Method for correcting chromatic aberration
US8599280B2 (en) 2013-12-03 Multiple illuminations automatic white balance digital cameras
CN101616254B (en) 2012-05-23 Image processing apparatus and image processing method
US6944398B2 (en) 2005-09-13 Photometric device and camera
JP2003087646A (en) 2003-03-20 Imaging apparatus
JP3554069B2 (en) 2004-08-11 Imaging device
US11496660B2 (en) 2022-11-08 Dual sensor imaging system and depth map calculation method thereof
US7898591B2 (en) 2011-03-01 Method and apparatus for imaging using sensitivity coefficients
JP4231599B2 (en) 2009-03-04 Imaging device

Legal Events

Date Code Title Description
2007-06-20 C06 Publication
2007-06-20 PB01 Publication
2007-08-15 C10 Entry into substantive examination
2007-08-15 SE01 Entry into force of request for substantive examination
2009-07-15 C14 Grant of patent or utility model
2009-07-15 GR01 Patent grant
2013-02-27 ASS Succession or assignment of patent right

Owner name: ICATCH TECHNOLOGY INC.

Free format text: FORMER OWNER: LINGYANG SCIENCE AND TECHNOLOGY CO., LTD.

Effective date: 20130124

2013-02-27 C41 Transfer of patent application or patent right or utility model
2013-02-27 TR01 Transfer of patent right

Effective date of registration: 20130124

Address after: Taiwan, Hsinchu, China Science Park Innovation all the way No. 19-1, 4 floor

Patentee after: iCatch Technology, Inc.

Address before: Hsinchu, Taiwan, China

Patentee before: Lingyang Science and Technology Co., Ltd.