CN101673399A - Calibration method of coded structured light three-dimensional vision system - Google Patents
- ️Wed Mar 17 2010
CN101673399A - Calibration method of coded structured light three-dimensional vision system - Google Patents
Calibration method of coded structured light three-dimensional vision system Download PDFInfo
-
Publication number
- CN101673399A CN101673399A CN200910153612A CN200910153612A CN101673399A CN 101673399 A CN101673399 A CN 101673399A CN 200910153612 A CN200910153612 A CN 200910153612A CN 200910153612 A CN200910153612 A CN 200910153612A CN 101673399 A CN101673399 A CN 101673399A Authority
- CN
- China Prior art keywords
- calibration
- points
- camera
- point
- projector Prior art date
- 2009-09-29 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 239000000284 extract Substances 0.000 claims description 10
- 238000003384 imaging method Methods 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 4
- 238000000354 decomposition reaction Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 238000002474 experimental method Methods 0.000 description 7
- 239000004744 fabric Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000002372 labelling Methods 0.000 description 4
- 238000007430 reference method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000009897 systematic effect Effects 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 239000004411 aluminium Substances 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000002386 leaching Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
A calibration method of a coded structured light three-dimensional vision system comprises the following steps of: 1) camera calibration processes: i.e. a two-dimensional calibration template is adopted, a calibration target consists of standard black and white chess checks, feature points are formed by chess checks points, a camera shoots a plurality of images in different positions, check angular points are selected as the feature points for the plurality of images, and a plane calibration method is adopted for calibrating the camera so as to obtain internal parameter calibration K of the camera as well as a mean value focal distance f; and 2) projector calibration processes: i.e. a light template generated from coding is used for projection onto a coplane calibration target, black and white check angular points are used as positioning points, and image coordinates of the black and white check angular points are simultaneously extracted to generate a world coordinate by utilizing theimages obtained from the camera calibration. The calibration method achieves high camera calibration accuracy and can satisfy the requirements of actual application.
Description
Technical field
The present invention relates to computer vision, data processing, Flame Image Process, especially a kind of scaling method of structured light three-dimensional vision system.
Background technology
Light structural three-dimensional recovers 3D shape from the two dimensional image that video camera obtains, and the structured light system has the high good characteristic of precision, and wherein important effect has been brought into play in the demarcation of video camera, projector machine.Video camera, projector respectively with the position mutual relationship of object, mutual relationship between projector and the video camera, imaging geometry model by video camera and projector is determined, the parameter of this two geometric model is called systematic parameter, these parameters must determine that experiment is called the structured light system calibrating with calculation process by experiment and calculating.The demarcation of system is to obtain relation between each sensor and the parameter of self, the reconstruction error of stated accuracy decision total system, and also the nonlinear characteristic of video camera and projector is left in the basket the modeling out of true usually fully.The scaling method of structured light system mainly contains direct demarcation and distributes demarcates two kinds, and in the prior art, the camera calibration technology is very ripe, directly uses the nonlinear model of classics to come calibrating camera, and stated accuracy satisfies practical application.
In the prior art, to the demarcation of projector according to the mutual relationship between itself and the video camera and decide, exist stated accuracy low, can't satisfy the practical application needs.
Summary of the invention
Low for the projector calibrating precision of the scaling method that overcomes existing coded structured light three-dimensional vision system, as can't to satisfy practical application needs deficiency the invention provides a kind of video camera stated accuracy height, can satisfy the scaling method of the coded structured light visual imaging system of practical application needs.
The technical solution adopted for the present invention to solve the technical problems is:
The scaling method of a kind of coded structured light visual imaging system, described scaling method may further comprise the steps:
1), camera calibration process:
Adopt two-dimentional calibrating template, target is that the black and white chess lattice of standard are formed, unique point is that the point of chess lattice is formed, video camera is taken the multiple image of diverse location, select the grid angle point as unique point to described multiple image, adopt the plane reference method that video camera is demarcated, obtain camera intrinsic parameter and demarcate K, and obtain the average focal length
2), projector calibrating process:
The optical mode plate that produces by coding projects on the common target, and black and white grid angle point utilizes the image that obtains as anchor point in camera calibration, extract black and white grid angle point image coordinate simultaneously, generates world coordinates;
During projection single white optical mode plate, obtain A 1,
B12 image coordinate (u A, v A), (u B, v B), and world coordinates is (X A, Y A, Z A), (X B, Y B, Z B), shown in Figure 7 as in the accompanying drawing asked straight line L 1Intersection point with L obtains P iThe image coordinate of point is as P 1The image coordinate of point is (u P, v P), world coordinates is defined as (X p, Y P, Z p), because A 1, B 1, P 1So 3 coplanes are Z p=Z A=Z B, according to the double ratio unchangeability, obtain,
X A - X P X A - X B = v A - v P v A - v B , Y A - Y P Y A - Y B = u A - u P u A - u B - - - ( 1 )
Order
λ x = v A - v P v A - v B , λ y = u A - u P u A - u B - - - ( 2 )
Get by (1), (2) formula:
X p = X A + λ x ( X B - X A ) Y p = Y A + λ y ( Y B - Y A ) Z P = Z A = Z B - - - ( 3 ) .
λ x, λ yThe double ratio coefficient of representing X and Y direction respectively, (X p, Y P, Z p) world coordinates of expression striation edge feature point.
As preferred another kind of scheme: in described step 2) in, the process of setting up of projector model is:
By projector coordinates x dWith world coordinates X wIt is as follows to set up model:
γ x d 1 = m 1 m 14 m 2 m 24 X w Y w Z w 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 X w Y w Z w 1 - - - ( 4 )
γ is arbitrary scale factor, eliminates γ, and conversion (4) formula obtains:
m 1 X w Y w Z w T + m 14 m 2 X w Y w Z w T + m 24 = x d - - - ( 5 )
(5) formula of utilization structure is with m IkBe the system of homogeneous linear equations of unknown number, find the solution the projector parameter by svd then;
Set up and simplify perspective double ratio unchangeability geometric model, hypothesis space point A, B, C are positioned on same the straight line L, are reference point with B, and the position ratio is defined as:
PR(A,B,C)=AB/AC????(6)
In like manner, A ', B ', C ' they are under the O of centre of perspectivity effect, the imaging point of corresponding point A, B, C, and be positioned on the same point straight line L1, the position ratio on the image coordinate is:
PR(A′,B′,C′)=A′B′/A′C′???(7)
Double ratio unchangeability according to the perspective geometry principle obtains:
PR(A,B,C)=PR(A′,B′,C′)????(8)
At image mid point A ', B ', C ' also is to be positioned on the same straight line L '.
Extract the image coordinate of unique point in the optical mode plate, obtain the world coordinates of reference point B, set up space three-dimensional and come the labeling projection instrument to the mapping relations between the dimension coordinate according to (8) formula.
Beneficial effect of the present invention mainly shows: video camera stated accuracy height, can satisfy the practical application needs.
Description of drawings
Fig. 1 is a double ratio principle of invariance synoptic diagram.
Fig. 2 is the synoptic diagram of high precision target.
Fig. 3 is the synoptic diagram that reads in original uncalibrated image.
Fig. 4 is the synoptic diagram of shooting labeling projection error.
Fig. 5 is striation projection target face figure.
Fig. 6 is the synoptic diagram of edge extracting and fitting a straight line.
Fig. 7 is the synoptic diagram that unique point is obtained.
Fig. 8 is a projector calibrating error synoptic diagram.
Fig. 9 is the synoptic diagram of calibration point three-dimensionalreconstruction.
Embodiment
Below in conjunction with accompanying drawing the present invention is further described.
With reference to Fig. 1~Fig. 9, the scaling method of a kind of coded structured light visual imaging system, described scaling method may further comprise the steps:
1), camera calibration process:
Adopt two-dimentional calibrating template, target is that the black and white chess lattice of standard are formed, unique point is that the point of chess lattice is formed, video camera is taken the multiple image of diverse location, select the grid angle point as unique point to described multiple image, adopt the plane reference method that video camera is demarcated, obtain camera intrinsic parameter and demarcate K, and obtain the average focal length
2), projector calibrating process:
The optical mode plate that produces by coding projects on the common target, and black and white grid angle point utilizes the image that obtains as anchor point in camera calibration, extract black and white grid angle point image coordinate simultaneously, generates world coordinates;
During projection single white optical mode plate, directly extract A 1, B 1Image coordinate (the u of two angle points A, v A), (u B, v B), and world coordinates is (X A, Y A, Z A), (X B, Y B, Z B), shown in Figure 7 as in the accompanying drawing asked straight line L iIntersection point with L obtains P iThe image coordinate of point is as P 1The image coordinate of point is (u P, v P), its world coordinates is defined as (X p, Y P, Z p), because A 1, B 1, P 1So 3 coplanes are Z p=Z A=Z B, according to the double ratio unchangeability, obtain,
X A - X P X A - X B = v A - v P v A - v B , Y A - Y P Y A - Y B = u A - u P u A - u B - - - ( 1 )
Order
λ x = v A - v P v A - v B , λ y = u A - u P u A - u B - - - ( 2 )
Get by (1), (2) formula:
X p = X A + λ x ( X B - X A ) Y p = Y A + λ y ( Y B - Y A ) Z P = Z A = Z B - - - ( 3 ) .
λ x, λ yThe double ratio coefficient of representing X and Y direction respectively, (X p, Y P, Z p) world coordinates of expression striation edge feature point.
In described step 2) in, the process of setting up of projector model is:
By projector coordinates x dWith world coordinates X wIt is as follows to set up model:
γ x d 1 = m 1 m 14 m 2 m 24 X w Y w Z w 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 X w Y w Z w 1 - - - ( 4 )
γ is arbitrary scale factor, eliminates γ, and conversion (4) formula obtains:
m 1 X w Y w Z w T + m 14 m 2 X w Y w Z w T + m 24 = x d - - - ( 5 )
(5) formula of utilization structure is with m IkBe the system of homogeneous linear equations of unknown number, find the solution the projector parameter by svd then;
Set up to simplify perspective double ratio unchangeability geometric model, shown in Figure 1 as in the annex, hypothesis space point A, B, C are positioned on same the straight line L, are reference point with B, and the position ratio is defined as:
PR(A,B,C)=AB/AC????(6)
In like manner, A ', B ', C ' they are under the O of centre of perspectivity effect, the imaging point of corresponding point A, B, C, and be positioned on the same point straight line L1, the position ratio on the image coordinate is:
PR(A′,B′,C′)=A′B′/A′C′????(7)
Double ratio unchangeability according to the perspective geometry principle obtains:
PR(A,B,C)=PR(A′,B′,C′)????(8)
At image mid point A ', B ', C ' also is to be positioned on the same straight line L '.
Extract the image coordinate of unique point in the optical mode plate, obtain the world coordinates of reference point B, set up space three-dimensional and come the labeling projection instrument to the mapping relations between the dimension coordinate according to (8) formula.
In the present embodiment, the projector model is at the conceptive inversion model that is equal to video camera, because the one dimension of space encoding can be simplified to one-dimensional model here.By projector coordinates x dWith world coordinates X wIt is as follows to set up model:
γ x d 1 = m 1 m 14 m 2 m 24 X w Y w Z w 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 X w Y w Z w 1 - - - ( 4 )
γ is arbitrary scale factor, eliminates γ, and conversion (4-13) formula obtains:
m 1 X w Y w Z w T + m 14 m 2 X w Y w Z w T + m 24 = x d - - - ( 5 )
(4-14) formula of utilization structure is with m IkBe the system of homogeneous linear equations of unknown number, decompose by singular value (SVD) then and find the solution the projector parameter.
In three-dimensional reconstruction, the striation marginal point is as unique point, and is different with tessellated angle point, and its world coordinates can't directly obtain, and need utilize the world coordinates of known point to ask indirectly.Set up and simplify perspective double ratio unchangeability geometric model, as shown in Figure 1, hypothesis space point A, B, C are positioned on same the straight line L, are reference point with B, and the position ratio is defined as:
PR(A,B,C)=AB/AC????(6)
In like manner, A ', B ', C ' they are under the O of centre of perspectivity effect, the imaging point of corresponding point A, B, C, and be positioned on the same point straight line L1, the position ratio on the image coordinate is:
PR(A′,B′,C′)=A′B′/A′C′????(7)
Double ratio unchangeability according to the perspective geometry principle can obtain:
PR(A,B,C)=PR(A′,B′,C′)????(8)
In experiment, can find A, B, C on the space line at 3, so corresponding also is to be positioned on the same straight line L ' at image mid point A ', B ', C '.
Extract the image coordinate of unique point in the optical mode plate, can obtain the world coordinates of reference point B according to (4-17) formula.Set up space three-dimensional and come the labeling projection instrument to the mapping relations between the dimension coordinate.
In the structure light vision system, divide independent two steps to demarcate, camera calibration and projector calibrating, the demarcation of each step is all separate, does not rely on mutually, iteration error can not occur, the system accuracy height.
Camera calibration: the three-dimensional coordinate accuracy requirement height of unique point, adopt two-dimentional calibrating template, operate in order to make systematic unity, the single white light of projector projects during camera calibration, coplane target is as shown in Figure 2 made simple and easy.
Target has 21 * 21mm standard black/white gridiron pattern to form, and gridiron pattern is printed by laser high-precision, is close to then on the aluminium sheet face, and unique point is made up of tessellated point.Video camera is taken 10 width of cloth images of diverse location, as shown in Figure 3.
Select the grid angle point as unique point to 10 width of cloth images, use above-mentioned plane reference method that video camera is demarcated, the camera intrinsic parameter calibration result is:
K = 3872.6 0 692.2 0 3858.1 443.3 0 0 1
Distortion factor k c=(0.16762 ,-0.74282,0.00092 ,-0.01760,0) T, the physical size of used CCD is d in the experiment x=d y=6.45 μ m do not consider the out of plumb factor, to α x, α yAsk average, can get the average focal length
With the camera standard focal distance f 0The relative error of=25mm is:
The target unique point is carried out re-projection, and the resultant error scope is the 0.49708-0.65712 pixel, the re-projection error of each width of cloth image as shown in Figure 5, x, y represent with the pixel to be the image coordinate system of unit respectively.
Projector calibrating: projector is as the light source projects device in the structured light system, and its demarcation relates to the precision of system, and this paper demarcates with one dimension simplified model set forth above.The demarcation of projector need be obtained the world coordinates and the striation coordinate figure of feature point for calibration.To project on the coplane target by the optical mode plate that programming produces, black and white grid angle point as shown in Figure 6, is taken four width of cloth images at depth value Z=990mm, 982mm, 974mm, 966mm place respectively as anchor point.
Marginal point carries out fitting a straight line, and the angle point on the straight line of place is also needed to carry out match.
The image that utilization is obtained in camera calibration extracts black and white grid angle point image coordinate simultaneously, and according to the grid characteristic, world coordinates can generate automatically.
As shown in Figure 8, during projection single white optical mode plate, can obtain A 1,
B12 image coordinate (u A, u A), (u B, v B), and world coordinates is (X A, Y A, Z A), (X B, Y B, Z B).Ask straight line L iIntersection point with L obtains P 1The image coordinate of point is as P 1The image coordinate of point is (u P, v P), world coordinates is defined as (X p, Y P, Z p), because A 1, B 1, P 1So 3 coplanes are Z p=Z A=Z B, obtain by (7) formula,
X A - X P X A - X B = v A - v P v A - v B , Y A - Y P Y A - Y B = u A - u P u A - u B - - - ( 1 )
Order
λ x = v A - v P v A - v B , λ y = u A - u P u A - u B - - - ( 2 )
Get by (1), (2) formula
X p = X A + λ x ( X B - X A ) Y p = Y A + λ y ( Y B - Y A ) Z P = Z A = Z B - - - ( 3 )
Utilize the principle of double ratio unchangeability to find the solution the world coordinates of striation edge feature point thus, decode the student number coordinate of striation again, demarcate with the projector model.The calibration result scope is: 0-0.2889, the error synoptic diagram as shown in the figure.
The base length of the structure light vision system of present embodiment is 230mm, and four target plane of depth value Z=990mm, 982mm, 974mm, 966mm are reconstructed, and reconstructed error is as shown in table 1.As can be seen from Table 1, utilize the target of the equidistant translation of four width of cloth to demarcate, the mean absolute error of system is 1.38mm, and average relative error is 0.14%.
??Z/mm | Standard deviation | Absolute error | Relative error |
??990.0 | ??1.0074 | ??1.3209 | ??0.13 |
??982.2 | ??1.0374 | ??1.4086 | ??0.14 |
??974.4 | ??1.0293 | ??1.3943 | ??0.14 |
??966.7 | ??1.0881 | ??1.3988 | ??0.14 |
Table 1
Utilize the parameter of having demarcated respectively four plane reference points to be carried out back projection, reconstruction result as shown in Figure 9.
The precision of measuring accuracy and hardware device is closely related, and in the system calibrating experiment, the reason that error produces is a lot, mainly reduces the following aspects:
1. high-precision gridiron pattern stencil marking exists certain error.
2. extract in the process that produces at angle point, some coplane during camera calibration on the hypothesis gridiron pattern, this plane that will seek template is definitely smooth, but in actual experiment, stencil plane can not be absolute smooth, the image and the hypothesis of taking have little deviation, and this little deviation is introduced to calculate and will be caused error.
3. in the process of projector calibrating, use the double ratio unchangeability and ask world coordinates, introduced error when unique point is carried out fitting a straight line.
4. the error that causes of calibration system.Secondly at first the resolution of vision system and acquisition precision all can influence the error of total system, and scaling method chooses the error that all can cause system with the combination of whole calibrating system each several part.Mainly based on the plane reference method of Zhang Zhengyou, the result from demarcating based on the method for nonlinear iteration, makes the minimum of error to the camera calibration of this paper, and the decomposition of projector model also is based on error minimize.
More than four aspects are the main causes that cause calibrated error.Because experiment condition is limit, the restriction of the video camera of hardware and projector resolution and acquisition precision, in the grid angle point leaching process, the template face is not definitely smooth, causes that angle point extracts error.In addition, above-mentioned system model in theory with computation process in all can defectiveness, also can cause error.
Claims (2)
1、一种编码结构光三维视觉系统的标定方法,所述标定方法包括以下步骤:1. A calibration method for a coded structured light three-dimensional vision system, the calibration method comprising the following steps: 1)、摄像机标定过程:1), camera calibration process: 采用二维标定模板,标靶为标准的黑白棋格组成,特征点为棋格的点组成,摄像机拍摄不同位置的多幅图像,对所述多幅图像选择格子角点作为特征点,采用平面标定法对摄像机进行标定,得到摄像机内参数标定K,并得到均值焦距f;其特征在于:所述标定方法还包括:A two-dimensional calibration template is used, the target is composed of standard black and white chess grids, and the feature points are composed of checker grid points. The camera captures multiple images at different positions, and selects the corner points of the grid as the feature points for the multiple images. The calibration method calibrates the camera, obtains the internal parameter calibration K of the camera, and obtains the mean focal length f; it is characterized in that: the calibration method also includes: 2)、投影仪标定过程:2), Projector calibration process: 通过编码产生的光模板投影到共同标靶上,黑白方格角点作为定位点,利用在摄像机标定中获取的图像,同时提取黑白方格角点图像坐标,生成世界坐标;The light template generated by encoding is projected onto the common target, and the corners of the black and white squares are used as positioning points. Using the images obtained in camera calibration, the image coordinates of the black and white square corners are extracted at the same time to generate world coordinates; 投射单一白色光模板时,直接提取A1,B1两角点的图像坐标(uA,vA),(uB,vB),以及世界坐标为(XA,YA,ZA),(XB,YB,ZB),求直线Li和L的交点,得到P1点的图像坐标,如P1点的图像坐标为(uP,vP),世界坐标定义为(Xp,YP,Zp),因为A1,B1,P1三点共面,所以Zp=ZA=ZB,根据交比不变性,得到,When projecting a single white light template, directly extract the image coordinates (u A , v A ), (u B , v B ) of the two corner points of A 1 and B 1 , and the world coordinates are (X A , Y A , Z A ) , (X B , Y B , Z B ), find the intersection point of straight line L i and L, and get the image coordinates of point P 1 , such as the image coordinates of point P 1 is (u P , v P ), and the world coordinates are defined as ( X p , Y P , Z p ), because the three points A 1 , B 1 , and P 1 are coplanar, so Z p =Z A =Z B , according to the cross-ratio invariance, we get, Xx AA -- Xx PP Xx AA -- Xx BB == vv AA -- vv PP vv AA -- vv BB ,, YY AA -- YY PP YY AA -- YY BB == uu AA -- uu PP uu AA -- uu BB -- -- -- (( 11 )) 令make λλ xx == vv AA -- vv PP vv AA -- vv BB ,, λλ ythe y == uu AA -- uu PP uu AA -- uu BB -- -- -- (( 22 )) 由(1)、(2)式得:From (1), (2) formula: Xx pp == Xx AA ++ λλ xx (( Xx BB -- Xx AA )) YY pp == YY AA ++ λλ ythe y (( YY BB -- YY AA )) ZZ PP == ZZ AA == ZZ BB -- -- -- (( 33 )) .. λx,λy分别表示X和Y方向的交比系数,(Xp,YP,Zp)表示光条边缘特征点的世界坐标。λ x , λ y represent the cross-ratio coefficients in the X and Y directions respectively, and (X p , Y P , Z p ) represent the world coordinates of the feature points on the edge of the light strip. 2、如权利要求1所述的编码结构光三维视觉系统的标定方法,所述标定方法包括以下步骤:在所述步骤2)中,投影仪模型的建立过程为:2. The calibration method of the coded structured light three-dimensional vision system according to claim 1, said calibration method comprising the following steps: in said step 2), the establishment process of the projector model is: 由投影仪坐标xd和世界坐标Xw建立模型如下:The model is established by the projector coordinate x d and the world coordinate X w as follows: γγ xx xx 11 == mm 11 mm 1414 mm 22 mm 24twenty four Xx ww YY ww ZZ ww 11 mm 1111 mm 1212 mm 1313 mm 1414 mm 21twenty one mm 22twenty two mm 23twenty three mm 24twenty four Xx ww YY ww ZZ ww 11 -- -- -- (( 44 )) γ是任一尺度因子,消除γ,变换(4)式得到:γ is any scale factor, eliminate γ, and transform (4) to get: mm 11 Xx ww YY ww ZZ ww TT ++ mm 1414 mm 22 Xx ww YY ww ZZ ww TT ++ mm 24twenty four == xx dd -- -- -- (( 55 )) 利用(5)式构造以mik为未知数的齐次线性方程组,然后通过奇异值分解求解投影仪参数;Utilize formula (5) to construct the homogeneous linear equation system with m ik as the unknown, and then solve the projector parameters by singular value decomposition; 建立简化透视交比不变性几何模型,假设空间点A、B、C位于同一条直线L上,以B为参考点,位置比率定义为:Establish a simplified perspective-cross-ratio invariant geometric model, assuming that the spatial points A, B, and C are located on the same straight line L, with B as the reference point, and the position ratio is defined as: PR(A,B,C)=AB/AC (6)PR(A, B, C)=AB/AC (6) 同理,A′、B′、C′是在透视中心O作用下,对应点A、B、C的成像点,开且位于同一点直线L1上,图像坐标上的位置比率为:Similarly, A′, B′, and C′ are the imaging points corresponding to points A, B, and C under the action of the perspective center O, and they are located on the same point line L1, and the position ratio on the image coordinates is: PR(A′,B′,C′)=A′B′/A′C′ (7)PR(A', B', C')=A'B'/A'C' (7) 根据透视几何原理的交比不变性,得到:According to the cross-ratio invariance of the principle of perspective geometry, we get: PR(A,B,C)=PR(A′,B′,C′) (8)PR (A, B, C) = PR (A', B', C') (8) 在图像中点A′、B′、C′也是位于同一直线L′上。In the image, points A', B', C' are also located on the same straight line L'. 提取光模板中特征点的图像坐标,根据(8)式求出参考点B的世界坐标,建立空间三维到一维坐标之间的映射关系来标定投影仪。The image coordinates of the feature points in the light template are extracted, the world coordinates of the reference point B are obtained according to formula (8), and the mapping relationship between the three-dimensional and one-dimensional coordinates is established to calibrate the projector.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910153612A CN101673399A (en) | 2009-09-29 | 2009-09-29 | Calibration method of coded structured light three-dimensional vision system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910153612A CN101673399A (en) | 2009-09-29 | 2009-09-29 | Calibration method of coded structured light three-dimensional vision system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN101673399A true CN101673399A (en) | 2010-03-17 |
Family
ID=42020612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910153612A Pending CN101673399A (en) | 2009-09-29 | 2009-09-29 | Calibration method of coded structured light three-dimensional vision system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101673399A (en) |
Cited By (14)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101876532A (en) * | 2010-05-25 | 2010-11-03 | 大连理工大学 | Camera Field Calibration Method in Measuring System |
CN102567989A (en) * | 2011-11-30 | 2012-07-11 | 重庆大学 | Space positioning method based on binocular stereo vision |
CN102842117A (en) * | 2012-07-13 | 2012-12-26 | 浙江工业大学 | Method for correcting kinematic errors in microscopic vision system |
CN103020946A (en) * | 2011-09-21 | 2013-04-03 | 云南大学 | Camera self-calibration method based on three orthogonal direction end points |
CN103503027A (en) * | 2011-03-04 | 2014-01-08 | Lbt创新有限公司 | Colour calibration method for an image capture device |
CN103778640A (en) * | 2014-03-07 | 2014-05-07 | 中国工程物理研究院激光聚变研究中心 | Microsphere-target-based objective image space telecentric microscopic vision system calibration method |
CN104299218A (en) * | 2013-07-17 | 2015-01-21 | 南京邮电大学 | Projector calibration method based on lens distortion rule |
CN105551039A (en) * | 2015-12-14 | 2016-05-04 | 深圳先进技术研究院 | Calibration method and calibration device for structured light 3D scanning system |
CN107004278A (en) * | 2014-12-05 | 2017-08-01 | 曼蒂斯影像有限公司 | Mark in 3D data captures |
WO2019041652A1 (en) * | 2017-08-30 | 2019-03-07 | 广州视源电子科技股份有限公司 | Image correction method, apparatus and device, and computer readable storage medium |
CN110163918A (en) * | 2019-04-24 | 2019-08-23 | 华南理工大学 | A kind of line-structured light scaling method based on projective geometry |
CN110443856A (en) * | 2019-08-12 | 2019-11-12 | 广州图语信息科技有限公司 | A kind of 3D structure optical mode group scaling method, storage medium, electronic equipment |
WO2021147670A1 (en) * | 2020-01-23 | 2021-07-29 | 华为技术有限公司 | Image processing method and apparatus |
CN116342718A (en) * | 2023-05-26 | 2023-06-27 | 合肥埃科光电科技股份有限公司 | Calibration method, device, storage medium and equipment of line laser 3D camera |
-
2009
- 2009-09-29 CN CN200910153612A patent/CN101673399A/en active Pending
Cited By (22)
* Cited by examiner, † Cited by third partyPublication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101876532A (en) * | 2010-05-25 | 2010-11-03 | 大连理工大学 | Camera Field Calibration Method in Measuring System |
CN101876532B (en) * | 2010-05-25 | 2012-05-23 | 大连理工大学 | Camera on-field calibration method in measuring system |
CN103503027B (en) * | 2011-03-04 | 2017-02-08 | Lbt创新有限公司 | Colour calibration method for an image capture device |
CN103503027A (en) * | 2011-03-04 | 2014-01-08 | Lbt创新有限公司 | Colour calibration method for an image capture device |
CN103020946A (en) * | 2011-09-21 | 2013-04-03 | 云南大学 | Camera self-calibration method based on three orthogonal direction end points |
CN102567989A (en) * | 2011-11-30 | 2012-07-11 | 重庆大学 | Space positioning method based on binocular stereo vision |
CN102842117B (en) * | 2012-07-13 | 2015-02-25 | 浙江工业大学 | Method for correcting kinematic errors in microscopic vision system |
CN102842117A (en) * | 2012-07-13 | 2012-12-26 | 浙江工业大学 | Method for correcting kinematic errors in microscopic vision system |
CN104299218B (en) * | 2013-07-17 | 2017-02-22 | 南京邮电大学 | Projector calibration method based on lens distortion rule |
CN104299218A (en) * | 2013-07-17 | 2015-01-21 | 南京邮电大学 | Projector calibration method based on lens distortion rule |
CN103778640A (en) * | 2014-03-07 | 2014-05-07 | 中国工程物理研究院激光聚变研究中心 | Microsphere-target-based objective image space telecentric microscopic vision system calibration method |
CN107004278B (en) * | 2014-12-05 | 2020-11-17 | 曼蒂斯影像有限公司 | Tagging in 3D data capture |
CN107004278A (en) * | 2014-12-05 | 2017-08-01 | 曼蒂斯影像有限公司 | Mark in 3D data captures |
CN105551039B (en) * | 2015-12-14 | 2017-12-08 | 深圳先进技术研究院 | The scaling method and device of structural light three-dimensional scanning system |
CN105551039A (en) * | 2015-12-14 | 2016-05-04 | 深圳先进技术研究院 | Calibration method and calibration device for structured light 3D scanning system |
WO2019041652A1 (en) * | 2017-08-30 | 2019-03-07 | 广州视源电子科技股份有限公司 | Image correction method, apparatus and device, and computer readable storage medium |
CN110163918A (en) * | 2019-04-24 | 2019-08-23 | 华南理工大学 | A kind of line-structured light scaling method based on projective geometry |
CN110163918B (en) * | 2019-04-24 | 2023-03-28 | 华南理工大学 | Line structure cursor positioning method based on projective geometry |
CN110443856A (en) * | 2019-08-12 | 2019-11-12 | 广州图语信息科技有限公司 | A kind of 3D structure optical mode group scaling method, storage medium, electronic equipment |
WO2021147670A1 (en) * | 2020-01-23 | 2021-07-29 | 华为技术有限公司 | Image processing method and apparatus |
CN113240744A (en) * | 2020-01-23 | 2021-08-10 | 华为技术有限公司 | Image processing method and device |
CN116342718A (en) * | 2023-05-26 | 2023-06-27 | 合肥埃科光电科技股份有限公司 | Calibration method, device, storage medium and equipment of line laser 3D camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101673399A (en) | 2010-03-17 | Calibration method of coded structured light three-dimensional vision system |
CN109029284B (en) | 2019-10-22 | A 3D Laser Scanner and Camera Calibration Method Based on Geometric Constraints |
CN111028297B (en) | 2023-04-28 | Calibration method of surface structured light three-dimensional measurement system |
CN104616292B (en) | 2017-07-11 | Monocular vision measuring method based on global homography matrix |
CN105551039B (en) | 2017-12-08 | The scaling method and device of structural light three-dimensional scanning system |
CN100489446C (en) | 2009-05-20 | Method for measuring three-dimensional contour based on phase method |
CN104007444B (en) | 2017-02-08 | Ground laser radar reflection intensity image generation method based on central projection |
CN101667303A (en) | 2010-03-10 | Three-dimensional reconstruction method based on coding structured light |
CN105118040B (en) | 2017-12-01 | File and picture distortion correction method based on structure laser rays |
CN106091984A (en) | 2016-11-09 | A kind of three dimensional point cloud acquisition methods based on line laser |
CN105913439A (en) | 2016-08-31 | Large-view-field camera calibration method based on laser tracker |
Ahmadabadian et al. | 2019 | An automatic 3D reconstruction system for texture-less objects |
CN104376558A (en) | 2015-02-25 | Cuboid-based intrinsic parameter calibration method for Kinect depth camera |
CN110163918A (en) | 2019-08-23 | A kind of line-structured light scaling method based on projective geometry |
CN113505626B (en) | 2024-07-12 | Quick three-dimensional fingerprint acquisition method and system |
US20140015929A1 (en) | 2014-01-16 | Three dimensional scanning with patterned covering |
CN110378967B (en) | 2023-09-22 | Virtual target calibration method combining grating projection and stereoscopic vision |
CN104048649B (en) | 2016-08-03 | A kind of multi-view images and the rapid registering method of threedimensional model |
CN112050752B (en) | 2022-03-18 | Projector calibration method based on secondary projection |
CN102750698B (en) | 2014-12-03 | Texture camera calibration device, texture camera calibration method and geometry correction method of texture image of texture camera |
Wei et al. | 2008 | Flexible calibration of a portable structured light system through surface plane |
CN103632334B (en) | 2017-02-08 | Infinite image alignment method based on parallel optical axis structure cameras |
CN102789644B (en) | 2014-10-15 | Novel camera calibration method based on two crossed straight lines |
CN114792345A (en) | 2022-07-26 | Calibration method based on monocular structured light system |
CN111145246B (en) | 2023-11-14 | Foot type scanning method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
2010-03-17 | C06 | Publication | |
2010-03-17 | PB01 | Publication | |
2010-04-28 | C10 | Entry into substantive examination | |
2010-04-28 | SE01 | Entry into force of request for substantive examination | |
2012-03-21 | C02 | Deemed withdrawal of patent application after publication (patent law 2001) | |
2012-03-21 | WD01 | Invention patent application deemed withdrawn after publication |
Open date: 20100317 |