版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)
文檔簡介
1、<p><b> 中文2880字</b></p><p> 畢 業(yè) 設(shè) 計(論文) 外 文 翻 譯</p><p> 外文題目:Matching between 3D Coordinate and its Color Information in 3D Color Sensor
2、 </p><p> 中文題目:三維坐標(biāo)和顏色信息匹配的3D顏色傳感器 </p><p> 學(xué) 院 名 稱: 電子與信息工程學(xué)院 </p><p> 專 業(yè): 電子信息工程 <
3、/p><p> 班 級: 電信114班 </p><p> 姓 名: 11401180419 </p><p> 指 導(dǎo) 教 師:
4、</p><p> 企 業(yè) 教 師: </p><p> 定稿日期: 2014年 12 月 30日</p><p> 三維坐標(biāo)和顏色信息匹配的3D顏色傳感器</p><p> 摘要:可以通過分別適用于線性區(qū)分標(biāo)定的3D傳感器和BP神經(jīng)網(wǎng)絡(luò)法的顏色傳感器,來校
5、準(zhǔn)三維(3D)坐標(biāo)與顏色傳感器測量的顏色信息之間的匹配關(guān)系。校準(zhǔn)的過程主要包括公式計算、求解過程以及信息匹配方法的詳細(xì)討論。標(biāo)定實驗結(jié)果表明,采用線性分區(qū)標(biāo)定的3D傳感器其平均測量相對誤差為0.26%,而利用BP神經(jīng)網(wǎng)絡(luò)標(biāo)定的顏色傳感器的測試精度可以達(dá)到0.5-0.6像素?;谠撔?zhǔn)結(jié)果,真實的物體被測量并且獲得的三維色點(diǎn)云,可以真實生動的展現(xiàn)實物對象。</p><p> 關(guān)鍵詞:3D顏色傳感器;攝像機(jī)標(biāo)定,信
6、息匹配,線性分區(qū)標(biāo)定,BP神經(jīng)網(wǎng)絡(luò),色點(diǎn)云</p><p><b> 引言</b></p><p> 獲得一個實物的3D坐標(biāo)和顏色信息是一種純數(shù)字化的研究。截至目前,各種基于不同原理的技術(shù)[1]被提出,并廣泛應(yīng)用于許多領(lǐng)域,如CAD和CAM,逆向工程,快速原型,虛擬現(xiàn)實,人體工程學(xué)和文物保護(hù)等[2,3]。在這些技術(shù)中,非接觸式的光學(xué)方法,特別是結(jié)構(gòu)光方法由于其簡單
7、的原理,快速的測量,不需要接觸以及精度高的特點(diǎn)變得越來越流行。</p><p> 關(guān)鍵部位是黑色和白色(B&W)相機(jī)和彩色攝像機(jī)的3D顏色傳感器可以將實物的顏色信息數(shù)字化。并且3D顏色傳感器數(shù)字化得到的3D坐標(biāo)和顏色信息匹配可以由內(nèi)部照相機(jī)來校準(zhǔn)實現(xiàn)。為此,許多的校準(zhǔn)技術(shù)被提了出來比如如直接線性變換法,滿量程非線性優(yōu)化法,兩階段法等[4,5]。不過多數(shù)的方法都因為相機(jī)過于復(fù)雜,從而使得模型總是需要被設(shè)置、許多照
8、相機(jī)的內(nèi)在和外部參數(shù)需要進(jìn)行運(yùn)算,很有可能會造成一個不穩(wěn)定的求解過程。但實際上在許多應(yīng)用中,光是在圖像空間中的點(diǎn)的坐標(biāo)和它們的像素坐標(biāo)之間的映射關(guān)系的運(yùn)算已經(jīng)是足夠的,照相機(jī)的許多內(nèi)部和外部參數(shù)往往有多余的嫌疑。基于上述的理由,提出了可以通過分別適用于線性區(qū)分標(biāo)定的3D傳感器和BP神經(jīng)網(wǎng)絡(luò)法的顏色傳感器,來校準(zhǔn)三維(3D)坐標(biāo)與顏色傳感器測量的顏色信息之間的匹配關(guān)系。</p><p><b> 原理校
9、準(zhǔn)和信息匹配</b></p><p> 2.1線性區(qū)分標(biāo)定及其解決方法</p><p> 對象的空間坐標(biāo)(XW,YW,ZW)和它們的對應(yīng)像素坐標(biāo)(Xf為,YF)之間的映射關(guān)系,可以成從圖像捕獲過程得到的形式如下面等式(1)所示的齊此坐標(biāo)配制而成的矩陣方程。</p><p> 其中ρ是一個比例因子。顯而易見的我們可以從上面的公式(1)中發(fā)現(xiàn),矩陣M中
10、包含了所有的映射信息。如果校準(zhǔn)點(diǎn)的數(shù)目足夠,M完全可以通過由對象的空間坐標(biāo)和它們的對應(yīng)像素坐標(biāo)創(chuàng)建的求解線性方程系統(tǒng)來確定。由此,可將方程(1)可以擴(kuò)展為下面的等式。</p><p> 理論上,參數(shù)M11到參數(shù)M34都可以通過6個點(diǎn)來確定。然而在實際應(yīng)用中,M34卻是一個需要幾十個點(diǎn)構(gòu)建超定方程來減少誤差的特殊項。因此,當(dāng)點(diǎn)的數(shù)量為N,2N時,方程可以用如公式(3)所示的構(gòu)建在矩陣M基礎(chǔ)的最小二乘法來得到,并表
11、示。</p><p> 但是,若匹配校準(zhǔn)只是簡單地按照上述方法進(jìn)行計算,將會導(dǎo)致很多錯誤,因為該方法并沒有考慮到鏡頭等非線性因素的失真情況。所以另一種方法被提了出來:將整個圖像分割成幾部分,當(dāng)然這也意味著該數(shù)據(jù)對空間點(diǎn)坐標(biāo)及其相應(yīng)的像素坐標(biāo)也會被分為幾組,線性地選取各個部分分別施加到每一組數(shù)據(jù)對或圖像區(qū)域。由該方法可以得到若干轉(zhuǎn)換矩陣,它們在需要測量時將被用于某些基于區(qū)域劃分分類規(guī)則的數(shù)據(jù)輸入。這就是線性分配法
12、的基本概念,使用這種方法可以使測量誤差顯著減少。</p><p> 2.2 BP神經(jīng)網(wǎng)絡(luò)標(biāo)定技術(shù)</p><p> BP神經(jīng)網(wǎng)絡(luò)是單向傳輸?shù)亩鄬尤斯ど窠?jīng)網(wǎng)絡(luò)。每一層都包含一個或多個節(jié)點(diǎn),每個層的輸出只與下一層的輸入端連接,并沒有與別的任何層有節(jié)點(diǎn)的輸入和輸出關(guān)系。一個標(biāo)準(zhǔn)的網(wǎng)絡(luò)是由一個輸入層,一個或多個的隱藏層和一個輸出層構(gòu)建而成的。</p><p> 對于輸
13、入節(jié)點(diǎn)來說,它們的輸出量是和它們的輸入量相等的。隱藏層和輸出層的行為模式可以被公式(4)描述。</p><p> 其中P是當(dāng)前輸入樣本,wji是連接體重從節(jié)點(diǎn)i到節(jié)點(diǎn)j,OPI和OPJ是輸入和節(jié)點(diǎn)j的輸出。 fj是激勵功能應(yīng)該是微無處不在BP神經(jīng)網(wǎng)絡(luò),所以使用S形函數(shù)總是偏好,如以下公式(5)所示。</p><p> 網(wǎng)絡(luò)的訓(xùn)練過程開始訓(xùn)練樣本的制備,它包含輸入樣本與理想的輸出樣本。過
14、程中定義了向前和向后兩個方向。在向前的方向時,各層的行為僅影響下一層,并且如果輸出不理想,過程中會變成向后方向,它沿著連接路徑和返回錯誤信號通過修改重量發(fā)送回輸入層。重復(fù)此過程,直到誤差滿足需求為止。在向后方向中,重量由如下公式(6)所示。</p><p> 其中,η是學(xué)習(xí)速率,其值應(yīng)0和1之間進(jìn)行選擇。當(dāng)節(jié)點(diǎn)是在輸出層,以下定義被使用,否則等式(8)被使用。</p><p> 其中,
15、TPJ是理想的輸出,OPJ是實際產(chǎn)出。顏色傳感器是由BP神經(jīng)網(wǎng)絡(luò)校準(zhǔn)上述輸入和輸出選擇為的三維坐標(biāo)的校準(zhǔn)點(diǎn)的和其相應(yīng)的二維象素座標(biāo)為圖1示出。</p><p> 2.3信息匹配的過程</p><p> 如圖2所示,三維傳感器是由黑白/彩色CCD和能夠上下移動的線結(jié)構(gòu)激光光線組合成。該光平面將與相交對象產(chǎn)生光的條紋。該測量過程被掃描的光條紋的物體上的過程中,并記錄在輪廓信息由3D傳感器。
16、水平方向和深度坐標(biāo)由三維傳感器被記錄和垂直坐標(biāo)將是從精確的機(jī)械掃描方式得到的,并且顏色傳感器將記錄的顏色信息對象。</p><p> 3D之間的匹配坐標(biāo)和其顏色信息可通過以下過程來實現(xiàn)。3D傳感器的校準(zhǔn)可以獲得不同分區(qū)的轉(zhuǎn)換矩陣,它可以把B&W CCD空間坐標(biāo)為(XW,YW)轉(zhuǎn)換為光條紋的信息。與另一個坐標(biāo)ZW接到掃描系統(tǒng)中,對象的整個三維信息可以由(XW,YW,度Zw)來表示。此外,3D之間的映射關(guān)系的坐標(biāo)(
17、XW,YW,度Zw)及其相應(yīng)的像素彩色圖像坐標(biāo)得到了彩色CCD可以通過顏色傳感器校準(zhǔn)獲得??梢詮恼娌噬珗D像得到RGB的像素值。所以,該方法如上所述能夠?qū)崿F(xiàn)三維信息和其相應(yīng)的顏色信息之間的相互匹配。</p><p><b> 實驗結(jié)果</b></p><p> 基于上述理論,共面和非共面的校準(zhǔn)點(diǎn)可分別使用敏通公司制造的MTV-0360和73X11HP來校準(zhǔn)和現(xiàn)實黑白
18、CCD和彩色CCD的分別。圖像捕獲板以640×480的分辨率采樣B&W,彩色CCD用8毫米鏡頭、768×576的分辨率采樣CCD,兩者的面積都為一個7毫米的鏡頭。激光的實現(xiàn)是采用了加拿大LASIRIS公司的SNF-501L670,其波長為670nm。</p><p> 黑白圖象是由大部分在1×4的形式內(nèi)的電和4的區(qū)域劃分得到的變換矩陣組成。為了驗證校準(zhǔn)精度,進(jìn)行了一次以為1
19、49.50毫米為標(biāo)準(zhǔn)值的結(jié)果測定。發(fā)現(xiàn)四個圖像被捕獲的平均相對誤差僅為0.26%。</p><p> 雙層(不考慮輸入層)BP神經(jīng)網(wǎng)絡(luò)的六個節(jié)點(diǎn)的隱藏層采用校準(zhǔn)顏色傳感器。校準(zhǔn)點(diǎn)和檢測點(diǎn)的數(shù)目本別是60和48,前者在x方向上的平均絕對誤差為0.61個像素,后者在y方向上平均絕對誤差為0.55像素。</p><p> 對粘貼黃色,紅色和綠色的紙真實三維彩色實物進(jìn)行基于黑白CCD和彩色CC
20、D垂直掃描系統(tǒng)所捕獲的實驗結(jié)果。圖3是物體的彩色圖像;圖4是三維點(diǎn)云計算由變換矩陣和圖片系列;圖5是從顏色傳感器校準(zhǔn)以及信息獲得的3D彩色點(diǎn)云的匹配過程。它可以真實,生動地代表物體的3D和顏色信息。</p><p><b> 結(jié)論</b></p><p> 從理論分析上述的實驗結(jié)果顯示,該基于攝像機(jī)標(biāo)定的匹配技術(shù)在一定程度上是可行的,而且也有令人滿意的精度和效果。
21、兩者的方法都不需要預(yù)先設(shè)置相機(jī)的內(nèi)在和外部參數(shù),如比例因子和圖像中心的空間,但點(diǎn)坐標(biāo)和其對應(yīng)的像素坐標(biāo)的數(shù)量卻是不夠的,為了保證高精度,仍需要多次的采樣。另外,在使用線性區(qū)分標(biāo)定的情況下,數(shù)量和形式的分區(qū)應(yīng)該由實際應(yīng)用確定,而圖像可以由其他方式來劃分,如同心圓或矩形等。這種方法,在每一個分區(qū)的線性,甚至其他非線性校準(zhǔn)都是可以使用的。因此,該方法可以有效解決信息匹配的問題,這為未來的3D彩色重建和紋理映射奠定了良好的基礎(chǔ)。</p&g
22、t;<p><b> 參考文獻(xiàn)</b></p><p> [1] Sun Yuchen, Ge Baozhen, Zhang Yimo. Review for the 3D information measuring technology[J]. Journal</p><p> of Optoelectronics ? Laser, 2004,
23、15(2): 248-254.(in Chinese)</p><p> [2] Petrov M., Talapov A. et al. Optical 3D digitizers: bringing life to the virtual world. Computer Graphics</p><p> and Applications, IEEE, 1998, 18(3):28
24、-37.</p><p> [3] Borghese N.A., Ferrigno G. et al. Autoscan: a flexible and portable 3D scanner. Computer Graphics and</p><p> Applications, IEEE, 1998, 18(3):38–41.</p><p> [4]
25、Roger, Y. Tsai. A versatile camera calibration technique for high-accuracy 3D machine vision metrology</p><p> using off-the-shelf TV cameras and lenses[J]. IEEE Journal of Robotics and Automation,1987,<
26、/p><p> RA-3(4):323-344.</p><p> [5] W. Faig. Calibration of close-range Photogrammetry systems: Mathematical formulation.</p><p> Matching between 3D Coordinate and its Color Infor
27、mation in 3D Color Sensor</p><p> GE Bao-zhen, SUN Yu-chen, MU Bing, SUN Ming-rui, LV Qie-ni</p><p> College of Precision Instrument and Opto-electronics Engineering, Tianjin University, Tianj
28、in 30072, China Key Laboratory of Opto-electronics Information and Technical Science (Tianjin University), Ministry of Education, Tianjin 30072, China</p><p><b> ABSTRACT</b></p><p>
29、; The matching between three dimensional (3D) coordinate and its color information in 3D color sensor is realized by the calibration technique, which applies Linear Partition Method and BP Neural Network Method to</p
30、><p> 3D sensor and color sensor respectively. The principle of the calibration technique includes the formula and solution procedure is deduced and the procedure of the information matching is discussed in de
31、tail. Calibration experiment results indicate that the use of Linear Partition Method to 3D sensor enables its measuring mean relative error to reach 0.26 percent and the use of BP Neural Network Method to color sensor e
32、nables its testing accuracy to reach 0.5-0.6 pixels. Based on the calibration</p><p> Keywords: 3D color sensor; camera calibration, information matching, linear partition calibration, BP neural network, co
33、lor point cloud</p><p> 1. INTRODUCTION</p><p> Acquisition of 3D and color information of real object is a research focus in digitization field. Up to now, different kinds of techniques based
34、 on different principles are proposed[1], which are widely applied in many fields such as CAD and CAM, reverse engineering, rapid prototype, virtual reality, human engineering and preservation of cultural relics and so o
35、n[2, 3]. Among these techniques, non-contact optical method, especially structured light method is become more and more popular due to i</p><p> Real object can be digitized with color information by 3D col
36、or sensor whose key parts are black and white (B&W) cameras and color cameras. The matching between 3D coordinate and its color information in 3D color sensor can be realized by camera calibration. Many calibration t
37、echniques are proposed such as Direct Linear Transformation Method, Full-Scale Nonlinear Optimization Method, Two Stage Method and so on[4, 5]. In most of these methods, complicated camera model always need to be set up
38、and</p><p> 2. PRINCIPLE OF CALIBRATION AND INFORMATION MATCHING</p><p> 2.1 Linear partition method and its solution procedure</p><p> The mapping relationship between object’s
39、space coordinates (XW, YW, ZW) and their corresponding pixels coordinates (Xf, Yf) got from image capture process can be formulated in matrix form with homogeneous coordinate as following equation.</p><p>&l
40、t;b> ? Xw?</b></p><p><b> ? Xw?</b></p><p><b> ? X f ?</b></p><p><b> ?m11</b></p><p><b> m12</b></p>
41、;<p><b> m13</b></p><p> m14 ? ????</p><p><b> ρ ? Y</b></p><p><b> ? = ?mm</b></p><p> mm ? ? ?Yw ? = M ? ?Y
42、w ?</p><p><b> (1)</b></p><p> ? f ?</p><p> ? 2122</p><p><b> 2324 ?</b></p><p><b> ? Zw?</b></p&
43、gt;<p><b> ? Zw?</b></p><p><b> ?? 1 ??</b></p><p><b> ??m31</b></p><p><b> m32</b></p><p><b> m33<
44、/b></p><p> m34 ?? ????</p><p><b> ????</b></p><p> Where ρ is a scale factor. Apparently, the matrix M contains all of the mapping information and if the&
45、lt;/p><p> number of the calibration points is enough, M can be determined by solving a linear system of equations, which</p><p> can be created by using calibration points’ 3D coordinates and th
46、eir corresponding image coordinates. Equation</p><p> (1) can be expanded as following equation.</p><p> ?m11 Xw + m12Yw + m13 Zw + m14 ? m31 X f Xw ? m32 X f Yw ? m33 X f Zw = m34 X f</p&g
47、t;<p><b> ?</b></p><p> ? m21 Xw + m22Yw + m23 Zw + m24 ? m31Y f Xw ? m32Y f Yw ? m33Y f Zw = m34Y f</p><p><b> (2)</b></p><p> Theoretically, p
48、arameters from m11 to m34 can be determined by 6 points. However, in practical application, m34 always be treated as one and dozens of calibration points are introduced to reduce the error by solving overdetermined equa
49、tions. So when the number of the points is N, 2N equations can be obtained and expressed as following and the matrix M can be got from least-squares procedure.</p><p><b> ? Xwi</b></p>&l
50、t;p><b> A = ?</b></p><p><b> Ywi</b></p><p><b> Zwi10</b></p><p><b> Ax = B</b></p><p><b> 000</b&g
51、t;</p><p> ? X fi Xwi</p><p><b> ? X fiYwi</b></p><p> ? X fi Zwi ?</p><p><b> ?</b></p><p><b> (3)</b></p>
52、<p><b> ? 00</b></p><p><b> 00Xwi</b></p><p><b> Ywi</b></p><p><b> Zwi1</b></p><p> ? Y fi Xwi</p&
53、gt;<p><b> ? Y fiYwi</b></p><p> ? Y fi Zwi ?</p><p><b> X = [m11</b></p><p><b> m12</b></p><p><b> m13</b>
54、</p><p><b> m14</b></p><p><b> m21</b></p><p><b> m22</b></p><p><b> m23</b></p><p><b> m24</
55、b></p><p><b> m31</b></p><p><b> m32</b></p><p><b> m33 ]</b></p><p> It doesn’t take into consideration of the distort
56、ion of the lens and other nonlinear factors during above discussion and calibration technique simply based on this method will cause much error. So another method is proposed, which divides the whole image into
57、several parts and that also means the data pairs (space points’ coordinates and their corresponding pixels’ coordinates) are divided in to several sets, linear method mentioned above will be applied to each set of data p
58、airs </p><p> 2.2 BP neural network calibration technique</p><p> BP Neural Network is a one-way transmission and multi-layer artificial network. Every layer contains one or more nodes and the
59、 output of each layer is only connect with the input of the next layer and have no relationship with the nodes of other layers and itself. A standard network is composed of one input layer, one or more hidden layers and
60、one output layer.</p><p> For input nodes, their output is just the same as their input, and for hidden layer and output layer their behavior can be described as following.</p><p> net pj = ∑
61、 w ji o pi</p><p><b> i</b></p><p><b> o pj =</b></p><p> f j (net pj )</p><p><b> (4)</b></p><p> Where p is pr
62、esent input sample, wji is the connection weight from node i to node j, opi and opj are the input and output of node j. fj is the excitation function which should be differentiable everywhere in BP Neural Network, so fu
63、nctions with S shape is always preference, such as following.</p><p> f = 1/(1 + e ? x )</p><p><b> (5)</b></p><p> The training process of the network begins with t
64、he preparation of training sample that contains input sample and ideal output sample. There are two directions defined as forward and backward direction in the process. In</p><p> forward direction, the beh
65、avior of each layer only has influence on the next layer and if the output is not ideal, the process will turn into backward direction, which return the error signal along the connection paths and transmit it to input la
66、yer by modifying the weight. Repeat this process until the error meets the demand. In backward direction, the weight is adjusted by this formula.</p><p><b> ? pω ji</b></p><p> = η
67、δ pj o pi</p><p><b> (6)</b></p><p> Where η is learning rate and its value should be selected between zero and one. When the nodes are in output</p><p> layer, follo
68、wing definition is used, otherwise equation (8) is used.</p><p><b> δ pj</b></p><p> = (t pj ? o pj ) f ' j (net pj )</p><p><b> (7)</b></p>&
69、lt;p><b> δ pj =</b></p><p> f ' j (net pj )∑δ pk ω kj</p><p><b> k</b></p><p><b> (8)</b></p><p> Where tpj is ideal ou
70、tput and opj is real output.</p><p> Color sensor is calibrated by the BP Neural Network mentioned above and the input an output are selected as the 3D coordinates of the calibration points and their corres
71、ponding 2D pixel coordinates as Fig.1 shows.</p><p><b> Xw</b></p><p><b> Yw</b></p><p><b> Zw</b></p><p> Input Layer</p>
72、<p><b> …</b></p><p> Hidden Layers</p><p><b> Xf</b></p><p><b> Yf</b></p><p> Output Layer</p><p><b&g
73、t; Laser</b></p><p><b> B&W CCD</b></p><p><b> Object</b></p><p><b> Color CCD</b></p><p> Fig.1 BP Neural Network mo
74、del adopted by color sensor</p><p> Fig.2 Schematic diagram of 3D color sensor</p><p> 2.3 Information matching process</p><p> As Fig.2 shows, 3D sensor is composed of B&W C
75、CD and line-structured laser light and moves up and down, color CCD is color sensor. The light plane will intersect with the object and generate a light stripe. The measuring process is the process of scanning the light
76、stripe on the object and record the contour information by 3D sensor. The horizontal and depth coordinate are recorded by 3D sensor and vertical coordinate will be got from a precise mechanical scanning system, and color
77、 sensor will r</p><p> The matching between 3D coordinate and its color information can be realized by the following process. Transformation matrixes of different partitions will be got from the calibration
78、 of 3D sensor, which can translate the light stripe information on the B&W CCD to space coordinate (XW, YW). With another coordinate Zw got from scanning system, the whole 3D information of the object can be expresse
79、d by (XW, YW, Zw). Furthermore, the mapping relationship between 3D coordinate (XW, YW, Zw) and its c</p><p> 3. EXPERIMENTAL RESULTS</p><p> Based on the above theory, coplanar and non-coplan
80、ar calibration points are used to calibrate B&W CCD and color CCD respectively, which are MTV-0360 and 73X11HP made by MINTRON Company. The B&W CCD</p><p> with a 6 mm lens is sampled by an image ca
81、pture board with a resolution of 640×480 and the color CCD with a</p><p> 8 mm lens is sampled by a resolution of 768×576. The laser is SNF-501L670 from LASIRIS Company in Canada,</p><p
82、> the light is line-structured and the wavelength is 670nm.</p><p> The B&W image is divided by the form of 1×4 at the area where most of the points distributes and four transformation matrix i
83、s obtained. In order to verify the calibration precision, an object with edge length</p><p> 149.50mm is measured according to the calibration results. Four images are captured and the mean relative error i
84、s 0.26% after computation.</p><p> Double-layer (without considering the input layer) BP Neural Network with six nodes in hidden layer is adopted to calibrate color sensor. The number of calibration points
85、and testing points are 60 and 48 respectively. The mean absolute error in x direction is 0.61 pixels and 0.55 pixels in y direction.</p><p> Real 3D color object made by pasting yellow, red and green paper
86、on a cylinder is measured based on the calibration results of B&W and color CCD and 30 B&W images are captured by a vertical scanning system. Fig.3 is the color image of the object; Fig.4 is the 3D point cloud ca
87、lculated by the transformation matrixes and the image series; Fig.5 is the 3D color point cloud obtained from color sensor calibration and information matching process. It can represent the 3D and color information of th
88、e o</p><p> Fig.3 Color image of the objectFig.4 3D point cloud of the object</p><p> Fig.5 3D color point cloud of the object</p><p> 4. CONCLUSION</p><p> From t
89、he theory analysis and experimental results mentioned above, it indicates that the matching technique based on camera calibration is feasible and can give satisfying precision and results. Both of the methods do not need
90、 the camera’s intrinsic and outside parameters such as scale factor and image center, and the space point coordinate and its corresponding pixel coordinate is enough, which enable the methods easy to adopt and still high
91、 in precision. In addition, when Linear Partition Meth</p><p> 5. ACKNOWLEDGEMENTS</p><p> This paper is supported by the National Nature Scientific Research Foundation (No.60277009).</p>
92、;<p> REFERENCES</p><p> [1]Sun Yuchen, Ge Baozhen, Zhang Yimo. Review for the 3D information measuring technology[J]. Journal of Optoelectronics ? Laser, 2004, 15(2): 248-254.(in Chinese)</p>
93、;<p> [2]Petrov M., Talapov A. et al. Optical 3D digitizers: bringing life to the virtual world. Computer Graphics and Applications, IEEE, 1998, 18(3):28-37.</p><p> [3]Borghese N.A., Ferrigno G.
94、et al. Autoscan: a flexible and portable 3D scanner. Computer Graphics and</p><p> Applications, IEEE, 1998, 18(3):38–41.</p><p> [4]Roger, Y. Tsai. A versatile camera calibration technique f
95、or high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses[J]. IEEE Journal of Robotics and Automation,1987, RA-3(4):323-344.</p><p> [5]W.Faig.Calibrationofclose
96、-rangePhotogrammetrysystems:Mathematicalformulation.</p><p> Photogrammetric Eng. Remote Sensing, 1975, 41:1479-1486.</p><p> contact. Ge Baozhen, e-mail: gebz@tju.edu.cn; phone: 022
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 眾賞文庫僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 顏色傳感器的應(yīng)用
- 顏色傳感器tcs230及顏色識別電路
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- A3-顏色辨別傳感器支架.dwg
- 基于RGB顏色傳感器的植物顏色檢測系統(tǒng)研究.pdf
- ps利用【匹配顏色】和【顏色查找】調(diào)色技巧
- RGB顏色傳感器葉綠素儀的研究.pdf
- 彎管機(jī)三維圖3d
- 3D SW三維建模.rar
- 3D三維圖.rar
- 3D SW三維建模.rar
- 3D三維圖.rar
- ae三維場景3d st
- 3D SW三維建模.rar
評論
0/150
提交評論