基于光流法的3D視頻穩(wěn)定虛擬視點(diǎn)生成研究
發(fā)布時間:2018-04-14 16:01
本文選題:3D + 光流跟蹤 ; 參考:《北京郵電大學(xué)》2016年碩士論文
【摘要】:3D技術(shù)的蓬勃發(fā)展使得3D視頻深入人心。與戴眼鏡式的3D技術(shù)相比,裸眼式3D給人們提供了更為輕松的觀看體驗。裸眼顯示方案通常在已有的雙目3D視頻下生成多個中間視點(diǎn),然后將生成的多視點(diǎn)合成最終的3D視頻。然而,在生成多個中間視點(diǎn)的過程中,常常會因為立體匹配不穩(wěn)定等因素,而出現(xiàn)不同程度的抖動現(xiàn)象。如何有效生成穩(wěn)定中間視點(diǎn)并保證它們的連續(xù)性,這無疑是一個很有意義的研究課題。本論文在研究虛擬視點(diǎn)生成技術(shù)的基礎(chǔ)上,分析了虛擬視點(diǎn)生成不穩(wěn)定的原因,針對其主要原因,利用光流的跟蹤來進(jìn)行抖動抑制的改進(jìn),提出了兩種穩(wěn)定虛擬視點(diǎn)生成的方案,具體研究內(nèi)容和成果如下:1.提出了一種基于稀疏光流跟蹤的穩(wěn)定虛擬視點(diǎn)生成方案。該方案主要從特征點(diǎn)提取、中間視點(diǎn)生成以及特征點(diǎn)縱向跟蹤三個方面保證穩(wěn)定性。采用SIFT算法能夠得到穩(wěn)定性較好的特征點(diǎn)集;將Delaunay三角剖分與Direct3D技術(shù)相結(jié)合,實現(xiàn)了中間視點(diǎn)的有效、快速生成;為了保證縱向生成視點(diǎn)間的穩(wěn)定連續(xù)性,采用Lucas-Kanade稀疏光流法來跟蹤特征點(diǎn)集,將圖像金字塔引入到該光流法中,使其更適用于大而不連貫的運(yùn)動;為保證跟蹤過程的穩(wěn)定可靠性,還進(jìn)一步引入了跟蹤周期,通過實驗驗證了這種虛擬視點(diǎn)生成方案不僅速度快,生成的中間視點(diǎn)質(zhì)量較高,且有效改善了中間視點(diǎn)不穩(wěn)定和不連續(xù)的現(xiàn)象。該方案回避了深度圖的生成和空洞填補(bǔ)等難題,方便于實際應(yīng)用。2.提出了一種基于稠密光流跟蹤的穩(wěn)定虛擬視點(diǎn)生成方案。該方案主要包括左右立體圖像對間的光流跟蹤、中間視點(diǎn)生成以及視頻幀間跟蹤和水平視差計算。采用TV-L1光流法進(jìn)行跟蹤,將圖像金字塔與其結(jié)合從而充分優(yōu)化了該光流法的穩(wěn)定性。在左右圖像對間運(yùn)用該光流法,得到立體圖像對的水平視差,采用視差偏移法生成了多個穩(wěn)定中間視點(diǎn)。在視頻幀間同樣采用該光流法進(jìn)行跟蹤,根據(jù)前后幀間運(yùn)動信息計算得出當(dāng)前幀左右圖像對間水平視差。為充分保證幀間跟蹤的穩(wěn)定可靠性,同樣引入了跟蹤周期。針對稠密跟蹤時耗大的問題,對原圖先采用降采樣,生成中間視點(diǎn)后再插值恢復(fù)原圖像的大小。實驗顯示該方案能夠生成較良好的中間視點(diǎn),有效彌補(bǔ)了稀疏法部分區(qū)域模糊的現(xiàn)象,并對視頻幀間的穩(wěn)定連續(xù)性也有一定保證。以上研究給出的這兩種方案各有所長,都能有效生成穩(wěn)定連續(xù)的中間視點(diǎn),且在一定場景下相互補(bǔ)充,具有一定的現(xiàn)實意義。
[Abstract]:The vigorous development of 3D technology makes 3D video deeply popular.Compared with glasses-based 3D technology, naked-eye 3D provides a more relaxed viewing experience.The naked eye display scheme usually generates multiple intermediate views under the existing binocular 3D video, and then synthesizes the resulting multi-view points into the final 3D video.However, in the process of generating multiple intermediate viewpoints, some factors such as instability of stereo matching often appear different degrees of jitter.How to effectively generate stable intermediate views and ensure their continuity is undoubtedly a very meaningful research topic.Based on the research of virtual viewpoint generation technology, this paper analyzes the causes of instability of virtual view generation, and improves the jitter suppression by optical flow tracking.Two stable virtual viewpoint generation schemes are proposed. The specific research contents and results are as follows: 1.A stable virtual viewpoint generation scheme based on sparse optical flow tracking is proposed.This scheme mainly guarantees stability from three aspects: feature point extraction, intermediate viewpoint generation and feature point longitudinal tracking.Using SIFT algorithm can get the stable feature point set; combine Delaunay triangulation with Direct3D technology to realize the effective and fast generation of the intermediate viewpoint; in order to ensure the stability continuity of the longitudinal generation view,The Lucas-Kanade sparse optical flow method is used to track feature points, and the image pyramid is introduced to the optical flow method to make it more suitable for large discontinuous motion, and the tracking period is further introduced to ensure the stability and reliability of the tracking process.The experimental results show that the virtual viewpoint generation scheme is not only fast, but also can effectively improve the instability and discontinuity of the intermediate viewpoint.This scheme avoids the problems of depth map generation and void filling, and is convenient for practical application.A stable virtual viewpoint generation scheme based on dense optical flow tracking is proposed.The scheme mainly includes optical flow tracking between left and right stereo images, intermediate viewpoint generation, video frame tracking and horizontal parallax calculation.The TV-L1 optical flow method is used to track and combine the image pyramid with it to optimize the stability of the optical flow method.The horizontal parallax of stereo image pair is obtained by using the optical flow method between left and right image pairs, and several stable intermediate viewpoints are generated by parallax offset method.The optical flow method is also used to track the video frames, and the horizontal parallax between the left and right images of the current frame is calculated according to the motion information between the two frames.In order to ensure the stability and reliability of inter-frame tracking, tracking period is also introduced.In order to solve the problem of large amount of time in dense tracking, the original image is first de-sampled, the intermediate viewpoint is generated, and then the original image size is restored by interpolation.Experiments show that the proposed scheme can generate a good intermediate view point, which can effectively make up for the fuzzy phenomenon in some regions of the sparse method, and also guarantee the stable continuity between video frames to a certain extent.The above two schemes have their own strong points, which can effectively generate stable and continuous intermediate viewpoints, and complement each other in a certain scene, which has certain practical significance.
【學(xué)位授予單位】:北京郵電大學(xué)
【學(xué)位級別】:碩士
【學(xué)位授予年份】:2016
【分類號】:TP391.41
【相似文獻(xiàn)】
相關(guān)期刊論文 前10條
1 孫正,郁道銀,陳曉冬,徐智,黃家祥,謝洪波;基于光流法的冠狀動脈造影圖像序列中血管運(yùn)動的估計[J];工程圖學(xué)學(xué)報;2003年03期
2 韓雷;王洪慶;林隱靜;;光流法在強(qiáng)對流天氣臨近預(yù)報中的應(yīng)用[J];北京大學(xué)學(xué)報(自然科學(xué)版);2008年05期
3 龔大墉;汪治華;楊數(shù)強(qiáng);劉巖;;基于目標(biāo)的局部約束光流分析[J];重慶工學(xué)院學(xué)報(自然科學(xué)版);2009年07期
4 嚴(yán)強(qiáng);黃增喜;曹麗萍;黃蓉剛;;光流法在機(jī)車安全行駛中的應(yīng)用[J];計算機(jī)應(yīng)用研究;2013年04期
5 馬鵬飛;楊金孝;;基于光流法的粒子圖像測速[J];科學(xué)技術(shù)與工程;2012年32期
6 張永亮;盧煥章;賀興華;謝耀華;;基于光流預(yù)測的直線對應(yīng)算法[J];信號處理;2010年05期
7 孫承志;熊田忠;吉順平;張家海;;基于差分的光流法在目標(biāo)檢測跟蹤中的應(yīng)用[J];機(jī)床與液壓;2010年14期
8 王,
本文編號:1750008
本文鏈接:http://sikaile.net/kejilunwen/ruanjiangongchenglunwen/1750008.html
最近更新
教材專著