基于三維圖像的運(yùn)動(dòng)員起跑動(dòng)作誤差預(yù)測仿真
發(fā)布時(shí)間:2018-07-15 16:29
【摘要】:田徑比賽運(yùn)動(dòng)支撐期間,起跑動(dòng)作的準(zhǔn)確性可使運(yùn)動(dòng)員在最短時(shí)間內(nèi)獲得最佳速度,直接影響比賽成績,因此需要對運(yùn)動(dòng)員起跑動(dòng)作進(jìn)行誤差預(yù)測,當(dāng)前方法預(yù)測起跑動(dòng)作時(shí),難以對運(yùn)動(dòng)員起跑動(dòng)作關(guān)節(jié)點(diǎn)進(jìn)行準(zhǔn)確跟蹤,降低了運(yùn)動(dòng)軌跡預(yù)測的精度。提出一種基于三維圖像的運(yùn)動(dòng)員起跑動(dòng)作誤差預(yù)測方法。上述方法選用星型骨架組成結(jié)構(gòu)描述運(yùn)動(dòng)員起跑動(dòng)作三維運(yùn)動(dòng)模式,采用ISOMAP非線性降維方法計(jì)算獲取起跑動(dòng)作圖像三維子空間的運(yùn)動(dòng)狀態(tài)投影,將起跑動(dòng)作三維數(shù)據(jù)投影至非線性低維子空間中,識別出運(yùn)動(dòng)員起跑動(dòng)作狀態(tài)的內(nèi)在結(jié)構(gòu)后研究整個(gè)起跑動(dòng)作的各個(gè)關(guān)節(jié)點(diǎn),利用Mean-Shift搜索算法確定運(yùn)動(dòng)員起跑動(dòng)作各個(gè)關(guān)節(jié)點(diǎn)位置,并通過卡爾曼濾波器算法進(jìn)行運(yùn)動(dòng)員起跑動(dòng)作誤差預(yù)測,確定運(yùn)動(dòng)員起跑過程中的動(dòng)作軌跡。仿真結(jié)果表明,所提方法可有效提升運(yùn)動(dòng)員起跑動(dòng)作軌跡預(yù)測精度,且預(yù)測效率較高。
[Abstract]:During the support of track and field sports, the accuracy of the starting movement can make the athletes get the best speed in the shortest time and directly affect the performance of the competition. Therefore, it is necessary to predict the starting movements of the athletes. The current method is difficult to accurately track the joint points of the starting movement and reduce the track of the movement. The accuracy of track prediction is proposed. A method of predicting the starting movement error of athletes based on three-dimensional images is proposed. The above method uses the star frame structure to describe the three dimensional motion pattern of the starting movement of the athletes, and uses the ISOMAP nonlinear dimensionality reduction method to calculate the motion state projection of the starting movement image in the three-dimensional subspace, and the starting movement is three The dimension data is projected into the nonlinear low dimension subspace. After identifying the inner structure of the starting movement state of the athletes, it studies the joint points of the whole starting movement, and uses the Mean-Shift search algorithm to determine the position of each joint point of the starting movement of the athletes, and the Calman filter algorithm is used to predict the starting movement error of the athletes. The simulation results show that the proposed method can effectively improve the prediction accuracy of athletes' start motion trajectory, and the prediction efficiency is higher.
【作者單位】: 西藏民族大學(xué)體育學(xué)院;
【分類號】:G822;TP391.41
,
本文編號:2124691
[Abstract]:During the support of track and field sports, the accuracy of the starting movement can make the athletes get the best speed in the shortest time and directly affect the performance of the competition. Therefore, it is necessary to predict the starting movements of the athletes. The current method is difficult to accurately track the joint points of the starting movement and reduce the track of the movement. The accuracy of track prediction is proposed. A method of predicting the starting movement error of athletes based on three-dimensional images is proposed. The above method uses the star frame structure to describe the three dimensional motion pattern of the starting movement of the athletes, and uses the ISOMAP nonlinear dimensionality reduction method to calculate the motion state projection of the starting movement image in the three-dimensional subspace, and the starting movement is three The dimension data is projected into the nonlinear low dimension subspace. After identifying the inner structure of the starting movement state of the athletes, it studies the joint points of the whole starting movement, and uses the Mean-Shift search algorithm to determine the position of each joint point of the starting movement of the athletes, and the Calman filter algorithm is used to predict the starting movement error of the athletes. The simulation results show that the proposed method can effectively improve the prediction accuracy of athletes' start motion trajectory, and the prediction efficiency is higher.
【作者單位】: 西藏民族大學(xué)體育學(xué)院;
【分類號】:G822;TP391.41
,
本文編號:2124691
本文鏈接:http://sikaile.net/jiaoyulunwen/tylw/2124691.html
最近更新
教材專著