An Aircraft Ranging Algorithm Based on Two Frames of Image in Monocular Image Sequence

We proposed a novel rotation-invariant feature based passive ranging algorithm to estimate the distance of an imaged non-cooperation target to camera. This improved algorithm avoids sometimes occurrence of physically unreasonable results in solving the existing quartic equation, such as the happening of complex or negative value. This method uses three matched points in two adjacent frames of an image sequence to extract depth-dependent line features of the target. With this line features combination of the observer’s displacement and imaging directions, a quadratic equation was build to estimate the distance. Analysis shows that the proposed new passive ranging equation would be solvable when the observer is with non-zero displacement in adjacent sampling instances. Our reduced-model experiment also demonstrates that the proposed algorithm is not only simple and feasible but also with a relative ranging error no more than 4 per cent in most cases.


Keywords:    Invariants Scramjet,  depth cues,   feature representation,  passive ranging, imaging geometry  

Although stereo vision could find the depth information of an imaged object1, but the motion stereo depth measurement2 scheme is more attractive for its simplicity in construction. As we know, from industrial robot3 to air-launched weapons4, motion stereo has been widely used. It is characterized by using monocular imaging system and being mounted on a maneuvering platform.

In motion stereo, feature extraction and feature matching are problems that must be solved5. These features include points36789, lines or edges51011, and regions12 or colours13. Among these three kinds of features, regions or colours are less applied than two former, lines and edges have got the most applications. This is because the lines are easier to extract from contour images and their characterization, by means of polygonal approximation, and are more reliable than points feature in the presence of noise11. In general, edges come from edge-based methods5, straight lines may come from either edge-based methods or two unique feature points.

In the past few decades, linearity features which include straight lines and edges have been used in image distortion correcting5, 3D image matching14, target tracking711, range finding815, vision based navigating guide1617, robot simultaneous localization and mapping (SLAM)918, pose estimation1920, and so on. Among these application, passive ranging based on motion stereo is very important, which can be seen in5821, and so on. Feature lines are wire poles or street trees in background, but in most cases, feature lines are in the target itself5821. A new method is proposed for non-cooperate aircraft ranging in this study.

2.1  Existing Ranging Model

In this study, authors take the geography coordinates o-xyz as the host coordinates, in which the north, the west and the upper direction are assigned as the positive direction for x, y and z axis respectively. It is a reasonable assumption since that the state of our measurement platform on which the camera is fixed could be known from other detection system, such as the GPS and other inboard sensors, the information includes the azimuth, pitching, radial distance to point o, velocity, acceleration, and the attitude.

The platform itself also constitutes a coordinates O-XYZ, i.e. the platform coordinates. If we take an aircraft as the platform, then the nose head, right wing, and engine- room top are the positive directions of Y, X, and Z-axis respectively. The air- borne measurement-platform in the geography coordinate is shown in Fig. 1.


Figure 1. The airborne measurement-platform in geography coordinate.


At the n-th sampling time, the point O in o-xyz coordinates is O(xn, yn, zn). Suppose the moving target in the measurement-platform coordinates is expressed as (rn, αn, βn) in a spherical form. Here, αn and βn are the azimuth and pitching from camera to the target, and the sightline of camera to the target in the geography coordinates could be expressed as the direction vector (ln, mn, nn) as below:


( l n m n n n )=( t 11 n t 12 n t 13 n t 21 n t 22 n t 23 n t 31 n t 32 n t 33 n )( cos α n cos β n sin α n cos β n sin β n ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaWaaeWaaqaabe qaaiaadYgadaWgaaWcbaGaamOBaaqabaaakeaacaWGTbWaaSbaaSqa aiaad6gaaeqaaaGcbaGaamOBamaaBaaaleaacaWGUbaabeaaaaGcca GLOaGaayzkaaGaeyypa0ZaaeWaaeaafaqabeWadaaabaGaamiDamaa DaaaleaacaaIXaGaaGymaaqaaiaad6gaaaaakeaacaWG0bWaa0baaS qaaiaaigdacaaIYaaabaGaamOBaaaaaOqaaiaadshadaqhaaWcbaGa aGymaiaaiodaaeaacaWGUbaaaaGcbaGaamiDamaaDaaaleaacaaIYa GaaGymaaqaaiaad6gaaaaakeaacaWG0bWaa0baaSqaaiaaikdacaaI YaaabaGaamOBaaaaaOqaaiaadshadaqhaaWcbaGaaGOmaiaaiodaae aacaWGUbaaaaGcbaGaamiDamaaDaaaleaacaaIZaGaaGymaaqaaiaa d6gaaaaakeaacaWG0bWaa0baaSqaaiaaiodacaaIYaaabaGaamOBaa aaaOqaaiaadshadaqhaaWcbaGaaG4maiaaiodaaeaacaWGUbaaaaaa aOGaayjkaiaawMcaamaabmaaeaqabeaaciGGJbGaai4Baiaacohacq aHXoqydaWgaaWcbaGaamOBaaqabaGcciGGJbGaai4BaiaacohacqaH YoGydaWgaaWcbaGaamOBaaqabaaakeaaciGGZbGaaiyAaiaac6gacq aHXoqydaWgaaWcbaGaamOBaaqabaGcciGGJbGaai4BaiaacohacqaH YoGydaWgaaWcbaGaamOBaaqabaaakeaaciGGZbGaaiyAaiaac6gacq aHYoGydaWgaaWcbaGaamOBaaqabaaaaOGaayjkaiaawMcaaaaa@8045@            (1)

Here, ( t 11 n t 12 n t 13 n t 21 n t 22 n t 23 n t 31 n t 32 n t 33 n ) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaWaaeWaaeaafa qabeWadaaabaGaamiDamaaDaaaleaacaaIXaGaaGymaaqaaiaad6ga aaaakeaacaWG0bWaa0baaSqaaiaaigdacaaIYaaabaGaamOBaaaaaO qaaiaadshadaqhaaWcbaGaaGymaiaaiodaaeaacaWGUbaaaaGcbaGa amiDamaaDaaaleaacaaIYaGaaGymaaqaaiaad6gaaaaakeaacaWG0b Waa0baaSqaaiaaikdacaaIYaaabaGaamOBaaaaaOqaaiaadshadaqh aaWcbaGaaGOmaiaaiodaaeaacaWGUbaaaaGcbaGaamiDamaaDaaale aacaaIZaGaaGymaaqaaiaad6gaaaaakeaacaWG0bWaa0baaSqaaiaa iodacaaIYaaabaGaamOBaaaaaOqaaiaadshadaqhaaWcbaGaaG4mai aaiodaaeaacaWGUbaaaaaaaOGaayjkaiaawMcaaaaa@59B5@ is the transposed matrix of the direction vector for X, Y, and Z-axis in o-xyz coordinates.

Suppose that there exists a one-dimension scale x0 in the target, which is invariable to the rotation of the camera in two adjacent sampling times. Let’s call the scale’s projection on the camera focal plane the target’s characteristic linearity. Under normal condition both the target and the measurement-platform are in moving. Figure 2 illustrates the recursive ranging model based on characteristic linearity.

Figure 2. shows T and S are the Target and Surveyor (camera) respectively. The subscript n or (n+1) represent the sampling time. Therefore, TnTn+1, SnSn+1 is the moving trace of the target and the platform between the n-th and (n+1)-th sampling time, while φn and φn+1 are the angles of the target’s trace to the camera’s sightline at each sampling time.


Figure 2. The ranging model based on characteristic linearity.


Assume that the focuses of optical system in the camera to be fn and fn+1 at the n-th and (n+1)-th sampling time, and length of the characteristic linearity in camera focal-plane to be Ln and Ln+1. Obviously, Ln or Ln+1 belong to a kind of depth-depended line features. According to the geometry imaging principle, the Eqn. below could be concluded.


r n+1 r n = f n+1 f n L n L n+1 sin φ n+1 sin φ n MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaWaaSaaaeaaca WGYbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaaaOqaaiaadkha daWgaaWcbaGaamOBaaqabaaaaOGaeyypa0ZaaSaaaeaacaWGMbWaaS baaSqaaiaad6gacqGHRaWkcaaIXaaabeaaaOqaaiaadAgadaWgaaWc baGaamOBaaqabaaaaOWaaSaaaeaacaWGmbWaaSbaaSqaaiaad6gaae qaaaGcbaGaamitamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaaa aOWaaSaaaeaaciGGZbGaaiyAaiaac6gacqaHgpGAdaWgaaWcbaGaam OBaiabgUcaRiaaigdaaeqaaaGcbaGaci4CaiaacMgacaGGUbGaeqOX dO2aaSbaaSqaaiaad6gaaeqaaaaaaaa@572A@             (2)

2.2   Existing Algorithm

Based on Eqn. (2), the following recursion passive ranging Eqn. is derived15:
C 4 r n+1 4 + C 3 r n+1 3 + C 2 r n+1 2 + C 1 r n+1 + C 0 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaI0aaabeaakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaa igdaaeaacaaI0aaaaOGaey4kaSIaam4qamaaBaaaleaacaaIZaaabe aakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaaigdaaeaacaaIZaaa aOGaey4kaSIaam4qamaaBaaaleaacaaIYaaabeaakiaadkhadaqhaa WcbaGaamOBaiabgUcaRiaaigdaaeaacaaIYaaaaOGaey4kaSIaam4q amaaBaaaleaacaaIXaaabeaakiaadkhadaWgaaWcbaGaamOBaiabgU caRiaaigdaaeqaaOGaey4kaSIaam4qamaaBaaaleaacaaIWaaabeaa kiabg2da9iaaicdaaaa@56BA@             (3)

where
C 4 =H[1 ( l n+1 l n + m n+1 m n + n n+1 n n ) 2 ] MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaI0aaabeaakiabg2da9iaadIeacaGGBbGaaGymaiabgkHi TiaacIcacaWGSbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaaki aadYgadaWgaaWcbaGaamOBaaqabaGccqGHRaWkcaWGTbWaaSbaaSqa aiaad6gacqGHRaWkcaaIXaaabeaakiaad2gadaWgaaWcbaGaamOBaa qabaGccqGHRaWkcaWGUbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaa beaakiaad6gadaWgaaWcbaGaamOBaaqabaGccaGGPaWaaWbaaSqabe aacaaIYaaaaOGaaiyxaaaa@5422@             (4)

C 3 =2H{ l n+1 ( x n+1 x n )+ m n+1 ( y n+1 y n )+ n n+1 ( z n+1 z n )( l n+1 l n + m n+1 m n + n n+1 n n ) [ l n ( x n+1 x n )+ m n ( y n+1 y n )+ n n ( z n+1 z n )]} MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGceaabbeaacaWGdb WaaSbaaSqaaiaaiodaaeqaaOGaeyypa0JaaGOmaiaadIeacaGG7bGa amiBamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccaGGOaGaam iEamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccqGHsislcaWG 4bWaaSbaaSqaaiaad6gaaeqaaOGaaiykaiabgUcaRiaad2gadaWgaa WcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaaiikaiaadMhadaWgaaWc baGaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamyEamaaBaaale aacaWGUbaabeaakiaacMcacqGHRaWkaeaacaWGUbWaaSbaaSqaaiaa d6gacqGHRaWkcaaIXaaabeaakiaacIcacaWG6bWaaSbaaSqaaiaad6 gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadQhadaWgaaWcbaGaamOB aaqabaGccaGGPaGaeyOeI0IaaiikaiaadYgadaWgaaWcbaGaamOBai abgUcaRiaaigdaaeqaaOGaamiBamaaBaaaleaacaWGUbaabeaakiab gUcaRiaad2gadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaam yBamaaBaaaleaacaWGUbaabeaakiabgUcaRiaad6gadaWgaaWcbaGa amOBaiabgUcaRiaaigdaaeqaaOGaamOBamaaBaaaleaacaWGUbaabe aakiaacMcaaeaacaGGBbGaamiBamaaBaaaleaacaWGUbaabeaakiaa cIcacaWG4bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiabgk HiTiaadIhadaWgaaWcbaGaamOBaaqabaGccaGGPaGaey4kaSIaamyB amaaBaaaleaacaWGUbaabeaakiaacIcacaWG5bWaaSbaaSqaaiaad6 gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadMhadaWgaaWcbaGaamOB aaqabaGccaGGPaGaey4kaSIaamOBamaaBaaaleaacaWGUbaabeaaki aacIcacaWG6bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiab gkHiTiaadQhadaWgaaWcbaGaamOBaaqabaGccaGGPaGaaiyxaiaac2 haaaaa@9B61@             (5)

C 2 =H{[ l n ( x n+1 x n )+ m n ( y n+1 y n )+ n n ( z n+1 z n ) ] 2 + ( x n+1 x n ) 2 + ( y n+1 y n ) 2 + ( z n+1 z n ) 2 } MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGceaabbeaacaWGdb WaaSbaaSqaaiaaikdaaeqaaOGaeyypa0JaamisaiaacUhacaGGBbGa amiBamaaBaaaleaacaWGUbaabeaakiaacIcacaWG4bWaaSbaaSqaai aad6gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadIhadaWgaaWcbaGa amOBaaqabaGccaGGPaGaey4kaSIaamyBamaaBaaaleaacaWGUbaabe aakiaacIcacaWG5bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaa kiabgkHiTiaadMhadaWgaaWcbaGaamOBaaqabaGccaGGPaGaey4kaS cabaGaamOBamaaBaaaleaacaWGUbaabeaakiaacIcacaWG6bWaaSba aSqaaiaad6gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadQhadaWgaa WcbaGaamOBaaqabaGccaGGPaGaaiyxamaaCaaaleqabaGaaGOmaaaa kiabgUcaRiaacIcacaWG4bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXa aabeaakiabgkHiTiaadIhadaWgaaWcbaGaamOBaaqabaGccaGGPaWa aWbaaSqabeaacaaIYaaaaOGaey4kaScabaGaaiikaiaadMhadaWgaa WcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamyEamaaBaaa leaacaWGUbaabeaakiaacMcadaahaaWcbeqaaiaaikdaaaGccqGHRa WkcaGGOaGaamOEamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGc cqGHsislcaWG6bWaaSbaaSqaaiaad6gaaeqaaOGaaiykamaaCaaale qabaGaaGOmaaaakiaac2haaaaa@7EA7@             (6)

C 1 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaIXaaabeaakiabg2da9iaaicdaaaa@3B1B@             (7)

C 0 = k 2 r n 2 + k 1 r n + k 0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaIWaaabeaakiabg2da9iaadUgadaWgaaWcbaGaaGOmaaqa baGccaWGYbWaa0baaSqaaiaad6gaaeaacaaIYaaaaOGaey4kaSIaam 4AamaaBaaaleaacaaIXaaabeaakiaadkhadaqhaaWcbaGaamOBaaqa aaaakiabgUcaRiaadUgadaWgaaWcbaGaaGimaaqabaaaaa@46BB@             (8)

H= ( f n f n+1 ) 2 ( L n+1 L n ) 2 1 r n 2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamisaiabg2 da9iaacIcadaWcaaqaaiaadAgadaWgaaWcbaGaamOBaaqabaaakeaa caWGMbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaaaaGccaGGPa WaaWbaaSqabeaacaaIYaaaaOGaaiikamaalaaabaGaamitamaaBaaa leaacaWGUbGaey4kaSIaaGymaaqabaaakeaacaWGmbWaaSbaaSqaai aad6gaaeqaaaaakiaacMcadaahaaWcbeqaaiaaikdaaaGcdaWcaaqa aiaaigdaaeaacaWGYbWaa0baaSqaaiaad6gaaeaacaaIYaaaaaaaaa a@4D21@             (9)

and,
k 2 = ( l n+1 l n + m n+1 m n + n n+1 n n ) 2 1 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4AamaaBa aaleaacaaIYaaabeaakiabg2da9iaacIcacaWGSbWaaSbaaSqaaiaa d6gacqGHRaWkcaaIXaaabeaakiaadYgadaWgaaWcbaGaamOBaaqaba GccqGHRaWkcaWGTbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaa kiaad2gadaWgaaWcbaGaamOBaaqabaGccqGHRaWkcaWGUbWaaSbaaS qaaiaad6gacqGHRaWkcaaIXaaabeaakiaad6gadaWgaaWcbaGaamOB aaqabaGccaGGPaWaaWbaaSqabeaacaaIYaaaaOGaeyOeI0IaaGymaa aa@51BB@             (10)

k 1 =2{ l n ( x n+1 x n )+ m n ( y n+1 y n )+ n n ( z n+1 z n ) ( l n+1 l n + m n+1 m n + n n+1 n n )[ l n+1 ( x n+1 x n )+ m n+1 ( y n+1 y n )+ n n+1 ( z n+1 z n )]} MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGceaabbeaacaWGRb WaaSbaaSqaaiaaigdaaeqaaOGaeyypa0JaaGOmaiaacUhacaWGSbWa aSbaaSqaaiaad6gaaeqaaOGaaiikaiaadIhadaWgaaWcbaGaamOBai abgUcaRiaaigdaaeqaaOGaeyOeI0IaamiEamaaBaaaleaacaWGUbaa beaakiaacMcacqGHRaWkcaWGTbWaaSbaaSqaaiaad6gaaeqaaOGaai ikaiaadMhadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaeyOe I0IaamyEamaaBaaaleaacaWGUbaabeaakiaacMcacqGHRaWkcaWGUb WaaSbaaSqaaiaad6gaaeqaaOGaaiikaiaadQhadaWgaaWcbaGaamOB aiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamOEamaaBaaaleaacaWGUb aabeaakiaacMcacqGHsislaeaacaGGOaGaamiBamaaBaaaleaacaWG UbGaey4kaSIaaGymaaqabaGccaWGSbWaaSbaaSqaaiaad6gaaeqaaO Gaey4kaSIaamyBamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGc caWGTbWaaSbaaSqaaiaad6gaaeqaaOGaey4kaSIaamOBamaaBaaale aacaWGUbGaey4kaSIaaGymaaqabaGccaWGUbWaaSbaaSqaaiaad6ga aeqaaOGaaiykaiaacUfacaWGSbWaaSbaaSqaaiaad6gacqGHRaWkca aIXaaabeaakiaacIcacaWG4bWaaSbaaSqaaiaad6gacqGHRaWkcaaI XaaabeaakiabgkHiTiaadIhadaWgaaWcbaGaamOBaaqabaGccaGGPa Gaey4kaScabaGaamyBamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqa baGccaGGOaGaamyEamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqaba GccqGHsislcaWG5bWaaSbaaSqaaiaad6gaaeqaaOGaaiykaiabgUca Riaad6gadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaaiikai aadQhadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0Ia amOEamaaBaaaleaacaWGUbaabeaakiaacMcacaGGDbGaaiyFaaaaaa@9ABA@             (11)

k 0 = [ l n + 1 ( x n + 1 x n ) + m n + 1 ( y n + 1 y n ) + n n + 1 ( z n + 1 z n ) ] 2 ( x n + 1 x n ) 2 ( y n + 1 y n ) 2 ( z n + 1 z n ) 2             (12)

In the distance estimating Eqn. (3), each n-th distance from a target to the camera, rn, could be obtained if the initial distance information r0 is known, as through Radar or Lidar.

Nevertheless it is more attractive for passive ranging without initial distance. To our knowledge, when we have known some certain length x0 on target, the distance difference between two sampling time could be got as below


Δ= r n+1 r n =f x 0 L n L n+1 L n+1 L n MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaeuiLdqKaey ypa0JaamOCamaaBaaaleaacaWGUbGaey4kaSIaaeymaaqabaGccqGH sislcaWGYbWaaSbaaSqaaiaad6gaaeqaaOGaeyypa0JaamOzaiaadI hadaWgaaWcbaGaaeimaaqabaGcdaWcaaqaaiaadYeadaWgaaWcbaGa amOBaaqabaGccqGHsislcaWGmbWaaSbaaSqaaiaad6gacqGHRaWkca qGXaaabeaaaOqaaiaadYeadaWgaaWcbaGaamOBaiabgUcaRiaabgda aeqaaOGaeyyXICTaamitamaaBaaaleaacaWGUbaabeaaaaaaaa@52F9@             (13)

Substitute Eqn. (13) into Eqn. (3), we obtain a nonlinear Eqn. (14) as below15

D 4 r n 4 + D 3 r n 3 + D 2 r n 2 + D 1 r n + D 0 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaI0aaabeaakiaadkhadaqhaaWcbaGaamOBaaqaaiaaisda aaGccqGHRaWkcaWGebWaaSbaaSqaaiaaiodaaeqaaOGaamOCamaaDa aaleaacaWGUbaabaGaaG4maaaakiabgUcaRiaadseadaWgaaWcbaGa aGOmaaqabaGccaWGYbWaa0baaSqaaiaad6gaaeaacaaIYaaaaOGaey 4kaSIaamiramaaBaaaleaacaaIXaaabeaakiaadkhadaWgaaWcbaGa amOBaaqabaGccqGHRaWkcaWGebWaaSbaaSqaaiaaicdaaeqaaOGaey ypa0JaaGimaaaa@504B@             (14)

where

D 4 = C 4 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaI0aaabeaakiabg2da9iaadoeadaWgaaWcbaGaaGinaaqa baaaaa@3C17@             (15)

D 3 =4 C 4 Δ+ C 3 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaIZaaabeaakiabg2da9iaaisdacaWGdbWaaSbaaSqaaiaa isdaaeqaaOGaeuiLdqKaey4kaSIaam4qamaaBaaaleaacaaIZaaabe aaaaa@40D7@             (16)

D 2 =6 C 4 Δ 2 +3 C 3 Δ+ C 2 + k 2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaIYaaabeaakiabg2da9iaaiAdacaWGdbWaaSbaaSqaaiaa isdaaeqaaOGaeuiLdq0aaWbaaSqabeaacaaIYaaaaOGaey4kaSIaaG 4maiaadoeadaWgaaWcbaGaaG4maaqabaGccqqHuoarcqGHRaWkcaWG dbWaaSbaaSqaaiaaikdaaeqaaOGaey4kaSIaam4AamaaBaaaleaaca aIYaaabeaaaaa@494E@            (17)

D 1 =4 C 4 Δ 3 +3 C 3 Δ 2 +2 C 2 Δ+ C 1 + k 1 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaIXaaabeaakiabg2da9iaaisdacaWGdbWaaSbaaSqaaiaa isdaaeqaaOGaeuiLdq0aaWbaaSqabeaacaaIZaaaaOGaey4kaSIaaG 4maiaadoeadaWgaaWcbaGaaG4maaqabaGccqqHuoardaahaaWcbeqa aiaaikdaaaGccqGHRaWkcaaIYaGaam4qamaaBaaaleaacaaIYaaabe aakiabfs5aejabgUcaRiaadoeadaWgaaWcbaGaaGymaaqabaGccqGH RaWkcaWGRbWaaSbaaSqaaiaaigdaaeqaaaaa@4EFB@            (18)

D 0 = C 4 Δ 4 + C 3 Δ 3 + C 2 Δ 2 + C 1 Δ+ k 0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamiramaaBa aaleaacaaIWaaabeaakiabg2da9iaadoeadaWgaaWcbaGaaGinaaqa baGccqqHuoardaahaaWcbeqaaiaaisdaaaGccqGHRaWkcaWGdbWaaS baaSqaaiaaiodaaeqaaOGaeuiLdq0aaWbaaSqabeaacaaIZaaaaOGa ey4kaSIaam4qamaaBaaaleaacaaIYaaabeaakiabfs5aenaaCaaale qabaGaaGOmaaaakiabgUcaRiaadoeadaWgaaWcbaGaaGymaaqabaGc cqqHuoarcqGHRaWkcaWGRbWaaSbaaSqaaiaaicdaaeqaaaaa@4F1D@             (19)

Equation (14) is essentially a 4-order nonlinear eqnuation on rn. By solving Eqn. (14), rn could be obtained, consequently by rn+1 = rn + Δ Up to now, a passive distance finding scheme without initial distance has been achieved.

Compared with Eqn. (3), the initial distance r0 is no longer needed in Eqn. (14). It is particularly convenient in practical application. According to FU15, et al. the errors result from reduced-model experiment using Eqn. (14) is much close to 4%. Because of introduction of variable Δ, it is not valid in passive ranging to non-cooperation target. In addition, there sometimes exists occurrence of pathological solution to the current quartic Eqn. (14). This problem needs to be overcome.

A factor worthy to be pointed out is that the ranging algorithm need only two frames of image in Eqn. (3) or Eqn. (14), this merit is beneficial to prevent ranging error diffusion. As we know, Eqn. (3) and Eqn. (14) use not only the image information but also the imaging directions, that is, there exists some redundancy information in it. This redundancy information is favorable for the associated algorithm to be improved.

Multiplying rn2 to both sides of Eqn. (3), we get Eqn. (20) as below:


C 4 r n 2 r n+1 4 + C 3 r n 2 r n+1 3 + C 2 r n 2 r n+1 2 + C 1 r n 2 r n+1 + C 0 r n 2 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaI0aaabeaakiaadkhadaqhaaWcbaGaamOBaaqaaiaaikda aaGccaWGYbWaa0baaSqaaiaad6gacqGHRaWkcaaIXaaabaGaaGinaa aakiabgUcaRiaadoeadaWgaaWcbaGaaG4maaqabaGccaWGYbWaa0ba aSqaaiaad6gaaeaacaaIYaaaaOGaamOCamaaDaaaleaacaWGUbGaey 4kaSIaaGymaaqaaiaaiodaaaGccqGHRaWkcaWGdbWaaSbaaSqaaiaa ikdaaeqaaOGaamOCamaaDaaaleaacaWGUbaabaGaaGOmaaaakiaadk hadaqhaaWcbaGaamOBaiabgUcaRiaaigdaaeaacaaIYaaaaOGaey4k aSIaam4qamaaBaaaleaacaaIXaaabeaakiaadkhadaqhaaWcbaGaam OBaaqaaiaaikdaaaGccaWGYbWaaSbaaSqaaiaad6gacqGHRaWkcaaI XaaabeaakiabgUcaRiaadoeadaWgaaWcbaGaaGimaaqabaGccaWGYb Waa0baaSqaaiaad6gaaeaacaaIYaaaaOGaeyypa0JaaGimaaaa@650B@             (20)

Let
{ C 40 = C 4 r n 2 C 30 = C 3 r n 2 C 20 = C 2 r n 2 C 10 = C 1 r n 2 =0( C 1 =0) C 00 = C 0 r n 2 = k 2 r n 4 + k 1 r n 3 + k 0 r n 2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaWaaiqaaqaabe qaaiaadoeadaWgaaWcbaGaaGinaiaaicdaaeqaaOGaeyypa0Jaam4q amaaBaaaleaacaaI0aaabeaakiaadkhadaqhaaWcbaGaamOBaaqaai aaikdaaaaakeaacaWGdbWaaSbaaSqaaiaaiodacaaIWaaabeaakiab g2da9iaadoeadaWgaaWcbaGaaG4maaqabaGccaWGYbWaa0baaSqaai aad6gaaeaacaaIYaaaaaGcbaGaam4qamaaBaaaleaacaaIYaGaaGim aaqabaGccqGH9aqpcaWGdbWaaSbaaSqaaiaaikdaaeqaaOGaamOCam aaDaaaleaacaWGUbaabaGaaGOmaaaaaOqaaiaadoeadaWgaaWcbaGa aGymaiaaicdaaeqaaOGaeyypa0Jaam4qamaaBaaaleaacaaIXaaabe aakiaadkhadaqhaaWcbaGaamOBaaqaaiaaikdaaaGccqGH9aqpcaaI WaGaaGzbVlaacIcacqWI1isucaWGdbWaaSbaaSqaaiaaigdaaeqaaO Gaeyypa0JaaGimaiaacMcaaeaacaWGdbWaaSbaaSqaaiaaicdacaaI Waaabeaakiabg2da9iaadoeadaWgaaWcbaGaaGimaaqabaGccaWGYb Waa0baaSqaaiaad6gaaeaacaaIYaaaaOGaeyypa0Jaam4AamaaBaaa leaacaaIYaaabeaakiaadkhadaqhaaWcbaGaamOBaaqaaiaaisdaaa GccqGHRaWkcaWGRbWaaSbaaSqaaiaaigdaaeqaaOGaamOCamaaDaaa leaacaWGUbaabaGaaG4maaaakiabgUcaRiaadUgadaWgaaWcbaGaaG imaaqabaGccaWGYbWaa0baaSqaaiaad6gaaeaacaaIYaaaaaaakiaa wUhaaaaa@7B7B@             (21)

The Eqn. (20) is turned into the Eqn. as below:
C 40 r n+1 4 + C 30 r n+1 3 + C 20 r n+1 2 + k 2 r n 4 + k 1 r n 3 + k 0 r n 2 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaam4qamaaBa aaleaacaaI0aGaaGimaaqabaGccaWGYbWaa0baaSqaaiaad6gacqGH RaWkcaaIXaaabaGaaGinaaaakiabgUcaRiaadoeadaWgaaWcbaGaaG 4maiaaicdaaeqaaOGaamOCamaaDaaaleaacaWGUbGaey4kaSIaaGym aaqaaiaaiodaaaGccqGHRaWkcaWGdbWaaSbaaSqaaiaaikdacaaIWa aabeaakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaaigdaaeaacaaI YaaaaOGaey4kaSIaam4AamaaBaaaleaacaaIYaaabeaakiaadkhada qhaaWcbaGaamOBaaqaaiaaisdaaaGccqGHRaWkcaWGRbWaaSbaaSqa aiaaigdaaeqaaOGaamOCamaaDaaaleaacaWGUbaabaGaaG4maaaaki abgUcaRiaadUgadaWgaaWcbaGaaGimaaqabaGccaWGYbWaa0baaSqa aiaad6gaaeaacaaIYaaaaOGaeyypa0JaaGimaaaa@60D9@             (22)

Substitute the distance ratio in adjacent sampling times, ρ=rn/rn+1 into Eqn. (22), we can get Eqn. (23) as below:
( C 40 + k 2 ρ 4 ) r n+1 4 +( C 30 + k 1 ρ 3 ) r n+1 3 +( C 20 + k 0 ρ 2 ) r n+1 2 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaaiikaiaado eadaWgaaWcbaGaaGinaiaaicdaaeqaaOGaey4kaSIaam4AamaaBaaa leaacaaIYaaabeaakiabeg8aYnaaCaaaleqabaGaaGinaaaakiaacM cacaWGYbWaa0baaSqaaiaad6gacqGHRaWkcaaIXaaabaGaaGinaaaa kiabgUcaRiaacIcacaWGdbWaaSbaaSqaaiaaiodacaaIWaaabeaaki abgUcaRiaadUgadaWgaaWcbaGaaGymaaqabaGccqaHbpGCdaahaaWc beqaaiaaiodaaaGccaGGPaGaamOCamaaDaaaleaacaWGUbGaey4kaS IaaGymaaqaaiaaiodaaaGccqGHRaWkcaGGOaGaam4qamaaBaaaleaa caaIYaGaaGimaaqabaGccqGHRaWkcaWGRbWaaSbaaSqaaiaaicdaae qaaOGaeqyWdi3aaWbaaSqabeaacaaIYaaaaOGaaiykaiaadkhadaqh aaWcbaGaamOBaiabgUcaRiaaigdaaeaacaaIYaaaaOGaeyypa0JaaG imaaaa@6466@             (23)

After reduction of Eqn. (23), we get Eqn. (24):
A 2 r n+1 4 + A 1 r n+1 3 + A 0 r n+1 2 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamyqamaaBa aaleaacaaIYaaabeaakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaa igdaaeaacaaI0aaaaOGaey4kaSIaamyqamaaBaaaleaacaaIXaaabe aakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaaigdaaeaacaaIZaaa aOGaey4kaSIaamyqamaaBaaaleaacaaIWaaabeaakiaadkhadaqhaa WcbaGaamOBaiabgUcaRiaaigdaaeaacaaIYaaaaOGaeyypa0JaaGim aaaa@4DBC@             (24)

Since the target’s distance rn+1≠0, Eqn. (24) is re-written as Eqn. (25):
A 2 r n+1 2 + A 1 r n+1 + A 0 =0 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamyqamaaBa aaleaacaaIYaaabeaakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaa igdaaeaacaaIYaaaaOGaey4kaSIaamyqamaaBaaaleaacaaIXaaabe aakiaadkhadaqhaaWcbaGaamOBaiabgUcaRiaaigdaaeaaaaGccqGH RaWkcaWGbbWaaSbaaSqaaiaaicdaaeqaaOGaeyypa0JaaGimaaaa@4883@            (25)

From Eqn. (3) through Eqn. (12) and Eqn. (21), the coefficients of ranging Eqn. are determined as below
A 2 =( ρ 4 H )[ ( l n+1 l n + m n+1 m n + n n+1 n n ) 2 1] MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGaamyqamaaBa aaleaacaaIYaaabeaakiabg2da9iaacIcacqaHbpGCdaahaaWcbeqa aiaaisdaaaGccqGHsislceWGibGbauaacaGGPaGaai4waiaacIcaca WGSbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiaadYgadaWg aaWcbaGaamOBaaqabaGccqGHRaWkcaWGTbWaaSbaaSqaaiaad6gacq GHRaWkcaaIXaaabeaakiaad2gadaWgaaWcbaGaamOBaaqabaGccqGH RaWkcaWGUbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiaad6 gadaWgaaWcbaGaamOBaaqabaGccaGGPaWaaWbaaSqabeaacaaIYaaa aOGaeyOeI0IaaGymaiaac2faaaa@5925@            (26)

A 1 =2 H { l n+1 ( x n+1 x n )+ m n+1 ( y n+1 y n )+ n n+1 ( z n+1 z n )( l n+1 l n + m n+1 m n + n n+1 n n ) [ l n ( x n+1 x n )+ m n ( y n+1 y n )+ n n ( z n+1 z n )]}+ 2 ρ 3 { l n ( x n+1 x n )+ m n ( y n+1 y n )+ n n ( z n+1 z n ) ( l n+1 l n + m n+1 m n + n n+1 n n )[ l n+1 ( x n+1 x n )+ m n+1 ( y n+1 y n )+ n n+1 ( z n+1 z n )]} MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGceaabbeaacaWGbb WaaSbaaSqaaiaaigdaaeqaaOGaeyypa0JaaGOmaiqadIeagaqbaiaa cUhacaWGSbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiaacI cacaWG4bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiabgkHi TiaadIhadaWgaaWcbaGaamOBaaqabaGccaGGPaGaey4kaSIaamyBam aaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccaGGOaGaamyEamaa BaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccqGHsislcaWG5bWaaS baaSqaaiaad6gaaeqaaOGaaiykaiabgUcaRaqaaiaad6gadaWgaaWc baGaamOBaiabgUcaRiaaigdaaeqaaOGaaiikaiaadQhadaWgaaWcba GaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamOEamaaBaaaleaa caWGUbaabeaakiaacMcacqGHsislcaGGOaGaamiBamaaBaaaleaaca WGUbGaey4kaSIaaGymaaqabaGccaWGSbWaaSbaaSqaaiaad6gaaeqa aOGaey4kaSIaamyBamaaBaaaleaacaWGUbGaey4kaSIaaGymaaqaba GccaWGTbWaaSbaaSqaaiaad6gaaeqaaOGaey4kaSIaamOBamaaBaaa leaacaWGUbGaey4kaSIaaGymaaqabaGccaWGUbWaaSbaaSqaaiaad6 gaaeqaaOGaaiykaaqaaiaacUfacaWGSbWaaSbaaSqaaiaad6gaaeqa aOGaaiikaiaadIhadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaO GaeyOeI0IaamiEamaaBaaaleaacaWGUbaabeaakiaacMcacqGHRaWk caWGTbWaaSbaaSqaaiaad6gaaeqaaOGaaiikaiaadMhadaWgaaWcba GaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamyEamaaBaaaleaa caWGUbaabeaakiaacMcacqGHRaWkcaWGUbWaaSbaaSqaaiaad6gaae qaaOGaaiikaiaadQhadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqa aOGaeyOeI0IaamOEamaaBaaaleaacaWGUbaabeaakiaacMcacaGGDb GaaiyFaiabgUcaRaqaaiaaikdacqaHbpGCdaahaaWcbeqaaiaaioda aaGccaGG7bGaamiBamaaBaaaleaacaWGUbaabeaakiaacIcacaWG4b WaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadIha daWgaaWcbaGaamOBaaqabaGccaGGPaGaey4kaSIaamyBamaaBaaale aacaWGUbaabeaakiaacIcacaWG5bWaaSbaaSqaaiaad6gacqGHRaWk caaIXaaabeaakiabgkHiTiaadMhadaWgaaWcbaGaamOBaaqabaGcca GGPaGaey4kaSIaamOBamaaBaaaleaacaWGUbaabeaakiaacIcacaWG 6bWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadQ hadaWgaaWcbaGaamOBaaqabaGccaGGPaGaeyOeI0cabaGaaiikaiaa dYgadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaamiBamaaBa aaleaacaWGUbaabeaakiabgUcaRiaad2gadaWgaaWcbaGaamOBaiab gUcaRiaaigdaaeqaaOGaamyBamaaBaaaleaacaWGUbaabeaakiabgU caRiaad6gadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaOGaamOB amaaBaaaleaacaWGUbaabeaakiaacMcacqGHflY1caGGBbGaamiBam aaBaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccaGGOaGaamiEamaa BaaaleaacaWGUbGaey4kaSIaaGymaaqabaGccqGHsislcaWG4bWaaS baaSqaaiaad6gaaeqaaOGaaiykaiabgUcaRaqaaiaad2gadaWgaaWc baGaamOBaiabgUcaRiaaigdaaeqaaOGaaiikaiaadMhadaWgaaWcba GaamOBaiabgUcaRiaaigdaaeqaaOGaeyOeI0IaamyEamaaBaaaleaa caWGUbaabeaakiaacMcacqGHRaWkcaWGUbWaaSbaaSqaaiaad6gacq GHRaWkcaaIXaaabeaakiabgwSixlaacIcacaWG6bWaaSbaaSqaaiaa d6gacqGHRaWkcaaIXaaabeaakiabgkHiTiaadQhadaWgaaWcbaGaam OBaaqabaGccaGGPaGaaiyxaiaac2haaaaa@03BC@            (27)

A 1 = H { [ l n ( x n + 1 x n ) + m n ( y n + 1 y n ) + n n ( z n + 1 z n ) ] 2 + ( x n + 1 x n ) 2 + ( y n + 1 y n ) 2 + ( z n + 1 z n ) 2 } + ρ 2 { [ l n + 1 ( x n + 1 x n ) + m n + 1 ( y n + 1 y n ) + n n + 1 ( z n + 1 z n ) ] 2 ( x n + 1 x n ) 2 ( y n + 1 y n ) 2 ( z n + 1 z n ) 2 }            (28)

H =H r n 2 = ( f n f n+1 ) 2 ( L n+1 L n ) 2 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqipu0JLipgYlb91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqai=hGuQ8kuc9pgc9q8qqaq=dir=f0=yqaiVgFr0xfr=x fr=xb9adbaqaaeGacaGaaiaabeqaamaabaabaaGcbaGabmisayaafa Gaeyypa0JaamisaiaadkhadaqhaaWcbaGaamOBaaqaaiaaikdaaaGc cqGH9aqpdaqadaqaamaalaaabaGaamOzamaaBaaaleaacaWGUbaabe aaaOqaaiaadAgadaWgaaWcbaGaamOBaiabgUcaRiaaigdaaeqaaaaa aOGaayjkaiaawMcaamaaCaaaleqabaGaaGOmaaaakmaabmaabaWaaS aaaeaacaWGmbWaaSbaaSqaaiaad6gacqGHRaWkcaaIXaaabeaaaOqa aiaadYeadaWgaaWcbaGaamOBaaqabaaaaaGccaGLOaGaayzkaaWaaW baaSqabeaacaaIYaaaaaaa@4E95@            (29)

In Eqn. (26) through Eqn. (28), ρ=rn/rn+1=Ln+1/Ln, it is an approximation of Eqn. (2) in a smaller sampling interval. On Ln+1 or Ln, detailed process is demonstrated as below.

Let points A, B, C and A', B', C' be the matched points respectively in two adjacent frames. We would determine the line segment feature from these points, A, B and C for example, as shown in Fig. 3 A circumcircle for points A, B and C with the center named O' could be determined; we take MN, diameter of the circumcircle of triangle ABC, as our characteristic linearity. With this improvement, the characteristic linearity could be obtained more easily than that in Fu15, et al.


Figure 3. Feature points A, B, C and the selecting line segment features.


In this method, it needs only three matched points for extracting the line features, while three points is the least number in image matching. In order to obtain at least three matched points, an effective approach is adjusting the image contrast in certain range before image matching. Equations (25), (14), and (3) are homologous equations, but the former has lower orders, so its ranging error should be no more than of the latter’s.

As for the distance estimation Eqn. (25), it will always have solution provided that the observer has non-zero displacement in adjacent sampling times, i.e. the target’s distance can always be estimable. This could be satisfied by continuous observer maneuver. Even if the distance difference δ is zero, the discriminant of a quadratic equation, A12 − 4A2A0, is still greater than zero, that is the Eqn. (25) can be solved.

So far, our method does not need prior knowledge about the target, so it is quite suitable for non-cooperation target ranging. Figure 4 is the flow chart of our method for application. It has been found in our experiment that the step of ‘contrast adjustment in target and its adjacent region’ is indeed necessary for image matching and characteristic linearity extraction. Building a quadratic equation is our main improvement.


Figure 4. Flow chart of our method for application.


To verify the passive ranging algorithm based on Eqn. (25), we conducted reduced-model experiments with a ratio of 1:2300. As space is limited, here in Fig. 5 we only present the 16 frames of image with even numbers in the sequence. The photographing conditions and measurement results are shown in Table 1. It can be seen in Fig. 5, that there occurs a great change in the target. Even so, the ranging error is acceptable.


Figure 5. Experiment image sequence A. (2), (4),......(32), are frame numbers.


All the pictures in this paper were taken by a Sony ExwavePRO CCD with 768 x 576 pixels, the aircraft in pictures is a F16 model with length of 24 cm, and background removing has been done before this experiment. In natural scenario work, we us a moving target tracking technology from Visionlab23, so the contour detection and background removing become easy. Moreover, in our method only 3 matched points are required, it is the least requirement in imaging tracking.

In this experiment, the aircraft is moving in an arc shaped path and the surveyor is moving in a straight line, it can be seen in the image sequence that there is a significant change in the target’s attitude. In Table 1, relative error of target’s distance estimation is less than ±4 % in most cases, the biggest relative error is 7.81 %, and such errors can meet the demand of practical application.

As for static observer, ranging Eqn. (14) can still be used to estimate the distance. The results fully show that our improved algorithm is able to adapt to changes in target’s attitude. As a control, another group of pictures named sequence B are given in Fig. 6, and the ranging errors are shown in Table 2. In Fig. 6, only the frames with odd numbers are present. Experimental study shows that Table 2 has less error than that in Table 1, due to the less change in the target’s attitude in Fig. 6. In most cases, experimental errors are between Table 1and Table 2.


Figure 6. Experiment image sequence B. (1), (3), ......(41), are frame numbers.



Table 1. Data and ranging result of experiment A




Table 2. Data and ranging result of experiment B*



We proposed an algorithm for non-cooperation target passive ranging, in which distance estimation is turned into solving a quadratic equation after utilizing target’s imaging features and the camera positions. In contrast with the former algorithm of solving a quartic Eqn. (15), the new algorithm of solving a quadratic equation avoids pathological solutions, such as the complex solution, negative solutions, etc. Hence the new algorithm shows much more worth in practical application than the former ones. Theoretical analysis indicates that the new algorithm always has solution provided that the platform of observer has non-zero displacement in adjacent sampling times. The proposed algorithm is also examined by indoor reduced mode experiments which show that it can be implemented in practical passive ranging and the relative ranging error is less than ±4% in most cases. We also demonstrated that the distance estimation would be in much simpler mathematical form for moving observer.

This work was supported by both the National Natural Science Foundation of China under Grant No. 60872136 and by Natural Science Basic Research Plan in Shaanxi Province of China (Program No. 2011JM8002). The authors would like to thank the anonymous reviewers for their valuable advice toward the improvement of this article.

1. Reilly, J.P.; Klein, T. & Ilver, H. Design and demonstration of an infrared passive ranging. John Hopkings APL Technical Digest, 1999, 20(2), 1854-1859.

2. Suhr, J.K.; Jung, H.G.; Bae, K.H. & Kim, J.H. Monocular motion stereo-based free parking space detection apparatus and method. US Patent 8134479, 13 March 2012.

3. Olson, C.F. & Abi-Rached, H. Wide-baseline stereo vision for terrain mapping. Machine Vision Appl, 2010, 21(5), 713-725.[Full text via CrossRef]

4. Hewson, R. Taurus KEPD 350 (KEPD 150). Jane’s Air-Launched Weapons (Air-to-surface Missiles-Stand-off and Cruise). 24 April, 2012. (Accessed on 12 September, 2012).

5. Lepetit, V. & Fua, P. Monocular Model-Based 3D Tracking of Rigid Objects: A Survey. Foundations Trends Comput. Graphics Vision, 2005, 1(1), 1-89.[Full text via CrossRef]

6. Tuytelaars, T. & Mikolajczyk, K. Local invariant feature detectors: a survey. Foundations Trends Comp. Graphics Vision, 2008, 3(3), 177-280.[Full text via CrossRef]

7. Yilmaz, A.; Javed, O. & Shah, M. Object Tracking: A survey. ACM Computing Surveys, 2006, 38(4), pp. 1-45. [Full text via CrossRef]

8. Newcombe R. A.; Davison A. J. & Izadi S. Kinect Fusion: Real-time dense surface mapping and tracking. In the IEEE International Symposium on Mixed and Augmented Reality(ISMAR’11), Basel, Switzeland, October 2011, pp.127-136.[Full text via CrossRef]

9. Zhang, Z.; Huang, Y.; Li, C. & Kang, Y. Monocular Vision Simultaneous Localization and Mapping using SURF. In 7th World Congress on Intelligent Control and Automation (WCICA’ 2008), Chongqing, China, June 2008, pp. 1651-1656. [Full text via CrossRef]

10. Baker, P. & Kamgar-Parsi, B. Using shorelines for autonomous air vehicle guidance. Comp. Vision Image Understanding, 2010, 114(6), 723-729.[Full text via CrossRef]

11. Le, M. H. & Jo, K. H.. Building detection and 3D reconstruction from two-view of monocular camera. In Computational Collective Intelligence. Technologies and Applications, Berlin Heidelberg. September 2011, pp.428-437.[Full text via CrossRef]

12. Cannons, K. & Wildes, R. P. A Unifying Theoretical Framework for Region Tracking, York University Technical Report, CSE-2013-04, February 8, 2013.

. Xiong, T. & Debrunner, C. Stochastic car tracking with line-and colour-based features. IEEE Trans. Intell. Transp. Syst., 2004, 5(4), 324-328.[Full text via CrossRef]

14. Zhang, Y.; Wang, Y. & Qu, H. Rotation and Scaling Invariant Feature Lines for Image Matching. In the 2011 International Conference on Mechatronic Science, Electric Engineering and Computer, Jilin, China, August 2011, pp.1135-1138.[Full text via CrossRef]

15. FU, X.; LIU, S. & LI, E. A real time image sequence procession algorithm for target ranging. In the Proceeding SPIE 6279: High-Speed Photography and Photonics, September, Xi’an, China, Nov 2006. Part 2, p. 62793A.

16. Troiani, C. & Martinelli, A. Vision-aided inertial navigation using virtual features. In the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 2012, pp. 4828-4834.[Full text via CrossRef]

17. Priyanka, D.D. & Dhok, G.P. Analysis of distance measurement system of leading vehicle. Inter. J. Instrumentation Control Sys., 2012, 2(1), 11-23.[Full text via CrossRef]

18. Smith, P.; Reid, I. & Davison, A. Real-Time Monocular SLAM with Straight Lines. In the Proceeding of 2nd International Symposium on Visual Computing, Edinburgh, GB, November 2006. pp. 1-10.[Full text via CrossRef]

19. Rosten, E. & Drummond, T. Fusing points and lines for high performance tracking. In Proceedings of the 10th IEEE International Conference on Computer Vision, Beijing, China, October 2005, 2, pp. 1508-1515.[Full text via CrossRef]

20. Ansar, A. & Daniilidis, K. Linear pose estimation from points or lines. IEEE Trans. Pattern Anal. Mach. Intell., 2003, 25(4), 1-12.

21. Von Gioi, R. G.; Jakubowicz, J.; Morel, J. M. & Randall, G. LSD: A fast line segment detector with a false detection control. IEEE Trans. Pattern Anal. Mach. Intell., 2010, 32(4), 722-732.[Full text via CrossRef]

22. Gong, J.; Fan, G.; Yu, L.; Havlicek, J. P. & Chen, D. Joint view-identity manifold for target tracking and recognition. In the 19th IEEE International Conference on Image Processing (ICIP), Orlando FL, USA, September 2012, pp.1357-1360.[Full text via CrossRef]

23. VisionLab VCL + Source code 4.5, URL:http://visionlab-vcl-source-code.en.softonic.com/.

Dr Xiaoning Fu received his PhD in Technology Physics from Xidian University in 2005. He is currently working at School of Electromechanical Engineering, Xidian University. He is in charge of photoelectric detection technology and system, video signal processing and electronic countermeasures in Xidian University. His research interests include imaging detection, signal processing, electro-optic ranging and countermeasure.

Lixia Wang received her bachelor degree in Automation from Anhui Polytechnic University in 2011. She is now pursuing her master’s degree in College of Electromechanical Engineering, Xidian University. Her research interests include photoelectric guidance system and electro-optic countermeasure.