c++ - How to get the distance of the object and How to use camera calibration matrix correctly? -


i succesfully calibrate camera using opencv. camera lens using. https://www.baslerweb.com/en/products/vision-components/lenses/basler-lens-c125-0418-5m-f1-8-f4mm/

the internal , external camera parameter given below.

cv::mat cameramatrix(3, 3, cv::datatype<double>::type);     cameramatrix.at<double>(0) = 1782.80;//fx //432.2 in mm     cameramatrix.at<double>(1) = 0;     cameramatrix.at<double>(2) = 3.0587694283633488e+002;//cx     cameramatrix.at<double>(3) = 0;     cameramatrix.at<double>(4) = 1782.80;//fy     cameramatrix.at<double>(5) = 3.0535864258476721e+002;//cy     cameramatrix.at<double>(6) = 0;     cameramatrix.at<double>(7) = 0;     cameramatrix.at<double>(8) = 1;       cv::mat discoeffs(1, 5, cv::datatype<double>::type);     discoeffs.at<double>(0) = -8.1752937039996709e-001;//k1     discoeffs.at<double>(1) = -2.5660653367749450e+001;//k2     discoeffs.at<double>(2) = -1.5556922931812768e-002;//p1     discoeffs.at<double>(3) = -4.4021541217208054e-002;//p2     discoeffs.at<double>(4) = 1.5042036073609015e+002;//k3 

i know formula used calculate distance of object. confuse how proper use it.

enter image description here

resolution of camera 640x480.

focal length = 1782.80 (px) not know how correctly convert mm

i know focal length distance sensor image plane. value represent? pixel unit represent dot on screen.

object using circle. radius = 22. (width , height 44*44) circle center point: 300,300 (x,y)

sensor height not know how get?

where use principle points?

how distance camera object? how real world coordinate of circle?

i know ask. try 1 month. did not find proper solution.

i use function solvepnp camera translation , rotation matrix. have problem how calculate object point?

your cx , cy seems wrong because should half resolution: 640/2 & 480/2. fx , fy in pixel unit calibration process. convert them mm use formula:

pixels width = (image width in pixels) * (focal length in mm) / (ccd width in mm)  pixels height = (image height in pixels) * (focal length in mm) / (ccd height in mm) 

when calibrate camera, use formulas make sure you've right values. me cx , cy wrong because represent center of image (they shouldn't equal unless image square not case). fx , fy can't tell because don't know ccd of camera. can equal if ccd square.

don't change parameters manually let calibration software compute them.

now you've parameters, how compute distance?

the formula presented not useful in sense if can measure real height, can measure distance (at least in case).. why using camera!?

so compute distance in real world, need 2 more things: extrinsic parameters (your cameramatrix matrix intrinsic parameters) , @ least 4 points (the more points better) in real world coordinates. once have things, can use solvepnp function find pose of object. pose represents translation , rotation respect camera frame.

http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#solvepnp 

this piece of code can that:

//four points in real world `x,y , z` coordinates vector<point3f> vec3d; vec3d.push_back(point3f(0, 0, 0)); vec3d.push_back(point3f(0, 211, 0)); vec3d.push_back(point3f(295, 211, 0)); vec3d.push_back(point3f(295, 0, 0)); 

the z=0 because real points in plane.

//the same 4 points in image plan, therefore there no z , they're in pixel unit vector<point2f> vec2d; vec2d.push_back(point2f(532, 412)); //(y,x) vec2d.push_back(point2f(583, 594)); vec2d.push_back(point2f(927, 535)); vec2d.push_back(point2f(817, 364));  //the pose of object: rvec rotation vector, tvec translation vector cv::mat rvec, tvec; solvepnp(vec3d, vec2d, cameramatrix, distcoeffs, rvec, tvec); 

finally, can compute real distance tvec euclidean distance: d=std::sqrt(tx*tx+ty*ty+tz*tz).

your questions:

sensor height not know how get?

look camera specification in internet or in manual book , you'll find it.

where use principle points?

they're intrinsic parameters. you're not gonna use them separately.

how distance camera object? how real world coordinate of circle?

i explained above. need 4 points , circle have 1 not enough compute pose.

but have problem how calculate object point?

objectpoints in solvepnp real world coordinates. example, chessboard has corners in know exact position in mm of each 1 respect world frame choose in chessboard. can in left top corner or , z=0 because chessboard printed in paper circle!

edit:

you can find more specifications in manual page 13 here. said 7.4 x 7.4µm:

f (mm)=f(pixel) x pixel_size(mm) => f (mm) = 1782.80x7.2e-6 = 12.83616 (mm)  

which not 4mm!! need calibration again, wrong!

3d points: vector vec3d; vec3d gonna store 3d coordinates point. gave example first point origin:

vec3d.push_back(point3f(0, 0, 0)); //y,x,z 

edit3

if take pattern this

enter image description here

then choose example circle in top left or right corner , have coordinate of (0,0,0), origin. after circle next second point , have (x,0,0) x distance in (mm) between 2 circles.. same 4 points in pattern. can choose pattern want long can detect in image , retrieve coordinates in pixel. if still don't understand, advise take course in projective geometry , camera models.. can understand every parameter means.


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -