Robot navigation depends on accurate scene analysis by a camera using its data. This paper investigates a refinement of the inherent falsified depth maps generated from a 3D SwissRanger camera in the emission of beams of rays through a modulated infrared light channel affected by environmental noise. The SR4000 time-of-flight camera produces streams of depth maps projected as a 2.5D on an x-y plane, which are refined using a dynamic convolution filter method coupled with a hypergraph-type model. Our findings indicate that the range of the camera is experimentally confirmed as being nine metres; more extreme values of impulse noise pixels are detected outside the range; while the uniform noise of valid pixel values affects depth maps of objects formed within the range. A decrease in the window size of filtering, to a pixel level, minimizes both the falsified depth maps of corrupted frames and the dominant effect of the noise pixels, to an acceptable level. The performance of our approach in the absence of complementing time-of-flight (ToF) with other camera types exhibits reliable depth maps for promising field work in terms of visual quality, mean squared error (MSE), root mean squared error (RMSE), and peak signalto- noise ratio (PSNR).
Reference:
Osunmakinde, IO. 2010. Refinement of falsified depth maps for the SwissRanger time-of-flight 3D camera on autonomous robots. 21st Annual Symposium of the Pattern Recognition Association of South Africa (PRASA). Stellenbosch, South Africa, 22-23 November 2010, pp 6
Osunmakinde, I. (2010). Refinement of falsified depth maps for the SwissRanger time-of-flight 3D camera on autonomous robots. PRASA 2010. http://hdl.handle.net/10204/4710
Osunmakinde, IO. "Refinement of falsified depth maps for the SwissRanger time-of-flight 3D camera on autonomous robots." (2010): http://hdl.handle.net/10204/4710
Osunmakinde I, Refinement of falsified depth maps for the SwissRanger time-of-flight 3D camera on autonomous robots; PRASA 2010; 2010. http://hdl.handle.net/10204/4710 .