|
1.IntroductionThree-dimensional (3D) scanners are used in more and more applications, and range accuracy is the main challenge for manufacturers. The knowledge of this accuracy helps to better exploit the results and the information from the scanners. Usually when the manufacturer does not provide enough information, the user has to characterize the scanner before using it. This letter describes some preliminary results we have obtained during the characterization of a commercial 3D scanner based on structured light.1 More precisely, we will focus on the study of range accuracy with respect to an illuminant that our reproducible experimental setup has provided. This letter is organized as follows: Section 2 presents some related works. The scanner and experimental setup are described in Sec. 3. We present and discuss the results in Sec. 4. Conclusions and future works are given in Sec. 5. 2.Related WorksTwo categories of methods emerge from the literature on range accuracy: methods based on the optical transfer function2, 3, 4 and methods based on measurement of known objects.5, 6, 7, 8 We have also reviewed color accuracy characterization methods, which can also be classified in two categories: the colorimetric-based camera characterization9 and the spectral-based characterization.10, 11, 12 Considering this knowledge, we have investigated the influence of color on range accuracy of a 3D scanner based on structured light. During our initial acquisitions, we have also observed that ambient light influences the results of color information, which is similar to the result provided by two-dimensional imaging experimentation. So far this phenomenon has not been studied for 3D scanners based on structured light. 3.ExperimentsUsually 3D scanners are chosen based on their abilities to digitize under particular conditions. It is notable that the scanner used, in this study, is designed to capture objects under ambient illuminant and give a 3D textured mesh as the output. It is based on structured light and more details can be found in Refs. 13, 14. Basically, the projected pattern is composed of vertical spatiotemporal modulated stripes and the field of view is a boxlike area, , whose center is away from the front of the scanner. The setup for the system was constant for all the experiments. It consisted of placing a Macbeth ColorChecker chart, a color grid composed of 24 colored patches, in a light booth displaying different illuminants. The chart was placed following the proprietary recommended setup in the light booth as perpendicular as possible with respect to the scanner. In addition, we fixed this chart on a special support to keep it flat. The experiments were done sequentially without modifying the system setup. The only varying parameter was the illuminant. 4.Results and DiscussionTo evaluate the results for each patch, we manually selected the faces in the middle area ( instead of per patch) to avoid human perception bias and possible inaccurate overlaying of the color information. For a concise presentation, we only show in Fig. 1 the results of 2 acquisitions among 7 and a single patch among 24. Fig. 1Graphical representations of the bluish-green patch of the Macbeth ColorChecker under two different illuminants: (a) dark night 1, (b) daylight. ![]() For meaningful measurements of the range accuracy, we had to know the exact orientation of the chart with respect to the scanner. Due to the design of the commercial scanner we used, we could not perfectly know its relative position. Therefore, we statistically chose a reference patch and considered its orientation to be the same as the chart patch with respect to the scanner. From this reference patch, we obtained a reference plane that we used to compute the geometric deviation for each patch under each illuminant using Eq. 1. In this equation, is the signed distance between the reference plane and a point on the surface patch . The deviation shows the range accuracy of the scanner because it represents the deviation of the points with respect to their theoretical positions; the results are shown in Table 1. should be equal to 0 if the scanner is perfect. Table 1The deviation Δ of each patch under each illuminant with respect to the reference plane PR (given in millimeters).
We have observed that the illuminant more or less influences the range accuracy depending on the original color. As we can see in Table 1, the daylight illuminant creates a small deviation equal to for the neutral 6.3 patch and a large deviation equal to for the white 9.5 patch. We have also observed that the deviation range varies with the illuminant. For instance, the dark night 1 illuminant induces a range of from for the neutral 6.3 patch to for the black 2 patch, the horizon illuminant induces a huge range of from for the neutral 6.3 patch to under the light skin patch, and the cool white illuminant implies a range of from for the blue flower patch to for the white 9.5 patch. In addition, we also have observed that a systematic error appears for certain colors under certain illuminants as shown in Fig. 1. We denote by systematic error the repetitive wavy effect that we can see in Fig. 1b along the patch surface, which is a distinct contrast to the quasi-flat appearance of the same colored patch under a different illuminant in Fig. 1a. This error seems to come from the projected pattern itself and to be dependent on the color but more statistical studies are necessary to be conclusive. However, this phenomenon can be physically explained with the response model of the digital camera given in Eq. 2. This equation represents the digital camera response for each channel (in our case, the three red-green-blue channels) at the pixel with respect to the spectral power distribution (how a light source is distributed across the different wavelengths) , the surface reflectance (amount of light reflected by a surface) at the pixel and the spectral sensitivity of the sensor (sensor sensitivity with respect to wavelength) over the visible spectrum . In our study case, the spectral power distribution, Eq. 3, is composed of two components: the illuminant spectral power distribution and the projector spectral power distribution As long as a scanner uses only one wavelength to reconstruct the 3D information, is the same for each pixel under the illuminant. This case study had been investigated by Clark and Robson.6 When different wavelengths are projected or, in our case, vertical stripes of different gray level, nonlinearly varies with each pixel column. The correspondence between the wavelength and the angle is no longer the same as the manufacturer calibration. Therefore, the computation of the depth information , the distance between a point on the object surface and the sensor, is no longer accurate. For instance, is computed with Eq. 4, which is used for standard triangulation To summarize, a false correspondence between the wavelength and the angle leads to a wrong computation of the depth information .5.Conclusion and Future WorkIn this letter, we have shown that, illumination exhibits a strong influence on range accuracy from structured light, in addition to its well-known influence on color accuracy, scanners. We have statistically evaluated the orientation of the Macbeth ColorChecker with respect to the scanner to estimate the range accuracy as the deviation . We have proposed a physical explanation for the systematic error we have observed from some colored patches. Future work will investigate this systematic error in more detail to define eventual reduction or elimination actions. AcknowledgmentsThis work is supported by the University Research Program in Robotics under Grant No. DOE-DE-FG52-2004NA25589 and by the DOD/RDECOM/NAC/ARC Program under Grant No. W56HZV-04-2-2001. ReferencesJ. Salvi,
J. Pages, and
J. Battle,
“Pattern codification strategies in structured light systems,”
Pattern Recogn., 37
(2), 827
–849
(2004). https://doi.org/10.1016/j.patcog.2003.10.002 0031-3203 Google Scholar
S. Dore and
Y. Goussard,
“Experimental determination of CT point spread function anisotropy and shift-variance,”
788
–791
(1997). Google Scholar
M. Goesele,
C. Fuchs, and
H.-P. Seidel,
“Accuracy of 3D scanners by measurement of the slanted edge modulation transfer function,”
37
–44
(2003). Google Scholar
S. E. Reichenbach,
S. K. Park, and
R. Narayanswamy,
“Characterizing digital image acquisition devices,”
Opt. Eng., 30
(2), 170
–177
(1991). https://doi.org/10.1117/12.55783 0091-3286 Google Scholar
J.-A. Beraldin and
M. Gaiani,
“Evaluating the performance of close range 3D active vision systems for industrial design applications,”
Proc. SPIE, 5665 7
–77
(2005). 0277-786X Google Scholar
J. Clark and
S. Robson,
“Accuracy of measurements made with a Cyrax 2500 laser scanner against surfaces of known colour,”
1031
–1036
(2004) Google Scholar
S. El-Hakim,
J.-A. Beraldin, and
F. Blais,
“A comparative evaluation of the performance of passive and active 3D vision systems,”
Proc. SPIE, 2646 14
–25
(1995). 0277-786X Google Scholar
G. Sansoni,
M. Carocci, and
R. Rodella,
“Calibration and performance evaluation of a 3D imaging sensor based on the projection of structured light,”
IEEE Trans. Instrum. Meas., 49
(3), 628
–636
(2000). https://doi.org/10.1109/19.850406 0018-9456 Google Scholar
T. Johnson,
“Methods for characterizing colour scanners and digital cameras,”
Displays, 16
(4), 183
–192
(1996). https://doi.org/10.1016/0141-9382(96)01012-8 0141-9382 Google Scholar
G. D. Finlayson,
S. Hordley, and
P. M. Hubel,
“Recovering device sensitivities with quadratic programming,”
J. Imaging Sci. Technol., 6 90
–95
(1998). 1062-3701 Google Scholar
J. Y. Hardeberg,
H. Brettel, and
F. Schimitt,
“Spectral characterization of electronic cameras,”
100
–109
(1998). Google Scholar
L. MacDonald and
W. Ji,
“Colour characterization of a high-resolution digital camera,”
J. Imaging Sci. Technol., 1 433
–437
(2002). 1062-3701 Google Scholar
J. Geng,
P. Zhuang,
P. May,
S. Yi, and
D. Tunnell,
“3D FaceCam™: A fast and accurate 3D facial imaging device for biometrics applications,”
316
–327
(54042004). Google Scholar
Z. J. Geng,
“Rainbow 3-dimensional camera: New concept of high-speed 3-dimensional vision system,”
Opt. Eng., 35
(2), 376
–383
(1996). https://doi.org/10.1117/1.601023 0091-3286 Google Scholar
|