****  Incompatibilities have been found when viewing this webpage using the Windows Explorer web browser. Please view using the Mozilla Firefox web browser

Generating NURBS Surfaces 
through 3D Silhouette Scanning


 
Christopher Nielsen

 

 

 



 

Contents


Project Information

Abstract (Project Summary)

In this project a system of 3D (three-dimensional)  silhouette scanning is proposed and implemented that would allow for the automatic computational generation of 3D NURBS (Non-Uniform Rational B-Spline) surfaces representing real world objects for use in 3D computer graphics applications. This system was designed to accurately describe a real world object as a NURBS surface based off only the silhouette images recorded of the object as it was rotating.

 

VSF 2008 Project Information

Grade Grouping: 7-9 (Grade 9)
Team Size: 1
Subject Area: Engineering/Computer Science
Language: English
Project Type: Descriptive Type III (Engineering)

Software Tools

  • Website:
    • Microsoft Expression Web
    • Adobe Photoshop CS2
    • Adobe Acrobat 6
    • Microsoft Office
  • Project:
    • MatLab
    • Maya

Special Skills

  • Website:
    • HyperText Markup Language (HTML)
    • Cascading Style Sheets (CSS)
  • Project:
    • Knowledge of Matlab
    • Knowledge of Maya
    • Software Debugging
    • Hardware Calibration
    • Algorithm Design and Testing

 

Hardware Tools

  • Website:
    • Acer Desktop PC with Windows Vista
    • Olympus Stylus digital camera
  • Project:
    • Acer Desktop PC with Windows Vista
    • Input output cables
    • Olympus digital camera
    • Stepper motor
    • LCD monitor
    • Signal generator
    • Various components

Source of Idea

For quite some time I have been interested in modeling 3D objects inside of 3D applications. I have come to realize that modeling 3D objects is a very repetitive process and can be very time consuming.  I wanted to find a way to automate the modeling process. As an initial solution I had researched commercial 3D scanners but due to their price they were inaccessible. So I decided to build my own.   

Awards Received

  • Canada Wide Science Fair (Ottawa, Ontario, May 2008)
    • Place on Team Calgary awarded by Calgary Youth Science Fair (fair has not yet taken place)

 

 

Contents


Problem

This project was developed originally as a solution to a problem with modeling 3-dimensional (3D) non-uniform rational B-spline (NURBS) surfaces. NURBS are 3D surfaces generated through mathematical interpolation applied to various computer graphics applications. Modeling 3D NURBS surfaces is generally a very monotonous and repetitive process. Therefore it was decided to undertake the construction of a custom scanning system to automate the surface modeling process.


Contents


Purpose

In this project an inexpensive 3D silhouette scanning system was proposed and implemented for the purpose of automatically generated 3D NURBS surfaces from 3D real world objects.

 

Contents


Hypothesis

A NURBS surface of a real world 3D object can be accurately generated from a series of 2D silhouette images taken of the object as it is being rotated.

Contents


Background

 

3D Silhouette Scanning

3D scanning is conceptually the process of representing a real world object as a matrix of 3D coordinate points or a 3D mesh. A wide variety of image processing techniques are applied when scanning a 3D object. A 3D silhouette scanner examines the exterior contour or silhouette of an object to deduce how the object should appear in 3D space. Figure 1 illustrates the design of the 3D silhouette scanner implemented for this project. Figure 2 displays how the silhouette of an object would look from the camera's perspective.  

ShilloetteDisplay copy.jpg

Figure 1.  3D Silhouette Scanner Configuration[11]


ShilloetteDisplayCamera.jpg

Figure 2. Silhouette of Object from the Camera’s Perspective in Figure 1[11]

 

NURBS Surfaces

In many 3D applications NURBS are used to represent many different types of objects due to their ability to accurately describe smooth flowing curvature of the object. NURBS curves are splines that are generated by mathematically interpolating between control vertices (3D points)[5]. NURBS 3D surfaces are generated by interpolating between NURBS curves. Figure 3 shows a series of NURBS curves with the purple dots representing the control vertices. Figure 4 shows the surface resulting from the interpolation of the curves shown in Figure 3.

Figure 3. Series of NURBS Curves[11]


Figure 4. NURBS Surface Generated from Curves[11]

 

Thresholding

Thresholding involves examining each pixel’s color intensity in an image. If the color intensity is above a certain threshold then that pixel’s color intensity is given a value of 1; likewise if it is bellow, it is given a value of 0. This technique can be used to remove background noise and distortion in images.  In the left side of Figure 5 an image before thresholding is shown. The right side of Figure 5 shows the image after thresholding.

 

   
Figure 5. Example of Thresholding[11]

 

Radon Transform

The Radon transform is an image processing technique that uses a series of 2D projections of an object from different rotation angles to generate a sinogram. A set of sinograms are produced corresponding to the set of vertical slice sections of the object
[7][6]. The 2D silhouette images are equivalent to measuring the absorption of particles passing through an object. To measure the absorption of a particle passing through an object, an emitter is placed on one side of the object and a sensor on the opposite side. This sensor then records the absorption as the object is rotated as seen in Figure 6. Figure 7 shows a thresholded image of an object. The green line running through the image represents a horizontal slice. Figure 8 shows the sinogram of the object outputted by the Radon transform the green line represents the slice represented in Figure 7.

 

radonMaya copy.jpg
Figure 6. Overview of Radon Transform
[11]

The left side of Figure 7 shows a thresholded image of an object. The green line running through the image represents a horizontal slice. The right side of Figure 7 shows the sinogram of the object outputted by the Radon transform with the green line representing the slice indicated in the left side of Figure 7.


   
Figure 7. Radon Transform of Object[11]

 

Inverse Radon Transform

The inverse Radon transform is an image processing technique that uses the sinograms generated by the Radon transform to reconstruct a projection of the object at that slice location
[8]. The process which the inverse Radon transform uses is called filtered back projection. Filtered back projection works by projecting the slice data in the sinogram back to its actual rotation at which it was recorded at. Filtering is then used for the purpose of minimizing the noise in the newly generated projections. The left side of Figure 8 shows a sinogram outputted by the Radon transform. The right side of Figure 8 shows the projection generated by the inverse Radon transform of the sinogram on the left side of Figure 8.


Figure 8 Inverse Radon Transform[11]



Canny Edge Detection
The Canny operator is considered the optimal edge detector because of its ability to find as many real edges on the image as is attainable and to be able to
position these edges as close as possible to the real edges on the image[2][3]. This technique works by minimizing the noise in the image by applying Gaussian filtering which essentially blurs the image removing any high intensity points. Next the intensity gradient of the image is found by applying a series of masks to the blurred image which determine the edge strengths in the image. From the gradient the edge direction can be found by examining the angles of the gradient. The last step is edge thinning where the edges are thinned and aligned to fit the real edge in the image as close as possible. The left side of Figure 9 shows an image before Canny edge detection has been used. The right side of Figure 9 shows the image after Canny edge detection.


Figure 9 Canny Edge Detection[9][12]

 

Boundary Detection

Boundary detection is a method that can be used to find the exterior edges of an object in a binary image. This technique works by analyzing an image point’s connectivity neighbours as seen in the left side of Figure 10. By examining the neighbours to an image point the next point on the boundary can be found. Notice how in the left side of Figure 10 the starting point of the boundary trace which is coloured gray has no surrounding neighbours which have the same color intensity value. This means that there would be no boundary starting at that point. The left side of Figure 11 shows a binary image of an object. The right side of Figure 11 shows the boundary of this object found by using boundary detection.

                      
Figure 10. Neighbour Tracing[10]


 
Figure 11. Boundary Detection[11]


Maya
Maya is a commercial 3D application that can be used for all purposes of 3D modeling and most applications in 3D production. Maya has a very powerful NURBS modeling engine.


Matlab

Matlab is a technical computing software package that has a wide range of features including image processing and algorithm development.

Contents


Procedure

Hardware Overview

The hardware configuration for this project was designed as seen in Figure 12. This setup uses a stepper motor to rotate a 3D object 360 degrees while recording the object through its trajectory using a camera that has the capability to record a 20 second video at 15 frames per second. This camera was positioned directly in front of the object in a stationary position. Directly behind the object was an LCD monitor that was used as a uniform light source for generating the silhouette of the object.

Hardware Overview

Figure 12. Hardware Overview[11]


Algorithm Overview

The algorithm used to generate the 3D model from the silhouette images of an object was written in Matlab[1]. Here are the steps that make up the algorithm sequentially as seen in Figure 13. 

1.  Bring video from camera into Matlab.

2.  Split video into separate frames and use thresholding function to minimize background noise.

3.  Generate a Radon transform sinogram matrix that contains each slice from all the rotations of the object.

4.  Use inverse Radon transform to reconstruct slice data stored in the sinogram as a projection image.

5.  Use thresholding function, edge detection and boundary detection on the projection images to locate where the exterior edges of the slice reside.

6.  Export each slice into Maya as a NURBS curve, and then generate a 3D surface by interpolating between the curves.


Algorithm

Figure 13. Algorithm Overview[11]

 

Post Processing

After the vertical slice projections were outputted by the inverse Radon transform, thresholding was used to remove the effects of the noise and minor distortion. Then edge detection was used on the thresholded projection images to isolate the exterior edges. Boundary detection was used to find the boundary that flows around the exterior detected edges. These boundaries were then exported as NURBS curves to the 3D application Maya where the curves were then used to generate a 3D surface representing the object[4].  

Experimentation

To address the hypothesis two sets of experiments were conducted using the algorithm designed in this project. The first set used silhouette images of real objects that were recorded using the hardware configuration developed in this project. The second set used a 3D application to render the silhouette data of prebuilt 3D models. The data created in the 3D application was used because it was rendered in a perfect environment and by using this type of image the algorithm could be tested to its full potential. Multiple objects were tested based on both hardware and simulation generated images.

 

Computer Programs

Program written to generate the points representing the object

% Read in video data from camera

video = aviread('C:\N_3d\bottleExperiment\bottle.avi');

 

% Predefined Variables

frames = 300;

xMin = 10;

xMax = 220;

yMin = 90;

yMax = 225;

iRadFilter = 1.57;

 

% Split Video into Separate Frames And Apply Threshold

for j = 1:frames

   

I(:,:,:,j) = video(j).cdata;

BW(:,:,j) = threshold(I(:,:,:,j),70,xMin,xMax,yMin,yMax,1);

 

end

 

% Radon Transform

for y = xMin:xMax

 

    for x = yMin:yMax

        for i = 1:frames

       

          R(x-yMin+1,i,y-xMin+1) = BW(y,x,i);

 

        end

    end

 

end

 

% Inverse Radon Transform

for i = 1:xMax-xMin

 

II(:,:,i) = iradon(R(:,:,i),360/frames,'linear','none');

 

end

 

% Average Filter

D = find(II>iRadFilter);

II(D) =2;

 

% Thresholding and Edge Detection for Slice Projections

for i = 1:xMax-xMin

   

    BB(:,:,i) = threshold(II(:,:,i),2,0,0,0,0,2);

    EE(:,:,i) = edge(BB(:,:,i),'canny');

 

end

 

% Boundary Detection

for i = 1:length(EE)

 c = 1;

   

 for x = 1:96

      

  for y = 1:96

      

   if EE(x,y,i) ==1

   

    H = bwtraceboundary(EE(:,:,i), [x,y], 'W', 8,1000,'counterclockwise');

   

    if length(H)>10

      

       contour(c,i).x = H(:,2);

       contour(c,i).y = H(:,1);

       contour(c,i).z = linspace(abs(i-210),abs(i-210),length(H))';

       contour(c,i).length = length(H);

  

          EE(contour(c,i).y(:),contour(c,i).x(:),i) = 0;  

 

          c= c+1;

    end

 

      

            end

        

        end

   end

end

 

% Plot the Point Cloud

for i = 1:length(contour)

    plot3(contour(i).x(:),contour(i).y(:),contour(i).z(:),'.');

    hold on;

end

 

Program written to export the points representing the object as NURBS curves in a file that could be read into Maya

function savNurb(contour,name,degree)

% savNurb saves a group of points as a Nurbs curves in a Maya ASCII file

% contour contains the points representing the object in X,Y,Z

% name is the location were the resulting file is saved

S = strvcat('//Maya ASCII 7.0 scene','//Name: f.ma','//Last modified: Mon, Feb 25, 2008 10:55:08','requires maya "7.0";',...

    'currentUnit -l centimeter -a degree -t film;','fileInfo "application" "maya";','fileInfo "product" "Maya Unlimited 7.0";',...

    'fileInfo "version" "7.0.1";','fileInfo "cutIdentifier" "200511200915-660870";','fileInfo "osv" "Professional  (Build 6000)\n";');

   

G = '';

H = '';

for j = 1:length(contour)

    G = strvcat(G,['createNode transform -n "curve',int2str(j),'";'],['createNode nurbsCurve -n "curveShape',int2str(j),'" -p "curve',int2str(j),'";'],...

    '   setAttr -k off ".v";',' setAttr ".cc" -type "nurbsCurve"',...

   ['       ',int2str(degree),' ',int2str(length(contour(j).x)-degree),' 0 no 3']);

c = 0;

H = ['      ',int2str(length(contour(j).x)+2)];

for i = 1:length(contour(j).x)+2

    if i <4

    H = strcat([H,' ',int2str(c)]);

    continue;

    end

    if i > length(contour(j).x)-1

        H = strcat([H,' ',int2str(length(contour(j).x)-3)]);

       continue;

    end

   

        c= c+1;

        H = [H,' ',int2str(c)];

   

end

H = strvcat(H,['        ',int2str(length(contour(j).x))]);

 

G = strvcat(G,H);

 

for i = 1:length(contour(j).x)

    G = strvcat(G,['        ',int2str(contour(j).x(i)),' ',int2str(contour(j).z(i)),' ',int2str(contour(j).y(i))]);

end

G = strvcat(G,'     ;');

end 

F = strvcat('createNode lightLinker -n "lightLinker1";','   setAttr -s 2 ".lnk";','select -ne :time1;','    setAttr ".o" 1;','select -ne :renderPartition;'...

    ,'  setAttr -s 2 ".st";','select -ne :renderGlobalsList1;','select -ne :defaultshaderList1;'    ,'  setAttr -s 2 ".s";','select -ne :postProcessList1;','   setAttr -s 2 ".p";'...

    ,'select -ne :lightList1;','select -ne :initialshadingGroup;',' setAttr ".ro" yes;','select -ne :initialParticlesSE;',' setAttr ".ro" yes;','select -ne :hardwareRenderGlobals;'...

    ,'  setAttr ".ctrs" 256;',' setAttr ".btrs" 512;','select -ne :defaultHardwareRenderGlobals;',' setAttr ".fn" -type "string" "im";  '...

    ,'  setAttr ".res" -type "string" "ntsc_4d 646 485 1.333";','connectAttr ":defaultLightSet.msg" "lightLinker1.lnk[0].llnk";','connectAttr ":initialshadingGroup.msg""lightLinker1.lnk[0].olnk";','connectAttr ":defaultLightset.msg" "lightLinker1.lnk[1].llnk";','connectAttr ":initialParticlesSE.msg" "lightLinker1.lnk[1].olnk";','connectAttr "lightLinker1.msg" ":lightList1.ln" -na;','// End of f.ma');

D = strvcat(S,G,F);

 

fid = fopen(name,'w');

for i = 1:length(D)

fprintf(fid,'%s\n',D(i,:));

end

fclose(fid)

 

Contents


Results 

The left sides of Figures 14a-16a show photos of the objects that were scanned using images generated by the hardware configuration. The right sides of Figures 14b-16b show the resulting NURBS surface representing the object. The left sides of Figures 17a-18a show images of the computer generated objects that the algorithm was applied to. The right sides of Figures 17b-18b show the resulting NURBS surface representing the object.


Hardware Generated Images


Bottle      
Model
Figure 14.(a) Real World 3D Object, (b) Generated NURBS Surface[11]

 

Sun Tan Bottle 
Bottle
Figure 15.(a) Real World 3D Object, (b) Generated NURBS Surface
[11]

 

Nail  nailFin

Figure 16.(a) Real World 3D Object, (b) Generated NURBS Surface[11]

Cup 
cups
Figure 17.(a) Real World 3D Object, (b) Generated
NURBS Surface[11]


Computer Generated Images

 

 

Hand

Figure 18.(a) Computer Generated 3D Object, (b) Generated NURBS Surface[11]


Gear  gea
Figure 19.(a) Computer Generated 3D Object, (b) Generated
NURBS Surface[11]

 

 

          

 Contents


 

Sources of Distortion

Figure 14:

There was general visual distortion found in the resulting NURBS surface seen in Figure 14b. This was caused from the lack of visual description of the object in the silhouette images due to the resolution of the camera.

Figure 15:

There was general distortion found in the NURBS surface that can be seen in Figure 15b. It can also be seen that the NURBS surface tapers inward though the actual object does not. This is caused from the diffraction of the light illuminating from the LCD monitor as it bends around the object and scatters as it reflects off the rough surface.

Figure 16:

The object seen in Figure 16a was very small and when it was described in the silhouette images the resulting NURBS surface representing the object had a very distorted shape due to the resolution of the recording camera (320x240 pixels). Also the bottom of the object (head of the nail) could not be scanned without distortion in the resulting NURBS surface which was caused by the blending in the silhouette images between the mounting bracket on the stepper motor and the bottom of the object(head of nail).

Figure 17:

There was general visual distortion found in the resulting NURBS surface seen in Figure 17b. It can be seen that part of the extruded section of the object in Figure 17a was not represented in the NURBS surface (cup's handle). This was because the extruded section extended far out beyond the base of the cup causing some of the concave geometry data to not be interpreted from the silhouette images.

Figure 18:

Again it can be seen in Figure 18b that the resulting NURBS surface was deformed. It can also be seen that where outcroppings in the NURBS surface (extrusion for thumb) there was shape distortion. This was caused because concave shapes cannot be interpreted from silhouette images without distortion.

Figure 19:

There was general distortion found in the NURBS surface that can be seen in Figure 19b. The sharp extrusions of the object (gear teeth) are not clearly defined in the NURBS surface because of a limitation with using the silhouette scanning method to describe objects with concave features.


 

Contents


Conclusion

It was demonstrated that the system designed in this project can be used to generate 3D NURBS surfaces representing real world 3D objects and that this process could be executed based off only silhouette images recorded of an object rotating. This system was introduced for being an innovative low cost solution to 3D scanning in comparison to the commercial 3D scanners which use other methods and have an expensive price range. Distortion was experienced as a limitation to this system. The most distortion was experienced when objects with concave geometry were scanned and this distortion was shown for both the hardware generated images and the computer generated images. As well there were limitations due to the resolution of the camera causing some detail to be lost when recording the silhouette images using the hardware configuration.  In future work a solution for minimizing distortion in the resulting NURBS surface will be addressed. In conclusion it was shown that the 3D silhouette scanning system that was designed in this project can be applied to represent real world objects with 3D NURBS surfaces.

Contents


 

Applications

 

1. Automotive Design

In the automotive design phase 3D scanning can be used to scan models built of prototype cars for enhancement through the use of CAD applications. 3D scanning can also be used in the process of representing old unavailable car hardware parts with 3D surfaces that can be modified and reproduced using various CAD application  

2.  Medical Research

3D scanning can be applied to medical research in helping visualize many different internal parts of the body for diagnostic purposes that can not be seen by the human eye. An example of this application is CT (computed tomography) scanning.

3. CG Entertainment

As game engines and computer generated scenes in movies get closer to photorealism 3D scanning can be used to generate 3D models of real world objects for this purpose. Some of these models might be humans, cars and even models of clay objects created by artists. 

4.  CAD design

3D scanners can be used to scan machine parts for use in a CAD application.

5.  Restoration of Historical Objects

Historical objects such as sculptures can be recreated as 3D models using a 3D scanner.

Contents


References

[1] “Matlab”, The MathWorks, Inc 2007.

[2] K. Pratt, William ”Digital Image Processing”, John Wiley & Sons, Inc. 2001

[3] Green, Bill “Canny Edge Detection Tutorial” Autonomous Systems Lab 2002 http://www.pages.drexel.edu/~weg22/can_tut.html

[4] “Maya ASCII format”, Alias|Wavefront, 1999

[5] Hill, F.S JR., “Computer Graphics using opengl”, Prentice Hall, Inc. 2001

[6] Ma, Yi, Soatto, Stefano, Kosecka, Jana, Sastry, S. Shankar, “An Invitation to 3-D Vision”, Springer Science + Business Media LLC, 2006

[7] Toft, Peter, “The Radon Transform”, Peter Toft, 1996, http://eivind.imm.dtu.dk/staff/ptoft/Radon/Radon.html

[8] Hayden, Brendan F., “Slice Reconstruction”, Brendan F. Hayden, 10/02/2005, http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/AV0405/HAYDEN/Slice_Reconstruction.html

[9]  The left side of Figure 9 Olympus Stylus camera,
http://images.google.ca/imgres?imgurl=http://www.dcresource.com/reviews/olympus/stylus_300-review/camera-front-angled.jpg&imgrefurl=
http://www.dcresource.com/reviews/olympus/stylus_300-eview/&h=301&w=455&sz=30&hl=en&start=25&um=1&tbnid=vTYe0SIEln
MWkM:&tbnh=85&tbnw=128&prev=/images%3Fq%3Dolympus%2Bstylus%2Bcamera%26start%3D18%26ndsp
%3D18%26um%3D1%26hl%3Den%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-GB:official%26hs%3Debv%26sa%3DN
[10] Image of boundary detection from  Matlab documentation,
http://www.mathworks.com/access/helpdesk/help/toolbox/images/index.html?/access/helpdesk/help/toolbox/images/
bwtraceboundary.html&http
://www.google.ca/search?q=boundary+tracing
+matlab&ie=utf-8&oe=utf8&aq=t&rls=org.mozilla:en-GB:official&client=firefox-a
 
 

[11] Images created by Christopher Nielsen

[12] left side of Figure 9 created by Christopher Nielsen

  

Contents


Acknowledgements

Special thanks is extended to the following people who supported this project:

  • Dr. John Nielsen of the University of Calgary for the hardware contributions and for discussion of my ideas around the algorithm development
  • My mother for general help and support

 

Contents

 

Copyright Christopher Nielsen 2008

All Rights Reserved