Mingl I 2004

You are on page 1of 6

Pmcwdlngs ofthe 2004 IEEE

-
lnbmatlonal Confironce on R o b t l ~ &
New Orleans, LA npril2004
. Autornatlon

Spatial Motion Constraints in Medical Robot Using


Virtual Fixtures Generated by Anatomy
Ming Li and Russell H. Taylor
Deparfmenf of Computer Science
NSF Engineering Research Centerfor Computer Integrated Surgical Sysfems and Technology
The Johns Hopkins University
Bahimore, Maiyland 21218. USA
{liming & rht}@cs.jhu.edu

Absmct -I n Ear, Nose and Throat (ENT) surgery, the Recent research on motion constraints usine Kumar’s svstem
operating volume is very limited. This is especially true in sinus [4,51 has focused on simple techniques for-“guidance ;irtual
surgery, when the instrument passes through the nasal and sinus fmtures”. This prior work focused on 2D geometric guidance
cavity to reach the pathological area. The nasal and sinus bones motion of the tool tip or camera and assumed that the tool or
impose geometric constraints on the work volume. During the
surgery, the surgeon needs to control the motion of the camera itself did not have any other environmental
instrument tip to accomplish some delicate proeedure; constraints.
meanwhile helshe needs to avoid hitting anatomic constraints. I n The focus of this paper is automatic generation of spatial
this paper, we present a method to assist the path following task motion constraints associated with complex 3D anatomy,
in a constrained area. The system reads the user’s force input based on preoperative medical images. In endoscopic sinus
and combines it with the planned tip-trajectory to create the tip surgery, the endoscope and other instruments have some
motion constraints; meanwhile it generates tbe, tool-shaft degree of translational and rotational 6eedom hut their motion
boundary constraints based on a 3-D geometric model. We map is constrained by anatomic structures. During surgery, the
instrument tip motion and boundary information to joint
instruments or the camera.should avoid collisions or excessive
displacements via robot kinematics, and then use a constrained
quadratic optimization algorithm to compute the optimal set of force on delicate anatomy while still moving in the desired
correspondingjoint displacements. In the preliminary study, we manner to accomplish the intended task. Constrained robot
show that robot guidance using cooperative control and vlrtual control has been discussed previously in both tele-
futures derived from complex geometry can assist users ‘in manipulation and cooperative manipulation contexts. Funda,
skilled manipulation tasks, while maintaining desirable Taylor, et al. [6] formulated desired motions as sets of task
properties such as collision avoidance and safely. goals in any number of coordinate 6ames relevant to the task,
optionally subject to additional linear constraints in each of
-
I n d n Terms collaborative monippularion; virrvolf i f w e ; the task frames for redundant and deficient robots. The
optimirorion robot control; geomerry constraint geometric complexity of medical workspace constraints in
ENT makes this approach amactive for our current research.
I. INTRODUCnON In preliminary work [7] we applied the method of [6] to
In sinus surgery, medical instruments or an endoscope surgical environments in which motion constraints on a simple
camera are inserted through the nose into a sinus cavity. The path-following task are automatically derived from registered
surgeon must precisely manipulate these instruments based on pre-operative models created 6om 3D images. In this paper,
visual feedback from the endoscope and on information from we generalize these techniques to more general cooperative
preoperative 3D images such as CT. Although surgical ’
control cases in which virtual fixtures are automatically
navigation systems can track instruments relative to generated from registered preoperative medical images. First,
preoperative data [l], the difficulty of precise surgical we describe the system and algorithm for generating spatial
manipulation with endoscopic instruments makes endoscopic motion constraints derived from geometry. We then describe
sinus surgery a natural candidate application domain for implementation and experiments.
cooperatively controlled surgical robots. Fig. 1 conceptually illustrates the relationship between
The goal of human-machine collaborative systems the instrument, 3D path and approach apemre to the
(HMCS) research is to create mechanisms that selectively workspace cavity in our experiments. The surgical instrument
provide cooperative assistance to a human user (for us, a is a sharp-tipped pointer held either hy a robot or freehand. In
surgeon), while allowing the user to retain ultimate control of other cases it might he a surgical endoscope or a grasping
the procedure. Taylor e f al. [Z] developed an augmentation instrument. We use the term “tip frame” to refer to a
system for fine manipulation, and Kumar e f a/. [3] applied it coordinate system whose origin is at the tip of the pointer and
to microsurgical tasks such as inserting a needle into a 100 whose orientation is parallel to the tool holder of the robot.
micron retinal vessel. Virtual fixtures in a collaborative .The “tool boundary frame” is a coordinate system whose
system provide cooperative control of the manipulator by origin corresponds to the point on the tool that is closest to the
“stiffening” a hand-held guidance mechanism against certain surrounding anatomy and whose orientation is again parallel
directions of motion or forbidden regions of the workspace. to the tool holder.

0-7803-8232-3/04/$17.00 02004 IEEE 1270


I 1

I c m.
Fig. 1 Relationship between the tool, cavity,
approach ape-, and 3D path in OUT sample task

11. VIRTUAL FIXlllRE GENERATION SYSTEM


Fig. 2 shows the system of virtual fixture generation for
our task. In the pre-operative stage, !?om medical images (e.g., Fig. 2 Virtual fixturegeneranon system
CT image, MRI), the surgeon defines the tool-tip path by
selecting positions in the image slice. 3D geometry model also incremental motion is described as the solution to a
is created !?om the pre-operative images. Ailer the robot is constrained optimizationproblem.
calibrated and registered, the planned tool-tip path and 3D Sfep 2: Use the robot and task kinematic equations to
model are transformed into the robot coordinate system. This produce a new linearized optimization problem, in which
paper focuses on a method to generate virtual fmtures derived instrument motion variables and other task variables have
!?om complicated geometry, which is the shadow part of Fig. been projected onto incrementaljoint variables. This problem
2. has the general form:
In our collaborative system, the surgeon is in the control
loop; s h e is able to control the progress of the tool along the a r g 4 W . h- 41
constrained path. The system reads the surgeon’s input and H.h>h (1)
combines it with the planned tool-tip trajectory and the current k=J.Aq
tool-tip position to create the spatial motion for the tool-tip.
Meanwhile, tool-shail boundary motion constraints are where Aq is the desired incremental motion of the joint
generated !?om the registered 3D geometry model and the variables and h i s an arbitrary vector of task variables.
current tool position. With some other constraints, such as Different components of the optimization function may be
joint limitations of the robot, all these constraints are fed into assigned different relative weights, so that the errors of critical
the constrained optimization control algorithm to obtain the motion elements are close to zero, while errors in other non-
robot joint velocities critical motions simply stay as low as possible within
111. CONSTRAINED CONTROL ALMRTTHM OVERVIEW tolerances allowed by the constraint set.
Step 3: Use known numerical methods [8] to compute
It is important to be able to place absolute bounds on the incremental joint motions Aq , and use these results to move
motion of the instrument in the constrained working the robot.
environment. Within these hounds, the con&oller should cy to Step 4: Go hack to Step 1.
place the instrument tip as close to the desired position as
possible. The basic control loop may he summarized as IV. S Y SIMPLEMENTATION
~
follows:
Sfep 0: We assume that the robot is holding the surgical A. Tip Spatior Motion Consfraint Generatian
instrument, and that a model of the patient’s anatomy has been Our method for implementing guidance virtual fixtures
obtained and registered to the coordinate system of the robot. for tool-tip spatial motion is described in an earlier paper by
Step 1: Describe a desired incremental motion of the Marayong, Li, et al. [ 5 ] . For completeness, we will
surgical instrument, based upon (a) surgeon inputs, such as summarize key details here. We will model the tool tip as a
may be obtained !?om a joystick or hands-on cooperative force Cartesian ‘‘robot’’ whose position x ( t ) SE(3) ~ and velocity
control; @) an apriori surgical task description; (c) real time Y ( f ) E R ’ are related to the robot’s joint positions q ( f ) and the
sensor feedback, such as might be obtained from a vision relations
sensor. This description may include both an objective
function describing desired outcomes (e.g., move as close as x(t)= Kinr(q(f))
possible to a target) and motion constraints (e.g., avoid
collisions, do not exceed robot joint limits, do not permit (2)
position errorst0 exceed specified limits, restrict tip motion to
remain within a desired envelope, etc.). The desired

1271
The tip's motion is to be controlled by applying forces tree, each sub-space is defined in the orthogonal coordinate
and torques on the handle of the instlument, transformed to system of the eigenvectors centered at the center of mass of
produce forces f E R3 resolved into the coordinate system of the point set, and is recursively partitioned along this local
coordinate h m e . We rotate each local reference h e so that
the tool tip. We define a reference direction of motion
the x-ais is parallel to the eigenvector with the largest
D = D(r), and span and kernel projection operators
eigenvalue and the z-ais is parallel to the eigenvector with the
smallest eigenvalue. An important advantage of covariance
S'n(D) E [D]= D(DrDrDr trees is that the bounding boxes tend to be much tighter than
(3)
Ker(D)E ( D ) = I - [D] those found in conventional k-D trees and tend to align with
surfaces, thus producing a more efficient search.
We then define a function U = u(x,S) computing a signed The tree is built recursively as follows. Given a set of
distance fiom the tool tip to the task motion target (e.g., an points {si) sampled from the surface, we find the centroid
anatomical feature or desired tool path). The new preferred and moments
direction is then defined as
(4)

where kd is a blending coeEcient with Os k, s 1, which


governs bow quickly the tool is moved toward the reference
M = ZS,S',
, where q =si -T
direction. We then define an admittance control law We @en compute the eigenvalue decomposition of M
and determine a 'rotation R with columns R. and R,
corresponding to the largest and smallest eigenvalues, as
where is the desired tip velocity, k is the admittance discussed above. We then rotate the St to compute S, = R .SF.
gain and k , ( 0 < k, < 1) is an admittance ratio that,attenuates Finally, we compute the bounding box of the S, and partition
the non-preferred component of the force input. the i, about the cutting plane x = 0 . Each of these sets is
Our desired 3D Curve D is generated by B-spline thus recursively subdivided until only a small number of
interpolation. As mentionecl,the tool-tip trajectory is planned points remain ,in each box.
by picking positions on the"pre-operative medical image In our case, we must modify this procedure because our
slices. After registration, all these points are transformed into surface model is composed of small triangles, not sample
the robot coordinate system. We then interpolate a B-spline points. To do this, we fit a circumsphere about each triangle.
curve to fit these sample points. At each control loop, the We then construct a covariance tree of the centers of each
robot encoders read out the current tool-tip position. We circumsphere. At each node (i.e. for each bounding box) of
search the closest point on the B-spline to the current tool-tip
the tree, we note the maximum radius r- of all
position and compute the tangent direction of the B-spline at
that point. If I , , t y and f, are the components of the tangent circumspheres associated with that node. We compute the
corners of an expanded bounding box
to the curve, and n, , ny and n, are the components of the
vector &om current position to the closest point on the curve, = (-r-
(idn,im) + minij,r,, + yi,) (7)
then we have D = (I, f, I, and u = (n, ny n z r .
. . min & max operations are performed element-
where the
B. Boundary Conshuint Generation wise.
We use 3D triangulated surface models of anatomy to Tree, searching proceeds as follows. Given a lime
develop real-time constraints on tool motion Production of segment (0.6) between two points U and b ,we wish to find
such anatomic models from 3D ,medical images and all possible surface patches that may be closer than some
registration of the models to robotic workspaces are well- .threshold distance e,,-, to the line segment. At each level of
established techniques 191.
the tree,. we tirst trqsform the line segment to the local
However, the models are geometrically complex;
generating constraints in real time can be a challenge. In the coordinates(G,d)=(R-'.(li-T),R-'.(d-J)). Ifwe are at a
current work, we model the surgical tool as one or more "fat"
line segments representing the tip and tool shaft. Boundary terminal node of the tree, then for each triangle t in the node,
constraints are generated fiom closest-point pairs P6 and Pk we compute the point pair (F,,d,) correspondingto the closest
on the geometry boundary and the tool respectively. approach between the line segment and triangle, where t, is
In our work, we use a covariance tree data structure [lo] on the triangle and 2, is on TI,^). We re& any such pairs
to search for the closest point on the surface to the tool. A
covariance tree is a variant of a k-dimensional binary tree (k-D with ~ ~ Z ; - d , ~ ~ < . For a non-terminal node, we compute
tree). The traditional k-D tree shucture partitions space
recursively along principal coordinate axes. In our covariance
the point of closest approach Fd and a corresponding point

1272
d,, on the line segment. If the distance not collide with the cavity boundary as a result of the motion.
For each potential contact point pair we get a constraint of the
IlC-,* -J-,*Il c , we recursively search the left and right general form
sub-trees for points of close approach to (a,&). When the nl ( $ + A P - < ) ~ E (10)
search is complete, we transform all point pairs back into the
world coordinate system. If there are no point pairs closer where $ is the position of the potential collision point on the
than e,,,, we generate no boundary constraints for the tool and 4 is the position of the potential contact point on the
current time step of the control. surface. $ and 4 are the closest point pair we generated by
One difficulty with this approach is that it tends to the method described in last section. na is the normal of the
produce many point pairs from the same patch of anatomy that
are almost equivalent, thus producing an excessive number of contact point on the surface, and E is a small positive number
motion control constraints. Therefore, in practice we modify (0.01 in our experiments). Constraint (IO) indicates that the
our search as follows: For any non-terminal node with angle between (P,+ APk - 5)and nbr is less than 90'. We can
-ddel1< that is vely flat (i.e., with ,i -& <,
also defme an objective function =IIW,AP,II expressing the
desirahility of minimizing extraneous motion of the tool near
smaller than a specified value) and for which the line segment
the boundary, and can again rewrite these formulae in terms of
does not penetrate the bounding box, we simply return the
4:
point pair(Cd,dd) without further recursion.
C. Control Algorithm Implementation 5. = IIw,AelI
(1 1)
We have experimented with both velocity and "Jk (4142 h'
incremental step position control. In the discussion below, we
Currently,we use vely low values (0.001) for wt and rely
will use the latter. Thus, at each time step, the goal is to
compute incremental joint motions Aq , which then are fed to mainly on the inequality constraints. An alternative would
low-level position servos. have been to leave the 5, term out of the optimization
We compute desired tool tip velocity using the altogether. The number of boundary constraints is dynamically
admittance law described above, and convert this to an changed because it depends on how many closest-point pairs
incremental 6-DOF tool motion we generate based on the relative position of the tool and the
Y*.&= ( ( ~ , # . ~ ~ A t(O,O,O))T,
f)ll where At is the sample geomeby constraint.
Joint limits: Finally, we want to ensure that none of the
interval. We identify 3 classes of requirements: in the tip joint limits are exceeded as a result of the motion. This
frame, tool boundary frame, and in joint space. For each, we requirement can be stated as, q- - q s Aq s,4 -4, where q
defme an objective function to be minimized and a set of
is the vector of the current values of the joint variables, and
linearized constraints.
q- and q, denote the vectors of lower and upper bounds
Tool tip motion: We require that an incremental tool tip
motion be as close as possible to some desired value. We on the joint variables respectively. We also want to minimize
express this as: the total motion of the joints. This can be rewritten in the
form
5-,# = 1% - Ae,.J <,n, =IIWp,.'%l'
(8) (12)
AP*.&T.U, >I-& H p m . A q 2 h,,,

where E is a small positive number (0.01 in our where


experiments). We relate tool frame motion to joint motion via
the Jacobean relationship = J,(q)iq . We rewrite (8) as

5-,# = llKq4J,*(~)iq-Y<J (9) Again, we set w , ~ , to 0.001 and simply enforce the
%*J,q(q)iq 2 h,
inequality constraints.
where H,*.& =U,.,', h, = I - & and W, = diag{w,) Putfing it together: We combine all the task constraints
denotes a diagonal matrix of weighting factors specifylng the and objective functions, and then obtain the overall
relative importance of each component of Y w .Since we optimizationproblem, which is:
want to @ackthe path tightly, we set w, to a fairly high value
(1 in the current experiments).
Boundary construin@: Since the instrument is inserted
into a cavity, we want to ensure that the instrument itself will

1273
subject to

which can e solved numerically using the method ( iwson


and Hanson [SI for the set -of joint displacements ~ q ,
satisfying the constraint (15) and minimizing the error norm of
(14).
v. EXPERIMENT AND RESULTS Fig 3 Expcnrncntal apparamr rhowmg the Jllll Stcady-hand ruhot and
skull phantom (left) and close-up of pointer tool (right)
We have evaluated our approach using the experimental
setup shown in Fig. 3. A thin wire attached inside the nasal
cavity of a plastic skull served as the target path, and the
benchmark task was to trace this path with the tip of a bent
pointer without colliding with the walls of the cavity. Five
small radioopaque fiducials were implanted in the skull, which
was then CT-scanned. The positions of the fiducials in the CT
image were determined by standard image processing
methods. We used a Northern Digital Optotraka 3D tracking
system to perform registration of CT coordinates to robot
coordinates and (coincidently) to the Optotrak coordinates by
standard methods. Essentially, an Optotrak probe was used to Fig. 4 3D-SLICER[Ill surface models of test phantom (left) whole skull
locate each fiducial while observing additional Optotrak surface model and, (right) the model of nasal cavity portion used to
markers on the robot and then the appropriate transformations generate conrtraiotr
were computed using the method of [12].
We used 3D-SLICER [ l l ] for anatomy modeling and servoing the robot position runs on a dedicated motion
interactive display. 3D-SLICER’S built-in segmentation control-board at a much higher rate. The tool-tip position is
functionality was used to threshold, segment and render the computed from the robot encoders and transformed back to
skull CT dataset to create a triangulated model of the skull CT space. As a comparison, we also gathered fieehand data. A
surface. For the current experiment, we only used the nose and user held the tool with an Optotrak rigid body affixed, and
sinus portion of the resulting skull model. The 3D-SLICER moved the tool through the sinus cavity to follow the wire as
skull surface model and the model of the sinus portion are close as possible. The tip position was also transformed into
shown in Fig. 4. There are about 99,000 vertices and 182,000 CT space. In both cases, the position of the tool is displayed
triangles in this surface model, all of which were transformed on the 3D-SLICER interface. This provides the user with
to robot coordinates after registration. visual feedback of the task.
Our current implementation uses the JHU Steady-hand We choose the admittance ratio k, as 0 to enforce the
robot [2]. Steady-hand robotic mechanisms are coupled with tool-tip motion only along the preferred direction; control gain
computation to enhance human capabilities at highly skilled, k, as 0.2. Our experimental results (TABLE I) show that an
interactive tasks. It is a 7 DOF remote-center-of-motion
experienced user can perform the guidance task with twice
(RCM) manipulator with high position resolution. A force
accuracy and in 65% the time with robot assistance compared
sensor is integrated at the endeffector and the operator holds a
to freehand execution. The average error of 5 trials by robot
surgical pointer mounted to the force sensor.
guidance is 0.76mm and the average time is 17.24s; the
In our path following experiment, we defmed the target
average error for freehand is 1.82mm while the average time
path with respect to CT space by tracing the wire with the tip
is 26.61s. During the path following task, the tool itself did not
of an Optotrak pointer. We gathered multiple sample points
hit the bone. The robot calibration and system registration
along the path and then interpolated a 5-th degree B-spline
errors are the two main sources of tip motion error. In our
curve to fit these sample points.
experiment, the residual registration error measured across the
The current tool-tip position with respect to the robot can
5 fiducials is 0.425mm. Although it is small, the backlash of
be determined from the robot joint encoders and the forward
the robot also contributes to the robot guidance tip motion
kinematics of the Steady-hand robot. First, we manually
error. Fig. 5 shows tool trajectories and the relative position of
choose a point on the curve close to the current tip position
the tool with the nasal cavity model. Fig. 6 shows the
and guide the tool tip to the curve. Then the system reads the complete error of the system. In our task, the size of Aq in (1)
user’s force input and guides the user to follow the B-spline
curve. The sampling frequency of position and force data used is.7~1H . . J varies fiom 20x7 to 39x7, W . J varies from
in our algorithm is 30Hz, although a PID controller for 13x7 to 37x7 according to the number of boundary

1274
tool shaft portjan
Fig. 6 Magnitude of position crmr using Robot guidance (solid line) m d
ig. 5 Trajectories of the tool during the path following procedure. (le freehand (dashed line). x-axis(-): the parameter in B-spline parameter
le swept volume of the tool path, (fight) the relative position beween t domain, y-axis(-): the magnitude of the position error
tool and the nasal cavity.

constraints. With our 2GHz Pentium N PC, the average time


in each control interval for the boundary search and ACKNOWLEDGMENT
optimization problem solution was 5.54ms. Partial funding of this research was provided by the
National Science Foundation under grants EEC9731748
VI. CONCLUSION
(CISST ERC), IIS9801684, IIS0099770, and 1180205318.
ENT surgery presents a constrained working environment The authors also gratefully acknowledge the advice and
for both the surgeon and the mechanical devices designed to collaboration of Dr. Masaru Ishii, M.D., Dr. Allison Okamura,
assist them. The control for the medical design of the devices Dr.Greg Hager, Dr. Peter Kazanzides, Greg Fischer, Anand
must reflect these constraints. In this paper we outlined and Viswanathan, Panadda Marayong, and the rest of the CISST
implemented a guidance virtual fixture generated by complex ERC "crew".
3D anatomy.
Robot guidance using cooperative control and virtual REFERENCES
fixtures derived from complex geometry can assist users in [I] Taylor, R and L. Joskowicr, Computer-lntegrared Surgery and Medical
skilled manipulation tasks, while maintaining desirable Robotics, Standard Handbook'af Biomedical Engineering & Design, M.
Kuts Editor. 2002, McGraw-Hill. p. 29.3-29.45.
properties such as collision avoidance and safety. The [21 Taylor, R, P. Jenssa, L. Whitcomb, A. Barnes, R Kumar. D.
experiments reported here used a simple task with constraints Stoianavici, et al. A SIe+-Hond Robotic Sprern for Microsugical
similar to those that might be encountered in ENT surgery. Augmentation. Int-tioml Journal of Robotics Rcscanh, 1999.18(12).
The results showed significant improvement in both accuracy [3] Kumar, R, T. Goradia, A. Barnes, P. Jcnscn, L. Whitcomb, D.
Stoianovici, et al. Pe&omonce of Robotic Augmentmion in
and execution time, compared to free-hand instrument McmmrgeryScole Motions. 2nd Int. Symposium on Mcdical h g e
manipulation. Computing and Computer-Assisted Surgery. 1999. Cambridge, England.
We are continuing to evaluate these methods and further Springer, September 19-22p.1108-11IS.
integrating these components into a full frontal sinus surgery [4] Bcttini, A., S. Lang, A Okamura, G.Hager, Vision Asisred Connolfor
Manipulation Using Virmal Fkturer. IEEERSJ International
system. Meanwhile we are valuating other robot registration ConferenceonIntslligcnt RobotsandSystems,ZWl: p. 1171-1176.
methods to improve the registration accuracy. In the future, [5] Marayong, P., M. Li, A. Okamura, 0. Hagcr Sparid Morion
we will introduce these types of spatial motion constraints to Constraints: Theory and Demonsnarionr for Robot Guidance Usiog
tele-manipulation, which is another popular surgical robot Vimal FixWm. International Conferertee on Robotics and Automation.
2003. Taiwan, Septembu 15-19 p.1954-1959.
control mode. We will compare the performance using such Funds, J., R Taylor, B. Eldridgc, S. Gomory, K. Oruben, Consnoined
161
virtual fixtures between the collaborative and tele- C o r t s i m motion control for teleoperared surgical roboe. IEEE
manipulative robot systems. Transactions on Robotics and Automation, 1996 p.453465
[71 Li, M. and R Taylor. Optimal Robot Conml for 3 0 V i m d Fixmm in
TABLE I EXPERIMEKTALRESULTS - ROBOT ASSISTANCE 1 3 Contrained ENT Swgev. Medical Image Computing and Computer-
FREEHAND. Assisted Intnventim (MICCAI) 2003. Montreal, Nov 15-18 p.165-
172.
Robot guidance I Freehand [81 Lawson, C. and R. Hanson, Solving Least Squarer Pmblems. 1974,
T"al#
Average
Em,
Average
Time
Average
Em1
Average
Time [91
-
Enelcwood Cliffs. N I Rentice-Hall.
Taylor, RH., et al., eds. Computer-lntegroted Surgery. 1996, MIT
Press: Cambridge, Mass.
[IO] Williams, I., R Taylor, and L. Walff. Augmented k-d techniqus for
accelerated re&trdion and disrmce measurement of ~u%c.foces.
Computer Aided Surgay: Computer-Integrated Surgery of the Head and
Spine. 1997. Lioq Austria, Scptemberp.PO1-21.
II 11 . . -.
3D Slicer web sitehttolhuMK.slicer.are.Z003
f12j Ham, B.K.P., Relative olientation. Intnnational Journal of Computer
Vision, 1990.4 p. 59-78.

1275

You might also like