Automatic registration method using EM sensors in the IoT operating room

The Internet of Things (IoT) in the operating room can aid to improve the quality of the computer-aided surgical system. Patient-to-image registration is an important issue for computer-aided surgical systems. Automating the procedure of patient-to-image registration could increase tracking accuracy and lower the time consumed for performing the procedure of registration. Therefore, we propose an automatic registration method to address this issue by constructing a wireless sensor network system for surgery. A plastic fiducial object combing with specific markers is developed to perform registration in that the ultimate purpose is to integrate them into a surgical robotic system for surgical navigation. The specific markers are designed to localize the position of the small EM sensor and can be automatically detected in CT/MRI images by an automatic algorithm. The positions of the EM tracking sensors can be calibrated during the procedure of registration. Some experiments are designed and performed, and the experimental results demonstrate that the proposed registration method is robust and accurate. The proposed registration method is a foundational link of the surgical robots combing with virtual or augmented reality technology that all these technologies will be performed in further surgical navigation.


Introduction
The Internet of Things (IoT) in the operating room is the concept of medical devices in the operating room connected to each other via the Internet to make the medical devices smart. The performance of the IoT operating room can aid to improve the quality of the surgical navigation system. Figure 1 illustrates an IoT-based imageguided system. Image-guided surgical navigation is a computer-aided system that can help the surgeons use a tracking medical instrument for the treatment of specific diseases, which assists surgeons in improving therapy and reducing invasiveness and complications [1,2].
The tracking system plays a key role in image-guided surgical navigation. Electromagnetic (EM) tracking system is popularly used for real-time tracking medical instruments by attaching EM sensors on it owing to the fact that EM sensors have no restriction of line-of-sight with small size [1][2][3][4][5]. The work principle of EM tracking system is based on the electromagnetic theory of mutual induction that the positions of EM sensors can be localized when they are placed in a local EM field which is generated by an EM transmitter. The EM tracking technology combined with 2D or 3D images, which is achieved by computed tomography (CT), magnetic resonance (MR) imaging (MRI), or ultrasound, has been applied in clinics for various procedures by navigating medical instruments [1,6,7]. The EM tracking system allows medical instruments to be visualized in the preoperative 3D image dataset, especially helpful when guiding a puncture needle to reach a target [8]. Image-patient registration is fundamental for tracking medical instruments in 3D image-guiding, which is known as rigid registration by aligning the patient to the preoperative images. Anatomical landmarks combining with the fiducial markers are manually selected to perform image-patient registration in image and patient spaces which is a widely used routine procedure in image-guided navigation [9]. A 3D localizer, which is called a pointer, is used to select anatomical landmarks and fiducial markers in patient space [10]. Since the positions of anatomical landmarks and fiducial markers are picked by a pointer in patient space or selected manually on a computer screen, their positions are subject to manual measurement errors which results in inaccurate registration [11].
Besides, medical instruments and pointer have to be calibrated prior to use for tracking and selecting anatomical landmarks and fiducial markers. A commonly used method for calibration of medical instruments or pointer is rotating the medical instruments or pointer around their tip while the position of EM sensor is recording [12]. Since the tip of the medical instruments or pointer remains stationary throughout the calibration process, its location can then be solved by fitting all the positions of EM sensor recorded during the calibration process to a sphere that the center of the sphere is the position of the tip. This method is nevertheless prone to errors during calibration. The calibration accuracy of this method is dependent on the size of medical instruments or pointer. Generally, the positioning error introduced by the pointer is approximately 2 mm (including the tracking error of EM tracking system with 1.40 mm) which has a significant effect on the registration error and navigation error [13]. The stage of surgical navigation is where imaging and tracking techniques play a key role [14]. Prior to performing navigation, image-patient registration should be implemented to acquire the transformation between the two spaces, and the calibration of EM sensors also should be performed to guide medical instruments or pick anatomical landmarks and fiducial markers. The image-patient registration is a key step in determining the accuracy of surgical navigation. To obtain high accurate image-patient registration, we employ a solution of using an external fiducial object. Therefore, we developed a fiducial object and specific markers with the ultimate purpose of integrating them into a surgical robotic system [15]. In this study, we focus on the development of an automatic registration method to determine the relationship between the image and patient spaces.
Automatic registration is an important issue in image-guided navigation. Some marker-based solutions have been proposed in our previous works and other researchers [16][17][18][19][20]. Differences are the design of the fiducial object and the development of the automatic registration algorithm between our proposed method and the other introduced methods. By performing automatic registration in IoT operating room, the accuracy of surgical navigation can be improved because surgeons are able to focus on the operations. Our strategy is to design a plastic fiducial object combing with specific markers to determine the position of the electromagnetic tracking sensor and then using the position of the specific markers to automatically register the image and patient spaces. The fiducial object is designed to distribute some slots on its surface that each slot can place a specific marker exactly. The specific markers are designed to localize the position of the small EM sensor and can be automatically detected in CT/ MRI images by an automatic algorithm. Some experiments are performed to validate the effectiveness of our proposed method.

Fiducial object and specific marker design
For automatic and fast registration of patient to image, a fiducial object and specific markers are made. The fiducial object is manufactured by the material of polyoxymethylene that its main body can be described by a cube with a size of 350 mm × 350 mm × 200 mm. To place a baby patient on the fiducial object, a cylinder-like volume is cut from the main body of the fiducial object, as shown in Fig. 2. An EM sensor Model 800 can be placed in the slot on the bottom of the fiducial object which is cut according to the shape of the EM sensor Model 800 so that the EM sensor Model 800 is fixed relative to the fiducial object. Some cylindrical slots are cut on the surface of the fiducial object. These slots can be used to put specific markers. The specific marker is designed as a cylindrical body which consists of two cylinders. On the top of the small cylindrical part, a cylindrical pinhole is made to place the small EM sensor that the diameter and height of the cylindrical pinhole are 1 mm and 8 mm, as shown in Fig. 3. To easily segment from the patient and fiducial object and reduce the artifact in the CT images, polyvinyl chloride (PVC) is applied to produce the specific marker. Thus, the specific marker presents high contrast to the fiducial object and the soft tissue of the patient, and it is easily segmented from the CT images.

Calibration of fiducial object and EM sensor models
Manually picking fiducials in two spaces is still a routine method for registration. This method consumes much time during the procedure of registration and achieves an inaccurate registration due to manual error [21]. By using EM tracking, the EM sensor can be used to perform the automatic registration that the EM sensor can be identified in the images due to the high grayscale value of metal. However, it is unable to determine the exact position in the image which is corresponding to the position reported by the EM tracking system. Besides, the fiducial object should be calibrated in the coordinate of EM tracking system before using for automatic registration. Thus, we propose a strategy to calibrate the fiducial object and EM sensor models in the coordinate of EM tracking system. A 3D guidance trakSTAR (Northern Digital, Inc., ON, Canada) is used for the tracking component of the image-guided surgical navigation (IGSN) to perform the surgical registration and track the surgical instrument. The sensor Model 800 is used to track the movement of the fiducial object, and the sensor Model 90 is used to track the movement of the surgical instrument and it is also used to obtain the position of the pinhole of the specific marker when calibrating the fiducial object. EM transmitter is placed close to the fiducial object so that the sensors are  moving in the tracking range of the EM transmitter. In order to implement the automatic registration of the patient to the image, we first calibrate the fiducial object by calculating the relationship between the sensor Model 800 and the specific markers. Instead of identifying the sensors, the specific markers are then identified by the automatic localizing method. The positions of the specific markers obtained in images are registered to the positions of which reported by the trakSTAR obtained in the calibration stage of fiducial object. So far, the calibration of EM sensor models is accomplished.

Fiducial object calibration
The trakSTAR system can report the position and directions of the EM sensor Model 800. The information is sufficient for registration of patient to image. Nevertheless, the CT image of the EM sensor Model 800 is serious distortion due to the image artifact caused by the metal in the EM sensor. In addition, it is difficult to identify the exact position, which is corresponding to the position reported by the trakSTAR system, and the directions of the EM sensor from the images. For precisely localizing the position of the EM sensor, a fiducial object is constructed which assembles the EM sensor and specific markers. The specific markers can be identified in both of the two spaces. To calibrate the fiducial object, we first select the markers used for registration and place them onto the slots in the fiducial object. The EM sensor Model 800 is then fixed into the fiducial object. The small EM sensor Model 90 is used to localize the positions of the markers by inserting the sensor into the pinholes of the markers. During the process of the calibration, the fiducial object is maintained in the same position, and the positions and directions of the EM sensors Model 800 and Model 90 are simultaneously recorded by the trakSTAR system. To clearly explain the procedure of the calibration, the world coordinate frame S w (O w , X w , Y w , Z w ) is constructed. The origin O w of S w is attached to the position of the EM transmitter, and the X w , Y w , and Z w axes lie on the X, Y, and Z directions of the EM transmitter, respectively. Similarly, the coordinate frames S s (O s , X s , Y s , Z s ) and S p (O p , X p , Y p , Z p ) are constructed for the small EM sensor Model 90 and the EM sensor Model 800, respectively. All the coordinate systems involved in this paper are defined using the right-hand rule. All the three coordinate frames are shown in Fig. 4. The trakSTAR system can track and acquire the position and directions of the small EM sensor Model 90 in the world coordinate frame. The acquired positions are denoted as and p w90 = [x w90 , y w90 , z w90 ] T , and the acquired angles between three axes of the world coordinate frame and three axes of the small EM sensor Model 90 are represented as θ x90 , θ y90 , and θ z90 , respectively. Similarly, the position and directions of the EM sensor Model 800 are denoted as p w800 = [x w800 , y w800 , z w800 ] T , θ x800 , θ y800 , and θ z800 , respectively. These parameters can be used to calculate the relationship between the EM transmitter and the EM sensor. Let T pw be the translation vector from the coordinate frame S p to the coordinate frame S w , then T pw can be rewritten as Let R pw be the rotation matrix from the coordinate frame S p to the coordinate frame S w , then R pw can be denoted as where R pw _ x , R pw _ y , and R pw _ z are the rotation matrix that the EM sensor Model 800 rotates along the X w , Y w , and Z w axes with the angle θ x800 , θ y800 , and θ z800 , respectively. Thus, they can be represented as Similarly, we can compute the translation vector T sw and the rotation matrix R sw from the coordinate frame S s to the coordinate frame S w according to the parameters T sw , R sw _ x , R sw _ y , and R sw _ z . Let p w be a point in S w , and its corresponding coordinate in S p is denoted as p p . Thus, the relationship between p p and p w satisfy the following equation: Let p s be the corresponding coordinate of p w in the coordinate frame S s , then the relationship between p s and p w satisfy the following equation: Clearly, by combining Eqs. (3) and (4), the relationship between the point p s and p p can be expressed as where R sp ¼ R −1 pw R sw and T ps ¼ R −1 pw ðT sw −T pw Þ are the rotation matrix and the translation vector from the coordinate frame S s to the coordinate frame S p . The schematic diagram of the transformation is shown in Fig. 4.
Hereto, the position and directions of the small EM sensor Model 90 can be transferred to the coordinate frame of the EM sensor Model 800 using the Eq. (5). When the small EM sensor Model 90 is inserted into the pinhole of a marker, the position and directions of the marker in the coordinate frame S o can be acquired. The small EM sensor Model 90 is easy to determine the coordinates along the axes X s and Z s than the coordinates along the axes X s and Z s of the EM sensor Model 800 due to the diameter of the small EM sensor Model 90 which is 0.9 mm. However, it is difficult to determine the exact position of the small EM sensor Model 90 along the axis Y s that the sensor length of the small EM sensor Model 90 is 7.25 mm. A method is proposed to deal with this problem that the detailed description of this method is given in the next subsection. Thus, the procedure of calibrating the relationship between the specific markers and the EM sensor Model 800 is completed.

Calibration of EM sensor models
The position and orientation of specific markers are automatically realized through the method of matching the 3D reconstruction marker with the point cloud extracted from the surface of the 3D CAD marker model. By registering the specific markers in two spaces, the calibration of EM sensor models can be acquired. Figure 5a presents that the grayscale value of the specific markers is higher than the others except the bones in the CT image. A method of two bandpass threshold values is used to separate bones and the fiducial markers in 3D reconstruction images. Morphology operations and Gaussian filter are performed to eliminate noisy voxels, and the remaining voxels of the connected regions are applied to compute their volumes [22,23]. A 3D model of a specific marker is performed to match with the fiducial markers by comparing their volumes. A matching result is presented in Fig. 5b.
The 3D markers detected from the images are dispersive with a different pose. In order to acquire the position of the small EM sensor when it inserts to the pinhole of the marker, the poses of the markers should be estimated accurately. This can be classified as a pose estimation problem. In this paper, we intend to estimate the relative pose between the 3D reconstruction of a specific marker and the 3D CAD model of a specific marker. A point cloud can be extracted from the surface of the marker's CAD model. Thus, the pose estimation can be formulated as an optimization problem to obtain the relationship between the model and the 3D reconstruction which are usually solved through the iterative closest point (ICP) algorithm [3] or alike. The initial positions of CAD model and specific markers should be obtained in image space due to ICP algorithm which requires two point clouds to be adequately close. An initial position estimation method is used before performing ICP algorithm by overlapping their centroids and principal axes, which is proposed in our previous work [16,17,24]. After performing ICP algorithm, the transformation of two point clouds can be obtained. To determine the position of the EM sensor Model 90 in the image space, the relationship between the EM sensor Model 90 and the 3D CAD should be acquired in the patient space. The coordinate frame of 3D CAD model is constructed that it is denoted as S m (O m , X m , Y m , Z m ). The Y w axis lies on the symmetry axis of the CAD model as the CAD model is axisymmetric. The origin O m of S m is attached to the intersection point of the Y m axis and the bottom of the CAD model. Then, the X m and Z m axes can be established in the plane of CAD model bottom, as shown in Fig. 6. The pinhole of a specific marker can be built with high accuracy according to the diameter of EM sensor model 90 such that the symmetry axis of EM sensor model 90 can be coaxial with the axis Y m . Let q be the position of the EM sensor model 90 which is the position recorded by the trakSTAR system. When the small EM sensor Model 90 is inserted into the pinhole of a marker, the coordinate of position q can be written as q m = [0, l, 0] T in the coordinate frame S m , as shown in Fig. 6. The parameter l is difficult to determine as the sensor length of the small EM sensor Model 90 is 7.25 mm. Obviously, the parameter l satisfied the condition a ≤ l ≤ b, where a and b are the top and bottom coordinates of the pinhole in the axis Y m . The parameter l can be obtained by the procedure of fiducial object calibration. Suppose that two point sets of marker positions, in which N markers are contained, are acquired in the image and patient spaces, respectively. Let P = {p i |i = 1, 2, ⋯, N} and Q = {q i |i = 1, 2, ⋯, N} be the point sets in patient and image spaces, respectively. The relationship between P and Q is satisfying the following equation: where R pi and T pi are the transformation from the patient space to the image space. However, there is an existing measurement error between the point set obtained in the patient space and the point set obtained in the image space, which is called fiducial registration error (FRE). The equation of FRE is as follows: Denote the point set of EM sensor model 90 in the coordinate frame S m as Q m = { q mi |i = 1, 2, ⋯, N}, where q mi = [0, l i , 0] T and the parameter set is L = {l i |i = 1, 2, ⋯, N}.
The transformation between the coordinate frame S m and the image space is expressed as: Thus, Eq. (8) can be written as: The parameter l is different for each EM sensor model 90. If two EM sensor model 90 are used for a calibration procedure, there are many solutions for Eq. (9) which are unable to find the exact solution for each parameter l i . Thus, an EM sensor model 90 is used for each calibration procedure such that the parameter set can be expressed by the parameter l. Equation (9) can be reformulated as: Therefore, the solution of parameter l can be found by minimizing Eq. (10).

Patient-to-image registration
In order to make the registration of the patient to image more convenient and accurate, we proposed a framework to register the preoperative images to the intraoperative patient in the IoT operating room by constructing the wireless sensor networks system, as shown in Fig. 7. Registration between the fiducial object and the image is a key step.
After the specific markers are localized in the 3D images via the pose estimation method, the markers in the images are used to match with the corresponding markers in the world coordinate frame S w . Pairing the two position sets is important for automatic registration that false point-to-point pairing will lead to registration failure and result in navigation failure. Thus, a robust pairing method is essential for ensuring the correct registration. In this paper, we use the method proposed previously to calculate the pairing of the two point sets before matching [5]. This method treats the point-topoint correspondence problem as a constraint satisfaction problem (CSP) to automatically match the two point sets. The distances between one point and the other points are computed for each point in both coordinate frames. A constraint graph is constructed and used all the distances in each coordinate frame, respectively. A procedure including some cases is used to determine the correct pairing of two position sets that the detail of the procedure is presented in our previous work [24][25][26]. Then, the rigid transformation between the two spaces can be acquired by applying a least-squaresbased algorithm. Thus, the transformation matrix [R wi , T wi ], which is aligning the preoperative image coordinate frame with the world coordinate frame, is obtained. In order to perform the navigation, the transformation [R pi , T pi ] between the image space and the EM sensor model 800 should be acquired. The transformation matrix [R pi , T pi ] satisfies the relationship: where [R wp , T wp ] can be acquired by the position and directions of EM sensor model 800 reported by the trakSTAR system. Therefore, the transformation matrix [R si , T si ], which relates the EM sensor model 90 to the image coordinate frame, can be expressed as: obtained. Figure 8 shows all the FREs obtained from all the calibrations, which is defined as the residual distances of the positions of all the markers involved in the registration when transforming from patient space to image space. A mean FRE of 1.38 mm is measured for all the calibration which indicates that the accuracy of the calibration method for EM sensor models is high to perform the registration of patient to image space (Fig. 8).
To evaluate the accuracy of EM sensor model calibration, a square board phantom is constructed by 3D printing technology, as shown in Fig. 9. Four small holes are printed around the corners of board phantom which are designed to ensure enough large to fit the size of EM sensor model 90. The coordinates of four holes are used to compute the edge lengths of rectangular, as illustrated in Fig. 9a that the four edges are denoted as L 1 , L 2 , L 3 , and L 4 . A Vernier caliper is used to measure the lengths of four edges 60 times. The EM sensor model 90 is placed into each small hole in turn 60 times to measure the positions of the holes. The measurement results of the four edge lengths measured by the EM sensor model 90 are compared with the results measured by the Vernier caliper. Their results are presented in Table 1. The absolute and relative errors between the two methods are below 1 mm and less than 1%, respectively (see Table 1).    Fig. 10. The maximum absolute error is 0.65 mm among all the errors. Manual error is introduced during the procedure of manually positioning the holes of the board phantom by the two methods such that the maximum absolute error is large. However, most of the errors are below the absolute values of 0.40 mm. These experimental results indicate that the proposed method can achieve a high calibration accuracy.
To verify the accuracy of registration patient space to image space, a human phantom is produced to perform the proposed registration method for navigation, which is shown in Fig. 11. A navigation software cooperating with the trakSTAR system is designed for displacing the 3D image and navigation. The procedure of navigation for human phantom is implemented as described below. CT scan is performed to obtained preoperative images, and the preoperative images are then sent to the navigation system. Six targets are selected on the heart surface of human phantom for verifying the accuracy of registration. The navigation procedure is implemented after the registration of patient space to image space. All the targets are positioning fifteen times to obtain the positions by the EM sensor model 90, respectively. The target registration errors (TREs) are computed for all the targets that the TREs are defined by the distances between the targets and the tip of EM sensor model 90 in image space. Figure 12 represents the results of TREs which are measured for all the targets. The average values of TREs for all targets are 1.38 mm with the standard deviations of 0.32 mm. These results suggest that the proposed registration method is able to obtain a high accuracy for the electromagnetic navigation system.

Conclusions and future work
Currently, less time consumed and high convenience are required for the registration procedure of an electromagnetic navigation system. To achieve the aim of convenient and fast registration, we develop an automatic and accurate registration method which integrates the performance of patient-image registration and the calibration of fiducial object and EM sensors. Results of the experiments with the board phantom demonstrate that the calibration method of EM sensor models presented in this paper is performed conveniently and accurately, and the experimental results of human phantom show that the proposed automatic registration method can achieve high accuracy. The fiducial object and specific markers are the basis for both the calibration method and the automatic registration method. The proposed method is automatic and allows us to match the EM sensor model 800 in the two spaces in real time via the specific markers, and the calibration result of EM sensor models is able to obtain during the calibration procedure of fiducial object. The experimental results present that the proposed registration method plays an important role in the implementation of an electromagnetic navigation system. Our future work will focus mainly on amending this method to be used by combining a robotic arm, which will be able to navigate the surgery more accurately.