Skip to main content

Automatic registration method using EM sensors in the IoT operating room

Abstract

The Internet of Things (IoT) in the operating room can aid to improve the quality of the computer-aided surgical system. Patient-to-image registration is an important issue for computer-aided surgical systems. Automating the procedure of patient-to-image registration could increase tracking accuracy and lower the time consumed for performing the procedure of registration. Therefore, we propose an automatic registration method to address this issue by constructing a wireless sensor network system for surgery. A plastic fiducial object combing with specific markers is developed to perform registration in that the ultimate purpose is to integrate them into a surgical robotic system for surgical navigation. The specific markers are designed to localize the position of the small EM sensor and can be automatically detected in CT/MRI images by an automatic algorithm. The positions of the EM tracking sensors can be calibrated during the procedure of registration. Some experiments are designed and performed, and the experimental results demonstrate that the proposed registration method is robust and accurate. The proposed registration method is a foundational link of the surgical robots combing with virtual or augmented reality technology that all these technologies will be performed in further surgical navigation.

Introduction

The Internet of Things (IoT) in the operating room is the concept of medical devices in the operating room connected to each other via the Internet to make the medical devices smart. The performance of the IoT operating room can aid to improve the quality of the surgical navigation system. Figure 1 illustrates an IoT-based image-guided system. Image-guided surgical navigation is a computer-aided system that can help the surgeons use a tracking medical instrument for the treatment of specific diseases, which assists surgeons in improving therapy and reducing invasiveness and complications [1, 2].

Fig. 1
figure1

Image-guided system based on IoT

The tracking system plays a key role in image-guided surgical navigation. Electromagnetic (EM) tracking system is popularly used for real-time tracking medical instruments by attaching EM sensors on it owing to the fact that EM sensors have no restriction of line-of-sight with small size [1,2,3,4,5]. The work principle of EM tracking system is based on the electromagnetic theory of mutual induction that the positions of EM sensors can be localized when they are placed in a local EM field which is generated by an EM transmitter. The EM tracking technology combined with 2D or 3D images, which is achieved by computed tomography (CT), magnetic resonance (MR) imaging (MRI), or ultrasound, has been applied in clinics for various procedures by navigating medical instruments [1, 6, 7]. The EM tracking system allows medical instruments to be visualized in the preoperative 3D image dataset, especially helpful when guiding a puncture needle to reach a target [8]. Image-patient registration is fundamental for tracking medical instruments in 3D image-guiding, which is known as rigid registration by aligning the patient to the preoperative images. Anatomical landmarks combining with the fiducial markers are manually selected to perform image-patient registration in image and patient spaces which is a widely used routine procedure in image-guided navigation [9]. A 3D localizer, which is called a pointer, is used to select anatomical landmarks and fiducial markers in patient space [10]. Since the positions of anatomical landmarks and fiducial markers are picked by a pointer in patient space or selected manually on a computer screen, their positions are subject to manual measurement errors which results in inaccurate registration [11].

Besides, medical instruments and pointer have to be calibrated prior to use for tracking and selecting anatomical landmarks and fiducial markers. A commonly used method for calibration of medical instruments or pointer is rotating the medical instruments or pointer around their tip while the position of EM sensor is recording [12]. Since the tip of the medical instruments or pointer remains stationary throughout the calibration process, its location can then be solved by fitting all the positions of EM sensor recorded during the calibration process to a sphere that the center of the sphere is the position of the tip. This method is nevertheless prone to errors during calibration. The calibration accuracy of this method is dependent on the size of medical instruments or pointer. Generally, the positioning error introduced by the pointer is approximately 2 mm (including the tracking error of EM tracking system with 1.40 mm) which has a significant effect on the registration error and navigation error [13].

The stage of surgical navigation is where imaging and tracking techniques play a key role [14]. Prior to performing navigation, image-patient registration should be implemented to acquire the transformation between the two spaces, and the calibration of EM sensors also should be performed to guide medical instruments or pick anatomical landmarks and fiducial markers. The image-patient registration is a key step in determining the accuracy of surgical navigation. To obtain high accurate image-patient registration, we employ a solution of using an external fiducial object. Therefore, we developed a fiducial object and specific markers with the ultimate purpose of integrating them into a surgical robotic system [15]. In this study, we focus on the development of an automatic registration method to determine the relationship between the image and patient spaces.

Automatic registration is an important issue in image-guided navigation. Some marker-based solutions have been proposed in our previous works and other researchers [16,17,18,19,20]. Differences are the design of the fiducial object and the development of the automatic registration algorithm between our proposed method and the other introduced methods. By performing automatic registration in IoT operating room, the accuracy of surgical navigation can be improved because surgeons are able to focus on the operations. Our strategy is to design a plastic fiducial object combing with specific markers to determine the position of the electromagnetic tracking sensor and then using the position of the specific markers to automatically register the image and patient spaces. The fiducial object is designed to distribute some slots on its surface that each slot can place a specific marker exactly. The specific markers are designed to localize the position of the small EM sensor and can be automatically detected in CT/MRI images by an automatic algorithm. Some experiments are performed to validate the effectiveness of our proposed method.

Proposed methods

Fiducial object and specific marker design

For automatic and fast registration of patient to image, a fiducial object and specific markers are made. The fiducial object is manufactured by the material of polyoxymethylene that its main body can be described by a cube with a size of 350 mm × 350 mm × 200 mm. To place a baby patient on the fiducial object, a cylinder-like volume is cut from the main body of the fiducial object, as shown in Fig. 2. An EM sensor Model 800 can be placed in the slot on the bottom of the fiducial object which is cut according to the shape of the EM sensor Model 800 so that the EM sensor Model 800 is fixed relative to the fiducial object. Some cylindrical slots are cut on the surface of the fiducial object. These slots can be used to put specific markers. The specific marker is designed as a cylindrical body which consists of two cylinders. On the top of the small cylindrical part, a cylindrical pinhole is made to place the small EM sensor that the diameter and height of the cylindrical pinhole are 1 mm and 8 mm, as shown in Fig. 3. To easily segment from the patient and fiducial object and reduce the artifact in the CT images, polyvinyl chloride (PVC) is applied to produce the specific marker. Thus, the specific marker presents high contrast to the fiducial object and the soft tissue of the patient, and it is easily segmented from the CT images.

Fig. 2
figure2

Fiducial object. (a) The top view. (b) The bottom view

Fig. 3
figure3

Specific maker is placed to the fiducial object and the EM sensor is inserted to the pinhole of the specific marker

Calibration of fiducial object and EM sensor models

Manually picking fiducials in two spaces is still a routine method for registration. This method consumes much time during the procedure of registration and achieves an inaccurate registration due to manual error [21]. By using EM tracking, the EM sensor can be used to perform the automatic registration that the EM sensor can be identified in the images due to the high grayscale value of metal. However, it is unable to determine the exact position in the image which is corresponding to the position reported by the EM tracking system. Besides, the fiducial object should be calibrated in the coordinate of EM tracking system before using for automatic registration. Thus, we propose a strategy to calibrate the fiducial object and EM sensor models in the coordinate of EM tracking system. A 3D guidance trakSTAR (Northern Digital, Inc., ON, Canada) is used for the tracking component of the image-guided surgical navigation (IGSN) to perform the surgical registration and track the surgical instrument. The sensor Model 800 is used to track the movement of the fiducial object, and the sensor Model 90 is used to track the movement of the surgical instrument and it is also used to obtain the position of the pinhole of the specific marker when calibrating the fiducial object. EM transmitter is placed close to the fiducial object so that the sensors are moving in the tracking range of the EM transmitter. In order to implement the automatic registration of the patient to the image, we first calibrate the fiducial object by calculating the relationship between the sensor Model 800 and the specific markers. Instead of identifying the sensors, the specific markers are then identified by the automatic localizing method. The positions of the specific markers obtained in images are registered to the positions of which reported by the trakSTAR obtained in the calibration stage of fiducial object. So far, the calibration of EM sensor models is accomplished.

Fiducial object calibration

The trakSTAR system can report the position and directions of the EM sensor Model 800. The information is sufficient for registration of patient to image. Nevertheless, the CT image of the EM sensor Model 800 is serious distortion due to the image artifact caused by the metal in the EM sensor. In addition, it is difficult to identify the exact position, which is corresponding to the position reported by the trakSTAR system, and the directions of the EM sensor from the images. For precisely localizing the position of the EM sensor, a fiducial object is constructed which assembles the EM sensor and specific markers. The specific markers can be identified in both of the two spaces. To calibrate the fiducial object, we first select the markers used for registration and place them onto the slots in the fiducial object. The EM sensor Model 800 is then fixed into the fiducial object. The small EM sensor Model 90 is used to localize the positions of the markers by inserting the sensor into the pinholes of the markers. During the process of the calibration, the fiducial object is maintained in the same position, and the positions and directions of the EM sensors Model 800 and Model 90 are simultaneously recorded by the trakSTAR system.

To clearly explain the procedure of the calibration, the world coordinate frame Sw(Ow, Xw, Yw, Zw) is constructed. The origin Ow of Sw is attached to the position of the EM transmitter, and the Xw, Yw, and Zw axes lie on the X, Y, and Z directions of the EM transmitter, respectively. Similarly, the coordinate frames Ss(Os, Xs, Ys, Zs) and Sp(Op, Xp, Yp, Zp) are constructed for the small EM sensor Model 90 and the EM sensor Model 800, respectively. All the coordinate systems involved in this paper are defined using the right-hand rule. All the three coordinate frames are shown in Fig. 4. The trakSTAR system can track and acquire the position and directions of the small EM sensor Model 90 in the world coordinate frame. The acquired positions are denoted as and pw90 = [xw90, yw90, zw90]T, and the acquired angles between three axes of the world coordinate frame and three axes of the small EM sensor Model 90 are represented as θx90, θy90, and θz90, respectively. Similarly, the position and directions of the EM sensor Model 800 are denoted as pw800 = [xw800, yw800, zw800]T, θx800, θy800, and θz800, respectively. These parameters can be used to calculate the relationship between the EM transmitter and the EM sensor. Let Tpw be the translation vector from the coordinate frame Sp to the coordinate frame Sw, then Tpw can be rewritten as

$$ {\boldsymbol{T}}_{pw}={\left[{x}_{w800},{y}_{w800},{z}_{w800}\right]}^{\mathrm{T}}. $$
(1)
Fig. 4
figure4

Schematic diagram of transformations involved in the three coordinate frames

Let Rpw be the rotation matrix from the coordinate frame Sp to the coordinate frame Sw, then Rpw can be denoted as

$$ {\boldsymbol{R}}_{pw}={\boldsymbol{R}}_{pw\_z}\cdotp {\boldsymbol{R}}_{pw\_y}\cdotp {\boldsymbol{R}}_{pw\_x}. $$
(2)

where Rpw _ x, Rpw _ y, and Rpw _ z are the rotation matrix that the EM sensor Model 800 rotates along the Xw, Yw, and Zw axes with the angle θx800, θy800, and θz800, respectively. Thus, they can be represented as

$$ {\boldsymbol{R}}_{pw\_x}=\left[\begin{array}{ccc}1& 0& 0\\ {}0& \cos {\theta}_{x800}& -\sin {\theta}_{x800}\\ {}0& \sin {\theta}_{x800}& \cos {\theta}_{x800}\end{array}\right], $$
$$ {\boldsymbol{R}}_{pw\_y}=\left[\begin{array}{ccc}\cos {\theta}_{y800}& 0& \sin {\theta}_{x800}\\ {}0& 1& 0\\ {}-\sin {\theta}_{y800}& 0& \cos {\theta}_{y800}\end{array}\right], $$

and

$$ {\boldsymbol{R}}_{pw\_z}=\left[\begin{array}{ccc}\cos {\theta}_{z800}& -\sin {\theta}_{x800}& 0\\ {}\sin {\theta}_{x800}& \cos {\theta}_{x800}& 0\\ {}0& 0& 1\end{array}\right]. $$

Similarly, we can compute the translation vector Tsw and the rotation matrix Rsw from the coordinate frame Ss to the coordinate frame Sw according to the parameters Tsw, Rsw _ x, Rsw _ y, and Rsw _ z. Let pw be a point in Sw, and its corresponding coordinate in Sp is denoted as pp. Thus, the relationship between pp and pw satisfy the following equation:

$$ {p}_w={\boldsymbol{R}}_{pw}{p}_p+{\boldsymbol{T}}_{pw}. $$
(3)

Let ps be the corresponding coordinate of pw in the coordinate frame Ss, then the relationship between ps and pw satisfy the following equation:

$$ {p}_w={\boldsymbol{R}}_{sw}{p}_s+{\boldsymbol{T}}_{sw}. $$
(4)

Clearly, by combining Eqs. (3) and (4), the relationship between the point ps and pp can be expressed as

$$ {p}_p={\boldsymbol{R}}_{sp}{p}_s+{\boldsymbol{T}}_{sp}, $$
(5)

where \( {\boldsymbol{R}}_{sp}={\boldsymbol{R}}_{pw}^{-1}{\boldsymbol{R}}_{sw} \) and \( {\boldsymbol{T}}_{ps}={\boldsymbol{R}}_{pw}^{-1}\left({\boldsymbol{T}}_{sw}-{\boldsymbol{T}}_{pw}\right) \) are the rotation matrix and the translation vector from the coordinate frame Ss to the coordinate frame Sp. The schematic diagram of the transformation is shown in Fig. 4.

Hereto, the position and directions of the small EM sensor Model 90 can be transferred to the coordinate frame of the EM sensor Model 800 using the Eq. (5). When the small EM sensor Model 90 is inserted into the pinhole of a marker, the position and directions of the marker in the coordinate frame So can be acquired. The small EM sensor Model 90 is easy to determine the coordinates along the axes Xs and Zs than the coordinates along the axes Xs and Zs of the EM sensor Model 800 due to the diameter of the small EM sensor Model 90 which is 0.9 mm. However, it is difficult to determine the exact position of the small EM sensor Model 90 along the axis Ys that the sensor length of the small EM sensor Model 90 is 7.25 mm. A method is proposed to deal with this problem that the detailed description of this method is given in the next subsection. Thus, the procedure of calibrating the relationship between the specific markers and the EM sensor Model 800 is completed.

Calibration of EM sensor models

The position and orientation of specific markers are automatically realized through the method of matching the 3D reconstruction marker with the point cloud extracted from the surface of the 3D CAD marker model. By registering the specific markers in two spaces, the calibration of EM sensor models can be acquired.

Figure 5a presents that the grayscale value of the specific markers is higher than the others except the bones in the CT image. A method of two bandpass threshold values is used to separate bones and the fiducial markers in 3D reconstruction images. Morphology operations and Gaussian filter are performed to eliminate noisy voxels, and the remaining voxels of the connected regions are applied to compute their volumes [22, 23]. A 3D model of a specific marker is performed to match with the fiducial markers by comparing their volumes. A matching result is presented in Fig. 5b.

Fig. 5
figure5

CT images of the specific markers. a An example of a 2D image. b The 3D reconstruction result

The 3D markers detected from the images are dispersive with a different pose. In order to acquire the position of the small EM sensor when it inserts to the pinhole of the marker, the poses of the markers should be estimated accurately. This can be classified as a pose estimation problem. In this paper, we intend to estimate the relative pose between the 3D reconstruction of a specific marker and the 3D CAD model of a specific marker. A point cloud can be extracted from the surface of the marker’s CAD model. Thus, the pose estimation can be formulated as an optimization problem to obtain the relationship between the model and the 3D reconstruction which are usually solved through the iterative closest point (ICP) algorithm [3] or alike. The initial positions of CAD model and specific markers should be obtained in image space due to ICP algorithm which requires two point clouds to be adequately close. An initial position estimation method is used before performing ICP algorithm by overlapping their centroids and principal axes, which is proposed in our previous work [16, 17, 24].

After performing ICP algorithm, the transformation of two point clouds can be obtained. To determine the position of the EM sensor Model 90 in the image space, the relationship between the EM sensor Model 90 and the 3D CAD should be acquired in the patient space. The coordinate frame of 3D CAD model is constructed that it is denoted as Sm(Om, Xm, Ym, Zm). The Yw axis lies on the symmetry axis of the CAD model as the CAD model is axisymmetric. The origin Om of Sm is attached to the intersection point of the Ym axis and the bottom of the CAD model. Then, the Xm and Zm axes can be established in the plane of CAD model bottom, as shown in Fig. 6. The pinhole of a specific marker can be built with high accuracy according to the diameter of EM sensor model 90 such that the symmetry axis of EM sensor model 90 can be coaxial with the axis Ym. Let q be the position of the EM sensor model 90 which is the position recorded by the trakSTAR system. When the small EM sensor Model 90 is inserted into the pinhole of a marker, the coordinate of position q can be written as qm = [0, l, 0]T in the coordinate frame Sm, as shown in Fig. 6. The parameter l is difficult to determine as the sensor length of the small EM sensor Model 90 is 7.25 mm. Obviously, the parameter l satisfied the condition a ≤ l ≤ b, where a and b are the top and bottom coordinates of the pinhole in the axis Ym. The parameter l can be obtained by the procedure of fiducial object calibration. Suppose that two point sets of marker positions, in which N markers are contained, are acquired in the image and patient spaces, respectively. Let P = {pi|i = 1, 2, , N} and Q = {qi|i = 1, 2, , N} be the point sets in patient and image spaces, respectively. The relationship between P and Q is satisfying the following equation:

$$ \boldsymbol{Q}={\boldsymbol{R}}_{pi}\boldsymbol{P}+{\boldsymbol{T}}_{pi}, $$
(6)
Fig. 6
figure6

The estimated position of EM sensor model 90 in the coordinate frame of the CAD model

where Rpi and Tpi are the transformation from the patient space to the image space. However, there is an existing measurement error between the point set obtained in the patient space and the point set obtained in the image space, which is called fiducial registration error (FRE). The equation of FRE is as follows:

$$ \mathrm{FRE}=\sqrt{\frac{1}{N}\sum \limits_{i=1}^N{\left|\left({\boldsymbol{R}}_{pi}{p}_i+{\boldsymbol{T}}_{pi}\right)-{q}_i\right|}^2}. $$
(7)

Denote the point set of EM sensor model 90 in the coordinate frame Sm as Qm = {qmi|i = 1, 2, , N}, where qmi = [0, li, 0]T and the parameter set is L = {li|i = 1, 2, , N}. The transformation between the coordinate frame Sm and the image space is expressed as:

$$ {\boldsymbol{Q}}_i={\boldsymbol{R}}_{mi}{\boldsymbol{Q}}_m+{\boldsymbol{T}}_{mi}. $$
(8)

Thus, Eq. (8) can be written as:

$$ \mathrm{FRE}\left(\boldsymbol{L}\right)=\sqrt{\frac{1}{N}\sum \limits_{i=1}^N{\left|\left({\boldsymbol{R}}_{pi}{p}_i+{\boldsymbol{T}}_{pi}\right)-\left({\boldsymbol{R}}_{mi}{q}_{mi}+{\boldsymbol{T}}_{mi}\right)\right|}^2}. $$
(9)

The parameter l is different for each EM sensor model 90. If two EM sensor model 90 are used for a calibration procedure, there are many solutions for Eq. (9) which are unable to find the exact solution for each parameter li. Thus, an EM sensor model 90 is used for each calibration procedure such that the parameter set can be expressed by the parameter l. Equation (9) can be reformulated as:

$$ \mathrm{FRE}(l)=\sqrt{\frac{1}{N}\sum \limits_{i=1}^N{\left|\left({\boldsymbol{R}}_{pi}{p}_i+{\boldsymbol{T}}_{pi}\right)-\left({\boldsymbol{R}}_{mi}{q}_{mi}+{\boldsymbol{T}}_{mi}\right)\right|}^2.} $$
(10)

Therefore, the solution of parameter l can be found by minimizing Eq. (10).

Patient-to-image registration

In order to make the registration of the patient to image more convenient and accurate, we proposed a framework to register the preoperative images to the intraoperative patient in the IoT operating room by constructing the wireless sensor networks system, as shown in Fig. 7. Registration between the fiducial object and the image is a key step. After the specific markers are localized in the 3D images via the pose estimation method, the markers in the images are used to match with the corresponding markers in the world coordinate frame Sw. Pairing the two position sets is important for automatic registration that false point-to-point pairing will lead to registration failure and result in navigation failure. Thus, a robust pairing method is essential for ensuring the correct registration. In this paper, we use the method proposed previously to calculate the pairing of the two point sets before matching [5]. This method treats the point-to-point correspondence problem as a constraint satisfaction problem (CSP) to automatically match the two point sets. The distances between one point and the other points are computed for each point in both coordinate frames. A constraint graph is constructed and used all the distances in each coordinate frame, respectively. A procedure including some cases is used to determine the correct pairing of two position sets that the detail of the procedure is presented in our previous work [24,25,26]. Then, the rigid transformation between the two spaces can be acquired by applying a least-squares-based algorithm. Thus, the transformation matrix [Rwi, Twi], which is aligning the preoperative image coordinate frame with the world coordinate frame, is obtained. In order to perform the navigation, the transformation [Rpi, Tpi] between the image space and the EM sensor model 800 should be acquired. The transformation matrix [Rpi, Tpi] satisfies the relationship:

$$ \left[\begin{array}{cc}{\boldsymbol{R}}_{pi}& {\boldsymbol{T}}_{pi}\\ {}0& 1\end{array}\right]={\left[\begin{array}{cc}{\boldsymbol{R}}_{wp}& {\boldsymbol{T}}_{wp}\\ {}0& 1\end{array}\right]}^{\hbox{-} 1}\cdot \left[\begin{array}{cc}{\boldsymbol{R}}_{wi}& {\boldsymbol{T}}_{wi}\\ {}0& 1\end{array}\right]. $$
(11)
Fig. 7
figure7

Electromagnetic navigation system and the coordinate transformations involved in spatial registration in the IoT operating room

where [Rwp, Twp] can be acquired by the position and directions of EM sensor model 800 reported by the trakSTAR system. Therefore, the transformation matrix [Rsi, Tsi], which relates the EM sensor model 90 to the image coordinate frame, can be expressed as:

$$ \left[\begin{array}{cc}{\boldsymbol{R}}_{si}& {\boldsymbol{T}}_{si}\\ {}0& 1\end{array}\right]=\left[\begin{array}{cc}{\boldsymbol{R}}_{sw}& {\boldsymbol{T}}_{sw}\\ {}0& 1\end{array}\right]\cdot \left[\begin{array}{cc}{\boldsymbol{R}}_{wi}& {\boldsymbol{T}}_{wi}\\ {}0& 1\end{array}\right]. $$
(12)

where [Rsw, Tsw] can be acquired by the position and directions of EM sensor model 800 reported by the trakSTAR system. To track the positions and directions of a surgical instrument, an EM sensor model should be attached to the surgical instrument before the operation. By applying the transformation matrix [Rsi, Tsi] on the coordinate acquired in the coordinate frame of EM sensor model 90, the registration of patient to image is accomplished and the corresponding position and directions of the surgical instrument can be shown in the image space.

Experimental results and analysis

A fiducial object with twenty-eight cylindrical slots is manufactured (see Figs. 1 and 2). Twenty-eight cylindrical slots are distributed around the fiducial object. The fiducial object is scanned after the specific markers are put into the twenty-eight cylindrical slots of the fiducial object. A dual-source CT system (SOMATOM Definition, Siemens Medical Solutions, Forchheim, Germany) is used in high-resolution mode (0.3 mm × 0.3 mm × 0.3 mm) to scan images for the fiducial object. The obtained CT images are sent to segment the specific markers from other tissues, the obtained markers are reconstructed to perform automatic registration with the CAD model of the marker. The calibration of EM sensor models is performed for thirty times that the fiducial object is placed in different positions for each calibration. An EM sensor model 90 is used in all the calibrations. The height and diameter of the large cylindrical part are 8.5 mm and 16 mm, and the height and diameter of the small cylindrical part are 10 mm and 6 mm. The height of the pinhole is 7.5 mm. Thus, the values of parameters a and b are 11 mm and 18.5 mm. The parameter l can be solved using Eq. (9) for each calibration. Therefore, the position of EM sensor model 800 can be calibrated in the image space, and the transformation between the EM sensor model 800 and the image space can be obtained. Figure 8 shows all the FREs obtained from all the calibrations, which is defined as the residual distances of the positions of all the markers involved in the registration when transforming from patient space to image space. A mean FRE of 1.38 mm is measured for all the calibration which indicates that the accuracy of the calibration method for EM sensor models is high to perform the registration of patient to image space (Fig. 8).

Fig. 8
figure8

FRE values obtained from all the calibrations

To evaluate the accuracy of EM sensor model calibration, a square board phantom is constructed by 3D printing technology, as shown in Fig. 9. Four small holes are printed around the corners of board phantom which are designed to ensure enough large to fit the size of EM sensor model 90. The coordinates of four holes are used to compute the edge lengths of rectangular, as illustrated in Fig. 9a that the four edges are denoted as L1, L2, L3, and L4. A Vernier caliper is used to measure the lengths of four edges 60 times. The EM sensor model 90 is placed into each small hole in turn 60 times to measure the positions of the holes. The measurement results of the four edge lengths measured by the EM sensor model 90 are compared with the results measured by the Vernier caliper. Their results are presented in Table 1. The absolute and relative errors between the two methods are below 1 mm and less than 1%, respectively (see Table 1). No significant difference could be observed from comparing the result of the error. To represent the comparison of the results measured by the EM sensor model 90 relative to the results measured by the Vernier caliper, the values of lengths measured by the EM model sensor are used to subtract the values measured by the Vernier caliper. The obtained errors are presented in Fig. 10. The maximum absolute error is 0.65 mm among all the errors. Manual error is introduced during the procedure of manually positioning the holes of the board phantom by the two methods such that the maximum absolute error is large. However, most of the errors are below the absolute values of 0.40 mm. These experimental results indicate that the proposed method can achieve a high calibration accuracy.

Fig. 9
figure9

A square board phantom is constructed to verify the accuracy of surgical navigation

Table 1 Length of four board’s edges measured by EM sensor model 90 and Vernier caliper in mm
Fig. 10
figure10

The measurement errors between the two methods

To verify the accuracy of registration patient space to image space, a human phantom is produced to perform the proposed registration method for navigation, which is shown in Fig. 11. A navigation software cooperating with the trakSTAR system is designed for displacing the 3D image and navigation. The procedure of navigation for human phantom is implemented as described below. CT scan is performed to obtained preoperative images, and the preoperative images are then sent to the navigation system. Six targets are selected on the heart surface of human phantom for verifying the accuracy of registration. The navigation procedure is implemented after the registration of patient space to image space. All the targets are positioning fifteen times to obtain the positions by the EM sensor model 90, respectively. The target registration errors (TREs) are computed for all the targets that the TREs are defined by the distances between the targets and the tip of EM sensor model 90 in image space. Figure 12 represents the results of TREs which are measured for all the targets. The average values of TREs for all targets are 1.38 mm with the standard deviations of 0.32 mm. These results suggest that the proposed registration method is able to obtain a high accuracy for the electromagnetic navigation system.

Fig. 11
figure11

Platform of electromagnetic navigation

Fig. 12
figure12

TRE values achieved by performing the proposed automatic registration

Conclusions and future work

Currently, less time consumed and high convenience are required for the registration procedure of an electromagnetic navigation system. To achieve the aim of convenient and fast registration, we develop an automatic and accurate registration method which integrates the performance of patient-image registration and the calibration of fiducial object and EM sensors. Results of the experiments with the board phantom demonstrate that the calibration method of EM sensor models presented in this paper is performed conveniently and accurately, and the experimental results of human phantom show that the proposed automatic registration method can achieve high accuracy. The fiducial object and specific markers are the basis for both the calibration method and the automatic registration method.

The proposed method is automatic and allows us to match the EM sensor model 800 in the two spaces in real time via the specific markers, and the calibration result of EM sensor models is able to obtain during the calibration procedure of fiducial object. The experimental results present that the proposed registration method plays an important role in the implementation of an electromagnetic navigation system. Our future work will focus mainly on amending this method to be used by combining a robotic arm, which will be able to navigate the surgery more accurately.

Availability of data and materials

The relevant analysis data used to support the findings of this study are included in the article.

Abbreviations

IoT:

Internet of Things

EM:

Electromagnetic

CT:

Computed tomography

MR:

Magnetic resonance

MRI:

Magnetic resonance imaging

PVC:

Polyvinyl chloride

IGSN:

Image-guided surgical navigation

ICP:

Iterative closest point

FRE:

Fiducial registration error

CSP:

Constraint satisfaction problem

TRE:

Target registration error

References

  1. 1.

    B. Moulin, L. Tselikas, T. De Baere, F. Varin, A. Abed, L. Debays, C. Bardoulat, A. Hakime, C. Teriitehau, and F. Deschamps, CT guidance assisted by electromagnetic navigation system for percutaneous fixation by internal cemented screws (FICS), Eur. Radiol. 1–7 (2019)

  2. 2.

    K. Jahn, R. Hartmann, D. Schumann, J. Bremerich, M. Tamm, and D. Stolz, Electromagnetic navigation bronchoscopy for peripheral nodules, ERS International Congress (2019)

  3. 3.

    O. Awais, M.R. Reidy, K. Mehta, V. Bianco, W.E. Gooding, M.J. Schuchert, J.D. Luketich, A. Pennathur, Electromagnetic navigation bronchoscopy-guided dye marking for thoracoscopic resection of pulmonary nodules. Ann. Thorac. Surg.102(1), 223–229 (2016)

  4. 4.

    A. Chen, N. Pastis, B. Furukawa, G.A. Silvestri, The effect of respiratory motion on pulmonary nodule location during electromagnetic navigation bronchoscopy. Chest 147(5), 1275–1281 (2015)

  5. 5.

    E.E. Folch, M.A. Pritchett, M.A. Nead, M.R. Bowling, S.D. Murgu, W.S. Krimsky, B.A. Murillo, G.P. LeMense, D.J. Minnich, S. Bansal, Electromagnetic navigation bronchoscopy for peripheral pulmonary lesions: one-year results of the prospective, multicenter NAVIGATE study. J. Thorac. Oncol.14(3), 445–458 (2019)

  6. 6.

    M.H. Mozaffari, W.-S. Lee, Freehand 3-D ultrasound imaging: a systematic review. Ultrasound Med. Biol.43(10), 2099–2124 (2017)

  7. 7.

    B.J. Wood, H. Zhang, A. Durrani, N. Glossop, S. Ranjan, D. Lindisch, E. Levy, F. Banovac, J. Borgert, S. Krueger, Navigation with electromagnetic tracking for interventional radiology procedures: a feasibility study. J. Vasc. Interv. Radiol.16(4), 493–505 (2005)

  8. 8.

    A.M. Franz, T. Haidegger, W. Birkfellner, K. Cleary, T.M. Peters, L. Maier-Hein, Electromagnetic tracking in medicine—a review of technology, validation, and applications. IEEE Trans. Med. Imaging 33(8), 1702–1725 (2014)

  9. 9.

    M.R. Bowling, E.E. Folch, S.J. Khandhar, J. Kazakov, W.S. Krimsky, G.P. LeMense, P.A. Linden, B.A. Murillo, M.A. Nead, M.A. Pritchett, Fiducial marker placement with electromagnetic navigation bronchoscopy: a subgroup analysis of the prospective, multicenter NAVIGATE study. Ther. Adv. Respir. Dis.13, 1–13 (2019)

  10. 10.

    W. Lehmann, J.M. Rueger, J. Nuechtern, L. Grossterlinden, M. Kammal, M. Hoffmann, A novel electromagnetic navigation tool for acetabular surgery. Injury 46, S71–S74 (2015)

  11. 11.

    Z. Dai, R. Yang, F. Hang, J. Zhuang, Q. Lin, Z. Wang, Y. Lao, Neurosurgical craniotomy localization using interactive 3D lesion mapping for image-guided neurosurgery. IEEE Access 7, 10606–10616 (2019)

  12. 12.

    P.-W. Hsu, R. W. Prager, A. H. Gee, and G. M. Treece, Freehand 3D ultrasound calibration: a review, Advanced imaging in biology and medicine, 47–84: Springer (2009)

  13. 13.

    P.-W. Hsu, R. W. Prager, N. E. Houghton, A. H. Gee, and G. M. Treece, Accurate fiducial location for freehand 3D ultrasound calibration. Proceedings of Spie the International Society for Optical Engineering 6513, 651315-651315-8 (2007)

  14. 14.

    Q. Lin, K. Cai, R. Yang, H. Chen, Z. Wang, J. Zhou, Development and validation of a near-infrared optical system for tracking surgical instruments. J. Med. Syst. 40(4), 107 (2016)

  15. 15.

    L. Zheng, H. Wu, L. Yang, Y. Lao, Q. Lin, and R. Yang, A novel respiratory follow-up robotic system for thoracic-abdominal puncture, IEEE Trans. Ind. Electron. In Press (2020)

  16. 16.

    Q. Lin, R. Yang, K. Cai, X. Si, X. Chen, X. Wu, Real-time automatic registration in optical surgical navigation. Infrared Phys. Technol. 76, 375–385 (2016)

  17. 17.

    Q. Lin, R. Yang, K. Cai, P. Guan, W. Xiao, X. Wu, Strategy for accurate liver intervention by an optical tracking system. Biomed. Opt. Express 6(9), 3287 (2015)

  18. 18.

    R. Yang, Z. Wang, S. Liu, X. Wu, Design of an accurate near infrared optical tracking system in surgical navigation. J. Lightwave Technol. 31(2), 223–231 (2012)

  19. 19.

    M. Wang, Z. Song, Automatic localization of the center of fiducial markers in 3D CT/MRI images for image-guided neurosurgery. Pattern Recogn. Lett. 30(4), 414–420 (2009)

  20. 20.

    S. Krueger, S. Wolff, A. Schmitgen, H. Timinger, M. Bublat, T. Schaeffter, A. Nabavi, Fast and accurate automatic registration for MR-guided procedures using active microcoils. IEEE Trans. Med. Imaging 26(3), 385–392 (2007)

  21. 21.

    Q. Lin, K. Cai, R. Yang, W. Xiao, J. Huang, Y. Zhan, J. Zhuang, Geometric calibration of markerless optical surgical navigation system. Int. J. Med. Robot. Comp. 15(2), e1978 (2019)

  22. 22.

    H. Tang, B. Xiao, W. Li, G. Wang, Pixel convolutional neural network for multi-focus image fusion. Inf. Sci. 433, 125–141 (2018)

  23. 23.

    B. Xiao, K. Wang, X. Bi, W. Li, J. Han, 2D-LBP: an enhanced local binary feature for texture image classification. IEEE Trans. Circ Syst Vid. 29(9), 2796–2808 (2018)

  24. 24.

    Q. Lin, R. Yang, L. Yang, H. Chen, B. Li, K. Cai, Optimization model for the distribution of fiducial markers in liver intervention. J. Med. Syst.44(4), 1–11 (2020)

  25. 25.

    H. Wu, Q. Lin, R. Yang, Y. Zhou, L. Zheng, Y. Huang, Z. Wang, Y. Lao, J. Huang, An accurate recognition of infrared retro-reflective markers in surgical navigation. J. Med. Syst. 43(6), 153 (2019)

  26. 26.

    Q. Lin, R. Yang, Z. Zhang, K. Cai, Z. Wang, M. Huang, J. Huang, Y. Zhan, and J. Zhuang, Robust stereo-match algorithm for infrared markers in image-guided optical tracking system, IEEE Access 6, 52421–52433 (2018)

Download references

Acknowledgements

Not applicable.

Funding

This work was supported in part by the China Postdoctoral Science Foundation (grants 2018T110880 and 2017M620375), the National Natural Science Foundation of China (grant 81671788), the Guangdong Provincial Science and Technology Program (grants 2016A020220006, 2017B020210008, 2017B010110015, and 2017A040405054), the Fundamental Research Funds for the Central Universities (grants x2yxD2182720 and 2017ZD082), and the Guangzhou Science and Technology Program (grant 201704020228 and 202002030246).

Author information

Affiliations

Authors

Contributions

All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ken Cai.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lin, Q., Yang, R., Dai, Z. et al. Automatic registration method using EM sensors in the IoT operating room. J Wireless Com Network 2020, 136 (2020). https://doi.org/10.1186/s13638-020-01754-w

Download citation

Keywords

  • Wireless sensor networks
  • Automatic registration
  • Electromagnetic navigation
  • Electromagnetic sensor calibration