Invented by Bernhard Adolf Fuerst, Pablo E. Garcia Kilroy, Berk Gonenc, Jose Luis Cordoba, Joan Savall, Alexander Barthel, Verb Surgical Inc
One of the key drivers of this market is the increasing demand for minimally invasive surgeries. These procedures are less invasive than traditional surgeries, resulting in less pain, shorter hospital stays, and faster recovery times for patients. Robotic surgery is particularly well-suited for minimally invasive procedures, as it allows surgeons to operate with greater precision and control than traditional surgical methods.
Another factor driving the market for user-input devices for robotic surgery is the increasing adoption of robotic surgery by hospitals and healthcare providers. As more hospitals invest in robotic surgical systems, the demand for user-input devices to control these systems is also increasing.
There are several different types of user-input devices for robotic surgery, including joysticks, hand-held controllers, and foot pedals. These devices allow surgeons to control the movement of robotic surgical instruments with greater precision and accuracy than traditional surgical methods.
One of the key challenges facing the market for user-input devices for robotic surgery is the high cost of these devices. Robotic surgical systems are already expensive, and adding user-input devices can add significantly to the cost. This can make it difficult for smaller hospitals and healthcare providers to invest in these systems.
Despite these challenges, the market for user-input devices for robotic surgery is expected to continue to grow in the coming years. As more hospitals and healthcare providers adopt robotic surgical systems, the demand for user-input devices to control these systems will also increase. With advances in technology and the development of new and innovative user-input devices, the market for robotic surgery is poised for continued growth and success.
The Verb Surgical Inc invention works as follows
The description of user input devices (UIDs), which are used to control a surgical robot system, is given. A UID may include one or multiple tracking sensors that generate spatial state signals according to the pose of the UID. One of the tracking sensor can be a video camera. When there are multiple tracking sensors, a sensor fusion algorithms is used to process the spatial state signals in order to produce a single, more robust tracking signal as well as a quality measurement. The tracking signal, along with the quality measure, are used by a control system digital to control the motion of an actuator associated with a UID. “Other embodiments are described and claimed.
Background for User-input device for robotic surgery
Field
Embodiments relating to robotic systems are described.” “More specifically, embodiments related surgical robotic systems and user input devices” are disclosed.
Background Information
Endoscopic surgery is the practice of using surgical instruments and endoscopes to look into a patient?s body. Laparoscopic surgery, for example, can be performed using a laparoscope in order to view and access the abdominal cavity. Endoscopic surgery may be performed with manual tools or a robotic surgical system that has robotically-assisted instruments.
A surgeon can remotely operate a surgical robotic system to control a robotically assisted tool at an operating desk. The surgeon can use a computer console in the operating room or in another city to control a robot that manipulates the surgical tool mounted to the operating table. The robotically-controlled surgical tool can be a grasper mounted on a robotic arm. The remote surgeon can control the surgical robotic system to grasp tissue in a robotic operation.
The surgeon may be required to provide control inputs in order to operate the surgical robotic system. The surgeon can use a UID (user input device) such as a mouse or joystick to control the motion of surgical robotic system components.
Existing UIDs relying on a single tracker are spatially restricted and subject to errors. These errors can lead to unwanted and potentially dangerous movements of the robot manipulator. For medical applications, sub-millimeter movements (for translation and orientation) may be needed to ensure clinically viable operation. Filtering the UID control signal can reduce system noise that may lead to control error. The filtering of signals can cause latency, which has adverse effects on the stability of the robotic manipulator. In order to detect the position, orientation, and status of the UID, it is necessary to use a real-time, noise-free sensing method.
An aspect here of the disclosure is a UID to control a surgical robot system that is based upon a combination several tracking modalities. This is in contrast to a UID which relies on one tracking modality, which can be spatially restricted and prone for error. Tracking modalities may include both a visual and inertial mode to estimate a UID’s pose. Visual/inertial odometry fuses together i) a pose estimation based on a visual modality and ii), a pose estimation based on an inertial modeality. To generate an accurate pose, the estimations of pose calculated using imaging optics is combined with measurements from a tracking device such as an IMU and/or electromagnet sensor. The UID can enable robust, fast (e.g. real-time), tracking over a large (unrestricted), range of motion. It may also be able to track without distortions or EM noise.
The summary above does not contain an exhaustive list of aspects of the invention. The invention is intended to include all systems and method that can be implemented from any combination of the aspects summarized in the above summary, and those that are disclosed in the Detailed Description and specifically pointed out in claims filed with application. These combinations offer advantages that are not mentioned in the summary.
The “embodiments” of a user-input device (UID), which controls a robotic system and, more specifically, a surgical robot system are described. UIDs can also be used to control medical systems such as medical vision systems and interventional cardiology systems.
The figures are used to describe various embodiments. Certain embodiments can be implemented without these details or with other methods or configurations. To provide a complete understanding of embodiments, many specific details, such as configurations, measurements, and processes are described in the following description. Other times, known processes and manufacturing methods have not been described to avoid obscuring the description. This specification refers to “one embodiment” throughout. ?an embodiment,? “An embodiment” or something similar means that at least one embodiment includes a certain feature, structure or configuration. The phrase “one embodiment” is used to describe this. ?an embodiment,? The use of the word “an embodiment” in this specification is not always referring to the exact same embodiment. “Furthermore, features, structures or configurations may be combined in a suitable way in one or several embodiments.
The use of relative terms in the description can indicate a relative direction or position. As an example, “distal” can be used to indicate a first direction away from a reference point. The word “distal” may be used to indicate the first direction in which a point of reference is not being approached, for example, away from a person. Similarly, ?proximal? The term “proximal” may refer to a position in the opposite direction of the first direction. For example, the direction toward the operator. These terms are used to establish frames of reference and not to restrict the use of or orientation of an UID to the specific configurations described below.
FIG. The FIG. 1 shows a picture of an example of a surgical robotic system in an operating room. The robotic system 100 comprises a user interface 120, a controller 130, and one, or more, surgical robotic arms 112. These are located on a surgical robot platform 111 (e.g. a table, bed, etc.). The system 100 may include any number of tools, devices, or accessories that are used to perform surgery on patient 102. The system 100 can include, for example, one or more surgical instruments 104 that are used to perform the surgery. “A surgical tool 104 can be an end-effector attached to the distal end a surgical arm 112. It is used for performing a surgery.
Each surgical instrument 104 can be operated manually, robotically or both during surgery. The surgical tool 104 can be used, for example, to view or manipulate the internal anatomy of a patient 102. In one embodiment, surgical instrument 104 can be a grasper capable of grasping tissue from patient 102. The surgical tool 104 can be controlled either manually by the bedside operator 106, or robotically via the surgical robotic arms 112 that it is attached to. The robotic arms 112 shown are table mounted, but they can also be mounted on a cart or ceiling, or sidewall.
Generally, the remote operator 107 (such as a doctor or another operator) can use the user console to remotely operate the arms 112 and/or the surgical tools 104 by teleoperation, for example. As shown in FIG., the user console 120 can be placed in the same room as the rest 100 of the system. 1 . In other environments, however, the console may be in another room or in a nearby area, or in a distant location, such as a different city or country. The user console may include a seat, foot-operated control 124, handheld user input devices (UIDs), and at least one display configured to show, for example a view inside the patient 102. In the user console 120 of this example, the remote operator 107 sits in the seat 122 while viewing the operator display 128, and manipulating the foot-operated controls 124 and handheld UIDs 126 to remotely control arms 112 (and surgical tools 104 mounted at the distal end of arms 112). Foot-operated controls 124 are foot pedals (such as seven pedals) that produce motion control signals upon activation. The user console 120 can include additional input devices such as a joystick or keyboard to receive manual inputs.
In some variations, the operator at bedside 106 can also operate system 100 “over the bed” In this mode, bedside operator is 106 at the side of the patient and simultaneously manipulates a robotically driven tool (end-effector attached to arm 122) with a handheld UID held in one hand and a laparoscopic manual tool. The bedside operator may use his left hand to manipulate the handheld UID 126 in order to control a robot component while using his right hand to manipulate a manual tool. In these variations, the bedside operator 106 can perform both robotically assisted minimally invasive and manual laparoscopic surgeries on patient 102.
During an example (surgery), the patient 102 was prepped, draped and anesthetized in a sterile manner. In order to gain initial access to patient anatomy, it is possible to use known techniques such as forming an incision on the skin. Through the optical entry, a trocar or other surgical tool can then be inserted in the incision. The trocar is then positioned at surgical site. The initial access to the surgery site can be done manually, while the arms are in the stowed or withdrawn configuration of the robotic system (to facilitate the access to surgical site), or in the parking position defined by the operator. After initial access, the robotic system and its arms 112 can be prepared or positioned in an initial position. The remote operator 107 will then perform the surgery at the user console, using the foot-operated control 124, and UIDs 126, to manipulate various end effectors, and possibly an imaging system. Bedside personnel in sterile gowns can also provide manual assistance at the procedure table or bed. For example, the bedside operator 106, who performs tasks like retracting tissues, manual repositioning and tool exchange on one or more robotic arms 112, for example. Remote operator 107 may be assisted by non-sterile personnel at user console 120. After the surgery or procedure is complete, the user console 120 and/or system 100 may be configured to allow for post-operative tasks such as sterilization or cleaning.
In one embodiment the remote operator 107 moves and holds UID 126 in order to input a command to move robot arm actuators 114 within robotic system 100. UID 126 can be communicatively connected to the rest robotic system 100 via console computer system 110, for example. UID 126 may generate spatial signals corresponding with movement of UID 126 (e.g. position and orientation the handheld housing), and these spatial signals can be input signals for controlling a motion of robot arm actuator 114. Robotic system 100 can produce control signals based on the spatial state signal to control proportional movement of actuator 114. In one embodiment, the console processor of console computing system 110 receives spatial state signals to generate the corresponding controls signals. These control signals control the way the actuator 114 moves a segment or a link of arm 112. The movement of an attached surgical tool, including the end effector, may be based on the control signals. “Similarly, the interaction between UID 126 and remote operator 107 can generate a grip signal, which causes the jaw of a grasper attached to the surgical tool, to close and grab the tissue of the patient 102.
The sensed movement of UID 126 can be used to control other aspects in surgical robotic system 100. A finger clutch, for example, may generate clutch signals to pause motion of the actuator 114 or the surgical tool 104. When an operator presses the finger grip of UID 126, a clutching signal is generated. The clutching signal can be used as an input to pause actuator 114. One or more capacitive sensors pads can be found on UID 126. The operator can use the capacitive sensor pads to control the camera view on an endoscope or a cursor displayed on the user console 120 while performing a robotic, diagnostic, surgical or laparoscopic procedure.
Surgical robotic system (100) may include multiple UIDs, where control signals are generated by each UID to control the actuators of the respective arm 112, and the surgical tool. As an example, remote operator (107 in this case) may move the first UID to control the motions of actuators 114 in the left robotic arm. The actuator will then respond by moving gears, linkages, etc. in the arm 112. Remote operator 107 can also control the motion of another actuator by moving a second UID. This actuator then moves other gears, linkages, etc. of the robotic system. Robotic system 100 can include a right hand 112 which is attached to the bed, table or other surface on the right side of patient and a left hand 112 at the left. A motor or motors may be included in an actuator 114 to drive rotation of a particular joint on arm 112. This can, for example, change the orientation of an endoscope, or the grasper attached to the surgical tool, with respect to the patient. The spatial state signals from an UID 126 can control the motion of multiple actuators 114 on the same arm 112. UIDs can also control the motion of surgical tool graspers. “For example, each UID can generate a grip signal that controls motion of an actuator. For instance, a linear motor that opens or shuts the jaws at the distal end to grasp tissue in the patient 102.
In certain aspects, communication between platform and user console may be done through a control station 130. This control tower may translate the operator commands received from user consoles 120 (and, more specifically, console computer systems 110) into robotic commands which are then transmitted to arms 112. Control tower 130 can also send feedback and status from robotic platform 111 to user console 120. Communication between the robot platform 111 and user console 120 and control tower 130 can be done via wireless or wired links using any of a number of different data communication protocols. Wired connections can be built into the walls, ceiling or floor of the operating room. Robotic system 100 can output video to one or multiple displays. These displays may be in the operating room or remote displays accessible through the Internet or other networks. Video output or feed can also be encrypted for privacy, and the entire video output (or portions thereof) may be saved on a server or electronic health record system.
Click here to view the patent on Google Patents.