Invented by Aleksandra Popovic, David Paul Noonan, Koninklijke Philips NV
Robotic surgery has revolutionized the field of medicine, allowing for more precise and minimally invasive procedures. With the advancement of technology, the integration of virtual reality (VR) devices into robotic surgery has opened up new possibilities for surgeons and patients alike. The system, controller, and method for using a virtual reality device in robotic surgery have created a significant impact on the market, with a growing demand for this innovative technology.
Virtual reality in robotic surgery offers several advantages. Firstly, it provides surgeons with a more immersive and realistic view of the surgical site. By wearing a VR headset, surgeons can visualize the patient’s anatomy in three dimensions, enhancing their spatial awareness and depth perception. This improved visualization can lead to more accurate and precise surgical maneuvers, reducing the risk of complications and improving patient outcomes.
Secondly, the integration of VR devices allows for enhanced training and education for surgeons. Trainees can practice complex surgical procedures in a virtual environment, simulating real-life scenarios without the need for live patients. This not only improves their skills but also reduces the risk associated with learning on actual patients. As a result, the demand for VR-based surgical training programs has been on the rise, driving the market growth for the system, controller, and method for using a virtual reality device in robotic surgery.
Furthermore, virtual reality in robotic surgery has the potential to improve patient engagement and satisfaction. Patients can have a better understanding of their condition and the proposed surgical procedure through immersive VR experiences. This increased patient involvement can lead to better decision-making and a higher level of trust between the patient and the surgeon. Consequently, the market for the system, controller, and method for using a virtual reality device in robotic surgery is driven by the growing demand for patient-centered care and improved surgical outcomes.
The market for the system, controller, and method for using a virtual reality device in robotic surgery is expected to witness significant growth in the coming years. Technological advancements in both robotic surgery and virtual reality are driving the development of more sophisticated and user-friendly devices. Additionally, the increasing adoption of robotic surgery in various medical specialties, such as urology, gynecology, and general surgery, is further fueling the demand for VR integration.
However, there are challenges that need to be addressed for the widespread adoption of virtual reality in robotic surgery. One such challenge is the cost associated with implementing this technology. VR devices and software can be expensive, making it a significant investment for hospitals and surgical centers. Additionally, there may be concerns regarding the learning curve for surgeons and the need for specialized training to effectively use VR devices in robotic surgery.
In conclusion, the market for the system, controller, and method for using a virtual reality device in robotic surgery is experiencing rapid growth due to the numerous benefits it offers. Improved visualization, enhanced training, and increased patient engagement are driving the demand for this innovative technology. As technology continues to advance and costs decrease, virtual reality in robotic surgery is expected to become a standard practice in the field of medicine, revolutionizing surgical procedures and improving patient outcomes.
The Koninklijke Philips NV invention works as follows
The control unit includes a processor configured to transmit acquired live images of a patient, received from an image acquisition device, to a virtual reality (VR) device for display; to receive input data from the VR system including tracking data from a VR tracking system of the VR device based on a user’s response based on the displayed live images. The control unit contains a processor that is configured to: transmit live images of a surgical site, acquired from an image acquisition system, to a virtual-reality (VR) device to be displayed; receive input data, including tracking data, from a VR-tracking system of the device based on the user’s reaction to the live pictures displayed on the viewer of the display of the VR-device; process the input received from the device to determine the target; determine a path to guide the robot end-effector towards
Background for The system, controller, and method for using a virtual reality device in robotic surgery
Surgery relies on the individual skills of each surgeon.” Dexterity is usually limited to the hands of a surgeon and rigid instruments. These limitations are especially evident in minimally-invasive surgery and natural orifice surgeries, where the space available to operate is constrained by the entry point as well as by anatomy. An endoscope is used to provide visual feedback in minimally invasive surgeries.
Controlling dexterous hand-held devices can be challenging. The user must combine non-dexterous proximal motion, which is typically around an entry point into the body (fulcrum), with complex dexterous movements inside the body. Robotic positioning of dexterous devices is one way to solve this problem. However, it increases the footprint of the operating room as well as the cost and length of the surgery. This problem can be exacerbated if the proximal part is out of field-of view for imaging devices. An endoscope only takes images on the inside of a patient. The field of view of a portable imaging device, such as a c-arm, can be too small to capture the entire device, causing radiation exposure for the operator. Dexterous devices can also cause misalignment once the desired position has been achieved. This is due to hand tremors or involuntary movements of the hand. “In order to increase the surgeon’s level of dexterity and improve their control, some surgical robots have more than six degrees-of-freedom, which makes them difficult to use.
This problem is exacerbated by the use of robots that are redundant and hyper-redundant, like snake robots. These robots are controlled by handles that require a steep learning curve and are difficult to operate. “Users are using endoscopes to navigate the surgical area and it is hard to map the movement of the handle to the images.
It is therefore desirable to provide a system, method, and computer-readable media for controlling a surgical robotic device using a combination live imagery and tracking data provided by a virtual reality tool, enabling target selection through motion detection and without relying on the use of the user?s hands or general dexterity.
The processor is configured to transmit acquired live images of a patient received from at least one image acquisition device to a virtual reality (VR) device for display on a display unit; to receive input data from the VR device, including tracking data based on a user’s response to the acquired live images of the patient displayed on a display unit of a VR system. The processor is configured for transmitting acquired live pictures of a patient received from an image acquisition device to a virtual-reality (VR) device. This data includes tracking data based on user response to the displayed acquired images.
According to another illustrative example, a robotic system for a surgery includes a robot that is configured to operate an end-effector at a surgical area within a patient. The robot also includes at least one image capture device to acquire real time images of the surgical area; a VR device to display these images and to determine tracking data based on actions taken by a user through the VR device. A control unit comprises input/output circuitry (I/O), and a processor. The I/O is configured to receive acquired live images via the at least one acquisition device, provide the acquired images to the virtual reality device for display and receive determined tracking data. It also provides robot control signals to robot. The processor is designed to send the acquired images from the image acquisition device, via the I/O, to the VR Device; process the determined track data from the VR Device to select a surgical target within the patient’s body; determine a path that the end-effector will take to reach the target using the acquired images and determined tracking data, and transmit robot control signals to robot.
According to another illustrative example, a nontransitory computer readable storage medium is provided that stores machine-readable instructions executable on a processor in order to operate a robotic surgical system. The surgical robot system comprises at least a robot configured to move at least a single end-effector at a surgical area within a patient. At least one image acquisition unit is configured to capture live images of the surgical area, and at least a head-mounted device (HMD) that will be worn by the user to display the live images, and determine at least a motion of the head of the wearer or eye movement. The non-transitory, computer-readable medium comprises transmitting code to cause transmission of acquired live images from the atleast one image acquisition device to the HMD; processing code to process determined eye-tracking and head-tracking information from the HMD to select a surgical target within the patients; determining code to determine a path to reach the selected surgical target within the patients based on the acquired images and the determined eye-tracking and/or head-tracking info; and robot code to cause transmission of robot control signals
The present invention is now described in greater detail with reference to the drawings that accompany it, which show embodiments of the invention. The invention can be implemented in many different ways and is not limited to those shown here. These embodiments are not intended to be limiting, but rather as examples of the invention.
The VR device receives live images, which can be endoscopic images, for example, by a camera mounted at the distal end of an endoscope, such as a dedicated controller, or by a forward-looking camera, such as a robot. Live images can be endoscopic images (e.g. by a camera mounted on the distal end an endoscope, which is operable with a dedicated controller or by a robot-operated camera). The VR device could be a HMD device, which displays an image of the surgical site on a display within a headset worn by the user. The HMD device also includes one or two sensors that detect the motion of the user. For example, the HMD device can detect eye or head movement. This motion is then processed to select a target on the display in the surgical site. The live images, along with the head movement and/or eye movements detection, improve the usability of the robotic system by simulating the experience of conventional surgery where the surgeon directs his or her eye and/or head movement towards the area that the surgeon is focusing on.
The terminology used in this document is intended only to describe specific embodiments and not to limit. The technical and scientific meanings are also included in any defined terms.
The terms “a” and “an” are used throughout the specification, including the appended claims. “As used in the specification and appended claims, the terms?a?,?an? Include both singular and plural references, unless the context makes it clear that they are not. Thus, for example, ?a device? “A device” includes both one and multiple devices.
As used in this document, the phrase “two or more components or parts are coupled” means that they are joined or operate together either directly or indirectly. The term “coupled” shall be used to describe the fact that two or more parts or components are connected or work together, either directly or through an intermediary part or component, as long as there is a link.
The following terms/phrases can be used to describe various elements: The accompanying drawings show the relationships between elements. These terms/phrases include different orientations for the device or element in addition to that shown in the drawings.
A computer-readable storage medium” As used herein, any tangible storage media that can store instructions executable by the processor of a computer device is included. To distinguish it from transitory media, such as signals propagating in the air, a computer-readable medium can be called a nontransitory computer-readable medium. The computer-readable media can also be called a tangible computer readable medium.
In some embodiments, the computer-readable medium can also store data that is accessible by the processor. Computer-readable media can include, but not be limited to, a floppy disc, a magnetic disk drive, solid-state hard drive, flash memory (RAM), USB thumb drive (ROM), optical disk (MAG), magneto-optical (MO) disk (MO), and the processor’s register file. Compact Disks and Digital Versatile Disks are examples of optical disks. For example, CD, CDRW, CDR, DVD, DVDRW or DVDR disks. Computer-readable storage medium can also refer to different types of recording media that are accessible by a computer device through a communication or network link. A data can be retrieved via a modem or the internet. Referring to a computer readable storage medium can be taken as referring to multiple computer readable storage media. Different executable components for a program may be stored at different locations. Computer-readable media can be, for example, multiple computer-readable media within the same system. “The computer-readable medium can also be distributed across multiple computing devices or computer systems.
?Memory? “Memory” is an example computer-readable storage media. Computer memory can be defined as any memory that is directly accessible by a processor. Computer memory includes RAM memory, registers and register files, among others. Referring to “computer memory” is not acceptable. Referring to?computer memory? The word “memory” can be taken to mean multiple memories. For example, the memory could be multiple memories in the same computer system. “The memory can also be distributed across multiple computers or computing devices.
Computer storage can be defined as any non-volatile, computer-readable medium. Computer storage can be a hard drive, USB thumb drive or a floppy disk, smart card, DVD, CD-ROM and solid state hard drive. Computer storage can be used as computer memory in some embodiments. Referring to “computer storage” is not a good idea. Computer storage? The term’storage’ should be understood to include multiple components or storage devices. The storage could include multiple storage devices in the same computer or computing device. Storage may include multiple storages spread across multiple computers or computing devices.
A ?processor? As used in this document, “a processor” is an electronic component that can execute a machine-executable program. Referring to a computing device as containing?a processor’ is a common practice. It is possible that the computing device contains more than one processing core or processor. A multi-core processor is one example. A processor can also be a group of processors in a single system, or spread across multiple systems. Computing device can also refer to a network or collection of computing devices, each with a processor. “Many programs contain instructions that are performed by multiple processors, which can be located within the same computing system or distributed across multiple devices.
A ?user interface? or ?user input device? As used in this document, an interface is one that allows a user to interact with a system or computer. A user interface can provide data or information to the operator, or receive data or information from the operator. A user interface can allow input from the operator to be received and output from the computer. The user interface allows an operator to manipulate or control a computer, and it also allows the computer to indicate the results of that manipulation or control. A graphical user interface or display of information or data on a screen is one way to provide information to an operator. “The receiving of data via a touch screen is possible through keyboards, mice, trackballs, touchpads, pointing sticks, graphics tablets, joysticks, gamepads, headsets, gearsticks, steering wheels, wired gloves, wireless remote controls, accelerometers, and webcams.
A ?hardware interface? As used herein, “hardware interface” refers to an interface that allows the processor of a computing system to control and/or interact with an external computing device or apparatus. A hardware interface can allow a computer system to send instructions or control signals to an external computing apparatus and/or device. A hardware interface can also allow a processor exchange data with external computing devices and/or equipment. Hardware interfaces include but are not restricted to a universal serial interface, IEEE 1394, parallel port IEEE 1284, serial port RS-232, IEEE-488, Bluetooth, Wireless Local Area Network connection, TCP/IP, Ethernet, analog input interface and digital input interface.
Click here to view the patent on Google Patents.