Invented by Zehua Huang, Panqu Wang, Pengfei Chen, Tusimple Inc
The market for systems and methods to transition between autonomous driving mode and manual driving mode based on the detection of the driver’s alertness level is rapidly growing. With the increasing popularity of autonomous vehicles, there is a growing need for systems that can detect when a driver is not paying attention and can safely transition the vehicle back to manual mode.
The transition between autonomous and manual driving modes is a critical aspect of autonomous vehicle technology. While autonomous driving technology has come a long way, there are still situations where drivers need to take control of the vehicle. For example, if the vehicle encounters a situation that it cannot handle, such as construction or an accident, the driver needs to take control of the vehicle to navigate through the situation safely.
However, if the driver is not paying attention, the transition from autonomous to manual mode can be dangerous. This is where systems and methods that can detect the driver’s alertness level come into play. These systems use a variety of sensors, including cameras, infrared sensors, and radar, to monitor the driver’s behavior and determine if they are alert and paying attention.
If the system detects that the driver is not paying attention, it can safely transition the vehicle back to manual mode. This can be done in a variety of ways, including visual and auditory alerts, vibrations in the steering wheel or seat, and even automatic braking.
The market for these systems and methods is growing rapidly, as more and more companies are investing in autonomous vehicle technology. According to a report by MarketsandMarkets, the global market for autonomous vehicle technology is expected to reach $65.3 billion by 2027, with a compound annual growth rate of 17.7%.
One of the key drivers of this growth is the increasing demand for safety features in autonomous vehicles. As more people begin to use autonomous vehicles, there is a growing need for systems that can ensure the safety of both the driver and other road users.
Another driver of this growth is the increasing adoption of artificial intelligence and machine learning in autonomous vehicle technology. These technologies are essential for developing systems that can accurately detect the driver’s alertness level and safely transition the vehicle back to manual mode.
In conclusion, the market for systems and methods to transition between autonomous driving mode and manual driving mode based on the detection of the driver’s alertness level is growing rapidly. With the increasing adoption of autonomous vehicle technology, there is a growing need for systems that can ensure the safety of both the driver and other road users. As the market continues to grow, we can expect to see more advanced systems that use artificial intelligence and machine learning to improve safety and reliability.
The Tusimple Inc invention works as follows
A system and method to transition between an autonomous driving mode and manual driving mode are disclosed. This is based on the detection of a driver?s ability to control a vehicle. One embodiment involves receiving sensor data about a vehicle driver’s ability to take manual command of an autonomous car; determining, using the sensor data, whether the driver is capable of performing an action or providing input; and sending a vehicle control signal to a subsystem to instruct it to take appropriate action based upon the driver’s capability to control the autonomous vehicle manually.
Background for System and method to transition between an autonomous driving mode and manual driving mode, based on the detection of the driver’s ability to control a vehicle.
Autonomous vehicles use a variety of computing systems to assist in autonomous vehicle control. Some autonomous vehicles require input from an operator such as a pilot or driver. Autopilot systems and other systems may only be used when the system is engaged. This allows the operator to change from a manual mode, where the operator has a lot of control over vehicle movement, to an autonomous mode, where the vehicle drives itself, or to modes that are somewhere in between. Conventional systems are unable to provide a safe and smooth transition between autonomous driving modes and driver-controlled driving modes. Particularly, traditional systems cannot automatically initiate a transition of control without the driver forcing it away from the autonomous vehicle control software.
A method and system for switching between an autonomous driving mode and a manual driving mode is disclosed in this document. It works by detecting the driver’s ability to control the vehicle. The method and system provide a controlled transition between driving modes within an autonomous vehicle. This is done by using detection data that corresponds to the driver’s capacity or fitness to control the vehicle manually. The system can include sensors, cameras, interactive devices, and computing devices. To monitor driver’s facial features, activities and collect facial feature data, cameras can be mounted inside/on autonomous vehicles. The system can also include steering wheel sensors that capture steering patterns in real time, and localization sensors that monitor the vehicle’s movement and location. The system might prompt the driver/user to take certain actions. Instructions may be provided to the driver/user via an interactive device such as a sound device, dashboard display screen, or voice command. The computing device analyses the driver’s responses to interactive prompt instructions and uses that information to categorize the driver’s current activity into predefined states or classes. The predefined states or classes can include talking on the phone, sleeping, and other activities. An example embodiment uses facial analysis to determine the state. It also takes into account the driver’s responses and actions. The user/driver’s ability to control the vehicle can then be assessed and determined. Based on the example embodiment’s evaluation of the driver/user’s ability, the driver/user may be permitted to control the vehicle. Based on the assessment of the driver’s ability to control the vehicle, there may be other measures that can be taken to protect the safety of both the driver/user and the vehicle.
The following description provides a detailed explanation of various embodiments. However, it will be obvious to those with ordinary skill in art that many embodiments can be practiced without requiring these details.
As described in different examples, a system is described that allows for the transition between autonomous and manual driving modes based on detecting a driver’s ability to control a vehicle. One example embodiment can be used within the context of an In-Vehicle Control System 150 in a Vehicle Ecosystem 101. One example embodiment of an in-vehicle controller system 150 can be combined with a driving transition module 200, which is located in a vehicle. 1. However, those with ordinary skill in the arts will see that the driving control module 200 described and claimed herein is possible to be implemented, configured and used in a variety other applications and systems.
Referring to FIG. Referring now to FIG. 1, a block diagram shows an example ecosystem 101, in which an in vehicle control system 150 can be used and a driving controls transition module 200 in an example embodiment. Below are detailed descriptions of these components. Ecosystem 101 is a collection of components and systems that can generate or deliver information/data to the in-vehicle controller 150 and driving control transition module 200. These components can be mounted in the vehicle 105. A camera mounted in vehicle 105 can, for example, generate image and timing data which can be received by in-vehicle controller 150. One of the cameras can be placed in an inward-facing position to view the head and face of the vehicle driver and capture images of his facial features over time. This image and the timing data can be input to the in-vehicle controller 150 and the image processing module that executes therein. The image processing module can collect sensor data from vehicle subsystems 140. This data, along with the image and timing data, can be used to monitor the driver and identify driver features. The driving control module 200, which processes the sensor data and driver facial data to determine the driver’s ability or fitness to control the vehicle, can be described in greater detail. An autonomous vehicle control subsystem can use the data to determine the driver’s ability or fitness. This subsystem is also part of the vehicle subsystems 140. For example, the autonomous vehicle control subsystem can determine the driver’s ability or fitness to safely and effectively perform vehicle control operations, navigate the vehicle 105 through real-world driving conditions, and avoid obstacles while safely controlling the vehicle.
In an example embodiment, the vehicle control system 150 can be in communication with a plurality vehicle subsystems 140. All of these can be located in a user?s vehicle 105. To facilitate data communication between the vehicle subsystem interface 140 and the in-vehicle controller 150, a vehicle subsystem interface 141 has been provided. A data processor 171 can be added to the in-vehicle controller 150 to process sensor data and driver facial data from any of the subsystems 140. A data storage device 172 can be used in conjunction with the data processor 171 to create a computing system 170 within the in-vehicle controller system 150. Data storage device 172 is used to store processing parameters and instructions. To facilitate data communication between the data processor 171 (and the driving control transition modules 200), a processing module interface 165 is available. A plurality of processing module interfaces can be provided to enable data processor 171 to execute various examples. As illustrated by the dashed lines of FIG. 1. The driving control module 200 can be either integrated into the vehicle control system 150 or downloaded to the 150.
The in-vehicle controller system 150 can be set up to transmit or receive data from/to a wide area network 120 and the network resources 122 connected thereto. You can communicate with network 120 using an in-vehicle internet-enabled device 130 or a user mobile device 132. The in-vehicle controller 150 can use a web-enabled interface 131 to facilitate data communication between its 150-vehicle control systems and the 120 network via the in vehicle web-enabled devices 130. The in-vehicle controller 150 can also use a user-mobile device interface 133 to facilitate data communication between its 150-equipped in-vehicle and 120-equipped network 120. The in-vehicle controller system 150 has instant access to network resources 120 via the network 120. You can use the network resources 122 to access processing modules that are executable by data processor 171, data contents to train neural networks, system parameters or other data.
The ecosystem 101 may also include the wide area data network 120. The network 120 can be one or more of the conventional wide-area data networks such as the Internet. It also includes a cellular telephone network. A satellite network. Pager network. Wireless broadcast network. Gaming network. WiFi network. Peer-to-peer network. Voice over IP (VoIP). network. These networks 120 can be used for connecting a client or user system to network resources 122 such as servers, central control sites, websites, and the like. The network resources 122 can create and/or distribute data. This data can be received in vehicle105 via web-enabled devices 130, or mobile devices 132. Network cloud services can be hosted on the network resources 122, which allow for object input and analysis to be computed or assisted in processing. Antennas are used to connect the vehicle control system 150 and driving control transition module 200 via cellular, satellite or radio. These cellular data networks are available at the moment (e.g. Verizon?, AT&T, T-Mobile?). These satellite-based content or data networks are also available at the moment (e.g. SiriusXM? and HughesNet?). It is also known that there are many broadcast networks such as UHF networks and pager networks, game networks, WiFi networks, peer to-peer networks, Voice over IP networks (VoIP), networks, etc. As described below, the in vehicle control system 150 can receive web-based content through an in-vehicle internet-enabled interface 131. This can be used to link with the network 120 and receiver 130. The in-vehicle controller 150 and driving control transition module 200 are able to support various network-connected in-vehicle devices from the vehicle 105.
As shown at FIG. 1. The in-vehicle controller 150 and the driving controls transition module 200 can receive input data, processing parameters and training content from the user mobile devices (132), which can be found inside or close to the vehicle. Standard mobile devices such as smartphones, cellular phones, MP3 players, tablets, and personal digital assistants (PDAs) can be represented by the user mobile devices 132. Other mobile devices that can create, receive and/or deliver data and processing parameters as well as content for the in-vehicle controller system 150 and driving control transition module 200 include laptop computers, CD players and other mobile devices such as smartphones, tablets, MP3 players, CD players and other mobile devices. FIG. 1 shows that the mobile devices 132 are also capable of data communication with the network 120. Mobile devices 132 can access data and content either from their internal memory or network resources 122 via network 120. Mobile devices 132 may also include GPS data receivers, accelerometers and WiFi triangulation. These components can be used to determine real-time location of users (via their mobile device). As shown in FIG. 1, the in-vehicle controller 150 and the driving controls transition module 200 can both receive data from mobile devices 132. 1.
Referring to FIG. “Referring still to FIG. Many standard vehicles have operational subsystems such as ECUs (electronic control units), which support monitoring/control subsystems for engine, brakes and transmission. Data signals sent from the vehicle operational systems 140 (e.g. ECUs of vehicle 105) to in-vehicle controller 150 via vehicle interface 141 can include information about one or more components or subsystems. The vehicle subsystem interface (141) can receive and process data signals from the vehicle operational systems 140 to a Controller Area Network bus of the vehicle. The systems and methods described can be applied to virtually any mechanized system that utilizes a CAN bus, similar data communications bus, as defined herein. This includes, but is not limited, to industrial equipment, boats or trucks, machinery or automobiles. Any mechanized system can be included in the definition of’vehicle’ as it is used herein. The embodiments of the systems and methods can be used with any system that uses some form of network data communication; however, these network communications are not necessary.
Referring to FIG. “Referring still to FIG. The vehicle 105 can be a car, truck or motorcycle, bus, boat and plane, as well as a vehicle operational subsystems 140. You can also make other vehicles. The vehicle 105 can be set up to operate in an autonomous mode. The vehicle 105, for example, may operate in an autonomous mode and be able to determine the current vehicle state and the environment. It may also be able to predict the behavior of at least one vehicle in the environment. A confidence level may also be used to estimate the likelihood that the vehicle will perform the predicted behavior. Based on this information, it can control the vehicle. The vehicle 105 can be set up to operate in autonomous mode without any human intervention or control.
Vehicle 105 may contain various subsystems, such as vehicle drive subsystems 142, vehicle sensor 144, vehicle control system 146, and vehicle interface subsystems 148. The vehicle 105 could also include the in-vehicle controller 150, the computing systems 170 and the driving control module 200. Each subsystem of vehicle 105 could have multiple elements. Each of the elements and subsystems of vehicle 105 can be connected. One or more of the functions described in vehicle 105 could be broken down into functional or physical parts or combined into fewer functional and physical components. FIG. 2 illustrates additional functional and physical elements. 1.
The vehicle drive system 142 may contain components that provide power to the vehicle 105. An example embodiment of the vehicle drive subsystem (142) may include an engine, motor, wheels/tires and an electrical subsystem. An engine or motor can be any combination of an internal combustion motor, an electric engine, steam engine fuel cell engine, propane motor, or other types. The engine can be designed to convert power into mechanical energy in some examples. The vehicle drive subsystem, 142 in some examples may contain multiple types of motors or engines. A gas-electric hybrid vehicle could have a gasoline engine as well as an electric motor. There are many other examples.
The wheels of the vehicle 105 could be standard tires. You can configure the wheels of vehicle 105 in many ways, including unicycles, tricycles, and four-wheel formats such as on a truck or car. You can also have six or more wheels. The wheels of vehicle 105 can be used to rotate in a differential manner with other wheels. At least one wheel may be fixedly attached to transmission, and at most one tire could be coupled to a wheel rim that could contact the driving surface. A combination of rubber and metal may be used to make the wheels. Transmissions may contain elements that transmit mechanical power from an engine to the wheels. The transmission may include a gearbox and clutch as well as drive shafts. Other elements may also be included in the transmission. One or more axles could be attached to the drive shafts. There may be elements in the electrical system that can control and transfer electrical signals within the vehicle 105. These signals can be used for activating lights, servos and electrical motors as well as other electrically controlled or driven devices. A power source could be a source or energy source that can power the engine or motor in whole or in part. The engine or motor may be designed to convert the power source into electrical energy. Power sources could include gasoline, diesel and other petroleum-based fuels. Propane, other compressed-gas-based fuels. Ethanol, fuel cells, solar panels, batteries, as well as other sources of electricity. Any combination of fuel tanks or batteries, capacitors or flywheels could be used as a power source. Other subsystems may also be powered by the power source 105.
The vehicle sensor subsystem (144) may contain a variety of sensors that can sense information about the vehicle’s environment or conditions 105. The vehicle sensor subsystem 144 could include, for example, an inertial measuring unit (IMU), a Global Positioning System transceiver (GPS), a RADAR device, a laser range finder/LIDAR device, and one or more cameras. The vehicle sensor subsystem (144) may include sensors that monitor the internal systems of vehicle 105, such as a 02 monitor, fuel gauge, and engine oil temperature. There are other sensors that can be used. You can activate one or more of the sensors in the vehicle sensor subsystem 144. This allows you to change a position, orientation, or both of them. The vehicle sensor subsystem144 also includes an input device that allows a user to activate a button, speak an expression, move a lever, pedal, or indicate an input signal.
The IMU can include any combination (e.g. accelerometers or gyroscopes), which are used to detect the position and orientation changes of a vehicle 105 using inertial acceleration. Any sensor that is capable of estimating the vehicle’s geographic location 105 may be called the GPS transceiver. The GPS transceiver could include a receiver/transmitter that can provide information about the vehicle’s position 105 relative to the Earth. The RADAR unit could be a system that uses radio signals to detect objects in the local environment of the vehicle. In some cases, the RADAR units may also be used to detect the speed and heading of objects within the vehicle’s vicinity 105. Any sensor that is designed to locate the vehicle 105 in an environment using lasers may be called a laser range finder/LIDAR unit. The laser range finder/LIDAR device may, in an example, include one or more laser sources, one or two laser scanners, one or more detectors and other system components. The laser range finder/LIDAR device could operate in either a coherent (e.g. using heterodyne detector) or incoherent mode. One or more cameras could be used to capture multiple images of the vehicle’s environment 105. These cameras can be motion video cameras or still image cameras.
The vehicle control system 146 can be used to control the operation of the vehicle105 and its components. The vehicle control system 146 could include elements such as a steering device, a throttle, brake, navigation, and an autonomous controller unit.
The steering unit can be any combination of mechanisms that are operable to adjust vehicle heading 105. The steering wheel may have a sensor that detects the position and movement of the steering wheels or to adjust the pressure. The throttle can be used to control the engine’s operating speed and then control the vehicle’s speed 105. Any combination of mechanisms can be used to slow down the vehicle’s speed. In a standard way, the brake unit may use friction to slow down the wheels. The brake unit can also convert the wheels’ kinetic energy into an electric current. Other forms of the brake unit are also possible. Any system that determines a driving route or path for the vehicle 105 may be called the navigation unit. Additionally, the navigation unit can be set up to dynamically update the driving route while the vehicle is in use. The navigation unit can be configured in some embodiments to include data from the GPS transceiver and the driving control module 200 to determine the vehicle’s driving path 105. An autonomous control unit can be a control system that is designed to recognize, evaluate, avoid, or otherwise negotiate obstacles in the vehicle’s environment 105. The autonomous control unit can be used to operate the vehicle without the assistance of a driver 105. The autonomous control unit can be configured in some embodiments to include data from the driving transition module 200, GPS transceiver and LIDAR. This will determine the vehicle’s driving path or trajectory 105. Other components may be added or removed from the vehicle control system 146.
Occupant interfaces subsystems 148 can be configured to allow interaction between vehicle 105, external sensors, other vehicles and other computer systems and/or an individual or user of vehicle105. The occupant interface subsystems may include standard visual displays such as plasma displays, liquid crystal display (LCDs), touchscreen displays or heads-up displays or any other devices that can be used to output audio, speakers, microphones, audio input devices, navigation devices and interfaces for controlling internal environments (e.g. temperature, fan, etc.). The vehicle’s interior 105.
In one example embodiment, the interface subsystems 148 for the occupants may allow a user/occupant to interact with other subsystems of the vehicle 105. A visual display device may give information to the user of vehicle 105. A touchscreen can be used to input input from the driver or user. A touchscreen can be set up to detect at least one of the following: capacitive sensing or resistance sensing, surface acoustic waves process, or position and movement of the user’s finger. The touchscreen might be able to sense finger movement in any direction, including parallel, planar, normal, or both. It may also be able to detect the level of pressure on the touchscreen surface. One or more transparent or translucent insulating layers may make up the touchscreen. There may also be one or two transparent or translucent conducting layers. Other forms of touchscreens are possible.
Click here to view the patent on Google Patents.
Leave a Reply