Invention for System to recommend sensor views for quick situational understanding

Invented by Siddharth Thakur, Armelle GUERIN, Julius S. Gyorfi, Kevin Poulet, Aude Laurent, Mark Allan, Omar Bentahar, Renault SAS, Nissan Motor Co Ltd

The market for systems to recommend sensor views for quick situational understanding is rapidly growing as organizations across various industries seek to enhance their decision-making processes. In today’s fast-paced world, having access to real-time information and the ability to interpret it quickly is crucial for effective decision-making and response to rapidly changing situations. This is where sensor views and recommendation systems come into play.

Sensor views refer to the data collected by various sensors, such as cameras, radars, and other monitoring devices. These sensors capture information about the environment, objects, and events, providing valuable insights into the current situation. However, the sheer volume of data generated by these sensors can be overwhelming, making it challenging for humans to process and make sense of it in a timely manner.

To address this challenge, systems that recommend sensor views have emerged. These systems leverage advanced technologies such as artificial intelligence (AI), machine learning (ML), and computer vision to analyze the sensor data and provide recommendations on the most relevant views for quick situational understanding. By automating the process of data analysis and interpretation, these systems enable decision-makers to focus on critical information and make informed decisions faster.

The market for these systems is driven by several factors. Firstly, the increasing adoption of IoT (Internet of Things) devices and sensors in various industries has led to a massive influx of data. Organizations are now looking for ways to extract meaningful insights from this data to gain a competitive edge. Sensor view recommendation systems offer a solution by filtering and presenting the most relevant information, saving time and improving decision-making.

Secondly, the growing complexity of modern environments, such as smart cities, industrial facilities, and transportation networks, requires advanced systems to handle the vast amount of data generated by multiple sensors. These systems can integrate data from different sources and provide a holistic view of the situation, enabling organizations to respond effectively to emergencies, security threats, or operational challenges.

Furthermore, the advancements in AI and ML technologies have significantly improved the accuracy and efficiency of sensor view recommendation systems. These systems can learn from historical data, adapt to changing environments, and continuously improve their recommendations over time. As a result, organizations can rely on these systems to provide accurate and up-to-date information for situational understanding.

The market for systems to recommend sensor views for quick situational understanding is not limited to any specific industry. It has applications in various sectors, including defense and security, transportation and logistics, healthcare, manufacturing, and smart cities. For example, in defense and security, these systems can help identify potential threats and provide real-time situational awareness to military personnel. In healthcare, they can assist in monitoring patient conditions and alerting healthcare providers to critical situations.

In conclusion, the market for systems to recommend sensor views for quick situational understanding is experiencing significant growth due to the increasing need for real-time data analysis and decision-making. These systems leverage advanced technologies to filter and present the most relevant sensor views, enabling organizations to make informed decisions faster and respond effectively to changing situations. As industries continue to embrace IoT and sensor technologies, the demand for these systems is expected to rise, driving further innovation and development in this market.

The Renault SAS, Nissan Motor Co Ltd invention works as follows

A method for exception handling for an autonomous vehicle (AV), includes identifying a situation of exception; identifying sensors relevant to the situation of exception; identifying tools relevant to the situation of exception, including tools that can be used by a teleoperator to solve the situation of exception; and presenting data from relevant sensors and relevant tools on a display for the teleoperator.

Background for System to recommend sensor views for quick situational understanding

Autonomous Vehicles (AVs), offer drivers the convenience of efficient transportation from one place to another, without having to pay attention to the condition of the road. A self-driven vehicle (e.g. computer controlled) is a vehicle capable of obeying traffic laws and standards while driving on roads. Even the best autonomous vehicles programming can’t account for and control all situations and conditions that may arise when the vehicle is being operated. There are also times when an autonomous vehicle may encounter conditions or situations where a human operator’s assistance is needed (e.g. a teleoperator).

The disclosures herein include aspects, features and elements of autonomous vehicle solutions, as well as implementations.

The disclosed implementations include a method for handling exceptions in an autonomous vehicle. The method comprises identifying an exceptional situation, identifying relevant sensors to the situation, identifying tools that can be used by a tele operator to resolve the situation and presenting data from these sensors and tools on a display to the tele operator.

A system for handling exceptions for an autonomous vehicle is disclosed as one aspect of the disclosed implementaions.” The system comprises a memory, and a processing unit. The processor is configured for executing instructions stored in memory to identify an exceptional situation; identify sensors relevant to the exception scenario; identify tools relevant to the exception, tools that can be used by a Tele-operator to solve the exception; and display, on a screen of the Tele-operator’s display, data from relevant sensors and relevant tools.

An aspect” of the disclosed implementations is a system to handle exceptions in autonomous driving. The system comprises a memory, and a processor. The processor is configured for executing instructions stored in memory to receive an assist request from an autonomous vehicle to resolve an exception scenario; identify an onboard sensor of the autonomous vehicle; identify an offboard sensor; select either the onboard sensor or offboard sensor; present the data from the selected sensors to a tele operator; receive a valid solution from the tele operator; and transmit the solution validated to the AV.

The following description, appended claims, and accompanying figures disclose these and other aspects of this disclosure.

Autonomous vehicles can drive on roads that are shared by other road users.” The behavior of other road users is usually predictable on most roads. On a freeway for example, all vehicles are usually traveling in the same direction at high speeds. On a freeway, there are usually no pedestrians. Other road settings such as dense urban areas, driveways, parking lots and similar places, may have unpredictable behavior.

An autonomously driven vehicle may also be unable (or not confident enough) to deal with many situations. These situations are herein referred to as exceptions or exception situations. AVs may not be able to handle certain situations. The AV, for example, may not be able fully to assess, classify and/or understand other road users’ intentions. Other road users include pedestrians (or construction workers), pets, policemen or vehicles, cyclists and other stationary objects. In order to simplify explanations and references, the AV’s occupants, such as passengers, are also referred herein as “other road users”. The AV can generate a solution to deal with the exception. The AV will then execute (e.g.) the solution.

Examples of exception situations include a passenger engaging in prohibited behavior, an item left by a previous passenger in the vehicle, the fact that the location where a passenger will be picked up or dropped off is not mapped, the fact that the pick-up/dropoff maneuver can’t be completed or performed, an obstruction on the road, the fact that the activity associated with a service offered by the AV hasn’t been completed, and any other exception situation.

The AV could be providing a particular service such as a delivery or taxi service. Some examples of unfinished activities that are related to the services provided by the AV include the fact that the customer failed to close a compartment, such as a trunk or a door. The customer may not have closed a compartment (e.g., a trunk, a door, etc.) after retrieving the item.

In some cases, an AV may request help from a human (e.g. a teleoperator or mobility manager) in order to resolve an exception. In some cases, an autonomous system can provide assistance to the AV remotely. The autonomous system may be a cloud-based system (for example) that is accessible to the AV. The autonomous system may determine a solution for the exception and send the solution to AV. In certain cases, an autonomous system may determine that the confidence level in the solution is low, and as a result, it will forward the solution to the tele-operator prior to sending the solution to the AV. The tele-operator may confirm, modify or create a new solution for the exception situation.

The autonomous system may be more capable than AV. The autonomous system can resolve situations where the AV cannot (or is unable to do so with enough confidence). The autonomous system, for example, may have more programming than an AV and/or have access to additional road data (such as information from vehicles in the vicinity of the vehicle) than is available to the latter. This allows the autonomous system use this additional data to resolve exception situations. The additional programming may, for instance, allow the autonomous system recognize more features or feature values within the scene of an exception situation than can the AV.

When an autonomous system encounters a situation it cannot handle, or cannot handle confidently, the autonomous can request (e.g. ask) for the assistance of a human operator (i.e. to handle the exception situation). The tele-operator is able to better judge the situation (e.g. assess, evaluate etc.). The tele-operator will be able to better judge (e.g., assess, evaluate, etc.) The autonomous system then provides the solution to AV. “The AV will then react according to the solution.

For convenience, a system-generated solution is called an automatic solution. A solution that comes from the involvement of a tele operator is called a validated or verified solution. If the teleoperator does not change the automatic solutions after reviewing them, the validated solution will be the same. The tele-operator can modify the automatic or generate a new one. Both the modified solution and new solution are referred to by the term “validated solution”.

As mentioned above, the human judgement of the tele-operator (i.e. used online) can be used to solve exception situations that the AV faces in real time. Human judgement can be used to review past exceptions that the AV has faced, and to label, annotate, or otherwise describe them in a way that the autonomous system understands. The autonomous system will then be able to learn how it can solve similar or identical exceptions autonomously in the future.

As previously mentioned, the AV may not be able to assess the road situation, for example, in determining whether or not other road users can be seen around a corner blocked by a building, or on the other side a hill. In some situations, such as obstruction situations, the AV may be required to depart from normal driving rules (e.g. legal, socially accepted) in a way that would not be acceptable without human supervision. Herein, road situations that an AV cannot handle are called exception situations. An exception situation is a situation in which the AV must suspend certain driving rules to progress forward (such as towards the final destination of the vehicle).

When the AV encounters a situation that is not normal, it can request help from a Tele-Operator. When the AV comes across an obstacle (e.g. a construction site or a stopped car), it can stop and request assistance from a tele-operator. In a road, the AV may not travel around an obstruction if it means the AV would have to travel through a restricted area but which is physically safe. A tele-operator can help the AV navigate its difficult situation. For example, they could map a route (i.e. a trajectory) around the obstacle. The tele operator may be just one of many available tele operators at a tele operation center (i.e. a remote vehicle support center), where each tele operator can monitor the condition or state of one or more vehicles.

To gain situational awareness, the Tele-operator may have access to or access a real-time, near-real-time (collectively real-time), feed from one of more sensors on the AV. The real-time data feed can be a live stream of data or static information that is captured when the exception situation (e.g. detected, recognized etc.) is identified. The AV. The tele-operator can provide input to the AV (e.g. a valid solution) based on real-time sensor data.

However a stopped AV can cause a public nuisance, or create dangerous road conditions. Even when teleoperators are requested or provided assistance, they must still respond quickly to the request.

When an exception occurs, the AV can stop and ask for tele-operator help. This model of tele-operator support may require a large number of tele-operators to solve each exception.

Implementations in accordance with this disclosure can reduce tele-operation assistance (e.g. intervention). Reduce the time required for a teleoperator (e.g. to respond to an exception situation) to resolve it. By reducing the number exception situations requiring tele-operator assistance, like providing a trajectory to an AV,

Click here to view the patent on Google Patents.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *