Invention for User interface for presenting decisions

Invented by Bibhrajit HALDER, Safeal Inc, SafeAI Inc

The market for user interface (UI) for presenting decisions is rapidly growing as businesses and organizations recognize the importance of effective decision-making processes. In today’s fast-paced and data-driven world, making informed decisions is crucial for success, and having a user-friendly interface to present these decisions is equally important.

User interface refers to the visual and interactive elements of a software or application that enable users to interact with and navigate through the system. When it comes to decision-making, a well-designed UI can significantly enhance the understanding and acceptance of decisions by stakeholders.

One of the key factors driving the market for UI in decision presentation is the increasing complexity of data and information. With the advent of big data and advanced analytics, organizations have access to vast amounts of data that can influence decision-making. However, presenting this data in a meaningful and comprehensible way can be challenging. A good UI can help simplify complex information, making it easier for decision-makers to grasp and analyze.

Furthermore, the rise of remote work and virtual collaboration has also fueled the demand for UI in decision presentation. As more teams work remotely or across different locations, having a digital platform with a user-friendly interface becomes essential for effective decision-making. A well-designed UI can facilitate communication, collaboration, and consensus-building among team members, regardless of their physical location.

In addition, the market for UI in decision presentation is driven by the need for real-time updates and dynamic visualizations. Decision-making is often an iterative process that requires continuous monitoring and adjustment. A UI that provides real-time updates and allows for interactive visualizations can help decision-makers stay informed and make timely adjustments as needed.

Moreover, the market for UI in decision presentation is not limited to business organizations. Government agencies, non-profit organizations, and educational institutions also require effective UI to present decisions to their stakeholders. Whether it is presenting policy decisions, fundraising strategies, or educational plans, a well-designed UI can enhance transparency, engagement, and understanding among stakeholders.

As the market for UI in decision presentation continues to grow, so does the competition among software developers and designers. Companies are investing in research and development to create innovative UI solutions that cater to the specific needs of different industries and sectors. From interactive dashboards and data visualization tools to collaborative decision-making platforms, there is a wide range of UI options available in the market.

In conclusion, the market for user interface for presenting decisions is expanding rapidly as organizations recognize the importance of effective decision-making processes. A well-designed UI can simplify complex information, facilitate remote collaboration, provide real-time updates, and enhance transparency and engagement. As the demand for UI in decision presentation grows, companies are investing in innovative solutions to cater to the diverse needs of different industries and sectors.

The Safeal Inc, SafeAI Inc invention works as follows

Herein are described techniques for providing information about one or more actions that an autonomous vehicle control system plans to take. The autonomous vehicle control system can provide information about one or more reasons behind a planned action. The information can be displayed on a user-interface that shows the action and the reason behind it. Information provided by the user interface will improve user confidence in the safety of autonomous vehicles, give context to evaluate the decisions made by autonomous vehicle management systems, and let the user decide whether or not the actions planned by autonomous vehicles make sense.

Background for User interface for presenting decisions

Recently, we have seen a significant rise in the use and adoption of autonomous driving technology (e.g. autonomous vehicles). The adoption and application at large scale of Artificial Intelligence based technologies in the autonomous driving domain has played a part. AI-based applications for autonomous driving are used to perform tasks such as identifying the objects within the environment of an autonomous vehicle, making automatic decisions that affect the vehicle’s motion, etc. The current autonomous driving solutions that use AI systems do not have the necessary tools to ensure functional safety. This is a major obstacle to the adoption and use of these technologies.

The present disclosure is related to autonomous vehicles and, more specifically, to artificial intelligence-based and machine learning techniques used by an autonomous management system for an autonomous car to control operations in a safe way. Herein are described various inventive embodiments including methods, systems and non-transitory computers-readable storage media that store programs, code or instructions executable on one or more processors.

An infrastructure that increases the safety of autonomous system such as autonomous vehicles and autonomous machines is provided.” The autonomous vehicle management system, also known as a controller system, is configured to automatically perform one or more autonomous functions performed by a vehicle or machine in a manner that ensures the autonomous operations are carried out safely. Examples of autonomous operations are, without limitation: autonomous driving along a route, scooping or dumping operations, moving objects or materials (e.g. moving dirt from one place to another), lifting material, driving, rolling or spreading dirt, excavating or transporting objects or materials from one point or another point.

In certain embodiments, an autonomous vehicle management system receives sensor data from one of more sensors attached to the vehicle. The autonomous vehicle management system generates and updates an internal map of the vehicle based on this sensor data. This internal map contains information that represents the state of the autonomous vehicles environment (e.g. objects detected). The internal map is used in conjunction with other inputs such as the objective (e.g. change lanes, turn right/left, perform a special operation like digging or scooping etc.). The autonomous vehicle management system generates a plan of actions for the autonomous car based on safety considerations and other inputs. The plan of actions may include a sequence of planned actions that the autonomous vehicle will perform in order to reach the goal in a secure manner. The autonomous vehicle control system can then control one or several vehicle systems to execute the actions specified in the plan.

The autonomous vehicle management system can use a variety of artificial intelligence-based techniques, such as neural networks, reinforcement Learning (RL) techniques and others. As part of the processing, it uses models. The autonomous vehicle management system, for example, may use a Convolutional Neuronal Network (CNN), to identify objects within the autonomous vehicles’ environment, using sensor data captured by the vehicle (e.g. images taken by the vehicle mounted cameras). The autonomous vehicle management system can also use RL techniques to determine the actions that should be included in a plan of action for the autonomous vehicle in order to reach a goal in a secure manner.

The autonomous vehicle management system employs a variety of techniques to enhance the safety of autonomous operations. The autonomous vehicle management system, for example, can provide information about one or more actions that the autonomous vehicle plans to take in the future. The autonomous vehicle management system may also give information about the reasons behind a planned autonomous vehicle action. The information can be sent to the user or passenger (e.g. the driver or the passenger) of the autonomous vehicle. The information informs the driver or passenger of the autonomous vehicle of the actions that need to be taken, and why. The user is assured that the vehicle behaves as it should and is not acting erratically. The user is able to anticipate what action(s) the vehicle will take. It is a great way to make the driver of an autonomous vehicle feel secure while driving the vehicle autonomously or performing other tasks autonomously. It also helps to increase the user’s confidence in the safety of autonomous vehicles. The user can also take manual actions, e.g. emergency actions, where necessary, to override planned actions. Information may be sent to an object, person or system within the autonomous vehicle’s surroundings (e.g. to a remote operator monitoring the operation of the autonomous car).

In certain embodiments, an autonomous vehicle management system can generate a user interface, such as a graphical interface, to be output to the user in the vehicle or remotely. The user interface may include information to help the user understand what the autonomous management system plans to do, and why it has chosen a certain course of action. Information provided by the user interface may improve user confidence in the safety of an autonomous vehicle, for example, by providing advanced notice of planned actions. Information provided by the user interface allows the user to evaluate the decisions taken by the autonomous vehicle system in a context that makes them feel more confident. The information also allows the user decide whether or not the actions planned by an autonomous vehicle are logical and intervene in autonomous operations, if needed.

In certain embodiments, the process of generating an interface for the user involves determining based on the information provided by a plurality sensors the action that the vehicle will perform and the reason why it will do so. The action can be determined by using the various machine learning and artificial intelligence-based techniques described in this document. The user interface may be generated with an indication for the action as well as an indication for the reason. It can then be displayed on a display of a vehicle user or a remote user. Outputting the user interface may include displaying both the indication of action and the indication for the reason. At least one graphic element can be used to represent the vehicle performing an action, and at least another graphic element can represent the reason. The controller system can decide if the action should be performed. In some embodiments, the decision to proceed with the action is made either before or after the user interface has been output. In certain embodiments the decision to continue with the action depends on the updated information from the plurality sensors received after the action has been determined.

In certain embodiments, an interface is created that includes an indication of the interaction between the vehicle with an object in an area around the vehicle if the planned action is not performed. The user can then understand what could happen if they do not perform the planned action, such as a dangerous situation.

In certain embodiments, an interface is created that includes an indication of the interaction between a vehicle and an item in the environment surrounding the vehicle when performing the planned actions. The user can then see what could happen, such as if they perform the action.

In certain embodiments, an interface for a user includes an indication of multiple actions. The indication may include an indication of the second action determined before the determination of the first action. The second action is an action which has been completed or an action which was superseded the first action. This allows the user to track the reasoning behind the sequence of decisions taken by the controller system.

In certain embodiments, the user can provide input in response to the user interface that will cause the controller system to cancel an action indicated by the user interface. The user can intervene as necessary in autonomous operations.

The following specification, the claims and the accompanying drawings will make it easier to understand all of these features and embodiments.

In the following description, certain details are provided for clarification purposes in order to give a clear understanding of some inventive embodiments. It will become clear that different embodiments can be implemented without the specific details. Figures and descriptions are not meant to be restrictive. The word “exemplary” is used here to mean’serving as an example, instance or illustration. The word ‘exemplary’ is used in this document to mean “serving as a model, example, or illustration.” Any embodiment or design described as ‘exemplary’ herein is to be construed as preferred. “Any embodiment or design described herein as?exemplary?

References to ‘one embodiment’ are made throughout this specification. ?an embodiment,? If you use the phrase ‘an embodiment,? or similar language, it means that a certain feature, structure or characteristic described with respect to an embodiment is present in at least one of them. The phrase ‘in one embodiment’ is used to indicate this. The phrases?in one embodiment? Similar language in this specification does not always refer to the exact same embodiment.

The present disclosure is related to autonomous vehicles and, more specifically, to artificial intelligence-based and machine learning techniques used by an autonomous management system for an autonomous vehicle to control operations of the autonomous car in a safe way.

An infrastructure that increases the safety of autonomous system such as autonomous vehicles and autonomous machines is provided.” The autonomous vehicle management system, also known as a controller system, is configured to automatically perform one or more autonomous functions performed by a vehicle or machine in a manner that ensures the autonomous operations are carried out safely. Examples of autonomous operations are, without limitation: autonomous driving along a route, scooping or dumping operations, moving objects or materials (e.g. moving dirt from one place to another), lifting material, driving, rolling or spreading dirt, excavating or transporting objects or materials from one point or another point.

In certain embodiments, an autonomous vehicle management system receives sensor data from one of more sensors attached to the vehicle. The autonomous vehicle management system generates and updates an internal map of the vehicle based on this sensor data. This internal map contains information that represents the state of the autonomous vehicles environment (e.g. objects detected). The internal map is used in conjunction with other inputs such as the objective (e.g. change lanes, turn right/left, perform a special operation like digging or scooping etc.). The autonomous vehicle management system generates a plan of actions for the autonomous car based on safety considerations and other inputs. The plan of actions may include a sequence of planned actions that the autonomous vehicle will perform in order to reach the goal in a secure manner. The autonomous vehicle control system can then control one or several vehicle systems to execute the actions specified in the plan.

The autonomous vehicle management system can use a variety of artificial intelligence-based techniques, such as neural networks, reinforcement Learning (RL) techniques and others. As part of the processing, it uses models. The autonomous vehicle management system, for example, may use a Convolutional Neuronal Network (CNN), to identify objects within the autonomous vehicles’ environment, using sensor data captured by the vehicle (e.g. images taken by the vehicle mounted cameras). The autonomous vehicle management system can also use RL techniques to determine the actions that should be included in a plan of action for the autonomous vehicle in order to reach a goal in a secure manner.

The autonomous vehicle control system described in the disclosure uses different techniques to increase the safety of autonomous operations. The autonomous vehicle system can, for example, dynamically control the behavior of the sensors that are associated with the vehicle and provide the sensor data used by the system to process. For a sensor, the autonomous vehicle management system can dynamically change and control what sensor data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., granularity/resolution of the data, field of view of the data, partial/detailed data, how much data is communicated, control zoom associated with the data, and the like), when the data is captured by the sensor and/or communicated by the sensor to the autonomous vehicle management system (e.g., on-demand, according to a schedule), and how the data is captured by the sensor and/or communicated from the sensor to the autonomous vehicle management system (e.g., communication format, communication protocol, rate of data communication to the autonomous vehicle management system). The autonomous vehicle system builds the internal map based on sensor data from the sensors. By being able dynamically control the behavior and behavior of the sensors the information used to build or maintain the internal maps can be dynamically controlled.

As an example, the autonomous vehicle control system can simulate and evaluate various “what-if” scenarios as part of their decision-making process. scenarios. These what-if scenario’s project different behavioral predictions onto the map of the autonomous vehicle and can be used in order to determine the safest sequence of actions that the autonomous vehicle should take to achieve a specific goal. The autonomous vehicle management system can run different what-if scenarios, for example, to determine the best way to perform a turn. Each what-if simulation may simulate a unique behavior pattern (e.g. simulating varying speeds, paths, pedestrians, etc.). The autonomous vehicle management system will then determine the safest course of action for the autonomous vehicle based on these simulations.

Click here to view the patent on Google Patents.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *