Invention for Autonomous vehicle navigation system, and method

Invented by James Paduano, Terrence Mckenna, Aurora Flight Sciences Corp

Autonomous vehicle navigation systems have been a topic of interest for many years, and the market for these systems is expected to grow significantly in the coming years. These systems are designed to allow vehicles to navigate and operate without human intervention, using a combination of sensors, cameras, and other technologies.

The market for autonomous vehicle navigation systems is expected to grow at a CAGR of over 20% from 2021 to 2026. This growth is driven by the increasing demand for autonomous vehicles in various applications such as transportation, logistics, and delivery services. The increasing need for efficient and safe transportation is also a major factor driving the growth of this market.

One of the key factors driving the growth of the autonomous vehicle navigation system market is the increasing demand for connected cars. Connected cars are vehicles that are equipped with advanced communication technologies that allow them to communicate with other vehicles, infrastructure, and the internet. These technologies enable vehicles to share information about traffic, road conditions, and other factors that can affect their navigation.

Another factor driving the growth of the autonomous vehicle navigation system market is the increasing demand for advanced driver assistance systems (ADAS). ADAS are systems that are designed to assist drivers in various tasks such as parking, lane changing, and collision avoidance. These systems are becoming increasingly popular in modern vehicles, and they are expected to become a standard feature in the coming years.

The market for autonomous vehicle navigation systems is also being driven by the increasing adoption of artificial intelligence (AI) and machine learning (ML) technologies. These technologies are being used to develop advanced navigation systems that can learn from their environment and make decisions based on real-time data.

In terms of geography, North America is expected to dominate the autonomous vehicle navigation system market during the forecast period. This is due to the presence of several major players in the region, as well as the increasing adoption of autonomous vehicles in various applications.

In conclusion, the market for autonomous vehicle navigation systems is expected to grow significantly in the coming years, driven by the increasing demand for connected cars, advanced driver assistance systems, and AI and ML technologies. As the technology continues to evolve, we can expect to see even more advanced and sophisticated navigation systems that will enable vehicles to operate safely and efficiently in a wide range of environments.

The Aurora Flight Sciences Corp invention works as follows

An autonomous vehicle can be improved by a navigational system that includes both echolocation sensors and cameras, with each field of view overlapping. These cameras and echolocation sensors could be part of an optical or echolocation system that works in conjunction with a global position system. The system determines a course for an autonomous vehicle to reach a goal while detecting and avoiding obstacles.

Background for Autonomous vehicle navigation system, and method

The unmanned aerial vehicle (UAV), technology has been a valuable tool in mission profiles that involve intelligence, surveillance and reconnaissance as well as payload delivery. A UAV, such as a micro-air vehicle (MAV), may encounter large and small obstacles in low-altitude urban reconnaissance. The position of these obstacles, which may be fixed or moved, is not known beforehand, can make it difficult for a UAV to navigate. It is important to have autonomous vehicle navigation systems that are able to respond to unknown and varied obstacles in complex navigational environments.

An autonomous vehicle can be improved by a navigational system that includes both echolocation sensors and cameras, with each field of view overlapping. The cameras and echolocation sensor may be part or an optical system and/or echolocation system that work together with a global position system (GPS). This allows the autonomous vehicle to plan a route to achieve an objective and detect and avoid obstacles.

One aspect of a navigation system for a vehicle comprises a housing and an optical systems having a number cameras mounted within it. The overlap of the fields of view may be predetermined. Additionally, the optical system may combine at least one hundred eighty degrees of optical view around the housing through the housing. An echolocation system may be included in the vehicle’s navigation system. It may include a number of echolocation sensors that are mounted inside the housing. The echolocation sensors might have a second overlap in fields of vision, and the echolocation systems may aggregate at most ten degrees of acoustic view around the housing through the housing.

Another aspect of navigating a vehicle is: determining its position using a global positioning system; determining a course from that position to the object; detecting an obstacle using one or more cameras; calculating a revised route to the objective that avoids it; detecting an obstacle using an array echolocation sensors and then calculating a maneuver to avoid the second obstacle and return to the revised course.

Another aspect is a computer program product that executes on one or more computers. It navigates a vehicle to an objective using one or more cameras.

Described are devices, systems and methods for autonomous vehicle navigation, including navigation using multiple modes to avoid obstacles.

All documents herein are hereby included by reference in their entirety. If the text does not state otherwise, or it is clear, any reference to items in the singular must be understood to include items that are in the plural. Grammatical conjunctions can express all disjunctive or conjunctive combinations, sentences, words, and other conjoined elements, unless stated otherwise or made clear by the context. The term ‘or? is used here. The term?or? should be understood as?and/or? So on.

Recitations of ranges of value herein are not meant to be restrictive, but instead refer to each and every value falling within the range, except where otherwise indicated herein. Each separate value within the range is incorporated into this specification as though it were recited individually herein. The words “about” and “approximately” are used. ?approximately,? When accompanied by a numerical value, or similar, these should be understood as denoting an error that would be easily understood by someone of ordinary skill in art to perform satisfactorily for the intended purpose. These examples of numeric and/or values ranges are only intended to illustrate the concepts and not limit their scope. Any and all examples or exemplary languages (?e.g. ?such as,? (or the like) is provided herein to help better understand the embodiments. It does not limit the scope of the embodiments. The specification does not indicate any element that is not claimed as essential to the practice or enjoyment of the embodiments.

In the following description it is understood that terms like?first?,??, and?second? are used interchangeably. ?second,? ?top,? ?bottom,? ?side,? ?front,? ?back,? These words and similar terms are for convenience only and should not be taken as limiting terms.

FIG. “FIG.1 shows a perspective view for an autonomous vehicle. A vehicle 100 could include a navigation module 101, a housing 104 and a steering mechanism 110.

While FIG. 100 shows an aerial vehicle, it is possible to see that the vehicle 100 can also be used as a remote-controlled vehicle. “While the vehicle 100 shown in FIG. 1 is an aerial vehicle it should be noted that autonomous vehicles may also include any other vehicle, device, component or element that can be used to navigate using the principles of this invention. The autonomous vehicles described herein may also or instead include vertical-takeoff-and-landing (VTOL) aircraft with forward flight capability. The autonomous vehicles described in this article may also include helicopters and other vehicles that use horizontal propellers to lift.

The navigation module102 is generally used to navigate the vehicle 100. This’module’ is generally a conceptual module. This?module? is more conceptual than a vehicle item. The navigation module 102 could be part of a larger navigation system or may contain all the components. Any components mentioned in relation to the navigation system can also be used or included by the navigation module 102, and vice versa.

Based on signals from the components, the navigation module 102 can determine a route for the vehicle 100 to follow in order to get there. The steering mechanism 106 may be directed 100 by the navigation module 102. The housing 104 or the fuselage 108 may contain the navigation module 102. Or, it could be placed entirely or partially within the housing. Any component of the housing or navigation system may be included in the navigation module 102. See FIG. 5 below. The communication relationship between the vehicle 100, remote location and the navigation module 102 could be established. It may also be set up to send and receive signals from remote locations to the vehicle 100.

The housing 104 can be removed from the fuselage 108 to replace it with another one. It may contain any subsystems or systems of a navigation system, as described herein. The housing 104 may include optical sensors and echolocation sensor for augmented navigation, as described herein. It is housed in a removable packaging that can be removed from the vehicle 100 and replaced with the same package for easy reuse. Functionality can be distributed in any way between components of the housing 104, components elsewhere in vehicle 100, and a suitable electronic and mechanical communication interface may also be available to allow for the removal and replacement the housing to fuselage 108.

The steering system 106 could include rudders at 100 and elevators. Any mechanism that allows an autonomous vehicle to be controlled by the steering mechanism 106 can also be included. The steering mechanism 106 can include, for example, rudders and elevators. The steering mechanism 106 can also include other aerial vehicles such as a helicopter. It may contain a number rotors. These can be either fixed rotors (or steerable rotors), foils, and other control surfaces. The steering mechanism 106 can also have articulated electric motors that use vectored thrust control to change the thrust vector. The steering mechanism 106 can be used for land-based vehicles and may include a rack & pinion system and variably rotatable tires, a recirculating system and other similar components. Any components that provide thrust, acceleration and deceleration for the vehicle 100 may be included in the steering mechanism 106. Although vehicles can use separate or combined components for drive and direction in general, all combinations that allow control over vehicle movement are included within the definition of a “steering mechanism”. as contemplated herein.

FIG. “FIG. 2 is a perspective view showing a 200-channel navigation module that could be used in a navigational system as described herein. A modular housing 202 may be included in the navigation module 200. An optical system 204 includes cameras 206 and an echolocation system 220 including echolocation sensors 215. The navigation module 200 can be attached to the vehicle’s exterior, or it may be installed inside the vehicle. The removable and replaceable navigation module 200 can be attached to the exterior of a vehicle or it may be removable from the vehicle and replaced with another. Or, the permanent coupling or integration of the navigation module 200 into the vehicle may be possible. To determine the vehicle’s navigation path, the navigation module 200 can be used in conjunction with the navigation system described herein. ***

The components of the navigation module 200 may sense sensor data to determine the vehicle’s navigational path. The navigation module 200, for example, may send signals to the navigation system based on the sensed data. This system may then send signals to a steering control system to direct the vehicle along a navigational path. The navigation module 200 could include all components of the navigation systems, while the module 200 could be an individual component. The navigation module 200 can be made modular at any level that is suitable for a specific implementation.

The components of the navigation module 200 may be housed in the modular housing 202. The modular housing 202 can be made of metal, plastic, wood, composite materials, ceramic or any other material that is suitable for the purpose of a specific vehicle or type. The modular housing 202 can be detached or ejectable. Or it may remain permanently attached to the vehicle. Modular housing 202 can be attached to the vehicle in any way that is known to an ordinary skilled in the art. Modular housing 202 can include openings for sensors, such as cameras 206 or echolocation sensors 209.

Optical sensors may be included in the optical system 204, such as cameras 206 and 207. These optical sensors could include digital still cameras or multi-lens camera, as well as any other optical sensor capable of capturing images at an appropriate frame rate and resolution for the systems and methods described herein. The optical system 204 can use the optical sensor to capture images within the FOV of the optical sensors. These images may be processed by the optical system 204 to identify obstructions, using optical flow or forwarded to another processor. The optical system 204 can send images and/or processed sensor data from optical sensors to a component or processor of the navigation system. This data may then be used to determine a vehicle’s navigational path.

The cameras 206 can include any number cameras, such as a first camera (206a), a second camera (206b), and a third one (206c). The FOV of the second camera (206b), the second FOV 212 b is shown as an area between the dashed doubleslines 216a and216b. The FOV of the third camera (206c), the third FOV 221 b is shown to be the region between the dashed doubleslines 218a and 218a and 218a and 218a and 218a and 218a and 218b. The first shaded area 220 shows that the FOVs of the first camera (206 a) and second camera (206 b) overlap. This is the region between dashed double-lines 216 a, 216 b, and the third FOV 212 c. The second shaded area 220 shows that the FOVs of the second camera (206 b) and third camera (206 c) overlap, i.e. the second FOV 212 b, and third FOV 212 c overlap. The aggregate FOV of the first, second and third cameras 206a, 206b, 206c is located between the dashed double lines 214a and 218b. This includes the first FOV212a, the second FOV212b and the third FOV212c. It also includes the first and second shaded area 220 and 222, where the FOVs overlap. This allows for the creation of an aggregated view with any horizontal or vertical range (or azimuth/altitude in a spherical coordination system), such as ninety, one hundred twenty, one hundred eighty, three hundred sixty, and one hundred. FIG. 2 The optical system 204 can aggregate at least one-hundred eighty degrees of optical foV around the modular housing 200 in a plane from P1 to the modular housing 200. Any desired overlap between the optical sensors and any desired combined field of view may exist. An ordinary person with common skill will know that more cameras have overlapping FOVs or better cameras with larger FOVs. This allows for more aggregate optical FOV. In an example, the optical system can combine three hundred sixty degrees of optical FOV through the housing. It is possible to have a smaller aggregate optical FOV. A person of average skill will also recognize that cameras and optical FOVs described are just an example of the systems possible with the techniques discussed herein. Many implementations with different configurations can be implemented.

The echolocation system 208 could include acoustic transceivers, such as echolocation sensor 210, that can output acoustic signal and detect echos of those signals. A single echolocation sensor can detect objects in line of sight, but an array of these sensors could be used to provide more reliable three-dimensional detection using beam forming or similar techniques. Echolocation system 208 could use echolocation sensors 220 to detect obstructions within the array’s acoustic FOV. The sensor data may be sent to the component of the navigation system by the echolocation system 208. This data may then be used to determine the vehicle’s navigation path based on the data. The echolocation device 208 can provide processed data, such as data that characterizes the shape, size, distance and movement of an obstruction. Or, it may simply provide an alert based upon a detected obstacle. The echolocation system 208 can also interpret sensor data to create a navigational path for the vehicle, such as a collision avoidance maneuver. The echolocation system may work with the optical systems 204 to enhance contextual data or provide an alternative mode of obstacle sensing.

Click here to view the patent on Google Patents.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *