Sony Patent | Information processing device, information processing method, and information processing system

Patent: Information processing device, information processing method, and information processing system

Drawings: Click to check drawins

Publication Number: 20210224976

Publication Date: 20210722

Applicant: Sony

Abstract

The present technology relates to an information processing device, an information processing method, and an information processing system by which a user can easily recognize a position of a sensor having a defect from among sensors mounted on a vehicle. Position-related information about a relative position or direction with respect to a vehicle is acquired, and a combined image obtained by combining a defect image representing a position of the sensor having a defect from among the sensors mounted on the vehicle with a vehicle image reflecting the vehicle is displayed in response to the position-related information. The present technology can be applied so that a user can recognize a sensor having a defect from among the sensors mounted on a vehicle.

Claims

  1. An information processing device comprising: a position-related information acquisition unit configured to acquire position-related information about a relative position or direction with respect to a vehicle; and a display controller configured to display a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

  2. The information processing device according to claim 1, wherein the display controller displays the combined image obtained by combining an image representing a relative position of the information processing device with respect to the vehicle and the defect image with the vehicle image overlooking the vehicle in response to the position-related information.

  3. The information processing device according to claim 1, wherein the display controller displays the combined image obtained by combining the defect image with the vehicle image of the vehicle viewed at a position of the information processing device in response to the position-related information.

  4. The information processing device according to claim 3, wherein the vehicle image is computer graphics (CG) of the vehicle viewed at the position of the information processing device or a captured image obtained by capturing an image of the vehicle using a camera at the position of the information processing device.

  5. The information processing device according to claim 1, wherein the display controller switches bird’s eye view display for displaying the combined image obtained by combining an image representing a relative position of the information processing device with respect to the vehicle and the defect image with the vehicle image overlooking the vehicle, virtual reality (VR) display for displaying the combined image obtained by combining the defect image with computer graphics (CG) of the vehicle viewed at the position of the information processing device, and augmented reality (AR) display for displaying the combined image obtained by combining the defect image with a captured image obtained by capturing an image of the vehicle using a camera at the position of the information processing device.

  6. The information processing device according to claim 5, wherein the display controller performs the bird’s eye view display when the sensor having the defect is not viewed at the position of the information processing device, and performs the VR display or the AR display when the sensor having the defect is viewed at the position of the information processing device.

  7. The information processing device according to claim 5, wherein the display controller performs the VR display or the AR display in response to brightness of the captured image.

  8. The information processing device according to claim 1, wherein the vehicle causes light-emitting parts around the sensor having the defect to be turned on.

  9. The information processing device according to claim 8, wherein the vehicle causes the light-emitting parts to be turned off when the defect of the sensor having the defect has been eliminated.

  10. The information processing device according to claim 8, wherein the sensor includes the light-emitting parts.

  11. The information processing device according to claim 1, wherein the sensor is a camera.

  12. The information processing device according to claim 1, wherein the defect is dirt on the sensor.

  13. The information processing device according to claim 12, wherein the defect image includes an image representing a position of the dirt on the sensor having the dirt.

  14. The information processing device according to claim 1, which is a portable terminal.

  15. An information processing method comprising: acquiring position-related information about a relative position or direction with respect to a vehicle; and displaying a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

  16. An information processing system comprising: a sensor configured to be mounted on a vehicle; a defect detector configured to detect a defect of the sensor; a display unit configured to display an image; a position-related information acquisition unit configured to acquire position-related information about a relative position or direction with respect to the vehicle; and a display controller configured to display a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

Description

TECHNICAL FIELD

[0001] The present technology relates to an information processing device, an information processing method, and an information processing system, and particularly, to an information processing device, an information processing method, and an information processing system by which a user can easily recognize, for example, a position of a sensor at which there is a defect from among sensors mounted on a vehicle.

BACKGROUND ART

[0002] Recently, a large number of sensors such as cameras have been provided outside a vehicle such as a four-wheeled vehicle for driving assistance and the like such as with an automatic brake.

[0003] Dirt easily adheres to sensors provided outside a vehicle, and dirt adhering to a sensor hinders sensing of the sensor and thus it is necessary to clean such dirt adhering to a sensor.

[0004] For example, PTL 1 proposes a technology for displaying an image captured using a camera as a sensor on a portable terminal and notifying a user of a contamination state of the sensor.

CITATION LIST

Patent Literature

[0005] [PTL 1]

[0006] JP 2012-106557A

SUMMARY

Technical Problem

[0007] When an image captured by a camera as a sensor is displayed on a portable terminal, a user can recognize that the camera has dirt but has difficulty recognizing the position (position in a vehicle) at which the camera having dirt is mounted. Accordingly, it is difficult for the user to clean the dirt adhering to the sensor when the position of the camera having the dirt is unclear.

[0008] In view of such situations, the present technology causes a user to be able to easily recognize a position of a sensor having a defect such as dirt from among sensors mounted on a vehicle.

Solution to Problem

[0009] An information processing device of the present technology is an information processing device including: a position-related information acquisition unit configured to acquire position-related information about a relative position or direction with respect to a vehicle, and a display controller configured to display a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

[0010] An information processing method of the present technology is an information processing method including: acquiring position-related information about a relative position or direction with respect to a vehicle; and displaying a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

[0011] An information processing system of the present technology is an information processing system including: a sensor configured to be mounted on a vehicle; a defect detector configured to detect a defect of the sensor; a display unit configured to display an image; a position-related information acquisition unit configured to acquire position-related information about a relative position or direction with respect to the vehicle; and a display controller configured to display a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle in response to the position-related information.

[0012] In the information processing device, the information processing method, and the information processing system of the present technology, position-related information about a relative position or direction with respect to a vehicle is acquired, and a combined image obtained by combining a defect image representing a position of a sensor having a defect from among sensors mounted on the vehicle with a vehicle image reflecting the vehicle is displayed in response to the position-related information.

[0013] Note that the information processing device may be an independent device or may be an internal block constituting a single device.

[0014] In addition, the information processing device can be realized by making a computer to execute a program. The program can be provided by being transmitted through a transmission medium or by being recorded in a recording medium.

Advantageous Effects of Invention

[0015] According to the present technology, a user can easily recognize a position of a sensor having a defect from among sensors mounted on a vehicle.

[0016] The effects described herein are not necessarily limiting and any effect described in the present disclosure may be obtained.

BRIEF DESCRIPTION OF DRAWINGS

[0017] FIG. 1 is a diagram showing an overview of one embodiment of a vehicle control system as an information processing system to which the present technology is applied.

[0018] FIG. 2 is a block diagram showing an example of an electrical configuration of a vehicle 1.

[0019] FIG. 3 is a block diagram showing an example of an electrical configuration of a vehicle exterior terminal 40.

[0020] FIG. 4 is a diagram illustrating augmented reality (AR) display as a first display example of a combined image.

[0021] FIG. 5 is a diagram illustrating virtual reality (VR) display as a second display example of a combined image.

[0022] FIG. 6 is a diagram illustrating bird’s eye view display as a third display example of a combined image.

[0023] FIG. 7 is a diagram illustrating an example of control of equipment of the vehicle 1 synchronized with notification processing.

[0024] FIG. 8 is a flowchart illustrating notification processing of a vehicle control device 10.

[0025] FIG. 9 is a flowchart illustrating notification processing of the vehicle exterior terminal 40.

[0026] FIG. 10 is a diagram showing a display example of display of a defect image including an image representing a position of dirt.

[0027] FIG. 11 is a cross-sectional view and a plan view showing an example of a configuration of a sensor 30.sub.i having light-emitting parts.

[0028] FIG. 12 is a block diagram showing an example of a configuration of one embodiment of a computer to which the present technology is applied.

DESCRIPTION OF EMBODIMENTS

[0029]

[0030] FIG. 1 is a diagram showing an overview of one embodiment of a vehicle control system as an information processing system to which the present technology is applied.

[0031] In FIG. 1, the vehicle control system includes a vehicle 1 and a vehicle exterior terminal 40.

[0032] The vehicle 1 is a four-wheeled car, for example, and includes a vehicle control device 10. Further, the vehicle 1 is equipped with on-board lights 20.sub.1 and 20.sub.2 as light-emitting parts attached to the vehicle 1, such as headlights, taillights, light emitting diodes (LEDs) and lamps, and sensors 30.sub.1, 30.sub.2, 30.sub.3, 30.sub.4, and 30.sub.5 and the like such as cameras outside the vehicle.

[0033] The vehicle control device 10 performs various types of control such as driving assistance in response to sensor information supplied from sensors 30.sub.i (i=1, 2, 3, 4, 5 in the present embodiment). In addition, the vehicle control device 10 performs control such as turning on and turning off of the on-board lights 20.sub.1 and 20.sub.2.

[0034] Further, the vehicle control device 10 performs wireless communication with the vehicle exterior terminal 40 to exchange various types of information.

[0035] For example, the vehicle control device 10 detects a defect of a sensor 30.sub.i in response to the sensor information from the sensors 30.sub.i, generates defect information including a position of the sensor 30.sub.i (on the vehicle 1) having the defect, and the like and transmits the defect information to the vehicle exterior terminal 40. In addition, the vehicle control device 10 receives position-related information and a cleaning completion notification transmitted from the vehicle exterior terminal 40.

[0036] The vehicle control device 10 turns on an on-board light 20.sub.j (j=1, 2 in the present embodiment) close to the sensor 30.sub.i having the defect in response to the position-related information from the vehicle exterior terminal 40. In addition, the vehicle control device 10 turns off the on-board light 20.sub.j in response to the cleaning completion notification from the vehicle exterior terminal 40.

[0037] The vehicle exterior terminal 40 is, for example, an information processing device such as a portable terminal that can be carried by a user (e.g., a driver of the vehicle 1, or the like) of a smartphone or the like and performs wireless communication with the vehicle control device 10 to exchange various types of information.

[0038] For example, the vehicle exterior terminal 40 acquires position-related information about a relative position or direction of the vehicle exterior terminal 40 with respect to the vehicle and transmits the position-related information to the vehicle control device 10. In addition, the vehicle exterior terminal 40 transmits a cleaning completion notification representing that cleaning of, for example, dirt as a defect of a sensor 30.sub.i is completed to the vehicle control device 10 in response to an operation of a user.

[0039] Further, the vehicle exterior terminal 40 receives defect information transmitted from the vehicle control device 10. The vehicle exterior terminal 40 generates a defect image representing a position of a sensor 30.sub.i having a defect from the defect information from the vehicle control device 10 and combines the defect image with a vehicle image reflecting the vehicle 1 in response to position-related information. In addition, the vehicle exterior terminal 40 causes a display unit 145 to display a combined image obtained by combining the defect image with the vehicle image.

[0040] In FIG. 1, in the vehicle exterior terminal 40, a blackened circular image indicating the position of a sensor 30.sub.i having a defect on the vehicle 1 and a balloon image in which a message “This camera is dirty with mud” is written are used as a defect image, and this defect image and a vehicle image are combined (superposed) and displayed as a combined image.

[0041] As described above, in the vehicle control system of FIG. 1, a user can easily recognize the position of a sensor 30.sub.i having a defect from among sensors 30.sub.1 to 30.sub.5 mounted on the vehicle 1 by carrying the vehicle exterior terminal 40 outside the vehicle and viewing a combined image because the vehicle exterior terminal 40 displays the combined image obtained by combining the vehicle image reflecting the vehicle 1 and a defect image indicating the position of the sensor 30.sub.i having a defect mounted on the vehicle 1.

[0042] Here, when the vehicle control device 10 detects a defect such as dirt on the sensor 30.sub.i on the vehicle 1 on which the sensor 30.sub.i is mounted outside the vehicle, there are cases in which the user needs to exit the vehicle and clean the sensor 30.sub.i. For example, when the vehicle 1 is not equipped with a cleaning system for automatically cleaning the sensor 30.sub.i with water, a cleaning solution, air, or the like or when the dirt on the sensor 30.sub.i cannot be removed using such a cleaning system even when the vehicle is equipped with the cleaning system, the user needs to exit the vehicle and clean the sensor 30.sub.i.

[0043] When the user exits the vehicle and cleans the sensor 30.sub.i, the user needs to recognize the position of the sensor 30.sub.i having dirt on the vehicle 1 and the condition of dirt as necessary. When information on the dirt on the sensor 30.sub.i is displayed on an in-vehicle display of the vehicle 1, the user needs to view the display on the display, memorize the position of the sensor 30.sub.i having dirt or the condition of dirt and additionally exit the vehicle to find and clean the sensor 30.sub.i having dirt. In addition, the user needs to return from the outside of the vehicle to the inside of the vehicle and confirm whether the vehicle control device 10 determines that the dirt on the sensor 30.sub.i has been eliminated and the sensor 30.sub.i now has no problems after cleaning of the sensor 30.sub.i having dirt, that is, whether the dirt on the sensor 30.sub.i is no longer detected.

[0044] Further, when the vehicle 1 is realized as a self-driving vehicle in the future, the number of sensors 30.sub.i mounted on the vehicle 1 is expected to increase and the sizes of the sensors 30.sub.i are expected to decrease in order to improve robustness of sensing. In addition, it is expected that people will have more opportunities to use vehicles that they do not own themselves, such as automatically allocated taxis and vehicles for car sharing, in the world in which self-driving vehicles are distributed. In this way, it is expected to take time for the user using the vehicle 1 to find the sensor 30.sub.i having dirt when the vehicle 1 is not the user’s own vehicle because the user does is not familiar with the position of the sensor 30.sub.i on the vehicle 1 on which the sensor 30.sub.i is mounted or a mounted state of the sensor 30.sub.i.

[0045] Accordingly, a method for causing the user to recognize the position of the sensor 30.sub.i having dirt, for example, by blinking a light (e.g., an on-board light 20 or the like) at a position close to the sensor 30.sub.i having dirt or outputting a sound at a position close to the sensor 30.sub.i having dirt may be conceived.

[0046] However, when the number of sensors 30.sub.i mounted on the vehicle 1 increases, as described above, a large number of sensors 30.sub.i may be present close to lights. In this case, it may be difficult for the user to easily recognize the position of the sensor 30.sub.i having dirt even when a light at a position close to the sensor 30.sub.i having dirt is blinked.

[0047] Accordingly, in the vehicle control system of FIG. 1, a combined image obtained by combining the vehicle image reflecting the vehicle 1 with the defect image representing the position of the sensor 30.sub.i having a defect mounted on the vehicle 1 is displayed on the portable vehicle exterior terminal 40, as described above. In this case, the user can easily recognize the position of the sensor 30.sub.i having a defect such as dirt from among the sensors 30.sub.1 to 30.sub.5 mounted on the vehicle 1 by carrying the vehicle exterior terminal 40 outside the vehicle and viewing the combined image.

[0048] Note that an agricultural vehicle such as a truck, a construction vehicle such as an excavator, a construction vehicle, and any other vehicles can be employed as the vehicle 1 in addition to the four-wheeled vehicle. In addition, a drone, an airplane, a robot, and any other devices equipped with a sensor can be employed instead of the vehicle 1.

[0049] Further, defects of the sensors 30.sub.i include failure of the sensor 30.sub.i and any other defects in addition to adherence of dirt (an obstacle that hinders sensing) such as dust, insects and mud. Hereinafter, it is assumed that (adherence of) dirt is employed as a defect of the sensors 30.sub.i.

[0050] In addition, cameras (including a stereo camera), distance measuring sensors (a time of flight (ToF) sensor, a light detection and ranging (LIDAR) sensor, a millimeter wave sensor, and the like), an acceleration sensor, a temperature sensor, and any other sensors can be employed as the sensor 30.sub.i.

[0051] Further, although the five sensor 30.sub.1 to 30.sub.5 are mounted on the vehicle 1 in FIG. 1, the number of sensors 30.sub.i mounted on the vehicle 1 is not limited to five.

[0052] In addition, as a (bidirectional) wireless communication method performed between the vehicle control device 10 and the vehicle exterior terminal 40, for example, any method such as a wireless local area network (LAN) or Bluetooth (registered trademark) can be employed.

[0053] Further, wireless communication between the vehicle control device 10 and the vehicle exterior terminal 40 may be performed peer-to-peer or performed via a network such as the Internet.

[0054] In addition, the vehicle exterior terminal 40 can be realized as a dedicated terminal or realized by causing a general-purpose information processing device such as a smartphone to execute an application. Further, the vehicle exterior terminal 40 can be configured as a single detachable block of the vehicle control device 10.

[0055]

[0056] FIG. 2 is a block diagram showing an example of an electrical configuration of the vehicle 1 in FIG. 1.

[0057] As described in FIG. 1, the vehicle 1 includes the vehicle control device 10, the on-board lights 20.sub.1 and 20.sub.2 and the sensors 30.sub.1 to 30.sub.5 such as cameras.

[0058] The vehicle control device 10 includes a display unit 111, an input unit 112, an information processing unit 113, and communication units 114 and 115.

[0059] The display unit 111 displays various types of information according to control of the information processing unit 113.

[0060] The input unit 112 is operated by a user and supplies input information corresponding to a user operation to the information processing unit 113. When the input information is supplied from the input unit 112 to the information processing unit 113, the information processing unit 113 performs processing in response to the input information from the input unit 112.

[0061] Note that the display unit 111 and the input unit 112 can be integrally configured using a touch panel or the like, for example.

[0062] The information processing unit 113 performs control of each block constituting the vehicle control device 10 and various other types of processing.

[0063] The communication unit 114 performs wired communication with on-board lights 20.sub.j, sensors 30.sub.i, and the like.

[0064] For example, the communication unit 114 transmits light control information supplied from the information processing unit 113 to the on-board lights 20.sub.j. The on-board lights 20.sub.j are turned on, blink, are turned off, or change directions and intensities of light emission according to the light control information from the communication unit 114.

[0065] In addition, for example, the communication unit 114 receives sensor information transmitted from the sensors 30.sub.i and supplies the sensor information to the information processing unit 113. Sensor information of a sensor 30.sub.i includes a sensing result of sensing performed by the sensor 30.sub.i (e.g., a captured image having pixel values of RGB (red, green and blue) and the like acquired by capturing an image using a camera, a distance image having pixel values of a distance which is acquired by measuring a distance by a distance measuring sensor, or the like), a dirt condition of dirt as a defect of the sensor 30.sub.i, and the like. The information processing unit 113 detects dirt as a defect of a sensor 30.sub.i according to the sensor information of the sensors 30.sub.i from the communication unit 114 and generates defect information representing whether the sensor 30.sub.i has dirt. Further, the information processing unit 113 introduces sensor position information indicating the position of the sensor 30.sub.i (on the vehicle 1) into the defect information of the sensor 30 and supplies the defect information including the sensor position information to the communication unit 115 such that the communication unit 115 transmits the defect information to the vehicle exterior terminal 40.

[0066] The communication unit 115 performs wireless communication with the vehicle exterior terminal 40 and the like.

[0067] For example, the communication unit 115 transmits the defect information of the sensor 30.sub.i supplied from the information processing unit 113 to the vehicle exterior terminal 40.

[0068] In addition, for example, the communication unit 115 receives position-related information and a cleaning completion notification transmitted from the vehicle exterior terminal 40 and supplies the position-related information and the cleaning completion notification to the information processing unit 113.

[0069] The position-related information transmitted from the vehicle exterior terminal 40 is information about a relative position or direction of the vehicle exterior terminal 40 with respect to the vehicle 1, and the information processing unit 113 obtains the relative position of the vehicle exterior terminal 40 with respect to the vehicle 1 (hereinafter referred to simply as a relative position) from the position-related information. That is, when the position-related information indicates the relative position of the vehicle exterior terminal 40 with respect to the vehicle 1, the information processing unit 113 employs the relative position indicated by the position-related information as the relative position of the vehicle exterior terminal 40. In addition, when the position-related information indicates the relative direction of the vehicle exterior terminal 40 with respect to the vehicle 1, the information processing unit 113 obtains, for example, a position at a predetermined distance from the vehicle 1 in the relative direction indicated by the position-related information as the relative position of the vehicle exterior terminal 40. In addition, the information processing unit 113 controls the on-board lights 20.sub.j by supplying light control information for controlling the on-board lights 20.sub.j in response to the relative position or the like of the vehicle exterior terminal 40 to the communication unit 114 and causing the communication unit 114 to transmit the light control information to the on-board lights 20.sub.j.

[0070] The cleaning completion notification transmitted from the vehicle exterior terminal 40 is a notification of completion of cleaning of a sensor 30.sub.i having dirt, and when the cleaning completion notification is supplied from the communication unit 115, the information processing unit 113 repeats detection of (presence or absence of) dirt as a defect of a sensor 30.sub.i according to the sensor information of the sensors 30.sub.i from the communication unit 114 and generation of defect information.

[0071]

[0072] FIG. 3 is a block diagram showing an example of an electrical configuration of the vehicle exterior terminal 40 of FIG. 1.

[0073] The vehicle exterior terminal 40 includes a communication unit 141, an information processing unit 142, a camera 144, and a display unit 145.

[0074] The communication unit 141 performs wireless communication with the vehicle control device 10 and the like.

[0075] For example, the communication unit 141 transmits position-related information and a cleaning completion notification supplied from (a controller 155 of) the information processing unit 142 to the vehicle control device 10. In addition, the communication unit 141 receives defect information transmitted from (the communication unit 115 of) the vehicle control device 10 and supplies the defect information to (the controller 155) the information processing unit 142.

[0076] The information processing unit 142 performs control of each block constituting the vehicle control device 10 and other various types of processing.

[0077] The information processing unit 142 includes an image processor 151, a position-related information acquisition unit 152, an image generator 153, a display controller 154, and the controller 155.

[0078] An image captured by the camera 144 is supplied to the image processor 151. When the captured image from the camera 144 is a vehicle image reflecting the vehicle 1, the image processor 151 supplies the captured image that is the vehicle image to the position-related information acquisition unit 152 and supplies it to the display controller 154 as necessary.

[0079] In addition, the image processor 151 generates computer graphics (CG) of the vehicle 1 from the captured image that is the vehicle image and provides the vehicle image as the CG to the display controller 154.

[0080] The position-related information acquisition unit 152 acquires position-related information about the relative position or direction of the vehicle exterior terminal 40 with respect to the vehicle 1 by estimating the position-related information, for example, according to simultaneous localization and mapping (SLAM) using the captured image that is the vehicle image from the image processor 151 and provides the position-related information to the image generator 153 and the controller 155.

[0081] Although the position-related information acquisition unit 152 acquires the position-related information of the vehicle exterior terminal 40 using SLAM in this case, the position-related information of the vehicle exterior terminal 40 can be acquired through any other methods.

[0082] For example, the position-related information of the vehicle exterior terminal 40 can be estimated from the shape of the vehicle 1 reflected in the captured image that is the vehicle image. In addition, a predetermined marker can be provided on the vehicle 1 and the position-related information of the vehicle exterior terminal 40 can be estimated from the marker on the vehicle 1 reflected in the captured image that is the vehicle image. Further, radio waves having strong directivity can be transmitted from the vehicle 1 and the position-related information of the vehicle exterior terminal 40 can be estimated from a reception state of the radio waves in the vehicle exterior terminal 40. In addition, when the vehicle 1 and the vehicle exterior terminal 40 have a Global Positioning System (GPS) function, the position-related information of the vehicle exterior terminal 40 can be estimated from positions of the vehicle 1 and the vehicle exterior terminal 40 acquired using the GPS function. Further, when a sensor 30.sub.i of the vehicle 1 radiates infrared rays for measuring a distance or the vehicle 1 is equipped with an LED emitting visible rays, the position-related information of the vehicle exterior terminal 40 can be estimated from a reception state of the infrared rays or visible rays in the vehicle exterior terminal 40.

[0083] Although methods of obtaining the position-related information in the vehicle exterior terminal 40 have been described above, the position-related information of the vehicle exterior terminal 40 can be obtained in the vehicle 1 instead of the vehicle exterior terminal 40. Then, the position-related information obtained in the vehicle 1 is transmitted from the vehicle 1 to the vehicle exterior terminal 40 and the vehicle exterior terminal 40 can acquire the position-related information by receiving it from the vehicle 1.

[0084] In the vehicle 1, position-related information of a user carrying the vehicle exterior terminal 40 outside the vehicle can be obtained as the position-related information of the vehicle exterior terminal 40 by capturing an image of the user using a sensor 30.sub.i as a camera and recognizing the user reflected in a captured image acquired by the capturing, for example. In addition, the position-related information of the user can be obtained as the position-related information of the vehicle exterior terminal 40, for example, by employing a human body sensor as a sensor 30.sub.i and using a sensing result of the human body sensor. Further, radio waves having strong directivity can be transmitted from the vehicle exterior terminal 40 and the position-related information of the vehicle exterior terminal 40 can be obtained (estimated) from a reception state of the radio waves in the vehicle 1. In addition, when the vehicle exterior terminal 40 can emit infrared rays or visible rays, the position-related information of the vehicle exterior terminal 40 can be obtained from a reception state of the infrared rays or the visible rays in the vehicle 1.

[0085] In addition, the position-related information of the vehicle exterior terminal 40 can be obtained by combining a plurality of the above-described methods to improve robustness.

[0086] The position-related information of the vehicle exterior terminal 40 is supplied from the position-related information acquisition unit 152 to the image generator 153 and defect information transmitted from the vehicle control device 10 is supplied from the controller 155. The image generator 153 generates a defect image representing the position of a sensor 30 having dirt according to the position-related information from the position-related information acquisition unit 152 and the defect information from the controller 155 and provides the defect image to the display controller 154.

[0087] The display controller 154 performs display control for supplying various images to the display unit 145 and causing the display unit 145 to display the images. In addition, the display controller 154 combines the defect image from the image generator 153 with the vehicle image that is a captured image or a CG vehicle image supplied from the image processor 151 to generate a combined image. Then, the display controller 154 provides the combined image to the display unit 145 and causes the display unit 145 to display the combined image.

[0088] The controller 155 performs control of each block of the information processing unit 142 and various other types of processing.

[0089] For example, the controller 155 generates a cleaning completion notification in response to an operation of the input unit 143, provides the cleaning completion notification to the communication unit 141, and causes the communication unit 141 to transmit the cleaning completion notification. In addition, the controller 155 provides the position-related information supplied from the position-related information acquisition unit 152 to the communication unit 141 and causes the communication unit 141 to transmit the position-related information. Further, the controller 155 provides the defect information supplied from the communication unit 141 and transmitted from the vehicle control device 10 to the image generator 153.

[0090] The input unit 143 is operated by a user and provides input information corresponding to a user operation to the controller 155.

[0091] The camera 144 captures, for example, an image of a subject such as the vehicle 1 and provides a captured image acquired by the capturing to the image processor 151.

[0092] The display unit 145 displays the image such as the combined image supplied from the display controller 154 according to display control of the display controller 154.

[0093] Note that the input unit 143 and the display unit 145 can be integrally configured using a touch panel or the like, for example.

[0094] In the vehicle control system of FIG. 1 which is composed of the vehicle 1 of FIG. 2 and the vehicle exterior terminal 40 of FIG. 3, notification processing of notifying a user of dirt as a defect of a sensor 30.sub.i is started at a predetermined timing in a state in which the vehicle control device 10 of the vehicle 1 starts and the vehicle 1 is stopped in a safe place. As the predetermined timing, for example, a timing at which the user has approached the vehicle 1 (the user has entered a predetermined range of the vehicle 1), a timing at which the user has unlocked the lock (key) of the vehicle 1 from the outside of the vehicle, a timing at which the user has started the engine of the vehicle 1, and the like can be employed.

[0095] Note that it is desirable to perform the notification processing in a state in which the engine of the vehicle 1 has been started such that sufficient power can be used in the vehicle control device 10. However, when the vehicle 1 is an electric vehicle, the notification processing may be performed without the engine of the vehicle 1 being started because sufficient power can be used in the vehicle control device 10 even when the engine of the vehicle 1 is not started.

[0096] In addition, it is assumed that a combined image and the like are not displayed on the vehicle exterior terminal 40 while the vehicle 1 is traveling because a user who is driving the vehicle 1 cannot view the combined image and the like displayed on the vehicle exterior terminal 40 even if the combined image and the like are displayed on the vehicle exterior terminal 40 while the vehicle 1 is traveling.

[0097] When the notification processing is started in the vehicle control device 10 (FIG. 2), the sensor 30.sub.i acquires the condition of dirt as a defect of the sensor 30.sub.i, introduces the condition of dirt into sensor information and transmits the sensor information. The sensor information transmitted by the sensor 30.sub.i is received by the communication unit 114 and provided to the information processing unit 113. Note that any method can be employed as a method of acquiring the condition of dirt on the sensor 30.sub.i. In addition, the condition of dirt of the sensor 30.sub.i can be acquired at any timing in addition to the timing at which the notification processing is started. For example, the condition of dirt of the sensor 30.sub.i can be acquired and stored in a memory of the vehicle control device 10 or the sensor 30.sub.i, which is not shown, when the vehicle 1 is traveling, stopped or the like. Then, the condition of dirt of the sensor 30.sub.i stored in the memory can be used in the notification processing performed thereafter.

[0098] The information processing unit 113 detects the sensor 30.sub.i having dirt as a defect (detects the dirt on the sensor 30.sub.i) from (the condition of dirt included in) the sensor information from the communication unit 114. When the sensor 30.sub.i having dirt is detected, the information processing unit 113 causes the display unit 111 to display information representing that the sensor 30.sub.i has dirt, and the like. Further, the information processing unit 113 causes the display unit 111 to display the position of the sensor 30.sub.i having dirt, the condition (state) of dirt, and the like.

[0099] In addition, the information processing unit 113 generates defect information representing whether dirt is present with respect to the sensor 30.sub.i. Further, the information processing unit 113 introduces sensor position information representing the position of the sensor 30.sub.i into the defect information of the sensor 30.sub.i, provides the defect information to the communication unit 115 and causes the communication unit 115 to transmit the defect information to the vehicle exterior terminal 40.

[0100] When a user inside the vehicle 1 views display of the display unit 111 and recognizes that the sensor 30.sub.i has dirt, the user cleans the sensor 30.sub.i having dirt.

[0101] For cleaning of the sensor 30.sub.i having dirt, the user exits the vehicle carrying the vehicle exterior terminal 40 and confirms the sensor 30.sub.i having dirt according to the vehicle exterior terminal 40. For this, the user exits the vehicle, directs the display unit 145 of the smartphone as the vehicle exterior terminal 40 toward himself or herself and assumes a posture for capturing an image of the vehicle 1 with the camera 144 provided on the opposite side (rear side when the side on which the display unit 145 is provided is a front side) of the display unit 145.

[0102] When the notification processing is started in the vehicle exterior terminal 40 (FIG. 3), the camera 144 starts capturing an image.

[0103] Accordingly, a captured image that is a vehicle image reflecting the vehicle 1 viewed at the position of the vehicle exterior terminal 40 (the position of the user) is captured by the camera 144 and provided to the image processor 151. The image processor 151 provides the captured image as the vehicle image from the camera 144 to the position-related information acquisition unit 152.

[0104] Further, the image processor 151 provides the captured image as the vehicle image from the camera 144 to the display controller 154. Alternatively, the image processor 151 generates CG of the vehicle 1 viewed at the position of the vehicle exterior terminal 40 from the captured image as the vehicle image from the camera 144 and provides the vehicle image as the CG to the display controller 154.

……
……
……

You may also like...