空 挡 广 告 位 | 空 挡 广 告 位

Sony Patent | Information processing device, information processing method, and computer program

Patent: Information processing device, information processing method, and computer program

Drawings: Click to check drawins

Publication Number: 20210160150

Publication Date: 20210527

Applicant: Sony

Abstract

[Problem] To provide an information processing device, an information processing method, and a computer program that can intuitively present a connection state between wireless appliances in a space. [Solution] An information processing device includes a control unit configured to perform control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

Claims

  1. An information processing device comprising: a control unit configured to perform control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

  2. The information processing device according to claim 1, wherein the positional information is three-dimensional position information indicating a position in a space, and the control unit performs control for displaying, in a case in which the first wireless appliance is connected to the second wireless appliance, the first virtual object that links a three-dimensional position of the first wireless appliance with a three-dimensional position of the second wireless appliance.

  3. The information processing device according to claim 2, wherein the first virtual object is a line-shaped display image that links the first wireless appliance with the second wireless appliance.

  4. The information processing device according to claim 3, wherein the first virtual object includes a virtual cable image of the first wireless appliance and a virtual cable image of the second wireless appliance.

  5. The information processing device according to claim 2, wherein the control unit detects a disconnection operation performed by a user for disconnecting the connection between the first wireless appliance and the second wireless appliance, and performs disconnection processing for disconnecting the connection between the first wireless appliance and the second wireless appliance after detecting the disconnection operation.

  6. The information processing device according to claim 5, wherein the disconnection operation performed by the user is an operation by gesture or voice.

  7. The information processing device according to claim 6, wherein a motion of a hand that disconnects the first virtual object linking the first wireless appliance with the second wireless appliance is detected as the gesture.

  8. The information processing device according to claim 1, wherein the control unit acquires positional information of a third wireless appliance in a case of detecting the third wireless appliance that is not connected to the first wireless appliance, and performs control for displaying a second virtual object indicating a non-connected state with respect to the first wireless appliance at a position of the third wireless appliance.

  9. The information processing device according to claim 8, wherein the control unit detects a connection operation performed by a user for connecting the first wireless appliance with the third wireless appliance, and performs connection processing for connecting the first wireless appliance with the third wireless appliance after detecting the connection operation.

  10. The information processing device according to claim 9, wherein the connection operation performed by the user is an operation by gesture or voice.

  11. The information processing device according to claim 10, wherein the gesture is at least an operation for a third virtual object indicating a non-connected state displayed at the position of the first wireless appliance, or the second virtual object indicating a non-connected state displayed at a position of the third wireless appliance.

  12. The information processing device according to claim 11, wherein the second virtual object and the third virtual object are virtual cable images having different display modes for respective wireless communication schemes.

  13. The information processing device according to claim 9, wherein the control unit performs connection processing using a wireless communication scheme that is different depending on the connection operation.

  14. The information processing device according to claim 1, wherein the control unit displays an image indicating a fourth wireless appliance present in another space, acquires connection information about a connection between the first wireless appliance and the fourth wireless appliance, and performs control for displaying a fourth virtual object that links a three-dimensional position of the first wireless appliance with a display position of the image indicating the fourth wireless appliance in a case in which the first wireless appliance is connected to the fourth wireless appliance.

  15. An information processing method performed by a processor, the information processing method comprising: performing control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

  16. A computer program for causing a computer to function as a control unit configured to perform control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

Description

FIELD

[0001] The present disclosure relates to an information processing device, an information processing method, and a computer program.

BACKGROUND

[0002] In recent years, as information processing techniques and communication techniques have been developed, the number and types of appliances that can be connected to the Internet have been remarkably increased. Users have been enabled to access the Internet using such various appliances that can be connected to the Internet, and acquire information or send an instruction from a certain appliance to another appliance. A concept called Internet of Things (IoT) starts to get attention, the IoT for connecting a large number of appliances to exchange information with each other more dynamically and autonomously.

[0003] Such an appliance that can be connected to the Internet to transmit or receive information is also called an “IoT device”.

[0004] Notably, many products have been caused to be IoT devices, and household electrical appliances such as a television, a refrigerator, an acoustic device, an air conditioning device, and a digital camera that are compatible with the IoT have been developed and widespread.

[0005] Regarding increase of such IoT devices, for example, the following Patent Literature 1 discloses an information processing device that can dynamically change an input/output form depending on a situation of the user and adjust output content to solve a problem in that, in a situation in which the user owns a plurality of IoT devices, the user may forget to change a setting and disturb surrounding people by an alarm or a notification sound.

CITATION LIST

Patent Literature

[0006] Patent Literature 1: JP 2016-091221** A**

SUMMARY

Technical Problem

[0007] However, to check which object present in the periphery is the IoT device, a current connection state of the IoT device present in the periphery, or the like, a dedicated GUI screen is required to be called, so that time and effort are taken for operation.

[0008] The present disclosure provides an information processing device, an information processing method, and a computer program that can intuitively present a connection state of a wireless appliance in a space.

Solution to Problem

[0009] According to the present disclosure, an information processing device is provided that includes: a control unit that performs control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

[0010] According to the present disclosure, an information processing method performed by a processor, the information processing method is provided that includes: performing control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

[0011] According to the present disclosure, a computer program is provided that causes a computer to function as a control unit that performs control for displaying, on a display unit, a first virtual object indicating a link between a first wireless appliance and a second wireless appliance based on connection information about a connection between the first wireless appliance and the second wireless appliance, and positional information of the first wireless appliance and positional information of the second wireless appliance.

Advantageous Effects of Invention

[0012] As described above, according to the present disclosure, the connection state of the wireless appliance in the space can be intuitively presented.

[0013] The effects described above are not necessarily limiting, and any one of the effects described in the present description or another effect that may be grasped from the present description may be exhibited in addition to or in place of the effects described above.

BRIEF DESCRIPTION OF DRAWINGS

[0014] FIG. 1 is a diagram for explaining an outline of an information processing system according to one embodiment of the present disclosure.

[0015] FIG. 2 is a diagram for explaining an example of an information processing terminal according to one embodiment of the present disclosure.

[0016] FIG. 3 is a block diagram illustrating an example of a configuration of the information processing terminal according to the present embodiment.

[0017] FIG. 4 is a flowchart illustrating operation processing related to display of a connection state of the information processing terminal according to the present embodiment.

[0018] FIG. 5 is a diagram illustrating an example of an activation gesture for AR cable display according to the present embodiment.

[0019] FIG. 6 is a diagram illustrating an example of AR cable display according to the present embodiment.

[0020] FIG. 7 is a diagram illustrating another example of AR cable display according to the present embodiment.

[0021] FIG. 8 is a diagram illustrating an example of a connection gesture according to the present embodiment.

[0022] FIG. 9 is a diagram illustrating another example of the connection gesture according to the present embodiment.

[0023] FIG. 10 is a diagram for explaining an example of a display mode of an AR cable image according to the present embodiment.

[0024] FIG. 11 is a diagram for explaining another example of the display mode of the AR cable image according to the present embodiment.

[0025] FIG. 12 is a diagram for explaining filtering of a connection destination at the time of starting a connection operation according to the present embodiment.

[0026] FIG. 13 is a diagram illustrating an example of a disconnection gesture according to the present embodiment.

[0027] FIG. 14 is a diagram illustrating another example of the disconnection gesture according to the present embodiment.

[0028] FIG. 15 is a block diagram illustrating a hardware configuration example of the information processing terminal according to one embodiment of the present disclosure.

DESCRIPTION OF EMBODIMENTS

[0029] The following describes a preferred embodiment of the present disclosure in detail with reference to the attached drawings. In the present description and the drawings, constituent elements having substantially the same functional configuration are denoted by the same reference numerals, and redundant description will not be repeated.

[0030] Description will be made in the following order.

[0031] 1. Outline of information processing system according to one embodiment of present disclosure

[0032] 2. Configuration of information processing terminal

[0033] 3. Operation processing

[0034] 4. Complement [0035] (4-1. Application example) [0036] (4-2. Hardware configuration)

[0037] 5. Conclusion

1.* OUTLINE OF INFORMATION PROCESSING SYSTEM ACCORDING TO ONE EMBODIMENT OF PRESENT DISCLOSURE*

[0038] FIG. 1 is a diagram for explaining an outline of an information processing system according to one embodiment of the present disclosure. As illustrated in FIG. 1, the information processing system according to the present embodiment includes an information processing terminal 1 and wireless appliances 2 (wireless appliances 2A to 2F). All of the information processing terminal 1 and the wireless appliances 2A to 2F are devices that can perform wireless communication.

BACKGROUND

[0039] As described above, in recent years, various objects present in the periphery have been enabled to be connected to the Internet due to an IoT technique. For example, in addition to a mobile device such as a smartphone, a tablet, or a notebook personal computer (PC), and a wearable device such as a head mounted display (HMD), smart eyeglasses, a smart band, smart earphones, and a smart necklace, consumer electronics (CE) devices such as a television, a recorder, a digital camera, a game machine, an air conditioner, an acoustic device, a lighting device, a refrigerator, a washing machine, a microwave oven, a home projector, or a desktop PC have been enabled to be connected to the Internet to transmit, receive, or control data.

[0040] In recent years, a smart home, which is obtained by causing a house to be compatible with IoT, starts to be implemented, and IoT products that can be connected to the Internet have become widespread.

[0041] However, while various products used in daily life are caused to be compatible with the IoT and connected to the Internet or a home network to improve convenience as described above, IoT products are required to be frequently connected, disconnected, changed, or the like, and a burden is imposed on users. On a dedicated graphical user interface (GUI) screen at the time of making a wireless communication connection and the like, IDs of wireless appliances are often displayed in a list format, so that association between a real object (IoT product) and the ID is unclear, and it is difficult for a general user to identify the association therebetween at a glance.

[0042] Due to difficulty in operability for wireless connection or disconnection, opportunity loss and useless user operations have been caused.

[0043] Thus, the information processing system according to the present embodiment enables the user to intuitively grasp a connection state by displaying a virtual object indicating the connection state between wireless appliances present in a space based on positions of the wireless appliances. The information processing system according to the present embodiment also enables usability to be improved by enabling connection or disconnection of the wireless appliance to be controlled by an intuitive operation such as a gesture operation.

[0044] Specifically, as illustrated in FIG. 1 for example, in a system configuration including the information processing terminal 1 owned by the user and the wireless appliances 2A to 2F, the information processing terminal 1 receives, from the wireless appliance 2A, connection information about a connection between the wireless appliance 2A and the other wireless appliances 2B to 2F, and recognizes three-dimensional positions of the respective wireless appliances 2A to 2F in a real space. The connection information may be included in wireless information (Wi-Fi wireless information and the like) sent from the wireless appliance 2A, for example, or may be acquired from the information processing terminal 1 by request after the wireless appliance 2A is connected to the information processing terminal 1 for communication.

[0045] The information processing terminal 1 owned by the user and the wireless appliances 2A to 2F are present in the same space (for example, in the same room), for example. The information processing terminal 1 may be what is called an augmented reality (AR) terminal that performs control for displaying a virtual object on a transmissive display unit to seem to be imposed on the real space, and implements augmented reality. With reference to FIG. 2, the following describes an AR terminal as an example of the information processing terminal 1.

[0046] As illustrated in FIG. 2, the AR terminal may be a spectacle-type wearable terminal, for example. As illustrated in FIG. 2, the information processing terminal 1 according to the present embodiment is implemented by a spectacle-type head mounted display (HMD) worn by a user U on his/her head part, for example. Display units 13 corresponding to spectacle lens portions that are positioned in front of eyes of the user U at the time of wearing may be a transmissive type or a non-transmissive type. By displaying a virtual object on the display units 13, the information processing terminal 1 can present the virtual object in a field of vision of the user U. The HMD as an example of the information processing terminal 1 is not limited to the HMD that presents an image to both eyes, but may be an HMD that presents an image to only one eye. For example, the HMD may be a monocular type in which the display unit 13 that presents an image to one eye is disposed.

[0047] The information processing terminal 1 also includes an outward camera 110 disposed therein that images a sight line direction of the user U at the time of wearing, that is, the field of vision of the user. Additionally, although not illustrated in FIG. 1, various sensors such as an inward camera that images the eyes of the user U at the time of wearing and a microphone (hereinafter, referred to as a “microphone”) may also be disposed in the information processing terminal 1. A plurality of the outward cameras 110 and the inward cameras may be disposed.

[0048] The shape of the information processing terminal 1 is not limited to the example illustrated in FIG. 1. For example, the information processing terminal 1 may be an HMD of a headband type (a type that is worn with a band surrounding the entire circumference of the head part. A band may be disposed to pass through not only a side of the head part but also the top of the head part), or an HMD of a helmet type (a visor portion of a helmet corresponds to a display). Alternatively, the information processing terminal 1 may be implemented by a wearable device such as a wristband type (for example, a smart watch; including a case with a display or without a display), a headphone type (not including a display), or a neckphone type (a neck-hanging type; including a case with a display or without a display).

[0049] The following describes the wireless appliances 2A to 2F. The wireless appliances 2A to 2F are IoT devices that can perform wireless communication, and assumed to be various appliances. By way of example, the wireless appliance 2A is a Wi-Fi (registered trademark) master unit, the wireless appliance 2B is a television apparatus, the wireless appliance 2C is a speaker, the wireless appliance 2D is a lighting device, the wireless appliance 2E is a button-type terminal, and the wireless appliance 2F is a wearable terminal. The wireless appliance 2A is a communication appliance that is connected to the peripheral wireless appliances 2B to 2F for communication, and relays communication with the Internet or a home network. The button-type terminal is, for example, a terminal dedicated to order in Internet shopping with which a predetermined commodity can be automatically purchased by pushing a button disposed on the terminal, for example.

[0050] The information processing terminal 1 can enable the user to intuitively grasp the connection state of the wireless appliance 2 by AR-displaying a virtual object indicating a link between the wireless appliance 2A and the other wireless appliances 2B to 2F in accordance with three-dimensional positions of the wireless appliances present in the space based on the connection information received from the wireless appliance 2A.

[0051] The information processing system according to one embodiment of the present disclosure has been described above. The following describes a specific configuration of the information processing terminal included in the information processing system according to the present embodiment with reference to the drawings.

  1. CONFIGURATION OF INFORMATION PROCESSING TERMINAL 1

[0052] FIG. 3 is a block diagram illustrating an example of a configuration of the information processing terminal 1 according to the present embodiment. As illustrated in FIG. 3, the information processing terminal 1 includes a sensor unit 11, a control unit 12, a display unit 13, a speaker 14, a communication unit 15, an operation input unit 16, and a storage unit 17.

[0053] (Sensor Unit 11)

[0054] The sensor unit 11 has a function of acquiring various kinds of information about the user or a peripheral environment. For example, the sensor unit 11 includes an outward camera 110, an inward camera 111, a microphone 112, a gyro sensor 113, an acceleration sensor 114, an azimuth sensor 115, a position measuring unit 116, and a biosensor 117. A specific example of the sensor unit 11 described herein is merely an example, and the present embodiment is not limited thereto. The number of the respective sensors may be plural.

[0055] The specific example of the sensor unit 11 illustrated in FIG. 3 is exemplified as a preferred example, but all components thereof are not necessarily provided. For example, the sensor unit 11 may have a configuration including part of the specific example of the sensor unit 11 illustrated in FIG. 3 such as a configuration including only the outward camera 110, the acceleration sensor 114, and the position measuring unit 116, or may further include another sensor.

[0056] Each of the outward camera 110 and the inward camera 111 includes a lens system constituted of an imaging lens, a diaphragm, a zoom lens, a focus lens, and the like, a driving system that causes the lens system to perform a focus operation and a zoom operation, a solid-state imaging element array that performs photoelectric conversion on imaging light obtained by the lens system to generate an imaging signal, and the like. The solid-state imaging element array may be implemented, for example, by a charge coupled device (CCD) sensor array or a complementary metal oxide semiconductor (CMOS) sensor array.

[0057] In the present embodiment, an angle of view and orientation of the outward camera 110 are preferably set to image a region corresponding to a field of vision of the user in the real space.

[0058] The microphone 112 collects a voice of the user and a surrounding environmental sound to be output to the control unit 12 as voice data.

[0059] The gyro sensor 113 is, for example, implemented by a triaxial gyro sensor, and detects an angular speed (rotational speed).

[0060] The acceleration sensor 114 is, for example, implemented by a triaxial acceleration sensor (also referred to as a G sensor), and detects acceleration at the time of movement.

[0061] The azimuth sensor 115 is, for example, implemented by a triaxial geomagnetic sensor (compass), and detects an absolute direction (azimuth).

[0062] The position measuring unit 116 has a function of detecting a current position of the information processing terminal 1 based on a signal acquired from the outside. Specifically, for example, the position measuring unit 116 is implemented by a Global Positioning System (GPS) measuring unit, and receives radio waves from GPS satellites, detects a position at which the information processing terminal 1 is present, and outputs the detected positional information to the control unit 12. The position measuring unit 116 may also detect the position via Wi-Fi (registered trademark), Bluetooth (registered trademark), transmission/reception to/from a cellular telephone, a PHS, a smartphone, and the like, short-range communication, or the like in addition to the GPS, for example.

[0063] The biosensor 117 detects biological information of the user. Specifically, for example, the biosensor 117 may detect a heartbeat, a body temperature, sweating, a blood pressure, sweating, a pulse, respiration, nictitation, an eye movement, a gazing time, a size of pupil diameter, a blood pressure, brain waves, a body motion, a posture, a skin temperature, electric skin resistance, micro vibration (MV), myoelectric potential, blood oxygen saturation (SPO2), or the like.

[0064] (Control Unit 12)

[0065] The control unit 12 functions as an arithmetic processing unit and a control device, and controls the entire operations in the information processing terminal 1 in accordance with various computer programs. The control unit 12 may be implemented by an electronic circuit such as a central processing unit (CPU) and a microprocessor, for example. The control unit 12 may also include a read only memory (ROM) that stores a computer program to be used, an arithmetic parameter, and the like, and a random access memory (RAM) that temporarily stores a parameter and the like that vary as appropriate.

[0066] The control unit 12 according to the present embodiment controls starting or stopping of each configuration, for example. The control unit 12 can also input a control signal to the display unit 13 or the speaker 14. As illustrated in FIG. 3, the control unit 12 according to the present embodiment may also function as a wireless appliance association processing unit 120, a user operation recognition unit 122, a connection state acquisition unit 124, a connection state display processing unit 126, and a connection management unit 128.

[0067] Wireless Appliance Association Processing Unit 120

[0068] The wireless appliance association processing unit 120 acquires three-dimensional position information of the wireless appliance present in the space based on the information acquired by the sensor unit 11 or the communication unit 15, and associates the three-dimensional position information with wireless information (for example, Wi-Fi wireless information).

[0069] For example, the wireless appliance association processing unit 120 analyzes a taken image sensed by the sensor unit 11 to recognize an object, and acquires the three-dimensional position of the wireless appliance. Subsequently, the wireless appliance association processing unit 120 specifies the wireless appliance in the space based on the device specification information that is received from the wireless appliance via the communication unit 15, and associates the wireless appliance with the wireless information. The device specification information is included in the wireless information, and includes information about a physical characteristic of a corresponding wireless appliance (a characteristic amount, image information, and the like), for example. The wireless appliance association processing unit 120 can specify the wireless appliance in the space by comparing an object recognition result with characteristic information. The control unit 12 of the information processing terminal 1 may recognize a three-dimensional space in advance by using a Simultaneously Localization and Mapping (SLAM) technique, and recognize three-dimensional position information of real objects in the periphery.

[0070] For example, the wireless appliance association processing unit 120 analyzes the taken image taken by an AR camera included in the sensor unit 11, and detects an AR marker given to the wireless appliance to acquire the three-dimensional position. The wireless information acquired by the communication unit 15 includes AR marker information, and the wireless appliance association processing unit 120 can associate the wireless information with the wireless appliance recognized (the position of which is specified) in the space by performing comparison with the AR marker information.

[0071] Alternatively, for example, the wireless appliance association processing unit 120 may analyze the taken image taken by the camera included in the sensor unit 11, detects a QR code (registered trademark) given to the wireless appliance to acquire the three-dimensional position, and compares the three-dimensional position with the wireless information received from the communication unit 15 to make association therebetween.

[0072] For example, the wireless appliance association processing unit 120 may also analyze a specific image, a sound, or blinking of an LED, an infrared LED, an ultraviolet LED, and the like emitted from the wireless appliance detected by a camera, a microphone, a light receiving unit, and the like included in the sensor unit 11 to specify the three-dimensional position of the wireless appliance, and associate the three-dimensional position with the wireless information received from the communication unit 15. The wireless information includes, for example, image information, sound information, or blinking information, which can be compared with an analysis result by the wireless appliance association processing unit 120.

[0073] User Operation Recognition Unit 122

[0074] The user operation recognition unit 122 performs processing of recognizing a user operation using various kinds of sensor information sensed by the sensor unit 11. For example, the user operation recognition unit 122 can recognize a gesture of the user based on a taken image, depth information, positional information, motion information, and the like sensed by the sensor unit 11. The user operation recognition unit 122 can also recognize a request from the user by performing voice recognition based on an utterance of the user sensed by the sensor unit 11.

[0075] The user operation recognition unit 122 can also detect a user operation of managing a connection between the wireless appliances. The user operation of managing a connection is a connection operation of connecting the wireless appliances, and a disconnection operation of disconnecting the connection between the wireless appliances. For example, the user operation recognition unit 122 specifies the wireless appliance designated by the user based on various kinds of sensor information sensed by the sensor unit 11, and based on a position of a hand or content of uttered voice of the user. Additionally, the user operation recognition unit 122 recognizes the wireless appliance instructed to be connected or disconnected by the operation based on a motion of the hand, a shape of a finger, or content of the uttered voice of the user. The user operation may be a combination of a gesture and a voice.

[0076] Connection State Acquisition Unit 124

[0077] The connection state acquisition unit 124 acquires information about the connection state between the wireless appliances based on the connection information that is received from the wireless appliance 2A via the communication unit 15. For example, the connection state acquisition unit 124 acquires information indicating that which of the wireless appliances 2n is connected to the wireless appliance 2A based on the received connection information, and in a case of the connected state, and what is a communication scheme (Bluetooth, Wi-Fi, short-range communication, ZigBee (registered trademark), or the like), and band utilization setting information (restricted/unrestricted), and the like.

[0078] Connection State Display Processing Unit 126

[0079] The connection state display processing unit 126 performs control for displaying a virtual object indicating a link between the wireless appliances (indicating the connected state) on the display unit 13 based on connection state information between the wireless appliances acquired by the connection state acquisition unit 124 and three-dimensional position information of a corresponding wireless appliance acquired by the wireless appliance association processing unit 120. The virtual object indicating the connected state between the wireless appliances may be an AR image connecting the wireless appliances using a line, for example, an image of a virtual cable.

[0080] The connection state display processing unit 126 can perform not only control for displaying the virtual object indicating the connected state but also control for displaying a virtual object indicating a disconnected state for the wireless appliance 2n that is not connected to the wireless appliance 2A in a wireless manner, for example. Which wireless appliance is in the disconnected state can be determined by the information processing terminal 1 by referring to wireless information received from the wireless appliance 2 in the periphery.

[0081] The connection state display processing unit 126 may also control a motion of the virtual object to be displayed in accordance with the connection operation or the disconnection operation performed by the user that is recognized by the user operation recognition unit 122.

[0082] Connection Management Unit 128

[0083] The connection management unit 128 performs connection management (connection, disconnection, and the like) of a target wireless appliance in accordance with the user operation recognized by the user operation recognition unit 122.

[0084] Specifically, for example, the connection management unit 128 transmits, to the wireless appliance 2A (in this case, Wi-Fi), a control signal for instructing to connect or disconnect the connection between the wireless appliance 2A and the wireless appliance 2n designated by the user. The connection management unit 128 may transmit the control signal to the wireless appliance 2n side that is connected/disconnected to/from the master unit.

[0085] (Display Unit 13)

[0086] The display unit 13 is, for example, implemented by a lens unit (an example of a transmissive display unit) that performs display using a hologram optical technique, a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, and the like. The display unit 13 may be a transmissive type, a transflective type, or a non-transmissive type.

[0087] (Speaker 14)

[0088] The speaker 14 reproduces a voice signal in accordance with control by the control unit 12.

[0089] (Communication Unit 15)

[0090] The communication unit 15 is a communication module for transmitting/receiving data to/from another device in a wired/wireless manner. For example, the communication unit 15 wirelessly communicates with an external apparatus directly or via a network access point using a scheme such as a wired local area network (LAN), a wireless LAN, Wireless Fidelity (Wi-Fi, registered trademark), infrared communication, Bluetooth registered trademark), short distance/non-contact communication, and a portable communication network (Long Term Evolution (LTE), third-generation mobile object communication scheme (3G)).

[0091] (Operation Input Unit 16)

[0092] The operation input unit 16 is implemented by an operation member having a physical structure such as a switch, button, or a lever.

[0093] (Storage Unit 17)

……
……
……

您可能还喜欢...