Sony Patent | Display processing device, display processing method, and program
Patent: Display processing device, display processing method, and program
Drawings: Click to check drawins
Publication Number: 20210181854
Publication Date: 20210617
Applicant: Sony
Assignee: Sony Semiconductor Solutions Corporation
Abstract
There is provided a display processing device, a display processing method, and a program that can provide a better user experience when performing virtual display on a real space. An indication point recognition processing unit performs recognition processing to recognize an indication point indicating a point on a real space for creating a virtual drawing image that is an image virtually drawn. An operation information acquisition unit acquires operation information according to a user operation that makes a change to the virtual drawing image in creation. Then, a virtual drawing data processing unit generates virtual drawing data for drawing the virtual drawing image created according to the indication point while reflecting the change according to the operation information. A virtual drawing image display processing unit performs display processing to display the virtual drawing image in creation on a display screen in real time on the basis of the virtual drawing data. The present technology can be applied to, for example, an AR display device.
Claims
-
A display processing device comprising: a recognition processing unit configured to perform recognition processing to recognize an indication point that indicates a point on a real space for creating a virtual drawing image that is an image virtually drawn; an operation information acquisition unit configured to acquire operation information according to a user operation that makes a change to the virtual drawing image in creation; a data processing unit configured to generate virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and a display processing unit configured to perform display processing to display the virtual drawing image in creation on a display screen in real time on a basis of the virtual drawing data.
-
The display processing device according to claim 1, wherein the operation information acquisition unit acquires the operation information in response to a touch operation of a user on a touch panel including a display unit that displays the display screen.
-
The display processing device according to claim 2, wherein the recognition processing unit recognizes the indication point by following the point moving continuously, the operation information acquisition unit acquires the operation information according to a continuous change in the touch operation of the user on the touch panel, and the data processing unit associates timing of continuous movement of the point indicated by the indication point with timing of the continuous change according to the operation information to generate the virtual drawing data.
-
The display processing device according to claim 1, wherein on a basis of a time difference between timing of emitting light and timing of receiving reflected light obtained by the light being reflected by an object, the recognition processing unit recognizes the indication point by using a distance image acquired by a time of flight (TOF) sensor that obtains a distance to the object.
-
The display processing device according to claim 1, further comprising a feedback control unit configured to feed back to a user that the virtual drawing image is being created.
-
The display processing device according to claim 1, further comprising a voice recognition unit configured to recognize a voice uttered by a user to acquire utterance information obtained by transcribing the voice, wherein the data processing unit generates the virtual drawing data for drawing the virtual drawing image in which a character based on the utterance information is virtually placed at a position indicated by the indication point at timing when the character is uttered.
-
The display processing device according to claim 1, further comprising a storage unit configured to store the virtual drawing data generated by the data processing unit, wherein the data processing unit supplies the virtual drawing data read from the storage unit to the display processing unit to perform the display processing of the virtual drawing image.
-
A display processing method to be executed by a display processing device that displays a virtual drawing image that is an image virtually drawn, the method comprising: performing recognition processing to recognize an indication point indicating a point on a real, space for creating the virtual drawing image; acquiring operation information according to a user operation that makes a change to the virtual drawing image in creation; generating virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and performing display processing to display the virtual drawing image in creation on a display screen in real time on a basis of the virtual drawing data.
-
A program for causing a computer of a display processing device that displays a virtual drawing image that is an image virtually drawn to perform display processing comprising: performing recognition processing to recognize an indication point indicating a point on a real space for creating the virtual drawing image; acquiring operation information according to a user operation that makes a change to the virtual drawing image in creation; generating virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and performing display processing to display the virtual drawing image in creation on a display screen in real time on a basis of the virtual drawing data.
Description
TECHNICAL FIELD
[0001] The present disclosure relates to a display processing device, a display processing method, and a program, and in particular, to a display processing device, a processing method, and a program that can provide a better user experience when performing virtual display on a real space.
BACKGROUND ART
[0002] In recent years, technology to perform display processing that seems like an object displayed on a screen really exists in a real space, such as augmented reality and mixed reality, has been put into practical use. An application is provided that performs virtual display in which an object appears to be placed on a real space, for example, by display processing to display an image captured by a camera on a touch panel and to superimpose on the image an object image by using a so-called smartphone.
[0003] Conventionally, in such an application, for example, a user interface is employed to place a virtual object on a real space and to perform an operation on the virtual object by a user performing a touch panel operation. However, such a user interface results in a user experience that gives a feeling that compatibility with a real space is low.
[0004] Furthermore, a user interface is provided that performs a virtual display in which when a user moves a smartphone itself, a line is drawn on a real space according to a locus of the smartphone. However, with such a user interface, it is difficult to draw a virtual line on a real space as intended by the user, resulting in a user experience that gives a feeling that a degree of freedom is low.
[0005] In contrast to this, for example, a user interface is proposed that captures a user’s gesture with a camera, places a virtual object on a real space according to the gesture, and performs an operation on the virtual object.
[0006] For example, Patent Document 1 discloses a user interface technology that provides feedback to a user by using a depth sensor to recognize a user’s gesture.
CITATION LIST
Patent Document
Patent Document 1: Japanese Patent Application Laid-Open Mo. 2012-221498
SUMMARY OF THE INVENTION
Problems to be Solved by the Invention
[0007] Meanwhile, in the user interface using the user’s gesture as described above, for example, an operation to place a virtual object, an operation to make changes to the placement of the virtual object, or the like need to be performed independently by respectively corresponding gestures. For this reason, for example, it is difficult to perform an operation to change a width, color, or the like of a virtual line continuously while drawing the line on a real space, and it is difficult to provide a good user experience.
[0008] The present disclosure has been made in view of such a situation, and is intended to provide a better user experience when performing a virtual display on a real space.
Solutions to Problems
[0009] A display processing device according to one aspect of the present disclosure includes: a recognition processing unit configured to perform recognition processing to recognize an indication point that indicates a point on a real space for creating a virtual drawing image that is an image virtually drawn; an operation information acquisition unit configured to acquire operation information according to a user operation that makes a change to the virtual drawing image in creation; a data processing unit configured to generate virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and a display processing unit configured to perform display processing to display the virtual drawing image in creation on a display screen in real time on the basis of the virtual drawing data.
[0010] A display processing method according to one aspect of the present disclosure is to be executed by a display processing device that displays a virtual drawing image that is an image virtually drawn. The method includes: performing recognition processing to recognize an indication point indicating a point on a real space for creating the virtual drawing image; acquiring operation information according to a user operation that makes a change to the virtual drawing image in creation; generating virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and performing display processing to display the virtual drawing image in creation on a display screen in real time on the basis of the virtual drawing data.
[0011] A program according to one aspect of the present disclosure causes a computer of a display processing device that displays a virtual drawing image that is an image virtually drawn to perform display processing including: performing recognition processing to recognize an indication point indicating a point on a real space for creating the virtual drawing image; acquiring operation information according to a user operation that makes a change to the virtual drawing image in creation; generating virtual drawing data for drawing the virtual drawing image created according to the indication point, while reflecting the change according to the operation information; and performing display processing to display the virtual drawing image in creation on a display screen in real time on the basis of the virtual drawing data.
[0012] According to one aspect of the present disclosure/recognition processing to recognize an indication point that indicates a point on a real space for creating the virtual drawing image that is an image virtually drawn is performed; operation information according to a user operation that makes a change to the virtual drawing image in creation is acquired; virtual drawing data for drawing the virtual drawing image created according to the indication point is generated, while reflecting the change according to the operation information; and display processing to display the virtual drawing image in creation on a display screen in real time on the basis of the virtual drawing data is performed.
Effects of the Invention
[0013] According to one aspect of the present disclosure, it is possible to provide a better user experience when performing a virtual display on a real space.
[0014] Note that advantageous effects described here are not necessarily restrictive, and any of the effects described in the present disclosure may be applied.
BRIEF DESCRIPTION OF DRAWINGS
[0015] FIG. 1 is a view showing a usage example of an AR display application.
[0016] FIG. 2 is a view showing a display example of an application screen.
[0017] FIG. 3 is a view describing a user interface when starting creation of a virtual drawing image.
[0018] FIG. 4 is a view describing the user interface when performing an operation to change a line thickness of the virtual drawing image.
[0019] FIG. 5 is a view describing the user interface when finishing the creation of the virtual drawing image.
[0020] FIG. 6 is a view describing a usage example of creating the virtual drawing image on the basis of voice recognition.
[0021] FIG. 7 is a block diagram showing a configuration example of a smartphone to which the present technology is applied.
[0022] FIG. 8 is a flowchart describing display processing to be performed in the AR display application.
[0023] FIG. 9 is a view describing a first effect display for the virtual drawing image.
[0024] FIG. 10 is a view describing a second effect display for the virtual drawing image.
[0025] FIG. 11 is a view describing a third effect display for the virtual drawing image.
[0026] FIG. 12 is a view describing an example of applying the AR display application to virtual reality.
[0027] FIG. 13 is a block diagram showing a configuration example of one embodiment of a computer to which the present technology is applied.
MODE FOR CARRYING OUT THE INVENTION
[0028] A specific embodiment to which the present technology is applied will be described in detail below with reference to the drawings.
[0029]
[0030] First, with reference to FIGS. 1 to 6, usage examples of an application that implements display processing to which the present technology is applied (hereinafter referred to as an AR display application) will be described. For example, the AR display application can be executed by a smartphone 11 including an image capturing device, a time of flight (TOF) sensor, a touch panel, or the like.
[0031] A of FIG. 1 shows a user A using the smartphone 11, and B of FIG. 1 shows an AR image display screen 13 displayed on the touch panel of the smartphone 11.
[0032] For example, the user A operates the smartphone 11 to execute the AR display application, and moves a fingertip such that the fingertip appears in an image captured by the image capturing device of the smartphone 11. At this time, a position of the user’s fingertip is recognized on the basis of a distance image acquired by the TOF sensor of the smartphone 11. With this configuration, by following a locus of the user’s fingertip, the AR display application can display, on the AR image display screen 13, an AR image obtained by superimposing a virtual drawing image 14 drawn by a line following the locus of the fingertip on the image of a real space captured by the image capturing device.
[0033] In the usage example shown in A of FIG. 1, the user A points the image capturing device of the smartphone 11 at a vase 12 and moves the fingertip so as to draw a flower arranged in the vase 12. With this configuration, as shown in B of FIG. 1, an AR image in which the virtual drawing image 14 representing the flower virtually drawn by the line corresponding to the locus of the fingertip is arranged in the vase 12 shown in the image of a real space is displayed on the AR image display screen 13.
[0034] At this time, a user B sees that the user A is just moving the fingertip in the air, but when the image capturing device of the smartphone 11 is pointed at the vase 12 from the user B side, the virtual drawing image 14 viewed from the user B side is displayed on the AR image display screen 13. That is, the AR display application can generate virtual drawing data for displaying the virtual drawing image 14 (for example, data indicating the locus of the fingertip represented by the absolute coordinate system on a real space) according to the absolute coordinate system on a real space. This allows the AR display application to display the created virtual drawing image 14 on the AR image display screen 13 from all directions like the virtual drawing image 14 is virtually placed on a real space.
[0035] FIG. 2 shows a display example of an application screen displayed on the touch panel of the smartphone 11 when the AR display application is executed.
[0036] As shown in FIG. 2, the AR image display screen 13 (see B of FIG. 1) is displayed in an upper part of an application screen 21, and a line drawing operation button 22, a line width operation panel 23, and a line color operation panel 24 are displayed below the AR image display screen 13. For example, the line drawing operation button 22, the line width operation panel 23, and the line color operation panel 24 are preferably displayed within reach of a finger of one hand when the user holds the smartphone 11 with the one hand.
[0037] On the AR image display screen 13, the image captured by the image capturing device of the smartphone 11 is displayed in real time, and in superimposition on the image, an AR image is displayed in which the virtual drawing image 14 as described with reference to FIG. 1 is displayed.
[0038] The line drawing operation button 22 is a graphical user interface (GUI) for performing an operation to start or finish the creation of the virtual drawing image 14 in response to a touch operation on the touch panel of the smartphone 11. For example, when it is recognized that the user has touched the line drawing operation button 22, while the touch operation of the user is recognized, a line representing the virtual drawing image 14 in creation is displayed according to the locus of the fingertip of the user. Then, when it is recognized that the user has released the touch from the line drawing operation button 22, generation of the virtual drawing image 14 is finished. Note that for example, the operation on the line drawing operation button 22 may switch the start or finish of the creation of the virtual drawing image 14 each time the touch is performed, or when the touch is recognized, the virtual drawing image 14 may be created until the next touch is recognized.
[0039] The line width operation panel 23 is a GUI for continuously operating changes to the width of the line representing the virtual drawing image 14 in response to the touch operation on the touch panel of the smartphone 11 while creating the virtual drawing image 14. For example, when a touch operation to move a slider displayed on the line width operation panel 23 to the right is recognized, the line width of the created virtual drawing image 14 is changed to increase according to the position where the slider is moved. Meanwhile, when a touch operation to move the slider displayed on the line width operation panel 23 to the left is recognized, the line width of the created virtual drawing image 14 is continuously changed to decrease according to the position where the slider is moved.
[0040] The line color operation panel 24 is a GUI for continuously operating changes to the color of the line representing the virtual drawing image 14 in response to the touch operation on the touch panel of the smartphone 11 while creating the virtual drawing image 14. For example, a color palette representing a hue circle in which RGB values change continuously can be used for the line color operation panel 24, and the color of the created virtual drawing image 14 is changed continuously according to the color displayed at the touch position.
[0041] With reference to FIG. 3, a user interface when starting creation of the virtual drawing image 14 in the AR display application will be described.
[0042] For example, as shown in an upper side of FIG. 3, when the user moves the fingertip of the right hand to a position to start drawing the virtual drawing image 14 and then performs a touch operation to touch the line drawing operation button 22 with the left finger, the creation of the virtual drawing image 14 is started. Then, when the user moves the fingertip of the right hand while performing the touch operation on the line drawing operation button 22, the virtual drawing image 14 is created according to the locus of the fingertip. With this operation, as shewn in a lower side of FIG. 3, an AR image in which the virtual drawing image 14 is placed on a real space is displayed on the AR image display screen 13.
[0043] With reference to FIG. 4, a user interface for performing an operation to change the line thickness of the virtual drawing image 14 in the AR display application will be described.
[0044] For example, as shown in an upper side of FIG. 4, when the user moves the fingertip of the right hand to a position to change the line thickness of the virtual drawing image 14 while drawing the line of the virtual drawing image 14, and then performs an operation on the slider of the line width operation panel 23 with the left finger, the line thickness of the virtual drawing image 14 is changed. Then, when the user changes the line thickness by using the slider of the line width operation panel 23 and moves the fingertip of the right hand, according to the locus of the fingertip, the virtual drawing image 14 in which the line thickness is continuously changed is created. With this operation, for example, as shown in a lower side of FIG. 4, the virtual drawing image 14 in which the line thickness is continuously changed to increase is created.
[0045] Furthermore, similarly, when the user moves the fingertip of the right hand to a position to change the line color of the virtual drawing image 14 while drawing the line of the virtual drawing image 14, and then performs a touch operation on the line color operation panel 24 with the left finger, the line color of the virtual drawing image 14 is changed.
[0046] With reference to FIG. 5, a user interface for finishing creation of the virtual drawing image 14 in the AR display application will be described.
[0047] For example, as shown in an upper side of FIG. 5, when the user moves the fingertip of the right hand to a position to finish drawing the virtual drawing image 14 and then performs a touch operation to release the touch on the line drawing operation button 22 with the left finger, the creation of the virtual drawing image 14 is finished. Thereafter, even if the user moves the fingertip of the right hand, for example, even if the user moves the fingertip of the right hand by the dashed arrow shown in an upper side of FIG. 5, the line of the virtual drawing image 14 is not drawn on the AR image display screen 13 as shown in a lower side of FIG. 5.
[0048] With the above-described user interface, the AR display application can implement the operation to change the line width and line color continuously when creating the virtual drawing image 14. With this configuration, operability with a degree of freedom higher than before can be provided. Furthermore, the user can create the virtual drawing image 14 by moving the fingertip on a real space while checking the virtual drawing image 14 in creation on the AR image display screen 13, and can provide operability that is highly compatible with a real space. Therefore, a better user experience can be implemented by the AR display application.
[0049] That is, the AR display application can recognize one hand, finger, or the like captured by the image capturing device of the smartphone 11, follow movement thereof, create the virtual drawing image 14, and continuously reflect the change to the virtual drawing image 14 in creation in response to the operation of the other hand. With this configuration, continuous changes to the virtual drawing image 14 with the degree of freedom higher than before can be implemented.
[0050] With reference to FIG. 6, a usage example of creating the virtual drawing image 14 on the basis of voice recognition in the AR display application will be described.
[0051] For example, when the user gives utterance while moving the fingertip to appear in the image captured by the image capturing device of the smartphone 11, the AR display application can perform voice recognition on the uttered voice and create a virtual drawing image 14a that displays a character string indicating details of the utterance according to the locus of the fingertip. For example, in the example shown in FIG. 6, when the user moves the fingertip while giving utterance “Thank you”, an AR image in which the virtual drawing image 14a that displays a character string “Thank you” according to the locus of the fingertip is placed on a real space is displayed on the AR image display screen 13.
[0052] By using such an input by voice recognition, for example, during presentation, a school class, or the like, the AR display application is suitable for use to input a voice with a microphone or the like while pointing with a finger a drawing to explain. That is, it is possible to easily perform use such as virtually placing the character string at the pointed position. Furthermore, the AR display application is also suitably used for, for example, a situation log at a construction site, a precaution for maintenance, or the like.
[0053] Configuration Example of Smartphone>
[0054] FIG. 7 is a block diagram showing a configuration example of the smartphone 11 that executes the AR display application.
[0055] As shown in FIG. 7, the smartphone 11 includes an image capturing device 31, a TOF sensor 32, a position attitude sensor 33, a sound pickup sensor 34, a touch panel 35, a vibration motor 36, and an AR display processing unit 37. Furthermore, the AR display processing unit 37 includes an indication point recognition processing unit 41, a voice recognition unit 42, an operation information acquisition unit 43, a feedback control unit 44, a storage unit 45, a virtual drawing data processing unit 46, and a virtual drawing image display processing unit 47.
[0056] The image capturing device 31 includes, for example, a complementary metal oxide semiconductor (CMOS) image sensor or the like, and supplies an image obtained by capturing a real space to the indication point recognition processing unit 41 and the virtual drawing image display processing unit 47 of the AR display processing unit 37.
[0057] The TOF sensor 32 includes, for example, a light-emitting unit that emits modulated light toward an image capturing range of the image capturing device 31 and a light-receiving unit that receives reflected light obtained by the modulated light being reflected by an object. With this configuration, the TOF sensor 32 can measure a distance (depth) to the object on the basis of a time difference between timing of emitting the modulated light and timing of receiving the reflected light, and acquire a distance image that is an image based on the distance. The TOF sensor 32 supplies the acquired distance image to the virtual drawing data processing unit 46 of the AR display processing unit 37.
[0058] The position attitude sensor 33 includes, for example, a positioning sensor that measures the absolute position of the smartphone 11 by receiving various radio waves, a gyro sensor that measures the attitude on the basis of an angular velocity generated in the smartphone 11, or the like. Then, the position attitude sensor 33 supplies position and attitude information indicating the absolute position and the attitude of the smartphone 11 to the virtual drawing data processing unit 46 of the AR display processing unit 37.
[0059] The sound pickup sensor 34 includes, for example, a microphone element, collects a voice uttered by the user, and supplies voice data thereof to the voice recognition unit 42 of the AR display processing unit 37.
[0060] The touch panel 35 includes a display unit that displays the application screen 21 described above with reference to FIG. 2, and a touch sensor that detects a touched position on a surface of the display unit. Then, the touch panel 35 supplies touch position information indicating the touched position detected by the touch sensor to the operation information acquisition unit 43 of the AR display processing unit 37.
[0061] The vibration motor 36 provides feedback about the user operation by vibrating the smartphone 11 according to the control by the feedback control unit 44 of the AR display processing unit 37.
[0062] The AR display processing unit 37 includes respective blocks necessary for executing the AR display application, and implements the user interface as described with reference to FIGS. 1 to 6.
[0063] The indication point recognition processing unit 41 recognizes the fingertip captured by the image capturing device 31 as an indication point indicating the locus of the line for drawing the virtual drawing image 14 on the basis of the image supplied from the image capturing device 31 and the distance image supplied from the TOF sensor 32. For example, by performing image recognition processing on the image captured by the image capturing device 31, the indication point recognition processing unit 41 can recognize the fingertip of the user that appears in the image. This allows the indication point recognition processing unit 41 to identify the relative position of the fingertip with respect to the smartphone 11 by obtaining the distance to the fingertip shown in the image according to the distance image, and recognize the indication point. Then, the indication point recognition processing unit 41 supplies relative position information indicating the relative position of the indication point with respect to the smartphone 11 to the virtual drawing data processing unit 46.
[0064] The voice recognition unit 42 performs voice recognition processing on the voice data supplied from the sound pickup sensor 34, acquires utterance information obtained by transcribing the voice uttered by the user, and supplies the utterance information to the virtual drawing data processing unit 46.
[0065] The operation information acquisition unit 43 acquires operation information indicating details of the operation according to the touch operation by the user on the basis of the application screen 21 displayed on the touch panel 35 and the touch position information supplied from the touch panel 35. For example, as described with reference to FIG. 2, in response to the touch operation on the line drawing operation button 22, the operation information acquisition unit 43 can acquire operation information indicating that creation of the virtual drawing image 14 is started or finished. Furthermore, the operation information acquisition unit. 43 acquires operation information indicating that a change is made to the line width representing the virtual drawing image 14 in response to the touch operation on the line width operation panel 23. Furthermore, the operation information acquisition unit 43 acquires operation information indicating that a change is made to the line color representing the virtual drawing image 14 in response to the touch operation on the line color operation panel 24.
[0066] When the operation information indicating that the operation to start the creation of the virtual drawing image 14 has beer, performed is supplied from the operation information acquisition unit 43, the feedback control unit 44 controls the vibration motor 36 to vibrate the vibration motor 36. Then, the feedback control unit 44 continues to vibrate the vibration motor 36 until the generation of the virtual drawing data is finished, and stops the vibration of the vibration motor 36 when the operation information indicating that the operation to finish the generation of the virtual drawing data has been performed is supplied from the operation information acquisition unit 43. This allows the feedback control unit 44 to perform feedback control for causing the user to recognize that the virtual drawing image 14 is being created.
……
……
……