雨果巴拉:行业北极星Vision Pro过度设计不适合市场

Sony Patent | Information Processing Apparatus, Information Processing Method, Information Processing System, Program Production Method, And Program

Patent: Information Processing Apparatus, Information Processing Method, Information Processing System, Program Production Method, And Program

Publication Number: 20200241851

Publication Date: 20200730

Applicants: Sony

Abstract

There is provided an information processing apparatus to enable a user to easily grasp a state of a computer program and to easily perform a process of changing the computer program, using a three-dimensional object. The information processing apparatus includes an output section and a controller. The output unit is configured to output data indicating a predetermined three-dimensional object having a face on which at least a portion of a computer program that runs on a computer is displayed. The controller is configured to instruct, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

TECHNICAL FIELD

[0001] The present technology relates to an information processing apparatus, an information processing method, an information processing system, a program production method, and a program that make it possible to perform, in the process of development of a computer program, processing regarding, for example, a modification to a program.

BACKGROUND ART

[0002] In related art, there exist technologies that support the process of development of a computer program. For example, Patent Literature 1 described below discloses that a debugger of an application program collects every state of a colonel, an interrupt handler, and respective tasks; arranges a plurality of objects in a virtual three-dimensional space on a screen; and allocates the collected pieces of information to respective properties of a shape, a size, and a color of each of the plurality of objects, and to a property indicating the behavior of each of the plurality of objects in the three-dimensional space.

CITATION LIST

Patent Literature

[0003] Patent Literature 1: Japanese Patent Application Laid-open No. 10-261109

Non-Patent Literature

Disclosure of Invention

Technical Problem

[0004] However, the technology disclosed in Patent Literature 1 enables a user (a programmer) to intuitively grasp, for example, a state of a task of a computer program in a three-dimensional space, but does not enable the user to directly operate and change the computer program in the three-dimensional space using the grasped information.

[0005] In view of the circumstances described above, it is an object of the present technology to provide an information processing apparatus, an information processing method, an information processing system, a program production method, and a program that enable a user to easily grasp a state of a computer program and to easily perform an operation of changing the computer program, using a three-dimensional object.

Solution to Problem

[0006] In order to achieve the object described above, an information processing apparatus according to an embodiment of the present technology includes an output section and a controller. The output unit is configured to output data indicating a predetermined three-dimensional object having a face on which at least a portion of a computer program that runs on a computer is displayed. The controller is configured to instruct, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

[0007] Accordingly, using the three-dimensional object, the information processing apparatus enables the user to intuitively and easily grasp a state of the computer program, and to easily perform a change operation for optimizing the computer program. Here, the computer may be a personal computer (PC), a smartphone, a tablet terminal, a smartwatch and other wearable terminals, a camera, a game device, a TV device, a personal video recorder (PVR), an audio player, an electronic book, and any other apparatuses.

[0008] The controller may control the output unit to output, together with the data indicating the three-dimensional object, data indicating an operation method for changing the computer program.

[0009] Accordingly, the information processing apparatus enables the user to grasp the operation method and to change the computer program easily.

[0010] The controller may control the output unit to output the data indicating the three-dimensional object in a three-dimensional space that is a space in which an operation is performed by the user.

[0011] Accordingly, the information processing apparatus enables the user to grasp the computer program more intuitively and to input an operation of the change.

[0012] The controller may recognize a predetermined gesture of the user as the operation, and may give, according to the gesture, an instruction to perform the change.

[0013] Accordingly, the information processing apparatus enables the user to more intuitively input the operation of the change.

[0014] The controller may instruct the computer to start to run the computer program, and may analyze running data of the computer program to generate analysis data, the running data being received from the computer.

[0015] In this case, the controller may control the output unit such that the computer program is displayed on a first face of the three-dimensional object, and the running data or the analysis data is displayed on a second face of the three-dimensional object, the second face being different from the first face.

[0016] Accordingly, the information processing apparatus enables the user to grasp, together with (code of) the computer program, a running condition of the computer program or an analysis result on the different faces of the three-dimensional object.

[0017] The three-dimensional object may have a structure including at least one rectangular parallelepiped stacked in a predetermined direction. In this case, the controller may control the output unit such that the computer program is displayed on one of two faces from among faces of the rectangular parallelepiped, and the running data or the analysis data is displayed on one of four lateral faces from among the faces of the rectangular parallelepiped, the two faces being orthogonal to the predetermined direction, the four lateral faces being parallel to the predetermined direction.

[0018] Accordingly, the information processing apparatus enables the user to intuitively grasp the computer program, and the running condition of the computer program or an analysis result through different faces of the stacked rectangular parallelepiped object.

[0019] The stacked rectangular parallelepiped may correspond to a single loop process of the computer program. In this case, the controller may control the output unit such that data indicating an execution time for the single loop process is displayed on the one of four lateral faces as the running data or the analysis data.

[0020] Accordingly, the information processing apparatus enables the user to intuitively grasp a loop structure included in the computer program and an execution time for it through the different faces of the rectangular parallelepiped object.

[0021] The controller may determine whether a change process performed with respect to the computer program is possible, the controller performing the determination with respect to a plurality of the change processes different from one another, and the controller may control the output unit to output, as the operation method, data indicating an operation method for the change process determined to be possible.

[0022] Accordingly, the information processing apparatus determines whether the plurality of the different change processes performed with respect to the computer program is possible and clearly specifies an operation method only for a possible change process. This enables the user to intuitively grasp which change process is possible, and makes it possible to avoid user’s unproductive consideration with respect to an impossible change process.

[0023] The controller may give, according to an operation of the user, an instruction to perform the change in the computer program, the operation of the user corresponding to the operation method for the change process determined to be possible, and the controller may control the output unit such that the three-dimensional object is deformed to be displayed according to a result of the change.

[0024] Accordingly, the information processing apparatus causes the three-dimensional object to be deformed to be displayed according to a result of the change based on an operation of the user, the operation of the user giving an instruction to perform one of the plurality of the change processes. This makes it possible to give feedback to the user regarding the operation.

[0025] The change process may be one of splitting, parallelization, unrolling, vectorization, or reordering of the computer program.

[0026] These change processes can be performed in a programing language such as “Halide”, but are not limited to this.

[0027] The three-dimensional object may be in a form of a book. In this case, the controller may control the output unit to output an image of a bookshelf in which a plurality of the three-dimensional objects is arranged in a three-dimensional space such that the second face corresponds to a backbone of the book.

[0028] Accordingly, the information processing apparatus depicts the computer program in the form of a book, and arranges the books in a bookshelf such that the user understands the running condition of the computer program or an analysis result at once. This enables the user to easily grasp a tuning-target program.

[0029] When the controller detects a user’s operation of selecting one of the plurality of the three-dimensional objects arranged in the bookshelf, the controller may set, to be a target of the change, the computer program displayed on the first face of the selected three-dimensional object.

[0030] Accordingly, the information processing apparatus enables the user to easily select a tuning-target program.

[0031] The controller may analyze an execution time for the computer program as the running data, and may control the output unit such that the second face having an area corresponding to a length of the execution time is displayed.

[0032] Accordingly, the information processing apparatus enables the user to intuitively grasp an execution time for the computer program using the area of a backbone of the book virtually arranged in the bookshelf, and to select a tuning-target computer program.

[0033] An information processing system according to another embodiment of the present technology includes a computer and an information processing apparatus. The computer includes a storage and a first controller, and the information processing apparatus includes an output unit and a second controller. The storage is configured to store therein a computer program, and the first controller is configured to change the computer program. The output unit is configured to output data indicating a predetermined three-dimensional object having a face on which at least a portion of the computer program is displayed, and the second controller is configured to instruct, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

[0034] An information processing method according to another embodiment of the present technology includes: outputting data indicating a predetermined three-dimensional object having a face on which at least a portion of a computer program that runs on a computer is displayed;* and*

[0035] instructing, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

[0036] A computer program production method according to another embodiment of the present technology includes:

[0037] outputting data indicating a predetermined three-dimensional object having a face on which at least a portion of a computer program that runs on a computer is displayed;* and*

[0038] instructing, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

[0039] A program according to another embodiment of the present technology causes an information processing apparatus to perform a process including:

[0040] outputting data indicating a predetermined three-dimensional object having a face on which at least a portion of a computer program that runs on a computer is displayed;* and*

[0041] instructing, according to a predetermined operation, the computer to perform a change in the computer program, the predetermined operation being performed by a user with respect to the three-dimensional object.

Advantageous Effects of Invention

[0042] As described above, the present technology enables a user to easily grasp a state of a computer program and to easily perform an operation of changing the computer program, using a three-dimensional object. However, this effect does not limit the present technology.

BRIEF DESCRIPTION OF DRAWINGS

[0043] FIG. 1 illustrates a configuration of a computer-program development system according to an embodiment of the present technology.

[0044] FIG. 2 illustrates a hardware configuration of a program development server included in the computer-program development system.

[0045] FIG. 3 is a flowchart of a procedure of an operation of the computer-program development system.

[0046] FIG. 4 illustrates an example of a three-dimensional object used to develop a computer program being displayed by the program development server.

[0047] FIG. 5 illustrates a method for operating the three-dimensional object used to develop a computer program.

[0048] FIG. 6 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0049] FIG. 7 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0050] FIG. 8 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0051] FIG. 9 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0052] FIG. 10 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0053] FIG. 11 illustrates an example of displaying an iteration object taken out of the three-dimensional object used to develop a computer program.

[0054] FIG. 12 illustrates an example of displaying the three-dimensional object according to another embodiment of the present technology.

[0055] FIG. 13 illustrates an example of displaying the three-dimensional object according to another embodiment of the present technology.

[0056] FIG. 14 illustrates an example of displaying the three-dimensional object according to another embodiment of the present technology.

MODE(S)* FOR CARRYING OUT THE INVENTION*

[0057] Embodiments according to the present technology will now be described below with reference to the drawings.

[0058] [Outline of System]

[0059] FIG. 1 illustrates a configuration of a computer-program development system according to an embodiment of the present technology.

[0060] As illustrated in the figure, the system includes a program development server 100, a 3D display device 110, a gesture-input-and-recognition device 120, and a camera 200.

[0061] The program development server 100 is connected to the 3D display device 110 and the gesture-input-and-recognition device 120 as well as to the camera 200 through a wired or wireless communication interface.

[0062] The camera 200 is an example of a tuning-target computer, and includes a storage such as a flash memory that stores therein an image processing program 210 that is an example of a computer program, and a calculator 220 that can execute the image processing program 210.

[0063] Examples of the 3D display device 110 include a virtual reality (VR) device such as a head-mounted display and VR glasses, and the 3D display device 110 is worn by a user. Examples of the gesture-input-and-recognition device 120 include a camera and other sensors.

[0064] The program development server 100 performs processing that occurs in a program development environment, the processing including a change in the image processing program 210, building of the image processing program 210, and a calculation necessary to analyze a profile of the image processing program 210, the image processing program 210 being executed in the camera 200 described above.

[0065] Specifically, using the 3D display device 110, the program development server 100 can display at least a portion of the image processing program 210 on a predetermined face of a three-dimensional object in a three-dimensional VR space.

[0066] Then, during the three-dimensional display of the image processing program 210, the program development server 100 transmits, to the camera 200, an instruction to change the image processing program 210, according to a user’s gesture that is recognized by the gesture-input-and-recognition device 120.

[0067] Further, the program development server 100 instructs the camera 200 to execute (to start to run) the image processing program 210, and receives, from the camera 200, static-and-dynamic-profile information regarding the image processing program 210 or the calculator 220 at the time of execution of the image processing program 210 performed by the calculator 220. Then, the program development server 100 arranges, on the three-dimensional object, the profile information or information regarding a result obtained by further analyzing the profile information, and outputs the profile information or the information regarding the analysis result to the 3D display device 110 as VR display information so that the 3D display device 110 displays the VR display information.

[0068] The program development server 100 can cause the 3D display device 110 to display the image processing program 210 together with the profile information received from the camera 200, or can cause the 3D display device 110 to switch between the image processing program 210 and the received profile information to perform display.

[0069] [Hardware Configuration of Program Development Server]

[0070] FIG. 2 illustrates a hardware configuration of the program development server 100. As illustrated in the figure, the program development server 100 includes a central processing unit (CPU) 11, a read only memory (ROM) 12, a random access memory (RAM) 13, an input/output interface 15, and a bus 14 through which these components are connected to one another.

[0071] The CPU 11 accesses, for example, the RAM 13 as appropriate when necessary, and comprehensively controls the entirety of respective blocks of the program development server 100 while performing various arithmetic processes. The ROM 12 is a nonvolatile memory that statically stores therein firmware, such as an OS, a program, and various parameters, that is executed by the CPU 11. The RAM 13 is used as, for example, a work region for the CPU 11, and temporarily holds an OS, various applications that are being executed, and various pieces of data that are being processed.

[0072] For example, a display unit 16, an operation acceptance unit 17, a storage 18, and a communication unit 19 are connected to the input/output interface 15.

[0073] The display unit 16 is a display device that uses, for example, a liquid crystal display (LCD) or an organic electroluminescence display (OELD). In the present embodiment, the display unit 16 does not necessarily have to be provided since the 3D display device 110 is provided separately from the program development server 100. When the program development server 100 is integrated with the 3D display device 110, the display unit 16 displays a 3D image of, for example, the three-dimensional object.

[0074] Examples of the operation acceptance unit 17 include a pointing device such as a mouse, a keyboard, a touch panel, and other input devices. When the operation acceptance unit 17 is a touch panel, the touch panel may be integrated with the display unit 16.

[0075] Examples of the storage 18 include a nonvolatile memory such as a hard disk drive (HDD), a flash memory (a solid-state drive; SSD), and other solid-state memories. The storage 18 stores therein the OS, the various applications, and the various pieces of data described above. In particular, in the present embodiment, the storage 18 stores therein a program and data for, for example, an application necessary to perform processing that includes arranging at least a portion of the image processing program 210 of the camera 200 on a three-dimensional object together with profile information, outputting them to the 3D display device 110, and instructing the camera 200 to change the image processing program 210, according to a gesture of a user (a change operation).

[0076] Examples of the communication unit 19 include a network interface card (NIC) for Ethernet, various modules for wireless communication such as a wireless LAN, and other communication interfaces, and the communication unit 19 performs processing of communication among the 3D display device 110, the gesture-input-and-recognition device 120, and the camera 200. In other words, the communication unit 19 serves as an output unit that outputs a three-dimensional object generated by the CPU 11 to the 3D display device 110 and causes the 3D display device 110 to display the three-dimensional object.

[0077] Note that a basic hardware configuration of the camera 200 is substantially the same as the hardware configuration of the program development server 100, although this is not illustrated.

[0078] [Operation of Computer-Program Development System]

[0079] Next, an operation of a computer-program development system that has the configuration described above, is described focused on an operation of the program development server 100. The operation of the computer-program development system is performed by the hardware of the program development server 100, such as the CPU 11 and the communication unit 19, cooperating with the software stored in the storage 18. In the following descriptions, for convenience, it is assumed that the CPU 11 is primarily involved in performing the operation.

[0080] FIG. 3 is a flowchart of a procedure of the operation of the computer-program development system.

[0081] First, the program development server 100 establishes a connection with the camera 200 through the communication interface described above.

[0082] Next, the CPU 11 of the program development server 100 transmits, to the camera 200, an instruction to execute (to start to run) the image processing program 210 (Step 32). This execution is, for example, an execution of a loop structure of the image processing program 210.

[0083] The camera 200 that has received the execution instruction described above executes (a loop structure of) the image processing program 210 using the calculator 220 (Step 33).

[0084] Next, the camera 200 transmits a result of executing the image processing program 210 and dynamic profile information regarding the image processing program 210 (Step 34). Here, the dynamic profile information includes, for example, an execution time for the loop structure, and the utilization and the power consumption of the calculator 220 upon the execution, but is not limited to them.

[0085] According to the execution result and the dynamic profile information (running data), the CPU 11 of the program development server 100 that has received the execution result and the dynamic profile information causes the 3D display device 110 to three-dimensionally display the image processing program 210 in the form of a three-dimensional object (Step 35).

[0086] In this case, the CPU 11 may display the execution result and dynamic profile information on the three-dimensional object with no change, or may generate analysis data obtained by analyzing the execution result and the dynamic profile information and may display the analysis data on the three-dimensional object.

[0087] FIG. 4 illustrates an example of the three-dimensional object being displayed by the program development server 100.

[0088] As illustrated in the figure, the program development server 100 displays a three-dimensional object O on which code in a portion of a loop process is displayed, the loop process being a process from among a plurality of processes included in source code of the image processing program 210. The three-dimensional object O has, for example, a rectangular-parallelepiped (cube) shape, but the shape of the three-dimensional object O is not limited to this.

[0089] The three-dimensional object O has a structure including a plurality of rectangular parallelepipeds stacked in a predetermined direction (a Z direction). The code in the portion of a loop process is displayed on a face P1 that is orthogonal to a direction of the stacking (the Z direction).

[0090] Each of the stacked rectangular parallelepipeds corresponds to a single loop process included the image processing program 210.

[0091] Further, an execution result and dynamic profile information with respect to each of the loop processes, or information obtained by analyzing the execution result and the dynamic profile information is displayed on a face P2 (for example, a face extending toward a depth direction) from among four lateral faces that are parallel to the direction of the stacking (the Z direction) of the three-dimensional object O.

[0092] Specifically, on the face P2, a direction of a Z axis in the figure corresponds to an iteration ID that identifies a loop process from among a plurality of loop processes included in the image processing program 210, and the Z axis corresponds to a temporal axis of the image processing program 210. Further, an execution-time bar B is displayed in a direction of an X axis of each rectangular parallelepiped in the figure, the execution-time bar B having a length corresponding to an execution time for a loop structure corresponding to an iteration ID.

[0093] The execution-time bar B is data generated by the program development server 100 using the execution result and execution-time information regarding an execution time for each loop structure, the execution-time information being included in the dynamic profile information, the execution result and the dynamic profile information being received from the camera 200.

[0094] As described above, using a plurality of faces of the three-dimensional object O, the program development server 100 enables a user to intuitively and easily grasp a state of a computer program and a running condition of the computer program, or an analysis result.

[0095] When the loop process has a nested structure (a multiple-loop structure), the program development server 100 identifies an outermost loop process and an innermost loop process using different iteration IDs, and independently displays these iteration IDs on the face P1.

[0096] Returning to FIG. 3, the CPU 11 of the program development server 100 determines whether the gesture-input-and-recognition device 120 detects a gesture of a user on the three-dimensional object O displayed by the 3D display device 110 (Step 36).

[0097] FIG. 5 illustrates a method for operating the three-dimensional object O.

[0098] As illustrated in A of the figure, when a user traces, with his/her finger, an iteration-ID axis (the Z axis in FIG. 4) on the face P2 of the three-dimensional object O, the program development server 100 recognizes this gesture using the gesture-input-and-recognition device 120. When the gesture is recognized, the program development server 100 causes the 3D display device 110 to perform display such that a plate iteration object I slightly protrudes in, for example, a direction indicated by an arrow in the figure (a direction of a Y axis in FIG. 4) as represented by broken lines, the iteration object I indicating a profile of an iteration ID corresponding to a position on the iteration-ID axis that is pointed with the finger of the user.

[0099] Here, the program development server 100 may cause the 3D display device 110 to display profile information regarding the iteration such that the information is transparently visible on, for example, a front face of an iteration object I corresponding to the pointed iteration ID.

[0100] Further, as illustrated in B of the figure, when the program development server 100 recognizes, using the gesture-input-and-recognition device 120, a user’s gesture of taking the protruding iteration object I out of the three-dimensional object O with his/her finger, the program development server 100 moves the protruding iteration object I in a direction indicated by an arrow in the figure, and causes the 3D display device 110 to display the moved iteration object I. This enables the user to take an iteration object I indicating a desired iteration out of the three-dimensional object O.

[0101] Returning to FIG. 3, when the CPU 11 of the program development server 100 determines that the gesture-input-and-recognition device has detected a gesture of the user, and determines that the gesture is an operation of changing a state of displaying the three-dimensional object O, the CPU 11 changes the state of displaying the three-dimensional object according to the gesture. The operation of changing a display state is, for example, the above-described operation of tracing the face P2 over the iteration axis, an operation of holding and pulling a specific iteration object I with a finger, and an operation of changing (rotating) an orientation of displaying the three-dimensional object O (a face displayed in the front). According to these operations, the CPU 11 causes the 3D display device 110 to perform 3D display processing such as causing an iteration object I to slightly protrude from the three-dimensional object O, moving a specific iteration object I in the direction of the Y axis in FIG. 4, and rotating the three-dimensional object O (Step 35).

[0102] On the other hand, when the CPU 11 determines that a gesture of a user has been detected, and determines that the gesture is an operation of changing the image processing program 210, the CPU 11 changes the image processing program 210 according to the gesture. (Step 37).

[0103] Here, a specific example of changing the image processing program 210 is described. In the present embodiment, in a state in which a loop structure of the image processing program 210 has been taken out in the form of the iteration object I, it is possible to perform splitting, parallelization, unrolling, vectorization, and reordering as change operations for optimizing the loop structure. These change processes can be performed in a programing language such as “Halide”, but are not limited to this. The processes are described below in the order described above.

[0104] (Splitting Operation)

[0105] FIG. 6 illustrates an example of a splitting operation of a loop structure. This operation is a process of equally splitting a loop structure in the time direction.

[0106] As illustrated on the left in the figure, for example, an exact syntax in original code is displayed on, for example, an upper face P1 of the iteration object I.

[0107] Subsequently to this state, when, as indicated by an arrow (1) in the figure, a user’s gesture of cutting, with his/her hand (in a chopping state), a lateral face (a face corresponding to the face P2 described above) of the iteration object I in a plane (an XY plane) that is parallel to the upper face P1 of the iteration object I, is recognized, the iteration object I is split into two plate iteration objects that are upper and lower iteration objects I1 and I2 to be displayed.

[0108] Subsequently to this state, when, as indicated by an arrow (2) in the figure, a gesture of holding the upper iteration object I1 of the two iteration objects obtained by the splitting, and of displacing the held upper iteration object I1 in a horizontal direction (to the right in the figure), is recognized, a splitting parameter P is adjusted to be determined according to the displacement amount.

……
……
……

您可能还喜欢...