Microsoft Patent | Virtualized Product Configuration And Quotation System
Patent: Virtualized Product Configuration And Quotation System
Publication Number: 20200302501
Publication Date: 20200924
Applicants: Microsoft
Abstract
Systems and methods are provided for enabling configure, price, quote (CPQ) systems to generate visualizations of configurable products by displaying 3-dimensional virtual views of such objects on a head-mounted display (HMD) device. Users may interact with the displayed views by executing gestures recognized by the HMD device, the gestures corresponding to operations on the virtual view. Embodiments enable the user to add, modify or remove parts of the displayed virtual views of the configurable product. Embodiments may also permit rapid cycling through different configuration options through suitable gesture operations. Embodiments are configured to provide price information within the field of view including the displayed virtual view, the price information corresponding to the current configuration, and being updated as the user changes the active configuration.
BACKGROUND
[0001] Configure, price, quote (CPQ) software solutions often fulfill a critical business process function whereby sellers employ CPQ systems to configure, price and quote configurable products for their customers. The value of a CPQ system becomes particularly apparent where the configurable products are complex and/or where the number of possible configurations is unwieldly. For example, suppose a customer is shopping for a new laptop computer. If the customer chooses a certain base model of computer (e.g., Dell Latitude 3000 series), the size of display screens may be limited. Then, given a certain choice of display screen size, a touch screen display may or may not be available. Likewise, the type and quantity of installable system memory may be constrained by the underlying motherboard. CPQ systems are typically quite capable of enumerating the various configurations, and calculating prices corresponding to each configuration.
[0002] Such systems, however, typically offer little means for providing rapid visualization of various configurations, offer little if any means for effectively visualizing the configuration options or illustrating how the product may physically vary from configuration to configuration, and may not simultaneous permit rapid determination of how configuration changes impact product price. For example, historically visualization has required drafting CAD drawings or the like. In many cases, however, such drawings may be of little use for verification and validation purposes, or where the configured products are intended to be placed in a particular physical location. Likewise, such drawings represent a static view of a particular configuration available at a particular time based on sub-components available at that time, and likewise capable of indicating the configuration price available at that time.
SUMMARY
[0003] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0004] Methods, systems and apparatuses are provided that address limitations of current configure, price, quote (CPQ) systems inasmuch as such systems are incapable of providing a virtual 3-dimensional visualization of a configurable product, particularly where the visualization includes dynamically updating price information as the configuration is modified.
[0005] In aspects, methods are provided that enable virtual configuration of a configurable object via a head-mounted display device. In one aspect, 3-dimensional (“3D”) models of the configurable object to be configured are received along with an initial configuration state for the configurable object, along with the price for an instance of the configurable object having the initial configuration. A 3D rendering of the configurable object along with the corresponding price is displayed in the forward field of view of a head-mounted display device, wherein the rendering reflects the received initial configuration state. Configuration changes are accepted for the configurable object, and the 3D rendering of the configurable object as displayed by the head-mounted display device is modified to reflect the received configuration change, and the displayed price corresponding to the modified configuration is likewise updated. A final configuration may be generated based upon the selection of a particular configuration, wherein the final configuration forms a basis for a price quote for a purchase of physical instances of the configurable object configured per the final configuration. In another aspect, the virtual images are rendered and displayed in the forward field of view of a head-mounted display device such that the virtual image is superimposed on an instance of the configurable object present in the physical environment visible in the forward field of view. In an aspect, configuration changes may be received by receiving gesture data from the head-mounted display device, or an associated input device, and identifying a configuration change based on an operation associated with a gesture corresponding to the received gesture data.
[0006] In one implementation, a virtualized configuration system includes a head-mounted display device, a model database including 3D models of a configurable object to be configured, a configuration database including configuration variations for the configurable object and further including corresponding pricing information, a configuration management component that exposes an application programming interface (API) configured to provide access to the model and configuration databases, and a virtualized configuration application component. In one aspect, the virtualized configuration application component is configured to receive via the API 3D models corresponding to the configurable object, and the initial configuration variation and price for the configurable object under configuration. The virtualized configuration application component may be further configured to render or cause to be rendered by the head-mounted display device a virtual image of the configurable object configured according to the initial configuration, where the virtual image likewise includes the price and the virtual image is superimposed on the forward field of view of the head-mounted display device.
[0007] Further features and advantages, as well as the structure and operation of various examples, are described in detail below with reference to the accompanying drawings. It is noted that the ideas and techniques are not limited to the specific examples described herein. Such examples are presented herein for illustrative purposes only. Additional examples will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
[0008] The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate embodiments of the present application and, together with the description, further serve to explain the principles of the embodiments and to enable a person skilled in the pertinent art to make and use the embodiments.
[0009] FIG. 1 depicts a virtualized configuration system, according to an embodiment.
[0010] FIG. 2 depicts a schematic view of a mixed reality configuration system, according to an embodiment.
[0011] FIG. 3 depicts an example head-mounted display device, according to an embodiment.
[0012] FIG. 4 depicts a functional diagram of example mixed reality head-mounted display optics, according to an embodiment.
[0013] FIG. 5 depicts a schematic view of a user wearing the head-mounted display device of FIG. 2 and viewing an example configuration environment, according to an embodiment.
[0014] FIG. 6 depicts the schematic view of FIG. 5 including mixed reality augmentation of the example configuration environment, according to an embodiment.
[0015] FIG. 7 depicts a flowchart of a method for virtual configuration of a configurable object via a head-mounted display device, according to embodiment.
[0016] FIG. 8 is a block diagram of an example computer system in which embodiments may be implemented.
[0017] The features and advantages of embodiments will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
DETAILED DESCRIPTION
I.* INTRODUCTION*
[0018] The following detailed description discloses numerous embodiments. The scope of the present patent application is not limited to the disclosed embodiments, but also encompasses combinations of the disclosed embodiments, as well as modifications to the disclosed embodiments.
[0019] References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0020] Numerous exemplary embodiments are described as follows. It is noted that any section/subsection headings provided herein are not intended to be limiting. Embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, embodiments disclosed in any section/subsection may be combined with any other embodiments described in the same section/subsection and/or a different section/subsection in any manner
II.* EXAMPLE EMBODIMENTS*
[0021] The example embodiments described herein are provided for illustrative purposes and are not limiting. The examples described herein may be adapted to any type of CPQ system. Further structural and operational embodiments, including modifications/alterations, will become apparent to persons skilled in the relevant art(s) from the teachings herein.
[0022] Conventional CPQ systems are typically more than adequate for many configuration, pricing and quotation tasks, particularly when configurable products have only limited configuration options. In such instances, a typical process flow for generating a price quote may proceed as follows. First, a sales representative may call or otherwise communicate with a customer to receive a list of product requirements. Second, the sales representative may use the CPQ system to input the constraints imposed by the product requirements, and receive in turn a list of configurable products along with a list of specific configurations for each configurable product, and including price information for each. Finally, the sales representative may generate a price quote for one or more of the configurable products, and provide such quotes to the customer for consideration.
[0023] The above described process can certainly suffice where the configurable products have relatively few configurable options, or in situations where the specific physical features and/or aesthetics of the product are relatively unimportant. For example, some configurable products such as computer system memory DIMMs must ordinarily conform with tight dimensional specifications, and the aesthetics of such DIMMs is generally irrelevant owing to being wholly invisible inside a computer. In such a case, the inability of a conventional CPQ system to provide adequate visualization and manipulation capabilities may be unimportant.
[0024] For more complex configurable products, manual configuration may be difficult. Moreover, even where enumerating configuration alternatives may be relatively straightforward, it may be difficult for a customer to imagine what the final product looks like, or what it would look like in a particular location. Further, in situations where physical review or inspection of various configurations for a product would be preferable, it may not be feasible to do so as the number of configuration permutations grows.
[0025] To address the current shortcomings of CPQ systems, embodiments described herein are enabled to provide CPQ systems capable of producing configurable product visualizations by rendering on a display device virtual instances of configurable products. Embodiments may, for example, render such virtual instances as 3-dimensional (3D) views on a suitably equipped head-mounted display (HMD) device. Such rendered instances may, as discussed herein below, comprise or be incorporated into any of virtual reality (VR) content, augmented reality (AR) content, or mixed reality (MR) content.
[0026] Moreover, embodiments may permit interaction with the displayed virtual instances of the configurable products, whereby the user of the HMD device may provide gestures of one type or another for performing corresponding operations on the displayed virtual product. For example, such gestures may trigger rotation of the product, opening or closing parts, pushing buttons, turning wheels, or otherwise interacting with manipulable portions of the displayed instance and causing actions to be taken.
[0027] Likewise, embodiments may enable the user to add or remove optional parts of the displayed instance of the configurable product. Embodiments may also permit rapid cycling through different configuration options through suitable gesture operations. For example, embodiments may permit rapid visualization of different color options for the configurable product (or sub-portions thereof) by enabling the user to trigger a color change with a virtual “double tap” of the rendered configurable product.
[0028] Embodiments may also include price information within the rendered instance of the configurable product, wherein the price information corresponds to configuration being viewed and further wherein the displayed price information is updated in real-time as the user cycles through various configuration options for the configurable product.
[0029] Enabling a CPQ system to allow users to perform the functions described herein above may be accomplished in numerous ways. For example, FIG. 1 depicts a virtualized configuration system 100, according to an embodiment. Virtualized configuration system 100 includes a configuration database 102, a configuration management component 104, a 3-dimentional (3D) model database 110, a virtualized configuration application component 108, and a head-mounted display device 106 (hereinafter “HMD display device”).
[0030] In embodiments, configuration database 102 is configured to store configuration data 112 which may comprise any data or metadata related to the available configurations for the available configurable products. Moreover, configuration data 112 also includes price information that comprises, or may be used at least in part to generate, the price for a particular configuration of a particular configurable product. As described in detail herein below, such configuration data 112 may be retrieved by configuration management component 104 in response to one or more requests 114 from virtualized configuration application component 108. In another embodiment, configuration data 112 may be pushed to virtualized configuration application component 108.
[0031] In embodiments, 3D model database 110 is configured to store 3D models 118 for each configurable product and/or 3D models for each configurable part or sub-portion of each configurable product. In embodiments, 3D models 118 enable virtualized configuration application component 108 and/or HMD display device 106 to render a 3D virtual instance of the chosen configurable product, and to thereafter modify or otherwise re-render the displayed instance of the configurable product. As described in detail herein below, such 3D models 118 may be retrieved by configuration management component 104 in response to one or more requests 114 from virtualized configuration application component 108. In another embodiment, 3D models 118 may be pushed to virtualized configuration application component 108.
[0032] Configuration database 102 and 3D model database 110 may each comprise any type of datastore that enables the storage and retrieval of their respective data according to one or more match criteria. For example, configuration database 102 and 3D model database 110 may each comprise a relational database system (e.g., MySQL), a graph database (e.g., Neo4j), a hierarchical database system (e.g., Jet Blue) or various types of file systems. Likewise, although depicted as a single database, configuration database 102 and 3D model database 110 may comprise one or more databases that may be organized in any manner both physically and virtually. In an embodiment, configuration database 102 and/or 3D model database 110 may comprise any number of servers, and may include any type and number of other resources, including resources that facilitate communications with and between the servers, and configuration management component 104, and any other necessary components. Servers of configuration database 102 and/or 3D model database 110 may be organized in any manner, including being grouped in server racks (e.g., 8-40 servers per rack, referred to as nodes or “blade servers”), server clusters (e.g., 2-64 servers, 4-8 racks, etc.), or datacenters (e.g., thousands of servers, hundreds of racks, dozens of clusters, etc.). In an embodiment, the servers of configuration database 102 and/or 3D model database 110 may be co-located (e.g., housed in one or more nearby buildings with associated components such as backup power supplies, redundant data communications, environmental controls, etc.) to form a datacenter, or may be arranged in other manners. Accordingly, in an embodiment, configuration database 102 and/or 3D model database 110 may comprise a datacenter in a distributed collection of datacenters.
[0033] In embodiments, configuration management component 104 is communicatively coupled to configuration database 102, 3D model database 110 and virtualized configuration application component 108 and may be configured to perform CPQ system functions. For example, configuration management component 104 may be configured to retrieve configuration data 112 and/or 3D models 118, and deliver the same to virtualized configuration application component 108 in response to system 100 being directed to render virtual instances of a particular configurable product. Although depicted as a monolithic component, configuration management component 104 may comprise any type and number of other resources, including resources that facilitate communications with and between the servers of configuration database 102 and/or 3D model database 110, and virtualized configuration application component 108, and any other necessary components. Moreover, embodiments of configuration management component 104 may be constituted, organized and co-located in any of the manners described herein above in relation to configuration database 102 and/or 3D model database 110.
[0034] In embodiments, and as discussed above, virtualized configuration application component 108 is configured to make requests 114 and receive 3D models 118 and configuration data 112 from configuration management component 104. Requests 114 may arise in conjunction with a user selecting a particular configurable product and/or configuration for visualization with HMD device 106. In embodiments, virtualized configuration application component 108 may be further configured to provide 3D models 118 and configuration data 112 to HMD device 106 for processing and display. Alternatively, virtualized configuration application component 108 may be configured to process 3D models 118 and configuration data 112 local to virtualized configuration application component 108, and then transfer displayable data directly to HMD device 106 via a suitable media interface (e.g., HDMI or DVI) for display. Of course, other structural and operational embodiments will be apparent to persons skilled in the relevant art(s).
[0035] HMD device 106 may comprise any type of head-mounted display device suitable for presenting 3D virtual reality, augmented reality or mixed reality content to the user. In embodiments, and as discussed in detail below, HMD device 106 may be enabled to detect gestures made by the user and communicate gesture data to virtualized configuration application component 108 for subsequent processing and action. For example, and as described above, embodiments of HMD device 106 may be enabled to capture video of the forward field of view of the HMD device, to process the video to detect and identify gestures (pre-defined motions with the hands or arms), and having detected gestures, to perform an operation on the rendered virtual image. Additionally or alternatively, user gestures may be detected by one or more user input devices (e.g., motion controllers, clickers, gamepads, or the like) used in conjunction with HMD device 106. More detailed aspects of embodiments are described herein below.
[0036] In an embodiment, configuration database 102 and/or 3D model database 110 and/or configuration management component 104 may be components in a pre-existing CPQ system, and virtualized configuration application component 108 is specifically adapted to use any existing methods of access to that CPQ system. In an alternative embodiment, however, configuration database 102 and 3D model database 110 may be components in an existing CPQ system, and configuration management component 104 serves as a glue layer between the CPQ system and virtualized configuration application component 108. For example, configuration management component 104 may expose an application programming interface (API) for consumption by virtualized configuration application component 108 for accessing CPQ databases. In this manner, configuration management component 104 serves to adapt different CPQ systems to the needs of virtualized configuration application component 108 without the need of virtualized configuration application component 108 having any knowledge of the underlying CPQ system.
[0037] Further operational aspects of system 100 of FIG. 1 will now be discussed in conjunction with FIG. 2 which depicts a schematic view of a mixed reality configuration system 200, according to an embodiment. Although described with reference to system 100 of FIG. 1, mixed reality configuration system 200 is not limited to that implementation. Other structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the following discussion regarding system 200 of FIG. 2.
[0038] FIG. 2 shows a schematic view of one embodiment of a mixed reality configuration system 200. Mixed reality configuration system 200 includes a computing device 202 and HMD device 106. Computing device 202 includes a mass storage 204, a memory 210 and a processor 212. Mass storage 204 may include one or more of any type of storage mechanism, including a magnetic disc (e.g., in a hard disk drive), an optical disc (e.g., in an optical disk drive), a magnetic tape (e.g., in a tape drive), a memory device such as a RAM device, a ROM device, etc., and/or any other suitable type of storage medium.
[0039] Computing device 202 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., a Microsoft.RTM. Surface.RTM. device, a laptop computer, a notebook computer, a tablet computer such as an Apple iPad.TM., a netbook, etc.), home entertainment computer, interactive television, gaming system, or other suitable type of computing device. Additional details regarding the components and computing aspects of example computing devices are described in more detail below with reference to FIG. 8.
[0040] Furthermore, although computing device 202 and HMD device 106 may generally be described herein as separate devices, embodiments may combine computing device 202 and HMD device 106 into a single device such as, for example, a head-mounted device such as Microsoft HoloLens or so-called smart glasses such as Google.RTM. Glass.TM..
[0041] Mixed reality configuration system 200 includes a virtualized configuration application component 108 that may be stored in mass storage 204 of computing device 202, in an embodiment. Embodiments of virtualized configuration application component 108 may be loaded into memory 210 and executed by processor 212 of computing device 202 to perform one or more of the methods and processes described in more detail below.
[0042] Virtualized configuration application component 108 may generate a virtual environment 206 for display on a display device, such as HMD device 106, to create a mixed reality environment 222. Virtual environment 206 includes one or more virtual images, such as two-dimensional virtual objects and three-dimensional holographic objects. In the present example, virtual environment 206 includes virtual objects in the form of selectable virtual objects 208. As described in more detail below with respect to FIG. 3, selectable virtual objects 208 may correspond to identifiable and/or manipulable targets that may be rendered by virtualized configuration application component 108 within the forward field of view of mixed reality environment 222. More specifically, virtual objects 208 may comprise a 3D rendering of a physical object. Alternatively, in embodiments, virtual objects 208 may also comprise a sub-portion of a rendered virtual object, and wherein the sub-portions may be manipulated independently of the entire virtual object. Virtual objects 208 may also comprise, and as will be discussed in greater detail below, a mixed reality virtual object rendered on HMD device 106 to modify the apparent appearance of a real, physical object visible within the forward field of view.
[0043] Computing device 202 may be operatively connected with HMD device 106 in a variety of ways. For example, computing device 202 and HMD device 106 may be connected with HMD device 106 via a wired connection such as, e.g., Ethernet, Universal Serial Bus (USB), DisplayPort, FireWire, and the like. Alternatively, computing device 202 and HMD device 106 may be operatively connected via a wireless connection. Examples of such connections may include, IEEE 802.11 wireless LAN (WLAN), Worldwide Interoperability for Microwave Access (Wi-MAX), cellular network, Bluetooth.TM., or near field communication (NFC). It should be understood, of course, that the abovementioned examples for coupling HMD device 106 with computing device 202 are applicable only in embodiments where computing device 202 and HMD device 106 are physically distinct devices.
[0044] Note also that the foregoing general description of the operation of system 200 is provided for illustration only, and embodiments of system 200 may comprise different hardware and/or software, and may operate in manners different than described above. Indeed, embodiments of system 200 may include various types of HMD display 106.
[0045] For example, and with continued reference to FIG. 2, FIG. 3 depicts an example HMD display device 106, according to an embodiment. In particular, HMD device 106 as shown in FIG. 3 takes the form of a pair of wearable glasses with a display 302. It will be appreciated that in other examples, and as will be discussed herein below, HMD device 106 may take other suitable forms in which a transparent, semi-transparent or non-transparent display is supported in front of a viewer’s eye or eyes. Additionally, many other types and configurations of display devices having various form factors may also be used within the scope of the present disclosure. As discussed in more detail below, such display devices may include, but are not limited to, smart phones, tablet computers, and other suitable display devices.
[0046] Again, with reference to FIGS. 1 and 2, the example HMD device 106 of FIG. 3 includes display system 230 of FIG. 2 (not shown in FIG. 3), a display 302, lenses 304, an inward facing sensor 306, outward facing sensors 308, microphones 310, motion sensors 312, a processor 314 and speakers 316.
[0047] In embodiments, display system 230 and display 302 are configured to enable virtual images to be delivered to the eyes of a user in various ways. For example, display system 230 and display 302 may be configured to display virtual images that are wholly computer generated. This type of rendering and display is typically referred to as “virtual reality” since the visual experience is wholly synthetic and objects perceived in the virtual world are not related or connected to physical objects in the real world.
[0048] In another embodiment, display system 230 and display 302 may be configured to display virtual images that are a combination of images of the real, physical world, and computer-generated graphical content, and twhereby the appearance of the physical environment may be augmented by such graphical content. This type of rendering and display is typically referred to as “augmented reality.”
[0049] In still another embodiment, display system 230 and display 302 may also be configured to enable a user to view a physical, real-world object in physical environment 224. Physical environment 224 comprises all information and properties of the real-world environment corresponding to the forward field of view of HMD device 106, whether such information and properties are directly or indirectly perceived by the user. That is, physical environment 224 is sensed by the user and one or more cameras and/or sensors of the system, and none of physical environment 224 is created, simulated, or otherwise computer generated.
[0050] In an embodiment, a user may be enabled to view the physical environment while wearing HMD device 106 where, for example, display 302 includes one or more partially transparent pixels that are displaying a virtual object representation while simultaneously allowing light from real-world objects to pass through lenses 304 and be seen directly by the user. In one example, display 302 may include image-producing elements located within lenses 304 (such as, for example, a see-through Organic Light-Emitting Diode (OLED) display). Other means for combining computer images with real-world views will be discussed herein below regarding FIG. 4. This combination of real-world views and computer-generated graphics is usually referred to as “mixed reality.” It should be noted that although superficially similar, mixed reality and augmented reality differ in that physical, real-world objects are directly viewed. In augmented reality, on the other hand, although the user perceives a view of the real world, the view is not being directly perceived by the user, but instead is typically a captured view of the real world. For example, although photos and videos of the real world may be augmented with computer graphics and displayed, the real-world objects in the photos and videos are not directly perceived, only the augmented reproduction.
[0051] Embodiments of HMD device 106 may also include various sensors and related systems. For example, HMD device 106 may include an eye-tracking sensor system (not shown in FIG. 3) that utilizes at least one inward facing sensor 306. Inward facing sensor 306 may be an image sensor that is configured to acquire image data in the form of eye-tracking information from a user’s eyes. Provided the user has consented to the acquisition and use of this information, the eye-tracking sensor system may use this information to track a position and/or movement of the user’s eyes.
[0052] In one example, an eye-tracking system 232 of HMD device 106 may include a gaze detection subsystem configured to detect a direction of gaze of each eye of a user. The gaze detection subsystem may be configured to determine gaze directions of each of a user’s eyes in any suitable manner. For example, the gaze detection subsystem may comprise one or more light sources, such as infrared light sources, configured to cause a glint of light to reflect from the cornea of each eye of a user. One or more image sensors may then be configured to capture an image of the user’s eyes. Images of the glints and of the pupils as determined from image data gathered from the image sensors may be used to determine an optical axis of each eye. Using this information, an eye-tracking sensor system may then determine a direction and/or at what physical object or virtual object the user is gazing. Captured or derived eye-tracking data may then be provided to virtualized configuration application component 108 as eye tracking data 214 as shown in FIG. 2. It should be understood that a gaze detection subsystem may have any suitable number and arrangement of light sources and image sensors.
[0053] HMD device 106 may also include sensor systems that receive physical environment data 228 from physical environment 224. For example, HMD device 106 may include optical sensor system 236 of FIG. 2 that utilizes at least one of outward facing sensors 308, such as an optical sensor (i.e., a camera sensor). Outward facing sensors 308 may also capture two-dimensional image information and depth information from a physical environment and physical objects within the environment. For example, outward facing sensors 308 may include a depth camera, a visible light camera, an infrared light camera, and/or a position tracking camera, in embodiments.
[0054] Outward facing sensors 308 of HMD device 106 may also provide depth sensing image data via one or more depth cameras. In one example, each depth camera may include left and right cameras of a stereoscopic vision system. Time-resolved images from one or more of these depth cameras may be provided, for example, to virtualized configuration application component 108 as image data 216 for further processing. For example, such images included in image data 216 may be registered to each other and/or to images from another optical sensor such as a visible spectrum camera, and then combined to yield depth-resolved video.
[0055] In other examples a structured light depth camera may be configured to project a structured infrared illumination, and to image the illumination reflected from a scene onto which the illumination is projected. The captured images may likewise be provided to virtualized configuration application component 108 as image data 216 for construction of a depth map of the scene based on spacings between adjacent features in the various regions of an imaged scene. In still other examples, a depth camera may take the form of a time-of-flight depth camera configured to project a pulsed infrared illumination onto a scene and detect the illumination reflected from the scene. It will be appreciated that any other suitable depth camera may be used within the scope of the present disclosure.
[0056] Outward facing sensors 308 may detect movements within its field of view, such as gesture-based inputs or other movements performed by a user or by a person or physical object within the forward field of view. For example, outward facing sensors 308 may capture images as described above, determine that motion detectable within some portion of the captured image may match one or more pre-defined gesture definitions, and provide gesture-related information to virtualized configuration application component 108 as gesture data 218. Gesture data 218 may comprise the gesture-related images captured by outward facing sensors 308, depth information of gesture targets, image coordinates defining the gesture target, and the like as understood by those skilled in the relevant art(s). Gesture data 218 may then be analyzed or processed, alone or in combination with image data 216, by virtualized configuration application component 108 to identify the gesture and the corresponding operation to perform.
……
……
……