Настройки

Укажите год
-

Небесная энциклопедия

Космические корабли и станции, автоматические КА и методы их проектирования, бортовые комплексы управления, системы и средства жизнеобеспечения, особенности технологии производства ракетно-космических систем

Подробнее
-

Мониторинг СМИ

Мониторинг СМИ и социальных сетей. Сканирование интернета, новостных сайтов, специализированных контентных площадок на базе мессенджеров. Гибкие настройки фильтров и первоначальных источников.

Подробнее

Форма поиска

Поддерживает ввод нескольких поисковых фраз (по одной на строку). При поиске обеспечивает поддержку морфологии русского и английского языка
Ведите корректный номера.
Ведите корректный номера.
Ведите корректный номера.
Ведите корректный номера.
Укажите год
Укажите год

Применить Всего найдено 45. Отображено 45.
26-01-2016 дата публикации

Interactions of virtual objects with surfaces

Номер: US0009245388B2

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее
15-09-2011 дата публикации

ITERATIVELY LOCATING A POSITION CORRESPONDING TO A DESIRED SEEK TIME

Номер: US20110225186A1
Принадлежит: Microsoft Corporation

Techniques enable locating a position within a file that corresponds to a desired seek time without having access to an index specifying the desired seek time's position. An iterative process may be used to estimate the position that corresponds to the desired seek time. The process may iterate through multiple estimations until a difference between a time corresponding to an estimated position and the desired seek time is within an acceptable amount or until the process reaches an iteration threshold. The file may then be played beginning at or near the desired seek time. The techniques may therefore allow a user to seek within a file while the user progressively downloads or streams the file.

Подробнее
19-12-2017 дата публикации

Holographic bird's eye view camera

Номер: US0009846968B2

A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras at user-defined positions within the mixed reality environment. The system renders virtual objects in the mixed reality environment from the perspective of the one or more cameras. Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras on one or more external, 2D monitor for viewing by others.

Подробнее
12-10-2021 дата публикации

Systems and methods for detecting collaborative virtual gestures

Номер: US0011146661B2
Принадлежит: Rec Room Inc., REC ROOM INC

An endpoint system including one or more computing devices receives user input associated with an avatar in a shared virtual environment; calculates, based on the user input, motion for a portion of the first avatar, such as a hand; determines, based on the user input, a first gesture state for first avatar; transmits first location change notifications and a representation of the first gesture state for the first avatar; receives second location change notifications and a representation of a second gesture state for a second avatar; detects a collision between the first avatar and the second avatar based on the first location change notifications and the second location change notifications; and identifies a collaborative gesture based on the detected collision, the first gesture state, and the second gesture state.

Подробнее
30-03-2021 дата публикации

Remote rendering for virtual images

Номер: US0010962780B2

One or more sensors of a virtual reality device track a pose of the virtual reality device. The virtual reality device requests a virtual image having a perspective corresponding to a future pose from a remote computer. After receiving the requested virtual image, the virtual reality device adjusts the virtual image to an adjusted virtual image having an updated perspective corresponding to an updated tracked pose of the virtual reality device. Then, a virtual reality display displays the adjusted virtual image.

Подробнее
05-08-2021 дата публикации

SYSTEMS AND METHODS FOR ASSISTING VIRTUAL GESTURES BASED ON VIEWING FRUSTUM

Номер: US20210240262A1
Принадлежит: Rec Room Inc.

An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.

Подробнее
29-12-2020 дата публикации

Systems and methods for transferring object authority in a shared virtual environment

Номер: US0010874943B2
Принадлежит: Rec Room Inc., REC ROOM INC

In some embodiments of the present disclosure, endpoint systems participating in a shared virtual environment simulate objects locally that a user of the endpoint system is likely to interact with. In some embodiments, object authority is thus managed by the endpoint systems, and is not managed by a central server. In some embodiments, a subsequent endpoint system likely to interact with an object may be predicted, and object authority may be transferred to the subsequent endpoint system before the interaction in order to provide an immersive experience for a user of the subsequent endpoint system. In some embodiments, efficient techniques for transmitting notifications between endpoint systems are provided.

Подробнее
12-12-2019 дата публикации

SYSTEMS AND METHODS FOR DETECTING COLLABORATIVE VIRTUAL GESTURES

Номер: US20190379765A1
Принадлежит: Against Gravity Corp.

An endpoint system including one or more computing devices receives user input associated with an avatar in a shared virtual environment; calculates, based on the user input, motion for a portion of the first avatar, such as a hand; determines, based on the user input, a first gesture state for first avatar; transmits first location change notifications and a representation of the first gesture state for the first avatar; receives second location change notifications and a representation of a second gesture state for a second avatar; detects a collision between the first avatar and the second avatar based on the first location change notifications and the second location change notifications; and identifies a collaborative gesture based on the detected collision, the first gesture state, and the second gesture state. 1. A system comprising:a first endpoint system; anda second endpoint system; receive first user input associated with a hand of a first avatar in a shared virtual environment;', 'calculate, based on the first user input, first motion for the hand of the first avatar;', 'determine, based on the first user input, a first gesture state for the hand of the first avatar;', 'transmit first location change notifications and a representation of the first gesture state for the hand of the first avatar;', 'receive second location change notifications and a representation of a second gesture state for a hand of a second avatar;', 'detect a collision between the hand of the first avatar and the hand of the second avatar based on the first location change notifications and the second location change notifications; and', 'identify a collaborative gesture based on the detected collision, the first gesture state, and the second gesture state; and, 'wherein the first endpoint system comprises one or more computing devices programmed to receive second user input associated with the hand of the second avatar in the shared virtual environment;', 'calculate, based on the ...

Подробнее
26-06-2018 дата публикации

Interactions of virtual objects with surfaces

Номер: US0010008044B2

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее
02-06-2016 дата публикации

INTERACTIONS OF VIRTUAL OBJECTS WITH SURFACES

Номер: US20160155270A1
Принадлежит: Microsoft Technology Licensing, LLC

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating. 1. On an augmented reality computing device comprising a display system , a method of operating a user interface , the method comprising:displaying a virtual object via the display system as free-floating;identifying a surface via image data acquired with the augmented reality computing device;comparing the surface to a policy associated with the virtual object; andbased at least on comparing the surface to the policy associated with the virtual object, displaying the virtual object as attached to the surface.2. The method of claim 1 , wherein the surface is a first surface claim 1 , further comprising identifying a second surface via the image data claim 1 , comparing the second surface to the policy associated with the virtual object claim 1 , and not displaying the virtual object as attached to the second surface based upon comparing the second surface to the policy associated with the virtual object.3. The method of claim 1 , wherein the surface comprises a real-world surface.4. The method of claim 1 , wherein the trigger comprises a threshold distance between the virtual object and the surface.5. The method of claim 1 , further comprising updating displaying of the virtual object on the display system based on a change in a location of a gaze of a user.6. The ...

Подробнее
13-12-2022 дата публикации

Systems and method for managing permission for interacting with virtual objects based on virtual proximity

Номер: US0011524232B2
Принадлежит: Rec Room Inc.

In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.

Подробнее
03-10-2017 дата публикации

Automatic generation of virtual materials from real-world materials

Номер: US0009779512B2

Methods for automatically generating a texture exemplar that may be used for rendering virtual objects that appear to be made from the texture exemplar are described. In some embodiments, a head-mounted display device (HMD) may identify a real-world object within an environment, acquire a three-dimensional model of the real-world object, determine a portion of the real-world object from which a texture exemplar is to be generated, capture one or more images of the portion of the real-world object, determine an orientation of the real-world object, and generate the texture exemplar using the one or more images, the three-dimensional model, and the orientation of the real-world object. The HMD may then render and display images of a virtual object such that the virtual object appears to be made from a virtual material associated with the texture exemplar.

Подробнее
29-11-2022 дата публикации

Systems and methods for assisting virtual gestures based on viewing frustum

Номер: US0011513592B2
Принадлежит: Rec Room Inc.

An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.

Подробнее
27-04-2017 дата публикации

REMOTE RENDERING FOR VIRTUAL IMAGES

Номер: US20170115488A1
Принадлежит:

One or more sensors of a virtual reality device track a pose of the virtual reality device. The virtual reality device requests a virtual image having a perspective corresponding to a future pose from a remote computer. After receiving the requested virtual image, the virtual reality device adjusts the virtual image to an adjusted virtual image having an updated perspective corresponding to an updated tracked pose of the virtual reality device. Then, a virtual reality display displays the adjusted virtual image. 1. A virtual reality device , comprising:one or more sensors configured to track a pose of the virtual reality device;a pose prediction machine configured to predict a future pose of the virtual reality device at a future time;a communications machine configured to send to a remote computer a request for a virtual image having a perspective corresponding to the future pose of the virtual reality device and to receive from the remote computer the virtual image;a late stage reprojection machine configured to adjust the virtual image to an adjusted virtual image having an updated perspective corresponding to an updated tracked pose of the virtual reality device; anda virtual reality display configured to display the adjusted virtual image.2. The virtual reality device of claim 1 , wherein the communications machine is further configured to determine a communications latency by measuring an elapsed time between the request being sent and the virtual image being received.3. The virtual reality device of claim 2 , wherein the pose prediction machine is configured to dynamically adjust a buffer period between a current time and the future time for which the future pose is predicted claim 2 , the buffer period dynamically increased with increasing communications latency and dynamically decreased with decreasing communications latency.4. The virtual reality device of claim 1 , wherein the one or more sensors are configured to detect a motion of the virtual reality ...

Подробнее
19-07-2007 дата публикации

Traversal of datasets using positioning of radial input device

Номер: US20070168879A1
Принадлежит: Microsoft Corporation

Items can be displayed radially by positioning representative icons in a circular or helical fashion on a display. Upon receiving an angle input from a radial input device, an item at a corresponding angle on the display can be selected. The set of selectable items can be recalculated with each new selection, thereby removing the constraint of available display space and allowing easy and continuous traversal of even very large datasets.

Подробнее
11-03-2021 дата публикации

SYSTEMS AND METHOD FOR MANAGING PERMISSION FOR INTERACTING WITH VIRTUAL OBJECTS BASED ON VIRTUAL PROXIMITY

Номер: US20210069589A1
Принадлежит: Rec Room Inc.

In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.

Подробнее
30-05-2024 дата публикации

BI-DIRECTIONAL EXCHANGE OF EDITED VIRTUAL OBJECTS BETWEEN A DESIGN ENVIRONMENT AND A VIRTUAL ENVIRONMENT

Номер: US20240177373A1
Принадлежит: Rec Room Inc.

In some embodiments, a computer-implemented method of managing editing of virtual environment objects is provided. A computing system receives a first studio-version object created in a creation studio application executed by a designer computing device. The creation studio application is associated with a game engine. The computing system transmits a first binary object based on the first studio-version object to a virtual environment client computing device. The computing system receives a first VE editor object created in a virtual environment editor executed by the virtual environment client computing device. The first VE editor object includes at least a reference to the first binary object. The computing system creates a first studio-version VE editor object that includes at least a reference to the first studio-version object and the first VE editor object; and transmits the first studio-version VE editor object to the designer computing device.

Подробнее
06-11-2008 дата публикации

Iteratively Locating A Position Corresponding To A Desired Seek Time

Номер: US20080276173A1
Принадлежит: Microsoft Corporation

Techniques enable locating a position within a file that corresponds to a desired seek time without having access to an index specifying the desired seek time's position. An iterative process may be used to estimate the position that corresponds to the desired seek time. The process may iterate through multiple estimations until a difference between a time corresponding to an estimated position and the desired seek time is within an acceptable amount or until the process reaches an iteration threshold. The file may then be played beginning at or near the desired seek time. The techniques may therefore allow a user to seek within a file while the user progressively downloads or streams the file.

Подробнее
08-06-2006 дата публикации

Tool for real-time graphical exploration of interconnected friends and groups

Номер: US20060121988A1
Принадлежит: Microsoft Corporation

The present invention is direct to a Live Network Explorer (LNE) which allows users to graphically navigate through a web of interconnected friends on a gaming console. The LNE provides a real-time graphical representation of friend-of-friend relationships, with the currently-viewed user as a visual node in the screen center. The center node is surrounded by nodes representing that user's friends. Real-time animation is used to represent transitions as the user navigates from person to person. The LNE also represents groups as a visual node in this graphically interconnected web so that the user can travel to a group that a person belongs to, and can see all the people inside the group by connections. The user can navigate to any of these people in the group. The user can click on a person to bring up their digital identity details.

Подробнее
04-08-2016 дата публикации

AUTOMATIC GENERATION OF VIRTUAL MATERIALS FROM REAL-WORLD MATERIALS

Номер: US20160225164A1
Принадлежит:

Methods for automatically generating a texture exemplar that may be used for rendering virtual objects that appear to be made from the texture exemplar are described. In some embodiments, a head-mounted display device (HMD) may identify a real-world object within an environment, acquire a three-dimensional model of the real-world object, determine a portion of the real-world object from which a texture exemplar is to be generated, capture one or more images of the portion of the real-world object, determine an orientation of the real-world object, and generate the texture exemplar using the one or more images, the three-dimensional model, and the orientation of the real-world object. The HMD may then render and display images of a virtual object such that the virtual object appears to be made from a virtual material associated with the texture exemplar. 1. An electronic device for generating and controlling virtual objects within an augmented reality environment , comprising:one or more processors, the one or more processors acquire one or more images of a portion of a real-world object and acquire a three-dimensional model of the real-world object, the one or more processors identify one or more diffuse reflections within the one or more images and generate a texture exemplar using the one or more images and the three-dimensional model of the real-world object, the one or more processors generate the texture exemplar such that the one or more diffuse reflections are substantially removed from the texture exemplar, the one or more processors generate the texture exemplar such that distortions caused by a curved surface of the real-world object are corrected, the one or more processors render images of a virtual object such that the virtual object appears to be at least partially covered with a virtual material corresponding with the texture exemplar; anda display in communication with the one or more processors, the display displays the rendered images.2. The ...

Подробнее
13-04-2017 дата публикации

INTERACTIONS OF VIRTUAL OBJECTS WITH SURFACES

Номер: US20170103583A1
Принадлежит: Microsoft Technology Licensing, LLC

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating. 1. On an augmented reality computing device comprising a display system , a method of operating a user interface , the method comprising:displaying a virtual object via the display system as free-floating;identifying a surface via image data acquired with the augmented reality computing device;comparing the surface to a policy associated with the virtual object; andbased at least on comparing the surface to the policy associated with the virtual object, displaying the virtual object as attached to the surface.2. The method of claim 1 , wherein the surface is a first surface claim 1 , further comprising identifying a second surface via the image data claim 1 , comparing the second surface to the policy associated with the virtual object claim 1 , and not displaying the virtual object as attached to the second surface based upon comparing the second surface to the policy associated with the virtual object.3. The method of claim 1 , wherein the surface comprises a real-world surface.4. The method of claim 1 , wherein the trigger comprises a threshold distance between the virtual object and the surface.5. The method of claim 1 , further comprising updating displaying of the virtual object on the display system based on a change in a location of a gaze of a user.6. The ...

Подробнее
18-07-2019 дата публикации

SYSTEMS AND METHOD FOR MANAGING PERMISSION FOR INTERACTING WITH VIRTUAL OBJECTS BASED ON VIRTUAL PROXIMITY

Номер: US20190217192A1
Принадлежит: Against Gravity Corp.

In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.

Подробнее
27-04-2021 дата публикации

Systems and methods for assisting virtual gestures based on viewing frustum

Номер: US0010990169B2
Принадлежит: Rec Room Inc., REC ROOM INC

An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information.

Подробнее
24-11-2020 дата публикации

Systems and method for managing permission for interacting with virtual objects based on virtual proximity

Номер: US0010843073B2
Принадлежит: Rec Room Inc., REC ROOM INC

In some embodiments, techniques for managing interaction permissions for an object in a shared virtual environment are provided. The techniques may determine whether to present an object in a limited-interaction mode or in an interactive mode based on a permission condition. The permission condition may include a proximity condition that specifies a proximity threshold between two objects within the shared virtual environment that should be met in order to provide the object in the interactive mode. The proximity threshold may specify a distance between an avatar of an owning user of the object and the object; a distance between an avatar of an owning user of the object and an avatar of a user being presented the object, or other distances.

Подробнее
16-06-2015 дата публикации

Iteratively locating a position corresponding to a desired seek time

Номер: US0009060190B2

Techniques enable locating a position within a file that corresponds to a desired seek time without having access to an index specifying the desired seek time's position. An iterative process may be used to estimate the position that corresponds to the desired seek time. The process may iterate through multiple estimations until a difference between a time corresponding to an estimated position and the desired seek time is within an acceptable amount or until the process reaches an iteration threshold. The file may then be played beginning at or near the desired seek time. The techniques may therefore allow a user to seek within a file while the user progressively downloads or streams the file.

Подробнее
05-07-2011 дата публикации

Iteratively locating a position corresponding to a desired seek time

Номер: US0007975225B2

Techniques enable locating a position within a file that corresponds to a desired seek time without having access to an index specifying the desired seek time's position. An iterative process may be used to estimate the position that corresponds to the desired seek time. The process may iterate through multiple estimations until a difference between a time corresponding to an estimated position and the desired seek time is within an acceptable amount or until the process reaches an iteration threshold. The file may then be played beginning at or near the desired seek time. The techniques may therefore allow a user to seek within a file while the user progressively downloads or streams the file.

Подробнее
23-05-2019 дата публикации

SYSTEMS AND METHODS FOR ASSISTING VIRTUAL GESTURES BASED ON VIEWING FRUSTUM

Номер: US20190155384A1
Принадлежит: Against Gravity Corp.

An endpoint system including one or more computing devices presents an object in a virtual environment (e.g., a shared virtual environment); receives gaze input corresponding to a gaze of a user of the endpoint system; calculates a gaze vector based on the gaze input; receives motion input corresponding to an action of the user; determines a path adjustment (e.g., by changing motion parameters such as trajectory and velocity) for the object based at least in part on the gaze vector and the motion input; and simulates motion of the object within the virtual environment based at least in part on the path adjustment. The object may be presented as being thrown by an avatar, with a flight path based on the path adjustment. The gaze vector may be based on head orientation information, eye tracking information, or some combination of these or other gaze information. 1. A system comprising: present an object in a virtual environment;', 'receive gaze input corresponding to a gaze of a user of the endpoint system;', 'calculate a gaze vector based on the gaze input;', 'receive motion input corresponding to an action of the user that causes motion of the object in the virtual environment;', 'determine a path adjustment for the object based at least in part on the gaze vector and the motion input; and', 'simulate motion of the object within the virtual environment based at least in part on the path adjustment., 'an endpoint system including one or more computing devices programmed to25-. (canceled)6. The system of claim 1 , wherein the one or more computing devices are further programmed to calculate a motion vector based on the motion input claim 1 , wherein the motion vector includes a direction.7. The system of claim 6 , wherein the motion input comprises hand motion input.8. (canceled)9. The system of claim 6 , wherein the gaze vector includes a direction claim 6 , and wherein determining the path adjustment for the object comprises:performing a comparison of the direction ...

Подробнее
30-07-2015 дата публикации

RADIAL SELECTION BY VESTIBULO-OCULAR REFLEX FIXATION

Номер: US20150212576A1
Принадлежит:

Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object. 1. An electronic system , comprising:a display; andone or more processors in communication with the display, the one or more processors detect that an end user of the electronic system has gazed at a selectable object displayed on the display for a first period of time, the one or more processors detect that the end user has performed a particular head movement while gazing at the selectable object subsequent to detecting that the end user has gazed at the selectable object for the first period of time, the one or more processors detect a first vestibulo-ocular reflex during a first portion of the particular head movement, the one or more processors detect that the end user has performed the particular head movement based on the detection of the first vestibulo-ocular reflex, the one or more processors determine a selection of the selectable object based on detecting that the end user has performed the particular head movement while gazing at the selectable object, the one or more processors determine a size of the selectable object and determine the first period of time based on the size of the selectable object.2. The electronic system ...

Подробнее
21-07-2016 дата публикации

HOLOGRAPHIC BIRD'S EYE VIEW CAMERA

Номер: US20160210783A1
Принадлежит:

A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras at user-defined positions within the mixed reality environment. The system renders virtual objects in the mixed reality environment from the perspective of the one or more cameras. Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras on one or more external, 2D monitor for viewing by others. 1. A system for presenting a mixed reality environment including real and virtual objects , the system comprising:a head mounted display device including a display unit for displaying a three-dimensional virtual object in the virtual environment;one or more camera assemblies; anda processing unit operatively coupled to the display device and the one or more camera assemblies, the processing unit generating a scene map comprising a three-dimensional coordinate space in which the head mounted display, the three-dimensional virtual object and the one or more camera assemblies are registered, the processing unit and a camera assembly of the one or more camera assemblies generating an image of the mixed reality environment for display from a perspective of the camera assembly.2. The system of claim 1 , the camera assembly including a image capture device for capturing a color or black and white image of real objects in the mixed reality environment.3. The system of claim 1 , the camera assembly including an image sensor for capturing depth data of the mixed reality environment claim 1 , the depth data used to generate a depth map of the mixed reality environment from a perspective of the image sensor.4. The system of claim 1 , the camera assembly and the head mounted display device calibrated to a common coordinate system by the camera assembly and the head mounted display device capturing one or more images of a ...

Подробнее
13-11-2014 дата публикации

INTERACTIONS OF VIRTUAL OBJECTS WITH SURFACES

Номер: US20140333666A1
Принадлежит:

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating. 1. On an augmented reality computing device comprising a display system , a method of operating a user interface , the method comprising:displaying a virtual object via the display system as free-floating;detecting a trigger to display the object as attached to a surface; andin response to the trigger, displaying the virtual object as attached to the surface via the display system.2. The method of claim 1 , wherein the surface comprises a real-world surface.3. The method of claim 1 , wherein the trigger comprises a threshold distance between the virtual object and the surface.4. The method of claim 1 , further comprising updating displaying of the virtual object on the display system based on a change in a location of a gaze of a user.5. The method of claim 1 , further comprising obtaining one or more policies associated with the virtual object claim 1 , where the one or more policies dictate one or more of which surfaces the virtual object may be attached to and under what conditions the virtual object may be attached to a surface claim 1 , and displaying the virtual object on the display system based on the one or more policies associated with the virtual object.6. The method of claim 1 , further comprising:detecting a trigger to detach the virtual object from the ...

Подробнее
24-01-2017 дата публикации

Radial selection by vestibulo-ocular reflex fixation

Номер: US0009552060B2

Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.

Подробнее
28-08-2014 дата публикации

MIXED REALITY AUGMENTATION

Номер: US20140240351A1
Принадлежит:

Embodiments that relate to providing motion amplification to a virtual environment are disclosed. For example, in one disclosed embodiment a mixed reality augmentation program receives from a head-mounted display device motion data that corresponds to motion of a user in a physical environment. The program presents via the display device the virtual environment in motion in a principal direction, with the principal direction motion being amplified by a first multiplier as compared to the motion of the user in a corresponding principal direction. The program also presents the virtual environment in motion in a secondary direction, where the secondary direction motion is amplified by a second multiplier as compared to the motion of the user in a corresponding secondary direction, and the second multiplier is less than the first multiplier. 1. A mixed reality augmentation system for providing motion amplification to a virtual environment in a mixed reality environment , the mixed reality augmentation system comprising:a head-mounted display device operatively connected to a computing device, the head-mounted display device including a display system for presenting the mixed reality environment; anda mixed reality augmentation program executed by a processor of the computing device, the mixed reality augmentation program configured to:receive from the head-mounted display device motion data that corresponds to motion of a user in a physical environment;present via the display system the virtual environment in motion in a principal direction, the principal direction motion being amplified by a first multiplier as compared to the motion of the user in a corresponding principal direction, andpresent via the display system the virtual environment in motion in a secondary direction, the secondary direction motion being amplified by a second multiplier as compared to the motion of the user in a corresponding secondary direction, wherein the second multiplier is less than the ...

Подробнее
30-05-2019 дата публикации

SYSTEMS AND METHODS PROVIDING TEMPORARY DECOUPLING OF USER AVATAR SYNCHRONICITY FOR PRESENCE ENHANCING EXPERIENCES

Номер: US20190160378A1
Принадлежит: Against Gravity Corp.

In some embodiments, a detecting endpoint system accessing a shared virtual environment detects a collision between a target avatar and an object within the shared virtual environment. The detecting endpoint system transmits a location change notification for a head of the target avatar. An observer endpoint system moves the head of the target avatar based on the location change notification. A target endpoint system associated with the target avatar does not move its viewpoint based on the location change notification. In some embodiments, this decoupling of viewpoint from the avatar allows for a more immersive experience for all users. 1. A method of temporarily decoupling a position of a viewpoint from a position of a head of a target avatar in a shared virtual environment , the method comprising:presenting, by a target endpoint system, the shared virtual environment from a viewpoint at a motion-tracked position, wherein the motion-tracked position corresponds to a position of a head-mounted display device detected by a motion sensor device of the target endpoint system;presenting, by an observing endpoint system, the shared virtual environment including the head of the target avatar at the motion-tracked position;transmitting, by a detecting endpoint system, a first location change notification that includes a reaction position for the head of the target avatar;in response to receiving the first location change notification, animating, by the observing endpoint system, the head of the target avatar from the motion-tracked position to the reaction position; andin response to receiving the first location change notification, maintaining, by the target endpoint system, the viewpoint at the motion-tracked position instead of moving the viewpoint from the motion-tracked position to the reaction position.2. The method of claim 1 , wherein transmitting claim 1 , by the detecting endpoint system claim 1 , the first location change notification that includes the reaction ...

Подробнее
02-02-2021 дата публикации

Systems and methods providing temporary decoupling of user avatar synchronicity for presence enhancing experiences

Номер: US0010905956B2
Принадлежит: Rec Room Inc., REC ROOM INC

In some embodiments, a detecting endpoint system accessing a shared virtual environment detects a collision between a target avatar and an object within the shared virtual environment. The detecting endpoint system transmits a location change notification for a head of the target avatar. An observer endpoint system moves the head of the target avatar based on the location change notification. A target endpoint system associated with the target avatar does not move its viewpoint based on the location change notification. In some embodiments, this decoupling of viewpoint from the avatar allows for a more immersive experience for all users.

Подробнее
27-09-2012 дата публикации

SYSTEM FOR EDITING AN AVATAR

Номер: US20120246585A9
Принадлежит: MICROSOFT CORPORATION

Systems, methods and computer readable media are disclosed for updating the appearance of an avatar that exists across an online multi-player gaming system, including an executing video game. In addition to the general system, systems, methods and computer readable media for updating the avatar, techniques are disclosed for prompting networked video games to update an avatar that has been modified while the video game has been executing. 1. A method for updating the appearance of an avatar stored on a console and used in a plurality of video games executed on the console , while the console is executing a video game , comprising:receiving, by the console, while executing the game, an instruction from a user to update the appearance of the avatar;updating the appearance of the avatar;storing the updated avatar on the console; andinstructing the game to load and display the updated avatar.2. The method of claim 1 , wherein updating the appearance of the avatar includes:overlaying on top of the game an editor window; andreceiving at least one instruction from the user on how to update the appearance of the avatar.3. The method of claim 1 , wherein said updating comprises updating one from the set of: a hair color claim 1 , a hair length claim 1 , a hair style claim 1 , a facial hair color claim 1 , a facial hair length claim 1 , a facial hair style claim 1 , a facial hair position claim 1 , an eye color claim 1 , an eye style claim 1 , an eye position claim 1 , a nose style claim 1 , a nose position claim 1 , a mouth style claim 1 , a mouth color claim 1 , a mouth position claim 1 , an ear style claim 1 , an ear position claim 1 , a skin color claim 1 , a height claim 1 , a weight claim 1 , and a body build.4. The method of claim 1 , wherein the video game is an online-multi-player video game comprising a session and a plurality of other users participating in said session claim 1 , and said instructing the game to load and display the updated avatar includes ...

Подробнее
31-10-2019 дата публикации

SYSTEMS AND METHODS FOR TRANSFERRING OBJECT AUTHORITY IN A SHARED VIRTUAL ENVIRONMENT

Номер: US20190329129A1
Принадлежит: Against Gravity Corp.

In some embodiments of the present disclosure, endpoint systems participating in a shared virtual environment simulate objects locally that a user of the endpoint system is likely to interact with. In some embodiments, object authority is thus managed by the endpoint systems, and is not managed by a central server. In some embodiments, a subsequent endpoint system likely to interact with an object may be predicted, and object authority may be transferred to the subsequent endpoint system before the interaction in order to provide an immersive experience for a user of the subsequent endpoint system. In some embodiments, efficient techniques for transmitting notifications between endpoint systems are provided.

Подробнее
11-06-2015 дата публикации

Interactions of virtual objects with surfaces

Номер: WO2015047453A3
Принадлежит: MICROSOFT CORPORATION

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее
28-07-2016 дата публикации

Mixed reality system

Номер: WO2016118371A1
Принадлежит: Microsoft Technology Licensing, LLC

A system and method are disclosed for capturing views of a mixed reality environment from various perspectives which can be displayed on a monitor. The system includes one or more physical cameras (50a, 50b) at user-defined positions within the mixed reality environment. The system renders virtual objects (40) in the mixed reality environment from the perspective of the one or more cameras (50a, 50b). Real and virtual objects from the mixed reality environment may then be displayed from the perspective of the one or more cameras (50a, 50b) on one or more external, 2D monitor for viewing by others.

Подробнее
02-04-2015 дата публикации

Interactions of virtual objects with surfaces

Номер: WO2015047453A2
Принадлежит: MICROSOFT CORPORATION

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее
06-08-2015 дата публикации

Radial selection by vestibulo-ocular reflex fixation

Номер: WO2015116475A1
Принадлежит: Microsoft Technology Licensing, LLC

Methods for enabling hands-free selection of objects within an augmented reality environment are described. In some embodiments, an object may be selected by an end user of a head-mounted display device (HMD) based on detecting a vestibulo-ocular reflex (VOR) with the end user's eyes while the end user is gazing at the object and performing a particular head movement for selecting the object. The object selected may comprise a real object or a virtual object. The end user may select the object by gazing at the object for a first time period and then performing a particular head movement in which the VOR is detected for one or both of the end user's eyes. In one embodiment, the particular head movement may involve the end user moving their head away from a direction of the object at a particular head speed while gazing at the object.

Подробнее
23-03-2016 дата публикации

Interactions of virtual objects with surfaces

Номер: EP2997449A2
Принадлежит: Microsoft Technology Licensing LLC

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее
22-06-2006 дата публикации

互いに結び付けられた友人およびグループのリアルタイムのグラフィック探索のためのツール

Номер: JP2006164241A
Принадлежит: Microsoft Corp

【課題】ユーザがゲームコンソール上で互いに結び付けられた友人のウェブをグラフィックスでナビゲートすることを可能にするライブネットワークエクスプローラ(LNE)を提供すること。 【解決手段】中央ノードは、ユーザの友人達を表すノードで囲まれる。リアルタイムアニメーションが使用されて、ユーザが個人から個人へナビゲートするにつれ、遷移が表される(202)。LNEは、このグラフィックスで互いに結び付けられたウェブの中で、グループをビジュアルノードとして表して、個人が属するグループにユーザが移動することができ、結び付きによってグループ内のすべての人々を見ることができるようにする(204)。ユーザは、グループ内のそれらの人々のいずれにもナビゲートすることができる。ユーザは、個人の上でクリックして、そのデジタル身元詳細を表示できる(206)。 【選択図】図2

Подробнее
15-04-2010 дата публикации

Programming apis for an extensible avatar system

Номер: WO2010009175A3
Принадлежит: MICROSOFT CORPORATION

Disclosed is an application programming interface (API) that provides for an extensible avatar system. In one embodiment an API may allow video game applications to retrieve structures of data which represent an avatar. The game can then take those structures and incorporate the data into its own rendering system. In another embodiment an API may allow a video game application to render an avatar to a render target or texture wherein the video game system performs rendering and animation functions.

Подробнее
21-01-2010 дата публикации

Programming apis for an extensible avatar system

Номер: WO2010009175A2
Принадлежит: MICROSOFT CORPORATION

Disclosed is an application programming interface (API) that provides for an extensible avatar system. In one embodiment an API may allow video game applications to retrieve structures of data which represent an avatar. The game can then take those structures and incorporate the data into its own rendering system. In another embodiment an API may allow a video game application to render an avatar to a render target or texture wherein the video game system performs rendering and animation functions.

Подробнее
27-12-2016 дата публикации

Interactions of virtual objects with surfaces

Номер: US09530252B2
Принадлежит: Microsoft Technology Licensing LLC

Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a display system. For example, one disclosed embodiment includes displaying a virtual object via the display system as free-floating, detecting a trigger to display the object as attached to a surface, and, in response to the trigger, displaying the virtual object as attached to the surface via the display system. The method may further include detecting a trigger to detach the virtual object from the surface and, in response to the trigger to detach the virtual object from the surface, detaching the virtual object from the surface and displaying the virtual object as free-floating.

Подробнее