Zum Hauptinhalt springen

REGISTRATION OF LOCAL CONTENT BETWEEN FIRST AND SECOND AUGMENTED REALITY VIEWERS

2022
Online Patent

Titel:
REGISTRATION OF LOCAL CONTENT BETWEEN FIRST AND SECOND AUGMENTED REALITY VIEWERS
Link:
Veröffentlichung: 2022
Medientyp: Patent
Sonstiges:
  • Nachgewiesen in: USPTO Patent Applications
  • Sprachen: English
  • Document Number: 20220130116
  • Publication Date: April 28, 2022
  • Appl. No: 17/429100
  • Application Filed: February 26, 2020
  • Assignees: Magic Leap, Inc. (Plantation, FL, US)
  • Claim: 1. A viewing system comprising: a first augmented reality viewer that includes: a first display that permits a first user to see real world objects; a first data source to hold image data of local content; a first projector to display the image data of local content through the first display to the first user while the first user views the real world objects; a first processor; a first computer-readable medium connected to the first processor; and a first set of vision data and algorithms on the first computer-readable medium and executable by the processor, wherein the first set of vision data and algorithms comprises: a first device coordinate frame (DCF); a first registration marker operable by the first user to select a first feature point (FP1) and a second feature point (FP2) on at least one of the real word objects; and a first uniform coordinate system (UCS) alignment module including: a first feature point storing module that stores locations of the first registration marker when selecting the FP1 and the FP2; a first user coordinate frame calculator determining a first user coordinate frame (UCF) based on the locations of the first registration marker when selecting the FP1 and the FP2; a first transformer to transform the first DCF to the first UCF; and a first render engine that displays the image data of local content on the first data source based on the transformation from the first DCF to the first UCF.
  • Claim: 2. The viewing system of claim 1, further comprising: a second augmented reality viewer that includes: a second display that permits a second user to see real world objects; a second data source to hold image data of local content; a second projector to display the image data of local content through the second display to the second user while the second user views the real world objects; and a second processor; a second computer-readable medium connected to the second processor; a second set of vision data and algorithms on the second computer-readable medium and executable by the processor, including: a second device coordinate frame (DCF); a second registration marker operable by the second user to select the FP1 and the FP2 on said at least one of the real word objects; a second uniform coordinate system (UCS) alignment module having: a second feature point storing module that stores locations of the second registration marker when selecting the FP1 and the FP2; a second user coordinate frame calculator determining a second user coordinate frame (UCF) based on the locations of the second registration marker when selecting the FP1 and the FP2; a second transformer to transform the second DCF to the second UCF; and a second render engine that displays the image data of local content on the second data source based on the transformation from the second DCF to the second UCF.
  • Claim: 3. The viewing system of claim 2, wherein the first augmented reality viewer includes: a first head-mountable frame that is wearable on a head of the first user; and a first six degree of freedom (6 dof) controller that is movable relative to the first head-mountable frame by the first user to select the FP1 and FP2, wherein the first set of vision data and algorithms includes: a first feature point location calculator that determines the locations of the first registration marker when selecting the FP1 and FP2 by determining locations of the first 6 dof controller relative to the first head-mountable frame when selecting the FP1 and FP2; and a first feature storing module that stores the locations of the 6 dof controller in the first DCF.
  • Claim: 4. The viewing system of claim 3, wherein the second augmented reality viewer includes: a second head-mountable frame that is wearable on a head of the second user; and a second six degree of freedom (6 dof) controller that is movable relative to the second head-mountable frame by the second user to select the FP1 and FP2, wherein the second set of vision data and algorithms includes: a second feature point location calculator that determines the locations of the second registration marker when selecting the FP1 and FP2 by determining locations of the 6 dof controller relative to the second head-mountable frame when selecting the FP1 and FP2; and a second feature storing module that stores the locations of the second 6 dof controller in the second DCF.
  • Claim: 5. The viewing system of claim 1, wherein the first viewing device further includes: a first DCF determining routine executable by the first processor to calculate the first DCF that changes upon movement of the first head-mountable frame; and a first DCF storing instruction executable by the first processor to store the first DCF on the first computer-readable medium.
  • Claim: 6. The viewing system of claim 5, wherein the first viewing device further includes: a first real object detection device that detects positioning of at least one real object; a first world object identification routine executable by the first processor to identify positioning of at least one point on a surface of the real object; a first world frame determining routine executable by the first processor to calculate a first world coordinate frame based on the at least one point; and a first world frame storing instruction executable by the first processor to store the world coordinate frame on the computer-readable medium, wherein the DCF determining routine determines the DCF relative to the world frame.
  • Claim: 7. The viewing system of claim 6, wherein the first real object detection device is a camera.
  • Claim: 8. The viewing system of claim 6, wherein the first real object detection device detects positioning of a plurality of real objects.
  • Claim: 9. The viewing system of claim 6, wherein the first world object identification routine identifies positioning of a plurality of points on a surface of the real object.
  • Claim: 10. The viewing system of claim 9, wherein the first world frame determining routine calculates the first world coordinate frame based on the plurality of points.
  • Claim: 11. The viewing system of claim 5, wherein the first viewing device further includes: a first inertial measurement unit (IMU) secured to the first head-mountable frame, the first IMU including a first gravitational sensor that detects a first direction of gravitational force relative to the first head-mountable frame and the DCF determining routine calculates the DCF frame based on the first direction of gravitational force.
  • Claim: 12. The viewing system of claim 11, wherein the first IMU includes at least one of a first accelerometers and a first gyroscope.
  • Claim: 13. A method of viewing image data of local content comprising: creating a first augmented reality view including: storing a first device coordinate frame (DCF) on a first computer-readable medium; moving, by a first user, a first registration marker to select a first feature point (FP1) and a second feature point (FP2) on at least one real word object viewable by the first user through a first display; executing a first uniform coordinate system (UCS) alignment module by: storing locations of the first registration marker when selecting the FP1 and the FP2; determining a first user coordinate frame (UCF) based on the locations of the first registration marker when selecting the FP1 and the FP2; transforming the first DCF to the first UCF; and displaying image data of local content received on a first data source with first projector through the first display to the first user, while the first user views real world objects, based on the transformation from the first DCF to the first UCF.
  • Claim: 14. The method of claim 13, further comprising: creating a second augmented reality view including: storing a second device coordinate frame (DCF) on a second computer-readable medium; moving, by the second user, a second registration marker to select a first feature point (FP1) and a second feature point (FP2) on at least one the real word object viewable by the user through a second display; executing a second uniform coordinate system (UCS) alignment module by: storing locations of the second registration marker when selecting the FP1 and the FP2; determining a second user coordinate frame (UCF) based on the locations of the second registration marker when selecting the FP1 and the FP2; transforming the second DCF to the second UCF; and displaying image data of local content received on a second data source with second projector through the second display to the second user, while the second user views the real world objects, based on the transformation from the second DCF to the second UCF.
  • Claim: 15. The method of claim 16, further comprising: wearing, on a head of the first user, a first head-mountable frame; and moving, by the first user, a first six degree of freedom (6 dof) controller relative to the first head-mountable frame to select the FP1 and FP2; executing a first feature point location calculator to determine the locations of the first registration marker when selecting the FP1 and FP2 by determining locations of the first 6 dof controller relative to the first head-mountable frame when selecting the FP1 and FP2; and executing a first feature storing module to store the locations of the 6 dof controller in the first DCF.
  • Claim: 16. The method of claim 15, further comprising: wearing, on a head of the second user, a second head-mountable frame; and moving, by the second user, a second six degree of freedom (6 dof) controller relative to the second head-mountable frame to select the FP1 and FP2; executing a second feature point location calculator to determine the locations of the second registration marker when selecting the FP1 and FP2 by determining locations of the second 6 dof controller relative to the second head-mountable frame when selecting the FP1 and FP2; and executing a second feature storing module to store the locations of the 6 dof controller in the second DCF.
  • Claim: 17. The method of claim 13, further comprising: executing, with a first processor, a first DCF determining routine to calculate the first DCF that changes upon movement of the first head-mountable frame; and executing, with the first processor, a first DCF storing instruction executable by the first processor to store the first DCF on the first computer-readable medium.
  • Claim: 18. The method of claim 17, further comprising: detecting, with a first real object detection device, positioning of at least one real object; identifying, with a first world object identification routine executable by the first processor, positioning of at least one point on a surface of the real object; calculating, with a first world object identification routine executable by the first processor, a first world coordinate frame based on the at least one point; and storing, with a first world frame storing instruction executable by the first processor, the world coordinate frame on the computer-readable medium, wherein the DCF determining routine determines the DCF relative to the world frame.
  • Claim: 19. The method of claim 18, wherein the first real object detection device is a camera.
  • Claim: 20. The method of claim 18, wherein the first real object detection device detects positioning of a plurality of real objects.
  • Current International Class: 06; 06; 06; 06; 02

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -