Zum Hauptinhalt springen

Gaming systems and methods using image analysis authentication

LNW Gaming, Inc.
2023
Online Patent

Titel:
Gaming systems and methods using image analysis authentication
Autor/in / Beteiligte Person: LNW Gaming, Inc.
Link:
Veröffentlichung: 2023
Medientyp: Patent
Sonstiges:
  • Nachgewiesen in: USPTO Patent Grants
  • Sprachen: English
  • Patent Number: 11854,337
  • Publication Date: December 26, 2023
  • Appl. No: 17/834220
  • Application Filed: June 07, 2022
  • Assignees: LNW Gaming, Inc. (Las Vegas, NV, US)
  • Claim: 1. A gaming terminal comprising: an input device configured to receive physical user input from a user; an image sensor configured to capture image data of a user area associated with the gaming terminal, the input device being at a predetermined location relative to the user area; and logic circuitry communicatively coupled to the input device and the image sensor, the logic circuitry configured to: detect user input received at the input device, the detected user input associated with a restricted action; receive, via the image sensor, input image data of the captured image data, the input image data corresponding to the detected user input; apply at least one neural network model to the input image data to classify pixels of the input image data as representing human characteristics, the human characteristics including at least one face and at least one pose model; based at least partially on (i) pixel coordinates of the human characteristics within the input image data and (ii) pixel coordinates of a user input zone within the input image data and associated with the detected user input, compare each of the at least one pose model to the user input zone and the at least one face; and in response to the comparison indicating suspicious behavior, prevent the restricted action.
  • Claim: 2. The gaming terminal of claim 1 , wherein the gaming terminal includes a three-dimensional camera including the image sensor.
  • Claim: 3. The gaming terminal of claim 2 , wherein each of the human characteristics has an associated depth detected by the three-dimensional camera, the logic circuitry configured to remove the human characteristics having the associated depth exceeding a depth threshold.
  • Claim: 4. The gaming terminal of claim 1 , wherein, in response to preventing the restricted action, the logic circuitry is configured to present the user with an authentication challenge, and wherein, in response to the user successfully responding to the authentication challenge as an authorized user, the logic circuitry is further configured to permit the restricted action.
  • Claim: 5. The gaming terminal of claim 1 , wherein the suspicious behavior is indicated by at least one of: (i) the comparison failing to match the user input zone to any face of the at least one face or any pose model of the at least one pose model, (ii) an absence of any face or pose model, or (iii) detecting a printed image or display device presenting one or more of the human characteristics.
  • Claim: 6. The gaming terminal of claim 1 , wherein the input device is a touchscreen, the detected user input associated with touch coordinates indicating a location on the touchscreen of the detected user input, wherein the logic circuitry is configured to calculate the pixel coordinates of the input zone at least partially as a function of the touch coordinates.
  • Claim: 7. The gaming terminal of claim 1 , wherein the logic circuitry transmits an alert to an attendant device associated with the gaming terminal in response to the indication of suspicious behavior.
  • Claim: 8. The gaming terminal of claim 7 , wherein the alert is transmitted to the attendant device further in response to the user failing an authentication challenge.
  • Claim: 9. A method for authentication a user at a gaming terminal of a gaming system, the gaming system including at least one image sensor and logic circuitry in communication with the gaining terminal and the at least one image sensor, the method comprising: receiving, by an input device of the gaming terminal, physical user input from the user, the physical user input associated with a restricted action; receiving, by the logic circuitry via the at least one image sensor, input image data that corresponds to the physical user input; applying, by the logic circuitry, at least one neural network model to the input image data to classify pixels of the received image data as representing human characteristics, the human characteristics including at least one face and at least one pose model; based at least partially on (i) pixel coordinates of the human characteristics within the input image data and (ii) pixel coordinates of a user input zone within the input image data and associated with the physical user input, comparing, by the logic circuitry, each of the at least one pose model to the user input zone and the at least one face; and in response to the comparison indicating suspicious behavior, preventing, by the logic circuitry, the restricted action.
  • Claim: 10. The method of claim 9 , wherein the at least one image sensor includes a three-dimensional camera, and wherein each of the human characteristics has an associated depth detected by the three-dimensional camera, the method further comprising removing, by the logic circuitry, the human characteristics having the associated depth exceeding a depth threshold.
  • Claim: 11. The method of claim 9 further comprising, wherein, in response to preventing the restricted action, presenting, by the logic circuitry, the user with an authentication challenge, and, in response to the user successfully responding to the authentication challenge as an authorized user, permitting, by the logic circuitry, the restricted action.
  • Claim: 12. The method of claim 9 , wherein the suspicious behavior is indicated by at least one of: (i) the comparison failing to match the user input zone to any face of the at least one face or any pose model of the at least one pose model, (ii) an absence of any face or pose model, or (iii) detecting a printed image or display device presenting one or more of the human characteristics.
  • Claim: 13. The method of claim 9 , wherein the input device is a touchscreen, the physical user input associated with touch coordinates indicating a location on the touchscreen of the physical user input, and wherein the method further comprises calculating, by the logic circuitry, the pixel coordinates of the input zone at least partially as a function of the touch coordinates.
  • Claim: 14. The method of claim 9 further comprising transmitting, by the logic circuitry, an alert to an attendant device associated with the gaming terminal in response to the indication of suspicious behavior.
  • Claim: 15. A gaming system comprising: a gaming terminal comprising an input device configured to receive physical user input from a user; an image sensor configured to capture image data of a user area associated with the gaming terminal, the input device being at a predetermined location relative to the user area; and logic circuitry communicatively coupled to the input device and the image sensor, the logic circuitry configured to: detect user input received at the input device, the detected user input associated with a restricted action; receive, via the image sensor, input image data of the captured image data, the input image data corresponding to the detected user input; apply at least one neural network model to the input image data to classify pixels of the input image data as representing human characteristics, the human characteristics including at least one face and at least one pose model; based at least partially on (i) pixel coordinates of the human characteristics within the input image data and (ii) pixel coordinates of a user input zone within the input image data and associated with the detected user input, compare each of the at least one pose model to the user input zone and the at least one face; and in response to the comparison indicating suspicious behavior, prevent the restricted action.
  • Claim: 16. The gaming system of claim 15 further comprising a three-dimensional camera including the image sensor, wherein each of the human characteristics has an associated depth detected by the three-dimensional camera, the logic circuitry configured to remove the human characteristics having the associated depth exceeding a depth threshold.
  • Claim: 17. The gaming system of claim 15 , wherein, in response to preventing the restricted action, the logic circuitry is configured to present the user with an authentication challenge, and wherein, in response to the user successfully responding to the authentication challenge as an authorized user, the logic circuitry is further configured to permit the restricted action.
  • Claim: 18. The gaming system of claim 15 , wherein the suspicious behavior is indicated by at least one of: (i) the comparison failing to match the user input zone to any face of the at least one face or any pose model of the at least one pose model, (ii) an absence of any face or pose model, or (iii) detecting a printed image or display device presenting one or more of the human characteristics.
  • Claim: 19. The gaming system of claim 15 , wherein the input device is a touchscreen, the detected user input associated with touch coordinates indicating a location on the touchscreen of the detected user input, wherein the logic circuitry is configured to calculate the pixel coordinates of the input zone at least partially as a function of the touch coordinates.
  • Claim: 20. The gaming system of claim 15 , wherein the logic circuitry transmits an alert to an attendant device associated with the gaming terminal in response to preventing the restricted action.
  • Claim: 21. The gaming system of claim 20 , wherein the alert is transmitted to the attendant device further in response to the user failing the authentication challenge.
  • Patent References Cited: 5103081 April 1992 Fisher et al. ; 5451054 September 1995 Orenstein ; 5757876 May 1998 Dam et al. ; 6460848 October 2002 Soltys et al. ; 6501982 December 2002 Ruchti et al. ; 6514140 February 2003 Storch ; 6517435 February 2003 Soltys et al. ; 6517436 February 2003 Soltys et al. ; 6520857 February 2003 Soltys et al. ; 6527271 March 2003 Soltys et al. ; 6530836 March 2003 Soltys et al. ; 6530837 March 2003 Soltys et al. ; 6533276 March 2003 Soltys et al. ; 6533662 March 2003 Soltys et al. ; 6579180 June 2003 Soltys et al. ; 6579181 June 2003 Soltys et al. ; 6595857 July 2003 Soltys et al. ; 6663490 December 2003 Soltys et al. ; 6688979 February 2004 Soltys et al. ; 6712696 March 2004 Soltys et al. ; 6758751 July 2004 Soltys et al. ; 7011309 March 2006 Soltys et al. ; 7124947 October 2006 Storch ; 7316615 January 2008 Soltys et al. ; 7319779 January 2008 Mummareddy et al. ; 7753781 July 2010 Storch ; 7771272 August 2010 Soltys et al. ; 8000505 August 2011 Gallagher ; 8130097 March 2012 Knust et al. ; 8285034 October 2012 Rajaraman et al. ; 8606002 December 2013 Rajaraman et al. ; 8896444 November 2014 Knust et al. ; 9165420 October 2015 Knust et al. ; 9174114 November 2015 Knust et al. ; 9378605 June 2016 Koyama ; 9511275 December 2016 Knust et al. ; 9795870 October 2017 Ratliff ; 9889371 February 2018 Knust et al. ; 10032335 July 2018 Shigeta ; 10096206 October 2018 Bulzacki et al. ; 10192085 January 2019 Shigeta ; 10242525 March 2019 Knust et al. ; 10242527 March 2019 Bulzacki et al. ; 10380838 August 2019 Bulzacki et al. ; 10398202 September 2019 Shigeta ; 10403090 September 2019 Shigeta ; 10410066 September 2019 Bulzacki et al. ; 10452935 October 2019 Li et al. ; 10493357 December 2019 Shigeta ; 10529183 January 2020 Shigeta ; 10540846 January 2020 Shigeta ; 10574650 February 2020 Wallace et al. ; 10580254 March 2020 Shigeta ; 10593154 March 2020 Shigeta ; 10600279 March 2020 Shigeta ; 10600282 March 2020 Shigeta ; 10665054 May 2020 Shigeta ; 10706675 July 2020 Shigeta ; 10720013 July 2020 Main, Jr. ; 10740637 August 2020 Garcia Rodriguez et al. ; 10741019 August 2020 Shigeta ; 10748378 August 2020 Shigeta ; 10755524 August 2020 Shigeta ; 10755525 August 2020 Shigeta ; 10762745 September 2020 Shigeta ; 10846985 September 2020 Shigeta ; 10825288 November 2020 Knust et al. ; 10832517 November 2020 Bulzacki et al. ; 10846980 November 2020 French et al. ; 10846986 November 2020 Shigeta ; 10846987 November 2020 Shigeta ; 11183012 November 2021 Eager et al. ; 20030190076 October 2003 Delean ; 20050059479 March 2005 Soltys et al. ; 20060019739 January 2006 Soltys et al. ; 20180034852 February 2018 Goldenberg ; 20180053377 February 2018 Shigeta ; 20180061178 March 2018 Shigeta ; 20180068525 March 2018 Shigeta ; 20180075698 March 2018 Shigeta ; 20180114406 April 2018 Shigeta ; 20180211110 July 2018 Shigeta ; 20180211472 July 2018 Shigeta ; 20180232987 August 2018 Shigeta ; 20180239984 August 2018 Shigeta ; 20180336757 November 2018 Shigeta ; 20190043309 February 2019 Shigeta ; 20190088082 March 2019 Shigeta ; 20190102987 April 2019 Shigeta ; 20190147689 May 2019 Shigeta ; 20190172312 June 2019 Shigeta ; 20190188957 June 2019 Bulzacki et al. ; 20190188958 June 2019 Shigeta ; 20190236891 August 2019 Shigeta ; 20190259238 August 2019 Shigeta ; 20190266832 August 2019 Shigeta ; 20190318576 October 2019 Shigeta ; 20190320768 October 2019 Shigeta ; 20190333322 October 2019 Shigeta ; 20190333323 October 2019 Shigeta ; 20190333326 October 2019 Shigeta ; 20190340873 November 2019 Shigeta ; 20190344157 November 2019 Shigeta ; 20190347893 November 2019 Shigeta ; 20190347894 November 2019 Shigeta ; 20190362594 November 2019 Shigeta ; 20190371112 December 2019 Shigeta ; 20200026919 January 2020 Boiko et al. ; 20200034629 January 2020 Vo et al. ; 20200035060 January 2020 Shigeta ; 20200098223 March 2020 Lyons et al. ; 20200118390 April 2020 Shigeta ; 20200122018 April 2020 Shigeta ; 20200175806 June 2020 Shigeta ; 20200202134 June 2020 Bulzacki et al. ; 20200226878 July 2020 Shigeta ; 20200234464 July 2020 Shigeta ; 20200242888 July 2020 Shigeta ; 20200258351 August 2020 Shigeta ; 20200265672 August 2020 Shigeta ; 20200273289 August 2020 Shigeta ; 20200294346 September 2020 Shigeta ; 20200302168 September 2020 Vo et al. ; 20200342281 October 2020 Shigeta ; 20200349806 November 2020 Shigeta ; 20200349807 November 2020 Shigeta ; 20200349808 November 2020 Shigeta ; 20200349809 November 2020 Shigeta ; 20200349810 November 2020 Shigeta ; 20200349811 November 2020 Shigeta ; 20200364979 November 2020 Shigeta ; 20200372746 November 2020 Shigeta ; 20200372752 November 2020 Shigeta ; 20210307621 October 2021 Svenson et al.
  • Other References: US 10,854,041 B2, 12/2020, Shigeta (withdrawn) cited by applicant ; “Face-detection-adas-0001”, OpenVINO™ Toolkit, retrieved Oct. 5, 2020 from: https://docs.openvinotoolkit.org/latest/omz_models_intel_face_detection_adas_0001_description_face_detection_adas_0001.html, 5 pages. cited by applicant ; “Human-pose-estimation-0001”, OpenVINO™ Toolkit, retrieved Oct. 5, 2020 from: https://docs.openvinotoolkit.org/latest/omz_models_intel_human_pose_estimation_0001_description_human_pose_estimation_0001.html, 4 pages. cited by applicant ; “ColorHandPose3D network”, Computer Vision Group, Albert-Ludwigs-Universität Freiburg, retrieved Oct. 5, 2020 from: https://github.com/lmb-freiburg/hand3d, 7 pages. cited by applicant ; Dibia, Victor, “Real-time Hand-Detection using Neural Networks (SSD) on Tensorflow.”, retrieved Oct. 5, 2020 from: https://github.com/victordibia/handtracking, 17 pages. cited by applicant
  • Primary Examiner: Garner, Werner G

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -