TECHNIQUES FOR PROVIDING COMPUTER ASSISTED EYE EXAMINATIONS
This application is a continuation of U.S. application Ser. No. 15/966,944, filed on Apr. 30, 2018, which claims the benefit of U.S. Provisional Application No. 62/500,089, filed on May 2, 2017. The disclosure of each of the above applications is incorporated herein by reference in its entirety. The present disclosure relates to providing eye examinations. Physicians can perform a series of eye tests to assess a patient's vision and ability to focus. For example, a physician may test a patient's visual acuity by having the patient read letters from an eye chart posted in the physician's office. The visual acuity test may test the patient's clarity of vision. Poor performance on the visual acuity test may indicate that the patient has one or more eye disorders or diseases. The physician may write a prescription for an underperforming patient, which the patient can use to purchase corrective optics, such as contact lenses and/or glasses. In one example, the present disclosure is directed to a non-transitory computer-readable medium comprising computer-executable instructions. The computer-executable instructions cause a processing unit of a user device to instruct a user to stand a predetermined distance away from the user device, acquire an image of the user using a camera, and determine a user interpupillary distance based on the acquired image. The user interpupillary distance indicates a distance between the user's pupils in terms of pixels. The computer-executable instructions further cause the processing unit to determine a current distance between the user and the user device based on the user interpupillary distance, a predetermined interpupillary distance, a width of the acquired image in terms of pixels, and a field of view of the camera. Additionally, the computer-executable instructions cause the processing unit to display the current distance between the user and the user device on a display and display an eye chart on the display when the determined current distance is equal to the predetermined distance. In another example, the present disclosure is directed to a non-transitory computer-readable medium comprising computer-executable instructions. The computer-executable instructions cause a processing unit of a user device to instruct a user to stand a predetermined distance away from the user device, acquire an image of the user using a camera, and transmit the image to a remote eye test system via a network. The computer-executable instructions further cause the processing unit to receive a user interpupillary distance value from the eye test system via the network. The user interpupillary distance indicates a distance between the user's pupils in terms of pixels. Additionally, the computer-executable instructions cause the processing unit to determine a current distance between the user and the user device based on the user interpupillary distance, a predetermined interpupillary distance, a width of the acquired image in terms of pixels, and a field of view of the camera. The computer-executable instructions further cause the processing unit to display the current distance between the user and the user device on a display and display an eye chart on the display when the determined current distance is equal to the predetermined distance. The present disclosure will become more fully understood from the detailed description and the accompanying drawings. In the drawings, reference numbers may be reused to identify similar and/or identical elements. The disclosure is directed to an eye test software application and remote eye test system that assist a user in performing one or more eye tests, such as a red reflex test and/or a visual acuity test. The user may use their computing device (e.g., smartphone, tablet, laptop, or desktop) to run the eye test software application (hereinafter “test application”). In some implementations, while running the test application, the user device may communicate with the remote eye test system (hereinafter “test system”) in order to analyze and generate data associated with the one or more eye tests. The test system can provide eye test data associated with the eye tests to a physician for review. Example eye test data may include user information, images, and video recordings. Upon approval of the eye tests, the physician may make a prescription for the user (e.g., a prescription for contact lenses or eye glasses). The user may then order contact lenses and/or eye glasses using the test application. The test application and test system may provide users with the ability to take eye tests remotely (e.g., at home), without the presence of a physician. For example, a user may download the test application onto their computing device (referred to herein as a “user device”), take the eye tests using the test application, and submit the eye test data to the test system (e.g., a remote server). A physician may then retrieve the eye test data from the test system for review and approval. The user may be notified of the results (e.g., a prescription) via the user device and subsequently order their contact lenses. The remote eye tests may be convenient for both the user and the physician, as the testing and review may be performed at their convenience without an in-person meeting. Although the test application and test system may be used for convenient remote ordering of contact lenses (e.g., from home), the test application and test system described herein may be used in a variety of different contexts. For example, the test application and test system may be used for vision tests at various locations, including a physician's office, at a Department of Motor Vehicles (DMV) office, or at a kiosk in a pharmacy/store that sells contacts and/or eye glasses. For example, the functionality attributed to the user device and test application herein may be implemented in other computing devices at these locations. Although a test application installed on a user device may be used to complete the eye tests and submit eye test data, in some implementations, the test system and test application together may provide the functionality to complete the eye tests. For example, some of the functionality attributed to the test application herein may be performed by the test system (e.g., see The user device 100 executes the test application 108 and communicates with the test system 102 in order to provide the eye test functionality described herein. For example, the user device 100 may communicate with the test system 102 in order to provide an eye test, submit the eye test data to a physician 112, and facilitate the purchasing of contact lenses upon physician approval. Example user devices may include, but are not limited to, a smartphone, tablet computer, laptop computer, or desktop computer. In general, any computing device including a display and a camera (e.g., facing the same direction as the display) can be used for the eye tests. In some cases, multiple devices can be used, such as a separate camera, display (e.g., a monitor), and computing device (e.g., a desktop tower). The user device 100 can include an operating system 116, one or more web browser applications 118, the installed test application 108, and additional applications 120. Example operating systems may include, but are not limited to, ANDROID® developed by Google Inc., IOS® developed by Apple Inc., MICROSOFT WINDOWS® by Microsoft Corporation, MAC OS® by Apple, Inc., and Linux. Although the test application 108 can be used to provide the eye tests, in some implementations, the test system 102 may provide a web-based interface that is accessible by the web browser application 118. The web-based interface may provide similar functionality as the test application 108. The environment includes one or more shopping servers 122 that sell contact lenses to users. In some cases, the owner/operator of the test system 102 may own/operate a shopping server 122 that sells contact lenses to the users. The user may also purchase contact lenses from shopping servers 122 that are owned/operated by other parties. In some implementations, the user may use the test application 108 to purchase contact lenses from the shopping servers 122. In other implementations, the user may use the web browser 118 to purchase contact lenses from the shopping servers 122. In some cases, the digital prescription can be used to buy contact lenses in a retail store location. The user can then begin the eye tests. In the example method of In block 206, the test application 108 initiates the visual acuity test. With respect to In response to the user 110 positioning themselves at the testing distance, in block 210, the test application 108 may present the user with one or more eye test charts and assist the user in taking the visual acuity test ( The processing unit 308 can execute computer-executable instructions in the memory 310. For example, the processing unit 308 can execute the test application 108, an operating system 116, the web browser application 118, and additional applications 120, all of which can be implemented as computer-executable instructions. The memory 310 and storage device 312 can include one or more computer-readable mediums (e.g., random-access memory, hard disk drives, solid state memory drives, flash memory drives, etc.) that can store any suitable data that is utilized by the operating system 116 and/or any of the applications that are executed by the user device 100. In some implementations, the storage device 312 may include non-volatile memory. The network device 314 may be configured to perform wired and/or wireless communication with the network 106. The user device includes image acquisition hardware 316 that, with the camera 300, may acquire images and/or video of the user and make the images available to the test application 108. The image acquisition hardware 316 and the camera 300 may include one or more lenses, an image sensor (e.g., a charge-coupled device or complimentary metal-oxide-semiconductor sensor), lighting hardware, and circuits that support the operation of the aforementioned hardware to acquire images and/or video. The test application 108 includes an information collection and instruction module 320 (hereinafter “information module 320”) that can retrieve user information and provide information and/or instructions to the user. For example, the information module 320 may generate GUIs (e.g., The test application 108 includes a redness test module 322 that assists the user in performing an eye redness test. The redness test module 322 can generate GUIs for performing the eye redness test, such as GUIs instructing the user how to hold the user device 100 during the test. The redness test module 322 may also acquire video/images during the eye redness test and instruct the user to perform specific actions (e.g., “Please look up” of The test application 108 includes a vision test module 324 that assists the user in performing the visual acuity test. For example, the vision test module 324 can generate GUIs that assist the user in finding the proper testing distance along with a vision chart for the user to read during the visual acuity test. The vision test module 324 may also acquire video/images during the visual acuity test and instruct the user to perform specific actions during the vision acuity test. The image acquisition functionality associated with the redness test module 322 and the vision test module 324 may be performed by native OS libraries on the user device 100 outside of the test application 108, by code included in the test application 108 (e.g., the modules themselves), and/or a combination of native OS libraries and code included in the test application 108. Image acquisition techniques may vary based on the platform of the user device (e.g., operating system and/or device type). In general, images may be acquired via native operating system methods, frameworks, or libraries that interact with the operating system. In one specific example for IOS®, video frames may be accessed using the AVCaptureVideoDataOutputSampleBufferDelegate method and captureOutput:didOutputSampleBuffer:fromConnection, which is part of the AVFoundation framework that may be included with IOS®. With respect to ANDROID®, the onPreviewFrame callback provided in the Camera. PreviewCallback interface in the android.hardware package can be used. With respect to web browsers, a live video stream can be set up (e.g., using Web Real-Time Communication or a dedicated media streaming server) and be rendered to a canvas by calling drawlmage on a CanvasRenderingContext2D object. The test application 108 includes a shopping module 326 that the user may use to purchase items after taking the eye redness test and/or the visual acuity test. For example, the shopping module 326 may generate GUIs that a user may use to select contact lenses from one or more of the shopping servers 122. For example, using the test application 108 (e.g., the shopping module 326), the user may use a doctor's prescription to purchase the appropriate contact lenses. The prescription can include the parameters and brand of the contact lenses, as well as any other metadata (e.g., required by state law). Contact lens parameters may include, for each eye, Base Curve, Sphere, Cylinder, Axis, Add., Prism, Diameter, and sometimes color. Example metadata can include issuing state, the prescribing doctor's name signature and license number, address of the doctor's office, issue date, and expiration date. The test application 108 stores application data 328 associated with the test application 108. Example application data 328 may include user information or other information collected by the information module 320. Application data 328 may also include redness test data and vision test data generated by the redness test module 322 and the vision test module 324, respectively. The user information (or other information), redness test data, and vision test data may be referred to herein generally as “eye test data.” In some cases, the application data 328 may include data received from the eye test system 102, such as data included in the physician response (e.g., prescription data). The test system 102 includes a server eye test module 330 (hereinafter “server test module 330”) that communicates with the test application 108. For example, the server test module 330 may receive data from the test application 108, such as user information, redness test data, and vision test data. The server test module 330 can store the received data in the system data store 332. In some implementations (e.g., see The test system 102 includes a physician interface module 334 that interfaces with physician devices 104. The physician interface module 334 may generate GUIs for the physicians and transmit eye test data to the physician devices 104 for review. For example, the physician interface module 334 may generate GUIs for evaluating the eye test data, such as GUI elements that present audio, video, and/or images of the user taking the redness and visual acuity tests. Additionally, the GUIs may show the test chart(s) presented to the user during the visual acuity test. The physician interface module 334 may also receive physician response data and store the data in the system data store 332. Example physician response data may include, but is not limited to, the physician name, approval or disapproval of the tests, reasons for the decision, prescription data, and additional notes. The test system 102 includes a prescreening module 336. The prescreening module 336 may generate GUIs that operators (e.g., employees) of the test system 102 may use to prescreen received eye test data. For example, the operators may screen for completed user information, check the quality of the audio/video associated with the tests (e.g., proper lighting in the images/videos and proper sound acquisition), determine whether the user is at the proper distance, and screen for other factors. If the eye test data is not satisfactory, then the operator may notify the user that the user should retake one or more of the eye tests. Such a prescreening functionality may ensure quality eye test data that optimizes the physician's time. Note that Upon starting the test application 108, the information module 320 generates GUIs for collecting user information and providing instructions in block 402. In In block 404, the redness test module 322 assists the user in taking the eye redness test and acquires eye redness test data. The redness test module 322 may then generate GUIs that display the user's face and eyes (see The redness test module 322 may acquire the video and/or images of the user while the user is taking the eye redness test. For example, the redness test module 322 may acquire the video and/or images of the user while they are looking up. Upon completion of the eye redness test (e.g., after looking up), the user may be prompted to indicate whether they performed the test properly (e.g., see After completion of the eye redness test, the user device 100 may store the redness test data and/or transmit the redness test data to the test system 102 for later review by the physician. The redness test data may include the video and/or images of the user taking the eye redness test. The physician may use the video and/or images to determine how the user's contact lenses sit on the user's eyes and also to ensure there are no signs of inflammation. In block 406, the vision test module 324 initiates the visual acuity test and assists the user into the proper position. For example, the vision test module 324 may generate a GUI that instructs the user to stand at the testing distance. The vision test module 324 then calculates the user's distance from the user device 100 and generates GUIs that assist the user into the proper position. The GUIs may include a number (e.g., in feet or meters) that indicates the user's distance from the user device 100. After the user is positioned at the testing distance, the vision test module 324 may generate an eye chart GUI and record the user's reading of the eye chart in block 408. While viewing the GUI of Although the figures illustrate how the test application 108 can update distance numbers on the GUI, in other implementations, the user device 100 (e.g., according to the test application 108) may provide additional, or alternative, types of feedback to the user to aid the user in finding the proper testing distance. In one example, the GUI may display commands to the user indicating that the user should come closer to the user device 100 or move away from the user device (e.g., “Approach” or “Move back”). In another example, the test application 108 may provide audio cues to the user (e.g., via the speaker 304) to indicate how the user should be positioned. For example, the test application 108 may provide audio cues indicating the distance between the user and the user device 100 (e.g., the measurement in feet). As another example, the test application 108 may provide audio cues including instructions for the user to follow, such as “back up” or “move forward.” The test application 108 may provide a variety of visual and audio cues when the user has reached the correct testing position. For example, the GUI may display 10 ft when the user has reached the correct position. The GUI may also display text indicating that the user has reached the correct position and that the user should remain in the position for the visual acuity test. As another example, the GUI may change colors when the user has reached the correct position. For example, the GUI may flash a green color to indicate that the user has reached the proper testing distance (e.g., 10 ft). In some implementations, the test application 108 may provide audio cues to the user indicating that the user has reached the correct distance and that the user should maintain the position for the subsequent visual acuity test. In general, the test application 108 will direct the user to test each of their eyes independently (e.g., by covering one eye). The user may also be directed to test both eyes (e.g., without covering). The test application 108 may provide audio cues to the user to indicate that the user should cover an eye and read the letters presented on the display. In some implementations, the test application 108 may provide the instructions as text on the screen. The vision test module 324 may record video and audio of the visual acuity test as the user is taking the test. For example, the vision test module 324 may record video and audio of the user covering/uncovering their eyes and reading the presented text. The test application 108 may provide the video/audio recording and corresponding exam chart (e.g., the correct characters) to the test system 102 so that the physician can review the video test along with the exam chart. In some implementations, the vision test module 324 may generate random characters for the visual acuity test so that the user is prevented from memorizing the test characters. Referring back to In block 412, the test system 102 (e.g., the server test module 330) processes the eye test data received from the test application 108. For example, the server test module 330 may format and store the received eye test data in the server data store 332 for later use. In some implementations, an operator (e.g., employee) of the test system 102 may pre-screen the received eye test data before the data is made available to a physician for review. In block 414, the physician accesses the eye test data via the physician interface module 334. In block 416, the physician provides a physician response (e.g., a prescription) to the test system 102 after reviewing the eye test data. In block 418, the test system 102 makes the prescription available to the user device 100. For example, the test system 102 may transmit the prescription to the user device 100 and/or email the prescription to the user. The user may use the received prescription to purchase contact lenses. For example, in block 420, the shopping module 326 generates a contact lens shopping GUI for the user based on the received prescription data. The shopping module 326 may access one or more shopping servers 122 and present a variety of contact lenses to the user for purchase. In some implementations, the shopping module 326 may filter the available lenses according to the prescription. In block 422, the user may place an order for contact lenses in the shopping GUI. Although the user may place the order for contact lenses after receiving a prescription, in some implementations, the user may place the order before the prescription is confirmed, which may result in a cancellation of the order if the prescription is denied. In block 1002, the vision test module 324 acquires an image of the user (e.g., using the camera 300) as the user is finding the testing position. In block 1004, the vision test module 324 determines the distance of the user from the user device 100 based on the acquired image. For example, the vision module 324 may determine the user's distance based on the user's interpupillary distance (IPD), a predetermined IPD value associated with a population of individuals, and other factors described with respect to If the vision test module 324 determines that the user is at the proper testing distance in block 1006, the vision test module 324 may initiate the eye chart portion of the visual acuity test in block 1010 (e.g., as described with respect to As described herein, the test application 108 (e.g., the vision test module 324) and/or the test system 102 (e.g., the sever test module 330) may determine the distance of the user from the user device 100 based on an acquired image (e.g., a single image and/or video). In The illustration, annotations, and equations in The acquired image 1108 includes the user's face and eyes. The acquired image 1108 has a width of L2, in terms of pixels. The image width L2 (in pixels) may be determined based on analysis of the acquired image 1108 and/or based on camera specifications associated with the user device. The field of view (FoV) of the camera, illustrated as theta (e), can also be determined based on camera specifications associated with the user device 100. The user device 100 may provide the various camera specifications to the vision test module 324 in some examples. In other examples, the test system 102 may store the camera specifications (e.g., width L2 in pixels and/or e), which may be retrieved by the test application 108. The focal length (f) may then be determined based on L2 and e, for example, using the equation f=L2/(2*tan(e/2)), as illustrated in The user device 100 and/or the test system 102 may process the acquired image to determine the IPD distance L1 in terms of pixels. For example, the user device 100 and/or the test system 102 may identify the user's face and the user's eyes/pupils in the acquired image and then determine the number of pixels between the user's pupils. The test application 108 and/or the test system 102 may include a stored IPD value (e.g., predetermined IPD value), referred to in Although a median IPD value of a population may be used, other IPD values associated with the population may be chosen. For example, a mode or average IPD value may be used, assuming the value(s) can provide sufficient calculation of user distance. In some cases, the mode value in the population may be similar to the median value in the population. Predetermined IPD values may also be stored based on several demographic factors including age, sex, and race. In this case, the user may specify age, sex, and/or race, which can be used for selection of the IPD values. In some implementations, the test application may use a linear regression or classifier to better predict the IPD of a patient using a combination of any of these factors. In other implementations, the test application may ask the user to specify their measured IPD. For example, the user may specify their measured IPD (e.g., as measured by a doctor), which may be included on their external glasses or contact lens prescription in some cases. The mathematical model in The test application 108 and/or the test system 102 may determine the distance between the user and the user device 100 based on the values determined above. For example, the test application 108 and/or the test system 102 may determine the distance d between the user and the user device 100 using the following equation: d=(median IPD)*f/L1. The user device 100 may report this distance to the user (e.g., In block 1302, the image acquisition module 1200 acquires an image of the user from the camera 300 and the image acquisition hardware 316 as the user is finding the testing position. In block 1304, the feature detection module 1202 process the acquired image to detect the user's pupils. For example, the feature detection module 1202 may identify the user's face in the acquired image and then identify the user's eyes/pupils in the acquired image. The feature detection module 1202 may detect facial regions in an image or video frame using a trained classifier (e.g., a trained histogram of oriented gradients (HOG) identifier) or Convolutional Neural Network. The feature detection module 1202 may then detect facial landmarks including pupils, within each facial region of interest, using another trained classifier (e.g., an ensemble of regression trees). In some implementations, the feature detection module 1202 may detect the user's pupils locally (e.g., using local processing on the user device). In other implementations (e.g., In some implementations, the feature detection module 1202 may detect the user's eyes and then estimate the location of the user's pupils within the user's eyes. For example, the feature detection module 1202 may detect the user's eyes and set the user's pupil location at the center of the eyes. As another example, the feature detection module 1202 may detect multiple points around the border of where the eyelid opens and then calculate the position of the pupil based on the multiple points (e.g., by calculating an average position). In block 1306, the IPD determination module 1204 determines the user's IPD distance in pixels and the width of the image L2 in pixels. In block 1308, the user distance module 1206 determines the distance of the user d based on the median IPD value, the IPD in pixels, the image width L2 in pixels, and the camera FoV, as described with respect to In block 1312, the user distance module 1206 determines whether the user is at the proper testing distance. If the user is not at the proper testing distance, the user distance module 1206 may prompt the user to move to the proper testing distance in block 1314 (e.g., The vision test module 324 includes a capabilities assessment module 1212 (hereinafter “assessment module 1212”) that may assess the capabilities of the user device's hardware and/or software. For example, the assessment module 1212 may determine camera properties, such as FoV and image capture dimensions (e.g., image width). The assessment module 1212 may also determine other user device properties, such as display size and display resolution. Additionally, the assessment module 1212 may determine whether the user device 100 is capable of feature detection, such as face detection, eye detection, and/or pupil detection. The assessment module 1212 may also determine the network connection speed between the user device 100 and the test system 102. In some cases, the assessment module 1212 can retrieve the information related to the user device 100 from the application data 328 and/or the system data store 332. Additionally, or alternatively, the assessment module 1212 may query the user device 100 to determine properties of the user device 100. In some implementations, the assessment module 1212 can also determine whether the user granted permissions to access the camera and microphone. The assessment module 1212 may also assess the level of volume to ensure that instructions can be heard. The assessment module 1212 can be configured to query the operating system for screen sizes and resolutions on some user devices (e.g., for specific user devices listed in the app data 328). The app data 328 may also include the screen size/resolution for a variety of specified user devices. For a web-based implementation, since screen resolution varies among user devices, and programmatic access to the screen size/resolution may be inaccessible, the assessment module 1212 may ask the user to perform a set of steps for determining screen size/resolution. For example, the user may be prompted to place an object of known size (e.g., a credit card) over a resizable object (e.g., a resizable credit card) on the display. Then the user can resize the object to fit the known size object. Using this data, the assessment module 1212 may map a fixed width (e.g., card width) to a number of pixels, and then calculate the requisite size of the letters based on this mapping. The system feature detection module 1400 and the system IPD determination module 1402 may perform similar functions as the device feature detection module 1202 and the device IPD determination module 1204, respectively. The functionality may be offloaded from the user device 100 to the test system 102 for a variety of reasons. In some cases, the user device 100 may lack sufficient hardware/software computing resources to provide the functionality associated with the device feature detection module 1202 and the device IPD determination module 1204. For example, the user device 100 may lack sufficient processing power and/or the proper software libraries to implement the functionality associated with the device feature detection module 1202 and the device IPD determination module 1204. The assessment module 1212 may determine whether the user device 100 includes sufficient hardware/software computing resources to provide such functionality. If the user device 100 does not include sufficient computing resources, the vision test module 324 may offload the functionality of the device feature detection module 1202 and the device IPD determination module 1204 to the test system 102, as illustrated in In some implementations, processing tasks on the user device 100 and the test system 102 may vary based on internet connection speed between the user device 100 and the test system 102, as determined by the assessment module 1212. For example, for faster network connections (e.g., greater than a threshold network speed), the user device 100 may offload the image processing to the test system 102 by transmitting the videos/images to the test system 102 for processing. On the other hand, if the network connection is slow (e.g., less than a threshold network speed), the user device 100 may attempt to minimize the data transmitted to the test system 102 via the network 106 (e.g., by compressing the videos and/or cropping/masking the images). Although the vision test module 324 includes the device feature detection module 1202 and the device IPD determination module 1204 in In some cases, the distance determination may be subject to some uncertainty when using the median IPD, since IPD may deviate among users. The user device 100 and/or the test system 102 may determine a confidence score associated with the determined distance that reflects a level of confidence in the distance. For example, it may represent a statistical confidence that the patient is 10 feet or greater away from the camera. The confidence score may be included in eye test data that is reviewed by the physician. The user device 100 and/or the test system 102 may compute the confidence score in the following manner. The user device 100 and/or test system 102 may determine a test IPD value (e.g., in actual distance) based on the measured IPD in pixels. The test IPD may be the IPD that the user would have to have in order to be a target distance away from the user device 100 (e.g., 10 ft). The user device 100 and/or the test system 102 may then use a complementary cumulative distribution function (e.g., a tail function) to compute the probability that a human could have an IPD greater than or equal to the test IPD. The computed probability may indicate the probability that the user is greater than or equal to the target distance. In a specific example, a user standing an unknown distance from the camera may have to have a hypothetical IPD of 60 mm in order to be at least 10 feet away. In this case, the computation may return a confidence value of 0.81, indicating that there is an 81% probability that the user is at least 10 feet away. Although the user device 100 and/or the test system 102 may use a single median IPD value across a plurality of users, in some implementations, the user device 100 and/or the test system 102 may determine an IPD value specifically for the user. For example, the user device 100 and/or the test system 102 may analyze an image to determine user characteristics, such as the age, sex, and/or race. The user device 100 and/or the test system 102 may then determine an IPD value for the particular user based on the determined age, sex, and/or race. For example, the user device 100 and/or the test system 102 may select a different IPD value from a set of IPD values associated with the determined characteristics and/or modify the median IPD value based on the determined characteristics. In these implementations, the user-specific IPD value may be more accurate than the median IPD value. The user device 100 and/or the test system 102 may implement alternative techniques for determining the distance between the user and the user device 100. For example, the user device 100 and/or the test system 102 may implement a trained classifier to determine the distance between the user device 100 and the user. The trained classifier may be trained on a variety of faces at representative distances to the user. In these implementations, the classifier may take as input a series of images and/or a snapshotted video. The classifier may then return a distance based on the inputted images. The user device 100 and/or the test system 102 may implement the classifier as an alternative to the IPD determination described in Although the distance determination is described herein as being used for the visual acuity test, in some implementations, the user device 100 and/or the test system 102 may determine the distance between the user and the user device 100 for other eye tests, such as the eye redness test. In these implementations, the user device 100 and/or the test system 102 may determine the user distance based on the median IPD and/or another technique (e.g., using a trained classifier). In some implementations, the user device 100 and/or the test system 102 may be configured to identify the user among other people and objects included in the acquired image. Put another way, the user device 100 and/or the test system 102 may isolate the user from other people and/or objects in the acquired image. For example, the user device 100 and/or the test system 102 may independently track the user in the image so that the user can properly take the eye tests in the presence of others. As described herein, the user device 100 (e.g., according to the test application 108) and/or the test system 102 may provide functionality associated with providing eye tests. Put another way, the user device 100 (e.g., the test application 108) and the test system 102 may share in providing the eye testing functionality described herein. Accordingly, functionality attributed herein to the user device 100 and/or the test system 102 may be provided individually by either the user device 100 or the test system 102 or shared among the user device 100 and the test system 102. Division of the various functions among the user device 100 and/or the test system 102 described herein may vary depending on factors described herein, or for other reasons. The modules and data stores described herein may be embodied by electronic hardware, software, firmware, or any combination thereof. Depiction of different features as separate modules and data stores does not necessarily imply whether the modules and data stores are embodied by common or separate electronic hardware or software components. In some implementations, the features associated with the one or more modules and data stores depicted herein may be realized by common electronic hardware and software components. In some implementations, the features associated with the one or more modules and data stores depicted herein may be realized by separate electronic hardware and software components. The test system 102 may include one or more computing devices that are configured to implement the techniques described herein. Each of the one or more computing devices may include any combination of electronic hardware, software, and/or firmware described above. For example, each of the one or more computing devices may include any combination of processing units, memory components, and storage devices. The one or more computing devices of the test system 102 may also include various human interface devices, including, but not limited to, display screens, keyboards, pointing devices (e.g., a mouse), touchscreens, speakers, and microphones. In some examples, the test system 102 may include one or more server computing devices configured to communicate with user devices 100. The one or more computing devices may reside within a single machine at a single geographic location in some examples. In other examples, the one or more computing devices may reside within multiple machines at a single geographic location. In still other examples, the one or more computing devices of the test system 102 may be distributed across a number of geographic locations. A computer-readable medium includes computer-executable instructions that cause a processing unit of a user device to instruct a user to stand a predetermined distance from the user device, acquire an image of the user, and determine a user interpupillary distance based on the acquired image. The user interpupillary distance indicates a distance between the user's pupils in terms of pixels. The computer-executable instructions further cause the processing unit to determine a current distance between the user and the user device based on the user interpupillary distance, a predetermined interpupillary distance, a width of the acquired image in terms of pixels, and a field of view of the camera. Additionally, the computer-executable instructions cause the processing unit to display the current distance between the user and the user device on a display and display an eye chart when the determined current distance is equal to the predetermined distance. 1. A non-transitory computer-readable medium comprising computer-executable instructions that cause a processing unit of a user device to:
instruct a user to stand at a visual acuity testing distance away from the user device, wherein the visual acuity testing distance is a predetermined distance away from the user device for taking a visual acuity examination; acquire an image of the user using a camera; determine a user interpupillary distance based on the acquired image, wherein the user interpupillary distance indicates a distance between the user's pupils in terms of pixels; determine a current distance (d) between the user and the user device based on a width of the acquired image in terms of pixels, a field of view of the camera, and the equation d=(IPD)*f/L, wherein IPD is a predetermined interpupillary distance that is representative of a population of people, f is the focal length of the camera in pixels, and L is the user interpupillary distance in pixels; display the current distance between the user and the user device on a display included on the user device; display an eye chart on the display in response to the user reaching a distance from the user device that is equal to the visual acuity testing distance; acquire testing images and testing audio of the user reading the displayed eye chart at the visual acuity testing distance; and receive a user input indicating that the user has completed reading the displayed eye chart. 2. The computer-readable medium of instruct the user to move towards the user device if the determined current distance is greater than the visual acuity testing distance; and instruct the user to move away from the user device if the current determined distance is less than the visual acuity testing distance. 3. The computer-readable medium of detect the user's pupils in the acquired image; and determine the user interpupillary distance based on the detected user's pupils in the acquired image. 4. The computer-readable medium of detect the user's eyes in the acquired image; estimate the location of the user's pupils within the user's eyes; and determine the user interpupillary distance based on the estimated location of the user's pupils in the acquired image. 5. The computer-readable medium of 6. The computer-readable medium of 7. The computer-readable medium of 8. The computer-readable medium of 9. The computer-readable medium of 10. The computer-readable medium of 11. The computer-readable medium of 12. The computer-readable medium of acquire user information from the user, wherein the user information includes at least one of a user age, a user location, and a date of a last eye examination; and transmit the user information to the remote eye test system. 13. The computer-readable medium of prompt the user to begin an eye redness test; acquire eye redness test images of the user during the eye redness test; and transmit the eye redness test images to the remote eye test system. 14. The computer-readable medium of acquire an image of an object of known size; determine a letter size based on the image of the object of known size; and display letters in the eye chart that are sized according to the letter size. 15. The computer-readable medium of 16. A non-transitory computer-readable medium comprising computer-executable instructions that cause a processing unit of a user device to:
instruct a user to stand at a visual acuity testing distance away from the user device, wherein the visual acuity testing distance is a predetermined distance away from the user device for taking a visual acuity examination; acquire an image of the user using a camera; transmit the acquired image to a remote eye test system via a network; receive a user interpupillary distance value from the remote eye test system via the network, wherein the user interpupillary distance value indicates a distance between the user's pupils in terms of pixels; determine a current distance (d) between the user and the user device based on a width of the acquired image in terms of pixels, a field of view of the camera, and the equation d=(IPD)*f/L, wherein IPD is a predetermined interpupillary distance that is representative of a population of people, f is the focal length of the camera in pixels, and L is the user interpupillary distance in pixels; display the current distance between the user and the user device on a display included on the user device; display an eye chart on the display in response to the user reaching a distance from the user device that is equal to the visual acuity testing distance; acquire testing images and testing audio of the user reading the displayed eye chart at the visual acuity testing distance; and receive a user input indicating that the user has completed reading the displayed eye chart. 17. The computer-readable medium of instruct the user to move towards the user device if the determined current distance is greater than the visual acuity testing distance; and instruct the user to move away from the user device if the current determined distance is less than the visual acuity testing distance. 18. The computer-readable medium of 19. The computer-readable medium of 20. The computer-readable medium of 21. The computer-readable medium of 22. The computer-readable medium of 23. The computer-readable medium of 24. The computer-readable medium of 25. The computer-readable medium of acquire user information from the user, wherein the user information includes at least one of a user age, a user location, and a date of a last eye examination; and transmit the user information to the remote eye test system. 26. The computer-readable medium of prompt the user to begin an eye redness test; acquire eye redness test images of the user during the eye redness test; and transmit the eye redness test images to the remote eye test system. 27. The computer-readable medium of acquire an image of an object of known size; determine a letter size based on the image of the object of known size; and display letters in the eye chart that are sized according to the letter size. 28. The computer-readable medium of CROSS-REFERENCE TO RELATED APPLICATIONS
FIELD
BACKGROUND
SUMMARY
BRIEF DESCRIPTION OF THE DRAWINGS
DETAILED DESCRIPTION
![](/ipUS20210022603A1/0.png)
![](/ipUS20210022603A1/1.png)
![](/ipUS20210022603A1/2.png)
![](/ipUS20210022603A1/3.png)
![](/ipUS20210022603A1/4.png)
![](/ipUS20210022603A1/5.png)
![](/ipUS20210022603A1/6.png)
![](/ipUS20210022603A1/7.png)
![](/ipUS20210022603A1/8.png)
![](/ipUS20210022603A1/9.png)
![](/ipUS20210022603A1/10.png)
![](/ipUS20210022603A1/11.png)
![](/ipUS20210022603A1/12.png)
![](/ipUS20210022603A1/13.png)