AI-powered communication board with dwell-based eye tracking
Requires an eye tracking device (Tobii or similar) to control the cursor with your gaze.
Add OpenAI and Claude keys via Settings for AI voice and word prediction. Without them, local fallbacks are used. Keys stay in your browser.
A 9-point calibration will start after you dismiss this screen. Look at each dot until it fills to map your gaze accurately.
Hold your gaze on any button to select it. Build sentences, use word prediction, and speak with text-to-speech.
Press Enter or click the button above
Duration of gaze fixation required to select. Research recommends 600-1000ms. Longer = fewer accidental selections.
How far the gaze cursor magnetically snaps to the nearest button. Higher = easier targeting but less precision. 0 = off.
Smooths jittery gaze input using an exponential moving average. Higher = smoother but slower response. 0 = off.
How sticky buttons are once snapped. Multiplier on snap radius before releasing. Higher = harder to leave a button.
Larger targets improve eye gaze accuracy (research: min 80px recommended).
More points improve accuracy, especially with glasses. 9 is standard. 16 or 25 recommended for glasses wearers — enables polynomial correction that models lens distortion.
Enter your OpenAI API key for high-quality AI voice output. Falls back to browser speech if not set.
Choose the AI voice for spoken output (only used when OpenAI key is set).
Browser voice used when OpenAI key is not set.
Speed of spoken output. Higher = faster speech.
Volume of the word echo when picking words.
Volume when speaking the full sentence.
Number of word predictions shown. More = higher hit rate but more scanning.
Enter your Anthropic API key to enable AI-powered predictions. Key is stored locally in your browser only.