Skip to content
← Back to blog

Keystroke Aura — How I Turned Typing Rhythm into a Personality Test

Built a monthly playground toy that measures your typing rhythm and maps it to one of 64 named auras with dual rarity scores. Here's how the feature vector + cosine distance pipeline works.

by Jay10 min readVIBE.LOG

Series: VIBE.LOG

  1. 1. The Layout Vocabulary Cheat Sheet: What to Call That Thing on Your Screen
  2. 2. I Spent 3 Hours Trying to Proxy a Blog Subdomain. Here's My Descent Into Madness.
  3. 3. The Complete SEO Guide: How to Make Google Actually Notice Your Website
  4. 4. Why Your Next.js Favicon Isn't Showing (And the Three Ways to Actually Fix It)
  5. 5. GitHub Keeps Telling Me My Branch Is Fine. And Also Not Fine. At the Same Time.
  6. 6. Mobile-First Playground: Making an Astrology Grid Actually Work on a Phone (And Go Viral While Doing It)
  7. 7. Playground Is Live: The Destiny Grid, Real Astrology, and Why I'm Shipping a Toy Every Month
  8. 8. The Interactive Component Cheat Sheet: What to Call That Clickable Thing
  9. 9. Google Rejected My Site for 'Low-Value Content.' Here's What I Actually Fixed.
  10. 10. I Actually Fixed Everything. Here's What That Looked Like.
  11. 11. I Hired 131 AI Employees Today. Here's How.
  12. 12. I Let My AI Run 72 Backtests While I Watched. It Picked the Winner.
  13. 13. I Taught My AI to Stop Asking Questions. It Took Five Rewrites.
  14. 14. Obsidian Turned My Scattered Notes Into a Second Brain. Here's How to Set It Up.
  15. 15. The Destiny Grid Gets Its East Wing: I Rebuilt Saju (四柱八字) in TypeScript
  16. 16. Molecule Me: Your Personality, Encoded in Chemistry
  17. 17. OpenAI Just Built a Plugin for Their Competitor's Tool. I Installed It.
  18. 18. I Combined Two Open-Source Repos Into an AI That Plans, Builds, and Reviews Its Own Code
  19. 19. I Built a Weekly Directory for Claude Code Agents (Because My Brain Couldn't Keep Up)
  20. 20. Karpathy Tweeted an Idea. I Spent a Day Putting It in My Obsidian Vault.
  21. 21. Keystroke Aura — How I Turned Typing Rhythm into a Personality Test ← you are here

I was about to ship another "enter your name" personality toy.

You know the kind. Type your birthday, get a star sign. Type your name, get a Hogwarts house. I'd already built two of those (Destiny Grid and Molecule Me) and the third one in the sketch pile was, honestly, more of the same.

Then my cursor paused on the name field just long enough for me to notice: the rhythm itself could be the input.

Not what you type. How you type it. The little drag after a comma. The three-character burst before a space. The way your hand hesitates on vowels that remind you of someone. You never notice you do this, but you do it the same way every time. Your typing has a fingerprint, and nobody had turned it into a personality quiz yet.

So I built one. This is Toy #3 in Playground, and it is called Keystroke Aura.


What It Actually Does

You pick one of four sentences — melancholy, anger, joy, calm — and type it. That's it. No name field, no birthday, no "what's your favorite color." The app watches how you type and hands you back an aura.

┌─────────┐    ┌──────────┐    ┌─────────┐    ┌──────────┐
│ Landing │───>│ Sentence │───>│ Typing  │───>│ Reveal   │
└─────────┘    │  Picker  │    │ (capture│    │  (aura + │
               │  (4)     │    │ rhythm) │    │  rarity) │
               └──────────┘    └─────────┘    └──────────┘

Four phases. Each one is a single screen, no modals, no "sign in to continue." The sentence picker is intentionally small — four options, four cohorts, done — because the more choices I gave testers, the longer they stared at the screen and the less they actually typed. Decision fatigue is real and it is sneaky.

The Typing phase captures every keydown and keyup with performance.now() timestamps. The Reveal phase does the math and shows you your result. The whole round takes about 20 seconds.

How to prompt AI:

"Design a 4-phase flow for a typing rhythm personality test. Phase 1 is a landing screen with one CTA. Phase 2 is a sentence picker with exactly 4 options, one per emotional cohort. Phase 3 captures keydown/keyup events with performance.now timestamps. Phase 4 reveals the result. No login, no modals, no branching. State machine, not a wizard."


How the Rhythm Becomes an Aura

This is the part where I pretend to be a data scientist.

Every typing session gets distilled into a 9-dimensional feature vector. Not raw keystrokes — features extracted from them:

  • mean_interval_ms — average gap between keys
  • interval_variance — how uneven that gap is
  • interval_entropy — how unpredictable the rhythm is
  • mean_hold_ms — how long you hold each key
  • pause_count_norm — how often you pause longer than expected
  • correction_burst — how many backspaces show up in clusters
  • tempo_delta — whether you speed up or slow down mid-sentence
  • consistency_index — how self-similar your rhythm is
  • error_rate — typos per character

That vector gets normalized (milliseconds and ratios can't be compared raw) and compared to 64 precomputed centroids via cosine distance. The nearest centroid wins. That centroid is an aura — a family × variant pair, like staccato × copper or legato × velvet.

your typing     feature         nearest        your aura
 ──────────> [ 9-dim vector ] ──────────> [ centroid ] ─────> "Velvet Legato"
  (20s)       extract + norm   cosine      1 of 64
                               distance

8 families × 8 variants = 64 auras. The family is the rhythm shape (how you type). The variant is the aesthetic tone (how the aura looks — copper, velvet, glass, ember, etc.). Same rhythm + different palette = related but distinct auras. You and your friend might both be Legato typists, but one of you is Velvet and the other is Glass, and that actually matters to people when they screenshot it.

How to prompt AI:

"I need to classify a 9-dim feature vector into one of 64 categories. The categories are a 2D grid: 8 rhythm families × 8 aesthetic variants. Give me a centroid-based approach where each category has its own centroid vector, distance is cosine, and normalization weights are per-feature because features are on different scales (ms vs ratio vs count)."


The Two-Rarity Trick

Here's the design move I'm proudest of: every aura gets two percentiles, not one.

┌────────────────────────┐
│   [pulsing orb]        │
│                        │
│   Velvet Legato        │
│                        │
│   Top 0.8% globally    │ ← your identity
│   Top 4.2% among       │ ← your tribe
│   Anger typists        │
│                        │
└────────────────────────┘

The first number is your global rarity — how unusual your rhythm is against every typist on the platform. The second is your cohort rarity — how unusual you are among people who picked the same sentence. Anger typists. Melancholy typists. Joy typists. Calm typists.

Why both? Because one number gives you identity, and the other gives you tribe.

If you only see "top 0.8% globally," you feel special but isolated. If you only see "top 4.2% among Anger typists," you feel like a member of a club but you don't know if the club is rare. You need both to feel located — "I'm unusual, and here's the specific corner of humanity I'm unusual in."

Under the hood, both percentiles come from Firestore histograms. One doc per (global, device), one doc per (cohort, device). 100 buckets each. Every session contributes to both histograms and reads back its own percentile in the same round-trip. Fast enough that the Reveal screen feels instant, cheap enough that I don't lose sleep over Firestore reads.

How to prompt AI:

"I want to show a user two rarity numbers: their percentile globally, and their percentile within their self-selected cohort. Use Firestore histograms with 100 buckets per distribution. Both percentiles should be computed from a single cosine distance measurement. The user's own session contributes to both histograms after it's measured, so new users don't see a stale distribution."


Why I Didn't Use a Timer

The obvious version of this toy is "typing speed test." WPM, go.

I hate the obvious version. Let me tell you why.

Speed flattens people. A fast typist and a slow typist look identical on a speed test except for one number, and that number maps trivially to years of practice. It's not about them, it's about their keyboard muscle memory. Boring.

Rhythm is personal. The same WPM can be produced by ten totally different rhythms. Someone who types in bursts of four letters and pauses. Someone who types at perfectly even intervals like a metronome. Someone whose rhythm speeds up as they gain confidence in the sentence. Same speed, completely different feel. A speed test can't see any of that. A rhythm-based feature vector can.

Speed invites cheating. If I tell you your WPM, you'll type that same sentence 5 times and screenshot the fastest one. If I tell you "you're a Velvet Legato typist, top 0.8% globally," there's nothing to optimize against. You can't game rhythm because you don't know what rhythm you're trying to produce. You just type, and the truth about your hands comes out.

Removing the timer wasn't a feature — it was a deletion. The best design decisions usually are.


What I Got Wrong the First Time

I thought I could seed the rarity histograms with 10,000 fake typists from a Gaussian distribution and no one would notice.

Turns out cosine distance in a 9-dim space hates you.

What happened: I generated 10k synthetic feature vectors by sampling each of the 9 dimensions from its own normal distribution. Looked reasonable on paper. Then I computed cosine distances from those synthetic vectors to the 64 centroids and built histograms.

Every real user came back as "top 99.9% globally."

Why? Because real typing rhythms have correlated features. People who type fast also hold keys shorter. People who pause a lot also have higher entropy. The correlations make real vectors cluster in a specific manifold inside 9-dim space. My Gaussian synthetic data was uniformly distributed across the whole space — so real users were always anomalies compared to the fakes.

The fix was embarrassingly simple: don't seed. Just let the histograms start empty and fill up from real users. The first 100 users see noisy percentiles; user #500 sees accurate ones. That's fine. Honesty beats a fake bell curve every time.

I spent two days on the seeder. Deleted all of it. The app got better.


How to Prompt AI (Meta Section)

The 64-aura taxonomy — 8 families × 8 variants — didn't come from me sitting down with a spreadsheet. It came from a back-and-forth with Claude where I described the rhythm shapes in words and let the AI riff on evocative names.

My prompt was something like:

"I have 8 rhythm families: staccato (punchy bursts), legato (flowing), flutter (erratic fast), metronome (perfectly even), sprint (accelerating), drift (pausing mid-sentence), tremor (unstable), surge (decelerating). I need 8 aesthetic variants that work as adjectives before each family name. Variants should feel material — not colors, but textures and substances. Give me 8 options ordered roughly from warm/dense to cool/ethereal, plus a one-line rationale for each."

What came back: copper, velvet, ember, glass, smoke, mercury, obsidian, aurora. Every one of those survived into production. I would not have come up with "mercury" on my own. I would have written "silver" and stopped.

The lesson: AI is great at generating taxonomies — structured vocabularies where you know the shape but not the fill. Ask for the fill.


Cheatsheet — Family to Rhythm Trait

Family Rhythm trait Feels like
Staccato Punchy, short bursts A drummer warming up
Legato Flowing, connected keys Someone sight-reading piano
Flutter Fast but erratic A hummingbird on espresso
Metronome Perfectly even intervals Court stenographer
Sprint Starts slow, accelerates Confidence building mid-sentence
Drift Pauses mid-sentence Thinking and typing in parallel
Tremor Uneven, unstable Typing with cold hands
Surge Starts fast, slows down Front-loaded energy, tapered finish

If you want to cheat and target a specific family: you can't. That's the whole point. Your hands already decided.


Closing

Every toy in Playground is an experiment in turning something quiet and personal into something screenshotable. Destiny Grid turned birthdays into color maps. Molecule Me turned names into fake pharmacology. Keystroke Aura turns 20 seconds of typing into an identity card with two rarity numbers.

The common thread: the input is something you already do, and the output makes you feel seen. Not manipulated, not profiled — seen. You typed a sentence. The app told you something about your hands that you didn't know. That's it.

Next month's toy is already sketched. It involves sound, and it involves a mirror, and I'll shut up about it now before I jinx the build.

Try this one: Keystroke Aura →

See you next month.


2026.04.24

Written by

Jay

Licensed Pharmacist · Senior Researcher

Building production-grade AI tools across medicine, finance, and productivity — without a CS degree. Domain expertise first, code second.

About the author →
ShareX / TwitterLinkedIn