Examples: cricketvodkacrisiscarnivalasiadesertmalariacoffee
Word2Vec associates a vector with each word by processing large amounts of text. The vectors land in a semantic space where similar words sit close together, and you can do arithmetic on them (the classic example: king − man + woman ≈ queen). Restricting the comparison set to country (or US state) names turns that space into a fuzzy choropleth of meaning.
The model is Google News word2vec (3M words, 300-dim). I converted the 12 GB binary into a sqlite-vec database that ships in this project's static directory: each lowercase word stores the sum of its case-variants as one float32 vector, looked up with a primary-key lookup on every query.