Show HN: Finding similarities in New Yorker covers

New Yorker cover matcher goes live; Hacker News cheers, roasts the UI, and wants smarter magic

TLDR: A tool launched to find similar New Yorker covers, but users struggled with a confusing interface. Commenters praised the concept and demanded clearer “why” explanations, match scores, and modern AI like CLIP/DINO to make results feel truly similar—turning a cool idea into a lively debate over art vs algorithms.

A new web toy promises to spot look‑alike New Yorker covers across a century, but the real show was the comment section. Some readers swooned over the idea of “find the echoes in art,” while others slammed the interface. One baffled voice cried, “I dont understand the UI at all”, wondering why the “similar” results didn’t look, well, similar. The app lets you click through different image “fingerprints” and color matches, but without any “why” or “how similar” guidance, the crowd felt lost.

Then the armchair product managers arrived. Fans asked for clear explanations: highlight what matches, give a score, and show examples of “this is similar to that because…”. Meanwhile, the tech‑savvy chorus pushed a glow‑up: skip the old pixel tricks and use modern AI. One suggestion: CLIP (an AI that understands images and words) or DINO v2 (an AI that learns what pictures look like) to capture “vibes,” not just pixels—kind of like same.energy. Others wanted to know if it uses public libraries and begged for a GitHub repo. And yes, the memes landed: “Is this Tinder for Eustace Tilley?”, “Where’s the ‘because’ button?”, and “Who wore that skyline better?” Call it a friendly roast: big love for the concept, big push for a smarter brain and a clearer face.

Key Points

  • Project provides a browsable archive of The New Yorker covers organized by multi-year ranges from the 1920s to the 2020s.
  • Each cover entry links to similarity searches using multiple hashing methods: average, difference, perceptual, wavelet, and color hashes.
  • Hash implementations are indicated as 32-bit variants (e.g., avg_hash_32, d_hash_32, p_hash_32, w_hash_32, color_hash_32).
  • Recent entries list specific dates and credited artists for covers (e.g., late 2025 issues).
  • A dedicated “Color” similarity option focuses on palette-based comparisons, separate from structure/perception-based hashes.

Hottest takes

"I dont understand the UI at all" — smusamashah
"Would be nice to have some visual indication on "what" was similar, or why, or even how much?" — multisport
"Using a CLIP or Dino v2 model to produce image embeddings would probably improve the similarity search a lot" — Samin100
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.