Greetings

Part 3 - How do I choose a KNN algorithm?

Part 3 in the Search and recommendations with embeddings series. There are multiple algorithms to compute KNN, including both "Exact KNN" and Approximate Nearest Neighbors (ANN). When choosing between them, there is a tradeoff of quality for speed. The simplest nearest neighbor approach is a Brute…

Continue reading...

Part 2 - What is K-Nearest Neighbors (KNN)?

Part 2 in the Search and recommendations with embeddings series. An embedding's "nearest neighbors" are those other embeddings that are the shortest distance away in vector space. Continuing the 2-Dimensional Example from Part 1: Animal size, let's say we measure an unknown animal, and we find…

Continue reading...

Part 1 - What are embeddings?

Part 1 in the Search and recommendations with embeddings series. "Embeddings" are a representation of something using numbers. For words, images, songs, etc., we create a number to represent each item (e.g. [7], or several numbers together as a vector [2, 5, 3], . To create embeddings, we use Machine…

Continue reading...

Search and recommendations with embeddings

We built a system that serves 10s of millions of song recommendations for 10s of millions of customers at a low latency. This multi-part series explains how we built it, starting from core concepts, and working our way into practicum. This isn't the only way…

Continue reading...

FAANG interviewing is purposely flawed

FAANG interviews stink on purpose. Because so many qualified candidates apply, FAANG interviewers have the luxury of saying “no” to lots of high quality people, and know there are more options. For example, Amazon has a large candidate pool, because lots of engineers want to work there. The pay is…

Continue reading...