AI City Popy ๐Ÿ™๏ธ
๐Ÿ“š
๐Ÿ“–
โœจ
๐Ÿ”–
โญ
๐Ÿ’ซ
๐Ÿ“•
District 7 ยท Memory Library

Welcome to The Memory Library

Where AI remembers meanings, not just exact words.

Enter The Library

Meaning cluster ยท furniture

These words all live close together in AI memory.

Why Exact Words Fail

Old-style systems search for exact letter matches. If the word isn't there, the shelf stays empty.

Keyword Search Engine
๐Ÿค”

The Problem

โ€œCouchโ€ and โ€œsofaโ€ mean the same thing โ€” but a keyword system treats them as completely different.

โœจ

The Solution

Embeddings teach AI to recognise meaning โ€” not just spelling.

๐Ÿ“š

Humans understand meanings โ€” not just exact spellings. Wouldn't it be magical if AI could too?

Meaning Neighborhoods

The Memory Library organises words by meaning, not alphabet. Similar ideas live in the same neighbourhood โ€” no matter how they're spelled.

๐Ÿ“š

In the Memory Library, words aren't filed alphabetically โ€” they float near the words they feel like.

Meet Popy

Your guide through the Memory Library โ€” the AI librarian who organises thoughts by meaning.

๐Ÿ“š
Popy

Popy

Semantic Librarian ยท District 6

The Semantic Galaxy

Every word becomes a glowing star floating in a meaning galaxy. Related words drift close together โ€” like planets in the same solar system.

sofacouchchairreclinerbenchpizzapastasushinoodleshotelvacationbeachtourismpythonfunctiondebugalgorithm
๐Ÿ“š

Each word becomes a coordinate in meaning-space. Similar meanings have similar coordinates โ€” so they land near each other!

Search By Meaning

The Memory Library doesn't need exact words. It finds the closest meanings. Pick a query and watch the semantic search light up.

How AI Fetches Memories

RAG โ€” Retrieval Augmented Generation โ€” means AI looks up relevant memories before answering. Like a student who reads notes before the exam.

โ“
Question arrives

Tourist asks: "Where can I relax in Jaipur?"

๐Ÿ”
Popy searches

Converts the question to a meaning coordinate and scans the library.

๐Ÿ“š
Memories retrieved

Nearest documents fly off the shelves: spa guides, lounge reviews, recliner cafรฉs.

๐Ÿง 
AI reads context

The AI worker reads the retrieved documents โ€” getting rich, relevant context.

๐Ÿ’ฌ
Better answer

"Try Anokhi cafe's rooftop lounge or Dera Mandawa spa โ€” both are very relaxing!"

1 / 5
๐Ÿ“š

RAG helps AI fetch useful memories before answering โ€” like checking the library shelves before writing an essay.

Embeddings in Code

Two lines. That's all it takes. Hover the glowing words to reveal what each piece does.

# Step 1: embed your query
embedding = model.embed("comfortable couch")
# Step 2: search the vector database
results = vector_db.search(embedding)
๐Ÿ“š

The AI never stores raw text in a vector database โ€” it stores meaning coordinates (the embedding). Searching those coordinates is how semantic similarity works.

Mission: Build AI City's Tourism Memory

Tourists are arriving and the Memory Library shelves are mixed up! Sort each word into its correct semantic neighbourhood โ€” and watch AI City come alive ๐ŸŒ†

ottoman
sushi
resort
variable
recliner
noodles
Get started

Build your first semantic memory ๐Ÿ“š

Four steps. From raw text to a searchable meaning-library in minutes.

  1. 1

    Install the libraries

    Grab OpenAI for embeddings and a vector store client.

    pip install openai chromadb
  2. 2

    Create an embedding

    Send any text to the embedding model. Get back a list of meaning-coordinates.

    from openai import OpenAI
    client = OpenAI()
    
    resp = client.embeddings.create(
      model="text-embedding-3-small",
      input="comfortable couch",
    )
    embedding = resp.data[0].embedding
    # โ†’ [0.012, -0.187, 0.443, ...] (1536 nums)
  3. 3

    Store in a vector database

    Save the embedding alongside its original text so you can retrieve it later.

    import chromadb
    db = chromadb.Client()
    collection = db.create_collection("library")
    
    collection.add(
      documents=["comfortable couch"],
      embeddings=[embedding],
      ids=["doc_1"],
    )
  4. 4

    Search by meaning

    Embed your query, then ask the database for the closest stored meanings.

    query_resp = client.embeddings.create(
      model="text-embedding-3-small",
      input="places to sit",
    )
    query_vec = query_resp.data[0].embedding
    
    results = collection.query(
      query_embeddings=[query_vec],
      n_results=3,
    )
    # finds: "comfortable couch", "sofa"...
๐Ÿ’ก

Pro tip

You never store the exact words in a vector database โ€” you store the coordinates of their meaning. That's why searching works across synonyms and paraphrases.

Ask Popy

Explore the library chat โœจ

Questions about embeddings, vector search, RAG, or how AI remembers? Popy is ready.

Hi! I'm Popy ๐Ÿ“š Ask me anything about embeddings, vector databases, or how AI remembers meanings!

You're a Semantic Librarian now

You now understand how AI systems store meanings as coordinates, organise related ideas in semantic space, and retrieve the right knowledge by meaning โ€” no exact match needed.

Continue Journey โ†’
Mini Project
Build Quest

Semantic Shelf

Deliverable: Embed five short docs and return top-2 matches for a user query.

Stretch: Display similarity score for each retrieved result.

Complete the deliverable first, then unlock the stretch goal.

Previous
๐Ÿ”ง Tool Workshop
Next
๐Ÿซ Open-Book School