Altas

In 2013, with funding from Bill Gates, I co-founded Atlas on a simple but radical premise: that the way people searched their digital lives was fundamentally broken.

Traditional search tools were built around files and filenames — they required you to remember where something was stored in order to find it. But that is not how human memory works. We remember moments. We remember context. We remember who we were with, what we were trying to accomplish, and why something mattered — not which folder we saved it in. Atlas was built to work the way memory actually works: organizing a person's entire digital history around moments and meaning rather than files and keywords.

In the knowledge economy, where the speed at which you can recover and apply what you already know is a direct driver of productivity, that distinction is not cosmetic. It is the difference between a tool that serves you and a tool you have to serve.

What we built

Atlas developed a personal semantic index: a continuously updated, AI-powered model of a user's digital life that captured not just what they had created or consumed, but when, with whom, in what context, and toward what purpose. The index spanned devices, applications, and formats — making the entirety of a user's digital past searchable through meaning rather than metadata.

The design of the search interface was as important as the technology behind it. We developed what we called the Mad-Lib Search UI: a natural language query builder that allowed users to reconstruct past events through images, actions, and semantic cues rather than filenames or dates. The interface was intuitive enough that users didn't need to learn it — it worked the way they already thought. It was also distinctive enough to anchor our IP defensibility, establishing a set of proprietary visualizations and semantic UI patterns that competitors couldn't easily replicate.

Automated personalized insights surfaced relevant moments and connections without being asked, positioning Atlas as a behavioral layer on top of the operating system rather than just another search application. The goal was daily habitual use — the kind of product relationship that transforms a tool into a dependency.

The pivot

Atlas launched as a consumer product, but the market found us before we found it. Knowledge workers and executive assistants were among our earliest and most engaged adopters — people whose professional value was directly tied to their ability to recover and apply information quickly across complex, high-stakes contexts. The signal was clear: the highest-value application of what we had built was not personal productivity. It was enterprise knowledge management and collaboration.

We pivoted to B2B, validated enterprise market fit, and established the pricing and revenue models that would support deployment at organizational scale. The technology, the team, and the IP we had built in service of a consumer vision turned out to be exactly what the enterprise needed.

What it delivered

Atlas validated product-market fit through alpha and beta adoption with knowledge workers and enterprise teams. We established defensible IP through the Mad-Lib Search UI and our proprietary semantic visualization patterns. We validated enterprise pricing and revenue models. And we sustained the confidence of investors including Bill Gates and Nathan Myhrvold through continued engagement and follow-on support.

The team and the technology we built at Atlas were ultimately acquired by Xinova — recognized as the AI capability and the rare, experienced team that Xinova needed to power the next phase of its ambition.

My role

I came into Atlas with a specific conviction: that the design of the search experience was not a layer on top of the technology — it was the technology's primary expression of value. A semantic index that users didn't trust, couldn't understand, or found cognitively demanding would fail regardless of how sophisticated the model beneath it was. The interface had to make the intelligence feel natural.

That conviction shaped every product decision I made.

I defined the product vision, roadmap, and requirements from the ground up, and directed the iterative design and user research process that produced the semantic search interface, the Mad-Lib UI, the index visualizations, and the contextual insight tools. I established a biweekly user research cadence that kept the team anchored to real user behavior rather than internal assumptions — and identified the trust metrics that proved most predictive of adoption and retention.

I built the design language and comprehensive design system that gave the product coherence across every surface and interaction. I co-led fundraising for subsequent rounds and drove the go-to-market strategy, collateral, and sales materials that supported the enterprise pivot. And I led that pivot itself — from consumer to B2B — identifying the enterprise opportunity in our early adoption data before it was obvious, and validating the market fit that made the acquisition possible.

The acquisition validated the technology. The pivot validated the judgment. The team we built validated the leadership.

D E E P D I V E

Capturing the Moment: The Design of Atlas

Most search tools are built around retrieval. You remember something exists, you search for it, you find it. The interface assumes you know what you're looking for and need help locating it.

Atlas was built around a different premise entirely. The problem it was solving wasn't retrieval. It was reconstruction.

Knowledge workers don't just need to find things. They need to return to moments: the context in which something was created, read, shared, or decided. The email that started a project. The webpage that changed a direction. The conversation that happened alongside the document being written. Traditional search returns the needle. Atlas returned the haystack: not just the item, but the moment it lived in, reconstructed from the behavioral evidence of everything that surrounded it.

That distinction shaped every design decision we made.

The Index

Atlas ran a continuous semantic index of everything on your device: files, emails, messages, browser activity, application usage, and the interactions between them; copy-paste behavior, window switching, links opened from messages, the music playing while you worked. It captured not just what you had created or consumed, but the relational context that gave it meaning: who else was involved, what you did next, where you were, what time it was.

The result was a personal timeline of moments rather than a catalog of items. When you searched Atlas you weren't searching a file system. You were searching your own history of engagement with information, organized the way memory actually works: around context and association rather than location and filename.

The index was built on an opt-out model. Everything was included by default, and users could choose to exclude specific applications, websites, or content types. That decision was grounded in two principles we held as non-negotiable from the start. First, the raw data never left your device unencrypted; the index was processed and stored locally, and only encrypted data was transmitted. Second, the Atlas index was yours alone. No one else could search it, access it, or benefit from it. It was a personal memory tool, not a surveillance system, and the privacy model was designed to make that distinction impossible to misunderstand.

This was 2013. The conversation about what AI systems should be allowed to know about you, and who should have access to that knowledge, was only beginning. Atlas had already answered it.

Visual Search

Rather than returning a ranked list of results, Atlas displayed search results visually: images and artifacts grouped by time, topic, people, and location, arranged to reflect the relational structure of the moment rather than the relevance score of individual items.

The visual approach was a direct consequence of what Atlas was actually indexing. A list format assumes discrete items with clear boundaries. Atlas was indexing moments, which are inherently relational and contextual. A visual display could show those relationships in a way a list never could: the email next to the document next to the browser tab next to the calendar entry, all present simultaneously, the moment reconstructed rather than the needle returned.

The Mad-Lib Search Interface

We built the Mad-Lib Search Interface in 2013, before prompting was an established interaction model. Users weren't yet comfortable typing natural language sentences into a search box, and asking them to abandon familiar keyword conventions produced friction and uncertainty. We needed a way to give users the expressive power of natural language search without requiring them to compose sentences from scratch.

The solution turned the search query into a sentence with fillable blanks: "Recall [content type] related to [topic] from [time period] that I [action] at [location]" with each element represented by a dropdown menu populated with options drawn from the user's own index. You didn't type keywords. You read a sentence that already made sense and selected the words that made it yours.

Each selection narrowed the search while simultaneously generating a set of parallel queries displayed as a cascade of complete sentences, showing at a glance every adjacent search that might return the moment you were looking for.

The interface anchored a set of design patents that made the interaction model genuinely protectable. We had invented a prompting interface before anyone was calling it that.

Returning to the Moment

When a search returned results, Atlas didn't surface an item. It reconstructed a moment: the browser tabs open alongside it, the emails that preceded and followed it, the people involved, the applications in use. Everything that had surrounded that item when it mattered was present again. From there you could share the moment with a colleague or simply resume your own work exactly where you had left off.

Atlas was less a search tool and more a memory augmentation: private, local, and designed around how human recall actually works rather than how file systems are organized.

That design philosophy ran through everything that came after it. AI should serve the person, invisibly and without extracting value from the relationship.

Previous
Previous

four sided innovation marketplace

Next
Next

SFLY 3.0