— Back to Blog

You’re Not Boring… It's The Algorithm

You’re Not Boring… It's The Algorithm

  • Algorithms now know us with startling accuracy, often better than our closest friends.
  • In this blog, psychologist Sandra Matz shows how this power can either strip away our complexity or be harnessed to improve well-being.
  • The real challenge is learning how to make algorithms serve us without losing what makes us human

When you grow up in a small town, you probably don’t need to introduce yourself. Everyone already knows who you are.

The neighbors watch you race for the school bus every morning. They might know who you’re dating, what music you like, and which bumper sticker is on your car. Sometimes that intimacy feels like support, but other times it feels like surveillance. The upside is comfort and guidance; the downside is gossip and interference.

Today, the villagers are not flesh and blood but algorithms. They know you not by watching you in the street but by watching you online. Every like, every search, every movement with your phone in your pocket creates a trail. That trail tells a story, and the machines read it.

The result is a modern version of the old village. The difference is that this village never sleeps, and it never forgets.

How the Village Became a Computer

A decade ago, the idea that a computer could understand you better than your spouse would have sounded like science fiction. Today, it is a documented fact. With a few hundred Facebook likes, an algorithm can predict your personality traits as well as the person who lives with you.

It makes sense when you stop and think about it. Many people tell Google and ChatGPT things they would never tell their closest friends. At 2 a.m., when a question keeps us up at night, we whisper it into a search bar. We do not filter or polish, and we do not worry about being judged. That honesty is revealing.

The same goes for the songs we stream, the photos we post, the routes we drive, and the purchases we make. Each is a small clue. Alone, they may mean little, but combined, they become a pattern, and the pattern is often more accurate than what we would say about ourselves in a questionnaire.

For companies, this is gold. For individuals, it’s both a gift and a curse.

The Shrinking of Complexity

Human beings are complex because we contain contradictions within ourselves. An introvert can love dancing on Friday nights. An extrovert can spend Sunday alone watching tennis. Life is situational.

But algorithms do not like situational. They like averages, and they’re trained to minimize risk. If the system thinks of you as extroverted, it will keep showing you extroverted products. Why risk being wrong? Why suggest something outside the profile when safe prediction is rewarded?

Over time, this creates what Dr. Sandra Matz calls the “Basic Bitch Effect.” It smooths out our edges, and we start to look alike. We are fed the same music, the same movies, the same recommendations. We become simpler versions of ourselves.

The investor who has only lived through bull markets is not prepared for a crash. The child who only talks with polite algorithms is not prepared for the playground fight. Complexity is what gives us resilience. When it disappears, fragility follows.

The Messiness of Reality

Think about an argument. Online, the bots are designed to be polite. They disagree gently and explain their case in civil tones. In the real world, people are not so kind. They interrupt, raise their voices, and insult.

Learning to survive those messy interactions is part of growing up. It is uncomfortable, but it is necessary. Without it, we are unprepared for life outside the screen.

That’s the danger of a generation raised on frictionless interactions. When the world hits back, the shock may be overwhelming. Just as an investor who only owns smooth assets is exposed when volatility arrives, a person who only knows smooth conversations is exposed when conflict comes.

The Bright Side

It’s easy to focus on the risks. But to ignore the upside would be equally as short-sighted.

Data can be an early warning system. Imagine a smoke detector, not for fire but for depression. Your phone notices that you are leaving the house less. It sees that your step count has dropped, your calls have dwindled, and your screen time has soared late at night. Alone, each signal is harmless. Together, they suggest a shift in your mental state.

A well-designed system could send an alert: not a diagnosis, but a nudge. It might remind you to reach out to a friend, or it might notify someone you trust to check in. For millions of people who never seek help until it is too late, such a system could be life-changing.

The same principle applies to treatment. Not everyone responds to therapy in the same way. Some heal best by walking in nature, others need close family nearby. Algorithms that learn which interventions work for which personalities could personalize care in ways today’s system cannot.

The tools that now sell us sneakers could, if redirected, help us heal.

Making Data Work for Us

The question is not whether data will be collected. It already is. The question is how we shape the rules.

One idea is to reverse the default. Today, most data collection is opt-out. You are tracked unless you actively say no. Since few people read the fine print, saying no rarely happens. If the default were opt-in, companies would need to prove value before earning our consent. Laziness, which once exposed us, would now protect us.

Another idea is called federated learning. Instead of sending your data to the company, the company sends its algorithm to your device. Your data never leaves your phone. The recommendations are just as accurate, but the risk of breach or misuse vanishes. You enjoy personalization without surrendering privacy.

A third idea is data co-ops. Individuals band together and pool their data under a structure that has a fiduciary duty to its members. Imagine expecting mothers who join a co-op to combine genetic, medical, and lifestyle data. The group uses that pool to generate insights, then shares those insights with each member’s doctor. The benefits flow back to the community instead of to a corporation.

None of these solutions is simple. However, all of them point toward a future where individuals can retain their humanity while still reaping the benefits of personalization.

The Investor’s Analogy

Investing is a constant negotiation between risk and return. Every strategy has both, and the only free lunch is often seen as diversification.

The same is true with data. Complete openness brings personalization but invites manipulation. Complete secrecy preserves privacy but leaves us lost in a world of overwhelming choice. The answer lies somewhere in between: enough data to help us, but not enough to flatten us.

One key to surviving markets is preparing for the environment you cannot yet see. That wisdom applies here. The environment we cannot yet see is one where algorithms define identity. Preparing means building systems that preserve complexity.

In Conclusion...

The village has always shaped us: The old one did it through gossip and watchful eyes. The new one does it through clicks and code. Both offer support and both impose control.

The question is whether we let ourselves become predictable averages or insist on remaining complex individuals. In markets, the investors who survive are not the ones who maximize return in one environment but the ones who endure across many. In life, the people who thrive will not be the ones who match an algorithm’s profile but the ones who remain resilient when life refuses to follow the script.

The village is still here. The only difference is that now it lives in our pocket.


This is based on an episode of Top Traders Unplugged, a bi-weekly podcast with the most interesting and experienced investors, economists, traders and thought leaders in the world. Sign up for our Newsletter or subscribe on your preferred podcast platform so that you don't miss out on future episodes.