Think Wrong, Move Fast and Break Things

I’m currently reading Careless People by Sarah Wynn-Williams. I’m less than halfway through, but it already feels like this book deserves more than one post. So far, it doesn’t paint Mark Zuckerberg and Sheryl Sandberg as supervillains, and I’m getting a glimpse into Facebook’s early culture.

One of the big ideas from Facebook’s early years was Move Fast and Break Things. This mantra has been both confirmed working and disproved many times – often by engineers like me, who’ve lived through its successes and catastrophic failures. .

Moving Fast and Breaking Things Works

It works because the software industry can be like a gator-infested pool. When a new idea drops like a piece of meat in the pool, everyone jumps on it. The biggest reward goes to the fastest gator that ships first and markets well. There’s often no time to make things well.

Facebook won the social network race in large parts of the world. Twitter and a few others got the leftovers. But this principle applies beyond tech giants – down to much smaller scales. It’s a form of the Pareto principle: 80% of the outcomes stem from 20% of the causes. If you can roughly identify the 20% and validate an idea quickly, you’ve already won even if it doesn’t work. You saved the effort for something that may work.

On an individual level, it also feels like it works. You get a task, you ship something quickly – it shows up in your weekly update, your team’s update, maybe even the leadership sees it. You’re productive, visible, and valuable.

But It Also Doesn’t Work

Once an idea is validated, it gains users, traction, and revenue. A bug that shows up once in 1000 runs might never happen with 10 users/day. Once you have a million users, it happens 1000 times a day. Also, one broken user profile may be easily fixable but a million? Not so much.

Zuckerberg himself cited this kind of thinking when Facebook moved away from the motto around 2014. You can’t keep patching the same issues over and over at scale. Stability becomes a requirement.

From an individual contributor point of view, it looks that profitable ideas attract many layers of heavily invested people – technical, marketing, finance, data, legal, executive, investors. And when something breaks, you’re not just dealing with bugs. You’re affecting dashboards, KPIs, morale, and your own job security. Blame becomes easier to assign. 10 of these people will know how things work and won’t blame you but the eleventh may have a bad day and push the button.

How to make a difference?

In early-stage product development or during moments of intense change, moving fast and breaking things can be the right move. But in mature projects, where uptime matters and stakeholders are many, the priority shifts. It’s more about stability, reliability, and trust.

Ultimately, Mark Zuckerberg hung that motto on Facebook’s wall – and eventually took it down. He may put it back up if he recognizes a need for it. Recognizing the moment is a key part of leadership.

Zuck’s Moderation Changes And My Blog

Mark Zuckerberg announced that Meta is changing its moderation policy. The announcement was posted on Threads in 6 points and struck me as unusual. I went over these points several times and used ChatGPT to additionally separate the meaning from the PR. Came up to the conclusion that three things were announced and wrapped in zuckspeak:

  • Meta will cut moderation
  • Meta will relocate teams out of California and Texas
  • Meta will support a future initiative by president Trump

From the point of view of a personal blogger who actively uses Facebook, Threads, and formerly Instagram, Meta is my second-biggest source of traffic. I’m interested in them being successful and helping me succeed as well. Unfortunately, I’ve experienced problems with their services and think the problems are not going anywhere with these changes.

The problems that I’ve experienced with Facebook:

  • Publicize is no longer possible to my profile. I need to manually add my blog posts. This is likely to keep the traffic within the platform, which is something OpenAI, Google, and X also do
  • AI moderation bans/removes my blog posts, for example it banned this scary post and this photo of a bridge in my neighborhood
  • The moderation decisions are not reversible. Didn’t get a response to any appeal
  • Algorithmic deprioritization of manually shared posts if they contain external links
  • Propaganda/triggering articles reach my attention due to insufficient filtering of fake news
  • The response to any report I filed against hate speech was that that it doesn’t violate their guidelines (but my fitness achievements violated them)

So whatever moderation they do, it likely hides humans behind multiple layers of automation that leave bloggers vulnerable to frequent unfair treatment.

The problems that Facebook likely needs to solve and are lurking in the 6 changes:

  • Many parents from my generation consider Facebook and Instagram as not appropriate for children. We are raising a generation that’s banned from using these services and lives on Youtube, TikTok, and a few messenger apps. Some of these kids are already entering the work force and not spending any time or money on Meta.
  • Some social networks thrive because they allow X-rated content (Reddit, X)

My expectation for changes based on these observations

  • I think Zuckerberg expresses support to the new administration with a hope to get TikTok banned and Meta not banned, which can direct some young users to Instagram
  • I think they’ll let X-rated content slip in on all platforms to increase engagement, following similar moves by other networks, and label that a battle for the freedom of speech.

I don’t expect that any personal blogger will benefit from the changes unless they’re involved in whistleblowing/political criticism. Meanwhile, bloggers like me will continue to be banned when posting about outdoor walking.