Building Multiplayer in Under a Month: Lessons from a Rescue Mission
When the person responsible for our multiplayer system turned out to be faking progress, I had to build the entire thing from scratch. Here's how I approached it and what I learned.
The multiplayer system in our game was broken. The person we’d hired to fix it had been faking progress reports for weeks. When we finally tested his implementation, nothing worked. We let him go.
Now it was my problem, and we had a month.
The Situation
The game was a multiplayer home decoration sandbox. Players needed to see each other’s houses, visit them, and interact in real-time. The existing system was unreliable—desyncs, dropped connections, state corruption.
Our stack was Unity with Steam as the distribution platform. I chose Netcode for GameObjects as the networking framework and Steamworks for the transport layer and matchmaking.
I’d worked with other networking solutions before—Mirror, Photon, the old UNet—but Netcode for GameObjects was relatively new to me. The documentation was decent but incomplete. I’d be learning as I built.
The Approach
When you’re under pressure, the temptation is to start coding immediately. I’ve learned that’s usually wrong.
I spent the first few days just thinking. What does this game actually need? What can I cut? What’s the simplest architecture that could work?
Some decisions:
Host-client, not dedicated servers. We didn’t have the infrastructure or budget for dedicated servers, and the game’s architecture assumed one player’s game would be authoritative anyway. Host-client was the practical choice.
Authority on the host, but trust the client for non-critical stuff. Player movement prediction on the client, but all building and persistent state changes validated by the host. This gave us responsive gameplay without opening the door to serious cheating.
Don’t synchronize everything. The game had thousands of decorative objects. Synchronizing every lampshade would kill bandwidth. I identified what actually mattered—player positions, building actions, ownership—and only synced that.
The Build
The actual implementation was heads-down work. A few things that helped:
Start with the happy path. Get two players connected and seeing each other before worrying about edge cases. Early wins matter for morale when you’re racing the clock.
Test constantly. I had colleagues help with testing throughout, not just at the end. Finding a fundamental design flaw in week three would have been fatal.
Document as you go. I wrote notes about every decision. When I inevitably forgot why something was built a certain way, the notes were there. This also helped when onboarding others to the system later.
Accept imperfection. Some edge cases weren’t worth solving in the initial build. I kept a list of “known issues that don’t block launch” and moved on.
The Problems
Nothing goes smoothly.
Steam transport issues. Steamworks has its own opinions about how networking should work. Integrating it with Netcode for GameObjects required understanding both systems’ assumptions about connection lifecycle, and they didn’t always agree.
Serialization surprises. Some of our existing game state wasn’t designed with network serialization in mind. Objects that worked fine locally would fail to reconstruct properly on the client. I had to rewrite several serialization paths.
The existing codebase. The game had years of single-player assumptions baked in. Some systems expected to be the only writer to certain data. Finding and fixing these hidden assumptions took time I hadn’t budgeted.
What Shipped
A month later, we had working multiplayer. Players could connect, see each other, visit houses, and build together. The system handled disconnections gracefully and recovered from most error states.
Was it perfect? No. There were edge cases I knew about and probably some I didn’t. But it worked, it was stable, and it shipped.
What I Learned
Constraints clarify. Having a month forced me to make decisions I might have procrastinated on otherwise. “We don’t have time” is a legitimate reason to choose the simpler approach.
Networking is about tradeoffs. Every choice—authority model, sync frequency, what to replicate—involves trading off between responsiveness, bandwidth, complexity, and cheat resistance. There’s no universally right answer.
The hard part isn’t the networking. The framework handles packets. The hard part is figuring out what your game actually needs to synchronize and how to integrate that with existing systems that weren’t built for multiplayer.
Trust, but verify. We trusted someone’s progress reports and got burned. Now I ask for demos, not descriptions. “Show me it working” is a reasonable request.
Would I Do It Again?
Building a multiplayer system under this kind of pressure isn’t something I’d recommend seeking out. But I’m glad I went through it. The deadline forced clarity, and shipping under pressure taught me what I’m actually capable of.
The next time someone tells me something is impossible in the available time, I’ll remember the month I built multiplayer from scratch. Usually “impossible” just means “requires focus.”