I started building Retro Recordings XR in Unity. That sentence contains more history than it seems.
Unity was the obvious choice at the time. It had the best VR support, the largest community, the most documentation, and a free tier that made sense for a solo developer without external funding. I learned it deeply. I built things in it that I was genuinely proud of. I spent years in it.
I left it, and starting over was one of the most difficult decisions I have made in this project — and one of the most necessary.
This is the honest account of how that happened.
The frustrations that preceded everything else
Before the events that made Unity’s direction undeniable, there were years of friction that I had been absorbing and rationalising.
Interfacing with native C++ plugins in Unity is designed for teams with infrastructure. As a solo developer, every time I needed native code — and in a VR project with custom audio processing, that was frequently — I was fighting a system that was not built for my situation. This meant hours of debugging compilation errors with limited community support for the specific combination of Unity version, VR SDK version, and platform target I was working with. I got through it. But I got through it the way you get through something unpleasant — by enduring it, not by solving it.
Audio routing was its own category of difficulty. The Unity audio mixer, at the level of complexity that a working studio simulation requires — custom signal chains, flexible routing between virtual channels, hardware-inspired gain staging — was a constant source of rework. Not because it lacked capability, but because the capability it had was difficult to compose into the architecture I needed. Patches that worked in one Unity version broke in the next. Behaviour that was documented one way behaved differently in practice.
These were manageable problems. They were the tax I paid for the benefits Unity provided. I paid the tax and kept building.
When the platform stopped working for you
In 2023, Unity announced changes to its pricing model that introduced a runtime fee — a per-install charge that would apply retroactively to projects built on past versions of Unity. The specifics changed rapidly in response to the reaction from the developer community, but the nature of what had happened was clear.
For a solo developer building a long-term project, the message was not primarily about the money. It was about the relationship. Unity had changed the terms of an implicit agreement that the development community had built on for years. The trust that you could build something on a platform and understand your long-term costs had been broken.
To be clear: the runtime fee would not have directly affected me. I had never shipped a commercial application with recurring revenue built in Unity. Everything I had built there was B2B work, free projects, or experiments — things you do while learning or while working for others. I had no Unity product generating installs. By the letter of the new rules, I was not in the line of fire.
It did not matter. That was almost the point.
I had also attended several Unite events in Amsterdam over the years. I genuinely enjoyed them. The international community around Unity was something real — developers from everywhere, sharing work, comparing approaches, the particular energy of a platform that had made game development accessible to people who had no business being there. I was one of those people. I liked that about it.
But that community feeling was already starting to change before the pricing announcement. The founders who had built Unity into what it was had moved on. What replaced them was a leadership structure with different incentives — investment groups, quarterly pressures, a product being steered toward monetisation rather than toward developers. The pricing announcement was not a surprise in hindsight. It was consistent with a direction that had been set before it happened.
I had been absorbing friction for years. This made the total cost of staying visible in a way it had not been before. The compilation frustrations, the audio routing rework, the version-to-version instability — they added up differently once I stopped assuming the platform was working with me.
I made the decision in Sri Lanka.
I was travelling — a few weeks of actually being away from the desk, exploring, not thinking about the project. In the evenings I would check my phone, read through the developer forums and social feeds, and watch the Unity situation unfold in real time. The announcement, the community reaction, the backpedalling, the counter-reaction. And somewhere in those evenings, reading all of it on a phone while the day’s dust settled, I realised I had been fed up with Unity for a long time before this happened. The pricing announcement had not created the frustration. It had made it impossible to ignore.
The project had been quietly sliding to the background for a while. The frustration had accumulated in a way I had not fully acknowledged. This was why.
I did not decide to leave Unity in Sri Lanka. I decided to scope out other options when I got back. That is all. I gave myself permission to look. When I returned, in October 2023, I started evaluating. I only went back to Unity twice after that — to export some assets I needed. That was it.
The migration
Switching engines mid-project is genuinely as difficult as it sounds.
Some things transfer. The conceptual architecture of what you are building, the asset work, the design decisions, the hard-won understanding of the problem you are trying to solve. None of that is engine-specific.
Most of the implementation does not transfer. Scripting languages, component systems, physics implementations, audio middleware integrations, VR SDK bindings — all of it needed to be rebuilt from scratch in Unreal Engine 5’s framework.
I had looked at Unreal before. I had opened UE4, evaluated it, decided the workflow and visual language were not for me at the time, and closed it. That is not the same as knowing it. My first real steps in Unreal were in 5.1 — I needed MetaSounds, which had shipped with that version — and then 5.2. I was starting from close to zero.
The VR framework problem nobody talks about
In Unity, I had worked through roughly three different VR frameworks over the years. Each migration was expensive — weeks of rework, rebuilding interaction systems from scratch — but each time it was worth it. My last switch in Unity was to the XR Interaction Toolkit, which I treated as a long-term commitment. A foundation that would grow with the project rather than get abandoned by its maintainers.
I went into UE5 expecting to find something equivalent. There is not one.
The official VR Template is very basic — a starting point, not a framework. Epic does not offer a free, maintained VR interaction framework the way Unity does. The plugin ecosystem was underwhelming: a lot of UE4-era tools that had not been properly updated, incomplete implementations, or commercial plugins at prices that made no sense for an unproven purchase. Three hundred euros for a system with no track record from a small developer — not for me.
What I found eventually was VRExpansion — mordentral’s framework, originally built for UE4 and ported to UE5, still actively maintained. The documentation at vreue4.com is not great, but the framework itself had most of what I needed to get going: grabbable objects, interaction primitives, a reasonable foundation to build on. So that is what I used. It is still what I use today.
I will probably want to move to something closer to Meta’s native framework or whatever Epic eventually ships — but both currently lack too much of the basic functionality I have already built on top of VRExpansion. That migration, if it happens, is a long-term plan. Not now.
Two edits per minute
There was a period during early UE5 development where the editor became nearly unusable.
I am not being figurative. There were days where I could make approximately two meaningful edits per minute — open a Blueprint, wait, make a change, wait, compile, wait, evaluate result, wait, iterate. The rest of the time the editor was processing.
The hardware was part of it. But the bigger problem was that I was thinking in Unity terms.
In Unity, it is natural to nest components — a behaviour attached to an object, which contains sub-behaviours, which contain references to other components. I brought that mental model into UE5 and applied it to Blueprints. The result, across roughly the first two to three months — spanning UE5.1 through 5.2 — was one large Blueprint with approximately 200 embedded Blueprints nested inside it. Every object, every interactive element, every piece of logic, all living inside a single massive graph because that is how I had been trained to think about structure.
Unreal does not perform well with this. A Blueprint with that level of embedded complexity compiles slowly, loads slowly, and when you have many instances of it placed in a level, the editor slows to a crawl. Clicking nodes, dragging connections, making any change at all — everything becomes laboured. The feedback loop breaks down.
The architectural shift was to build using custom components — extending C++ classes with Blueprint, keeping each component focused on one thing, then composing behaviour from there. For RRXR this meant all the buttons, knobs, and faders could live in one Blueprint and be controlled from a central place, while changes to the base component or its extensions would propagate across the whole project automatically. More manageable, more compact, and much easier to reason about than the nested structure had been.
But I want to be honest: the slowdowns have not fully gone away. The cause now seems to be less about architecture and more about scale — a large number of Blueprint nodes, many instances in the level, and the editor starts to resist you. Clicking things, dragging things, any interaction becomes sluggish. The interaction system in particular needs a lot of code within one object, and I have not found a way to split it that does not create its own problems. My current workaround is to open an empty level and edit the Blueprint there, which makes editing faster but does not help with iteration — you still have to switch back to the full scene to see the actual result. It is a workaround, not a solution. I am still working on a better one.
It is one of those limitations you learn to work around rather than solve. UE5’s power comes with a certain amount of this.
But there were months where I was not sure any of it would come together. There were evenings — and if I am being honest, there were longer stretches than evenings — where it felt genuinely irrational to keep going. Years of work on a project that only a handful of people had ever seen or tried. A clear vision for what it should be, and not enough time or resources to close the gap between where it was and where it needed to be. No funding, no team, just the project and the job that pays for the project.
The version of quitting I came closest to was not dramatic. It was just the thought that maybe this was a hobby that had outgrown its time, and the practical parts of life — the video work, the photography, the web projects, the live production — were sufficient. That those things were real and this was something else.
I did not quit. Partly stubbornness. Partly because nine years of work is not something you just shelve. But mostly because the project is something I genuinely want to exist, and that is a different kind of motivation than professional obligation or sunk-cost reasoning. I want to use it. I want to stand in front of that console in VR and record something real. That has not changed.
The side project that is not work, that no one is depending on — it turns out that is exactly the kind of thing worth finishing.
What Unreal Engine 5 gave me
First, a clarification: Lumen and Nanite, UE5’s flagship rendering features, are not part of RRXR. Both are still works in progress for the forward renderer that VR requires, and the performance cost in a headset would be prohibitive regardless. If you are building VR in UE5 expecting to use those features, you will be disappointed for now. They are primarily for flat-screen applications — Nanite has gained partial VR support in later UE5 versions, but it is not production-ready for demanding VR use cases yet. That gap will close eventually, but it has not closed enough.
What UE5 actually gave me was different, and in some ways more useful for where the project was.
MetaSounds — Unreal’s procedural audio system — replaced years of Unity audio frustration in one move. Unity’s DOTS audio had been promised for years and kept not arriving, while the legacy audio stack aged and the gap between what I needed and what I had kept growing. MetaSounds gave me a node-based audio graph with the kind of composability and signal-processing capability that the studio simulation actually requires. It was the right tool where Unity had given me the wrong one.
The material editor and Blueprint system also clicked for me in a way that Unity’s equivalents never quite had. Some of this is familiarity — you get better at tools you use every day. But there is something in how UE5 exposes its rendering pipeline through the material editor, and how Blueprint nodes map to engine systems, that I find more legible than what I was working with before. The particle and VFX tooling is in the same category. I understand what I am doing in Unreal in a way that sometimes felt like guessing in Unity.
The renderer itself — even without Lumen — handles the interior studio environment better than Unity did at comparable settings. The lighting tools available within the forward renderer are capable, and the overall visual quality of a well-lit indoor scene is meaningfully higher.
And it gave me a fresh start on the interaction systems, which I rebuilt with the experience of knowing what had not worked in Unity and a clearer sense of what needed to.
What I miss, and what I don’t
I miss the Unity Inspector and Hierarchy.
That sounds like a small thing. It is not. Being able to inspect live object state at runtime — move things, disable components, tweak variables — without going into VR for every single test was essential for the kind of debugging this project requires. In Unreal, the Outliner is significantly harder to work with. Finding anything in a scene with many objects takes real effort. Inside the Blueprint editor it is worse: you cannot snap to selected objects, search components effectively, or jump to a selected item’s location in the viewport with a key press. These things all work in the level viewport but not in the Blueprint editor, which is exactly where you need them most. The workflow tax is real and ongoing.
I also miss the platform relationship Unity had with Meta. For most of the project’s life, everything Meta and Oculus released came to Unity first — proper SDK support, developer tools, documentation. Meta Avatars still do not have proper Unreal support. This is a genuine frustration for a VR developer building on Quest hardware. The reasons for Meta’s slower support for Unreal are not entirely clear to me, but the effect is that you are developing on the less-supported platform for the hardware you are targeting, which adds overhead at every platform-level integration.
What I do not miss is C#.
I have been programming since I was six — Sinclair BASIC, C64, MSX, Atari GFA BASIC, Pascal, then Lingo in Macromedia Director, ActionScript, PHP, JavaScript after ActionScript died, and eventually C# and some C++. I liked C# in Visual Studio. But I have always been a practical programmer rather than a rigorous one — I build things that work, I do not always build them the right way, and I have a tendency to solve a problem quickly rather than correctly.
Blueprint has, counterintuitively, made me a more disciplined developer than C# did.
Something about the visual structure — the way you have to define interfaces, use inheritance explicitly, build functions as discrete nodes — pushes toward better patterns. There is less ability to write a quick dirty solution and move on. The constraints of the system encourage structure.
There is also something I did not expect: the time I spend rearranging Blueprint nodes, straightening connection lines, cleaning up the visual layout of a graph — that time is not wasted. It is thinking time. It gives me a moment to look at what I just built before I test it, and to catch the things I would have missed if I had run it immediately. It is an accidental review process built into the workflow.
I would not have predicted that a visual scripting system would be better for my coding habits than a typed language. But here we are.
What I would tell another developer
Two things I wish someone had told me on day one of UE5:
Watch the keynote first. Epic’s Unreal events have a main keynote — Chris Murphy’s State of Unreal presentations are a good example — that covers the engine at a high level, what is new, what the philosophy is, how the systems relate to each other. Watching that before touching anything else would have saved me a significant amount of time fumbling around with the wrong mental model. It is the orientation that YouTube tutorials assume you already have.
And look beyond YouTube. Unity’s community is more open and more searchable — forum answers, tutorials, third-party documentation. Unreal’s best resources are less visible but they exist. Community forums, the documentation itself, specific Discord communities, conference talks. The information is there if you look in the right places.
On whether to switch: I would not recommend it unless you are genuinely hitting limits in your current engine, or you have a specific reason the other platform serves your use case better. The migration cost is real. If the issue is financial and you are considering alternatives, look at Godot first — for many applications it is a serious option and the cost is zero.
For VR specifically: Unity is still a strong contender. More than I would have said three years ago, honestly. I have not spent enough time in the current version to say exactly where it stands, but a lot has changed and as a prototyping environment in particular, it is faster to move in than Unreal. The platform support from Meta alone is worth considering.
I switched because I hit real limits and the relationship with the platform had broken down. That is a valid reason. It is also a high bar. If you are on the other side of it, the switch is worth it. If you are not, keep building.
Where things stand
Retro Recordings XR is further along now than it has ever been. The studio environment is photorealistic in a way it was not in Unity. The interaction systems work better. The audio processing is closer to what the product actually needs to be.
The migration cost years. I would not choose to do it again. But given where Unity went and where UE5 has taken this project, I would not choose differently either.
Nine years in, still building. The studio doors are not open yet. But they exist now.
Retro Recordings XR is a VR music production studio built in Unreal Engine 5, currently in development. All screenshots are captured in real-time — no renders, no mockups.
Subscribe to the Studio for development updates, or join the Discord to follow the build.
this post was a rewrite of my earlier post on this subject which has some more examples in it of the beginning days of unreal. Check it on Burning the ships, why I left Unity for Unreal after 8 years