
Artificial Intelligence 15 May, 2025
16 May, 2025
6 min read
Unity and Unreal are the two top trending XR development platforms known for this single-source coding base. At Cubix, we have introduced the use of these platforms, showcasing their innovation and quality in providing results in optimal time and efficiency.
Let us guide you on the potential of these XR platforms and potential setbacks based on real projects, clients, and real codes.
“When you’re prototyping fast and testing on devices like the Quest, Unity just gets out of your way. That’s why it was our starting point.”
– Shoaib Abdul Ghaffar, VP of Engineering, Cubix
It all started with our XR deployment, which was for a decor VR app. In this app, users could move furniture in a digital layout of the house using Meta Quest, and Unity’s XR toolkit made it super quick to get things working.
Here’s a basic interaction script we used:
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;
public class DecorObjectInteraction : MonoBehaviour
{
private XRGrabInteractable grabInteractable;
private Vector3 initialScale;
void Awake()
{
grabInteractable = GetComponent<XRGrabInteractable>();
initialScale = transform.localScale;
grabInteractable.selectEntered.AddListener(OnGrabbed);
grabInteractable.selectExited.AddListener(OnReleased);
}
void OnDestroy()
{
grabInteractable.selectEntered.RemoveListener(OnGrabbed);
grabInteractable.selectExited.RemoveListener(OnReleased);
}
private void OnGrabbed(SelectEnterEventArgs args)
{
Debug.Log($"{gameObject.name} grabbed.");
// Optional: highlight or play soundGENERATED_BODY()
}
private void OnReleased(SelectExitEventArgs args)
{
Debug.Log($"{gameObject.name} released.");
// Optional: Snap to grid or ground
}
// Optional: Add scale/rotate logic via UI or gestures
public void ScaleUp()
{
transform.localScale += Vector3.one * 0.1f;
}
public void ScaleDown()
{
transform.localScale = Vector3.Max(initialScale, transform.localScale - Vector3.one * 0.1f);
}
public void RotateRight()
{
transform.Rotate(Vector3.up, 15f);
}
public void RotateLeft()
{
transform.Rotate(Vector3.up, -15f);
}
}
As the VP of Engineering at Cubix, I personally led the project, keeping a close eye on the prototyping phase. To conduct testing, Unity provided an MVP in a week, allowing us to conduct a quick demo on Quest headsets. The client’s positive response, in turn, gave us a head start in delivering even more captivating results.
Read More: Unity Video Game Engine – The Future of Gaming
In this case we had a real estate client demanding a VR experience. For such, we learned that Unity is not the suitable platform for VR films or visuals; it lacked the realism. Another option was the XR development platform.
Therefore, we took a leap of faith by adopting Unreal’s Collaborative Viewer Template. Notably, Unreal offers powerful tools such as Lumen and Nanite, which enable fast and immersive multi-user condo walkthroughs right out of the box.
Here’s an example of how we updated materials at runtime using C++:
// MyMaterialChangerComponent.h UCLASS( ClassGroup=(Custom), meta=(BlueprintSpawnableComponent) ) class YOURPROJECT_API UMyMaterialChangerComponent : public UActorComponent { GENERATED_BODY() public: UMyMaterialChangerComponent(); // Call this to change material at runtime UFUNCTION(BlueprintCallable, Category="Material") void ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex = 0); protected: virtual void BeginPlay() override; private: UPROPERTY() UMeshComponent* MeshComponent; };
// MyMaterialChangerComponent.cpp
#include "MyMaterialChangerComponent.h"
#include "GameFramework/Actor.h"
#include "Components/StaticMeshComponent.h"
#include "Components/SkeletalMeshComponent.h"
UMyMaterialChangerComponent::UMyMaterialChangerComponent()
{
PrimaryComponentTick.bCanEverTick = false;
}
void UMyMaterialChangerComponent::BeginPlay()
{
Super::BeginPlay();
// Try to find a mesh component on the owner
AActor* Owner = GetOwner();
if (Owner)
{
MeshComponent = Owner->FindComponentByClass<UStaticMeshComponent>();
if (!MeshComponent)
{
MeshComponent = Owner->FindComponentByClass<USkeletalMeshComponent>();
}
}
}
void UMyMaterialChangerComponent::ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex)
{
if (MeshComponent && NewMaterial)
{
MeshComponent->SetMaterial(ElementIndex, NewMaterial);
}
}
This part was implemented late at night just before the client review. Turned out to be the deal breaker. We used the exact code for clients to change the layouts of the house by just pointing a laser controller. This provides convenience and showcases innovation, attracting engagement.
Read More: How VR Fitness Apps Transform Home Workouts
Unreal may provide you some of the best visual results, but like any other platform, it has its issues. One of our safety training apps’ quality of frames dropped, which caused confusion until we optimized them using Hierarchical Instanced Static Meshes and baked lighting.
If we compare the performance tuning for both Unity and Unreal,. In Unity, it’s much easier to do, as scripts helped us switch LODs based on distance for mobile VR optimization.
using UnityEngine;
[RequireComponent(typeof(LODGroup))]
public class DynamicLODController : MonoBehaviour
{
public Transform targetCamera; // Usually the XR Camera or Main Camera
public float[] lodDistances = { 5f, 10f, 20f }; // Customize these
private LODGroup lodGroup;
void Start()
{
lodGroup = GetComponent<LODGroup>();
if (targetCamera == null)
{
Camera cam = Camera.main;
if (cam != null) targetCamera = cam.transform;
}
}
void Update()
{
if (!targetCamera || lodGroup == null) return;
float distance = Vector3.Distance(transform.position, targetCamera.position);
float lodBias = 1f;
if (distance < lodDistances[0])
lodBias = 0f; // Highest LOD
else if (distance < lodDistances[1])
lodBias = 0.5f;
else if (distance < lodDistances[2])
lodBias = 0.8f;
else
lodBias = 1f; // Lowest LOD
lodGroup.ForceLOD(Mathf.FloorToInt(lodBias * (lodGroup.lodCount - 1)));
}
}
This simple logic saved us almost 30 percent of frame time on Quest 2. In these situations I always stress to junior developers: never underestimate the power of distance-based LOD and culling techniques when building for standalone VR.
Read More: How to Develop Console Games with Unreal Engine
Using Unreal’s MetaHuman Animator for the first time was one of the coolest experiences I had with my team. We used it to create an app that scans faces and provides lifelike characters in no time.
We applied Live Link to sync facial motion and imported the asset straight into our VR project. This will allow clients or users to integrate this into a virtual training room where users talk to digital humans for practice scenarios.
The blueprint flow looked something like this:
When we showed this to the client, we received an overwhelming response. The clients assumed, though, that we had outsourced it to a movie studio. That’s the power of MetaHuman paired with good hardware.
Read More: Unity vs Unreal Engine – Better for Metaverse Development
Next up, MR, in the past we have conducted experiments using Quest Pro and Vision Pro. The results were distorted visuals with inconsistent lighting and hand-tracking lag. One of our apps has virtual workstation overlays, issues with bright lighting, and the environment mapping struggled.
This experience clarified that MR works on short and guided tasks. Don’t build anything that depends on pixel-perfect pass-through yet. Consider MR as an optional feature rather than a core mechanic until the tech matures.
Read More: Will Vision Pro Revolutionize The Way We ‘Experience’ Things
When it comes to XR development, I had a personal experience that helped me learn in-depth about Unity’s features. Once I set up Tilt Brush for my kid and watched him paint a 3D dragon in mid-air. This gave a perspective on how XR innovates kids’ learning.
A Unity-based AR flashcard app allows users to scan a dinosaur card, and a 3D dino pops up roaring. Here’s a Unity code snippet we used:
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;
public class DinoImageTracker : MonoBehaviour
{
public ARTrackedImageManager trackedImageManager;
[System.Serializable]
public class DinoData
{
public string imageName; // Must match the reference image name
public GameObject dinoPrefab; // 3D prefab to spawn
}
public List<DinoData> dinoLibrary = new List<DinoData>();
private Dictionary<string, GameObject> spawnedDinos = new Dictionary<string, GameObject>();
void OnEnable()
{
trackedImageManager.trackedImagesChanged += OnTrackedImagesChanged;
}
void OnDisable()
{
trackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged;
}
void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs)
{
// Handle newly detected images
foreach (ARTrackedImage trackedImage in eventArgs.added)
{
SpawnOrUpdateDino(trackedImage);
}
// Handle updates (position, tracking state)
foreach (ARTrackedImage trackedImage in eventArgs.updated)
{
SpawnOrUpdateDino(trackedImage);
}
// Handle removed images
foreach (ARTrackedImage trackedImage in eventArgs.removed)
{
if (spawnedDinos.ContainsKey(trackedImage.referenceImage.name))
{
Destroy(spawnedDinos[trackedImage.referenceImage.name]);
spawnedDinos.Remove(trackedImage.referenceImage.name);
}
}
}
void SpawnOrUpdateDino(ARTrackedImage trackedImage)
{
string imageName = trackedImage.referenceImage.name;
if (!spawnedDinos.ContainsKey(imageName))
{
GameObject prefab = GetDinoPrefab(imageName);
if (prefab != null)
{
GameObject dino = Instantiate(prefab, trackedImage.transform.position, trackedImage.transform.rotation);
spawnedDinos[imageName] = dino;
// Optional: Play roar animation and sound
Animator animator = dino.GetComponent<Animator>();
if (animator != null) animator.SetTrigger("Roar");
AudioSource audio = dino.GetComponent<AudioSource>();
if (audio != null) audio.Play();
}
}
else
{
// Update position if already spawned
GameObject dino = spawnedDinos[imageName];
dino.transform.position = trackedImage.transform.position;
dino.transform.rotation = trackedImage.transform.rotation;
}
}
GameObject GetDinoPrefab(string imageName)
{
foreach (var data in dinoLibrary)
{
if (data.imageName == imageName)
return data.dinoPrefab;
}
return null;
}
}
Seeing how XR seems to be the future for primary education. Our weekend project focused on developing a full XR prototype. Seeing how engaged kids become when learning becomes spatial.
AR, VR, and MR are blended together to provide an immersive experience, all integrated into an XR development platform. Its applications, which range from training and education to gaming and entertainment, are produced by combining these technologies.
Read More: Can Custom Software Revolutionize Remote Education
Having extensive experience using both platforms. I would suggest that both offer vastly different things in terms of catering to their niches.
When it comes to Unity, it’s a lightweight, flexible, and fast-deploy tool. It is perfect for MVPs, multiplatform apps, and when you want to get something working across devices quickly.
For Unreal, it’s a high-end production engine. Use it when quality matters and when you have the hardware budget to back it up.
Read More: Top 10 Cross-Platform App Development Frameworks
With years of experience on various developmental projects, I believe that learning is a never-ending process. At Cubix we have a couple of test builds available. From a motion capture performance viewer on Meta Quest to an AR model viewer for product demonstrations.
Ping me or join our newsletter, and I’ll send my APKs. Let’s expand our horizons and discover what more these tools have to offer.
If you are getting into XR development, try both Unity and Unreal. They each have their strengths. The trick is knowing when to use which.
Need help deciding? Want actual source code examples to start from? I am always happy to share what I have learned.
Read More: Ideal Game Engine for Creating Powerful Simulation Games
Category