Earth Operation Center

Overview

The Earth Operation Center (EOC) project is a virtual reality project that I worked on as a part of my job at Meteor Studio. I was the lead user experience designer on the team and had to expand my horizons beyond that of a 2D Figma design file and learn new software like Unity, ArcGIS and Cesium.

Project Duration: Around 12 months.

Team size: 4.

Roles: Creative XR Developer, UI/UX Designer, Unity Developer.

Deliverables: A Virtual Tablet (Unity Asset), Figma Design Files, Experimental Data, Interaction Mechanisms.

Project Specifications: Unity, Figma, Cesium, ArcGIS, Blender, Slack, Zoom.

Achievements

In 1 month, learned Unity from scratch and started the development of the Virtual tablet.

Created 1 Scalable virtual tablet meeting the desired specs consisting of 3 levels of interaction.

Successfully presented the project in front of 5 stakeholders from Dreamscape at ASU.

Successfully delivered the project on time despite my lack of experience and changing timelines.

Joining Meteor Studio

I joined Meteor Studio in February 2023. The job came as a surprise to me because I did not remember applying for the position. Later, during the onboarding call, me and several other participants on the call were informed that all of us had been referred to Meteor by well-respected authorities on campus. I deduce that I was referred to Meteor by the lab for which I volunteered in my previous semester as an R developer, but to this day I do not know for sure!

During the call, we were told about the exciting new technologies we would be working on. Meteor Studio is only focused on creating Extended Reality experiences and comes under the Arizona State University. I had never worked with these technologies before except some 3D models on Blender (the knowledge of which was of no use to me for my entire stint at Meteor Studio). Everything seemed like an exciting new challenge and I was eager to get started.

The opportunity to work at Meteor Studio is, to this day, one of the best opportunities I have received in my lifetime. I cannot be more grateful! The projects were new and positively challenging, the supervisors were hugely inspiring, and my peers were absolutely masterful.

Testing my work at Dreamscape, ASU.

Problem Statement

Climate change is a real threat to our planet. Although there have been efforts to make things better, much still needs to be done to make an impact. Even in the current times, we see a lack of awareness about our planet’s health and there are serious gaps in the public’s knowledge about the reasons and relief efforts about climate change which prevents people from facing the reality, which in turn stops them from making decisions in favor of the planet. We need better educational tools to make climate change easier to comprehend and understand for the general public. In addition to that, these tools should allow researchers to work in collaboration with crucial datasets which can be visualized in a way that gives them as realistic of a picture, both in terms of visual and in terms of data, as possible.

Responsibilities and End-Goals

My responsibilities at Meteor Studio were:

  • To be the lead user experience researcher in a team of 4.

  • To create interactions and interfaces for the virtual reality products being developed at Meteor Studio.

  • To keep up to date with the latest design trend in VR and keep learning new things to make myself more well-rounded as a VR developer.

  • To learn C# programming and other Unity development tools.

The end-goal of my role at Meteor Studio varied depending on my projects. For Earth Operation Center, my end-goal was to create a product with intuitive user interactions and to tell a coherent story through the product’s design and presentations.

A Shaky Start

Unity was a new software for me. I had some experience with 3D models but little to no experience with game engines. When I joined Meteor, I was under the impression that they needed me to use Figma as the design tool and handle other responsibilities of a user experience designer such as user surveys, user stories, interactions, etc. Slowly it became clear that if I had to succeed at Meteor Studio, I had to learn Unity and even C# scripting.

To start with, I divided my role in the project into two parts - a Unity developer and a User Experience and Interaction Designer.

Unity Development

I started Unity with a few basic tutorials. Once I had the hang of the basic terminologies and features, I opened the Earth Operation Center project which was, at that time, being developed by our team’s Unity Developer. I was still learning the ropes as I explored the project but admittedly, I took too long to get used to Unity development. It had been a month by now.

User Experience and Interaction Design

The user experience requirements for the project were still unclear to me. I started gathering more data from my then project lead, Matthew Soson. I was then instructed to come up with a good virtual tablet design which would become the central point of interaction for users inside the EOC experience.

Since this was my first Virtual Reality project, I had some biases and hurdles to cross before starting to not only empathize with VR users but also encourage non-VR users to give our technology and our application a try. To accomplish this task, I created mind maps and free-form associations on paper to help me overcome my inexperience with VR and get comfortable with the idea of headsets, controllers and VR visuals.

One Month Mark

For the next couple of months, I shifted my attention to sketching and wireframing the tablet and creating user journeys for the tablet interaction as I worked on learning Unity during my off-hours.

Application Interactions

This mind-map represents all the features that we initially planned to offer in Earth Operation Center (EOC). The circles or blobs represent features and entities, and the arrows represent a parent-child (contains) relation, while the dotted lines represent an association.

When I approached Matthew with some other concepts and diagrams, he wasn’t impressed. When I showed him this diagram, I got an audible gasp of excitement out of him. That was all the sign I needed to know that these types of diagrams are not only helpful to my way of thinking, but these are expected from me as a user experience designer as it conveys the higher-level concept way better than text or rough sketches.

Mind-map for admin controls

Different concepts for how the tablet would look

Calculating all the buttons

Labelling all the buttons on the controllers

Some Extremely Rough Sketches

Putting pen to paper helps my thoughts flow and gives me a better understanding my next steps.

I am not saying I am proud of how these shabby pages look, but I vouch for my process. These sketches are a necessary part of the design process, they bloom into something wonderful when the process ends, but the seeds are sowed right here.

Understanding XR Interactions

Since this was my first time developing for extended reality (XR), I wanted to understand what I was working with. Meteor Studio provided me with a VR headset for the duration of the project to test out the features. The first headset I got stopped working for some reason, so I had to get it replaced with another which did cost me some time.

The headset I finally worked on was the Meta Quest 2. The controllers in the diagrams represent that headset. I literally counted each and every button on the controller and made a list of all the types if interactions that were available for the developers through the controllers.

Exploring existing tools

A myriad of climate analysis tools exist for people to use on their computers but there are very few that currently exist in VR. Even the ones that exist are not widely available for commercial or personal use but are specialized tools that are either used for creating learning experiences, like our very own EOC, or they are the products of research and are only open to a small section of the public.

So for research, we looked at the tried and tested software tools available on the computer like ArcGIS. We learned how these tools process information to provide high-speed, accurate data to the user on such a large scale. We identified the features that were relevant to EOC and implemented them in our project.

UX inspirations

I was tasked with the development of a UI for EOC but I was inexperienced on the topic of creating user interfaces for a VR product. Since VR is still an emerging field, we have some liberty to experiment with our designs and create something new because there are no design conventions yet. Regardless, any user interface should still be somewhat familiar to the user and must be intuitive to learn and use. I personally believe that a UI of any system should be the element that has one of the flattest learning curves. To achieve this feat, I played around with some popular VR applications that had features similar to the ones that we planned to implement in EOC. One of these applications was Tiltbrush, which inspired me to consider a compact form factor with an ability to hide the UI.

Why make a list of button interactions for a virtual tablet? Won’t it all be touchscreen?

Good Question!

Firstly, since there is no touch feedback in VR, I wanted the option to interact with controller buttons because VR controllers provide a lot of haptic feedback to the user.

Secondly, Initially I imagined a larger-than-life virtual tablet in the VR setting. To give you a picture, it was supposed to look like a theatre screen to the user. The reason being that climate data has a lot of numbers and large texts. I was of the opinion that a bigger screen would serve a better purpose and within VR there are no limits on sizes so why not provide the user with such an experience. Since I imagined a larger screen, it was obvious that using the tablet in a touchscreen format would be tough for users as it might be too tall for some, and users might want to keep such a large screen at a distance instead of having it at arm’s length. One can only imagine how awful it would be to watch a movie in a theatre seated only at an arm’s length to the screen.

Creating User Journeys

I created the user journey to aid my UI design. I always put user experience over the aesthetics of the app. A good user interface should accompany the best possible user experience, and not the other way around (otherwise you get products like the Magic Mouse by Apple).

I conducted stakeholder interviews to determine the “ultimate purpose” of the application and then tried to create user journeys to meet that purpose. I got constant feedback from my project lead, Matthew Soson, on these journeys. Overall the feedback remained positive and the team was happy with the progress I was making.

I did not conduct user interviews for these journeys because it was tough to find users for VR applications of this scale. We had to figure everything out ourselves. I did conduct internal interviews within Meteor Studio. My plan at this stage was to create a design and get feedback from people on that design by loaning them my VR headset.

Three Months Mark

Figma Wireframes

After completing the user journeys and rough sketches, I was confident enough to start making low-fidelity wireframes in Figma. The tablet took a rectangular shape. The bar on the left is a menu containing various options leading to different screens such as settings, data input, admin controls, etc.

“Apply data” takes the users to a screen where they can apply different datasets to the virtual globe contained in the Earth Operation Center scene. “Inspect Layers” gives the users the option to inspect and modify different layers of data applied on the globe. The idea was to allow users to add multiple datasets on the same globe as different layers.

Four Months Mark

High-Fidelity Wireframes and My Biggest Mistake

Up to this point, I had been creating the designs in Figma without checking out Unity’s design interface, tools and limitations. This turned out to be my biggest mistake in this entire project as it cost me a lot of time and effort

As you can see in the embedded Figma file, I had all sorts of interactions and UI elements ready for the tablet design. I had created intricate details and animations and shown the prototype to the entire team. For example, I had added an interaction design where users could select multiple datasets on a single screen and the selected datasets would change to a darker background and a check mark in a highlighted box would show on the top-right of the bounding box of the dataset. I had also created transitions for switching from one screen to another

The reason for this being a big mistake is that I was the one who was in-charge of developing the screen in Unity. I did not have the right skillset to execute such a detailed design and interaction system, let alone finish it by the deadline. When I began developing with Unity, I quickly realized that I would not be able to accomplish a lot of my design goals which led me to go back to Figma and change the design to a more minimal one, without changing the user experience. I know this sounds like I gave up, but during the time, finishing this part of the project took priority over implementing a detailed design. I had to be realistic and adjust my goals to avoid letting the entire team down and prevent a delayed delivery of the product.

Five Months Mark

Figma to Unity, My Second Mistake and a Transformational Call

When time came to transfer my previous high0fidelity frameworks into Unity, I had no idea how to go about it.

Google handed me some good results, one of which was a Figma to Unity plugin that would transfer all my Figma frames of a prototype to Unity and also keep the interactions from the prototype intact.

The link on the right will take you to the plugin I used. After I set it up and tried it with my prototype, it worked exactly like it was supposed to. I could see my designs in Unity and could even interact with them. I also tested it on my Meta Quest headset, and it worked well enough for a prototype. I was elated! I had not only designed a VR tablet but also made it appear in VR without any prior VR development experience. I showed this to my team, and they were happy with how the tablet looked.

With the satisfaction on completing the task, I started fine-tuning the tablet in Unity and poking things here and there to see how everything was working.

A Call with Meteor Studio’s Director

As you know, it had already been 5 months since I started work on this project. Although I had a prototype to show for it, the prototype didn’t do all the much. Sure, you could click and go to different screens, but it had no functionality outside this. It was not added to the Earth Operation Center’s main Unity repository and it wasn’t present in the main scene. There were no controls set on it and therefore, it was just a standalone project.

Robert LiKamWa, the director of Meteor Studio, scheduled a meeting with me for the coming week. I was scheduled for my other part-time job during that time, the call was just too important to miss so I rescheduled that day to make things work smoothly. Although I had interacted with him before, I had never had a 1-on-1 meeting with Robert and did not know what to expect. I will not discuss my conversation, but let’s just say that it wasn’t the sweetest. It was a wake up call. I realized that I had been underperforming and the expectations from me were way higher. That did not sit right with me, and I got to work immediately. I spent late nights working on the project (did not charge those hours). I finally stopped being scared of Unity and pushed through. I needed tangible results that proved my talents as a designer, and I needed them quick.

Six Months Mark

Conquering Unity

The plugin I used to take my design from Figma to Unity worked by converting all the prototype frames to PNGs and then layering them in Unity inside a UI panel. The fundamental issue with it was that the elements within a frame were not separately programmable. So, if a frame contained a button, it did not have the programmability of a native Unity button. This was a crucial functionality, and I made it my goal to finally conquer Unity and learn its design system.

Unity’s User Interface design tools were not as robust and did not have as much potential as, say, a website would. I followed a random YouTube playlist of the video creator making a project in Unity for VR and I also went through Unity’s documentation. It was soon clear to me how a basic UI panel and fundamental elements like buttons, switches, etc. could be created and used. Unity, at that time, had also newly released a UI builder which was supposed to make a UI designer’s job much easier. There was no need to learn scripting and users can just drag and drop elements in the window to create a UI panel and other such things.

I started my work on the UI Builder but soon learned that user interfaces built in the UI builder did not have VR functionality, that is, they would not work or even be displayed in VR. This is because for a UI element to be seen in VR, it needs to exist in the world space of the Unity scene, whereas UI builder only worked in the overlay format.

Redoing High-Fidelity Wireframes

I redid the wireframes to simplify them and make them more suitable for Unity. I accomplished this by first playing around with Unity’s UI building tools and understanding what I can and cannot do from a design perspective. Then I transferred my knowledge to Figma and came up with a simplified design, built on top of the wireframe.

Color Palette

I did not spend much time deciding the colors of the tablet. Since Meteor had yellow in its brand color, I decided to go forward with that. The Meteor Studio yellow is of a brighter shade though, which caused high contrast in a virtual reality environment so I brough the color down to a more pastel tone so that it did not take attention away from the central globe where the actual data was to be shown.

Eight Months Mark

Project Lead’s Departure and an Organization Structure Shakeup

Eight months into the project, our project lead’s departure was announced. I also learned that our project had graduated from a program under which it was being funded. So, for the next one month, we were told to work on the project according to our capacities while Meteor Studio figured out the next direction for the Earth Operation Center.

New Team

After a month, we were introduced to our new team lead and our project was merged with another Meteor Studio project called Graphviz. The goal of both EOC and Graphviz was shifted from being a research-oriented consumer facing project to a educational Dreamscape project for undergraduate students on ASU campus. We spent the first month introducing ourselves, getting to know about each other’s projects, brainstorming sessions and coming up with new ideas for our projects.

Nine Months Mark

Unity Development and Refining the Design

As our team switched gears, we grew more and more comfortable as time passed. We smashed deadlines and made the project work on time. We learned how to deploy projects to Dreamscape, which was a laborious task with high risks as we could very easily crash the entire system with one wrong move.

Developing for Dreamscape called for certain things to be changed in the project that we had already created and also demanded new skills. Unlike my earlier days, I took the challenge head on and learned the required skills in relatively less time. We integrated the two projects - Graphviz and Earth Operation Center - into one single mega project.

No more controllers

Dreamscape is a fully immersive experience that works best without any kind of controllers. That being said, since it is used as a classroom experience, it has the ability to allow multiple students and professors to collectively experience a scene. The students sit together in a classroom and use controllers specifically built for Dreamscape. When I started studying the Dreamscape controllers for our project, we are asked to not rely on the controllers and only focus on the touch interactions.

Final Pitch Presentation to the Dreamscape Director and other Dreamscape Stakeholders Approaching in 3 Months

12 Months Mark

Pitch Presentation and the Final Product

The day of the presentation finally arrived. The last month consisted of late-night work, frantic meetings, unanticipated bugs, frustrating conversations, repeated testing and a celebration when everything worked as planned.

We all played our role in the preparation of the pitch by making our own slides and adding them to the final presentation file.

On the day of the presentation, we all spoke about our role in the project and presented the slides that each of us made respectively. Then we demonstrated the product to all the stakeholders. The stakeholders acted as students and our project lead played the role of the professor. The stakeholders were pleasantly surprised by our work, and we had an overall positive feedback.

Criticisms and Demands from the Stakeholders

  • Can the tablet be made smaller to match the dimensions of a physical tablet in the real world?

  • I would like to hide the tablet when it is not required.

  • I do not want the students to see the tablet. It is only the professor who needs it, so only they should be able to see it.

  • I would like to interact with the tablet using my fingers instead of my entire hand (that is a problem with the current SDK being used, a new SDK is rolling out soon that allows finger interactions).

  • Can I have a miniature version of the globe in my hand to move it as I please? And can those changes be replicated on the central globe? ( Yes, Awesome idea! We will implement it in the future).

12 Months Mark

Future Improvements

  • Decrease the tablet size (when the new SDK is released) to match the dimensions of an 12.9 inch ipad to invoke a sense of familiarity when using the user interface.

  • Make the tablet follow the user around so that they can make use of the physical space around them and move around in the virtual space.

  • Add designs for Dataset Form to allow the usage of a wider variety of datasets.

  • Allow the user to orient the tablet in whichever way they see fit.

  • Test a curved design for the tablet.

  • Continuous improvements based on user feedback.

Lessons Learned

  • Motion Sickness - Experiencing VR can easily cause motion sickness to newer users. Therefore, things in VR need to have a good resolution, good readability and should not move in a sudden or laggy manner. When I first experienced the initial designs of my tablet in VR with the previous color palette (which has now been completely changed), the contrast cause a lot of discomfort in my eye. This led to me changing the color palette and moving towards pastel shades to make it more comfortable to look at and interact with. The contrast also invited a lot of unnecessary attention to the tablet which was solved by using a more muted color palette.

  • Physical comfort - VR adds a new dimension (or really, allows a dimension that already exists to be more pronounced) of physical movement to software experiences. This means that a user experience doesn’t just include how they interact with the interface and how it makes them feel, but also how their body feels throughout and after the experience.

  • Developing in Unity - Prior to this project, I had no experience in developing in Unity. I had to learn how to develop working user interfaces for VR from scratch.

Conclusion

This project taught me a LOT of lessons. It doesn’t come as a surprise to me though. This was my first VR project ever and I made some beginner mistakes. I am proud of myself to have overcome such new challenges and finish the project successfully. After this project, I was confident enough to add Unity as a skill on my CV. I believe I have grown pretty well versed in Unity and can work in any XR project as a User Experience Designer.

Even for more mainstream user experience roles, my ability to adapt to such a new skillset and environment, as well as my ability to transfer my designs from Figma to this new medium bodes well as a display of my tenacity, creativity and wits.