Best Practices for Building Spatial Solutions
- 19 minutes read - 3949 wordsCovid19 is forcing us into a ‘new normal’, A new normal that makes most of us work from home, and that leads to spending long hours on Zoom, or other video conferencing platforms. Meetings, inceptions, discovery workshops, conferences, and webinars are happening in the same monotonous ways. We are stuck with a screen in almost the same pose and posture. Some might use their mobile phone or tablet but that limits us to a small screen. We do miss our office days and face to face meetings; the brainstorming, retrospectives, daily stand-ups or other important workshops exercises. Using physical boards, whole office walls and seeing our colleagues and clients body language was so normal pre-covid. Now, even when offices are re-opening, it will be difficult to go back to business as usual, while maintaining the social distancing norms. We have to face it, there is no other option than shaping what the ‘new normal’ means to our ways of working and interactions. We need touchless and contactless experiences that lead to the same results, and great digital products as well as customer experiences.
Can XR help here? XR moving from nicety to necessity.
At ThoughtWorks, we brainstormed on new ways of working that may be more engaging, interesting and productive could look like. One of those experiments is ThoughtArena, where users can collaborate in 3D space. It is an augmented reality-based mobile app, that allows the users to create a space with virtual boards in it.
ThoughtArena allows its users to extend traditional 2D workspaces to real world environments. Users can place virtual whiteboards in the 3D environment and use them like physical boards. They can place those broadcasts in their comfort and in real-time with other users. In addition, users can add audio and video notes and may vote on the notes.
This article explains our learnings around building spatial applications and also presents the results from the user experience and behavior analysis using the XR applications. We have done a couple of experiments and project deliveries in XR, learnings from them can also be found here.
Best Practices for Product Definition
The XR tech evolution is happening at great speed and there are a number of products being built to meet different business needs. XR based products are still evolving, and there are not enough standards and practices to validate the product maturity. Here are our learnings while defining the product features for ThoughtArena.
-
Go for Lean Inception - Inception is typically the phase where we define the product and its roadmap, as well as align on the big picture. However, after the initial meetings with stakeholders, we realized the need for a lean inception approach, which gives us a quick start and iteratively discovers the MVP. We planned a few remote meetings with stakeholders from business, marketing and technology groups, and came up with an initial feature list. We betted on Bare Minimum Viable Product (BMVP) features and product vision in the first couple of days.
-
Evolutionary product definition - We knew that, we have signed up for a product which will evolve with time, and we would need a plan which enables evolution. In ThoughtArena, we have included many new features since inception, and deprioritized others based on user expectation and need.
-
Recruit a diverse user group - One of enablers for the product evolution is the user group with diverse skills and experience. We have recruited a diverse user group in terms of ethical background as well as professional background including members IT, Operations, Delivery, Experience Design, Marketing and Support, and diversity. The group helped us strengthen product definition and vision. They helped us to think about newer ways of interactions with the 3D boards such as adding multimedia which is typically not available in a physical or digital wallboard.
-
State the Risky Assumptions - Defining a set of hypotheses for risks and assumptions and then validating them is valuable for a product, especially in an emerging technology space. We stated following risky assumptions for ThoughtArena:
- 3D experiences may be more enjoyable and better than 2D experiences
- Spatial (3D) experiences may improve creativity and productivity
- XR Tech is ready for prime time.
- It may be a remedy for Zoom fatigue.
- XR Products may leave a long lasting impact on the users.
-
Plan to Validate the Risky Assumptions - Once we know the risky assumptions, we should plan to validate those quantitatively and qualitatively.
- Is the product 3D/Spatial fit?
- Is spatial experience enjoyable?
- Does the product really solve the user’s problem?
-
Map the user’s mental model of a physical collaboration space - Users should be able to relate the experience of the product to the flexibility and intuitiveness of a physical collaboration space.
-
Focus on lifecycle and diversity of content (creation, reorganization, consumption) - The tool should assist users in the different stages of content lifecycle : Creation (editing, duplication, look and feel of content), Reorganization (grouping, sorting, moving), and Consumption (updates, viewing real time participation, tagging, commenting etc.). The users should also be allowed to choose from a diverse range of content like freehand drawing, shapes, flowcharts, tables etc.
Understand the user journeys for different use cases and focus on mapping them end to end functional flows. The product should aim to provide more transparency to view and manage collaborators and activity across the spaces and boards.
-
Adapt the experience to the user’s context - Identify the relevant use cases to allow users to engage using their mobile devices. The tech should be stable to adapt to user’s need to move around or stay stationary without impacting the experience.
-
Market analysis and coverage - Market analysis of similar products is important to understand what is already there. It also helps prioritize the product features. We analyzed XR collaboration products such as Spatial.io, Big Screen as well as popular products like Jabboard and Mural. This analysis enabled us to prioritize features such as audio, voice to text and video notes, and targeting to Android and iOS/iPadOS — which in turn would help to have more user base coverage.
Best practices of User Experience Design
The “Reality” is what we believe, and we believe what we see, hear and sense. So “Extended Reality (XR)” is about extending or changing the belief. Once the user gets immersed in the new digital environment, they also expect to interact with digital objects presented to them in the same/similar ways as they interact with real objects in the physical world.
-
Minor glitch in AR applications can break the immersive experience and users may not believe in the virtual objects. It is more important to keep the user engaged all the time.
-
AR Applications need great onboarding — As AR is a new technology, most of the first time users don’t know what to expect. They are only familiar with the 2-D interactions and they will interact with the 3-D environment in the same way while using a mobile device. This often results in interactions with the digital content in AR not being intuitive.
Eg. In the ThoughtArena app, users did not realise that they needed to physically move around or move their phone or tablet to be able to work on the virtual board. We needed to design a good onboarding experience, that properly communicated and educated the users about these interactions, to help them to get familiar with the new experience.
-
We need to build hybrid user interactions of both digital and physical worlds, keep the best out of both worlds. Eg. During the user testing phase of the ThoughtArena app, we experienced that many of the users would like to use the gestures and patterns that are familiar to them. Users want to zoom in and out on the board or board items, however, there is nothing like zoom in and out in the real world. In the real world, in order to see things more clearly, we just move closer to the objects. Similarly, on a real board, editing is done by removing and adding an item again, however, in a virtual board, users expect an edit functionality on the added objects. For the best experience, we should include the best features of both 2D and 3D.
-
Understanding the environment — As AR technology lets you add virtual content to the real world, it’s important to understand the user needs and their real-world environment. While designing an AR solution consider what use cases, lighting conditions or different environments your users will be using the product in. Is it a small apartment, vast field, public spaces? Give users a clear understanding of the amount of space they’ll need for your app and the ideal conditions for using it.
-
Be careful before introducing a new interaction language. XR interactions itself are new to a lot of users. We realized that at this stage it would be too much for the users to introduce a completely new language of interaction. We followed Jakob’s Law of UX, and continued the 2D board and sticky paradigm in the 3D environment, tried to keep interactions such as pasting notes, moving from one place to another, removing etc similar to what we do in 2D.
-
We have also researched on if 3D arrangement of content is always better than 2D arrangement, and found that spatial cognitive memory comes into play if the objects are distincts in look and feel and lessor in number, but if there is a large number of similar objects arranged in 3D then the user interfaces look more cluttered. Ref : 2D vs 3D
-
Text Input is a challenge in XR applications. If one takes the text input in the spatial environment, then the 3D on-screen keyboard needs to be shown to the user, and it is hard to take input from there. Another way is to take text input with a 2D keyboard from the device, but then it breaks the immersive experience. User interface needs to be designed to provide pseudo immersion.
-
Define product aesthetics — Define color, theme and application logo. We conducted a survey to take an opinion of different color themes. Based on the environment, color of the UI component may be adjusted to have better visibility.
-
XR spatial apps may allow users to roam around in the space. It may be a remedy for zoom fatigue and may break the monotonous style of working. This may also make people more creative or attentive or atleast lead to a healthy work style. At the same time we need to be careful while designing interactions, for following reasons
- Too much movement may cause physical fatigue.
- It may be fatal if the user is in an immersive environment and not conscious of the real environment while interacting with virtual objects.
- Phone based XR applications where the user needs to hold a phone awkward position for a longer duration would cause strain in hand.
-
Understand human anatomy while designing user experience in XR.
- Field of view (FOV) — The field-of-view is all that a user can see while looking straight ahead both in reality and in XR content. The average human field-of-view is approximately 200 degrees with comfortable neck movement. Devices with less FOV provide less natural experience.
- Field-of-Regard (FOR) — It is the space a user can see from a given position, including when moving eyes, head, and neck.
- Content positioning needs to be carefully defined, and provide provision to change the depth as per user’s preference. The human eye is comfortable focusing on objects half a meter to 20-meters in front of us. Anything too close will make us cross-eyed, and anything further away will tend to blur in our vision. Beyond 10-meters, the sense of 3D stereoscopic depth perception diminishes rapidly until it is almost unnoticeable beyond 20-meters. So the comfortable viewing distance is 0.5-meters to 10.0-meters where we can place relevant contents
- Neck/Hand movement — We need to position the primary UI element in the direct FOV without neck/hand movement, and secondary UI elements can be placed with neck/hand movement. UI elements which are placed beyond range require physical movement. Length of arm is another vital factor, It is essential to consider arms while designing UI, try to place fundamental interactions within this distance.
Best Practices of XR Development
XR development introduces us to the new set of technologies and tools. The standard development practices such as Test Driven Development, eXtreme Programming, Agile development helps building a better product. XR is influenced by gaming a lot, and many standard development practices for enterprise software development may not be straight forward.
-
Plan for spikes - XR is still fairly new so it is better to have a plan for spikes for uncovering the unknowns.
-
CI/CD first - Having a CI/CD setup makes the development cycle very smooth and faster. Investing some time in the beginning to set up the CI so that it is able to run the test suite and deploy to a particular environment with minimal effort, saves a lot of time later on. We used CircleCI for building both the Unity app, as well as the server side app. For the server side, we integrated with GCP. We maintained different environments on the GCP itself and CircleCI was handling the deployment to those environments.
-
Quick feedback cycles - We follow a 2 week iteration model and had regular showcases at the end of those. The showcase consisted of a diverse user group (it is important to have a user group to better inputs, see previous articles of this series) , which helped us a lot in covering all aspects of the app, the features, the learnings, the process. All the feedback helped a lot in making our app better. Having regular catch ups and IPMs made sure we had the stories detailed and ready to be picked up for development.
-
Learn the art of refactoring - A lot of times refactoring takes a backseat while delivering POCs or during tight deadlines. We did not let that happen here. The team made sure to keep refactoring the code as we moved along. In order to achieve refactoring, one must have a test suite to support it and we made sure that we have good code coverage and scenarios getting covered through our tests (unit and integration tests). There will always be scope for more refactoring, the trick is to keep doing it continuously and in small bits and pieces, so we do not end up with a huge amount of tech debt.
-
Decision analysis and resolution - Collect facts and figures and maintain decision records.
For example following decisions we had to take for ThoughtArena App.
- Unity for the mobile apps — With Unity and its AR Foundation API, we were able to develop our app for both Android and iOS devices with minimal extra effort. The built in support for handling audio and video also helped to speed up the development process.
- Java and Micronaut for server — We decided to work in the Java ecosystem based on the tight schedule we had and the familiarity of the team with the Java ecosystem. We chose Micronaut for its faster startup times and smaller footprint. Also, we were focusing mainly on APIs and deploying on cloud, so we felt it to be the right framework for us.
- Websockets for real time communication — There are multiple ways to go about the real-time communication aspect. The first thing that comes to mind is websocket, an open TCP connection between server and client to exchange messages and events in a fast manner. There are other options as well to achieve real time, like queues, some DBs, etc. We went ahead with websockets as this does not require any additional infrastructure to make it up and running and it fits our use case as well — both client and server are sending messages to each other real time.
- All our data is structured and straightforward, so it was easy for us to choose a relational database for transactional data.
-
Tech evaluation - Make sure we have a plan for tech choice evaluations to check if the tech is ready for prime time. We have evaluated multiple tech choices, for example cloud anchors, environment scanning, video playback, real-time sync, session management, maintaining user profile state locally etc.
-
Extendable design — Separate out the input system from the business logic, so that multiple input mechanisms can be integrated such as click, touch, tap, double tap and gaze. We have used Unity’s event system as a single source of event trigger.
-
XR tech considerations - AR takes a bit of time to initialize and learn the environment. Speed of this initialization depends on the hardware and OS capabilities.
- Tracking of the environment goes haywire when the device is shaking alot.
- Hard to control the AR system as it is a Platform specific capability.
- Longer use of the AR might make the hardware hot and also consumes battery as algorithms for AR are really CPU intensive.
- AR Tech is getting more mature year over year, but good hardware is still a dependency for this technology.
- AR Cloud anchors are early in stage but getting matured in a rapid manner. It needs proper calibration. In simple terms the user needs to do a little more scanning to achieve good results (to locate the points). We have to de-prioritized this feature due to its instability.
-
Real time collaboration considerations
- Define how long the session should be.
- Handle disconnections gracefully.
- Define approach for scaling sessions, we have to introduce distributed cache for managing sessions.
- Cloud considerations — We have used Google App Engine (Standard environment) for our app, because of its quick setup, Java 11 support and auto-scaling features. Also, GCP handles the infrastructure part in case of GAE apps. However, Standard App Engine does not support websockets. So we had to switch to the Flexible Environment, which does su-pot Java 11 out of the box, to keep things on Java 11, we had to provide custom docker configuration in the pipeline.
Best Practices of XR Testing
-
Our learnings from developer level testing for XR applications
- We learned that most of the devices have software emulators which can be integrated to the XR development tools such as Unity, it helps developers to test the logic without a physical device. We realized that 70% of the functionalities of ThoughtArena app didn’t require a physical environment/movement like adding/moving of stickies or playing of videos, making of API calls to get data, displaying list of boards or members in a space, adding of boards/deleting of boards. All these could be tested in the editor’s instant preview.
- Cases where physical movement were required like the pinning and unpinning of Boards were the ones that could not be directly tested in the editor, without manually rotating the user’s POV from the inspector.
- Things that had platform specific plugins like uploading of Image, saving snapshot of board to local storage could not be tested in the editor.
-
Unit testing - Unity engine comes with a great unit testing framework, and by plugging in a mock testing framework, we can test units very well.
- Features dependent on API calls — We had an entire internal mock server setup early into development to reduce dependency on the APIs. Once the API contracts were settled on, we could write tests for them and continue development even if the APIs were under development
- Board/Notes interactions by mocking user inputs such as click/touch
- Simulated the interactions as movement via mocking Pinning/Unpinning of boards and moving stickies.
- 2D UI interactions and use flow
-
Automation testing - We extend Unity Editor’s unit testing framework and built a functional test suite that can do integration tests and end to end tests for some scenarios on the plugin device. We have open-sourced an Automation Testing Framework — Arium have a look here.
-
Integration Testing - We have an integration testing suite for the backend, and for testing the APIs. Integration testing helped us to test out the websockets as well and if they are working in the intended way. These tests run on the CI after every push we make to the backend repo. Since our code was also using GCP storage as well, we needed to make sure that we are able to replicate that behaviour. We found out that GCP storage APIs do provide a test storage, which mimics the storage locally and used that in our integration tests.
-
Functional Testing - Since XR depends on the environment, testers need to test the functionality in different environments, different lighting conditions, noise level, indoor, outdoor, moving into the environment to test the stability of functionalities.
-
Acceptance criteria for the XR stories are quite different from what we see in general software development stories, a lot of factors need to be considered.
- Keep the user immersion intact while working.
- Acceptable Performance on different environment conditions.
- A functionality working very well may not work that well if the environment condition changes, then the user may lose the immersion, and the XR app does not meet the expectation.
- Testers must consider spikes to define the correct acceptance criterias. The factors which may impact may not be known before, for example what would be the impact of rain on a business story that requires outdoor testing.
- Acceptance criteria also needs to be evolved as the product evolves.
-
Make sure we have supported devices with required resolutions for the test team, plan it from the stars.
User Experience Testing
Plan for user experience testing to get user insights and validate the hypothesis with users. It helped us to better understand the impact, benefits, limitations, flexibility of our spatial solution. The user experience also focused on the aspect of collaboration and assessing the effectiveness of the developed concept in terms of usefulness and usability. It can be divided in two parts: User research and Usability Testing.
- The user research covers as-is analysis of user, user interactions and tools the user is using relevant to the problem statement of the solution, and also note down user expectation from the changing environment.
- Usability testing covers how well the solution engages the user, and focuses on usability and usefulness of the solution.
The next section describes the result of user experience testing. User experience testing should cover quantitative and qualitative analysis. Here are the our key learnings
- Recruit users for dedicated testing sessions with the UX Analysis team. Recruit minimum 9 users.
- Plan user interviews before and after the testing sessions, and over user behaviour during the testing.
- Recruit independent test users, and ask them to share their observations in a survey for quantitative analysis.
- Recruit users with diverse backgrounds, experiences and familiarity with XR.
- Define a test execution plan with a set of scenarios for dedicated user testing sessions, and try out mock sessions to see if the plan is working and we are getting the expected inputs.
- Define detailed questionnaires for independent testers.
- Collect observations only after a few trials of the app otherwise we might get correct feedback on the app feature if app onboarding/user training is not that great.
Conclusion
XR tech is evolving at such a rate that the untapped potential boggles the mind. XR business use cases are growing with convergence to the AI and ML. XR devices are now able to scan and detect the environments, and can adjust to the environment. The standards and practices are growing from the leanings and experiments industry is doing. We have shared our experience from building XR applications.
Authors and contributors : Kuldeep Singh, Aileen Pistorius, Sumedha Verma and Team XR Practices at ThoughtWorks. XRPractices
This article is originally published at Medium
#xr #thoughtworks #technology #ar #vr #mr #best practices #experience #learnings #covid19