Designing for new realities in the age of spatial computing and AI

Lauren Cervera

April 24, 2024

Design for New Realities in the Age of Spatial Computing  and AI
Design for New Realities in the Age of Spatial Computing  and AI

In the dynamic intersection of spatial computing and AI, today’s futuristic technologies are reshaping our digital interactions. As these ways of computing converge, new considerations are emerging for how we build for the world around us. 

The announcement of Meta Horizon OS is providing a more accessible mixed reality ecosystem, empowering developers to explore endless possibilities to get apps, experiences, and hardware to more users.

Top innovative design leaders shared key insights on how to navigate these systems. From human biometric inputs and accessibility to digital literacy and AI integration, we are in a new era requiring heightened intentionality and personalization.

Here are the key takeaways on how to design for these new realities.

Is spatial computing ready for mass consumers?

Spatial computing encompasses technologies that leverage the physical context of the surrounding world, gradually integrating more contextual information into computational processes. 

"Spatial computing is really anything that uses the physical context of the world that surrounds you — there's been this gradual emergence of these technologies which have this characteristic of bringing in more of your context and being able to use that in a computation."
— Keiichi Matsuda, Founder of Liquid City


Products like the Apple Vision Pro exemplify this trend by recognizing and labeling real world physical objects. According to Keiichi, "We're now in the stage of how all of those different parts will become aggregated and become part of a system that can link into all of those different areas." 

This progression is a transformative shift towards a more integrated and contextual computing experience, where spatial computing technologies seamlessly interact with and augment our physical environments.

How do major tech investments influence universal design standards?

Warning against the risk posed by major tech companies developing standards solely for profitability, Michelle Cortese, XR design lead at Meta and VR educator at NYU, urges that the industry push back and advocate for design choices that instead cater to human needs for all users, to set these standards themselves. 

The development of design standards in XR is crucial. According to her and based on hedonomics, a systematic framework to conceptualize affective human factors design, these standards can be organized into five categories: safety, functionality, usability, pleasure, and individuation.

"We are all in the process together right now of setting these standards and we have to hold human needs as the highest order problem."
— Michelle Cortese, Meta XR Design Lead


The importance of collaborative efforts will shape spatial design standards that prioritize the individual user experience above commercialization.

How will real-time personalized products impact the spatial computing industry?

The move towards real-time, personalized software experiences marks a significant shift in the spatial industry, as discussed by Daniel Marqusee from Bezi and Agatha Yu, previous lead designer at Oculus. They explored the potential for more adaptable interfaces that cater to individual needs, signaling a departure from static designs towards dynamic ones that enhance user experiences.

Daniel explores the question for more personalized software, "We're not going to be designing these macro flows that essentially a hundred designers would take to create an entire application".

"There's so much opportunity here to make more malleable, more personal software where it adapts itself — based on what your needs are. Systems that reposition themselves according to an individual's needs will help people feel connected to technology rather than feeling forced to adapt it”. 
— Agatha Yu, Human Interface Designer Manager


How are design roles evolving in an era of individually tailored UI experiences?

Designers that prioritize functionality, adaptability, and user empowerment will play a crucial role in shaping the future of AI-driven interfaces. The panelists dived in:

"the role of a designer is shifting towards understanding the importance of API access and strategic placement within the user experience. Designers will need to think about where the overall experience of their work is going to be exposed."
— Keiichi Matsuda, Founder of Liquid City


Michelle shares her piece, "There’s going to be a lot of bodily data that will be used to power these devices." This highlights the growing importance of understanding and utilizing data to improve AI recommendations and actions.

Agatha highlights the opportunities for interaction designers to create new paradigms for guiding AI, "We will have implicit data that will inform better AI recommendations or AI actions. One of the great opportunities for every interaction designer out here is that you now have a chance to actually create new paradigms about how to guide AI."

These evolving roles extend beyond traditional design tasks and encompass tuning behavior models and generating data to drive innovation in this space.

How can designers create safe spatial computing environments?

The immersive nature of spatial computing also amplifies threats, such as VR harassment. Michelle took on this topic, highlighting the critical need to establish safe spatial environments.

"By incorporating features based on proxemics and consent theory, borrowing existing ways in the real world that we protect people and their bodies, we can translate those into VR".
— Michelle Cortese, Meta XR Design Lead


Agatha expanded on this discussion, underlining the pivotal role of designers in actively engaging with leadership to prioritize ethical considerations. She remarked, "It requires the designer to have a very active conversation with leadership just to make sure that they understand and are able to wrap their head around it viscerally."

Keiichi contributed by discussing the necessity for loyal AI agents capable of understanding both physical and human context to address these issues. Instead of being a single super-intelligent entity, he proposed,

"There's a way of thinking about AI as more of an ecosystem where we can have many different agents that exist in this space and that you could curate your circle of trust and let only the ones that you want to have that information into it."
— Keiichi Matsuda, Founder of Liquid City


By implementing consent paradigms, loyal AI agents, and spatial boundaries within VR environments, designers can collaboratively create safer spaces in this spatial landscape, fostering a more secure and respectful digital environment for users.

How can designers develop innovative input paradigms for spatial computing?

Diving into the evolution of input paradigms across technological milestones, Agatha explains the historical impact of input methods and how these are driving the new mediums forward.

“You can’t take traditional inputs or repurpose inputs and push them into XR. The value proposition will be ambiguous if it’s emulating and replicating another medium.”
— Agatha Yu, Human Interface Designer Manager


The transition towards embodied experiences in XR needs implicit inputs and understanding of spatial relationships that we don’t have with screen devices.

“You need to have a lot of implicit signals, like knowing your body posture, understanding your intent, what are you looking at, what are you paying attention to, so that it can actually emulate a reality that feels coherent to you - the standard will emerge from there”.

Looking forward, she anticipates the shift towards more diverse platforms and investment strategies in XR. Envisioning the AI future, “AI is particularly interesting for this medium because we are already overwhelmed by technology. AI will be able to filter, be able to personalize a spatial medium for the user”. 

How can designers prioritize digital ethics in spatial computing?

Consensing on how to contribute positively to the evolving digital landscape, the panelists offered their final thoughts.

Michelle advocated for designing with a focus on spatial computing while considering primary standards. "When designing for spatial computing, regularly look to the primary standards by which we build safety and functionality and usability and pleasure and meet human needs in the real world — translate those over into all these paradigms."

Keiichi expressed a desire for a loyal AI that respects user privacy and empowerment. "I would like a loyal AI. I don't want to have an ambiguous relationship with the super intelligent being that knows everything about me.” He also emphasized the need to “give people the tools to be able to contribute to that world and to disrupt this idea that you have a platform and a consumer, where people can freely create and engage with each other”.

Agatha emphasized the importance of autonomy and need for a richer design vocabulary for how we engage with these new data and agents. “Designers need to have an active role in shaping not just how the models are built and how it actuates any types of recommendations or assumptions about you, but also think about new interaction affordances and patterns that allow people to guide the model's behavior.” 

How will you explore a human-centric digital future?

Spatial computing and AI presents a transformative opportunity to revolutionize human communication and interaction. By prioritizing design principles, leveraging real-time personalization and building loyal AI agents, designers can drive positive change and innovation while providing empowering solutions for all users.


Watch the full panel discussion

About the speakers

Michelle Cortese is an augmented and virtual reality (AR/VR) designer, artist and educator. She splits her professional time between design leadership at Meta Reality Labs and teaching at NYU. Her work explores immersive interaction systems; the ethical implications of embodied technology on end users; and the transmutation of human expression across new technologies and formats.

Agatha Yu has been working in spatial computing for 9 years, focusing on amplifying human senses with previous experience at Oculus and Adobe. She explores ways to translate spatial, social and emotive connections, so machines can understand our fuzzy human realities. Along the way, she assembled and grew design teams to dive into machine learning, spatial computing and procedural experiences.

Keiichi Matsuda is a designer of new realities. His cautionary short film *HYPER-REALITY* has been widely exhibited, and was awarded Vimeo’s “best drama of 2016,” amongst other accolades. Next Reality named Keiichi as a top Augmented Reality influencer while he was serving as Leap Motion’s VP Design, and he led the Mixed Reality experience design team at Microsoft while developing next-generation AR and VR devices. His studio Liquid City is a design practice specializing in XR and AI with a focus on shaping a positive future for technology in society.

Daniel Marqusee is a spatial designer helping build tools at Bezi, dedicated to teaching and exploring extended reality UX design and philosophy. Daniel is passionate about creating tutorials, UX deep dives, design philosophy, and is a self-proclaimed clumsy man muttering about XR and the Metaverse. His true passion is to work as hard as possible to create incredible content and to make learning more accessible to creatives from all socioeconomic backgrounds.



In the dynamic intersection of spatial computing and AI, today’s futuristic technologies are reshaping our digital interactions. As these ways of computing converge, new considerations are emerging for how we build for the world around us. 

The announcement of Meta Horizon OS is providing a more accessible mixed reality ecosystem, empowering developers to explore endless possibilities to get apps, experiences, and hardware to more users.

Top innovative design leaders shared key insights on how to navigate these systems. From human biometric inputs and accessibility to digital literacy and AI integration, we are in a new era requiring heightened intentionality and personalization.

Here are the key takeaways on how to design for these new realities.

Is spatial computing ready for mass consumers?

Spatial computing encompasses technologies that leverage the physical context of the surrounding world, gradually integrating more contextual information into computational processes. 

"Spatial computing is really anything that uses the physical context of the world that surrounds you — there's been this gradual emergence of these technologies which have this characteristic of bringing in more of your context and being able to use that in a computation."
— Keiichi Matsuda, Founder of Liquid City


Products like the Apple Vision Pro exemplify this trend by recognizing and labeling real world physical objects. According to Keiichi, "We're now in the stage of how all of those different parts will become aggregated and become part of a system that can link into all of those different areas." 

This progression is a transformative shift towards a more integrated and contextual computing experience, where spatial computing technologies seamlessly interact with and augment our physical environments.

How do major tech investments influence universal design standards?

Warning against the risk posed by major tech companies developing standards solely for profitability, Michelle Cortese, XR design lead at Meta and VR educator at NYU, urges that the industry push back and advocate for design choices that instead cater to human needs for all users, to set these standards themselves. 

The development of design standards in XR is crucial. According to her and based on hedonomics, a systematic framework to conceptualize affective human factors design, these standards can be organized into five categories: safety, functionality, usability, pleasure, and individuation.

"We are all in the process together right now of setting these standards and we have to hold human needs as the highest order problem."
— Michelle Cortese, Meta XR Design Lead


The importance of collaborative efforts will shape spatial design standards that prioritize the individual user experience above commercialization.

How will real-time personalized products impact the spatial computing industry?

The move towards real-time, personalized software experiences marks a significant shift in the spatial industry, as discussed by Daniel Marqusee from Bezi and Agatha Yu, previous lead designer at Oculus. They explored the potential for more adaptable interfaces that cater to individual needs, signaling a departure from static designs towards dynamic ones that enhance user experiences.

Daniel explores the question for more personalized software, "We're not going to be designing these macro flows that essentially a hundred designers would take to create an entire application".

"There's so much opportunity here to make more malleable, more personal software where it adapts itself — based on what your needs are. Systems that reposition themselves according to an individual's needs will help people feel connected to technology rather than feeling forced to adapt it”. 
— Agatha Yu, Human Interface Designer Manager


How are design roles evolving in an era of individually tailored UI experiences?

Designers that prioritize functionality, adaptability, and user empowerment will play a crucial role in shaping the future of AI-driven interfaces. The panelists dived in:

"the role of a designer is shifting towards understanding the importance of API access and strategic placement within the user experience. Designers will need to think about where the overall experience of their work is going to be exposed."
— Keiichi Matsuda, Founder of Liquid City


Michelle shares her piece, "There’s going to be a lot of bodily data that will be used to power these devices." This highlights the growing importance of understanding and utilizing data to improve AI recommendations and actions.

Agatha highlights the opportunities for interaction designers to create new paradigms for guiding AI, "We will have implicit data that will inform better AI recommendations or AI actions. One of the great opportunities for every interaction designer out here is that you now have a chance to actually create new paradigms about how to guide AI."

These evolving roles extend beyond traditional design tasks and encompass tuning behavior models and generating data to drive innovation in this space.

How can designers create safe spatial computing environments?

The immersive nature of spatial computing also amplifies threats, such as VR harassment. Michelle took on this topic, highlighting the critical need to establish safe spatial environments.

"By incorporating features based on proxemics and consent theory, borrowing existing ways in the real world that we protect people and their bodies, we can translate those into VR".
— Michelle Cortese, Meta XR Design Lead


Agatha expanded on this discussion, underlining the pivotal role of designers in actively engaging with leadership to prioritize ethical considerations. She remarked, "It requires the designer to have a very active conversation with leadership just to make sure that they understand and are able to wrap their head around it viscerally."

Keiichi contributed by discussing the necessity for loyal AI agents capable of understanding both physical and human context to address these issues. Instead of being a single super-intelligent entity, he proposed,

"There's a way of thinking about AI as more of an ecosystem where we can have many different agents that exist in this space and that you could curate your circle of trust and let only the ones that you want to have that information into it."
— Keiichi Matsuda, Founder of Liquid City


By implementing consent paradigms, loyal AI agents, and spatial boundaries within VR environments, designers can collaboratively create safer spaces in this spatial landscape, fostering a more secure and respectful digital environment for users.

How can designers develop innovative input paradigms for spatial computing?

Diving into the evolution of input paradigms across technological milestones, Agatha explains the historical impact of input methods and how these are driving the new mediums forward.

“You can’t take traditional inputs or repurpose inputs and push them into XR. The value proposition will be ambiguous if it’s emulating and replicating another medium.”
— Agatha Yu, Human Interface Designer Manager


The transition towards embodied experiences in XR needs implicit inputs and understanding of spatial relationships that we don’t have with screen devices.

“You need to have a lot of implicit signals, like knowing your body posture, understanding your intent, what are you looking at, what are you paying attention to, so that it can actually emulate a reality that feels coherent to you - the standard will emerge from there”.

Looking forward, she anticipates the shift towards more diverse platforms and investment strategies in XR. Envisioning the AI future, “AI is particularly interesting for this medium because we are already overwhelmed by technology. AI will be able to filter, be able to personalize a spatial medium for the user”. 

How can designers prioritize digital ethics in spatial computing?

Consensing on how to contribute positively to the evolving digital landscape, the panelists offered their final thoughts.

Michelle advocated for designing with a focus on spatial computing while considering primary standards. "When designing for spatial computing, regularly look to the primary standards by which we build safety and functionality and usability and pleasure and meet human needs in the real world — translate those over into all these paradigms."

Keiichi expressed a desire for a loyal AI that respects user privacy and empowerment. "I would like a loyal AI. I don't want to have an ambiguous relationship with the super intelligent being that knows everything about me.” He also emphasized the need to “give people the tools to be able to contribute to that world and to disrupt this idea that you have a platform and a consumer, where people can freely create and engage with each other”.

Agatha emphasized the importance of autonomy and need for a richer design vocabulary for how we engage with these new data and agents. “Designers need to have an active role in shaping not just how the models are built and how it actuates any types of recommendations or assumptions about you, but also think about new interaction affordances and patterns that allow people to guide the model's behavior.” 

How will you explore a human-centric digital future?

Spatial computing and AI presents a transformative opportunity to revolutionize human communication and interaction. By prioritizing design principles, leveraging real-time personalization and building loyal AI agents, designers can drive positive change and innovation while providing empowering solutions for all users.


Watch the full panel discussion

About the speakers

Michelle Cortese is an augmented and virtual reality (AR/VR) designer, artist and educator. She splits her professional time between design leadership at Meta Reality Labs and teaching at NYU. Her work explores immersive interaction systems; the ethical implications of embodied technology on end users; and the transmutation of human expression across new technologies and formats.

Agatha Yu has been working in spatial computing for 9 years, focusing on amplifying human senses with previous experience at Oculus and Adobe. She explores ways to translate spatial, social and emotive connections, so machines can understand our fuzzy human realities. Along the way, she assembled and grew design teams to dive into machine learning, spatial computing and procedural experiences.

Keiichi Matsuda is a designer of new realities. His cautionary short film *HYPER-REALITY* has been widely exhibited, and was awarded Vimeo’s “best drama of 2016,” amongst other accolades. Next Reality named Keiichi as a top Augmented Reality influencer while he was serving as Leap Motion’s VP Design, and he led the Mixed Reality experience design team at Microsoft while developing next-generation AR and VR devices. His studio Liquid City is a design practice specializing in XR and AI with a focus on shaping a positive future for technology in society.

Daniel Marqusee is a spatial designer helping build tools at Bezi, dedicated to teaching and exploring extended reality UX design and philosophy. Daniel is passionate about creating tutorials, UX deep dives, design philosophy, and is a self-proclaimed clumsy man muttering about XR and the Metaverse. His true passion is to work as hard as possible to create incredible content and to make learning more accessible to creatives from all socioeconomic backgrounds.