Metaverse has been attracting attention in recent years.
Thinking of the Metaverse as "an environment network of all forms of information constructed on 3D virtual", the information network technology that constitutes the Metaverse will be visually / experientially in the physical environment surrounding us in the future. It seems to be integrated.
Just as a building is made up of many materials and infrastructure, the Metaverse is also made up of numerous information technologies, scripts, and 3D / 2D models. As XR continues to be researched and developed and implemented in society, ask Stefano Koratza, who provides Adobe Aero and Mixamo as creative tools, from the standpoint of an artist / platformer. I did.
(Writing / God Scorpion)
* Adobe Aero: A free app that allows you to create AR content without coding. * Mixamo: A service that allows you to add animation to 3D characters.Founded by Koratza and acquired by Adobe in 2015
Gospi: Nice to meet you. I look forward to working with you today. I actually used Aero to display my work and experience AR, but it's a really easy-to-use tool.
In particular, it was a great experience to be able to check and produce smoothly even on the iPad version. I felt that Aero was focusing on "making and seeing are directly linked".
Koratsua: That's right. Adobe recognizes that iPads and other tablets are the best platform and the best hardware in the sense of "expressing creativity."
Gospi: Do you have any plans to support other devices such as Android in the future?
Koratsua: So far, Aero allows you to "see" on your Android device, but you haven't had a full experience yet. I can't support AR creation on Android devices. However, we are planning to be able to do them in the future.
We decided to support the iOS and iPad platforms first because this environment is less fragmented. There are various types of Android, and the situation is a little complicated.
On the desktop side, you can create works in a Windows / Mac environment, so you can test it on a mobile / tablet and release it, so I hope you can use it like that. increase.
Aero wants to be a content aggregator
Gospi: In the future, the number of XR platforms and metaverses will increase, and users will choose one of them and experience them at that time, rather than the image of being aggregated into one specific platform or metaverse. I think it's a future that will continue.
Now Aero completes everything from production to viewing experience within the app, but will it be possible to take projects created with Aero to other platforms in the future?
Koratsua: It's sharp, it's exactly the plan for this year. First of all, when the creator side sees the work made with Aero, the user side will not have to install Aero. In addition, when creating works with Aero, we plan to ensure comprehensive operability in file formats with various metaverses and applications.
It's a little bigger, but let's talk about where Aero's purpose is. Our goal is to "become an aggregator of content" (Editor's note: "Aggregator" here refers to the role of collecting content, talents, technologies, etc. beyond the boundaries of the organization and placing them in the right place. In other words, we want creators and viewers to enjoy the interactivity and the final experience.
Gospi: Over the last few years, we have been increasing the use of AR / MR glasses in actual urban and living spaces, and the project can start from a specific place or situation. It has increased.
Illustrators and Photoshop can choose templates according to various output styles. Do you plan to provide functions such as templates and situation samples in Aero in the future?
Koratsua: Of course. However, we are still in the process of researching what templates are typical use cases. For now, I'm asking through the community what kind of template I need, and I'm planning to ask what kind of scenes I want in the future, so it seems that it will take a little longer.
Gospi: Are you also developing the design and layout of the user interface for such a living space?
Koratza: We're still learning, and while the ecosystem itself is evolving, Aero plans to offer features with hidden user interface elements later this year.
Why Aero doesn't have a timeline bar for a bird's eye view
Gospi: I usually develop using Unity, but when I create a project, I often use the "timeline" function. Aero, on the other hand, adjusts the time scale of the entire space according to "what kind of interaction is created with the object".
Was there a reason why we didn't have a timeline bar that gave us a bird's eye view of the whole thing?
Koratsua: That's right, the timeline bar isn't really intentionally prepared. The timeline cannot be used as a so-called script unless you have created in advance that "the scene will be like this". On the other hand, we are focusing on full interactivity. I think these two are different.
Then, how time moves in Aero is that the time axis is extended at where in the scene the event occurs, for example, "trigger occurs and from there". I'm Kun.
In other words, it is possible to record the trigger in advance and burn it to the timeline side, but as a premise, we will make it in a state where there is no time axis.
Gospi: I see. This is a story that shows exactly where the platform side is focusing. The image is very easy to understand.
Next, I would like to talk about Mixamo, which I also use when making. Nowadays, I think there are many animations that are caused by games, but I want more animations that occur in everyday space and in everyday life.
Koratsua: Thank you for using it all the time! And thank you for your feedback (laughs). Animations such as "sitting on a chair", "tapping a smartphone", and "eating rice" are already supported. However, there is no more daily activity so far, so we are also considering what to do in the future.
By the way, in Aero, animation characters are supported, and an animation creation function using characters whose movements can be set has been added from a year ago. Speaking of animation in the future, I wonder if that will move on first.
Gospi: I see, I'm looking forward to it. I definitely want to use it.
Editorial department: Then, I would like to ask you a few points from Gizmodo. First of all, what kind of existence do you want Adobe's creative tools such as Aero to be for creators?
Koratsua: Aero has two pillars. First of all, AR experiences created by creators will be able to be deployed on all platforms including Metaverse. And to be able to create interactivity with no code without knowing the language of programming.
Based on these two concepts, we will continue to support AR glasses, VR headsets, Metaverse, and mobile.
Gospi: That's great. In creating interactions, Aero was very easy to understand the ability to set triggers on objects without coding.
I also provided an XR platform called "STYLY", and while watching various artists from the general public to photographers, filmmakers, sculptors, etc. create their works in AR / VR, the essence of what I want to make. I thought it was not the coating itself, but how I wanted to express it. It's pretty important that creators don't have to create object functions and behavior types from scratch, right?
There is only one AR metaverse
Editorial department: In 2021, the recognition of the word "Metaverse" rose at a stretch. What kind of vision does Mr. Koratza, who has been working on AR / VR since around 2008, envision in the Metaverse?
Koratsua: I think there are many people who try to squeeze all the elements into the word "Metaverse", but in reality, when you look at this "Metaverse", there must be various elements.
In the first place, the AR metaverse and the VR metaverse are completely different things, so I think we have to separate them. First of all, in the case of the VR Metaverse, as Mr. Gospi mentioned earlier, there are an infinite number of virtual worlds. On the other hand, in the case of the AR metaverse, there is only one metaverse. We call this "meta-earth". All the various elements are packed in that space.
In other words, VR has many planets and a "metaverse" is composed. On the other hand, AR has only the planet on which we live, and the image is that layers of content are piled up there.
Personally, I'm a fan of AR Metaverse. In the environment of AR Metaverse, it is good to discover improvements and new aspects in the world in which we are.
Gospi: I see, the word Meta Earth is interesting. There is also an image of AR, which is to modify or build the existing earth.
As I mentioned earlier in Gizmodo's "Dialogue about the Matrix and the Metaverse," I personally talk about the Metaverse as "how to put a symbol in my cognitive world and how to manipulate it ..." It's an image like "things that are developed in the center". It is close to the story of Umbert advocated by German biologist Yuxcur. "Meta-earth" in AR, which Mr. Koratza says, is exactly the opposite idea. (* Umbert: Philosophical term. Environment. In Japanese, "environment" is close, but it includes nuances such as "what the subject actively creates" including cognition and perception.)
Editorial department: Thank you both for the very interesting conversation. Finally, I would like to ask Mr. Gospi and Mr. Koratza. There is an image that XR technology is spreading to the general public from the game world. What can each of them do to bring the fun and benefits of the XR to the general public in the future?
Koratsua: As you said, Mixamo originally started in the world of 100% games, so we started with DNA there and are gradually moving to everyday life.
Aero, on the other hand, doesn't start with games, it's still learning how people are using this tool and AR. Please take a look at the movie that summarizes what kind of experience each artist is making with Aero. I think we can help these artists move to the mainstream where they are involved in people's daily lives.
Gospi: Through games and art works, we not only make people feel happy and get new cognitive experiences, but we also pay attention to supporting movements and behaviors that occur in the life world. increase. I think I'm already doing multiple things at the same time, such as reading something or watching a video while having a meeting, but I would like to provide an extension of that.
By transforming and manipulating 2D applications such as WEB, SNS, and map functions that have been invented by humankind into a style that can be used in 3D space through AR glasses, for example, something can be heard or somewhere. It's visualized ... I think it will be very important to make things that are not originally concrete into three dimensions.
Recently, "STYLY" is developing a service that can be viewed in collaboration with urban space, and I think that is one of our answers to this question.