Echo Dot (3rd Gen) - Smart speaker with Alexa - Charcoal

Use your voice to play a song, artist, or genre through Amazon Music, Apple Music, Spotify, Pandora, and others. With compatible Echo devices in different rooms, you can fill your whole home with music.

Buy Now

Wireless Rechargeable Battery Powered WiFi Camera.

Wireless Rechargeable Battery Powered WiFi Camera is home security camera system lets you listen in and talk back through the built in speaker and microphone that work directly through your iPhone or Android Mic.

Buy Now

Mission AR’ Showcases Impressive PC-quality Graphics – Road to VR

0
187


Epic Games today released a new video featuring a demo for HoloLens 2 that aims to show off just what sort of graphics can be achieved on Microsoft’s latest standalone AR headset. Called Apollo 11: Mission AR, the interactive demo is streamed wirelessly in real-time from networked PCs running the company’s game engine, Unreal Engine.

Unveiled earlier this summer at Microsoft Build 2019, Apollo 11: Mission AR is a recreation of the historic 1969 Apollo 11 mission and lunar landing, showing off the Saturn V’s launch, a reenactment of the lunar landing, and Neil Armstrong’s first steps on the Moon, which Epic says was reconstructed based on data and footage from the actual mission.

Epic says the demo features 7 million polygons in a physically-based rendering environment, and includes fully dynamic lighting and shadows, multi-layered materials, and volumetric effects.

Mission AR' Showcases Impressive PC-quality Graphics – Road to VR 1
Image courtesy Epic Games

That isn’t done on-device though. To achieve this level of detail, Epic says the experience’s “holographic elements” are actually streamed wirelessly in real-time from networked PCs running UE 4.23, the current version of Unreal Engine.

According to Epic’s HoloLens 2 streaming guide, the headset sends eye tracking, gesture, voice, current device pose, and spatial mapping input to your PC, and then streams rendered frames back to HoloLens 2. This, the company says, is designed to boost app performance, and make development easier since devs won’t need to package and deploy the app on-device before running it, however it’s clear it also allows HoloLens 2 to play host to more graphically involved experiences than were originally intended for the standalone device’s on-board processors.

Mission AR' Showcases Impressive PC-quality Graphics – Road to VR 2
Image courtesy Epic Games

We reached out to Epic to see whether this could also be achieved via cloud streaming, or if it’s a local machine-only implementation. We’ll update this article as soon as we hear back.

Released in early September, Unreal Engine 4.23 is the first iteration of the company’s game engine to feature production-ready support for HoloLens 2, which includes tools such as streaming and native deployment, emulator support, finger tracking, gesture recognition, meshing, voice input, and spatial anchor pinning.

Outside of the demo’s visual polish, Epic says Apollo 11: Mission AR also shows support for UE4 Composure, color temperature, and post-processing, plus OCIO LUTs, I/O for AJA video systems, and additional features that streamline mixed reality media production.



Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here