Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

VRoamer Generates On-The-Fly VR Experiences While Walking Through Buildings

VRoamer is a new Microsoft Research project that generates VR worlds on-the-fly while users walk through unknown building environments. Players can wear their HMD and no longer have to rely on artificial locomotion techniques such as teleportation. They simply walk through their environment and the game is built around them. This is possible through the use of a wearable tech camera that scans the environment in front of the user and visualizes a playable virtual world. The system paints its virtual environment over real world objects such as doors. The system keeps the user safe from objects in the real world, even though those objects are hidden to the user. Transitions are done through corridors that are constructed to the available space in the user's environment. Players can open real doors to progress the game and the corridors may contain weapons, enemies, keys, etc. Objects that suddenly appear in a user's environment such as other people may become skeletons or traps.

In this paper, we present VRoamer, which enables users to walk unseen physical spaces for which VRoamer procedurally generates a virtual scene on-the-fly. Scaling to the size of office buildings, VRoamer extracts walkable areas and detects physical obstacles in real time using inside-out tracking, instantiates pre-authored virtual rooms if their sizes fit physically walkable areas or otherwise generates virtual corridors and doors that lead to undiscovered physical areas. The use of these virtual structures that connect pre-authored scenes on-the-fly allow VRoamer to (1) temporarily block users' passage, thus slowing them down while increasing VRoamer's insight into newly discovered physical areas, (2) prevent users from seeing changes beyond the current virtual scene, and (3) obfuscate the appearance of physical environments. VRoamer animates virtual objects to reflect dynamically discovered changes of the physical environment, such as people walking by or obstacles that become apparent only with closer proximity.

Discussion
Posted by cageymaru March 22, 2019 4:47 PM (CDT)

Microsoft Patent Describes a Persistence of Vision Augmented Reality Display

Microsoft has filed a patent for a new type of mixed-reality head-mounted display (HMD) called a Persistence of Vision Augmented Reality Display. The display uses movable screens to simulate a 360 degree field of view. These screens can rotate back and forth, spin around a user's head, or spin in front of the user's eyes. The device isn't guaranteed to ever see the light of day, but hopefully some of the ideas make it into other products.

Microsoft's patent FIG. 4A below illustrates a movement based display device with a movable member configured to rotate about a user's head; FIG. 4B illustrates a movement based display device with a movable member configured to reciprocate in front of user's eyes; FIG. 4C illustrates a movement based display device with movable members configured to spin in front of the user's eyes

Discussion
Posted by cageymaru March 22, 2019 3:17 PM (CDT)

AMD Confirms Stadia Will Run on Intel CPUs

As one of the world's most pervasive cloud service providers, Google is in a better position to launch a successful game streaming platform than almost anyone. The hardware they choose to use for the launch of their "Stadia" streaming service will undoubtedly influence future game streaming efforts, hence AMD's stock price shot through the roof when Google announced they were using AMD GPUs. However, PCGamesN writer Dave James noticed that Google was conspicuously silent when it came to Stadia's CPUs. They were happy to share clock speeds, cache numbers and the fact that they're using "custom" x86 chips, but they refused to confirm the vendor of the platform's CPU. Eventually, AMD reached out and said that "the Stadia platform is using custom AMD Radeon datacentre GPUs, not AMD CPUs." Barring any surprise announcements from VIA, that more or less confirms that Stadia will run on some sort of Intel CPU platform, but just why Google refused to mention Chipzilla by name remains a mystery. The author suggested that Intel might not want to associate themself with what might be a "doomed" venture. Maybe Google plans to switch to EPYC CPUs or an unannounced Intel server platform sometime in the future, or maybe they just don't think it's particularly relevant. Whatever the reason may be, I also find the omission to be curious, and look forward to seeing what happens with Stadia's hardware in the future.

A switch to AMD's EPYC processors has been mooted as a potential future step for Stadia, and Google's Phil Harrison told us himself that "we're just talking about Gen 1 at the moment, but there will be iterations on that technology over time," so there is some potential for a changing of the processor guard either before or after launch. Whatever the truth of the matter is I still find it beyond strange that no-one involved is talking about the Intel CPUs being used for Google Stadia, even if they're not necessarily doing anything that special with regards the innovative streaming service. Certainly the multi-GPU features on offer with the Radeon graphics cards warranted mention, but just a note on the specs slide alone could have still done good things for Intel.

Discussion
Posted by alphaatlas March 22, 2019 11:20 AM (CDT)

Unreal Engine is Getting Destructable Environment Support

High-quality destructible environments seem like something that should be standard in 2019, but even today, it's a relatively rare thing to find in a game. It was a headlining feature Red Faction: Guerilla back in 2009, and a prominent feature in the recently released Crackdown 3, but detailed, destructible environments are still absent from most releases. However, at GDC this year, Epic announced that they're integrating a destruction system into Unreal Engine. Given how popular the engine is, and how competitors will probably try to achieve feature parity, I expect to see more games with destructible environments in the near future.

Revealed onstage at GDC 2019 during "State of Unreal," Chaos is Unreal Engine’s new high-performance physics and destruction system coming in early access to Unreal Engine 4.23. The real-time tech demo is set within the world of Robo Recall. With Chaos, users can achieve cinematic-quality visuals in real time in scenes with massive-scale levels of destruction, with unprecedented artist control over content creation.

Discussion
Posted by alphaatlas March 21, 2019 11:47 AM (CDT)

CD Projekt Red Reiterates Plan to Release 2 Games by 2021

In post by an official CD PROJEKT Moderator on their forums, the company re-iterated its promise to "release a second AAA game by 2021." The Polish company started teasing Cyberpunk 2077 way back in 2012, and the game still doesn't have a release window, but just what else the company is working on remains a mystery. Its not clear if the studio has been secretly chipping away at this second project for some time, if its somehow derived from Cyberpunk 2077 or The Witcher (which could reduce development time), or if it's simply smaller-scope AAA release, but the developer hasn't divulged any details about it so far.

"As far as the strategy of the CD PROJEKT Capital Group for 2016-2021 is concerned, its plans to release the second AAA game by 2021 remain unchanged. We are currently focusing on the production and promotion of Cyberpunk, so we do not want to comment on further projects. Donata Poplawska"

Discussion
Posted by alphaatlas March 21, 2019 11:26 AM (CDT)

All Digital Xbox One S Could Launch May 7

Following up on previous reports claiming that Microsoft could unveil a disc-less Xbox One in April, Windows Central allegedly got their hands on some photographs of the upcoming console, and recreated them in Photoshop to protect their source. "Additional documents" they obtained suggest that the new Xbone could launch on May 7 in a "global simultaneous release," but as we've noted with some of our own predictions, exact launch dates can be fuzzy this far ahead of time. The publication thinks that Microsoft is positioning this as a replacement to the original Xbox One, rather than a replacement to the newer disc-based consoles.

The design of the Xbox One S All-Digital appears to be virtually identical to the current Xbox One S, without the disc drive and eject button. The product shots we received seem to indicate that it will come with a 1TB HDD and with Forza Horizon 3, Sea of Thieves, and Minecraft digital codes bundled into the box. It doesn't look as though it will be bundled with Microsoft's Netflix-like subscription service for games, Xbox Game Pass. Our information suggests that the Xbox One S All-Digital edition will have the lowest recommended retail price (RRP) of all current Xbox One consoles, aimed at newcomers to the ecosystem, although the exact pricing is unknown at this time.

Discussion
Posted by alphaatlas March 21, 2019 11:04 AM (CDT)

AMD Radeon Software Adrenalin 2019 Edition 19.3.3

The AMD Radeon Software Adrenalin 2019 Edition 19.3.3 driver has been released and it adds support for Sekiro: Shadows Die Twice and Generation Zero. Fixed issues include: Rainbow Six Siege may experience intermittent corruption or flickering on some game textures during gameplay. DOTA 2 VR may experience stutter on some HMD devices when using the Vulkan API.

Known Issues: Mouse cursors may disappear or move out of the boundary of the top of a display on AMD Ryzen Mobile Processors with Radeon Vega Graphics. Performance metrics overlay and Radeon WattMan gauges may experience inaccurate fluctuating readings on AMD Radeon VII.

Discussion
Posted by cageymaru March 20, 2019 4:51 PM (CDT)

Google Partners with AMD for Google Stadia Game Streaming Service

Google has selected AMD as its partner for the Google Stadia game streaming service. Google will use high-performance, custom AMD Radeon datacenter GPUs for its Vulkan and Linux-based Google Stadia. AMD noted how its commitment to open-source AMD Linux drivers would allow Google and its development partners to inspect the code and understand exactly how the driver works, enabling them to better optimize their applications to interface with AMD Radeon GPUs. AMD supplies other tools such as the AMD Radeon GPU Profiler (RGP) that allows developers to identify timing issues that might lead to optimizations. The Google Stadia service will feature game streams with resolutions up to 4K HDR 60 FPS. Google announced a 2019 launch time for the game streaming service.

Streaming graphics-rich games to millions of users on demand and from the cloud requires ultra high-performance processing capabilities to minimize latency and maximize game performance. It also requires advanced technologies to tackle unique datacenter challenges, including security, manageability, and scalability. The AMD graphics architecture supports a wide range of today's gaming platforms -- from PCs to major game consoles -- enabling developers to optimize their games for a single GPU architecture and extend these benefits across multiple platforms which now include large-scale cloud gaming platforms.

Discussion
Posted by cageymaru March 19, 2019 4:41 PM (CDT)

AMD Talks 3D Stacking at Rice Presentation

AMD has talked up their "chiplet" based approach used in their upcoming products, and according to some reports, Marvell is already selling products based on the chiplet concept. But the next logical step from that approach is to move from 2D to 3D, where different dies are "stacked" on top of each other. In a recent presentation at Rice University, AMD confirmed that they're working on 3D stacking techniques in their future designs, and that it's a necessary step to keep the improvements coming, but didn't elaborate much beyond that. Check out the memory and stacking talk in the presentation below:
Discussion
Posted by alphaatlas March 19, 2019 10:47 AM (CDT)

Real-Time Ray Tracing Support Comes to GeForce GTX GPUs and Game Engines

NVIDIA has announced that real-time ray tracing support is coming to GeForce GTX GPUs. This driver is scheduled to launch in April. GeForce GTX GPUs will execute ray traced effects on shader cores and support is extended to both Microsoft DXR and Vulkan APIs. NVIDIA reminds consumers that its GeForce RTX lineup of cards has dedicated ray tracing cores built directly into the GPU which deliver the ultimate ray tracing experience. GeForce RTX GPUs provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores. NVIDIA GameWorks RTX is a comprehensive set of tools and rendering techniques that help game developers add ray tracing to games. Unreal Engine and Unity have announced that integrated real-time ray tracing support is being built into their engines.

Real-time ray tracing support from other first-party AAA game engines includes DICE/EA's Frostbite Engine, Remedy Entertainment's Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others. Quake II RTX -- uses ray tracing for all of the lighting in the game in a unified lighting algorithm called path tracing. The classic Quake II game was modified in the open source community to support ray tracing and NVIDIA's engineering team further enhanced it with improved graphics and physics. Quake II RTX is the first ray-traced game using NVIDIA VKRay, a Vulkan extension that allows any developer using Vulkan to add ray-traced effects to their games.

Discussion
Posted by cageymaru March 18, 2019 10:09 PM (CDT)

Microsoft Announces Variable Rate Shading Support for DX12

Variable Rate Shading (VRS) is a powerful new API that gives the developers the ability to use GPUs more intelligently. Shaders are used to calculate the color of each pixel in a screen. Shading rate refers to the resolution at which these shaders are called (which is different from the overall screen resolution). A higher shading rate means better visual fidelity at the cost of using more GPU power. All pixels in a frame are affected by the game's shading rate. VSR allows developers to choose which areas of the frame are more important and increase the visual fidelity, or set parts of the frame to have lower fidelity and gain extra performance. Lowering the fidelity of parts of the scene can help low spec machines to run faster. There are two tiers of support for VRS. First of all the VRS API lets developers set the shading rate in 3 different ways: per draw, within a draw by using a screenspace image, or within a draw, per primitive. The hardware that can support per-draw VRS hardware are Tier 1. There's also a Tier 2, the hardware that can support both per-draw and within-draw variable rate shading. VRS support exists today on in-market NVIDIA hardware and on upcoming Intel hardware. AMD is rumored to be working on support for the feature.

For example, foveated rendering, rendering the most detail in the area where the user is paying attention, and gradually decreasing the shading rate outside this area to save on performance. In a first-person shooter, the user is likely paying most attention to their crosshairs, and not much attention to the far edges of the screen, making FPS games an ideal candidate for this technique. Another use case for a screenspace image is using an edge detection filter to determine the areas that need a higher shading rate, since edges are where aliasing happens. Once the locations of the edges are known, a developer can set the screenspace image based on that, shading the areas where the edges are with high detail, and reducing the shading rate in other areas of the screen.

Discussion
Posted by cageymaru March 18, 2019 7:07 PM (CDT)

Atari VCS Is Powered by 14nm AMD Ryzen APU with Radeon Vega Graphics

Atari has announced that the upcoming Atari VCS will now be powered by a 14nm AMD processor featuring high-performance Radeon Vega graphics architecture and two "Zen" CPU cores. The hardware team at Atari has replaced the AMD "Bristol Ridge" processor with the new AMD Ryzen APU. The new AMD Ryzen platform will bring welcomed upgrades such as greater efficiency, faster speeds, and cooler temps; allowing the VCS to benefit from a simpler and more effective power architecture and thermal solution. The new processor includes built-in Ethernet, Native 4K video with modern HDCP, and a secure frame buffer that fully-supports DRM video (Netflix, HBO, etc.). The North American schedule is now targeting end of 2019 for delivery of the Atari VCS system for Indiegogo backers. Thanks @MixManSC !

This upgrade will translate to better overall performance in a cooler and quieter box--all with minimal impact to our manufacturing processes. While additional specifications about the new AMD processor will be announced closer to launch, be assured that the new AMD Ryzen processor is a much better fit for this project in multiple ways and will further enable the Atari VCS to deliver on its promise to be a unique and highly flexible platform for creators. Atari cannot thank our great partners at AMD enough for bringing forward this exciting new--and thus-far unannounce -- product for us to utilize in the VCS.

Discussion
Posted by cageymaru March 18, 2019 5:26 PM (CDT)