Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

Nvidia Releases "Creator Ready" RTX Drivers

Earlier this week, Nvidia rolled out a set of "creator ready" drivers that are compatible with consumer GPUs, but optimized for professional applications. This level of support is typically reserved for drivers that only work with pricey Quadro GPUs, but Nvidia says they've conducted "exhaustive multi-app testing" in programs like Adobe Premiere and After Effects. Support for this driver goes all the way back to Pascal cards, and extends to Nvidia's more affordable offerings like the GTX 1050 and the more recent 1660. Perhaps even more interestingly, Nvidia claims they've worked with a number of software vendors to leverage the raytracing and machine-learning muscle their RTX cards offer. Autodesk Arnold and Unreal Engine 4, for example, now support RTX accelerated rendering, and Redcine-X Pro seemingly uses Turing's updated video processing block to decode 8K video without taxing the CPU. Meanwhile, Lightroom uses "an extensively trained convolutional neural network to provide state-of-the-art image enhancing for RAW photographs." While I haven't tested Lightroom's new features myself, in my experience, neural networks can perform small miracles when processing images. Nvidia also claims the new driver features significant performance improvements in Photoshop, Premiere, Blender Cycles, and Cinema 4D.

"Creators are constantly faced with tight deadlines and depend on having the latest hardware and creative tools to complete their projects on time, without compromising quality," said Eric Bourque, senior software development manager at Autodesk. "We're excited that NVIDIA is introducing a Creator Ready Driver program because it will bring Arnold users an even higher level of support, helping them bring their creative visions to life faster and more efficiently." The first Creator Ready Driver is now available from NVIDIA.com or GeForce Experience. From GeForce Experience, you can switch between Game Ready and Creator Ready Drivers at any time by clicking the menu (three vertical dots in the top right corner). Creator Ready Drivers are supported for Turing-based GeForce RTX, GTX and TITAN GPUs, Volta-based TITAN V, Pascal-based GeForce GTX and TITAN GPUs, and all modern Quadro GPUs.

Discussion
Posted by alphaatlas March 22, 2019 8:57 AM (CDT)

Nvidia Skips Ampere at GTC 2019

Several news outlets seems to think Nvidia's GTC presentation was relatively longwinded and unexciting this year. The three-hour keynote reportedly featured some software announcements and a low power Jetson board, among other things, but didn't feature the 7nm Ampere GPUs many were expecting. EE Times says that the "unspoken message" at the presentation was that "Nvidia doesn't need to pre-announce a new and faster chip because it owns that software stack and channel today," and the emphasis on CUDA seemed to really drive that point home. However, in one of the more exciting parts of the presentation, Nvidia did highlight the Q2VKPT project we covered earlier this year. Nvidia's CEO seemed quite excited about the introduction of raytracing to Quake II, and they showed off some of the project's gameplay, which you can see below:

Presaging that future, Nvidia's chief scientist, Bill Dally, told reporters about a research project in optical chip-to-chip links. It targets throughput in terabits/second while drawing 2 picojoules/bit/s. In an initial implementation, 32 wavelengths will run at 12.5 Gbits/s each, with a move to 64 wavelengths doubling bandwidth in a follow-up generation. Dally predicted that copper links will start run out of gas as data rates approach 100 Gbits/s, already on many roadmaps for network switches. Progress in more power-efficient laser sources and ring resonators will enable the long-predicted shift, he said. If the future evolves as he believes, bleeding-edge GPUs may continue to skip an appearance at some Nvidia events. Attendees will have to hope that as the interconnects speed up, the keynotes don't get even longer.

Discussion
Posted by alphaatlas March 20, 2019 11:25 AM (CDT)

Shadow of the Tomb Raider Receives RTX Treatment in Latest Patch

The latest patch for Shadow of the Tomb Raider has enabled support for real time ray tracing and NVIDIA's DLSS. To enable the features, gamers will need Window 10 update 1809 or higher, NVIDIA RTX 20- series GPU, and NVIDIA's latest drivers 419.35 and up. Nixxes announced the creation of a Beta, enabling the ability to switch back to an older version of the game, if problems arise.

We have just released the thirteenth PC patch for Shadow of the Tomb Raider, build 1.0.280. This patch focuses primarily on the release for the Nvidia's Ray-Traced Shadows and DLSS. While we expect this patch to be an improvement for everyone, if you do have trouble with this patch and prefer to stay on the old version, we have made a Beta available on Steam, Build 279, that can be used to switch back to the previous version.

Discussion
Posted by cageymaru March 19, 2019 9:55 PM (CDT)

Not All RTX 2080s are Created Equal

Manufacturers have had some time to stock store shelves and warehouses with Nvidia RTX laptops, but as Techspot pointed out earlier this year, the nomenclature can be very confusing. The laptop "RTX 2080," for example, doesn't have the same performance as the desktop version of RTX 2080, and there are multiple version of the "RTX 2080 Max-Q" with different levels of performance. Hardware Unboxed tested the performance difference between the various versions, which you can see in the video below:
The fact that Nvidia can cram a 545mm^2 GPU into a low-power laptop at all is remarkable, and generally speaking, the RTX chips perform well in their relatively small power envelopes. But as the video points out, be careful if you're in the market for a gaming laptop, as the actual performance level of some RTX GPUs can be difficult to discern. Discussion
Posted by alphaatlas March 19, 2019 10:24 AM (CDT)

Real-Time Ray Tracing Support Comes to GeForce GTX GPUs and Game Engines

NVIDIA has announced that real-time ray tracing support is coming to GeForce GTX GPUs. This driver is scheduled to launch in April. GeForce GTX GPUs will execute ray traced effects on shader cores and support is extended to both Microsoft DXR and Vulkan APIs. NVIDIA reminds consumers that its GeForce RTX lineup of cards has dedicated ray tracing cores built directly into the GPU which deliver the ultimate ray tracing experience. GeForce RTX GPUs provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores. NVIDIA GameWorks RTX is a comprehensive set of tools and rendering techniques that help game developers add ray tracing to games. Unreal Engine and Unity have announced that integrated real-time ray tracing support is being built into their engines.

Real-time ray tracing support from other first-party AAA game engines includes DICE/EA's Frostbite Engine, Remedy Entertainment's Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others. Quake II RTX -- uses ray tracing for all of the lighting in the game in a unified lighting algorithm called path tracing. The classic Quake II game was modified in the open source community to support ray tracing and NVIDIA's engineering team further enhanced it with improved graphics and physics. Quake II RTX is the first ray-traced game using NVIDIA VKRay, a Vulkan extension that allows any developer using Vulkan to add ray-traced effects to their games.

Discussion
Posted by cageymaru March 18, 2019 10:09 PM (CDT)

NVIDIA Could Tease Next-Gen 7nm Ampere at GTC 2019

It isn’t clear whether NVIDIA will have any surprises to share at next week’s GPU Technology Conference (GTC), but some speculate the company could reveal aspects of its next-generation architecture, "Ampere," which will purportedly be built on the 7nm node. TweakTown and TechSpot suggest it could be the right time to do so, as the luster of Volta and Turing continues to wear thin. The former predicts it won’t be a gaming part, however, suggesting "a new GPU architecture tease that will succeed Volta in the HPC/DL/AI market."

For now, NVIDIA has used the Ampere name for their future 7nm GPUs. If that's the case, the Ampere GPUs would bring power efficiency improvements, higher clock rates, and perhaps higher memory bandwidth. Now would be a good time for NVIDIA to make a big announcement, considering the company just had one of the worst fiscal quarters its ever had. Consumer and investor faith in the company is slipping, especially since the adoption of RTX technology has been much slower than expected.

Discussion
Posted by Megalith March 16, 2019 4:45 PM (CDT)

Nvidia's Freesync Monitor Support Tested

Nvidia recently released a driver update that adds support for Freesync monitors, but the GPU maker was quick to point out that "non-validated" displays could exhibit serious issues. However, Techspot put that claim to the test last January, and just followed up on it with a handful of new Freesync monitors from LG. Out of the 12 monitors they've tested so far, 11 have worked flawlessly, and the one that didn't only supports Freesync over HDMI, while Nvidia only supports the adaptive sync technology over Displayport.

As we said in the original article, we think it's safe to say if you purchase a new FreeSync model today that it will work fine with Nvidia GPUs. You shouldn't expect to see any graphical issues, whether you buy an LG monitor or a display from a different maker. Only the very early, older FreeSync monitors may have some issues. Thus our recommendation continues to be to select your next monitor on its merits, features and preference. If it's a FreeSync gaming monitor and you can save some money, that's great. GeForce GPU owners no longer you need to bother with G-Sync for getting proper variable refresh rate, unless that's the monitor you want for other factors as mentioned above. These results from a selection of LG monitors just reinforces that FreeSync gaming displays can and will perform perfectly.

Discussion
Posted by alphaatlas March 12, 2019 9:03 AM (CDT)

NVIDIA Acquires Mellanox for $6.9 Billion

NVIDIA has reached a definitive agreement to acquire Mellanox for $6.9 billion. NVIDIA will acquire all of the issued and outstanding common shares of Mellanox for $125 per share in cash. NVIDIA and Mellanox are both known as high performance computing (HPC) industry leaders and their products are found in over 250 of the world's TOP500 supercomputers and every major cloud service provider. Mellanox is known for its high-performance interconnect technology called Infiniband and high-speed Ethernet products. "We share the same vision for accelerated computing as NVIDIA," said Eyal Waldman, founder and CEO of Mellanox. "Combining our two companies comes as a natural extension of our longstanding partnership and is a great fit given our common performance-driven cultures. This combination will foster the creation of powerful technology and fantastic opportunities for our people."

"The emergence of AI and data science, as well as billions of simultaneous computer users, is fueling skyrocketing demand on the world's datacenters," said Jensen Huang, founder and CEO of NVIDIA. "Addressing this demand will require holistic architectures that connect vast numbers of fast computing nodes over intelligent networking fabrics to form a giant datacenter-scale compute engine. "We're excited to unite NVIDIA's accelerated computing platform with Mellanox's world-renowned accelerated networking platform under one roof to create next-generation datacenter-scale computing solutions. I am particularly thrilled to work closely with the visionary leaders of Mellanox and their amazing people to invent the computers of tomorrow."

Discussion
Posted by cageymaru March 11, 2019 9:42 AM (CDT)

Potential Navi Benchmark: Better Graphics, Lower Compute Performance than Vega 64

An unknown AMD GPU (66AF:F1) has appeared on CompuBench, and some believe it could be a Navi part. GFXBench scores allude to a card that exceeds the Radeon RX Vega 64 in graphics capability but falls behind in certain compute tests, even against the Vega 56. Notebook Check advises this could actually be a Vega 20 GPU ("we've seen Linux drivers listing 0x66AF as Vega 20").

A comparison of the GFXBench scores of the AMD 66AF:F1 with the Radeon RX Vega 64 shows that the purported Navi variant leads significantly in the Aztec Ruins Normal Tier (1080p) and High Tier tests 1440p). This could imply that GCN6 in Navi is tailored more towards raw graphics than compute. We aren't exactly sure about the specs of this particular entry but expect to see variants with anywhere between 20 to 40 higher clocked CUs when Navi launches.

Discussion
Posted by Megalith March 10, 2019 1:25 PM (CDT)

NVIDIA Ending Driver Support for 3D Vision, Mobile Kepler-Series GeForce GPUs

NVIDIA has published two new support entries revealing the fate of its 3D Vision technology and Kepler notebook GPUs. After Release 418 in April 2019, GeForce Game Ready Drivers will no longer support NVIDIA 3D Vision. ("Those looking to utilize 3D Vision can remain on a Release 418 driver.") Critical security updates for mobile Kepler-series GPUs will also cease by April 2020.

Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available for systems utilizing mobile Maxwell, Pascal, and Turing-series GPUs for notebooks, effective April 2019. Critical security updates will be available on systems utilizing mobile Kepler-series GPUs through April 2020. Game Ready Driver upgrades will continue to be available for desktop Kepler, Maxwell, Pascal, Volta, and Turing-series GPUs.

Discussion
Posted by Megalith March 10, 2019 10:10 AM (CDT)

Windows 10 Update Can Degrade Graphics, Mouse Performance in Certain Games

Microsoft has alerted gamers that a recent Windows 10 update (KB4482887) can have adverse effects on some titles: "After installing KB4482887, users may notice graphics and mouse performance degradation with desktop gaming when playing certain games (eg: Destiny 2)." The company is still working on an official fix, but those who need an immediate solution may merely uninstall the update.

On March 1st, Microsoft released a brand-new update for Windows 10 that brought a number of quality improvements. However, and after various reports, Microsoft has confirmed that this latest update can degrade graphics and mouse performance in certain games (like for example Destiny 2). Microsoft has not explained/detailed the reasons behind this performance degradation. Moreover, this issue may not affect everyone therefore we strongly suggest uninstalling this update only if you are experiencing worse performance in your favorite games after applying it.

Discussion
Posted by Megalith March 09, 2019 4:00 PM (CST)

NVIDIA: RTX GPUs, High-Refresh-Rate Monitors Can Improve Your Kill-Death Ratio

If only to convince gamers to upgrade their GPUs and displays, NVIDIA has published new data supporting the obvious idea that better hardware improves player performance in Battle Royale titles such as Fortnite and Apex Legends. Essentially, players who can manage 144 fps score significantly higher than those limited to 60 fps: the company’s graphs suggest its RTX cards can increase K/D ratio by as much as 53%, while playing on 240 Hz and 144 Hz monitors can improve K/D ratio by 34% and 51%, respectively.

NVIDIA used more than a million sample points collected via anonymous GeForce Experience data, and then analyzed the data (which means no AMD cards). Specifically, NVIDIA is looking at player performance in two popular battle royale games: PUBG and Fortnite. How do you quantify player performance? NVIDIA looked at kill/death ratio and matched that up with number of hours played per week, then finally broke that down into graphics hardware and monitor refresh rate. NVIDIA limited its analysis to 1080p, which provides for the highest refresh rates and also serves to normalize things a bit.

Discussion
Posted by Megalith March 09, 2019 10:05 AM (CST)