Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

Nvidia Releases "Creator Ready" RTX Drivers

Earlier this week, Nvidia rolled out a set of "creator ready" drivers that are compatible with consumer GPUs, but optimized for professional applications. This level of support is typically reserved for drivers that only work with pricey Quadro GPUs, but Nvidia says they've conducted "exhaustive multi-app testing" in programs like Adobe Premiere and After Effects. Support for this driver goes all the way back to Pascal cards, and extends to Nvidia's more affordable offerings like the GTX 1050 and the more recent 1660. Perhaps even more interestingly, Nvidia claims they've worked with a number of software vendors to leverage the raytracing and machine-learning muscle their RTX cards offer. Autodesk Arnold and Unreal Engine 4, for example, now support RTX accelerated rendering, and Redcine-X Pro seemingly uses Turing's updated video processing block to decode 8K video without taxing the CPU. Meanwhile, Lightroom uses "an extensively trained convolutional neural network to provide state-of-the-art image enhancing for RAW photographs." While I haven't tested Lightroom's new features myself, in my experience, neural networks can perform small miracles when processing images. Nvidia also claims the new driver features significant performance improvements in Photoshop, Premiere, Blender Cycles, and Cinema 4D.

"Creators are constantly faced with tight deadlines and depend on having the latest hardware and creative tools to complete their projects on time, without compromising quality," said Eric Bourque, senior software development manager at Autodesk. "We're excited that NVIDIA is introducing a Creator Ready Driver program because it will bring Arnold users an even higher level of support, helping them bring their creative visions to life faster and more efficiently." The first Creator Ready Driver is now available from NVIDIA.com or GeForce Experience. From GeForce Experience, you can switch between Game Ready and Creator Ready Drivers at any time by clicking the menu (three vertical dots in the top right corner). Creator Ready Drivers are supported for Turing-based GeForce RTX, GTX and TITAN GPUs, Volta-based TITAN V, Pascal-based GeForce GTX and TITAN GPUs, and all modern Quadro GPUs.

Discussion
Posted by alphaatlas March 22, 2019 8:57 AM (CDT)

Nvidia Skips Ampere at GTC 2019

Several news outlets seems to think Nvidia's GTC presentation was relatively longwinded and unexciting this year. The three-hour keynote reportedly featured some software announcements and a low power Jetson board, among other things, but didn't feature the 7nm Ampere GPUs many were expecting. EE Times says that the "unspoken message" at the presentation was that "Nvidia doesn't need to pre-announce a new and faster chip because it owns that software stack and channel today," and the emphasis on CUDA seemed to really drive that point home. However, in one of the more exciting parts of the presentation, Nvidia did highlight the Q2VKPT project we covered earlier this year. Nvidia's CEO seemed quite excited about the introduction of raytracing to Quake II, and they showed off some of the project's gameplay, which you can see below:

Presaging that future, Nvidia's chief scientist, Bill Dally, told reporters about a research project in optical chip-to-chip links. It targets throughput in terabits/second while drawing 2 picojoules/bit/s. In an initial implementation, 32 wavelengths will run at 12.5 Gbits/s each, with a move to 64 wavelengths doubling bandwidth in a follow-up generation. Dally predicted that copper links will start run out of gas as data rates approach 100 Gbits/s, already on many roadmaps for network switches. Progress in more power-efficient laser sources and ring resonators will enable the long-predicted shift, he said. If the future evolves as he believes, bleeding-edge GPUs may continue to skip an appearance at some Nvidia events. Attendees will have to hope that as the interconnects speed up, the keynotes don't get even longer.

Discussion
Posted by alphaatlas March 20, 2019 11:25 AM (CDT)

NVIDIA Could Tease Next-Gen 7nm Ampere at GTC 2019

It isn’t clear whether NVIDIA will have any surprises to share at next week’s GPU Technology Conference (GTC), but some speculate the company could reveal aspects of its next-generation architecture, "Ampere," which will purportedly be built on the 7nm node. TweakTown and TechSpot suggest it could be the right time to do so, as the luster of Volta and Turing continues to wear thin. The former predicts it won’t be a gaming part, however, suggesting "a new GPU architecture tease that will succeed Volta in the HPC/DL/AI market."

For now, NVIDIA has used the Ampere name for their future 7nm GPUs. If that's the case, the Ampere GPUs would bring power efficiency improvements, higher clock rates, and perhaps higher memory bandwidth. Now would be a good time for NVIDIA to make a big announcement, considering the company just had one of the worst fiscal quarters its ever had. Consumer and investor faith in the company is slipping, especially since the adoption of RTX technology has been much slower than expected.

Discussion
Posted by Megalith March 16, 2019 4:45 PM (CDT)

Nvidia's Freesync Monitor Support Tested

Nvidia recently released a driver update that adds support for Freesync monitors, but the GPU maker was quick to point out that "non-validated" displays could exhibit serious issues. However, Techspot put that claim to the test last January, and just followed up on it with a handful of new Freesync monitors from LG. Out of the 12 monitors they've tested so far, 11 have worked flawlessly, and the one that didn't only supports Freesync over HDMI, while Nvidia only supports the adaptive sync technology over Displayport.

As we said in the original article, we think it's safe to say if you purchase a new FreeSync model today that it will work fine with Nvidia GPUs. You shouldn't expect to see any graphical issues, whether you buy an LG monitor or a display from a different maker. Only the very early, older FreeSync monitors may have some issues. Thus our recommendation continues to be to select your next monitor on its merits, features and preference. If it's a FreeSync gaming monitor and you can save some money, that's great. GeForce GPU owners no longer you need to bother with G-Sync for getting proper variable refresh rate, unless that's the monitor you want for other factors as mentioned above. These results from a selection of LG monitors just reinforces that FreeSync gaming displays can and will perform perfectly.

Discussion
Posted by alphaatlas March 12, 2019 9:03 AM (CDT)

NVIDIA Acquires Mellanox for $6.9 Billion

NVIDIA has reached a definitive agreement to acquire Mellanox for $6.9 billion. NVIDIA will acquire all of the issued and outstanding common shares of Mellanox for $125 per share in cash. NVIDIA and Mellanox are both known as high performance computing (HPC) industry leaders and their products are found in over 250 of the world's TOP500 supercomputers and every major cloud service provider. Mellanox is known for its high-performance interconnect technology called Infiniband and high-speed Ethernet products. "We share the same vision for accelerated computing as NVIDIA," said Eyal Waldman, founder and CEO of Mellanox. "Combining our two companies comes as a natural extension of our longstanding partnership and is a great fit given our common performance-driven cultures. This combination will foster the creation of powerful technology and fantastic opportunities for our people."

"The emergence of AI and data science, as well as billions of simultaneous computer users, is fueling skyrocketing demand on the world's datacenters," said Jensen Huang, founder and CEO of NVIDIA. "Addressing this demand will require holistic architectures that connect vast numbers of fast computing nodes over intelligent networking fabrics to form a giant datacenter-scale compute engine. "We're excited to unite NVIDIA's accelerated computing platform with Mellanox's world-renowned accelerated networking platform under one roof to create next-generation datacenter-scale computing solutions. I am particularly thrilled to work closely with the visionary leaders of Mellanox and their amazing people to invent the computers of tomorrow."

Discussion
Posted by cageymaru March 11, 2019 9:42 AM (CDT)

NVIDIA Ending Driver Support for 3D Vision, Mobile Kepler-Series GeForce GPUs

NVIDIA has published two new support entries revealing the fate of its 3D Vision technology and Kepler notebook GPUs. After Release 418 in April 2019, GeForce Game Ready Drivers will no longer support NVIDIA 3D Vision. ("Those looking to utilize 3D Vision can remain on a Release 418 driver.") Critical security updates for mobile Kepler-series GPUs will also cease by April 2020.

Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available for systems utilizing mobile Maxwell, Pascal, and Turing-series GPUs for notebooks, effective April 2019. Critical security updates will be available on systems utilizing mobile Kepler-series GPUs through April 2020. Game Ready Driver upgrades will continue to be available for desktop Kepler, Maxwell, Pascal, Volta, and Turing-series GPUs.

Discussion
Posted by Megalith March 10, 2019 10:10 AM (CDT)

NVIDIA: RTX GPUs, High-Refresh-Rate Monitors Can Improve Your Kill-Death Ratio

If only to convince gamers to upgrade their GPUs and displays, NVIDIA has published new data supporting the obvious idea that better hardware improves player performance in Battle Royale titles such as Fortnite and Apex Legends. Essentially, players who can manage 144 fps score significantly higher than those limited to 60 fps: the company’s graphs suggest its RTX cards can increase K/D ratio by as much as 53%, while playing on 240 Hz and 144 Hz monitors can improve K/D ratio by 34% and 51%, respectively.

NVIDIA used more than a million sample points collected via anonymous GeForce Experience data, and then analyzed the data (which means no AMD cards). Specifically, NVIDIA is looking at player performance in two popular battle royale games: PUBG and Fortnite. How do you quantify player performance? NVIDIA looked at kill/death ratio and matched that up with number of hours played per week, then finally broke that down into graphics hardware and monitor refresh rate. NVIDIA limited its analysis to 1080p, which provides for the highest refresh rates and also serves to normalize things a bit.

Discussion
Posted by Megalith March 09, 2019 10:05 AM (CST)

NVIDIA GeForce Game Ready Driver Version 419.35 WHQL

NVIDIA has released the GeForce Game Ready Driver 419.35 - WHQL and it provides the optimal gaming experience for Apex Legends, Devil May Cry 5, and Tom Clancy's The Division II. The driver also adds support for three new G-SYNC compatible monitors. The 3D Vision profile for Total War: Three Kingdoms is rated as fair, but the profile for Devil May Cry 5 is not recommended. The following SLI profiles have been added or updated: Anthem, Apex Legends, Assetto Corsa Competizione - AFR enabled for Turing GPUs, Battlefleet Gothic: Armada 2 - AFR enabled for Turing GPUs, Far Cry New Dawn, Life is strange Season 2 - AFR enabled for Turing GPUs, NBA 2K19 - AFR enabled for Turing GPUs, Space Hulk Tactics - AFR enabled for Turing GPU. Fixed issues in this release include: [G-SYNC]: With a G-SYNC and G-SYNC Compatible display connected in clone mode, flashing occurs on games played on the G-SYNC display with G-SYNC enabled. [200482157]. [Apex Legends]: DXGI_ERROR_DEVICE_HUNG error message may appear and then the game crashes.[2503360]. [Hitman 2]: Pixelated corruption appears in the game. [2504274]. [Batman: Arkham Origins]: PhysX fog renders incorrectly [2451459]. [Turing][Star Citizen]: The game flickers and then crashes to the desktop. [2518104]. [GeForce RTX 2080][PhysX][Assassin's Creed 4 - Black Flag]: Smoke dispersal appears accelerated. [2498928]. Microsoft.Photos.exe randomly crashes. [200496899]. NVDisplay.Container.exe causes high CPU usage. [200496099].

Windows 10 Issues: [SLI][HDR][Tom Clancy's The division II]: The game screen becomes unresponsive or goes blank when in-game HDR options are toggled. [200496967]. [HDR][Far Cry:New Dawn DirectX 11]: Desktop brightness and color gets overexposed with ALT + TAB when Windows HDR is disabled and in-game HDR is enabled. [200495279]. To work around, enable both the Windows HDR and the in-game HDR. [ARK Survival]: Multiple errors and then blue-screen crash may occur when playing the game. [2453173]. [Shadow of the Tomb Raider][Ansel]: Invoking Ansel in the game causes the game to slow down or crash. [2507125]. [Firefox]: Cursor shows brief corruption when hovering on certain links in Firefox. [2107201]. Random desktop flicker occurs on some multi-display PCs [2453059].

Discussion
Posted by cageymaru March 05, 2019 2:27 PM (CST)

Nvidia Adds Metro Exodus to RTX Bundle

In an attempt to tempt those in the market for an new GPU, Nvidia just launched the "RTX Triple Threat" game bundle with what may be the best RTX-enabled game to date. Nvidia's previous "Game On Bundle" gave buyers Battlefield V and Anthem for free, but starting today, anyone who buys a RTX 2080 or 2080 TI will get Metro Exodus as well. RTX 2070 and 2060 buyers will get to pick 1 of the 3 games, but sadly, the recently released 1660 TI isn't part of this particular bundle.
Pre-built desktops and laptops are covered by the deal as well, and the terms and conditions mention that the deal will be good until at least April 4, 2019. Several games that previously advertised RTX support still haven't received patches to enable it, while others are coming closer to release, so I suspect Nvidia will offer bundles with a larger variety of RTX-enabled games later this year. But, as we've noted, Metro Exodus is one of the best examples of Nvidia's raytracing and DLSS technology to date, and the developers are improving it with every update. Update 3/6/2019: We previously reported that RTX 2070 and 2060 buyers get 2 out of the 3 games, but Nvidia's terms say they only get to pick a single game. Discussion
Posted by alphaatlas March 05, 2019 12:49 PM (CST)

Google Starts Selling Edge TPUs

Back in 2016, Google revealed their custom built Tensor Processing Unit chips that were explicitly designed for machine learning tasks, and just last year, they started renting out cloud-based access to updated versions of that AI hardware. These monster ASICs are squarely aimed at ML training tasks, but during that same period, Google also released a small, low power Edge TPU designed to run algorithms the big chips train a little closer to home. Previously, these "Edge" ML chips were only available to rent through Google, but an Alphabet spin-off called Coral just started selling the Edge TPUs through Mouser. Interested parties can buy self-contained development boards complete with an ARM CPU, an integrated GPU, I/O, and Ethernet for $149, while a USB accessory akin to an Intel Compute Stick will set you back $75. A PCI-E based accelerator and a 40mm x 40mm "System on Module" are said to be coming sometime in 2019, but there's no word on when, or if, Google will ever sell the bigger TPU ASICs as discrete co-processors.

AI is pervasive today, from consumer to enterprise applications. With the explosive growth of connected devices, combined with a demand for privacy/confidentiality, low latency and bandwidth constraints, AI models trained in the cloud increasingly need to be run at the edge. Edge TPU is Google's purpose-built ASIC designed to run AI at the edge. It delivers high performance in a small physical and power footprint, enabling the deployment of high-accuracy AI at the edge.

Discussion
Posted by alphaatlas March 05, 2019 8:41 AM (CST)

NVIDIA GeForce GTX 1650 Turing Specs Allegedly Leak: 1.4GHz Base Clock, 4GB GDDR5

Bangkok-based leaker APISAK (@TUM_APISAK) has posted a 3DMark screenshot revealing the alleged specs of NVIDIA’s upcoming GeForce GTX 1650 GPU, which will reportedly launch alongside the GTX 1660 in spring. The Turing-based card is listed with a 1,395MHz base clock and 1,560MHz boost clock, and 4GB of presumed GDDR5 memory: "Past leaks peg the memory bus at 128-bit, and with a 2,000MHz effective clock speed, that would give the card 128GB/s of memory bandwidth."

...we can reasonably surmise that this will be yet another Turing card that lacks RT and Tensor cores, which are what give GeForce RTX series cards their real-time ray tracing and Deep Learning Super Sampling (DLSS) mojo. NVIDIA rightly recognized that gamers at large are waiting for both features to be more widely supported before investing in the necessary hardware. Hence why the GeForce GTX 1660 Ti exists -- it lacks those features and is the least expensive Turing card on the market.

Discussion
Posted by Megalith March 02, 2019 9:40 AM (CST)

NVIDIA and AMD Are Shipping Fewer GPUs as Retailers Sit on Inventory

Jon Peddie Research’s review of the GPU market for Q4 2018 suggests retailers are still neck deep in graphics cards they can’t get rid of, as overall shipments are down from the previous quarter: "all three companies saw a 2.65-percent decline in shipments from Q3 2018 to Q4 2018. AMD was down 6.8 percent, NVIDIA was down 7.6 percent, and Intel was down 0.7 percent." Peddie predicts the fallout from the cryptocurrency crash will persist through Q1 and Q219.

Year-over-year, GPU shipments dropped 3.3 percent. An 8-percent increase in notebook GPUs wasn’t enough to offset a 20-percent decline in desktop video cards. "The channel’s demand for add-in boards (AIBs) in early 2018 was out of sync with what was happening in the market," Jon Peddie Research founder Dr. Jon Peddie said. "As a result the channel was burdened with too much inventory. That has impacted sales of discrete GPUs in Q4." And it’s unlikely that the market and supply chain will equalize any time soon.

Discussion
Posted by Megalith March 02, 2019 9:00 AM (CST)