Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

Nvidia Releases "Creator Ready" RTX Drivers

Earlier this week, Nvidia rolled out a set of "creator ready" drivers that are compatible with consumer GPUs, but optimized for professional applications. This level of support is typically reserved for drivers that only work with pricey Quadro GPUs, but Nvidia says they've conducted "exhaustive multi-app testing" in programs like Adobe Premiere and After Effects. Support for this driver goes all the way back to Pascal cards, and extends to Nvidia's more affordable offerings like the GTX 1050 and the more recent 1660. Perhaps even more interestingly, Nvidia claims they've worked with a number of software vendors to leverage the raytracing and machine-learning muscle their RTX cards offer. Autodesk Arnold and Unreal Engine 4, for example, now support RTX accelerated rendering, and Redcine-X Pro seemingly uses Turing's updated video processing block to decode 8K video without taxing the CPU. Meanwhile, Lightroom uses "an extensively trained convolutional neural network to provide state-of-the-art image enhancing for RAW photographs." While I haven't tested Lightroom's new features myself, in my experience, neural networks can perform small miracles when processing images. Nvidia also claims the new driver features significant performance improvements in Photoshop, Premiere, Blender Cycles, and Cinema 4D.

"Creators are constantly faced with tight deadlines and depend on having the latest hardware and creative tools to complete their projects on time, without compromising quality," said Eric Bourque, senior software development manager at Autodesk. "We're excited that NVIDIA is introducing a Creator Ready Driver program because it will bring Arnold users an even higher level of support, helping them bring their creative visions to life faster and more efficiently." The first Creator Ready Driver is now available from NVIDIA.com or GeForce Experience. From GeForce Experience, you can switch between Game Ready and Creator Ready Drivers at any time by clicking the menu (three vertical dots in the top right corner). Creator Ready Drivers are supported for Turing-based GeForce RTX, GTX and TITAN GPUs, Volta-based TITAN V, Pascal-based GeForce GTX and TITAN GPUs, and all modern Quadro GPUs.

Discussion
Posted by alphaatlas March 22, 2019 8:57 AM (CDT)

Nvidia Skips Ampere at GTC 2019

Several news outlets seems to think Nvidia's GTC presentation was relatively longwinded and unexciting this year. The three-hour keynote reportedly featured some software announcements and a low power Jetson board, among other things, but didn't feature the 7nm Ampere GPUs many were expecting. EE Times says that the "unspoken message" at the presentation was that "Nvidia doesn't need to pre-announce a new and faster chip because it owns that software stack and channel today," and the emphasis on CUDA seemed to really drive that point home. However, in one of the more exciting parts of the presentation, Nvidia did highlight the Q2VKPT project we covered earlier this year. Nvidia's CEO seemed quite excited about the introduction of raytracing to Quake II, and they showed off some of the project's gameplay, which you can see below:

Presaging that future, Nvidia's chief scientist, Bill Dally, told reporters about a research project in optical chip-to-chip links. It targets throughput in terabits/second while drawing 2 picojoules/bit/s. In an initial implementation, 32 wavelengths will run at 12.5 Gbits/s each, with a move to 64 wavelengths doubling bandwidth in a follow-up generation. Dally predicted that copper links will start run out of gas as data rates approach 100 Gbits/s, already on many roadmaps for network switches. Progress in more power-efficient laser sources and ring resonators will enable the long-predicted shift, he said. If the future evolves as he believes, bleeding-edge GPUs may continue to skip an appearance at some Nvidia events. Attendees will have to hope that as the interconnects speed up, the keynotes don't get even longer.

Discussion
Posted by alphaatlas March 20, 2019 11:25 AM (CDT)

Shadow of the Tomb Raider Receives RTX Treatment in Latest Patch

The latest patch for Shadow of the Tomb Raider has enabled support for real time ray tracing and NVIDIA's DLSS. To enable the features, gamers will need Window 10 update 1809 or higher, NVIDIA RTX 20- series GPU, and NVIDIA's latest drivers 419.35 and up. Nixxes announced the creation of a Beta, enabling the ability to switch back to an older version of the game, if problems arise.

We have just released the thirteenth PC patch for Shadow of the Tomb Raider, build 1.0.280. This patch focuses primarily on the release for the Nvidia's Ray-Traced Shadows and DLSS. While we expect this patch to be an improvement for everyone, if you do have trouble with this patch and prefer to stay on the old version, we have made a Beta available on Steam, Build 279, that can be used to switch back to the previous version.

Discussion
Posted by cageymaru March 19, 2019 9:55 PM (CDT)

Not All RTX 2080s are Created Equal

Manufacturers have had some time to stock store shelves and warehouses with Nvidia RTX laptops, but as Techspot pointed out earlier this year, the nomenclature can be very confusing. The laptop "RTX 2080," for example, doesn't have the same performance as the desktop version of RTX 2080, and there are multiple version of the "RTX 2080 Max-Q" with different levels of performance. Hardware Unboxed tested the performance difference between the various versions, which you can see in the video below:
The fact that Nvidia can cram a 545mm^2 GPU into a low-power laptop at all is remarkable, and generally speaking, the RTX chips perform well in their relatively small power envelopes. But as the video points out, be careful if you're in the market for a gaming laptop, as the actual performance level of some RTX GPUs can be difficult to discern. Discussion
Posted by alphaatlas March 19, 2019 10:24 AM (CDT)

Get Ready for Targeted Ads on Your Smart TV

Disney, Comcast, NBCUniversal, and other top media companies have teamed up with VIZIO for a new standard that will bring targeted ads to television viewers. VIZIO, which recently lost $2.2 million after being caught tracking and selling viewing data using software on its Smart TVs, claims targeted ads, which are "relevant" to the household, will "drastically enhance" the viewing experience.

The companies are calling themselves a consortium, and they've dubbed this "Project OAR," or Open Addressable Ready. Once developed, the new, open standard will make it possible for all connected TV companies to sell targeted ads in scheduled and on-demand programs. While this will theoretically make ads more successful and therefore more valuable, it also means viewers' data will be shared with third parties. That raises the usual data privacy concerns.

Discussion
Posted by Megalith March 17, 2019 2:45 PM (CDT)

NVIDIA Could Tease Next-Gen 7nm Ampere at GTC 2019

It isn’t clear whether NVIDIA will have any surprises to share at next week’s GPU Technology Conference (GTC), but some speculate the company could reveal aspects of its next-generation architecture, "Ampere," which will purportedly be built on the 7nm node. TweakTown and TechSpot suggest it could be the right time to do so, as the luster of Volta and Turing continues to wear thin. The former predicts it won’t be a gaming part, however, suggesting "a new GPU architecture tease that will succeed Volta in the HPC/DL/AI market."

For now, NVIDIA has used the Ampere name for their future 7nm GPUs. If that's the case, the Ampere GPUs would bring power efficiency improvements, higher clock rates, and perhaps higher memory bandwidth. Now would be a good time for NVIDIA to make a big announcement, considering the company just had one of the worst fiscal quarters its ever had. Consumer and investor faith in the company is slipping, especially since the adoption of RTX technology has been much slower than expected.

Discussion
Posted by Megalith March 16, 2019 4:45 PM (CDT)

"Back 4 Blood": Left 4 Dead Creator Announces New Co-op Zombie Game

Nobody knows if Valve’s Left 4 Dead 3 will ever see the light of day, but the franchise’s original developer may have the next best thing: Turtle Rock Studios announced "Back 4 Blood" this week, a co-op zombie shooter with a campaign and PVP mode that should cater well to L4D fans who have been clamoring for more. "We know we have some big shoes to fill, but we’re going all out to surpass everything we’ve done before."

"We get to return to a genre that was born in our studio with over 10 years of additional experience and zombie ideas racked up in our brains. We also have some of the best teammates in the business at WBIE, who understand our development process and are equally committed to our player-first mentality. We love being able to announce, so we can start working with the community right away."

Discussion
Posted by Megalith March 16, 2019 12:30 PM (CDT)

Ubisoft Releases Tom Clancy's The Division 2 Launch Trailers

Ubisoft has released the latest trailers for Tom Clancy's The Division 2 and the first video showcases many of the AMD optimizations in the game. From high resolution shadows to real-time reflections, the game is shaping up to be a graphical masterpiece. The second trailer showcases the developer's vision in recreating a real-life, post-pandemic Washington, DC within The Division 2. Ubisoft researchers explain in great detail why certain wildlife, monuments, locations, camps and more are present in the game. Tom Clancy's The Division 2 launches March 15th but Gold Edition purchasers are already playing as those versions feature 3-day early access privileges.

How do you recreate a city? The Division 2 takes place in a post-pandemic Washington, DC, and bringing it to life required in-depth research and dedication to getting the details right. Learn about how the team tackled iconic monuments, incorporated real-life disaster planning into their designs, and adapted the map to create a believeable post-pandemic city in these interviews with IP Researcher Cloe Hammoud and Lead Environmental Artist Chad Chatterton.

Discussion
Posted by cageymaru March 13, 2019 6:49 PM (CDT)

The Web's Creator Comments on Its Future

The internet turned 30 this year, and CERN celebrated it with a long (and if I'm being honest, not particularly exciting) webcast featuring its creator, Sir Tim Berners-Lee. However, after the recent Cambridge Analytica data scandal and what seems like a new privacy/security related scandal every day since then, Sir Tim is worried about the future of the internet. Today, he published an open letter on the future of the web, noting that "while the web has created opportunity, given marginalised groups a voice, and made our daily lives easier, it has also created opportunity for scammers, given a voice to those who spread hatred, and made all kinds of crime easier to commit." The BBC recently posted an interview with Sir Tim, which you can watch below:

To tackle any problem, we must clearly outline and understand it. I broadly see three sources of dysfunction affecting today's web: Deliberate, malicious intent, such as state-sponsored hacking and attacks, criminal behaviour, and online harassment; System design that creates perverse incentives where user value is sacrificed, such as ad-based revenue models that commercially reward clickbait and the viral spread of misinformation; Unintended negative consequences of benevolent design, such as the outraged and polarised tone and quality of online discourse.

Discussion
Posted by alphaatlas March 12, 2019 10:46 AM (CDT)

Nvidia's Freesync Monitor Support Tested

Nvidia recently released a driver update that adds support for Freesync monitors, but the GPU maker was quick to point out that "non-validated" displays could exhibit serious issues. However, Techspot put that claim to the test last January, and just followed up on it with a handful of new Freesync monitors from LG. Out of the 12 monitors they've tested so far, 11 have worked flawlessly, and the one that didn't only supports Freesync over HDMI, while Nvidia only supports the adaptive sync technology over Displayport.

As we said in the original article, we think it's safe to say if you purchase a new FreeSync model today that it will work fine with Nvidia GPUs. You shouldn't expect to see any graphical issues, whether you buy an LG monitor or a display from a different maker. Only the very early, older FreeSync monitors may have some issues. Thus our recommendation continues to be to select your next monitor on its merits, features and preference. If it's a FreeSync gaming monitor and you can save some money, that's great. GeForce GPU owners no longer you need to bother with G-Sync for getting proper variable refresh rate, unless that's the monitor you want for other factors as mentioned above. These results from a selection of LG monitors just reinforces that FreeSync gaming displays can and will perform perfectly.

Discussion
Posted by alphaatlas March 12, 2019 9:03 AM (CDT)

NVIDIA Acquires Mellanox for $6.9 Billion

NVIDIA has reached a definitive agreement to acquire Mellanox for $6.9 billion. NVIDIA will acquire all of the issued and outstanding common shares of Mellanox for $125 per share in cash. NVIDIA and Mellanox are both known as high performance computing (HPC) industry leaders and their products are found in over 250 of the world's TOP500 supercomputers and every major cloud service provider. Mellanox is known for its high-performance interconnect technology called Infiniband and high-speed Ethernet products. "We share the same vision for accelerated computing as NVIDIA," said Eyal Waldman, founder and CEO of Mellanox. "Combining our two companies comes as a natural extension of our longstanding partnership and is a great fit given our common performance-driven cultures. This combination will foster the creation of powerful technology and fantastic opportunities for our people."

"The emergence of AI and data science, as well as billions of simultaneous computer users, is fueling skyrocketing demand on the world's datacenters," said Jensen Huang, founder and CEO of NVIDIA. "Addressing this demand will require holistic architectures that connect vast numbers of fast computing nodes over intelligent networking fabrics to form a giant datacenter-scale compute engine. "We're excited to unite NVIDIA's accelerated computing platform with Mellanox's world-renowned accelerated networking platform under one roof to create next-generation datacenter-scale computing solutions. I am particularly thrilled to work closely with the visionary leaders of Mellanox and their amazing people to invent the computers of tomorrow."

Discussion
Posted by cageymaru March 11, 2019 9:42 AM (CDT)

NVIDIA Ending Driver Support for 3D Vision, Mobile Kepler-Series GeForce GPUs

NVIDIA has published two new support entries revealing the fate of its 3D Vision technology and Kepler notebook GPUs. After Release 418 in April 2019, GeForce Game Ready Drivers will no longer support NVIDIA 3D Vision. ("Those looking to utilize 3D Vision can remain on a Release 418 driver.") Critical security updates for mobile Kepler-series GPUs will also cease by April 2020.

Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available for systems utilizing mobile Maxwell, Pascal, and Turing-series GPUs for notebooks, effective April 2019. Critical security updates will be available on systems utilizing mobile Kepler-series GPUs through April 2020. Game Ready Driver upgrades will continue to be available for desktop Kepler, Maxwell, Pascal, Volta, and Turing-series GPUs.

Discussion
Posted by Megalith March 10, 2019 10:10 AM (CDT)