Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

The DHS Issues Medical Advisory for Medtronic Cardiac Devices

The Department of Homeland Security (DHS) has issued a cybersecurity warning that documents vulnerabilities in the Medtronic Conexus Radio Frequency Telemetry Protocol. Medtronic makes cardio-defibrillators that are planted into a patient's chest and can be read and programmed by trained medical personnel. This allows the devices to communicate with home monitoring devices and Carelink programmers found at doctor's offices. These vulnerabilities require a low level of skill to exploit as the proprietary Conexus telemetry protocol utilized within this ecosystem does not implement authentication or authorization. An attacker can inject, replay, modify, and/or intercept data within the telemetry communication. This communication protocol provides the ability to read and write memory values to affected implanted cardiac devices; therefore, an attacker could exploit this communication protocol to change memory in the implanted cardiac device. Because the devices also lack encryption, attackers can listen to communications, including the transmission of sensitive data. Medtronics is working on developing updates to fix the vulnerabilities.

"It is possible with this attack to cause harm to a patient, either by erasing the firmware that is giving necessary therapy to the patient's heart, or by directly invoking shock related commands on the defibrillator," he said. "Since this protocol is unauthenticated, the ICD cannot discern if communications its receiving are coming from a trusted Medtronic device, or an attacker." A successful attacker could erase or reprogram the defibrillator's firmware, and run any command on the device.

Discussion
Posted by cageymaru March 22, 2019 4:05 PM (CDT)

Microsoft Patent Describes a Persistence of Vision Augmented Reality Display

Microsoft has filed a patent for a new type of mixed-reality head-mounted display (HMD) called a Persistence of Vision Augmented Reality Display. The display uses movable screens to simulate a 360 degree field of view. These screens can rotate back and forth, spin around a user's head, or spin in front of the user's eyes. The device isn't guaranteed to ever see the light of day, but hopefully some of the ideas make it into other products.

Microsoft's patent FIG. 4A below illustrates a movement based display device with a movable member configured to rotate about a user's head; FIG. 4B illustrates a movement based display device with a movable member configured to reciprocate in front of user's eyes; FIG. 4C illustrates a movement based display device with movable members configured to spin in front of the user's eyes

Discussion
Posted by cageymaru March 22, 2019 3:17 PM (CDT)

Audeze Mobius Review

Hardware Canucks has reviewed the $400 Audeze Mobius premium 3D gaming headset. The reviewer loved the 3D mode for movies, but didn't see the point in it for gaming. He also noted the decreased sound quality in 3D mode. The 3D tracking would lose the headphone's position and was entirely too slow to react to head movements when it worked. The reviewer didn't think the virtual 7.1 surround mode on the headset was that great, and the short range on the headset's Bluetooth would cause him issues if he walked around the office. The cabling included with the Audeze Mobius was subpar and the microphone quality was lackluster. He did appreciate the immersive sound quality that the planar magnetic headset exhibited when in Hi Rez mode. You can read our review of the Audeze Mobius here.

The Audeze Mobius is supposed to be a gaming headset that will satisfy audiophiles with incredible planar magnetic drivers and 3D positional audio. It has literally every feature but it also costs $400. But if you have the money, this might be the best gaming headset available.

Discussion
Posted by cageymaru March 22, 2019 12:04 PM (CDT)

Ransomware Encourages Victims to Subscribe to PewDiePie

PewDiePie's battle with Bollywood star T-Series has pushed some of his more enthusiastic fans to extremes. A group of hackers used printers to promote their favorite YouTuber last year, and more recently, they hacked their way into Smart TVs, Chromecasts, and Google Home devices. Now, recent reports claim that new strains of ransomware are encouraging users to subscribe to PewDiePie. The "PewDiePie ransomware" released last year didn't even bother to save encryption keys, which means whatever user data it targeted was gone for good, while a new strain that popped up this January runs in Java to make detection more difficult. However, instead of asking for a ransom, the later program simply offers a link to PewDiePie's subscription page. It claims that public keys will be released if PewDiePie hits 100 million subscribers before T-Series, while the user's data will never see the light of day again if T-Series hits that mark first. ZDNet says the software was "put together as a joke," but still managed to infect a few users, and that the code is now publicly available on GitHub. Thanks to AceGoober for the tip, and check out a demonstration of the ransomware below:

Both ransomware strains show the level of idiocy the competition for YouTube's top spot has reached. While T-Series fans have remained mostly quiet most of this time, a portion of PewDiePie's fans appears to have lost their minds and engaged in media stunts bordering on criminal behavior... The message itself has become a meme, and not in a good way.

Discussion
Posted by alphaatlas March 22, 2019 9:26 AM (CDT)

Nvidia Releases "Creator Ready" RTX Drivers

Earlier this week, Nvidia rolled out a set of "creator ready" drivers that are compatible with consumer GPUs, but optimized for professional applications. This level of support is typically reserved for drivers that only work with pricey Quadro GPUs, but Nvidia says they've conducted "exhaustive multi-app testing" in programs like Adobe Premiere and After Effects. Support for this driver goes all the way back to Pascal cards, and extends to Nvidia's more affordable offerings like the GTX 1050 and the more recent 1660. Perhaps even more interestingly, Nvidia claims they've worked with a number of software vendors to leverage the raytracing and machine-learning muscle their RTX cards offer. Autodesk Arnold and Unreal Engine 4, for example, now support RTX accelerated rendering, and Redcine-X Pro seemingly uses Turing's updated video processing block to decode 8K video without taxing the CPU. Meanwhile, Lightroom uses "an extensively trained convolutional neural network to provide state-of-the-art image enhancing for RAW photographs." While I haven't tested Lightroom's new features myself, in my experience, neural networks can perform small miracles when processing images. Nvidia also claims the new driver features significant performance improvements in Photoshop, Premiere, Blender Cycles, and Cinema 4D.

"Creators are constantly faced with tight deadlines and depend on having the latest hardware and creative tools to complete their projects on time, without compromising quality," said Eric Bourque, senior software development manager at Autodesk. "We're excited that NVIDIA is introducing a Creator Ready Driver program because it will bring Arnold users an even higher level of support, helping them bring their creative visions to life faster and more efficiently." The first Creator Ready Driver is now available from NVIDIA.com or GeForce Experience. From GeForce Experience, you can switch between Game Ready and Creator Ready Drivers at any time by clicking the menu (three vertical dots in the top right corner). Creator Ready Drivers are supported for Turing-based GeForce RTX, GTX and TITAN GPUs, Volta-based TITAN V, Pascal-based GeForce GTX and TITAN GPUs, and all modern Quadro GPUs.

Discussion
Posted by alphaatlas March 22, 2019 8:57 AM (CDT)

Sound BlasterX G6 Review

Audio Science Review has tested the Sound BlasterX G6 and found it to be surprisingly good for a budget device in a feature-filled package. During DAC testing it was discovered that the device's SINAD would rocket up to 112 dB if the level was dialed down by 2 dBFS (digitally.) Linearity was spot on but intermodulation distortion was a concern. Amirm discovered that dialing down the device by 2dBFS fixed the issue. He speculated that "The G6 is USB powered and likely doesn't have enough capacitance in its DC input to ride out the lasting peaks at low frequencies." The headphone amplifier measured great, and the output was decent but "there is no sensation of infinite power and you would be operating near or at max volume" when using the amp with a pair of Sennheiser HD-650 headphones. The last issue that irritated him was the lack of a properly working ASIO driver on Creative's website. He really liked the Sound BlasterX G6 and the review is full of charts and measurements conducted with a $28,300 Audio Precision APx555.

ADC Audio Measurements I was pleased that feeding the G6 2 volt, resulted in 0 dBFS showing no overflow. Performance though is not all that great with SINAD in the high 70s. We have lots of distortion components together with mains leakage. Compared to high-end products, we are short some 40 dB! Definitely not splitting hairs here.

Discussion
Posted by cageymaru March 21, 2019 7:32 PM (CDT)

Microsoft Announces Microsoft Defender ATP for Mac

Microsoft Defender ATP for Mac brings the same robust protection from Windows Defender to the Mac platform. Microsoft has created a "cross-platform next-generation protection and endpoint detection and response coverage" heterogeneous solution that will help Microsoft reach its goal of securing users and data wherever they are. Core components of Microsoft's unified endpoint security platform, including the new Threat & Vulnerability Management will now be available for Mac devices. The cloud-delivered, real-time protection antivirus solution is currently in preview.

We've been working closely with industry partners to enable Windows Defender Advanced Threat Protection (ATP) customers to protect their non-Windows devices while keeping a centralized "single pane of glass" experience. Now we are going a step further by adding our own solution to the options, starting with a limited preview today. As we bring our unified security solution to other platforms, we're also updating our name to reflect the breadth of this expanded coverage: Microsoft Defender ATP.

Discussion
Posted by cageymaru March 21, 2019 3:06 PM (CDT)

Unreal Engine is Getting Destructable Environment Support

High-quality destructible environments seem like something that should be standard in 2019, but even today, it's a relatively rare thing to find in a game. It was a headlining feature Red Faction: Guerilla back in 2009, and a prominent feature in the recently released Crackdown 3, but detailed, destructible environments are still absent from most releases. However, at GDC this year, Epic announced that they're integrating a destruction system into Unreal Engine. Given how popular the engine is, and how competitors will probably try to achieve feature parity, I expect to see more games with destructible environments in the near future.

Revealed onstage at GDC 2019 during "State of Unreal," Chaos is Unreal Engine’s new high-performance physics and destruction system coming in early access to Unreal Engine 4.23. The real-time tech demo is set within the world of Robo Recall. With Chaos, users can achieve cinematic-quality visuals in real time in scenes with massive-scale levels of destruction, with unprecedented artist control over content creation.

Discussion
Posted by alphaatlas March 21, 2019 11:47 AM (CDT)

Intel Previews Processors and Graphics Software at GDC 2019

At their GDC 2019 conference, Intel confirmed that they'll launch 9th generation mobile processors in the 2nd quarter of 2019. While 9th generation H-series and Y-series "Ice Lake" parts recently showed up on the EEC website, Intel told PC World that these parts are based on 14nm Coffee Lake Silicon. The company also mentioned that one of their goals with his release is "longer battery life" for gamers and more casual users alike, and they're promoting their Wi-Fi 6 capable AX200 chip and 3D XPoint memory with the new chips Meanwhile, Intel also showed off a new software suite for their modern IGPs and (presumably) their future GPUs. The "Intel Graphics Command Center" is essentially their answer to Nvidia's GeForce Experience and AMD's Game Advisor, as it automatically scans your PC for supported games and applies the optimal settings for your current hardware. An "early access" version of the control panel is available on the Microsoft Store, and oddly enough, it says it was "released" on 11/26/2018. Unlike other app stores, the Microsoft Store doesn't log updates or list old changes, so it was presumably in some kind of closed alpha before being officially launched today.

We asked, you answered. You're tired of our 'old, boring, corporate-looking' Graphics Control Panel. We were too and we designed a completely new one from the ground up! We're incorporating the changes you - the gamers, home theater enthusiasts, professionals, and everyday tinkerers requested. Using a phased approach, we're rolling out something we're proud to share with you: introducing the Intel Graphics Command Center.

Discussion
Posted by alphaatlas March 21, 2019 9:54 AM (CDT)

Linux Gaming Across 9 Distros [Review in Progress]

Jason Evangelho of Forbes has started a Linux series where he reviews various Linux distributions (distros) for ease of use and performance in regards to Linux gaming. Jason's series isn't about just running benchmarks as he asks questions that everyday users would need to find out. Where am I going to get up-to-date graphics drivers for my AMD or NVIDIA graphics card? How is the default state of gaming on the Linux distro? Can I get Steam working right out of the box or am I going to have to tweak my system to accomplish this task? The 9 Linux distros that he is going to test in the series includes: Fedora 29 Workstation, Pop!_OS 18.10, Debian 9, Solus 4, Manjaro 18, Linux Mint 19, elementary OS 5, Deepin 15.9, and Ubuntu 18.10. His test system consists of an AMD Ryzen 5 2600, Radeon Sapphire RX 580, Gigabyte G1 Gaming GTX 1080 and more. So far he has tested Fedora 29 Workstation and Pop!_OS 18.10 with Pop!_OS 18.10 easily winning hands down in usability and performance. With the recent announcement that Google is leveraging Linux, Vulkan, first party games, and open-source AMD drivers for games running on its Google Stadia game streaming service; Linux gaming performance may enter into our PC gaming world very soon!

If you're an NVIDIA user, good news: Pop!_OS has a separate installer image for you which automatically installs the proprietary (and far more performant) graphics driver. Again, there's no need to enable alternative software sources or hit the command line. The moment your OS is installed you're ready to start gaming. You'll be using the latest and greatest stable driver, Nvidia 418.43. Radeon gamers have an advantage across several Linux distributions: the open source driver is part of the kernel (and thus ready to use immediately), well maintained and quite performant. This typically means less steps to get up and running with Steam and Steam Proton. One distinct difference between Pop!_OS and Fedora, however, is that Fedora runs with a much newer MESA driver. Specifically, Fedora 29 uses MESA 18.3.4 while Pop uses MESA 18.2.8. The kernel on Pop is also a bit older, but again I noticed no disadvantage on the gaming side save for one: updating your kernel to 5.0 will add Freesync support which is a feature I can't live without. It is quite literally a game-changer.

Discussion
Posted by cageymaru March 20, 2019 12:23 PM (CDT)

Nvidia Skips Ampere at GTC 2019

Several news outlets seems to think Nvidia's GTC presentation was relatively longwinded and unexciting this year. The three-hour keynote reportedly featured some software announcements and a low power Jetson board, among other things, but didn't feature the 7nm Ampere GPUs many were expecting. EE Times says that the "unspoken message" at the presentation was that "Nvidia doesn't need to pre-announce a new and faster chip because it owns that software stack and channel today," and the emphasis on CUDA seemed to really drive that point home. However, in one of the more exciting parts of the presentation, Nvidia did highlight the Q2VKPT project we covered earlier this year. Nvidia's CEO seemed quite excited about the introduction of raytracing to Quake II, and they showed off some of the project's gameplay, which you can see below:

Presaging that future, Nvidia's chief scientist, Bill Dally, told reporters about a research project in optical chip-to-chip links. It targets throughput in terabits/second while drawing 2 picojoules/bit/s. In an initial implementation, 32 wavelengths will run at 12.5 Gbits/s each, with a move to 64 wavelengths doubling bandwidth in a follow-up generation. Dally predicted that copper links will start run out of gas as data rates approach 100 Gbits/s, already on many roadmaps for network switches. Progress in more power-efficient laser sources and ring resonators will enable the long-predicted shift, he said. If the future evolves as he believes, bleeding-edge GPUs may continue to skip an appearance at some Nvidia events. Attendees will have to hope that as the interconnects speed up, the keynotes don't get even longer.

Discussion
Posted by alphaatlas March 20, 2019 11:25 AM (CDT)

Digital Foundry Analyzes Google's Stadia Platform

Following Google's "Stadia" game streaming service announcement yesterday, Digital Foundry decided to take a closer look at the hardware behind the platform. Google says they use a "Custom 2.7GHz hyper-threaded x86 CPU with AVX2 SIMD and 9.5MB L2+L3 cache," and while they didn't mention the vendor, DF notes that they haven't seen such a configuration in any of AMD's currently shipping server CPUs, and that it should significantly outpace anything found in a modern console. Meanwhile, the GPU largely resembles a Vega 56 card with 16GB of HBM2, and the games are reportedly loaded from an SSD. Through their own testing, DF came away impressed with the platform's consistent frame pacing, and in some cases, total latency is on par with locally-run games on a console or PC.

Google has also demonstrated scalability on the graphics side, with a demonstration of three of the AMD GPUs running in concert. Its stated aim is to remove as many of the limiting factors impacting game-makers as possible, and with that in mind, the option is there for developers to scale projects across multiple cloud units: "The way that we describe what we are is a new generation because it's purpose-built for the 21st century," says Google's Phil Harrison. "It does not have any of the hallmarks of a legacy system. It is not a discrete device in the cloud. It is an elastic compute in the cloud and that allows developers to use an unprecedented amount of compute in support of their games, both on CPU and GPU, but also particularly around multiplayer."

Discussion
Posted by alphaatlas March 20, 2019 9:14 AM (CDT)