Articles

404 ERROR: REQUEST COULD NOT BE FOUND

The page that you have requested could not be found at this time. We have provided you a list of related content below or you can use our site search to find the information that you are looking for.

Intel Delivers First Exascale Supercomputer to Argonne National Laboratory

Intel Corporation and Cray Inc. have announced that a Cray "Shasta" system will be the first U.S. exascale supercomputer. This $500 million Aurora supercomputer will be coming to the U.S. Department of Energy's Argonne National Laboratory in 2021 and will have a performance of one exaFLOP - a quintillion floating point operations per second. In addition, this system is designed to enable the convergence of traditional HPC, data analytics, and artificial intelligence -- at exascale. The program contract is valued at more than $100 million for Cray, one of the largest contracts in the company's history. The design of the Aurora system calls for 200 Shasta cabinets, Cray's software stack optimized for Intel architectures, Cray Slingshot interconnect, as well as next generation Intel technology innovations in compute processor, memory and storage technologies. Intel's Rajeeb Hazra detailed some of the futuristic technology coming to Aurora including a future generation Intel Xeon Scalable processor, the recently announced Intel Xe compute architecture, and Intel Optane DC persistent memory. "Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer -- but also for all of us who are committed to American innovation and manufacturing," said Bob Swan, Intel CEO. "The convergence of AI and high-performance computing is an enormous opportunity to address some of the world's biggest challenges and an important catalyst for economic opportunity."

The Aurora system's exaFLOP of performance -- equal to a "quintillion" floating point computations per second -- combined with an ability to handle both traditional high-performance computing (HPC) and artificial intelligence (AI) will give researchers an unprecedented set of tools to address scientific problems at exascale. These breakthrough research projects range from developing extreme-scale cosmological simulations, discovering new approaches for drug response prediction and discovering materials for the creation of more efficient organic solar cells. The Aurora system will foster new scientific innovation and usher in new technological capabilities, furthering the United States' scientific leadership position globally.

Discussion
Posted by cageymaru March 18, 2019 3:24 PM (CDT)

Samsung Launches 12GB Smartphone Memory Packages

Samsung just announced what it claims to be the world's highest-capacity mobile DRAM package in production. The Korean company's new LPDDR4X modules combine six 16-gigabit, "10nm-class" DRAM ICs into a package that's 1.1 millimeters tall, allowing manufacturers to stuff just as much RAM as the desktop I'm typing this on into razor-thin phones. Samsung also says the module can hit transfer rates of up to 34.1GB per second, and claims that power consumption is only minimally increased in spite of the dramatic capacity boost. Thanks to cageymaru for the tip.

Since introducing 1GB mobile DRAM in 2011, Samsung continues to drive capacity breakthroughs in the mobile DRAM market, moving from 6GB (in 2015) and 8GB (2016) to today's first 12GB LPDDR4X. From its cutting-edge memory line in Pyeongtaek, Korea, Samsung plans to more than triple the supply of its 1y-nm-based 8GB and 12GB mobile DRAM during the second half of 2019 to meet the anticipated high demand.

Discussion
Posted by alphaatlas March 14, 2019 10:43 AM (CDT)

Old-School: Half-Life Running on a Quantum3D Mercury Brick

Classic game, classic hardware: [H]ardForum member TheeRaccoon is one of the lucky few to get his hands on a Quantum3D Mercury "brick," which comprises four Quantum3D Obsidian2 200SBi video boards. As The Dodge Garage explains, these were generally used for multi-channel visual simulation and training applications back in the day, but as TheeRaccoon’s video proves, they can also run a certain Valve shooter just fine. Thanks for the share, erek.

After a little over a year of ownership, I finally present to you the legendary Quantum3D Mercury brick up and running! (Don't mind my ghetto homemade passthrough cable.) In this brick configuration, there are 8 Voodoo 2 chipsets in SLI! (Each 200SBi board has two Voodoo 2 chipsets in SLI mode.) These bricks were mostly used for military simulation in the late 90's/early 2000's. The image generated by each 200SBi board is combined into one image, giving you 4 tap rotated grid full scene anti-aliasing.

Discussion
Posted by Megalith March 10, 2019 4:35 PM (CDT)

Intel Spotlights Quantum Computing Research

At the American Physical Society meeting in Boston this year, Intel is highlighting several of the advances they've made in the field of quantum computing. Between Monday and Thursday this week, Intel will hold several talks on things ranging to 49-qubit processors, the testing of qubits on 300mm wafers, bottlenecks that can be solved by quantum computers, and even a technique for cramming more quantum bits into a smaller area with through-silicon-vias, which is the same technology used to connect HBM memory stacks to underlying interposers. But what stood out to me most was Intel's focus on commercialization. Many research papers on quantum computing focus on the theoretical, and what could be built at some unspecified time in the future, once the engineering hurdles are worked out. Intel has their fair share of theoretical work too, but several of their talks focus on mass producing and testing quantum computers on 300mm wafers, which represents a huge step towards making quantum computing more affordable. In an interview with IEEE Spectrum, Intel's Rich Uhlig reiterated this point, claiming that the chip company is focusing on technology to make commercial products rather than trying to build demonstrators with big numbers.

At Intel, we're focused on developing a commercially viable quantum computer, which will require more than the qubits themselves. We have successfully manufactured a 49-qubit superconducting chip, which allows us to begin integrating the quantum processing unit (the QPU) into a system where we can build all of the components that will be required to make the qubits work together in tandem to improve efficiency and scalability. Instead of focusing on the hype of qubit count, we are working to create a viable quantum system that will scale from 50 qubits to the millions of qubits that will be required for a commercial system.

Discussion
Posted by alphaatlas March 04, 2019 11:39 AM (CST)

Old BioWare Has Become "A Distant Memory"

Despite lead producer Michael Gamble’s claim the studio isn’t shutting down and continues to get "great support," concerns about BioWare’s future appear to be growing following Anthem’s negative reception. USgamer and PC Gamer have both published articles reminiscing over the developer’s legendary past and how it could recapture its former glory, but while both agree the studio should simply return to the basics (i.e., making single-player RPGs), their current obligations to huge blockbusters means they may never get that chance. Anthem's physical sales were only half of Mass Effect Andromeda's, according to UK charts.

...the success of Mass Effect feels more and more like a poisoned chalice. It propelled BioWare to undreamed of success, but it also robbed it of its soul. It's hard to imagine it ever returning to the heights of Baldur's Gate 2, when BioWare was an independent PC developer catering to a limited but ferociously loyal audience. Anthem is the natural endpoint of a process that began more than a decade ago, when BioWare decided its traditional approach was incompatible with large-scale success.

Discussion
Posted by Megalith March 02, 2019 4:05 PM (CST)

Surf the Web Like It's 1990: CERN Rebuilds WorldWideWeb, the First Web Browser

There’s no better way to celebrate the 30th anniversary of the development of the WorldWideWeb by firing up the original 1990 browser, and CERN has made that easy by rebuilding the first web browser and editor as an in-browser app (queue Xzibit Yo Dawg meme). Once upon a time, users had to double click on links.

In December 1990, an application called WorldWideWeb was developed on a NeXT machine at The European Organization for Nuclear Research (known as CERN) just outside of Geneva. This program - WorldWideWeb - is the antecedent of most of what we consider or know of as "the web" today. In February 2019, in celebration of the thirtieth anniversary of the development of WorldWideWeb, a group of developers and designers convened at CERN to rebuild the original browser within a contemporary browser, allowing users around the world to experience the rather humble origins of this transformative technology.

Discussion
Posted by Megalith February 24, 2019 4:50 PM (CST)

Nokia 9 PureView: The World's First Quintuple Camera Smartphone

Because one, two, three, or even four cameras isn’t enough, Nokia has announced a new PureView phone that boasts five rear sensors: "ZEISS optics on the back, two of which are RGB and the other three - monochrome." These work in unison, capturing lighting, detail, and color data to stitch a 12MP image. "According to HMD, taking a single photo results in at least 60 MP of imaging data being processed."

This is made possible with a dedicated image co-processor, which helps out with the heavy task. Every picture is HDR, can have up to 12.4 stops of dynamic range, and ends up with a full scene 12 MP depth map. Yes, this means that you will be able to use a depth editor to specifically adjust bokeh effects after the photo has already been taken. It'll even allow you to tweak colors and contrast between different depth fields. And yes, it can shoot RAW.

Discussion
Posted by Megalith February 24, 2019 11:25 AM (CST)

SK Hynix to Spend $107 Billion on Four New Memory Chip Factories

SK Hynix has announced that it is building four new memory chip plants that will cost $107 billion. Construction of the plants will begin in 2022 at a 4.5 million square meter site that is south of Seoul. SK Hynix is expected to invest $49 billion into 2 existing plants. Next-generation chips and DRAM are expected to be manufactured at the sites. Even though there is a downturn in the memory market now, SK Hynix is preparing for cutting edge technologies such as 5G and self-driving vehicles.

"Though there is not enough chip demand for autonomous cars now, I believe there will be much more demand for self-driving vehicles in the next 10 years or as early as in 2023 or 2024," said analyst Kim Young-gun at Mirae Asset Daewoo. "That will create more chip demand for SK Hynix," as will the commercialization of 5G networks over the next few years, Kim said.

Discussion
Posted by cageymaru February 21, 2019 6:49 PM (CST)

Plans for First Chinese Solar Power Station in Space Revealed

The Chinese are reportedly working on experimental solar power stations that would be launched into space to generate electricity. Planned for launch as early as 2021, they will provide "an inexhaustible source of clean energy for humans," reliably supplying energy "99 percent of the time, at six-times the intensity of solar farms on earth." The final plan is a Megawatt-level space solar power station for 2030.

Pang said technical challenges to be overcome include the weight of a power station, expected to be 1000 tonnes, greater than 400 tonnes of the International Space Station. Researchers are examining whether a space factory using robots and 3D printing technology could construct the power station in space, avoiding the need to launch a heavy structure from earth. Solar energy would be converted to electricity and a microwave or laser beam would transmit the energy to earth.

Discussion
Posted by Megalith February 16, 2019 3:40 PM (CST)

Tesla Model X Receives Industry First Perfect Crash-Test Rating for a SUV

The Tesla Model X has become the first SUV to receive a perfect safety rating from the National Highway Traffic Safety Administration (NHTSA.) SUVs are generally safer than cars due to their increased size, but a higher center of gravity causes the vehicles to rollover during tight maneuvering situations or a side impact accident. According to statistics, "rollovers happened in 1% of serious crashes in passenger vehicles but accounted for one-third of collision-related deaths." The Tesla Model X was designed to have a lower center of gravity due to its large battery pack located in the floor of the vehicle. This design choice, along with the larger front crumple zone, allowed the vehicle to receive a perfect rating from the NHTSA. In other Tesla news, Dog Mode has been enabled on Tesla vehicles. This displays a message on the screen in Tesla vehicles that shows the temperature of the car for passersby to know that they don't need to worry about the animals in the vehicle. Tesla also added a Sentry Mode to their cars. When customers enable Sentry Mode, the electric vehicle constantly monitors its environment from a "Standby" state like a home security system works. When the vehicle detects a break-in in progress, it will start recording the event with its cameras, sound the car alarm, blast loud music through the stereo system, and warn the owners of the incident via the Tesla mobile app on their phones. Video of the incident can be downloaded to a USB drive.

If a minimal threat is detected, such as someone leaning on a car, Sentry Mode switches to an "Alert" state and displays a message on the touchscreen warning that its cameras are recording. If a more severe threat is detected, such as someone breaking a window, Sentry Mode switches to an "Alarm" state, which activates the car alarm, increases the brightness of the center display, and plays music at maximum volume from the car's audio system. If a car switches to "Alarm" state, owners will also receive an alert from their Tesla mobile app notifying them that an incident has occurred. They'll be able to download a video recording of an incident (which begins 10 minutes prior to the time a threat was detected) by inserting a formatted USB drive into their car before they enable Sentry Mode.

Discussion
Posted by cageymaru February 14, 2019 8:52 AM (CST)

Digitimes Claims that Memory is Cheap, and that AMD is Gaining Market Share

Citing industry sources, a recent report from Digitimes claims that prices for 1TB "gaming" SSDs fell more than 50% since 2018. According to their data, a 1TB SSD used to cost 10,000 New Taiwan Dollars (about $324 USD), whereas they're going for NT$3000-5000 ($97-$160) tody. This seemingly lines up with historical price data, as a 1TB Samsung 860 EVO, for instance, dropped from $330 in January 2018 to $128 in December. PCPartpicker's 1TB SSD chart paints a fuzzier picture, as the "average" selling price for SSDs is much higher than the list price for the most common drives, but even the average has fallen significantly since early 2018. Digitimes also claims that 4GB and 8GB memory modules have fallen to $30 and $60, respectively. According to PCPartPicker's DDR4 charts, that's nearly half the price they were going for about a year ago, and Digitimes believes that memory prices should drop even more in the first quarter of 2019.
Meanwhile, the publication's sources also claim that AMD has a 17% share of the "gaming market," whatever that means. AMD themselves recently highlighted Mercury Research's market share numbers, which claim that AMD has a 15.8% and 12.1% slice of the desktop and notebook markets, respectively, but those numbers only represent Q4 2018.

Despite growing adoption of NAND flash chips in SSDs, the prices will see a sequential drop of over 15% in the first quarter of 2019 thanks to weakening total bit demand for PC-use SSDs and SSD price falls... The sources continued that shipments of gaming DDR4 high-spec modules have been affected by the shortages of Intel CPUs, driving gaming consumers to turn to AMD platforms. This has pushed up AMD's share of the gaming market to 17% and increased shipments of DDR4 3200 MHz and under modules, compared to a slowdown in shipments of 3600-4000 MHz modules.

Discussion
Posted by alphaatlas February 14, 2019 8:40 AM (CST)

Apex Legends Hits 25 Million Players and 2 Million Concurrent Peak in Its First Week

Vince Zampella of Respawn Entertainment has announced that Apex Legends has surpassed the 25 million player mark in its first week. Also the peak number of concurrent players during that week was 2 million over the weekend. He also announced that Season One starts in March and there will be a Battle Pass as well as new Legends, weapons, loot, and more. Events planned in the near future include the Twitch Rivals Apex Legends Challenge which starts on February 12, 2019 and resumes on the following Tuesday, February 19th. 48 of Twitch's biggest personalities will compete on those days. Valentine's day will have limited-time, appropriately-themed loot drops.

What a week! Since we launched Apex Legends last week on Monday we've seen the creation of an Apex Legends community that is excited, thriving, and full of great feedback and ideas. Our goal is to build this game with you, our community, so keep giving us your feedback because we really are listening. From everyone here at Respawn, thank you. The community's excitement for Apex Legends is electric, and we feel it here at the studio. We couldn't have gotten where we are without you and look forward to having you on this journey with us.

Discussion
Posted by cageymaru February 11, 2019 5:57 PM (CST)