Login Register

Computing Hardware Articles & Tutorials

Nvidia's Supercomputing CPU Puts Intel Under Pressure

posted on Jul 27, 2021
tags: computing computing hardware

An already-beleaguered Intel is now facing more fire, thanks to Nvidia announcing last Monday that it will start making CPUs for facilities like data centers and supercomputers.

Named Grace (after computer science pioneer Grace Hopper), Nvidia’s CPU hopes to make a splash in the high-performance computing space. Nvidia says Grace is set to launch in 2023, but Los Alamos National Laboratory in the US and the Swiss National Computing Centre have already announced plans for Grace-based supercomputers.

Nvidia is famed as a maker of graphics cards; certainly, PC gaming and cryptocurrency mining are the company’s bread-and-butter. But Nvidia technology already powers a whole range of high-performance, GPU-intensive applications—including AI, finance, imaging, genome sequencing, signal processing, and computational fluid dynamics.

“It’s specialty applications, really,” says Douglas Lyon, a professor of computer engineering at Fairfield University in Connecticut. “They’re killer apps,... read more

A Hacker's Nightmare: Programmable Chips Secured by Chaos

posted on Jul 27, 2021
tags: computing computing hardware

Not all chaos is bad. In the case of programmable chips, chaos can be harnessed to create a unique digital identifier. Think of it as a fingerprint—only, instead of distinctive loops and whorls on human skin, this is about the unique physical variations present in silicon chips.

These minute differences, often at an atomic level, vary from chip to chip, even if they are manufactured together. The physical variations can be amplified using a technology called physically unclonable functions (PUFs) to create a signature that is unique to a specific chip. 

By putting many PUF cells together, it is possible to create a random number of an arbitrary length. Even though it is random, it still unique for a particular instance of the chip. More importantly, the identifier does not need to be saved on the chip as it can be generated only when required for authentication and immediately erased. Therefore, PUFs can potentially be used in smart cards and secure IDs, to track goods in supply chai... read more

Amazon's New Quantum Computer Design Relies On Tiny Schrödinger's Cats

posted on Jul 26, 2021
tags: computing computing hardware

Schrödinger’s cat is a thought experiment in which a quantum event suspends a cat in a box in a nebulous state between life and death: the cat only definitely becomes alive or dead when someone looks in the box. Now Amazon has unveiled a new theoretical blueprint for a quantum computer based on combining hardware versions of lots of Schrödinger’s cats.

Classical computers switch transistors either on or off to symbolize data as ones or zeroes, while quantum computers use quantum bits—qubits—that, because of the surreal nature of quantum physics, can exist in a state of superposition where they are both 1 and 0 at the same time. This essentially lets each qubit perform two calculations simultaneously.

If two qubits are quantum-mechanically linked, or entangled, they can help perform 2^2 or four calculations simultaneously; three qubits, 2^3 or eight calculations; and so on. In theory, a quantum computer with 300 qubits could perform more calculations in an instant than there are a... read more

The Next Generation of Qubit Control: SHFSG Signal Generator Launch Event

posted on Jul 25, 2021
tags: computing computing hardware sponsored

Would you like to achieve high-fidelity control of superconducting? Join Zurich Instruments SHFSG Signal Generator launch event to find out how.

... read more

Understanding BLDC Motor Control Algorithms

posted on Jul 24, 2021
tags: computing computing hardware sponsored

Read this ebook, featuring animated examples, to learn about the control algorithms to drive a brushless DC (BLDC) motor.

... read more

How Close Is Ordinary Light To Doing Quantum Computing?

posted on Jul 24, 2021
tags: computing computing hardware

Using just a simple laser of the sort found in undergraduate optics labs, physicists may be able to perform some of the same calculations as a hard-to-handle quantum computer operating at ultracold temperatures.

The trick is to use classically entangled light, a phenomenon that has some of the same properties as the traditional entanglement spun out of quantum mechanics. Researchers Yijie Shen from Tsinghua University, Beijing, China, and the University of Southampton, UK, and Andrew Forbes from the University of Witswaterand, Johannesburg, South Africa showed they could create a light beam with multiple entanglements in a recent paper in the journal Light: Science and Applications. And it’s all done with mirrors.

“Although it's always spoken about in the quantum world, the idea of entanglement is actually not unique to quantum mechanics,” says Forbes, a professor of physics at Witswaterand and leader of the research.

In the quantum realm, entanglement means that two particles—ele... read more

Cloud Computing’s Coming Energy Crisis

posted on Jul 22, 2021
tags: computing computing hardware

How much of our computing now happens in the cloud? A lot. Providers of public cloud services alone take in more than a quarter of a trillion U.S. dollars a year. That’s why Amazon, Google, and Microsoft maintain massive data centers all around the world. Apple and Facebook, too, run similar facilities, all stuffed with high-core-count CPUs, sporting terabytes of RAM and petabytes of storage.

These machines do the heavy lifting to support what’s been called “surveillance capitalism”: the endless tracking, user profiling, and algorithmic targeting used to distribute advertising. All that computing rakes in a lot of dollars, of course, but it also consumes a lot of watts: Bloomberg recently estimated that about 1 percent of the world’s electricity goes to cloud computing.

That figure is poised to grow exponentially over the next decade. Bloomberg reckons that, globally, we might exit the 2020s needing as much as 8 percent of all electricity to po... read more

Moving Chips Closer to Cold Qubits

posted on Jul 21, 2021
tags: computing computing hardware

Three of the biggest companies making quantum computers today—Google, Intel, and Microsoft—are betting on supercooled devices operating at close to absolute zero. But there’s a problem: These cold cathedrals of quantum computing cannot tolerate the extra heat given off by the conventional computing chips that control them.

This means the classical and quantum-­computing components must be separated, despite their marriage by design. The control chips usually reside at room temperature on top of the quantum-computing stack, while the quantum bits (qubits) remain in the coldest depths of dilution refrigerators. The dilution fridges involve helium-3 and helium-4 isotopes to supercool the environment, lowering temperatures from a baseline of 4 kelvins (–269.15 ºC) at the top to about 10 millikelvins at the bottom.

Cables running up and down the hardware stack connect each qubit with its control chip and other conventional computing components higher up... read more

Google's Quantum Computer Exponentially Suppresses Errors

posted on Jul 19, 2021
tags: computing computing hardware

In order to develop a practical quantum computer, scientists will have to design ways to deal with any errors that will inevitably pop up in its performance. Now Google has demonstrated that exponential suppression of such errors is possible, experiments that may help pave the way for scalable, fault-tolerant quantum computers.

A quantum computer with enough components known as quantum bits or "qubits" could in theory achieve a "quantum advantage" allowing it to find the answers to problems no classical computer could ever solve.

However, a critical drawback of current quantum computers is the way in which their inner workings are prone to errors. Current state-of-the-art quantum platforms typically have error rates near 10^-3 (or one in a thousand), but many practical applications call for error rates as low as 10^-15.

In addition to building qubits that are physically less prone to mistakes, scientists hope to compensate for high error rates using stabilizer codes. This st... read more

EXCLUSIVE: Google Snaps Up Network-on-Chip Startup Provino

posted on Jul 18, 2021
tags: computing computing hardware

Google has quietly acquired Provino Technologies, a start-up developing network-on-chip (NoC) systems for machine learning, an IEEE Spectrum investigation has discovered.

The latest processors for AI applications are home to thousands—or even hundreds of thousands—of cores, each of which needs to move vast swathes of data.

NoC technologies could accelerate communications on such “many-core” chips by replacing traditional buses and direct wires with an architecture familiar from large computer networks and the Internet, based on routers directing packets of data.

“Technology for communication simply hasn’t improved in the same way as that for computation,” says Md Farhadur Reza, assistant professor of Computer Science at the University of Central Missouri. “NoC’s decentralized architecture can have applications running multiple tasks in parallel and communicating with each other at the same time. And that means performance will improve, throughput will improve, and your wires will be... read more