Author: Blaze Woodard

Blaze Woodard, an editor at cubox-i.com, is presently working as an intern at a Silicon Valley technology company while majoring in politics at the University of Kansas. Blaze, who identifies as both a policy thinker and a self-described tech geek, offers a viewpoint on hardware and computing coverage that few editors in this field can match: the capacity to relate the workings of a circuit board to the larger political, regulatory, and social forces influencing the technology sector. Even though her academic path led her to political science, her early fascination with technology persisted. She writes about computing, AI, and hardware with the zeal of someone who truly loves the subject, not as someone assigned to cover it. Blaze plays soccer and spends her free time with friends and living her life, which is exactly what a college student should do outside of the office and newsroom.

All

A refrigerator the size of a small car hums at temperatures lower than deep space somewhere in an upstate New York laboratory. It contains a fingernail-thin chip that, until recently, sounded like science fiction. It is resolving a class of issues that the top deep learning models in the world have been silently failing to address for years. Researchers don’t say much as they pass. They’ve mastered the art of not overselling. However, if you spend enough time with them, the atmosphere has changed. FieldDetailTopicQuantum Computing vs. Deep LearningCore PrincipleSuperposition, entanglement, and qubit-based parallelismPrimary Competitor ParadigmClassical deep learning on GPUsNotable…

Read More

The moment someone brings up automation in a meeting, you can sense the nervous energy that permeates offices these days. People take a half-second longer than usual to look at their laptops. A joke about being replaced is made by someone. After that, everyone laughs a bit too fast and moves on. The joke might no longer be a joke. One of the most peculiar credential booms in recent memory is being subtly fueled by this anxiety. Nearly 8 out of 10 American adults say they’re interested in learning AI, but the majority still don’t know where to begin, according…

Read More

When the Nasdaq is flat and the equal-weight S&P is subtly outperforming the headline index by multiple points, there is a certain silence that has been hanging over the trading floors lately. You can hear it in the conversations as you pass any Midtown coffee shop in the morning: analysts hedging, portfolio managers shrugging, and retail traders checking the same five tickers. The trade in AI has not collapsed. It has simply grown more intricate. The majority of Wall Street hasn’t yet bothered to record the slower, stranger second story that is emerging beneath it. It’s the convergence of AI…

Read More

The recent movement of quantum computing stocks has a familiar feel to it. The kind of familiarity that causes seasoned investors to recline in their seats, let out a slow breath, and grab their notebooks. Shares of IonQ, Rigetti Computing, and D-Wave Systems increased by 72%, 37%, and 56%, respectively, between April 9 and April 20. There are seven trading sessions. That’s all. Such numbers typically have a backstory, and this one has too many reminders of previous manias to be disregarded. Quick Reference: Quantum Computing Rally SnapshotDetailsPeriod of SurgeApril 9 – April 20 (Seven trading sessions)IonQ (IONQ) Gain72%Rigetti Computing…

Read More

When you read Apple’s own description of Private Cloud Compute, the first thing you notice is how different it sounds from Apple. Almost overnight, the company that is known for keeping quiet about its supply chain has begun discussing server boards, tamper switches, and high-resolution imaging in a manner more akin to that of a defense contractor than a consumer electronics company. Something seems to have changed on the inside. It’s evident in the language. Naturally, the system itself is concealed. Apple won’t reveal the locations of the data centers, display the racks, or identify the individuals using clipboards to…

Read More

When a small hardware company places something genuinely strange on a table at a trade show, you notice a certain kind of excitement. Engineers bend over. Phones emerge. It’s easy to understand why the CuBox-M, a new two-inch cube from Israel-based SolidRun, has been receiving such attention. It appears to be a paperweight. It uses models for machine learning. Although SolidRun isn’t well-known, the company has been discreetly supplying specialized hardware for years in the field of single-board computers and embedded systems. The CuBox line has been around for more than ten years, and this newest model gives the impression…

Read More

The hum in practically every data center nowadays is the same as it was a decade ago. The faint smell of warm electronics, blinking LEDs, and cold air. What’s actually operating inside those metal racks has changed. An IT manager in 2005 would have responded with a disorganized assortment of Unix variants, Windows NT offshoots, and a tiny but expanding portion of Linux when asked which operating system drove the server industry. That discussion is largely resolved today. Linux is in charge. Even though the numbers are never flawless, they generally present a consistent picture. Depending on which methodology you…

Read More

The same low hum can be heard whether you enter a data center in Ashburn, Frankfurt, or Singapore. Blue lights blinking in patterns that no one really pays attention to anymore, rows of servers, and cold air forced through metal cages. Almost all of those machines are running programs written in C. Not with JavaScript. Not in Python. Not the languages that developers quarrel about every other week on Twitter. C. A language that is older than the majority of engineers who use it. InformationDetailsTopicMost Used Programming Language for Server Operating SystemsDominant LanguageC (with growing Rust adoption since 2022)Year of…

Read More

At tech meetups, there’s a specific type of conversation that takes place around the second beer when someone asks which Linux distribution is worth using these days. Ubuntu is always mentioned. There’s always someone who disagrees. The table falls silent for a moment when a more reserved person brings up Debian because no one wants to acknowledge that they’ve been considering switching back. In actuality, the Linux desktop, which was once dismissed as a hobbyist’s playground, has evolved into something nearly dull, and that’s the best thing you can say about an operating system. The three distributions that have supported…

Read More

In the early 1970s, a college computer lab would have looked more like a furnace room than an office. Men wearing short-sleeved shirts and wide ties attended to the entire setup, which included refrigerator-sized cabinets humming against the walls and vents expelling warm air. Before breakfast, a single machine could fill a hall, require its own cooling system, and still not be able to handle what a modern phone can. It’s difficult to ignore how swiftly that world vanished. During that time, the supercomputer was the main event. Soft-spoken engineer Seymour Cray, who liked to work alone in a basement…

Read More