The Top 8 Computing Stories of 2024


This year, IEEE Spectrum readers had a keen interest in all things software: What’s going on in the tumultuous world of open-source, why the sheer size of code is causing security vulnerabilities, and how we need to take seriously the energy costs of inefficient code. The ever-growing presence of artificial intelligence also made itself known in the computing world, by introducing an LLM-powered Internet search tool, finding ways around AI’s voracious data appetite in scientific applications, and shifting from coding copilots to fully autonomous coders—something that’s still a work in progress.

And if you scroll all the way down to the bottom of our list of top computing stories of the year, you’ll find a treat in the form of an IEEE Spectrum original science fiction short.

AI speech bubbles over a background of code
Andriy Onufriyenko/Getty Images

AI was destined to take the top spot in 2024, even in the computing realm. Coding assistants, like Github’s Copilot or Amazon’s CodeWhisperer, are already changing the way software engineering is done. This causes an obvious anxiety—are AI coders coming for software engineers’ jobs?

The short answer is, not yet. And not for lack of trying. An AI lab, Cognition, created a fully autonomous AI software engineer named Devin AI. Devin boasts the ability to design, build, and deploy a website, fix bugs in a codebase, and fine tune an LLM all by itself. Open-source alternatives to Devin followed shortly. But even at the tasks they claim to solve, these coding autopilots are not yet very good. Devin only resolved 14 percent of GitHub issues it was presented with, for example. And the real world of software development is far more interactive and complex, with many teams working together to co-design, triage, and collaboratively solve large-scale problems. So, this article postulates, coding assistants with a real human in the loop will be more successful, at least for now.

A yellow and blue photo collage of a laptop with speech bubbles around it labelled with various programming languagesIEEE Spectrum/Getty Images

The 2024 installment of this cult favorite round-up of the most popular programming languages saw some predictable results alongside emerging new trends.

Surprising no one, Python tops the charts as the most popular language in the zeitgeist and among IEEE members. Employers have a slightly different preference—they give an edge to job seekers who know SQL (pronounced as ‘sequel’), a database query language. It should be noted that just knowing SQL is not enough, and it must be paired with a more traditional programming language like Python or C++. But those who are already proficient in a language and looking to gain an edge in the job market would do well to add SQL to their resume.

In emerging trends, less well-known languages Typescript and Rust have made substantial gains since last year. Both languages have features that implicitly protect the coder from making certain types of errors. Typescript requires programmers to declare the type of each variable (floating point, integer, Boolean or other) before it’s used, increasing reliability. Rust is memory-safe, meaning it protects the program from writing data to memory it’s not supposed to write to, closing some vulnerabilities.

A photo-illustration of outstretched hands with blobby, running, oodles of green software code
Daniel Zender

In 1995, Niklaus Wirth, a computer science pioneer famous for designing the language Pascal, wrote an article titled “A Plea for Lean Software.” In it, Wirth lamented the growing size of code—literal number of lines and how much space it takes in memory—which he saw as both unnecessary and dangerous. After all, the more code you write, the more opportunities you have to introduce a mistake or a security vulnerabilities.

Almost thirty years later, upon Wirth’s passing in January 2024, lifelong technologist Bert Hubert revisited Wirth’s plea and despaired at how catastrophically worse the state of software bloat has become. In this renewed plea, which reads as a cry of the soul, Hubert explains how dire the situation is: The size of software has gotten huge, with applications as simple as garage door openers taking up to 50 million lines of code to implement. Coders routinely import many external libraries without truly knowing what’s in them, greatly expanding the size of the code and introducing lots of potential vulnerabilities. Security breaches have become so common that many consider it unsafe to run code themselves, resorting instead to software-as-a-service.

To shine a beacon of hope to other despairing software engineers out in the raging seas of enormous code, Hubert wrote an example application, called Trifecta, which supports image sharing online. Trifecta has a minimal number of dependencies and clocks in at 3 megabytes of code, a fraction of the size of competing solutions. Here’s hoping the next thirty years brings software bloat under control.

a colorful blue glowing magnifying glass against a dark background with a wave of colorful boxes on either side
iStock

For decades, Google search has reigned supreme for so long it has become a proprietary eponym, as in ‘just Google it.” Taking on the giant’s dominance in web search has been almost impossible—until now. A scrappy startup, Perplexity.ai, has used AI tools to challenge Google’s crown. Before the end of 2024, Perplexity had approximately 15 million users, which, full discloser, includes the author of this round-up. This is still chump change compared to Google’s almost 5 billion users, but Perplexity offers something traditional search does not: the power of LLMs.

The company, which started in 2022 with 4 employees, stumbled upon the idea of AI-powered search in a Slack channel. They combined several AI tools, including retrieval-augmented generation (RAG) to read the web pages relevant to a particular search, a bidirectional encoder representation from transformers (BERT) to rank web pages, and a pared down web crawler to index the internet.

dots of light starting from the outside in
Joshua Sortino/Unsplash

Most artificial intelligence models are data-hungry. Chatbots, for example, are trained on most of the internet before they can “speak” well. Scientific AI models are no different. In many cases—such as modeling airflow around an airplane wing, or the collapse of a star into a black hole—creating high-quality training data for AI models is slow and costly.

One approach is to use AI generated training data to train another AI model. But even that can be costly, and inaccurate. A team of researchers at the Georgia Institute of Technology, IBM Research, and MIT developed a solution that cuts the training data needed to reach a desired accuracy by a factor of 100. Their model, called a physics-enhanced deep surrogate, combines first principles physics theories with a neural network to produce a model that’s better than the sum of its parts.

An illustration of laptop with flowers coming out of the screen.
Elias Stein

When we talk about the energy cost of artificial intelligence, or computing in general, we usually think of the hardware—how efficient are the CPUs and GPUs at crunching through their tasks? But the way we write software can have drastic effects that often go unnoticed. For example, properly designing a webpage can cut 93 percent of the emissions generated by loading the page.

Designing greener software is a win-win: the software itself is more efficient, it runs faster, and it causes fewer emissions. But it takes some awareness, and some thought when designing and implementing the required solution. This article highlights the growing green software movement, and provides rules of thumb for building more energy-efficient websites, apps, and AI implementations.

As management consultant Peter Drucker reportedly said, “What gets measured gets improved.” This holds true for the energy effects of software. There are a growing number of tools that measure the emissions from websites, codebases, AI, and more, but experts say access to reliable data remains a problem, and better measurement tools are needed before we can really decarbonize software.

Illustration of Greek goddess Themis holding the scales of justice against a background of binary code.Moor Studio/iStock

The founder of open-source website platform WordPress published a cease-and-decease letter against WordPress hosting service WP Engine, claiming they don’t contribute back money or developer hours to the project. WP Engine sued the founder in response, for violating copyright.

This legal battle highlights an underlying question at the heart of the open-source model: How does one get paid for work that’s given away for free? In the early days, open-source development was done by enthusiasts, working in their spare time on passion projects. Now, these passion projects are powering an estimated 70 to 90 percent of all apps, and large companies are making a pretty penny off them. This has brought the crisis to a head: maintainers of open-source projects are reporting increasing levels of discontent, and undermaintained projects are creating security vulnerabilities. But there is hope: there are growing efforts to convince companies to make a pledge to pay maintainers.

An illustration of two people against a background consisting of the Sun, a hemisphere of tiny objects surrounding a dully-glowing sphere, and a number of toroidal space habitats
Andrew Archer

What would it take to build a computer the size of a planet? In a departure from our traditional reporting model, IEEE Spectrum commissioned a science fiction writer, Karl Schroeder, to envision an answer to this question. Contributing editor Charles Choi annotated the story, explaining how the fictionalized world draws on real science and tech.

Virtual minds floating in a computer made out of the planet Mercury orchestrate an effort to terraform other planets. Will it be a brave new world, or a solar system of loneliness?

From Your Site Articles

Related Articles Around the Web



Source link