Why don't so many programmers like C.

Development: Why Rust is the answer to lousy software and programming errors

The 2000s and 2010s are golden times for new programming languages. There was a veritable Cambrian explosion in diversity. Including new versions of well-known languages ​​(C ++ had several big leaps in content at the same time in the evolution) or a trend towards own languages ​​for companies (Swift from Apple, TypeScript from Microsoft, Go from Google, Kotlin from JetBrains, Rust from Mozilla) . Plus a lot of serious new programming languages ​​outside of existing company ecosystems, for example Elm for apps in web browsers, Julia for numerical mathematics, Elixir for the Erlang VM, etc.

One of the reasons for these many new languages ​​is that most aspects of languages ​​have been automated away. The parser is generated automatically from a grammar, and LLVM can be used as the backend if you are not aiming at the Java VM or another VM like Erlang's.

This article is about Rust.

Unlike virtually all other modern programming languages, Rust does not advertise increasing productivity. That makes you prick up your ears, since increasing productivity is the holy grail for the rest of the industry, and one that is always cheated on. Rust also wants to increase productivity, of course, but not only when writing code, but also later maintenance, debugging, and when finding and removing security gaps.

Better code, easier maintenance?

Well, Java also had the claim of secure code and reduced maintenance, where finding code smells and refactoring are now a separate branch of the tooling industry. In this respect, it is worth taking a look at what Rust is doing specifically to achieve these goals. To put that into context, however, we have to go back a bit and see what approaches there were already - and why they didn't work. Ultimately, "How can we ensure that our programmers write better code" has been the key question in software development right from the start.

An early theory was that programming languages ​​were too abstract and need to be brought closer to human language. Then the translation distance is smaller, so the idea. COBOL tried that. It didn't work. OK, then maybe we have to formalize that more. We make our programming language look like mathematical formulas. APL tried that. It wasn't successful.

Then let's try reducing complexity. Modularization will save us, later modularization was even between concepts, not just between source code files, and we called it object orientation and talked about encapsulation. The bottom line is that that didn't produce any safer or better code either.

This has been the central insight over the decades: the environment is not static. It is dynamic and subject to market influences. If we make an intervention to make the complexity more manageable, then this does not mean that the industry solves the old problems better, but that one tackles larger problems or that one reduces the planned time per problem. The result is that the position on the botch axis remains constant. Even highly successful concepts such as strong typing, modularization, abstraction and encapsulation in object-oriented programming languages ​​helped a lot, but the systems didn't get better in the end, just much bigger (and much slower).

Firefox: the worst case as a starting point

Rust comes from Mozilla, the people behind the Firefox browser. The Firefox browser is, so to speak, the worst case for traditional software development. It is a comparatively huge piece of software and, despite the application of all of the above concepts, it is a constant source of security holes. The release intervals became smaller and smaller over the years, but a release without critical security gaps was not among them. So the Mozilla people can feel for themselves that the previous approaches are not enough, and they have sat down and analyzed the problem.

There are several fundamental problems and obstacles in software development, and none of them are technical.