Wednesday December 28, 2022; 8:58 PM EST
- (3 min read)#
- #
- Some days, you spend most of it waiting for the test to fail. Finally, at the right spot, for the right reason. Now you know why it's broken, and now you can fix it.#
- If you are also writing the test infrastructure around that code, the code that tests the code, then it is harder to be sure about where the problem is. This is why I like test equipment. You can guarantee it will perform to spec, and as long as you maintain it, it will always perform to spec. If it doesn't, it's out of spec! It's beautiful, really, how it all works out.#
- Automated testing is a new thing in software development, by which I mean, it is something from the most recent 20 years of computing, rather than the first 20 years of computing. #
- There was a period in between where things changed pretty slowly, because performance just wasn't good enough, no matter how fast the hardware changed, you still had to rewrite your software to take advantage of it. You can think of this as friction between the performance of the intended virtuality and its ability to be expressed in hardware and software. This friction is the amount of time you spend optimizing for the particular platform, the cost of implementing in your code a feature that in later generations would be provided by the system. #
- Longer word lengths, and more registers, and access to large amounts of memory, and device access, and file structure, network access, graphics generation, fonts, animation, state machines, garbage handling, request handling, response generation, pipes, handles, sockets, key-value storage, structured data storage, multiple-host synchronization. Each layer reduces this virtuality friction. Each layer enables other layers. #
- If you need a layer, and it is not yet part of what you can count on, you are building it yourself. That is the friction. At one point, you needed to manage the contents of registers yourself. Later, you'd write your own random number routine because your game needed that. #
- Automated testing is the layer that allows you to know for a fact that you haven't broken anything. You can't fool the compiler. And you can't fool the test. If something breaks, you add a new test, and then fix the broken thing. Passing the test means you fixed the bug, and since it's a test, you can run it again, automatically, and know for a fact that you didn't break it. Anytime you rely on manually verifying something worked, instead of having a robot do it for you, eventually you will forget to keep verifying it, because we expect things to never break, so why keep checking? But eventually it will have broken when you weren't checking.#
- Now, you can create applications that are websites that are social connections that are protocols that are just literally when you get down to it, just a couple key things, a handful of short strings defining what kind of information to look for, in what place, and how to build these strings. For example, RSS or json. #
- The actual mathematical structure defined by these standards is extremely simple, and the entire rest of technology is built around generating and manipulating these things in all the ways we can think of. This is where we are currently paying most of the costs of virtuality friction. #
- As it becomes easier to treat working with these strings as fundamental capabilities, ignoring things like in-memory, on local disk, local network cache, same-host service, local network service, caching proxy to load-balanced cluster, remote server, pooled connection, database index structure, query structure, query optimizations, cache optimizations, and strategies of all sort, we can finally build other things on top of all that. That of course becomes the new level of virtuality friction.#