Rust has a useful concept of “features” baked into its packaging tool cargo
which allows developers to optionally toggle functionality on and off. In a
simple project features are simple, as you would expect. In more complex
projects which use cargo
workspaces the
behavior of features becomes much more complicated and in some
cases..surprising!
Howdy!
Welcome to my blog where I write about software
development
, cycling, and other random nonsense. This is not
the only place I write, you can find more words I typed on the Buoyant Data blog, Scribd tech blog, and GitHub.
The thing about appendable objects in S3
Storing bytes at scale is never as simple as we lead ourselves to believe. The concept of files, or in the cloud “objects”, is a useful metaphor for an approximation of reality but it’s not actually reality. As I have fallen deeper and deeper into the rabbit hole, my mental model of what is storage really has been challenged at every turn.
sccache is pretty okay
I have been using sccache
to improve feedback loops with large Rust projects
and it has been going okay but it hasn’t been the silver bullet I was hoping
for. sccache can be easily dropped into
any Rust project as a wrapper around rustc
, the Rust
compiler, and it will perform caching of intermediate build artifacts. As
dependencies are built, their object files are cached, locally or remotely, and
can be re-used on future compilations. sccache
also supports distributed
compilation which can compile those objects on different computers, pulling the
object files back for the final result. I had initially hoped that sccache
would solve all my compile performance problems, but surprising to nobody,
there are some caveats.
Jamming on Google Meet with Pulseaudio
For an upcoming hack week I wanted to have some live jam sessions with colleagues on a video call. Mostly I wanted some background music we could listen to while we hacked together, occasionally discussing our work, etc. I don’t normally use Pulseaudio in anger but it seemed like the closest and potentially simplest solution.
The AI Coding Margin Squeeze
Words cannot express how excited I am for the coming margin squeeze on every “AI company” that isn’t Anthropic, OpenAI, Microsoft, or Google. The entire industry is built on an unethical foundation, having illegitimately acquired massive amounts of content from practically everybody. The companies selling “AI Coding Assistants” I am particularly excited to see implode.
The last data file format
The layers of abstraction in most technology stacks has gotten incredibly deep over the last decade. At some point way down there in the depths of most data applications somebody somewhere has to actually read or write bytes to storage. The flexibility of Apache Parquet has me increasingly convinced that it just might be the last data file format I will need.
Save the world, write more efficient code
Large Language Models have made the relationship between software efficiency and environmentalism click for many people in the technology field. The cost of computing matters.
Solar Laundry
The process of switching to a more electric house, and therefore more renewable, is two parts: remove fossil fuel powered components and remove load. I wrote previously about using a home battery which has really made a big dent in the ability to switch loads towards electricity. I have recently done some more analysis on the humble clothesline.
Yes, please repeat yourself.
The oft touted principles of “pragmatic software development” are by and large nonsense. While not unique to software development many people like to choose their dogma and then wield it uncritically when any opportunity arises. “Don’t Repeat Yourself” (DRY)is one of the “principles” that has always irked me.
Low latency Parquet reads
The Apache Parquet file format has become the de facto standard for large data systems but increasingly I find that most data engineers are not aware of why it has become so popular. The format is interesting especially when taken together with most cloud-based object storage systems, where some design decisions allow for subsecond or millisecond latencies for parquet readers.