kena

It can’t be wrong if it feels so good

In hack, lecture, reflection on January 7, 2013 at 12:00

I did it again.

It’s like self-gratification: can’t hurt anyone, reliable pleasure, kept me busy for a while, pleasant relief afterwards.

Really, I ought to have been working on new, abstract, sexy-sounding research directions to advance my career.

Instead, I spent a half week programming and engineering. And I learned a damn lot about myself in the process.

I have been hacking things together, wedging pieces into places where they were not designed to belong together. And it seems to work! Despite the couple of rough edges left, I am quite proud of the result. The feeling of control and power that I source from these achievements is a sweet deal in comparison to drunken highs and successful conference talks.

Yet the achievement is modest. This week, I have extended the SPARC compatibility of my colleagues’ simulation and compilation tool chain. I did it initially because one of my colleagues had been asking me difficult technical questions as of lately, and I wanted better tools to answer his questions and more generally help him help himself.

Then after I started, I did it for myself. “Just because.”

The situation is exciting.

This new processor that my colleagues are making is similar, but not identical to some SPARC processors designed by Sun/Oracle. It is also similar in different ways to the DEC Alpha, and the MIPS architecture before it. Actually, it is an entirely different thing altogether: its hardware bits would look largely different. But it was designed to subvert: it can be configured to recognize the SPARC, Alpha or MIPS instructions, and thus accept to run programs designed for these previous processors.

Work-wise, my colleagues try to determine whether that new thing fares “better” than previous devices in some way. But I’m not too excited about that myself.

It’s an irresistible impulse, really. When I see a new programmable machine, I do not feel satisfied until I understand what makes it different from other programmable machines. And the way I get satisfied is not by writing programs for it or understanding how it works “inside”; it is by studying how its programming tools are different.

Different computers, although they look different from the outside and can be controlled using very different operating systems and user interfaces, often come down to the same things from the perspective of a programmer piecing bits together. The experience gained while programming one computer, using one programming language, can often be reused advantageously with another computer. This is possible because a programmer does not hold a complete, detailed mental picture of the machine when he/she programs. That would be mind bogglingly difficult to achieve. Instead programmers routinely abstract away the details of the real hardware device and create a mental picture of their own of how things ought to be in the machine.

This model is constructed incrementally, as the programmer gains skills and experience. It is also updated with each new machine that the programmer encounters. What really gets me going and excited is this process exactly:
how does a programmer’s mental model of computers is altered when meeting a new sort of computer.

If I could, that is, if I had time (and money) to do so, I would study this from the cognitive sciences perspective. Unfortunately, nobody is interested to fund and support me this way. Instead, I study, modify and produce programming environments for a (part of) a living. I am especially focusing on all these occasions where programming environments need to evolve or become different, because I know that around these occasions there will be people looking at new things and starting to think differently.

Here the occasion is a new kind of processor. It can recognize instructions from previous, existing processors but not completely. It’s just enough to claim partial compatibility, but not enough to reuse the previous programming tools as-is.

So I needed to know: why can’t the previous tools not be reused as-is? What is the impact of knowing the tools can’t be reused as-is on programmers of the previous platforms? What part of the model of the previous platforms can be reused, and what part needs to be thrown away and replaced?

I have answered quite a few of these questions already, and even published some thoughts on these topics already. But I am so enthused by the topic that I find it hard to stop investigating, and I needed just the nudge provided by my colleague to kindle a week-long fire of curiosity.

So I did my magic again, for the third time in three years.

I tricked the GNU C Compiler into producing machine code for SPARC in a way just slightly different from what the original SPARC design mandates. I wrote a machine code translation engine to transform this modified code into the form recognized by the new processor. Both tools are puppeteered by an abominable software thing made of many bits and pieces, but despite its horrendous, diseased bowels the thing appears to behave from the outside observer and user as the GNU C Compiler itself.

If it looks like a duck, and quacks like a duck, it must be a duck itself, right?

Not quite, unfortunately. There were two issues remaining.

One is that every language, including C, allows programmers to use some constructs that have no equivalent in hardware. For example, C defines the ability to divide two natural numbers; however this operation does not exist as a single instruction on the Alpha processor, and only in a limited form on the SPARC and MIPS processors. To provide the service, the C compiler automatically bundles programs with a library of language intrinsics, which are small helper functions with esoteric names like __divsi3 or __ffssi2. These functions come alongside the compiler itself, and their source code must thus also be compiled to machine code suitable on the target platform.

So I had taken an existing C compiler for SPARC, alongside its regular SPARC intrinsics, then puppeteered it to produce code for something like-SPARC-but-not-quite. But the existing SPARC intrinsics were not suitable for the new platform!

Unfortunately, the GNU C Compiler is very tightly packaged with its intrinsics. So tightly packaged that the intrinsics cannot be compiled separately, using a different C compiler than the one they are shipped with or at a different time. The intrinsics’ building process is arcane, only known to a handful of GCC developers, and tightly coupled to the building process of the compiler itself.

So I cheated. After my puppeteering code was ready, I re-ran a build of the GNU C Compiler for the regular SPARC platform. Then I interrupted the build as it was ready to process the intrinsics. At that point, I fooled the build process in using my puppeteering code in lieu of the newly built GNU C compiler that it was designed to use at that point. Then I let it continue, tricked into producing a new set of intrinsics suitable for my target platform.

And then I stopped caring about this issue.

It’s probably heart wrenching for a GNU C developer. In an hypothetical universe, I could spend time to understand the arcane build process, then modify it cleverly to do what I wish in an automated manner compatible with its designers’ intent. This mechanism would then be reusable for hackers in the future. But I live in no such universe, and I am not interested to do so. Someone else will have to care.

For what I care, I simply copied manually the object code for my intrinsics out of the GNU C build tree, and onto my project. They are now sitting as 91 kilobytes of precompiled code in my project, and I don’t intend to upgrade/recompile them ever unless forcefully pushed to, eg. by a bug. As a tribute to my future self, in case I forget how I did it or someone else is ever interested, I wrote some hints about the entire manoeuvre in the project’s documentation and my source control system.

The other issue is that a working code generator, able to produce machine code from source code, even when equipped with the suitable intrinsics, is largely insufficient to produce working programs. It misses what gives a taste and feeling to a programming language and gives it its essence: its library of services and its interfaces to the “outside world.” In C, this is the role of the so-called “standard library” which provides access to files, screen, keyboard, network, etc.

Contrary to the language intrinsics, a standard library is not necessarily provided alongside the compiler. In the case of C, this is even largely uncommon outside of Microsoft-world: the C library is typically provided by the operating system and shared by all C compilers for that platform.

C libraries are also commonly written in C itself. In principle, once equipped with a working C compiler, there is no surprise: use the compiler on a library’s source, produce a compiled version, and use the newly produced compiled code to compile other programs or libraries.

But here an obstacle remained: the primal library of them all, the “standard C library,” is not written entirely in C. Actually, it can’t! For example, there is no way to implement file access (e.g. fopen, open) in C using other, more primitive C services. Any standard C library, therefore, uses its own black magic to provide its services. This black magic is invisible to programs; invisible to other libraries; and mostly invisible to compiler implementers. This magic is called “system interface glue”; it works differently for every processor and for every operating system. Because of this,
reusing the source code of a C library on a new platform is doomed to failure until this glue is also adapted.

But that is an absolute hell, an insane amount of engineering work.

But I figured, this entire enterprise of me is only going to be used by a few people in the end. These people will only use a handful of library services. So instead of adapting an existing standard C library to work for my colleagues, I pieced a new one together, reusing bits from other projects. I used mostly bits from FreeBSD, and rewrote a few other bits. It wasn’t exciting work, but I satisfy myself that the amount of work was still lower than what the “proper” approach of porting an existing project would require. And it worked!

Mostly.

There is an often underrated, absolutely gigantic piece of code in every standard C library, a part that I found impossible to either rewrite or import into my project: the part that deals with “mathematics”: all the numerical functions.

All implementations of this that I could find have intricate source code that has been refined over the course of five decades. The resulting multitude of optional features and platform-specific optimizations makes importing this code a nightmarish work of its own.

So for this bit, in order to preserve my sanity I took another short-cut. I asked a tiny but yet fully functional implementation called uClibc to lend me a hand. I did so using the same trick as for the intrinsics earlier: I asked uClibc to produce a math library for a regular SPARC platform. Then I started the build. Then I interrupted it as it was about to process the code I wanted. Then I hijacked the build process to use my puppeteering code instead of the GNU C Compiler it was expecting. Then I let it resume, and watched it with pleasure churn out a math library suitable for my platform.

Again, I stopped caring as soon as my compiled code was ready. I did not modify the uClibc build system so that I can perform this operation again automatically in the future. I did not tell it either about the characteristics of my new platform. Instead, I swiftly copied the compiled object code to my project and I don’t intend to involve uClibc ever again unless I absolutely have to.

Tomorrow, I will wake up, and I will not know anything more about neither GNU C’s nor uClibc’s build processes and design principles than I did when I started.

With 20% of the time I accomplished 80% of the work. The remaining 20% would probably require 200% of the time, or more: realizing the same goal using the “official” or proper engineering processes of the projects I have reused would require many man-months worth of work. But the life expectancy of this particular project makes it likely that the remaining 20% of the work will never be needed, anyway. I violated all these rules of “good” engineering to “get the work done” and it appears I have succeeded within the five days I had set aside for it.

And in the mean time, I have learned most ways in which the new processor from my colleagues will twist the mind of programmers who plan to use it like a SPARC processor. This knowledge may not be directly useful, but it has influenced the way I look at my work. Mission accomplished.

Maybe next month or next year, I may be tempted or requested to do my magic again, a fourth time, and then a fifth, or maybe even more. Every time, I trust myself that I will re-learn the incantations, and trick new tools into reaching my goals.

Like my playing the piano, my programming is powered by an unconscious drive. When I sit before the instrument, I only have a vague notion of the relationship between the number of hours invested, the end result quality and how far I will be able to reuse my skills afterwards. I am typically not interested in producing a form of “aesthetics” recognizable by peers or able to impress audiences. My acting is for my own sake: I sit, and I carve my sense of accomplishment out of nothingness, note by note, line by line, ever trusting myself to move forward, step by step. Pride to succeed where I was previously doubting; pride to re-learn through sheer exercise what I had previously learned and forgotten; pride at my ability to eventually meet my self-appointed deadlines and deposit period-sounding semicolons at the end of sentences; pride in my ability to vanquish passing dispositions to procrastinate. These, and not the end result nor the process, are my powering engines.

Advertisements
  1. That’s a very beautiful post. Please do keep on writing about your thinking process. It’s fascinating.

    You wrote that you’re interested in how a programmer’s mental model of computers changes when he (or she) meets a new sort of computer. That research topic is of great interest among historians of computing!

    best wishes,
    Edgar

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: