Hacking aircraft for fun and profit

Modern commercial jets make use of AFDX networks for sending and receiving control and sensor data. The AFDX protocol is based on Ethernet, and (if you’re familiar with the OSI model) is identical up to layer 2. This means two things. First, that AFDX traffic can be (mostly) routed by standard Ethernet hardware. And second, that Ethernet software tools can (sometimes) be used to troubleshoot and hack AFDX networks.

The problem is that such tools are not designed to handle a number of the things that AFDX does. AFDX is deterministic, redundant, and more fault-tolerant than standard Ethernet. And so you generally need specialized hardware and software to interface with AFDX.

But it doesn’t have to be that way. A laptop’s Ethernet port should be able to read and write AFDX traffic just fine. The only reason that it cannot is that it doesn’t understand the upper level protocols. There have been a few projects to rectify this, and they have made use of the WinPcap libraries for low-level traffic reads and writes. And then they stopped there, because those involved were happy to leave it at the C-code level and lock it away behind corporate-secrecy firewalls.

I was somewhat less than happy with this, and so I’ve written a suite of LabVIEW libraries that can hijack a PC’s Ethernet port [note to the NSA: when I say “hijack”, I’m talking about taking control of an Ethernet port, not an airplane] and read, write, and otherwise manipulate AFDX traffic. If I get clearance to do so from my client, I’ll open source these libraries. And maybe write an article on it. I’m really hoping that I can share this with the world in some way because it’s a really neat thing and fills an as-of-yet-unfilled niche.

Stay tuned for details!

A delicate balancing act…

There is so much that I do that I would want to write about. Much of the work that I do would make for some fantastic conference or journal articles. And some it would’ve even made a great master’s or doctorate’s thesis. BUT… the reality of the situation is that I am almost constantly under some non-disclosure agreement or other. Not that the work I do is terribly secretive. There’s no national security issue (usually) and no chance of any disclosure actually hurting whatever company I’m working for.

But the knee-jerk reaction nowadays is to hide everything that everyone does, all the time. Just in case. As though my obscure bit of network queuing code would sink the company were it ever revealed. From the standpoint of furthering the art, this is not a wise policy. From the standpoint of furthering my career, it’s damned annoying.

As always, XKCD said it best…

https://xkcd.com/664/

Failure is not must always be an option…

I am a scientist (if you know me at all, you’re saying “duh” right about now) but I am not a science cheerleader. By this I mean that I do not try to uphold the ivory tower at all costs. Primarily because, if we start to do this, then we are no longer doing science. That said, let me shed some light on a glaring problem with the way that science is done nowadays.

Most institutions are “publish or perish” in fact if not outright stated. This means that, as a working scientist, you are regularly expected to publish your results. This part, I’m actually okay with, in principle at least. Putting things in to the public domain is a good thing. But now for the two not-so-good things (there are more than two, but I’ll only talk about these today).

First, most journals do not put their content in to the public domain. You have to pay (and pay through the nose) in order to see it. This is not conducive to good science. Mind you, there are attempts to mitigate this. There’s the physics pre-print archive covering physics, the public library of open science with bioscience-related content, and most journals now have a free content section. There are even (illegal) torrent sites and aggregators dedicated to swiping content from closed journals and sharing with the world (nope, I won’t provide a link for those). So this is slowly getting a bit better.

Second, and much more importantly, failure is not an option when it comes to publication. With very few exceptions, only successful experiments and proven theorems are accepted for publication. This is so absolutely wrong that it almost defies logic. Science would be far more transparent and progress much more rapidly (and more importantly, honestly) if null results could be published. Again, this is slowing starting to change. Recently there have been attempts to rectify this to a degree. the Journal of Negative Results is one such attempt, though it limits itself to the biosciences.

Clearly these two factors are a huge hindrance to the reasonable progression of scientific research. I myself have been stymied in the past, needing to see a particular set of results, but being unwilling or unable to pay the exorbitant journal access fees. Additionally, I could have been save a lot of trouble had null results been published. But that’s now how scientific publishing works. And so I (and countless others) have wasted a significant amount of time following paths that could have easily been avoided, if only access were more open and honest failures held in equal esteem to successes.

I’ll end it here, though I’ll pick this up again shortly. And if you’d like to read more, here’s a better written article:

Unpublished Results Hide the Decline Effect