One Thing To Do Today: Truth vs. Checksums

Reality is that which, when you stop believing in it, doesn’t go away.
― Philip K. Dick, I Hope I Shall Arrive Soon

So now we have all these records and backups made. What if we want to share them with someone else? We’re moving more into the Sanity section here, where we want there to be information that can be verified by more than keeping our fingers crossed. Perversely, I have trouble conveying how deeply sacred I think the transference of an idea from one person to another to be.  As a poor substitute lets talk about the fragility of the process. Successful communication has so very many steps, each vulnerable to failure, shown to hilarious effect in the screwball comedies of the 1930s and to great tragedy by Shakespeare.

  • Is the requestor of information who they say they are?
  • How can they be sure I’m who I say I am?
  • Did I understand the nature of their request?
  • Do I have the relevant information?
  • Do I have permission to distribute the relevant information?
  • Can my response arrive in a timely manner?
  • (Will my message get there unread by 3rd parties?)
  • Will my response message get there unchanged?
  • Will my response be properly understood?
  • Is the person delivering the message actually sent by me?
  • Accept that the message is from me, and it’s what I sent. Can they verify that my message accurately portrays a situation? The receiver should wonder, could I be delivering information that is:
    • uncertified. I haven’t done sufficient work to check it
    • accurate but irrelevant
    • inaccurate because its that’s how it was delivered to me by my own sensor network
    • inaccurate because its that’s how it was delivered to me by a third party
    • inaccurate because I ran a faulty algorithm on good quality data
    • inaccurate because I ran a good algorithm on bad quality data
    • inaccurate because I mean it to be
  • How can I maintain a record which can prove the actual content of my sent messages?

Whether you call this Epistemology or Information Theory, whether mediated through computers or not, trust is hard.   From secret pass phrases, to sealing wax, to handwriting analysis, to… checksums? It’s all a arms race through time. The more sophisticated the tech, the more clever the attacks.

Companies with products designed to enhance the privacy or security of communications don’t litter their marketing materials with jargon only to dazzle the uninitiated. Specific technologies protect very specific elements of the communication process.  The jargon communicates what narrow slice of the puzzle the company will be attempting to certify. When companies won’t name the names of the techniques used, but instead float fluffy words like “safe,” “private,” “secure” heck even “encrypted”, start to worry.

So lets make an example out of some words frequently used together that someone evaluating this type of software might mistake as an absolute promises of truth and authenticity:  “hashed,” “checksum”  and “fingerprint.”

Let’s say my mom’s very reliable mail carrier delivers to her a wooden crate filled with tasty looking chocolate chip cookies with my return address on it. Inside the crate is an envelope with a message, “Hey Mom, I’ve sent you a brown cardboard box that is 8″x12″x4″, weighs 2lb 6oz, and sealed with purple packing tape. Inside is a dozen oatmeal raisin cookies.” This apparent conflict will hopefully make her suspicious enough to call me before she actually eats the snacks.  If my message had only said, “here are cookies I hope you’ll like,” she would have no clue that perhaps someone had intercepted my package and swapped it with their own.   However, if our cookie crook had been capable of either exactly duplicating the package or swapping in their own note, we’d have a problem again.

My decision to sum up my package as a description of its volume, weight, color of tape, type and number of cookies was the hashing algorithm I used to create the checksum represented by the included note. If I had sent my note separately from the package instead of inside it, that would have been more like a fingerprint.

There’s not really a shared secret that only she and I would know to really ensure that someone isn’t trying to impersonate me. Also, nothing about any of this means that I sent cookies that my mother would actually even like the taste of or that I haven’t used an ingredients that she’s allergic to, etc. Heck, she could even be in the middle of a dream.  All she’s got is that as far as the situation is actually happening, someone claiming to be me sent her cookies that match what they said they’d be sending. It’s not the Complete Truth about who sent the cookies, why and what’s in them, but it’s not nothing either.

This Computerphile video explains how computers implement these schemes in a way which is perhaps much more useful than my care package analogy.

If you want more on this topic, Computerphile also has a short playlist on some related Information Theory topics.  I also quite liked Eddie Woo’s Parity and Checksum videos that came at the end of his very accessible Communications & Network Systems playlist.

So how is that we know what we know? What information can be trusted? These questions tangle up the best minds that have ever lived, so no, there is never going to be an App for that. Us mere mortals have hope though. We can add thin layers with specific processes building up confidence.  When trust has been devastatingly corroded, baby steps make the fastest progress.



I make things that do stuff. The best, though, is teaching others to do the same. Founder of @crashspacela Alum of @ITP_NYU

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.