Mr. Dillon; smartphone innovation in Europe ought to be about people’s privacy

Dear Mark,

Your team and you yourself are working on the Jolla Phone. I’m sure that you guys are doing a great job and although I think you’ve been generating hype and vaporware until we can actually buy the damn thing, I entrust you with leading them.

As their leader you should, I would like to, allow them to provide us with all of the device’s source code and build environments of their projects so that we can have the exact same binaries. With exactly the same I mean that it should be possible to use MD5 checksums. I’m sure you know what that means and you and I know that your team knows how to provide geeks like me with this. I worked with some of them together during Nokia’s Harmattan and Fremantle and we both know that you can easily identify who can make this happen.

The reason why is simple: I want Europe to develop a secure phone similar to how, among other open source projects, the Linux kernel can be trusted. By peer review of the source code.

Kind regards,

A former Harmattan developer who worked on a component of the Nokia N9 that stores the vast majority of user’s privacy.

ps. I also think that you should reunite Europe’s finest software developers and secure the funds to make this workable. But that’s another discussion which I’m eager to help you with.

5 thoughts on “Mr. Dillon; smartphone innovation in Europe ought to be about people’s privacy”

  1. http://cm.bell-labs.com/who/ken/trust.html

    It’s important to accept that you basically have to trust somebody, at some point. It’d be harder, of course, to insert back doors when the source code is visible, but there are ways around that.

    Standardize a subtly broken PRNG, say. Or subvert the compiler. Or get the security to look roughly right, and have all the right buzzwords, while leaving a hole big enough to drive a truck through. Often that last one doesn’t even need external assistance, sadly.

    There’s lots of good reasons why having complete source code to your phone would be useful and interesting,and it certainly mitigates against some forms of attack, but in practise, you’re going to end up blindly relying on the hardware supplier and the tool supplier.

    Concentrate instead on working out how you could run untrusted code on the platform in a safe manner. That’s the angle Android went for; I think it’s fundamentally a good angle, though I wonder if more transparency on the network traffic would be helpful.

    1. Hi Dave, nice to hear from you after such a long time!

      Given the possibility of a compiler backdoor I agree with “It’s important to accept that you basically have to trust somebody, at some point. It’d be harder, of course, to insert back doors when the source code is visible, but there are ways around that.”. However, providing source code with a device like a smartphone enables people like us a lot more to verify the device and its functionality.

      Given that more and more Internet protocols will be executed whilst being encrypted (Apple Airplay and Miracast come to mind) we as engineers are left with investigating individual devices rather than encrypted protocols (for which we’d need super- or quantum computers to intercept their in-transit data in a readable way). Meaning that in order to pragmatically/realistically do a proper peer-review ‘today’, we need verifiable source code (ie. md5 checksums).

      Encryption of protocol implementations ends up meaning that we can’t read whatever is being sent by our devices: it makes concentrating on working on how to trust untrusted code on the platform in a safe manner either unworkable or very inefficient. Who knows what the platform has sent, encrypted, unless I can investigate the platform itself? Will you provide me with hardware that can decrypt the info in a timely matter (within the next few years)? Please do if you can. Notwithstanding I’m sure I’ll have access to superiour hardware in future, I don’t have it right now. Latter proposition is relevant, the former right now isn’t.

      I have relatively few ‘of the shelve’ solutions to hardware hacks. I must admit that we’ll need to do both to tackle those.

    2. I agree that we need to acknowledge that “you basically have to trust somebody, at some point”. However I have never understood how that is a reason to dismiss calls for shortening the “chain of uncertainty”, ie. limiting the number of elements we blindly have to trust other people with.

      An operating system is a major part of the whole system and if everyone could audit it in an open manner that would reduce the related risk immensely.

      Open hardware would be good. Open software would be good. We shouldn’t stop demanding one just because the other is more problematic to achieve.

    1. MD5 or any other well known good method that allows us to verify binaries against source code. MD5 is a industry standard often used for this purpose.

Comments are closed.