The tempest around the recent Carrier IQ “spyware” issue serves as an important example of a key advantage of an open platform like Android, as compared to a closed source, locked system alternative (of course, we’ll use iOS as the example of the latter).
To be clear, before we begin, my point is *not* about the degree of “bad” that’s present in the various CIQ implementations. Let me clearly say that I acknowledge that (assuming you trust their statements on the matter, and I’m not arguing those here), Apple allowed the use of CIQ in the past in a much more limited capacity than some of the other cases, and it claims that it is even more limited in later releases. That’s great. Wonderful. Not what I’m talking about here, though.
The point I *am* making is that I don’t want to have to take the word of the carrier or the device maker on issues like this. All of them came out with similar statements denying the degree to which the “bad stuff” happened. Some were proven to be lying. Some may have been telling the truth. Doesn’t make much difference to me in this scenario.
My point is that you can take any instance of something like this and evaluate an important question. In order to avoid confusing the issue with the irrelevant details of the CIQ case, let’s (for the purposes of the rest of this post) substitute a different, totally fictional and hypothetical but similar discovery.
Let’s say it comes out in January that HTC, Motorola, and Apple all made deals with “DJR” (fictional) software in the past, and they all (to varying degrees) stored and shared some extra information you’d rather they didn’t.
The most important question (IMHO) if I’m a customer using a device where something like this has been discovered is “what can I do about it?”
If I’m an Android user, there are several answers to that question. I could buy a different phone (since I have many to choose from) from a different carrier / manufacturer who hasn’t made the particular poor choice that I have a problem with. Or I could install an open source, custom ROM on the device I have now. This may (in some cases) void my warranty, but it’s at least an option that I can consider.
On the other hand, if I’m using a system like Apple’s iOS, I have nowhere to turn. There are no other iOS devices (not made by Apple) to choose from if I don’t like what Apple has decided to do on the one I have. I certainly can’t install some alternative “distribution” of iOS, since those don’t exist. Even if the source were open (or obtained by other means) and it was technically possible for someone to build an alternative *full* iOS ROM (as opposed to simply jailbreaking the stock Apple one, which doesn’t solve problems like this), it would be illegal for it to ever be distributed since the people doing so would be violating Apple’s copyrights in doing so.
Rather, the only real choice I would have as an Apple customer would be the decision of whether I’m willing to just accept it or whether it’s a big enough deal for me to leave them over.
That last point is the one that really hit me with this, and I think it provides some degree of insight into why some people who are really into Apple are so reluctant to ever admit that they’ve done anything “wrong” or negative, in any situation. Perhaps it’s because they know deep down that if they do acknowledge anything of that sort but continue to use Apple products anyway, they are effectively saying “and I’m willing to live with that because I want to use iOS and there’s nothing else I can do about it”.
Ultimately, that’s the point I’m making here. One of the benefits of a free / open platform is not being boxed in to those kinds of all-or-nothing choices.
Two things:
1) This CIQ issue should make you think about what company you trust. Personally, as I’ve stated before, trust Apple with my personal information because they haven’t done anything to to prove that wrong. The biggest privacy issue of recent (since reent history matters) is the “LocationGate” issue where location vicinities triangulated from cell towers were cached on the phone and saved to iTunes without being encrypted by default (it was encrypted if you said to encrypt backups, which I don’t know is default); that was a non issue IMO for so many reasons since it meant people needed to steal your physical phone or computer and if that were to happen they’d get information freely available in so many other places (social media: facebook, flickr, twitter geo-location,…). Point is, since Apple did not allow a carrier to install CIQ to capture sensitive information (or install it for themselves), without the consent of the user. I can’t emphasize that enough, the user had to agree to sending diagnostic data to Apple.
Seeing CIQ on almost every other phone than iOS makes me trust Apple even more[1]. I don’t know how it couldn’t, instead you’re trying to include them in this debate when they shouldn’t. Your point still stands but it’s being based on a story that’s being wrongly associated.
#2 …
Following up on the first point.
You’re blaming guilt by association, CIQ in iOS is completely different than the CIQ issue we’re seeing on pretty much every other device, except the two closed devices: Windows and iOS.
Don’t take my word for it, read it from the iOS hacker that confirms it by looking at the code. He’s also the one that found the CIQ strings and associated iOS with CIQ and this story. http://blog.chpwn.com/post/13572216737?fe250de0
If you don’t trust Apple’s statements, consider what chpwn says in his update: “Update: From my examinations, Appleās recent statement on the issue appears to be entirely accurate.”
If you don’t that’s your right but denying his opinion would be short sited since he knows more about this than you or I combined.
#3
You’re hypothetical scenario has some holes.
You can “root” iOS and people have distributed full OS installs (I’ve already brought this to your attention in another debate), whether it’s illegal for the hackers should not be the concern for an iOS user (everyone reading this) — the option is still available and comparatively “breaks your warranty” (since reinstalling the OS makes it look as though nothing happened).
Since devs/hackers have distributed hacked versions of iOS (full OS, not a hack) there’s nothing stoping an alternative distribution without the offending hypothetical root kit installed. Assuming you’re talking about some root kit and not some daemon that a jailbreaked iPhone can’t (for some unknown reason) disable (which I think is possible but I’ll go along with this strange hypothetical you’re using to prove a point).
I won’t disagree that Android has more options but to say you don’t have any under iOS is inaccurate. Regardless, I think you can make your case better if you don’t argue against Apple’s privacy policies (since it’s really good compared to any other company, especially Google) and you argue against their lockin model with their AppStore, which is hard to argue against since every phone appstore provides a lockin to the phone’s OS (Windows, Android, iOS and BB).
[1] I also trust the Google directed Nexus devices and Windows mobile more as well.
Warning: I didn’t proof this, it took me too long to write and read your article.
Re: #1 and #2 – I think I was pretty clear about being uninterested in discussing the details of this particular CIQ incident, as it’s irrelevant to my point. I think I was also clear that I’m not disputing Apple’s statements in regards to that.
The CIQ details are a distraction from my point. You’re saying you’re OK with trusting Apple, but my whole point is that I don’t want to be in a situation where I’m forced to make the choice to trust any company, with the only option being to not use that platform if I don’t trust them. That’s the position that iOS users are in.
Re: #3 – I think you’re wrong on that. In the scenarios you’re describing where people distribute their own “custom” versions of iOS, without access to the source, the best they can do is try to hack *out* (from the binary, compiled finished product) whatever the offending components are, and hope they got them all. That’s hugely different than being able to look to knowledgeable, independent third parties to actually build another version of the full OS from source and be able to actually verify that there’s nothing shady or objectionable going on.
The latter part of #3 diverges a bit into other issues that I also have strong opinions on, so I’ll briefly cover the lock-in issue, but I think it’s unrelated to this point. Again, the primary point of this post has nothing to do with Apple’s privacy policy. Rather, it is that with closed systems like iOS and Windows, you are forced to trust what they’re doing, and if you don’t like something you eventually discover, your only real option is to leave. That isn’t the case on Android.
As to the issue of lock-in, we’ve been over this many times before, but it is utterly ridiculous to try to equate or put on the same level the degree of “lock-in” as it relates to Apple’s app store compared to Google’s. (I’m not as familiar with the Windows or BB ones, so I won’t comment on those). The lock-in in the Apple app store operates on many levels that *do not apply* on Android.
The issues are somewhat similar to the ones in this post, in that if you want to leave, you have no options on iOS. Your investment in apps on their store is not transferable to any devices not sold by Apple. In stark contrast, if I decide that I don’t like something that my device maker (HTC, Motorola, Dell, or anyone else) or carrier is doing, I can ditch them and pick up an Android phone made by someone else.
If it turns out that Google has put something into the OS that I don’t like, I can run a custom build of Android that doesn’t include those objectionable parts, and my investment in apps, etc. is completely transferable to that new build. Again, in contrast, if you don’t like something that Apple has put into iOS, you are stuck. If you do choose to leave, that app-store lock-in bites you hard, because you can’t take any of that stuff with you to anyone but Apple.