« February 2009 | Main | April 2009 »

March 20, 2009

Holy hand grenade

You couldn't make this stuff up. Here's slashdot's summary, which says it best:

Bomb disposal teams were called in and a nearby pub evacuated after water company engineers mistook a Monty Python film prop for a hand grenade. After nearly an hour of examination by bomb experts, they counted to three. No more. No less. Three was the number they counted, and the number they counted was three. Four they did not count, nor two, except to proceed to three. Five was right out. Once the number three had been reached, being the third number, they declared that the grenade was actually a copy of the "Holy Hand Grenade of Antioch" used in the film Monty Python And The Holy Grail. A police spokeswoman confirmed that the device was a toy and that it had been no danger to the public.

(tip to /.)

posted March 20, 2009 07:16 PM in Humor | permalink | Comments (1)

March 11, 2009

Pattie Maes

The more I think about this, the cooler I think it is.

Eons ago when I was looking at different grad schools, I remember liking Pattie Maes's work on software agents. Her "fluid interfaces group" seems just as interesting.

posted March 11, 2009 05:27 PM in Pleasant Diversions | permalink | Comments (0)

March 06, 2009

Oh so clever

(thanks xkcd)

posted March 6, 2009 10:36 AM in Humor | permalink | Comments (1)

March 01, 2009

The future of privacy

Bruce Schneier (wikipedia page) recently wrote a nice essay on the consequences of computer technology on privacy. Here's the opening paragraph

Welcome to the future, where everything about you is saved. A future where your actions are recorded, your movements are tracked, and your conversations are no longer ephemeral. A future brought to you not by some 1984-like dystopia, but by the natural tendencies of computers to produce data.

Schneier hits the issue on the head: increasingly, our actions and statements are not lost in the sands of time, but are recorded, stored, and analyzed for profit and power. Sometimes recording information about yourself is desirable, since it can create a convenient way to remember things you might've forgotten [1]. But right now, it's rarely you who actually stores and controls the data on yourself. Instead, corporations and governments "own" data about you [2], and use it to advance their own interests.

Ideally, we would each choose how much personal data to divulge and to which party we divulge it based on how much we value the services that use our personal data. For instance, advertising is a modern annoyance that could theoretically be made less annoying if advertisers could be more targeted. That is, part of the reason advertising is annoying is that most of the ads we see are not remotely interesting to us. Enter the miracle of personal data: if only advertisers knew enough about each of our real interests, then they would know which ads weren't interesting to us, and they would show us only ads that were actually interesting! This argument is basically a lie [3], but it highlights the idea that there should be a tradeoff between our privacy and our convenience, and that we should get to choose which we value more.

My favorite example of this tradeoff is Facebook, a place where people divulge all sorts of private information. Given that Facebook holds a treasure trove of demographic, interest, social data, and ad targets, its an obvious business plan to try to monetize it through advertising. Facebook's efforts to do so, e.g., their Beacon initiative and the recent revision to their Terms of Service, have gotten a strong backlash because people really do care about how their personal data is used, and whether its being used in a way that serves their interests or another's [4].

Another Facebook example comes from employers monitoring their employees' Facebook pages, and holding them accountable for their private actions (e.g., here and here). This issue exemplifies a deeper problem with the public availability of private data. Schneier mentions in his opening paragraph that it's bad that conversations are often no longer ephemeral. But, what does that really mean? Well, consider what it might be like to try to run for public office (say, Congress) in 2030, having grown up with most of your actions and statements being recorded, by Facebook, by modern advertisers, etc. During the campaign, all those records of the stupid, racy, naive things you did or said when you were young, innocent and didn't know better will come back to haunt you. In the past, you could be assured that most of that stuff was forgotten, and you could grow into a new, better, more mature person by leaving your past behind. If everything is recorded, you can never leave your past behind. Nothing is forgotten, and nothing is truly forgiven.

So, the cost of losing our privacy is not measured simply in terms of how much other people know about our current actions and attitudes, which is a high cost anyway. It's also the cost of defending your past actions and statements (potentially even those of your childhood), and of having those judged by unfriendly or unsympathetic voices. Sometimes I wonder whether blogging now will hurt me in the future, since it would be easy for a future potential employer to trawl my blog for statements that seem controversial or attitudes that they deem undesirable. There used to be a stronger respect for the division between public and private lives, but I think that's been fading for a long time now. Blogs are public forums. Facebook is a semi-public forum [5]. Your workplace is under surveillance by your employer. Your streets are watched by the government (for your protection, naturally). In fact, the only truly private place is your home [6].

The upside of recording everthing, and a point missed by Schneier, is that it's not easy to use all this data in a coherent and coordinated fashion. Credit card companies know a tremendous amount about each of us from our purchase histories, but they struggle to use that information effectively because they don't have the computational tools to individually understand their customers. Instead, they build aggregate profiles or "segments", and throw out all the other details. Although the computational tools will certainly improve, and there will be startling revelations about how much corporations, governments and our neighbors know about us, I'm not terribly worried about the dystopian future Schneier paints. That is, for most of us, we'll be hiding in plain sight because there will be too much information out there for us to stick out. The real dangers lie in believing that you shouldn't be careful about what you let be recorded, that you can avoid being noticed regardless of what you do or say (aka security through obscurity), or that you can continue hiding once you've been noticed. Privacy is not dead, it's just a lot more complicated than it used to be.


[1] My favorite feature of Safari is the "Reopen All Windows From Last Session" one, which lets me remember what I was looking at before I rebooted my computer, or before Safari crashed.

[2] Who owns this data is a critical question that will ultimately require an act of Congress to sort out, I think. (I wonder if copyright law will eventually be applied here, in the sense that I "author" the data about myself and should thus be able to control who is able to profit from it.)

Generally, I come down on the side of personal control over data about yourself, at least for private parties. That is, I should be able to authorize a company to use data about myself, and I should be able to revoke that authority and know that the company will not store or sell information about me to other party. With governments, I think the issue is a little trickier, since I think they have legitimate reasons to know some things about their citizens.

[3] The marginal cost is so low for showing an ad to someone who's not interested in it that you'd be crazy to expect economics to drive advertisers to show you less of them. Besides, it's hard to say before seeing an ad whether we're actually not interested in it.

[4] This point makes it clear that businesses have a legitimate path to getting a hold of their customer's personal information, which is to give people something in return for it. Ideally, this would be a customized service that utilizes the personal data to make better recommendations, etc., but sadly it's often a one-time payment like a discount and the data is then sold to advertisers.

[5] To their credit, Facebook gives its users better control over who can see what aspects of their profile than many past social networking websites.

[6] And if you live in a city, attached to city services, even your home is not as private as you might think. One of my favorite examples of this comes from testing the raw sewage of a neighborhood for traces of illegal drugs.

posted March 1, 2009 01:48 PM in Things to Read | permalink | Comments (3)