Key findings of the independent think tank Reform in The lawful society report:
- There has been a shift in responsibility in the criminal justice system, away from the individual and towards centralised institutions, demonstrated by 76% of Britons believing that the police and courts are responsible for controlling anti-social behaviour, compared to around 45% in France and Germany.
- Six out of ten people in Britain would be unlikely to challenge a group of 14 year old boys vandalising a bus shelter in the UK, more than Germany, the Netherlands, Italy, France Spain and the UK. In Germany, six out of ten would challenge the group.
- British people are more worried about crime and violence with 43 percent reporting it as one of their greatest concerns compared to 21 percent in Germany and 27 percent in the US.
- The UK spends the largest amount in the OECD on law and order as a percentage of GDP, with nearly 40% more in real terms spent in 2006/07 than in 1997/98. This is higher than the US, double that of Sweden, France and Denmark and around 50% greater than that of Canada, Germany and Japan.
- Administration costs across the criminal justice service have risen by around 10% since 2002/03 – faster than frontline expenditure, which has risen by 7% since 2002/03.
- International comparison shows that criminal justice is most effective where it is close to the public and has strong local accountability.
The Telegraph reveals that councils are recruiting, and paying, informers to snoop on their neighbours:
The youngsters are among almost 5,000 residents who in some cases are being offered £500 rewards if they provide evidence of minor infractions.
One in six councils contacted by the Telegraph said they had signed up teams of "environment volunteers" who are being encouraged to photograph or video neighbours guilty of dog fouling, littering or "bin crimes".
The "covert human intelligence sources", as some local authorities describe them, are also being asked to pass on the names of neighbours they believe to be responsible, or take down their number-plates.
(Henri Porter's latest column, Our obsession with crime is crushing our freedoms, comments on these news tidbits.)
What you publish today, what you give to others - who may store it in a database and then lose it (just look at all the hard disks, laptops, USB keys recently lost by the Government and banks) -, what is obtained on you (through surveillance or sousveillance) or what information you reveal when achieving some specific goals such as a search query (see Gregory Conti's Could Googling take down a president?) or posting a geo-tagged picture (see Jan Chipchase's Great to see you. Just not around here) may haunt you when you least expect it. It can be reused in a different context, you need to share your health data with your GP, you may be ok sharing it on the NHS Spine (if not see the Big Opt Out how-to), but you're unlikely to want it widely available to insurers. It can also resurface years later in a completely different context such as a potential employer or spouse checking on you.
Living as a hermit recluse from society is not an option, at least not for most of us so we should look for other ways to tackle this issue which will grow in severity. You can't avoid surveillance, sousveillance and unanticipated use of information leakages, but you do have the choice of making it worse by blindly adopting the wisdom of the techno utopian crowd and actively share your life with the world. Twittering your minute actions (surely the only thing you should ever twitter is 'I'm posting a twit'?), upload in near real-time your geo-located pictures, stream live your latest ramblings, dribble a stream of (un)consciousness on your blog (and repost your twits on it), etc.
Forgetting is difficult. We all have a tendency to retain things 'just in case' and even if you do decide to delete some data, as it is copied, cached, backed up, etc., how can you be sure you've effectively destroyed all traces of it? For example, this is a question with no answer for the few people managing to get off the National DNA Database. The issue was highlighted in a recent a CTO storage roundtable feature (published in the August 2008 issue of Communications of the ACM) exploring near-term challenges and opportunities facing the commercial computing community:
MACHE CREEGER: Now that we all agree that there should be a way to make information have some sort of time-to-live or be able to disappear at some future direction, what recommendations can we make?
MARGO SELTZER: There's a fundamental conflict here. We know how to do real deletion using encryption, but for every benefit there's a cost. As an industry, people have already demonstrated that the cost for security is too high. Why are our systems insecure? No one is willing to pay the cost in either usability or performance to have true security. In terms of deletion, there's a similar cost-benefit relationship. There is a way to provide the benefit, but the cost in terms of risk of losing data forever is so high that there's a tension. This fundamental tension is never going to be fully resolved unless we come up with a different technology.
ERIC BREWER: If what you want is time to change your mind, we could just wait awhile to throw away the key.
MARGO SELTZER: The best approach I've heard is that you throw away bits of the key over time. Throwing away one bit of the key allows recovery with a little bit of effort. Throw away the second bit and it becomes harder, and so on.
ERIC BREWER: But ultimately you're either going to be able to make it go away or you're not. You have to be willing to live with what it means to delete. Experience always tells us that there's regret when you delete something you would rather keep.
(An unexpressed assumption in the dialogue above is that the cryptographic algorithms used will never be broken.)
One approach is to adopt what has worked sufficiently well for other issues. This is the conclusion reached by Daniel J. Weitzner, Harold Abelson, Tim Berners-Lee, Joan Feigenbaum, James Hendler, and Gerald Jay Sussman in Information Accountability (published as an MIT Computer Science and Artificial Intelligence Laboratory Technical Report in June 2007):
[I]nformation accountability through policy awareness, while a departure from traditional computer and network systems policy techniques, is actually far more consistent with the way that legal rules traditionally work in democratic societies. Computer systems depart from the norm of social systems in that they seek to enforce rules up front by precluding any possibility of violation, generally through the application of strong cryptographic techniques. In contrast, the vast majority or rules we live by have high levels of actual compliance without actually forcing people to comply. Rather we follow rules because we are aware of what they are and because we know there will be consequences, after the fact, if we violate them, and we trust in our democratic systems to keep rules and enforcement reasonable and fair. We believe that technology will better support freedom by relying on these social compacts rather than by seeking to supplant them.
Data protection legislation, only available in some countries, is such an example. It requires organisations that collect personal information to state the purposes of use and not to keep the data longer than necessary. Unfortunately, information commissioners often do not have enough resources or enforcement power to make consequences of breach serious.
This month is the 25th birthday of the GNU project that led Richard Stallman to write the GPL and LGPL free software licenses. This has been a very successful use of legal compacts to ensure a set of freedoms for software users. We need to figure out and negotiate what set of societal rules we must follow to ensure information accountability.