On Saturday, ORGCon2014 happened at Kings College in Southbank. It’s an event organized by the Open Rights Group, a UK-based group tasked with campaigning for the rights of individuals on “…issues ranging from mass surveillance, to copyright, censorship, data protection and open data and privacy.” Among other things, it’s patroned by Neil Gaiman.
Here’s a link to the ORGCON2014 brochure in case you’re the sort of person interested in brochures.
Cory Doctorow did the keynote and expounded on his three laws.1
Panels occurred on ‘if you have nothing to hide, you have nothing to fear’ (fascinating word bombs dropped by nomadic hacker, artist, and designer Eleanor Saitta; on what big tech companies are doing in the age of mass surveilance (here’s an example of them going the extra mile in doing their own surveillance); on the state of NSA surveillance; on social data; on ISP tracking; on drone strikes (brilliant visualization here, accompanied by a brilliant talk by–we kill people based on meta-data—Jennifer Gibson, human rights lawyer with Reprieve–here’s a great report on not-great killer robots); on DRM; on the whole world, really, of data–its creation and its control.
There’s a lot that could be said, and may be said in the coming weeks, as EG and I absorbed all that to which we listened and which inspired. Right now, what seems most prudent is to be conscious of our digital consumption and our idenity, to, if nothing else, take stock of two basic things.
1) All the data we create, who’s collecting it, and the where’s and what for’s.
2) How much of the stuff we buy is, for lack of a better term, ‘closed’ data? How much art we buy comes equipped with DRM? How much do we spend on Netflix and other such big-data companies that happily share with us content, but work very hard to control the ways in which we are allowed to consume that content?
To address (1), we decided to make a list of the all the ways in which our data may be collected (either in the background or by active choice)–e.g., our internet service provider, the apps we use, the websites we frequently visit, and then figure out what data’s being collected, how it’s being used, and to what degree we would prefer our data not to be collected, or used.
To address (2), Doctorow suggested taking stock of everything you give to the big companies that work so hard to track you, or lock down things you buy with DRM, and give some percentage of that, each year, to organizations like the EFF or ORG, who are working to make sure that, over time, everything we are isn’t tracked, owned, and sold. Or. Heck. Just give money to the artists directly. In the end, that’s the business model we want. The one that puts a large amount of value in the baskets of the people who create the things we value.
Doctorow said something very smart at the end of his talk.
There are a lot of issues more important than a free and fair internet. Refugee rights. Police shootings. Black sites. Torture.
All of those fights, though, he said, will happen on the internet.
Happy Tuesday, readers. Keep your eyes peeled for those data brokers. They certainly have their eye on you.
- Doctorow’s three laws being, as follows:
(1) Anytime someone puts a lock on something that belongs to you and won’t give you the key, then the lock isn’t there for your benefit.
(2) Fame won’t make you rich, but you’ll have a hard time making money if no one’s heard of you.
(3) Information doesn’t want to be free. People do. ↩