Big Data for All

You may also like...

3 Responses

  1. Ben Isaacson says:

    As a privacy practitioner at a big data company, I have to disagree with a ‘one size fits all’ approach to ‘practical access’. In the vast majority of cases, access is neither practical nor relevant to the vast majority of users. There have been tools and apps to access and provide insight with online data for years (eg;@Bluekai), and the recent DAA adchoices initiative (which Jules co-founded) has made it clear that very few people really want to know what goes on behind the scenes with their online interest data-but simply need some comfort where they have the option to turn off something irrelevant.

    I’m all for access in the right context(like credit reporting), but think your focus should not be on ‘practical’ access for all but rather on ‘contextual’ access for some. Finally, as we’ve seen in the online ad space–the real issue (at least in the short term) should not be about access but rather meaningful and practical choices. If the context and relevance of the algorithm is skewed, the user needs a simple mechanism to turn it off, and help inform the analytics engine to stop others from receiving the same mistake. User choices will drive changes to the algorithm, not simply ‘practical access’.

  2. A.J. Sutter says:

    @ “Data creates enormous value for the world economy, driving innovation, productivity, efficiency and growth.”: How can you be sure that this “value” is not erased by the impact on privacy, the plague of online advertising, “filter bubble” effects, etc.? Or is “value” a euphemism for money?

    @ “Regardless of lingering questions concerning who – if anyone – ‘owns’ the information, we think that fairness dictates that individuals enjoy beneficial use of the data about them.” Unlike beneficial ownership, which has exclusionary rights, “beneficial use” doesn’t, at least judging from Black’s. Why is this, and not beneficial ownership, adequate?

    @ “We trust if the existence and uses of databases were visible to the public, organizations would be more likely to avoid unethical or socially unacceptable uses of data.” This is a page out of Milton Friedman’s corporate-funded Capitalism and Freedom, that disclosure will set in motion an invisible hand to restrain bad actors. Why not use more direct methods and prohibitions?

  3. Frank says:

    I applaud the proposal to require the disclosure of “the decisional criteria underpinning their data analytics machinery.” Your point that “the observer in big data analysis can affect the results of her research by defining the data set, proposing a hypothesis or writing an algorithm” is also very important. On a related note, this interview demonstrates that even academics have unfortunate incentives:

    http://www.econtalk.org/archives/2012/09/nosek_on_truth.html