Quote Originally Posted by Illusions View Post
It actually does take into account how popular a site is, but you are right that is not the only metric. Your profile on Facebook, and thus the ability to find your drunk photos there is going to show up higher in the rankings than a snippet of text on Joe's Blog o' Drunk unless its a popular, well linked blog.

Allow non-users to opt out of Facebook's functions, and don't automatically opt-in current users to new features. For instance, someone who is not a member of Facebook would be able to opt out of being tagged in photos, and someone who has signed up when Facebook's functionality was X, is not automatically included in Y until they agree to it.
Google doesn't have direct knowledge of how popular a site is. It also doesn't care. It uses proxies to determine the reputation and reliability of the information on a site. The system is specifically designed to not prioritize the popular unless that "popularity" is correlated with a number of other things.

Thus, even a random blog that no one reads could be a top result for a specific search. This is the key advance that Google brought to the table ten years ago over other search engines.

Your idea of opting-out of Facebook's functions for non-members further clarifies why this is an impossible demand. You are opted-out of Facebook's functions if you're not a member. But that doesn't stop someone from posting a giant Facebook album with your name and details about you that everyone can see.

This is exactly what happened with Khaled Saeed in Egypt, in which someone posted a Facebook album of someone's tortured corpse and it helped start the Egyptian revolution. And it's exactly what happens when someone posts a blog about anything.

Your "solution" is very plainly a warm-fuzzy attempt to cater to "privacy advocates" that actually is censorship. Expression is expression, whether it's no Facebook, a private blog or a newspaper.

Quote Originally Posted by CitizenCain View Post
Disingenuous little shit.

Hi everyone. I'm Jonathan, I'm the Product Manager for Google Photos. The decision to retire these name tag features didn't come lightly, but we felt they were necessary to help our team prioritize working on features that more people will find useful. I appreciate all your feedback about our recent changes and please continue to share your opinions and suggestions here in the forum. Thank you.



That is very far from a vindication of your anti-technology, pro-useless-"privacy"-measures stance. Whether the answer is a political evasion or a PR-friendly lie or even the honest truth is immaterial to the fact that your source, the source you chose and provided gives no hint that this is in any way related to privacy concerns or the anti-technology stance of the German government.
I find it pretty rich that OG, who openly pirates games, movies and music because the laws for owning/distributing that content don't make sense to him, thinks that laws to restrict the dissemination of photos taken in public do make sense.

But more generally, I was just doing some more reading on this in the media this morning and was pointed to a post by Google's privacy lawyer. He claims to be representing his own opinions here, but I think he make a very valuable point that people are often making "foggy" points about privacy here that usually lead to censorship.

http://peterfleischer.blogspot.com/2...-oblivion.html

Wednesday, March 9, 2011
Foggy thinking about the Right to Oblivion


I was lucky enough to spend a few days in Switzerland working on Street View. And I treated myself to a weekend of skiing too. The weather wasn't great, we had a lot of mountain fog, but then, the entire privacy world seems to be sort of foggy these days.

In privacy circles, everybody's talking about the Right to be Forgotten. The European Commission has even proposed that the "right to be forgotten" should be written into the up-coming revision of the Privacy Directive. Originally, a rather curious French "universal right" that doesn't even have a proper English-translation (right to be forgotten? right to oblivion? right to delete?), le Doit a l'Oubli, is going mainstream. But, what on earth is it? For most people, I think it's an attempt to give people the right to wash away digital muck, or delete the embarrassing stuff, or just start fresh. But unfortunately, it's more complicated than that.

More and more, privacy is being used to justify censorship. In a sense, privacy depends on keeping some things private, in other words, hidden, restricted, or deleted. And in a world where ever more content is coming online, and where ever more content is find-able and share-able, it's also natural that the privacy counter-movement is gathering strength. Privacy is the new black in censorship fashions. It used to be that people would invoke libel or defamation to justify censorship about things that hurt their reputations. But invoking libel or defamation requires that the speech not be true. Privacy is far more elastic, because privacy claims can be made on speech that is true.

Privacy as a justification for censorship now crops up in several different, but related, debates: le droit a l'oubli, the idea that content (especially user-generated content on social networking services) should auto-expire, the idea that data collection by companies should not be retained for longer than necessary, the idea that computers should be programmed to "forget" just like the human brain. All these are movements to censor content in the name of privacy. If there weren't serious issues on both sides of the debate, we wouldn't even be talking about this.

Most conversations about the right to oblivion mix all this stuff up. I can't imagine how to have a meaningful conversation (much less write a law) about the Right to be Oblivion without some framework to dis-entangle completely unrelated concepts, with completely unrelated implications. Here's my simple attempt to remember the different concepts some people want to forget.

1) If I post something online, should I have the right to delete it again? I think most of us agree with this, as the simplest, least controversial case. If I post a photo to my album, I should then later be able to delete it, if I have second-thoughts about it. Virtually all online services already offer this, so it's unproblematic, and this is the crux of what the French government sponsored in its recent Charter on the Droit a l'Oubli. But there's a big disconnect between a user's deleting content from his/her own site, and whether the user can in fact delete it from the Internet (which is what users usually want to do), more below.

2) If I post something, and someone else copies it and re-posts it on their own site, do I have the right to delete it? This is the classic real-world case. For example, let's say I regret having posted that picture of myself covered in mud, and after posting it on my own site, and then later deleting it, I discover someone else has copied it and re-posted it on their own site. Clearly, I should be able to ask the person who re-posted my picture to take it down. But if they refuse, or just don't respond, or are not find-able, what can do I do? I can pursue judicial procedures, but those are expensive and time-consuming. I can go directly to the platform hosting the content, and if the content violates their terms of service or obviously violates the law, I can ask them to take it down. But practically, if I ask a platform to delete a picture of me from someone else's album, without the album owner's consent, and only based on my request, it puts the platform in the very difficult or impossible position of arbitrating between my privacy claim and the album owner's freedom of expression. It's also debatable whether, as a public policy matter, we want to have platforms arbitrate such dilemmas. Perhaps this is best resolved by allowing each platform to define its own policies on this, since they could legitimately go either way.

3) If someone else posts something about me, should I have a right to delete it? Virtually all of us would agree that this raises difficult issues of conflict between freedom of expression and privacy. Traditional law has mechanisms, like defamation and libel law, to allow a person to seek redress against someone who publishes untrue information about him. Granted, the mechanisms are time-consuming and expensive, but the legal standards are long-standing and fairly clear. But a privacy claim is not based on untruth. I cannot see how such a right could be introduced without severely infringing on freedom of speech. This is why I think privacy is the new black in censorship fashion.

4) The Internet platforms that are used to host and transmit information all collect traces, some of which are PII, or partially PII. Should such platforms be under an obligation to delete or anonymize those traces after a certain period of time? and if so, after how long? and for what reasons can such traces be retained and processed? This is a much-debated topic, e.g., the cookies debate, or the logs debate, the data retention debate, all of which are also part of the Droit a l'Oubli debate, but they completely different than the categories above, since they focus on the platform's traffic data, rather than the user's content. I think existing law deals with this well, if ambiguously, by permitting such retention "as long as necessary" for "legitimate purposes". Hyper-specific regulation just doesn't work, since the cases are simply too varied.

5) Should the Internet just learn to "forget"? Quite apart from the topics above, should content on the Internet just auto-expire? e.g., should all user posts to social networking be programmed to auto-expire? Or alternatively, to give users the right to use auto-expire settings? Philosophically, I'm in favor of giving users power over their own data, but not over someone else's data. I'd love to see a credible technical framework for auto-delete tools, but I've heard a lot of technical problems with realizing them. Engineers describe most auto-delete functionalities as 80% solutions, meaning that they never work completely. Just for the sake of debate, on one extreme, government-mandated auto-expire laws would be as sensible as burning down a library every 5 years. Even if auto-expire tools existed, they would do nothing to prevent the usual privacy problems when someone copies content from one site (with the auto-expire tool) and moves it to another (without the auto-expire function). So, in the real world, I suspect that an auto-expire functionality (regardless of whether it was optional or mandatory) would provide little real-world practical privacy protections for users, but it would result in the lose of vast amounts of data and all the benefits that data can hold.

6) Should the Internet be re-wired to be more like the human brain? This seems to be a popular theme on the privacy talk circuit. I guess this means the Internet should have gradations between memory, and sort of hazy memories, and forgetting. Well, computers don't work that way. This part of the debate is sociological and psychological, but I don't see a place for it in the world of computers. Human brains also adapt to new realities, rather well, in fact, and human brains can forget or ignore content, if the content itself continues to exist in cyberspace.

7) Who should decide what should be remembered or forgotten? For example, if German courts decide German murderers should be able to delete all references to their convictions after a certain period of time, would this German standard apply to the Web? Would it apply only to content that was new on the Web, or also to historical archives? and if it only applied to Germany, or say the .de domain, would it have any practical impact at all, since the same content would continue to exist and be findable by anyone from anywhere? Or to make it more personal, the web is littered with references to my criminal conviction in Italy, but I respect the right of journalists and others to write about it, with no illusion that I should I have a "right" to delete all references to it at some point in the future. But all of my empathy for wanting to let people edit-out some of the bad things of their past doesn't change my conviction that history should be remembered, not forgotten, even if it's painful. Culture is memory.

8) Sometimes people aren't trying to delete content, they're just trying to make it harder to find. This motivates various initiatives against search engines, for example, to delete links to legitmate web content, like newspaper articles. This isn't strictly speaking "droit a l'oubli", but it's a sort of end-run around it, by trying to make some content un-findable rather than deleted. This will surely generate legal challenges and counter-challenges before this debate is resolved.

Next time you hear someone talk about the Right to be Oblivion, ask them what exactly they mean. Foggy thinking won't get us anywhere.