• Google, the “Good Censor,” is Going to Think for You

    October 24, 2018 // 7 Comments »




    Google might soon add its Terms of Service to the First Amendment.

    A leaked document written by Google argues because of a variety of factors, including the election of Donald Trump, what they call the “American tradition” of free speech may no longer be viable. The document lays out how the company can act as the world’s “Good Censor,” protecting us from harmful content and, by extension, harmful acts like electing the wrong president again.

    The document, which Google has officially characterized as research, is infuriatingly vague about whether the company has made any decisions or taken any action. So think of all this as a guidepost, like the Ghost of Christmas Future showing us the worst case scenario.

    The company is talking about changing the rules so the freedom to speak will no longer exist independent of the content of speech. What you can say could depend on Google’s opinion of whether or not it will negatively affect others. To Google, the personal liberty of freedom of speech might need to be balanced against collective well-being. The company acknowledges for the first time it has the responsibility and power to unilaterally adjudicate this battle between “free-for-all and civil-for-most” versions of society.

    We probably should be paying more attention to how they plan to do this, but because the document leaked on Breitbart, and because the initial rounds of censorship have impacted right of center, it has received little critical attention. But the significance of Google’s plans extends beyond the left-right fight; which content is censored is easily changed. If this plan is implemented, everything you will ever read online will be judged before it reaches you. Or doesn’t reach you.

    The old ideas seem as archaic to Google (Facebook, Twitter, and their successors) as the powdered wigs the Founders wore when they wrote them. People should be free to say nearly anything they want. In the marketplace of ideas good will overpower bad. If we block one person’s speech, we can soon block others, right up to when it comes to us. The collective right to free speech is more important than an individual’s reaction to that speech. There is an uncomfortable duty to protect speech irrespective of its content.

    Jefferson had a good run. Then the election of Donald Trump scared the free speech ideal out of Google. Could they have been… responsible… for helping elect a threat to democracy, the last president, someone who would shape-shift into a dictator? Should they have tried to stop him? Wouldn’t you have killed baby Hitler if you could have?

    Under such circumstances, free speech is reimagined by many as a liability which bad actors will exploit judo-style, the tools of democracy used to destroy democracy. The Google document warns “online manipulation and disinformation influenced elections in more than 18 countries, including the U.S. [as] free speech becomes a social, economic and political weapon.”

    The irony is the Internet was supposed to be, and maybe briefly was, the highest expression of what is now the legacy definition of unfettered speech. Anyone could start a website to stand alongside the .govs. One voice was as loud as anyone else’s, and search engines were the democratizing connective tissue. Google was created to organize the world’s knowledge, not help control it. Free speech flourished online. Government censors had real restrictions; we know them as borders.

    Not so for global entities like Google. What doesn’t pass through their search engines or social media travels through their servers and cloud storage. There is no more pretending any but a minority of users can use another tool, or ignore the web, and still functionally live in the real world. Google sees itself at the nexus of this historic change, saying “Although people have long been racist, sexist, and hateful in many other ways, they weren’t empowered by the Internet to recklessly express their views with abandon.” We apparently can’t handle that, and Google is, for the first time in human history, in a position to do something about it. After all, they acknowledge they “now control the majority of our online conversations” so the Internet is mostly whatever they say it is.

    At that point, Google worries, the “we’re not responsible for what happens on our platforms” defense crumbles. How much the last election was influenced doesn’t matter as much as the realization the tools are in place to do it more effectively next time. Existing laws can limit foreigners buying political ads stuffed with controversial news, but if Americans want to do the same thing laws not only don’t limit them, the legacy version of the 1A demands they be allowed to blast out hate speech and gendered bigotry. Something has to be done. Google’s document says they as the apex predator can now create online “well-ordered spaces for safety and civility.”

    There is no one to stop them. It is very clear what private companies can do vis-a-vis speech; the argument is over what they should do. Thanks to Section 230 of the Communications Decency Act, Google is shielded from traditional publishers’ liability and responsibility. The 1A does not apply. No one at Google stands for election. Users matter only in the aggregate of millions of clicks. Google as the Good Censor would be accountable to pretty much no one (though the Supreme Court last week agreed to hear a long-shot case that could determine whether users can challenge social media companies on free speech grounds.)

    As proof-of-concept – what they are capable of doing – the Google document cites Charlottesville. Following racial violence, Google, GoDaddy, and Cloudflare quietly ganged up to end their relationships with The Daily Stormer, “effectively booting it off the Internet.” Google noted “While some free speech advocates were troubled by the idea that ‘a voice’ could be silenced at its source, others were encouraged by the united front the tech firms put up.” Same with Alex Jones, as corporations serially kicked him off their sites. Facebook and Twitter also actively censor, with Facebook removing over 800 political pages for “coordinated inauthentic behavior,” an Orwellian term Facebook claims means they were not forums for “legitimate political debate.”

    Google and the others aren’t acting in a vacuum. Some 69% of American college students believe intentionally offensive language should be banned. The ACLU now applies a litmus test to cases it defends, weighing their impact on other rights (for example, the right to say the N-word versus the rights of POC not to hear it), declaring free speech can be secondary to other political goals. As Google suggests, censorship has a place, per the ACLU, if it serves a greater good.

    The document makes clear Google understands current censorship efforts have fallen short. Decisions have been imprecise, biased, and influenced by shares and likes. Yet while acknowledging they never will please everyone, Google is emphatic it can’t escape “its responsibility for how society functions and progresses.” So the document is rich in words like transparency and fairness as it wrestles with the complexity of the task, with Google envisioning itself as more an imperfect but benign curator than Big Brother. But like a bad horror movie, you can see the ending from miles away.

    Eliminating voices to “not influence” an election is influencing an election. Once one starts deleting hate speech, there is no bottom to the list of things offensive to someone. Once you set your goal as manipulating thought via controlling information, the temptation to use that tool will prove great. Why not manipulate stock prices to fund “good” nonprofits and harm bad ones? Who should be elected in Guatemala? What’s the Google solution for that land dispute in St. Louis? It is so easy. Just placing links for one candidate above another in a test search increased the number of undecided voters who chose that candidate by 12%.

    The cornerstone of free speech – the absolute right to speak remaining independent of the content of the speech – is now in the hands of corporate monopolies, waiting for them to decide whether or not to use the power. Where the Supreme Court refused to prohibit hate speech, Google can do so. Where the 1A kept the government from choosing what is and isn’t called true, Google may decide. Journalists can take a first pass at writing news, but Google is the one positioned to determine if anyone sees it. Like some TV murder mystery, Google is perched on the edge of a terrible decision, having tested opportunity, means, and method. All that’s left is the decision to pull the trigger.




    Related Articles:




    Copyright © 2018. All rights reserved. The views expressed here are solely those of the author(s) in their private capacity. Follow me on Twitter!

    Facebooktwittergoogle_plusredditpinterestlinkedin

    Posted in Democracy, Post-Constitution America, Trump

    The Police State and License Plate Scanners

    May 18, 2016 // 17 Comments »

    google car

    One of the latest tools for violating our privacy and creating the American police state are license plate scanners.

    Watching You

    This technology allows the police to cruise through a city at normal speed and photographically gather images of vehicle license plates, along with geolocation data. This is all stored, and can easily be used to create a record of everywhere your car has been. Coupled with cellphone and WiFi data being collected along with its own geodata, and tied to things like tracked credit card activity, emails and the now-ubiquitous public surveillance cameras, it is very, very easy for law enforcement to know where you are, where you have been and have a pretty good idea of what you were doing.

    Run that same process for lots and lots of people, and you can also tell who spent time with who.


    Vigilant Solutions

    Expand that process nationwide and you truly have a police state.

    How to do that? Contact a private company called Vigilant Solutions. They collect license plate scanning information from multiple police departments as well as their own network of private plate scanners and facial recognition/facial cataloging technology and then sell it in database form to law enforcement.

    The Vigilant database is massive, with over 2.2 billion location data points, and it is growing by almost a million data points per day. The database means, for example, that the New York police can now monitor you and your car whether you live in New York, Miami, Chicago, Los Angeles, or elsewhere.

    The database also boasts a full suite data analytics tools which allow police officers to track cars historically or in real time, conduct a virtual stakeout, figure out which cars are commonly seen in close proximity to each other, and predict likely locations to find a car.

    Data, once collected, can exist forever. Whatever it is being used for now, it will also be available for other uses in the future, enhanced by new exploitive technology.

    As Vigilant puts it on its website, “Data is cumbersome; intelligence is actionable.”


    Let’s Google It

    All that is quite dangerous enough. However, the latest wrinkle is that the police in at least one city are going as far as disguising their license plate scanning vehicle as an innocent Google Maps truck. You don’t even know your location information is being gathered this way.

    Matt Blaze, a University of Pennsylvania computer and information science professor, noticed an SUV tucked away in the shadows of the Philadelphia Convention Center, bearing a logo for Google Maps. Blaze, based on his profession, also identified mounted on top of the vehicle two high-powered license plate reader cameras. To the average passerby, it might appear to be a Google street view vehicle.

    After initially denying it, the Philly cops eventually admitted the van was their’s, but refused further comment.

    “We can confirm that this is not a Google Maps car, and that we are currently looking into the matter,” a Google spokesperson said. She would not elaborate as to whether the company was concerned that law enforcement was using a vehicle with warrantless surveillance technology while pretending to be a Google vehicle.


    It is impossible to escape this network of warrantless search and still live in society. Our cars, our phone, our credit cards and our very faces have been corrupted by a police state into tools of surveillance.




    Related Articles:




    Copyright © 2018. All rights reserved. The views expressed here are solely those of the author(s) in their private capacity. Follow me on Twitter!

    Facebooktwittergoogle_plusredditpinterestlinkedin

    Posted in Democracy, Post-Constitution America, Trump

    Welcome to the Memory Hole: Disappearing Snowden

    April 28, 2014 // 15 Comments »

    What if Edward Snowden was made to disappear? No, I’m not suggesting some future CIA rendition effort or a who-killed-Snowden conspiracy theory of a disappearance, but a more ominous kind.

    What if everything a whistleblower had ever exposed could simply be made to go away? What if every National Security Agency (NSA) document Snowden released, every interview he gave, every documented trace of a national security state careening out of control could be made to disappear in real-time? What if the very posting of such revelations could be turned into a fruitless, record-less endeavor?

    Am I suggesting the plot for a novel by some twenty-first century George Orwell? Hardly. As we edge toward a fully digital world, such things may soon be possible, not in science fiction but in our world — and at the push of a button. In fact, the earliest prototypes of a new kind of “disappearance” are already being tested. We are closer to a shocking, dystopian reality that might once have been the stuff of futuristic novels than we imagine. Welcome to the memory hole.

    Even if some future government stepped over one of the last remaining red lines in our world and simply assassinated whistleblowers as they surfaced, others would always emerge. Back in 1948, in his eerie novel 1984, however, Orwell suggested a far more diabolical solution to the problem. He conjured up a technological device for the world of Big Brother that he called “the memory hole.” In his dark future, armies of bureaucrats, working in what he sardonically dubbed the Ministry of Truth, spent their lives erasing or altering documents, newspapers, books, and the like in order to create an acceptable version of history. When a person fell out of favor, the Ministry of Truth sent him and all the documentation relating to him down the memory hole. Every story or report in which his life was in any way noted or recorded would be edited to eradicate all traces of him.

    In Orwell’s pre-digital world, the memory hole was a vacuum tube into which old documents were physically disappeared forever. Alterations to existing documents and the deep-sixing of others ensured that even the sudden switching of global enemies and alliances would never prove a problem for the guardians of Big Brother. In the world he imagined, thanks to those armies of bureaucrats, the present was what had always been — and there were those altered documents to prove it and nothing but faltering memories to say otherwise. Anyone who expressed doubts about the truth of the present would, under the rubric of “thoughtcrime,” be marginalized or eliminated.

    Government and Corporate Digital Censorship

    Increasingly, most of us now get our news, books, music, TV, movies, and communications of every sort electronically. These days, Google earns more advertising revenue than all U.S. print media combined. Even the venerable Newsweek no longer publishes a paper edition. And in that digital world, a certain kind of “simplification” is being explored. The Chinese, Iranians, and others are, for instance, already implementing web-filtering strategies to block access to sites and online material of which their governments don’t approve. The U.S. government similarly (if somewhat fruitlessly) blocks its employees from viewing Wikileaks and Edward Snowden material (as well as websites like TomDispatch) on their work computers — though not of course at home. Yet.

    Great Britain, however, will soon take a significant step toward deciding what a private citizen can see on the web even while at home. Before the end of the year, almost all Internet users there will be “opted-in” to a system designed to filter out pornography. By default, the controls will also block access to “violent material,” “extremist and terrorist related content,” “anorexia and eating disorder websites,” and “suicide related websites.” In addition, the new settings will censor sites mentioning alcohol or smoking. The filter will also block “esoteric material,” though a UK-based rights group says the government has yet to make clear what that category will include.

    And government-sponsored forms of Internet censorship are being privatized. New, off-the-shelf commercial products guarantee that an organization does not need to be the NSA to block content. For example, the Internet security company Blue Coat is a domestic leader in the field and a major exporter of such technology. It can easily set up a system to monitor and filter all Internet usage, blocking web sites by their address, by keywords, or even by the content they contain. Among others, Blue Coat software is used by the U.S. Army to control what its soldiers see while deployed abroad, and by the repressive governments in Syria, Saudi Arabia, and Burma to block outside political ideas.

    Google Search…

    In a sense, Google Search already “disappears” material. Right now Google is the good guy vis-à-vis whistleblowers. A quick Google search (0.22 seconds) turns up more than 48 million hits on Edward Snowden, most of them referencing his leaked NSA documents. Some of the websites display the documents themselves, still labeled “Top Secret.” Less than half a year ago, you had to be one of a very limited group in the government or contractually connected to it to see such things. Now, they are splayed across the web.

    Google — and since Google is the planet’s number one search engine, I’ll use it here as a shorthand for every search engine, even those yet to be invented — is in this way amazing and looks like a massive machine for spreading, not suppressing, news. Put just about anything on the web and Google is likely to find it quickly and add it into search results worldwide, sometimes within seconds. Since most people rarely scroll past the first few search results displayed, however, being disappeared already has a new meaning online. It’s no longer enough just to get Google to notice you. Getting it to place what you post high enough on its search results page to be noticed is what matters now. If your work is number 47,999,999 on the Snowden results, you’re as good as dead, as good as disappeared. Think of that as a starting point for the more significant forms of disappearance that undoubtedly lie in our future.

    Hiding something from users by reprogramming search engines is one dark step to come. Another is actually deleting content, a process as simple as transforming the computer coding behind the search process into something predatory. And if Google refuses to implement the change-over to “negative searches,” the NSA, which already appears to be able to reach inside Google, can implant its own version of malicious code as it has already done in at least 50,000 instances.

    But never mind the future: here’s how a negative search strategy is already working, even if today its focus — largely on pedophiles — is easy enough to accept. Google recently introduced software that makes it harder for users to locate child abuse material. As company head Eric Schmidt put it, Google Search has been “fine-tuned” to clean up results for more than 100,000 terms used by pedophiles to look for child pornography. Now, for instance, when users type in queries that may be related to child sexual abuse, they will find no results that link to illegal content. Instead, Google will redirect them to help and counseling sites. “We will soon roll out these changes in more than 150 languages, so the impact will be truly global,” Schmidt wrote.

    While Google is redirecting searches for kiddie porn to counseling sites, the NSA has developed a similar ability. The agency already controls a set of servers codenamed Quantum that sit on the Internet’s backbone. Their job is to redirect “targets” away from their intended destinations to websites of the NSA’s choice. The idea is: you type in the website you want and end up somewhere less disturbing to the agency. While at present this technology may be aimed at sending would-be online jihadis to more moderate Islamic material, in the future it could, for instance, be repurposed to redirect people seeking news to an Al-Jazeera lookalike site with altered content that fits the government’s version of events.

    …and Destroy

    However, blocking and redirecting technologies, which are bound to grow more sophisticated, will undoubtedly be the least of it in the future. Google is already taking things to the next level in the service of a cause that just about anyone would applaud. They are implementing picture-detection technology to identify child abuse photographs whenever they appear on their systems, as well as testing technology that would remove illegal videos. Google’s actions against child porn may be well intentioned indeed, but the technology being developed in the service of such anti-child-porn actions should chill us all. Imagine if, back in 1971, the Pentagon Papers, the first glimpse most Americans had of the lies behind the Vietnam War, had been deletable. Who believes that the Nixon White House wouldn’t have disappeared those documents and that history wouldn’t have taken a different, far grimmer course?

    Or consider an example that’s already with us. In 2009, many Kindle owners discovered that Amazon had reached into their devices overnight and remotely deleted copies of Orwell’s Animal Farm and 1984 (no irony intended). The company explained that the books, mistakenly “published” on its machines, were actually bootlegged copies of the novels. Similarly, in 2012, Amazon erased the contents of a customer’s Kindle without warning, claiming her account was “directly related to another which has been previously closed for abuse of our policies.” Using the same technology, Amazon now has the ability to replace books on your device with “updated” versions, the content altered. Whether you are notified or not is up to Amazon.

    In addition to your Kindle, remote control over your other devices is already a reality. Much of the software on your computer communicates in the background with its home servers, and so is open to “updates” that can alter content. The NSA uses malware — malicious software remotely implanted into a computer — to change the way the machine works. The Stuxnet code that likely damaged 1,000 centrifuges the Iranians were using to enrich uranium is one example of how this sort of thing can operate.

    These days, every iPhone checks back with headquarters to announce what apps you’ve purchased; in the tiny print of a disclaimer routinely clicked through, Apple reserves the right to disappear any app for any reason. In 2004, TiVo sued Dish Network for giving customers set-top boxes that TiVo said infringed on its software patents. Though the case was settled in return for a large payout, as an initial remedy, the judge ordered Dish to electronically disable the 192,000 devices it had already installed in people’s homes. In the future, there will be ever more ways to invade and control computers, alter or disappear what you’re reading, and shunt you to sites weren’t looking for.

    Snowden’s revelations of what the NSA does to gather information and control technology, which have riveted the planet since June, are only part of the equation. How the government will enhance its surveillance and control powers in the future is a story still to be told. Imagine coupling tools to hide, alter, or delete content with smear campaigns to discredit or dissuade whistleblowers, and the power potentially available to both governments and corporations becomes clearer.

    The ability to move beyond altering content into altering how people act is obviously on governmental and corporate agendas as well. The NSA has already gathered blackmail data from the digital porn viewing habits of “radical” Muslims. The NSA sought to wiretap a Congressman without a warrant. The ability to collect information on Federal judges, government leaders, and presidential candidates makes J. Edgar Hoover’s 1950s blackmail schemes as quaint as the bobby socks and poodle skirts of that era. The wonders of the Internet regularly stun us. The dystopian, Orwellian possibilities of the Internet have, until recently, not caught our attention in the same way. They should.

    Read This Now, Before It’s Deleted

    The future for whistleblowers is grim. At a time not so far distant, when just about everything is digital, when much of the world’s Internet traffic flows directly through the United States or allied countries, or through the infrastructure of American companies abroad, when search engines can find just about anything online in fractions of a second, when the Patriot Act and secret rulings by the Foreign Intelligence Surveillance Court make Google and similar tech giants tools of the national security state (assuming organizations like the NSA don’t simply take over the search business directly), and when the sophisticated technology can either block, alter, or delete digital material at the push of a button, the memory hole is no longer fiction.

    Leaked revelations will be as pointless as dusty old books in some attic if no one knows about them. Go ahead and publish whatever you want. The First Amendment allows you to do that. But what’s the point if no one will be able to read it? You might more profitably stand on a street corner and shout at passers by. In at least one easy-enough-to-imagine future, a set of Snowden-like revelations will be blocked or deleted as fast as anyone can (re)post them.

    The ever-developing technology of search, turned 180 degrees, will be able to disappear things in a major way. The Internet is a vast place, but not infinite.  It is increasingly being centralized in the hands of a few companies under the control of a few governments, with the U.S. sitting on the major transit routes across the Internet’s backbone.

    About now you should feel a chill. We’re watching, in real time, as 1984 turns from a futuristic fantasy long past into an instructional manual. There will be no need to kill a future Edward Snowden. He will already be dead.




    Related Articles:




    Copyright © 2018. All rights reserved. The views expressed here are solely those of the author(s) in their private capacity. Follow me on Twitter!

    Facebooktwittergoogle_plusredditpinterestlinkedin

    Posted in Democracy, Post-Constitution America, Trump

    We Meant Well App on Your Smartphone or Tablet

    August 23, 2012 // 5 Comments »

    A companion app for the book is now available on Google Play. Simply search for “we meant well” under Apps using your Android phone or Tablet, or hit the QR code below with your smartphone QR code reader app.

    The app is a great way to enhance your experience with the book. Visit the author’s blog, Twitter and web links, and learn more about the book. Through the web site, see photos not included in the book of the people and places featured in We Meant Well. The app is free.

    If you don’t have a phone, check out the app here.

    The app is also Amazon Apps marketplace. No plans for an Apple version I am afraid. Apple has high developer hurdles and fees for a small operation like this. If anyone out there is an Apple person and would like to help, contact me at info(at)wemeantwell.com

    In case you are curious, the app contains ads that do not benefit me; they are part of the third-party “free” building tool. That tool unfortunately also enables many permissions by default that are not used by the app, so don’t worry. Google added the rating of “low maturity,” which I assume is not a reflection of my writing.




    QRCode



    Related Articles:




    Copyright © 2018. All rights reserved. The views expressed here are solely those of the author(s) in their private capacity. Follow me on Twitter!

    Facebooktwittergoogle_plusredditpinterestlinkedin

    Posted in Democracy, Post-Constitution America, Trump

IP Blocking Protection is enabled by IP Address Blocker from LionScripts.com.