Home country: USA
Member since: Mon Sep 24, 2012, 12:07 PM
Number of posts: 347
Home country: USA
Member since: Mon Sep 24, 2012, 12:07 PM
Number of posts: 347
That is not an official quote, but it is what they wish they could say:
“At first we were in an arms race with sophisticated criminals,” says Eric Grosse, Google’s head of security. “Then we found ourselves in an arms race with certain nation-state actors . And now we’re in an arms race with the best nation-state actors.” Primarily, the US government.
But perhaps the most authentic expression of betrayal came from a relatively unknown Google security engineer named Brandon Downey in a post on his personal Google+ account. He prefaced his message by stating that he was speaking only for himself—but he might as well have been channeling his colleagues across the industry:
Fuck these guys. I’ve spent the last ten years of my life trying to keep Google’s users safe and secure from the many diverse threats Google faces. I’ve seen armies of machines DOS-ing Google. I’ve seen worms DOS’ing Google to find vulnerabilities in other people’s software. I’ve seen criminal gangs figure out malware. I’ve seen spyware masquerading as toolbars so thick it breaks computers because it interferes with the other spyware. I’ve even seen oppressive governments use state-sponsored hacking to target dissidents … But after spending all that time helping in my tiny way to protect Google—one of the greatest things to arise from the internet—seeing this, well, it’s just a little like coming home from War with Sauron, destroying the One Ring, only to discover the NSA is on the front porch of the Shire chopping down the Party Tree and outsourcing all the hobbit farmers with half-orcs and whips.
For all you NSA apologists out there, all you defenders of the state, your government's actions have consequences. And one of those consequences is it harms Americans and American companies:
Certainly the tech companies felt worse off. In November, the German newsweekly Der Spiegel—another recipient of Snowden leaks—described an NSA/GCHQ exploit that seemed tailor-made to erode trust. In an attempt to gain access to the Brussels-based telecommunications firm Belgacom, the agencies set up bogus versions of sites like Slashdot and LinkedIn. When employees tried to access the sites from corporate computers, their requests were diverted to the phony replicas, which the spies used to inject malware into their machines.
Using considerable understatement, LinkedIn’s general counsel, Erika Rottenberg, says, “We are not happy that our intellectual property is being used in that way.” It is not hard to see why. If foreign customers can’t know whether they are using a legitimate social network or a spy-created fake, they are liable to log off altogether.
For years, companies from espionage-happy countries like China have been spurned by overseas buyers who didn’t trust their products. Now it’s America’s turn.
Folks, this is not for our "own good". It is not for "flag and country". It is not likely even terribly effective. But it is damn destructive.
“The NSA is willing to compromise the security of everything to get what they want,” security expert Bruce Schneier says.
“Think about the damage this does to America,” says US Representative Rush Holt (D-New Jersey), who is the rare member of Congress with a PhD in physics—and one of a number of legislators pursuing measures that would curtail the NSA’s activities. “The NSA is saying, ‘We’ve got to make sure the encryption has flaws so we can decrypt.’ Isn’t that the pinnacle of arrogance? No one else knows how to do it or is as smart as we are. They won’t realize we’ve degraded our product. But the truth always comes out. And America is worse off because of it.”
Fuck You NSA. And Fuck You to every politician that supports it.
Oh yeah, and thank you Mr. Snowden. We need more heroes like you willing to take a principled stand.
Wired - How the US Almost Killed the Internet
Posted by hueymahl | Wed Jan 8, 2014, 10:44 AM (38 replies)
From the NYTimes: http://opinionator.blogs.nytimes.com/2013/09/15/the-banality-of-systemic-evil/
In recent months there has been a visible struggle in the media to come to grips with the leaking, whistle-blowing and hacktivism that has vexed the United States military and the private and government intelligence communities. This response has run the gamut. It has involved attempts to condemn, support, demonize, psychoanalyze and in some cases canonize figures like Aaron Swartz, Jeremy Hammond, Chelsea Manning and Edward Snowden.
In broad terms, commentators in the mainstream and corporate media have tended to assume that all of these actors needed to be brought to justice, while independent players on the Internet and elsewhere have been much more supportive. Tellingly, a recent Time magazine cover story has pointed out a marked generational difference in how people view these matters: 70 percent of those age 18 to 34 sampled in a poll said they believed that Snowden “did a good thing” in leaking the news of the National Security Agency’s surveillance program.
So has the younger generation lost its moral compass?
No. In my view, just the opposite.
In “Eichmann in Jerusalem,” one of the most poignant and important works of 20th-century philosophy, Hannah Arendt made an observation about what she called “the banality of evil.” One interpretation of this holds that it was not an observation about what a regular guy Adolph Eichmann seemed to be, but rather a statement about what happens when people play their “proper” roles within a system, following prescribed conduct with respect to that system, while remaining blind to the moral consequences of what the system was doing — or at least compartmentalizing and ignoring those consequences.
This is a great Op-Ed piece by Peter Ludlow, a professor of philosophy at Northwestern. He advances the proposition that normal people basically lose their moral compass when subjected to bureaucracy, especially if their livelihood is dependent on such compliance. This has happened throughout history and is what is happening now with the prosecution of principled whistleblowers acting outside the rules of bureaucracy (and outside the law codified law, though within moral law). It also pretty much decimates the arguments advanced here and other places that Snoweden, et al, was evil and a meglomaniac, etc. My favorite passage:
But wasn’t there arrogance or hubris in Snowden’s and Manning’s decisions to leak the documents? After all, weren’t there established procedures determining what was right further up the organizational chart? Weren’t these ethical decisions better left to someone with a higher pay grade? The former United States ambassador to the United Nations, John Bolton, argued that Snowden “thinks he’s smarter and has a higher morality than the rest of us … that he can see clearer than other 299, 999, 999 of us, and therefore he can do what he wants. I say that is the worst form of treason.”
For the leaker and whistleblower the answer to Bolton is that there can be no expectation that the system will act morally of its own accord. Systems are optimized for their own survival and preventing the system from doing evil may well require breaking with organizational niceties, protocols or laws. It requires stepping outside of one’s assigned organizational role. The chief executive is not in a better position to recognize systemic evil than is a middle level manager or, for that matter, an IT contractor. Recognizing systemic evil does not require rank or intelligence, just honesty of vision.
For all you NSA / Administration sympathizers out there, I just have one thing to say - you have plenty of company throughout history; it is just not the kind of company with whom I would want to be associated.
Posted by hueymahl | Mon Sep 16, 2013, 02:57 PM (0 replies)
Below is an excerpt from a blog post by Steve Blank (original at http://steveblank.com/2013/07/15/your-computer-may-already-be-hacked-nsa-inside/). Steve Blank is a very well respected entrepreneur and angel investor in silicon valley. His history makes him particularly well situated to analyze this type of information and understand its possibilities (including stints in the military and "spook stuff" - his term: http://steveblank.com/about/).
Bottom line - If you are not scared of the NSA's power AND THE POTENTIAL FOR ITS ABUSE, you should be.
While most outside observers think the NSA’s job is cracking encrypted messages, as the Prism disclosures have shown, the actual mission is simply to read all communications. Cracking codes is a last resort.
The NSA has a history of figuring out how to get to messages before or after they are encrypted. Whether it was by putting keyloggers on keyboards and recording the keystrokes or detecting the images of the characters as they were being drawn on a CRT.
Today every desktop and laptop computer has another way for the NSA to get inside.
It’s inevitable that complex microprocessors have bugs in them when they ship. When the first microprocessors shipped the only thing you could hope is that the bug didn’t crash your computer. The only way the chip vendor could fix the problem was to physically revise the chip and put out a new version. But computer manufacturers and users were stuck if you had an old chip. After a particularly embarrassing math bug in 1994 that cost Intel $475 million, the company decided to fix the problem by allowing it’s microprocessors to load fixes automatically when your computer starts.
Starting in 1996 with the Intel P6 (Pentium Pro) to today’s P7 chips (Core i7) these processors contain instructions that are reprogrammable in what is called microcode. Intel can fix bugs on the chips by reprogramming a microprocessors microcode with a patch. This patch, called a microcode update, can be loaded into a processor by using special CPU instructions reserved for this purpose. These updates are not permanent, which means each time you turn the computer on, its microprocessor is reset to its built-in microcode, and the update needs to be applied again (through a computer’s BIOS.).
Since 2000, Intel has put out 29 microcode updates to their processors. The microcode is distributed by 1) Intel or by 2) Microsoft integrated into a BIOS or 3) as part of a Windows update. Unfortunately, the microcode update format is undocumented and the code is encrypted. This allows Intel to make sure that 3rd parties can’t make unauthorized add-ons to their chips. But it also means that no one can look inside to understand the microcode, which makes it is impossible to know whether anyone is loading a backdoor into your computer.
To be clear, he is not saying Intel is working with the NSA and has already installed backdoors on all our computers. What he is saying is that they have the absolute capability to do so, and there is no way for us to know if they have already done so or will do so in the future.
Posted by hueymahl | Mon Jul 15, 2013, 02:38 PM (14 replies)
Go to Page: 1