There’s been one particularly misleading claim repeated throughout coverage of CIA documents released by WikiLeaks today: That the agency’s in-house hackers “bypassed” the encryption used by popular secure-chat software like Signal and WhatsApp.By specifically mentioning these apps, news outlets implied that the agency has a means of getting through the protections built into the chat systems. It doesn’t. Instead, it has the ability, in some cases, to take control of entire phones; accessing encrypted chats is simply one of many security implication of this. Wikileaks’ own analysis of the documents at least briefly acknowledges this, stating that CIA “techniques permit the CIA to bypass the encryption of WhatsApp, Signal, Telegram, Wiebo, Confide and Cloackman by hacking the ‘smart’ phones that they run on and collecting audio and message traffic before encryption is applied.”
Tag Archives: Internet
It’s difficult to buy a new TV that doesn’t come with a suite of (generally mediocre) “smart” software, giving your home theater some of the functions typically found in phones and tablets. But bringing these extra features into your living room means bringing a microphone, too — a fact the CIA is exploiting, according to a new trove of documents released today by WikiLeaks.According to documents inside the cache, a CIA program named “Weeping Angel” provided the agency’s hackers with access to Samsung Smart TVs, allowing a television’s built-in voice control microphone to be remotely enabled while keeping the appearance that the TV itself was switched off, called “Fake-Off mode.” Although the display would be switched off, and LED indicator lights would be suppressed, the hardware inside the television would continue to operate, unbeknownst to the owner. The method, co-developed with British intelligence, required implanting a given TV with malware—it’s unclear if this attack could be executed remotely, but the documentation includes reference to in-person infection via a tainted USB drive. Once the malware was inside the TV, it could relay recorded audio data to a third party (presumably a server controlled by the CIA) through the included network connection.
There are very few things that $5bn can’t buy, but one of them is manners. This week video emerged of Travis Kalanick, the CEO and founder of ride-share app Uber, patronising and swearing at one of his own drivers, who complained that harsh company policies had forced him into bankruptcy. “Some people don’t like to take responsibility for their own shit,” sneered Kalanick. Truer words were never spoken by a tycoon: for Uber, along with many other aggressive corporations, not taking responsibility for your own shit isn’t just a philosophy, it’s a business model.Uber has barely been out of the news this year, with a succession of scandals cementing the company’s reputation as a byword for cod-libertarian douchebaggery. Accusations of strike-breaking during protests against Donald Trump’s “Muslim ban” sparked a viral campaign to get customers to delete the app. A week later, a former employee went public with accusations of sexual harassment and institutional misogyny. Kalanick, who was pressured to withdraw from a position as a business adviser to Trump, is now facing legal suits across the world from drivers who insist that they would be better able to “take responsibility” for their lives if they could earn a living wage.
Everything is possible. Nothing is possible. Nothing hurts any more, until the consequences crash through the screen. Immersed almost permanently in virtual worlds, we cannot check what we are told against tangible reality. Is it any wonder that we live in a post-truth era, when we are bereft of experience?It is no longer rare to meet adults who have never swum except in a swimming pool, never slept except in a building, never run a mile or climbed a mountain, never been stung by a bee or a wasp, never broken a bone or needed stitches. Without a visceral knowledge of what it is to be hurt and healed, exhausted and resolute, freezing and ecstatic, we lose our reference points. We are separated from the world by a layer of glass. Climate change, distant wars, the erosion of democracy, resurgent fascism – in our temperature-controlled enclosures, all can be reduced to abstractions.
The global race to tame and civilise digital capitalism is on. In France, the “right to disconnect” – requiring companies of a certain size to negotiate how their employees handle out-of-hours work and availability – came into force on 1 January. In 2016 a similar bill was submitted to the South Korean parliament. Earlier this month a congressman in the Philippines introduced another such measure, receiving the support of an influential local trade union. Many companies – from Volkswagen to Daimler – have already made similar concessions, even in the absence of national legislation.
In April 2010, a classified U.S. military video was released through the website WikiLeaks, recorded from a camera aboard an Apache helicopter. It shows the massacre of civilians on a street in Baghdad, Iraq. The video, which WikiLeaks called “Collateral Murder,” documented in graphic, grainy black-and-white detail a helicopter gunship attack on July 12, 2007. The helicopter opens fire with machine guns on a group of men, including Reuters news agency photographer Namir Noor-Eldeen and his driver, Saeed Chmagh. Most of the men are killed instantly. Noor-Eldeen runs away, and the crosshairs follow him, shooting nonstop, until he falls dead.
Scott Shatford didn’t bargain for criminal charges.
The official complaint arrived at his front door in May, more than a year after Santa Monica, California, voted to ban the short-term home rentals flooding its small beachside community. Shatford knew the rules but had chosen to ignore them, continuing to list two properties for short stays on Airbnb. He found the city’s ban ridiculous and assumed it would be difficult to enforce. Even if he did get caught, Shatford figured a few fines would be a small price to pay for properties earning him around $60,000 a year.
Of all the big firms in Silicon Valley, Amazon had the most to lose from Donald Trump’s presidency. And lose it did, albeit briefly, its share price dropping 5% shortly after the election.
During the campaign, Trump warned that Amazon had a “huge antitrust problem” – a reasonable stance for the populist that he once aspired to be. Most likely, though, his animosity had more to do with the fact Amazon’s founder, Jeff Bezos, also owns the Washington Post, an influential newspaper that took an early strong dislike of Trump. By the time of Amazon’s massive cloud-computing conference, which kicked off in Las Vegas at the end of November, such squabbles seem to have been forgotten. Amazon went on to wow the audience with impressive gimmicks. Did you know it has a truck – yes, a real truck – to drive your data to the cloud? Apparently, it’s much faster than using networks.
Fewer than 2,000 readers are on his website when Paris Wade, 26, awakens from a nap, reaches for his laptop and thinks he needs to, as he puts it, “feed” his audience. “Man, no one is covering this TPP thing,” he says after seeing an article suggesting that President Obama wants to pass the Trans-Pacific Partnership before he leaves office. Wade, a modern-day digital opportunist, sees an opportunity. He begins typing a story.
“CAN’T TRUST OBAMA,” he writes as the headline, then pauses. His audience hates Obama and loves President-elect Donald Trump, and he wants to capture that disgust and cast it as a drama between good and evil. He resumes typing: “Look At Sick Thing He Just Did To STAB Trump In The Back… .”
How will artificial intelligence systems change the way we live? This is a tough question: on one hand, AI tools are producing compelling advances in complex tasks, with dramatic improvements in energy consumption, audio processing, and leukemia detection. There is extraordinary potential to do much more in the future. On the other hand, AI systems are already making problematic judgements that are producing significant social, cultural, and economic impacts in people’s everyday lives.AI and decision-support systems are embedded in a wide array of social institutions, from influencing who is released from jail to shaping the news we see. For example, Facebook’s automated content editing system recently censored the Pulitzer-prize winning image of a nine-year old girl fleeing napalm bombs during the Vietnam War. The girl is naked; to an image processing algorithm, this might appear as a simple violation of the policy against child nudity. But to human eyes, Nick Ut’s photograph, “The Terror of War”, means much more: it is an iconic portrait of the indiscriminate horror of conflict, and it has an assured place in the history of photography and international politics. The removal of the image caused an international outcry before Facebook backed down and restored the image. “What they do by removing such images, no matter what good intentions, is to redact our shared history,” said the Prime Minister of Norway, Erna Solberg.