Taking a break to set my fear aside while reading this NYT review of former counterterrorism czar Richard Clarke’s new book, I was reminded of how much I enjoy the language of hacking:
North Korea is suspected of being behind the cyberattacks of July 2009 that took down the Web servers of the Treasury, Secret Service, Federal Trade Commission and Transportation Department and is thought to have placed “trapdoors” — code that allows hackers future access to a network — on computer networks on at least two continents.
Trapdoors are just one device that rival nation states and cyberterrorists can use. There are also “logic bombs” (code that can set off malicious functions when triggered), Distributed Denial of Service (D.D.O.S.) attacks (in which a site or server is flooded with more requests for data than it can process), and foreign-manufactured software and hardware that might have been tampered with before being shipped to the States.
For more — including rock phish, flip buttons, and of course, Trojan horses — consult Wikipedia’s “malware” category page.
As is now widely known, and as NYT reports, Twitter has donated the tweets from its public timeline to The Library of Congress. The data takes up much less physical and digital space than you’d think. Ten billion tweets occupy just 5TB of storage space, enough to easily fit on a desktop. But dealing with the onslaught of primary source material may require a new kind of historian:
A tool like Google Replay is helpful in focusing on one topic. But it displays only 10 Tweets at a time. To browse 10 billion — let’s see, figuring six seconds for a quick scan of each screen — would require about 190 sleepless years. […] [History professor Daniel J.] Cohen encourages historians to find new tools and methods for mining the “staggeringly large historical record” of Tweets. This will require a different approach, he said, one that lets go of straightforward “anecdotal history.”
Rather than telling and retelling history, then, the new historians’ role will be to edit history. Liz Danzico explains,
[I]nformation overload is not a new problem and therefore does not accurately describe what’s at issue today. The critical issue is simply a failure of filters.
Enter the editor.
There has long been an invisible tribe, a mysterious group, who transform scattered thoughts into compelling stories, who splice hundreds of hours of video into feature-length films, who segregate the semicolons from the em dashes. These are editors working across media sectors — publishing, film, music, more — to deliver transformative stories with clarity and grace.
Whether we see it or not, we’re becoming editors ourselves. In the Gutenberg era, the one-to-many relationship, in which an editor dictated the content for the masses, was common. In the post-Gutenberg era, our reliance became more democratic: We sought out editors who could sift through the staggering amount of information for us, signal where to look, what to read, and what to pay attention to. Now there’s another shift at play; […] We are, for the first time, accepting the role of editor, and exhibiting our editorial qualities outward.
What’s interesting in comparing these two articles is the push and pull between the kind of bottom-up, user-generated editing that Liz describes and the kind of top-down, authority-driven editing that historians represent. My suspicion is that we’ll need a blend of the two — historians who are also users, who are sensitive to the kind of spontaneous, networked editing that Liz describes, but who are also comfortable taking a broader view than anyone in the midst of a historical moment ever could. Digital humanities, indeed.
In one of the more interesting quotes about my generation that I’ve read recently, art dealer Daniel Reich observed to NYT that “My generation grew up in a time when we didn’t have heroes. You grew up believing you were being hoodwinked and manipulated—and knowing you were, but learning to enjoy it because it came in fun colors or was on MTV.” More of the article here. Visit Reich’s gallery here. Visit Becky Smith’s gallery Bellwether here.
“In Cold Blood began, as the story goes, when Truman Capote came across a 300-word article in the back of the New York Times describing the unexplained murder of a family of four in rural Kansas” (via Salon). It’s still just as bracing to scan the paper for true crime stories today, and it’s certainly an armchair passion of mine. In 2001, “Christopher Rocancourt jumped bail in the Hamptons, running from accusations that he soaked the rich for nearly $1 million while posing as a member of the Rockefeller family.” His story in NYT may be a bit longer than Capote’s 300-word gem, but it is no less fascinating.
In recent years higher education has gone bonkers for branding. The potentially educatable are now the educated potential, and, at least in terms of marketing and focus groups, the students have become the teachers. To wit, this article from NYT, which was instrumental for our critique of the New School’s new identity for BusinessWeek.
When it opened at the Walker Art Center in 1989, “Graphic Design in America” was one of the first serious surveys of its kind. It’s tempting to say it was ahead of its time, but I think it was probably more like a little bit late. The relationship of design to art has always been a difficult one, and displaying design in a contemporary art museum didn’t simplify things much. Here’s what NYT thought back then, which makes for interesting reading now (via Unbeige).
Rob Walker’s plain-spoken article on branding and the counterculture appeared in this week’s NYT Magazine to the interest of many readers, including this one. One question that Walker wrestled with—and did not quite resolve—is that of how you can claim to be rebelling against consumer culture while you’re manufacturing products for people to consume. One way of addressing it is to say that the entrepreneurs Walker describes aren’t rebelling against consuming, they’re rebelling against mass-market consuming. Everything they make is made in limited quantities, exchanged from one knowing party to the other. Theirs is a boutique economy; they want to consume small, not big.
In terms of these entrepreneurs’ claims of honesty and authenticity, they are virtually assured, first as a necessity for entering their chosen marketplace—their customers wouldn’t buy otherwise—and second because the ceaseless churn and craving of newness in that marketplace doesn’t give them the time to sell out anyway. No sooner have they made it when their customers are on to the next microbrand. Businesses such as those Walker describes face a fork in the road. Either get big and become what you don’t want to be; or stay small and true, fading gracefully into obscurity. The model does not embrace longevity, nor should it: all countercultures take youth as their primary fact, and youth, as we all know, is fleeting.