Dr. Papert, who was born in South Africa, was one of the leading educational theorists of the last half-century and a co-director of the renowned Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. In some circles he was considered the world’s foremost expert on how technology can provide new ways for children to learn.
In the pencil-and-paper world of the 1960s classroom, Dr. Papert envisioned a computing device on every desk and an internetlike environment in which vast amounts of printed material would be available to children. He put his ideas into practice, creating in the late ’60s a computer programming language, called Logo, to teach children how to use computers.
I missed out on using Logo as a kid, but I know many people for whom Logo was their introduction to computers and programming. The MIT Media Lab has a short remembrance of Papert as well.
Elie Wiesel died yesterday in NYC aged 87. He survived the Auschwitz and Buchenwald during WWII and later wrote and spoke extensively about the experience, not letting the world forget what happened to so many Jews under Hitler’s boot. For his efforts, Wiesel won the Nobel Peace Prize in 1986 and this part of his acceptance speech remains as vital as when he spoke it:
And then I explained to him how naive we were, that the world did know and remain silent. And that is why I swore never to be silent whenever and wherever human beings endure suffering and humiliation. We must always take sides. Neutrality helps the oppressor, never the victim. Silence encourages the tormentor, never the tormented. Sometimes we must interfere. When human lives are endangered, when human dignity is in jeopardy, national borders and sensitivities become irrelevant. Wherever men or women are persecuted because of their race, religion, or political views, that place must — at that moment — become the center of the universe.
I am going to be thinking about that paragraph a lot in the next few months, I think.
Vanity Fair had Sam Roberts, an obituary writer from the NY Times, come up with an obit for Jesus, as it might have been written 2000 or so years ago.
His father was named Joseph, although references to him are scarce after Jesus’s birth. His mother was Miriam, or Mary, and because he was sometimes referred to as “Mary’s son,” questions had been raised about his paternity.
He is believed to have been the eldest of at least six siblings, including four brothers-James, Joseph, Judas, and Simon-and several sisters. He never married-unusual for a man of his age, but not surprising for a Jew with an apocalyptic vision.
The “about 33” in reference to his age is a nice touch.
That Rickman never won an Oscar (he did receive a Golden Globe, an Emmy, a Bafta and many more) became a perennial topic in interviews but did not seem to trouble the actor himself. “Parts win prizes, not actors,” he said in 2008. It was the wider worth of his art to which Rickman remained committed, saying that he found it easier to treat the work seriously if he could look upon himself with levity.
“Actors are agents of change,” he said. “A film, a piece of theatre, a piece of music, or a book can make a difference. It can change the world.”
I loved Rickman as Hans Gruber in Die Hard and as Dr. Lazarus in Galaxy Quest, but I’ll remember his turn as Severus Snape in the Harry Potter films the most. Among several fantastic actors in that series, Rickman’s performance was arguably the best. Many characters in Potter struggled between the good and not-so-good sides of themselves (including Harry and Dumbledore) but none of them carried that battle off as well as Rickman’s Snape.
David Bowie died Sunday from cancer. Dave Pell at Nextdraft has a nice roundup of links, writing:
In the NYT obituary, Jon Pareles writes: “Mr. Bowie wrote songs, above all, about being an outsider: an alien, a misfit, a sexual adventurer, a faraway astronaut.” Maybe that’s why there is such an outpouring of emotion at the news of David Bowie’s death at the age of 69. Everyone feels like an outsider and Bowie made being an outsider feel more like being ahead of the curve. Today, there are people who are famous for nothing. David Bowie was famous for everything.
Bowie was also quite keen on the Internet:
Quartz calls him a tech visionary, and there’s this from a 1999 Rolling Stone article: “David Bowie has pulled another cyber-coup by becoming the first major-label artist to sell a complete album online in download form.”
He didn’t get the future exactly right, but authorship and intellectual property has been “in for such a bashing” lately and music sales are down down down:
“Music itself is going to become like running water or electricity,” he added. “So it’s like, just take advantage of these last few years because none of this is ever going to happen again. You’d better be prepared for doing a lot of touring because that’s really the only unique situation that’s going to be left. It’s terribly exciting. But on the other hand it doesn’t matter if you think it’s exciting or not; it’s what’s going to happen.”
One of the world’s most famous madams, Madame Claude (real name: Fernande Grudet), has died at 92, leaving behind a colorful obituary in which it’s hard to discern what the real story was behind the mere maquerelle to the world’s most powerful men.
In 1975, the French tax authorities, who had begun taking an interest in Ms. Grudet’s business, estimated that she was taking in 100,000 to 140,000 francs a month. Her clients, whom she called “friends,” were a catalog of the rich and famous.
The soul of discretion in her heyday, Ms. Grudet became a heavy name-dropper when the time came to tell her life story, which she did in two memoirs: “Allo Oui, or the Memoirs of Madame Claude” (1975), written with Jacques Quoirez, the brother of her good friend Francoise Sagan and one of her testers; and “Madam,” published in 1994 under the name Claude Grudet.
By her account, the “friends” included John F. Kennedy, the shah of Iran, Muammar el-Qaddafi, Gianni Agnelli, Moshe Dayan, Marc Chagall, Rex Harrison and King Hussein of Jordan, who, she said, once told a Claude girl: “You and I are in the same business. We have to smile even when we don’t feel like it.”
Oliver Sacks was a champion of one of humankind’s most admirable qualities: Curiosity. The neurologist and writer died on Monday. He wrote beautifully about his impending death in a piece published a couple weeks ago:
And now, weak, short of breath, my once-firm muscles melted away by cancer, I find my thoughts, increasingly, not on the supernatural or spiritual, but on what is meant by living a good and worthwhile life…
A giant in the world of design, Massimo Vignelli, passed away this morning at the age of 83. Michael Bierut, who worked for Vignelli, has a nice remembrance of him.
Today there is an entire building in Rochester, New York, dedicated to preserving the Vignelli legacy. But in those days, it seemed to me that the whole city of New York was a permanent Vignelli exhibition. To get to the office, I rode in a subway with Vignelli-designed signage, shared the sidewalk with people holding Vignelli-designed Bloomingdale’s shopping bags, walked by St. Peter’s Church with its Vignelli-designed pipe organ visible through the window. At Vignelli Associates, at 23 years old, I felt I was at the center of the universe.
Kumar Pallana, one of Wes Anderson’s cast of regulars, has died at age 94. Pallana appears as Kumar in Bottle Rocket, Mr. Littlejeans in Rushmore, and as Pagoda in The Royal Tenenbaums.
Pallana led a massively interesting life before hitting the big screen at nearly 80. Born in colonial India, he lived all around the world, and first made a name for himself as an entertainer in America in the 1950s. Back then he was known as Kumar Of India, and his specialty was spinning plates-he even appeared on Captain Kangaroo in 1961. (Other feats included magic, balancing, swordplay, and juggling-you can see him do a handstand in The Royal Tenenbaums.)
Every few months on the web, a new candidate emerges for the Bad-Ass Hall of Fame, a collection of amazing people who lived large, walked their own path, and left their mark on history with flair. Today’s candidate is Mad Jack Churchill, a British Commando leader during World War II who died in 1996. Churchill fought in the war armed with a bow & arrows, a broadsword, and occasionally even bagpipes. Here’s a photo of him (far right) during a training exercise in Scotland, sword in hand as he storms the beach:
What a sight he must have been, leading charges branishing a sword and sucking on his pipes. Churchill even killed a German soldier in France with an arrow, recording the only known kill by bow in the war for the British. From a profile in WWII History magazine:
During the BEF’s fighting retreat, Churchill remained aggressive, unwilling to give up a yard of ground while extracting the maximum cost from the enemy. He was especially fond of raids and counterattacks, leading small groups of picked soldiers against the advancing Germans. He presented a strange, almost medieval figure at the head of his men, carrying not only his war bow and arrows, but his sword as well.
As befitted his love of things Scottish, Churchill carried the basket-hilted claymore (technically a claybeg, the true claymore being an enormous two-handed sword). Later on, asked by a general who awarded him a decoration why he carried a sword in action, Churchill is said to have answered: “In my opinion, sir, any officer who goes into action without his sword is improperly dressed.”
The war-diary of 4th Infantry Brigade, to which Churchill’s battalion belonged, commented on this extraordinary figure. “One of the most reassuring sights of the embarkation [from Dunkirk] was the sight of Captain Churchill passing down the beach with his bows and arrows. His high example and his great work … were a great help to the 4th Infantry Brigade.”
And this bit sounds totally made up:
Churchill himself was far in front of his troopers. Sword in hand, accompanied only by a corporal named Ruffell, he advanced into the town itself. Undiscovered by the enemy, he and Ruffell heard German soldiers digging in all around them in the gloom. The glow of a cigarette in the darkness told them the location of a German sentry post. What followed, even Churchill later admitted, was “a bit Errol Flynn-ish.”
The first German sentry post, manned by two men, was taken in silence. Churchill, his sword blade gleaming in the night, appeared like a demon from the darkness, ordered “haende hoch!” and got results. He gave one German prisoner to Ruffell, then slipped his revolver lanyard around the second sentry’s neck and led him off to make the rounds of the other guards. Each post, lulled into a sense of security by the voice of their captive comrade, surrendered to this fearsome apparition with the ferocious mustache and the naked sword.
Altogether, Churchill and Corporal Ruffell collected 42 prisoners, complete with their personal weapons and a mortar they were manning in the village. Churchill and his claymore took the surrender of ten men in a bunch around the mortar. He and his NCO then marched the whole lot back into the British lines.
As Churchill himself described the event, it all sounded rather routine: “I always bring my prisoners back with their weapons; it weighs them down. I just took their rifle bolts out and put them in a sack, which one of the prisoners carried. [They] also carried the mortar and all the bombs they could carry and also pulled a farm cart with five wounded in it….I maintain that, as long as you tell a German loudly and clearly what to do, if you are senior to him he will cry ‘jawohl’ and get on with it enthusiastically and efficiently whatever the … situation. That’s why they make such marvelous soldiers…”
Crazy! After the war, he took up boat refurbishing, river surfing, and freaking out train passengers:
In his last job he would sometimes stand up on a train journey from London to his home, open the window and hurl out his briefcase, then calmly resume his seat. Fellow passengers looked on aghast, unaware that he had flung the briefcase into his own back garden.
Heaney’s take on the Anglo-Saxon most reminds me of the first of Ezra Pound’s Cantos, a weird mix of old epic and contemporary free-verse imagery and meters, a translation of a translation of Homer that begins “And then” and ends “So that:”
When Swinburne died, W.B. Yeats is said to have told his sister, “Now I am King of the Cats.” When Robert Frost died, John Berryman asked, “who’s number one?”
I note this not to pose the question “who’s number one?” now that Heaney has died, but to observe that just as champion boxers and sprinters often have outsized competitive personalities that seem like caricatures compared to other athletes, even among writers, and even when they resist, as Heaney did, being drawn into literary feuds or political debate, great poets are often magnificent and terrible and troubling and glorious and weird.
Douglas Engelbart died at his home in California yesterday at the age of 88. Engelbart invented the mouse, among other things. In 1968, Engelbart gave what was later called The Mother of All Demos, in which he demonstrated “the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor”.
Not bad for a single demo. Truly one of the giants of our age.
Update: Bret Victor urges us to remember Engelbart not for the technology he created but for his vision of how people could collaborate and create together using technology.
The least important question you can ask about Engelbart is, “What did he build?” By asking that question, you put yourself in a position to admire him, to stand in awe his achievements, to worship him as a hero. But worship isn’t useful to anyone. Not you, not him.
The most important question you can ask about Engelbart is, “What world was he trying to create?” By asking that question, you put yourself in a position to create that world yourself.
Ebert, 70, who reviewed movies for the Chicago Sun-Times for 46 years and on TV for 31 years, and who was without question the nation’s most prominent and influential film critic, died Thursday in Chicago. He had been in poor health over the past decade, battling cancers of the thyroid and salivary gland.
He lost part of his lower jaw in 2006, and with it the ability to speak or eat, a calamity that would have driven other men from the public eye. But Ebert refused to hide, instead forging what became a new chapter in his career, an extraordinary chronicle of his devastating illness that won him a new generation of admirers. “No point in denying it,” he wrote, analyzing his medical struggles with characteristic courage, candor and wit, a view that was never tinged with bitterness or self-pity.
Always technically savvy - he was an early investor in Google - Ebert let the Internet be his voice. His rogerebert.com had millions of fans, and he received a special achievement award as the 2010 “Person of the Year” from the Webby Awards, which noted that “his online journal has raised the bar for the level of poignancy, thoughtfulness and critique one can achieve on the Web.” His Twitter feeds had 827,000 followers.
Ebert was both widely popular and professionally respected. He not only won a Pulitzer Prize - the first film critic to do so - but his name was added to the Hollywood Walk of Fame in 2005, among the movie stars he wrote about so well for so long. His reviews were syndicated in hundreds of newspapers worldwide.
I got taken too the one time I saw Elvis, but in a totally different way. It was the autumn of 1971, and two tickets to an Elvis show turned up at the offices of Creem magazine, where I was then employed. It was decided that those staff members who had never had the privilege of witnessing Elvis should get the tickets, which was how me and art director Charlie Auringer ended up in nearly the front row of the biggest arena in Detroit. Earlier Charlie had said, “Do you realize how much we could get if we sold these fucking things?” I didn’t, but how precious they were became totally clear the instant Elvis sauntered onto the stage. He was the only male performer I have ever seen to whom I responded sexually; it wasn’t real arousal, rather an erection of the heart, when I looked at him I went mad with desire and envy and worship and self-projection. I mean, Mick Jagger, whom I saw as far back as 1964 and twice in ‘65, never even came close.
Mr. Koch’s 12-year mayoralty encompassed the fiscal austerity of the late 1970s and the racial conflicts and municipal corruption scandals of the 1980s, an era of almost continuous discord that found Mr. Koch at the vortex of a maelstrom day after day.
But out among the people or facing a news media circus in the Blue Room at City Hall, he was a feisty, slippery egoist who could not be pinned down by questioners and who could outtalk anybody in the authentic voice of New York: as opinionated as a Flatbush cabby, as loud as the scrums on 42nd Street, as pugnacious as a West Side reform Democrat mother.
“I’m the sort of person who will never get ulcers,” the mayor - eyebrows devilishly up, grinning wickedly at his own wit - enlightened the reporters at his $475 rent-controlled apartment in Greenwich Village on Inauguration Day in 1978. “Why? Because I say exactly what I think. I’m the sort of person who might give other people ulcers.”
Koch, New York City’s dominant political figure of the 1980s and the architect of what remains its governing political coalition, stayed politically relevant through his long political twilight, courted aggressively by figures including Presidents George W. Bush and Barack Obama for his role as a proxy for pro-Israel Democrats willing, but not eager, to cross party lines.
But Koch’s later years of quips, movie reviews, and presidential politics remain secondary to his central legacy, which is in New York’s City Hall. Tall and gangly with a domed, bald head and a knowing smile, Koch was New York’s mayor and its mascot from 1978 to 1989. Through three terms, he repeated one question like a mantra: “How’m I doing?” At first, the answer was clear to observers who had watched the city slide toward bankruptcy: exceptionally well. Koch managed New York back from the brink, drove hard bargains with municipal unions, cut jobs where he had to and reduced taxes where he could. He presided over a boom in Manhattan, and spent his new revenues on renewing the south Bronx.
But as the Koch administration moved its third term, the mayor lost his momentum. As Wall Street boomed in the 1980s, Koch took advantage of the new revenues to double New York City’s budget and offer tax breaks to real estate developers. But the largesse couldn’t buy him friends: he clashed with black leaders and his old allies among Manhattan’s liberal democrats. New York became famous for its racial tensions and rising crime. He courted the Democratic Party bosses of Queens and the Bronx only to be tarnished by the corruption scandals that surrounded them.
Here’s the trailer for Koch, a documentary on the former mayor that coincidentally opens today in limited release:
The accomplished Swartz co-authored the now widely-used RSS 1.0 specification at age 14, was one of the three co-owners of the popular social news site Reddit, and completed a fellowship at Harvard’s Ethics Center Lab on Institutional Corruption. In 2010, he founded DemandProgress.org, a “campaign against the Internet censorship bills SOPA/PIPA.”
I met Aaron when he was 14 or 15. He was working on XML stuff (he co-wrote the RSS specification when he was 14) and came to San Francisco often, and would stay with Lisa Rein, a friend of mine who was also an XML person and who took care of him and assured his parents he had adult supervision. In so many ways, he was an adult, even then, with a kind of intense, fast intellect that really made me feel like he was part and parcel of the Internet society, like he belonged in the place where your thoughts are what matter, and not who you are or how old you are.
I didn’t know Aaron very well, but this is just terrible. My thoughts are with his friends and family.
Here is where we need a better sense of justice, and shame. For the outrageousness in this story is not just Aaron. It is also the absurdity of the prosecutor’s behavior. From the beginning, the government worked as hard as it could to characterize what Aaron did in the most extreme and absurd way. The “property” Aaron had “stolen,” we were told, was worth “millions of dollars” - with the hint, and then the suggestion, that his aim must have been to profit from his crime. But anyone who says that there is money to be made in a stash of ACADEMIC ARTICLES is either an idiot or a liar. It was clear what this was not, yet our government continued to push as if it had caught the 9/11 terrorists red-handed.
Although he had been a Navy fighter pilot, a test pilot for NASA’s forerunner and an astronaut, Mr. Armstrong never allowed himself to be caught up in the celebrity and glamour of the space program.
“I am, and ever will be, a white socks, pocket protector, nerdy engineer,” he said in February 2000 in one a rare public appearance. “And I take a substantial amount of pride in the accomplishments of my profession.”
A sad day; Armstrong was one of my few heroes. In my eyes, Armstrong safely guiding the LEM to the surface of the Moon, at times by the seat of his pants, is among the most impressive and important things ever done by a human being.
At fifteen, he was selling drugs on the corners of Fayette Street, but that doesn’t begin to explain who he was. For the boys of Franklin Square — too many of them at any rate — slinging was little more than an adolescent adventure, an inevitable right of passage. And whatever sinister vision you might conjure of a street corner drug trafficker, try to remember that a fifteen-year-old slinger is, well, fifteen years old.
He was funny. He could step back from himself and mock his own stances — “hard work,” he would say when I would catch him on a drug corner, “hard work being a black man in America.” And then he would catch my eye and laugh knowingly at his presumption. His imitations of white-authority voices — social workers, police officers, juvenile masters, teachers, reporters — were never less than pinpoint, playful savagery. The price of being a white man on Fayette Street and getting to know DeAndre McCullough was to have your from-the-other-America pontifications pulled and scalpeled apart by a manchild with an uncanny ear for hypocrisy and cant.
Chris Marker, best known as a filmmaker and for his film La jetée, has died aged 91.
Marker’s creative use of sound, images and text in his poetic, political and philosophical documentaries made him one of the most inventive of film-makers. They looked forward to what is called “the new documentary”, but also looked back to the literary essay in the tradition of Michel de Montaigne. Marker’s interests lay in transitional societies - “life in the process of becoming history,” as he put it. How do various cultures perceive and sustain themselves and each other in the increasingly intermingled modern world?
La jetée is available in its 28-minute entirety on YouTube and is well worth watching.
The North Korean leader died two days ago and now no one knows who’s in charge or what’s going to happen, which is pretty much par for the course for North Korea.
Kim Jong-il, the North Korean leader who realized his family’s dream of turning his starving, isolated country into a nuclear-weapons power even as it sank further into despotism, died on Saturday of a heart attack while traveling on his train, according to an announcement Monday by the country’s state-run media.
“My chief consolation in this year of living dyingly has been the presence of friends,” he wrote in the June 2011 issue. He died in their presence, too, at the MD Anderson Cancer Center in Houston, Texas. May his 62 years of living, well, so livingly console the many of us who will miss him dearly.
Although I suspect there will be posthumous writings to come, Hitchens’ final piece for Vanity Fair, published in the January 2012 issue, is a rumination on pain and death.
Before I was diagnosed with esophageal cancer a year and a half ago, I rather jauntily told the readers of my memoirs that when faced with extinction I wanted to be fully conscious and awake, in order to “do” death in the active and not the passive sense. And I do, still, try to nurture that little flame of curiosity and defiance: willing to play out the string to the end and wishing to be spared nothing that properly belongs to a life span. However, one thing that grave illness does is to make you examine familiar principles and seemingly reliable sayings. And there’s one that I find I am not saying with quite the same conviction as I once used to: In particular, I have slightly stopped issuing the announcement that “Whatever doesn’t kill me makes me stronger.”
Gordon died at 3:38 p.m. holding hands with his wife as the family they built surrounded them.
“It was really strange, they were holding hands, and dad stopped breathing but I couldn’t figure out what was going on because the heart monitor was still going,” said Dennis Yeager. “But we were like, he isn’t breathing. How does he still have a heart beat? The nurse checked and said that’s because they were holding hands and it’s going through them. Her heart was beating through him and picking it up.”
“They were still getting her heartbeat through him,” said Donna Sheets.
We lost a tech giant today. Dennis MacAlistair Ritchie, co-creator of Unix and the C programming language with Ken Thompson, has passed away at the age of 70. Ritchie has made a tremendous amount of contribution to the computer industry, directly and indirectly affecting (improving) the lives of most people in the world, whether you know it or not.
These sorts of comparisons are inexact at best, but Richie’s contribution to the technology industry rivals that of Steve Jobs’…Richie’s was just less noticed by non-programmers.
I am incredibly sad this morning. Why am I, why are we, feeling this so intensely? I have some thoughts about that but not for now. For now, I’m just going to share some of the things I’ve been reading and watching about Jobs. And after that, I think I’m done here for the day and will move on to spend some time building my little thing that I’m trying to make insanely great.
The 2005 Stanford Commencement Speech. For me, the speech is better in text than in video.
Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma - which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.
The iPhone announcement in 2007. I am with Dan Frommer on this one: this is Jobs at his absolute best. He was just so so excited about this thing that he and his team had created, so proud. His presentation is also a reminder of how revolutionary the iPhone was four years ago.
Jobs usually had little interest in public self-analysis, but every so often he’d drop a clue to what made him tick. Once he recalled for me some of the long summers of his youth. I’m a big believer in boredom,” he told me. Boredom allows one to indulge in curiosity, he explained, and “out of curiosity comes everything.” The man who popularized personal computers and smartphones — machines that would draw our attention like a flame attracts gnats — worried about the future of boredom. “All the [technology] stuff is wonderful, but having nothing to do can be wonderful, too.”
I like to think that in the run-up to his final keynote, Steve made time for a long, peaceful walk. Somewhere beautiful, where there are no footpaths and the grass grows thick. Hand-in-hand with his wife and family, the sun warm on their backs, smiles on their faces, love in their hearts, at peace with their fate.
I have no way of knowing how Steve talked to his team during Apple’s darkest days in 1997 and 1998, when the company was on the brink and he was forced to turn to archrival Microsoft for a rescue. He certainly had a nasty, mercurial side to him, and I expect that, then and later, it emerged inside the company and in dealings with partners and vendors, who tell believable stories about how hard he was to deal with.
But I can honestly say that, in my many conversations with him, the dominant tone he struck was optimism and certainty, both for Apple and for the digital revolution as a whole. Even when he was telling me about his struggles to get the music industry to let him sell digital songs, or griping about competitors, at least in my presence, his tone was always marked by patience and a long-term view. This may have been for my benefit, knowing that I was a journalist, but it was striking nonetheless.
At times in our conversations, when I would criticize the decisions of record labels or phone carriers, he’d surprise me by forcefully disagreeing, explaining how the world looked from their point of view, how hard their jobs were in a time of digital disruption, and how they would come around.
This quality was on display when Apple opened its first retail store. It happened to be in the Washington, D.C., suburbs, near my home. He conducted a press tour for journalists, as proud of the store as a father is of his first child. I commented that, surely, there’d only be a few stores, and asked what Apple knew about retailing.
He looked at me like I was crazy, said there’d be many, many stores, and that the company had spent a year tweaking the layout of the stores, using a mockup at a secret location. I teased him by asking if he, personally, despite his hard duties as CEO, had approved tiny details like the translucency of the glass and the color of the wood.
Heartwarming/breaking: shortly following the news of Steve’s death, our daughter called me “dada” for the first time. It goes on.
The Computer That Changed My Life. Bryce Roberts shares the story of the first Apple computer he bought.
As I sat alone in my makeshift office in Sandy, UT I decided that I wanted to start fresh, all the way down to my operating system. It sounds funny now, but it was an important psychological move for me. I wanted the next level to look and feel different than what I’d experienced in the past in every possible way.
I fired up my Sony Viao and surfed over to Apple.com. I wasn’t an Apple fanboy. I’d never owned one of their machines. And that was the point.
I didn’t know if I would love it or even like it, but it was going to be different. And different was exactly how I wanted the next level to feel.
This is *exactly* why I bought an iBook in 2002 after a lifetime of Windows/DOS machines.
For the quarter-century that followed, Mrs. Clark lived in the apartment in near solitude, amid a profusion of dollhouses and their occupants. She ate austere lunches of crackers and sardines and watched television, most avidly “The Flintstones.” A housekeeper kept the dolls’ dresses impeccably ironed.
With his deep, melodic voice and smooth soul rumble, Dogg was one of the key elements in the rise of the West Coast G-Funk sound pioneered by Death Row Records in the early 1990s. Though overshadowed by such peers as Dr. Dre, Snoop Dogg and Warren G, Nate was a critical participant in a number of major left-coast gangsta hits, including G’s “Regulate” and Dre’s iconic solo debut, 1992’s The Chronic.
Nothing in the news media yet, but many folks on Twitter and colleague Nassim Taleb are reporting that the father of fractal geometry is dead at age 85. We’re not there yet, but someday Mandelbrot’s name will be mentioned in the same breath as Einstein’s as a genius who fundamentally shifted our perception of how the world works.
Former NBA player, shot blocker extraordinaire, and humanitarian Manute Bol died over the weekend at age 47. He died of a rare skin condition caused by a medication he took while in Africa.
“You know, a lot of people feel sorry for him, because he’s so tall and awkward,” Charles Barkley, a former 76ers teammate, once said. “But I’ll tell you this — if everyone in the world was a Manute Bol, it’s a world I’d want to live in.”
Ken Arneson emailed me to say that he heard the phrase was first used by the Sudanese immigrant basketball player Manute Bol, believed to have been a native speaker of Dinka (a very interesting and thoroughly un-Indo-Europeanlike language of the Nilo-Saharan superfamily). Says Arneson, “I first heard the phrase here in the Bay Area when Bol joined the Golden State Warriors in 1988, when several Warriors players started using the phrase.” And Ben Zimmer’s rummaging in the newspaper files down in the basement of Language Log Plaza produced a couple of early 1989 quotes that confirm this convincingly:
St. Louis Post-Dispatch, Jan. 10, 1989: When he [Manute Bol] throws a bad pass, he’ll say, “My bad” instead of “My fault,” and now all the other players say the same thing.
USA Today, Jan. 27, 1989: After making a bad pass, instead of saying “my fault,” Manute Bol says, “my bad.” Now all the other Warriors say it too.
Update: As a recent post on Language Log notes, several people picked up on this and kinda sorta got rid of the “may have” and the story became that Bol absolutely coined the phrase “my bad”. Unfortunately, the evidence doesn’t support that theory (although it doesn’t entirely disprove it either). The internet is so proficient at twisting the original meaning of things as they propagate that Telephone should really be called Internet.