homeaboutarchives + tagsshopmembership!
aboutarchivesshopmembership!
aboutarchivesmembers!

kottke.org posts about computing

What Counts As Evidence in Mathematics?

posted by Tim Carmody   Mar 15, 2019

Einstein-Blackboard-01.jpg

The ultimate form of argument, and for some, the most absolute form of truth, is mathematical proof. But short of a conclusive proof of a theorem, mathematicians also consider evidence that might 1) disprove a thesis or 2) suggest its possible truth or even avenues for proving that it’s true. But in a not-quite-empirical field, what the heck counts as evidence?

The twin primes conjecture is one example where evidence, as much as proof, guides our mathematical thinking. Twin primes are pairs of prime numbers that differ by 2 — for example, 3 and 5, 11 and 13, and 101 and 103 are all twin prime pairs. The twin primes conjecture hypothesizes that there is no largest pair of twin primes, that the pairs keep appearing as we make our way toward infinity on the number line.

The twin primes conjecture is not the Twin Primes Theorem, because, despite being one of the most famous problems in number theory, no one has been able to prove it. Yet almost everyone believes it is true, because there is lots of evidence that supports it.

For example, as we search for large primes, we continue to find extremely large twin prime pairs. The largest currently known pair of twin primes have nearly 400,000 digits each. And results similar to the twin primes conjecture have been proved. In 2013, Yitang Zhang shocked the mathematical world by proving that there are infinitely many prime number pairs that differ by 70 million or less. Thanks to a subsequent public “Polymath” project, we now know that there are infinitely many pairs of primes that differ by no more than 246. We still haven’t proved that there are infinitely many pairs of primes that differ by 2 — the twin primes conjecture — but 2 is a lot closer to 246 than it is to infinity.

This starts to get really complicated once you leave the relatively straightforward arithmetical world of prime numbers behind, with its clearly empirical pairs and approximating conjectures, and start working with computer models that generate arbitrarily large numbers of mathematical statements, all of which can be counted as evidence.

Patrick Hanner, the author of this article, gives what seems like a simple example: are all lines parallel or intersecting? Then he shows how the models one can use to answer this question vary wildly based on their initial assumptions, in this case, whether one is considering lines in a single geometric plane or lines in an n-dimensional geometric space. As always in mathematics, it comes back to one’s initial set of assumptions; you can “prove” (i.e., provide large quantities of evidence for) a statement with one set of rules, but that set of rules is not the universe.

The Secret History of Women in Coding

posted by Jason Kottke   Feb 14, 2019

In an excerpt of his forthcoming book Coders, Clive Thompson writes about The Secret History of Women in Coding for the NY Times.

A good programmer was concise and elegant and never wasted a word. They were poets of bits. “It was like working logic puzzles — big, complicated logic puzzles,” Wilkes says. “I still have a very picky, precise mind, to a fault. I notice pictures that are crooked on the wall.”

What sort of person possesses that kind of mentality? Back then, it was assumed to be women. They had already played a foundational role in the prehistory of computing: During World War II, women operated some of the first computational machines used for code-breaking at Bletchley Park in Britain. In the United States, by 1960, according to government statistics, more than one in four programmers were women. At M.I.T.’s Lincoln Labs in the 1960s, where Wilkes worked, she recalls that most of those the government categorized as “career programmers” were female. It wasn’t high-status work — yet.

This all changed in the 80s, when computers and programming became, culturally, a mostly male pursuit.

By the ’80s, the early pioneering work done by female programmers had mostly been forgotten. In contrast, Hollywood was putting out precisely the opposite image: Computers were a male domain. In hit movies like “Revenge of the Nerds,” “Weird Science,” “Tron,” “WarGames” and others, the computer nerds were nearly always young white men. Video games, a significant gateway activity that led to an interest in computers, were pitched far more often at boys, as research in 1985 by Sara Kiesler, a professor at Carnegie Mellon, found. “In the culture, it became something that guys do and are good at,” says Kiesler, who is also a program manager at the National Science Foundation. “There were all kinds of things signaling that if you don’t have the right genes, you’re not welcome.”

See also Claire Evans’ excellent Broad Band: The Untold Story of the Women Who Made the Internet.

Buy the Cheap Thing First

posted by Tim Carmody   Feb 08, 2019

cast iron skillets.jpg

Beth Skwarecki has written the perfect Lifehacker post with the perfect headline (so perfect I had to use it for my aggregation headline too, which I try to never do):

When you’re new to a sport, you don’t yet know what specialized features you will really care about. You probably don’t know whether you’ll stick with your new endeavor long enough to make an expensive purchase worth it. And when you’re a beginner, it’s not like beginner level equipment is going to hold you back…

How cheap is too cheap?

Find out what is totally useless, and never worth your time. Garage sale ice skates with ankles that are so soft they flop over? Pass them up.

What do most people do when starting out?

If you’re getting into powerlifting and you don’t have a belt and shoes, you can still lift with no belt and no shoes, or with the old pair of Chucks that you may already have in your closet. Ask people about what they wore when they were starting out, and it’s often one of those options…

What’s your exit plan?

How will you decide when you’re done with your beginner equipment? Some things will wear out: Running shoes will feel flat and deflated. Some things may still be usable, but you’ll discover their limitations. Ask experienced people what the fancier gear can do that yours can’t, and you’ll get a sense of when to upgrade. (You may also be able to sell still-good gear to another beginner to recoup some of your costs.)

Wearing out your beginner gear is like graduating. You know that you’ve stuck with the sport long enough that you aren’t truly a beginner anymore. You may have managed to save up some cash for the next step. And you can buy the nicer gear now, knowing exactly what you want and need.

This is 100 percent the truth, and applies to way more than just sports equipment. Computers, cooking, fashion, cars, furniture, you name it. The key thing is to pick your spots, figure out where you actually know what you want and what you want to do with it, and optimize for those. Everywhere else? Don’t outwit yourself. Play it like the beginner that you are. And save some scratch in the process. Perfect, perfect advice.

The Embroidered Computer

posted by Jason Kottke   Jan 14, 2019

Artists Irene Posch & Ebru Kurbak have built The Embroidered Computer, a programmable 8-bit computer made using traditional embroidery techniques and materials.

Embroidered Computer

Embroidered Computer

Solely built from a variety of metal threads, magnetic, glas and metal beads, and being inspired by traditional crafting routines and patterns, the piece questions the appearance of current digital and electronic technologies surrounding us, as well as our interaction with them.

Technically, the piece consists of (textile) relays, similar to early computers before the invention of semiconductors. Visually, the gold materials, here used for their conductive properties, arranged into specific patterns to fulfill electronic functions, dominate the work. Traditionally purely decorative, their pattern here defines they function. They lay bare core digital routines usually hidden in black boxes. Users are invited to interact with the piece in programming the textile to compute for them.

The piece also slyly references the connection between the early history of computing and the textile industry.

When British mathematician Charles Babbage released his plans for the Analytical Engine, widely considered the first modern computer design, fellow mathematician Ada Lovelace is famously quoted as saying that ‘the Analytical Engine weaves algebraic patterns, just as the Jacquard loom weaves flowers and leaves.’

The Jacquard loom is often considered a predecessor to the modern computer because it uses a binary system to store information that can be read by the loom and reproduced many times over.

See also Posch’s & Kurbak’s The Knitted Radio, a sweater that functions as an FM radio transmitter.

Papercraft Computers

posted by Jason Kottke   Dec 19, 2018

Papercraft Electronics

Papercraft Electronics

Papercraft Electronics

Rocky Bergen makes paper models of vintage electronics and computing gear. And here’s the cool bit…you can download the plans to print and fold your own: Apple II, Conion C-100F boom box, Nintendo GameCube, and Commodore 64.

Why Doctors Hate Their Computers

posted by Tim Carmody   Nov 09, 2018

Nobody writes about health care practice from the inside out like Atul Gawande, here focusing on an increasingly important part of clinical work: information technology.

A 2016 study found that physicians spent about two hours doing computer work for every hour spent face to face with a patient—whatever the brand of medical software. In the examination room, physicians devoted half of their patient time facing the screen to do electronic tasks. And these tasks were spilling over after hours. The University of Wisconsin found that the average workday for its family physicians had grown to eleven and a half hours. The result has been epidemic levels of burnout among clinicians. Forty per cent screen positive for depression, and seven per cent report suicidal thinking—almost double the rate of the general working population.

Something’s gone terribly wrong. Doctors are among the most technology-avid people in society; computerization has simplified tasks in many industries. Yet somehow we’ve reached a point where people in the medical profession actively, viscerally, volubly hate their computers.

It’s not just the workload, but also what Gawande calls “the Revenge of the Ancillaries” — designing software for collaboration between different health care professionals, from surgeons to administrators, all of whom have competing stakes and preferences in how a product is used and designed, what information it offers and what it demands. And most medical software doesn’t handle these competing demands very well.

The Stylish & Colorful Computing Machines of Yesteryear

posted by Jason Kottke   Nov 08, 2018

Holy moly, these photographs of vintage computers & peripherals by “design and tech obsessive” James Ball are fantastic.

Ball Computers

Ball Computers

Ball Computers

He did a similar series with early personal computers subtitled “Icons of Beige”.

Ball Computers

(via @mwichary)

The history and future of data on magnetic tape

posted by Tim Carmody   Aug 31, 2018

IBM Magnetic Tape.jpeg

Maybe it’s because I’m part of the cassette generation, but I’m just charmed by IBM researcher Mark Lantz’s ode to that great innovation in data storage, magnetic tape. What could be seen as an intermediate but mostly dead technology is actually quite alive and thriving.

Indeed, much of the world’s data is still kept on tape, including data for basic science, such as particle physics and radio astronomy, human heritage and national archives, major motion pictures, banking, insurance, oil exploration, and more. There is even a cadre of people (including me, trained in materials science, engineering, or physics) whose job it is to keep improving tape storage…

It’s true that tape doesn’t offer the fast access speeds of hard disks or semiconductor memories. Still, the medium’s advantages are many. To begin with, tape storage is more energy efficient: Once all the data has been recorded, a tape cartridge simply sits quietly in a slot in a robotic library and doesn’t consume any power at all. Tape is also exceedingly reliable, with error rates that are four to five orders of magnitude lower than those of hard drives. And tape is very secure, with built-in, on-the-fly encryption and additional security provided by the nature of the medium itself. After all, if a cartridge isn’t mounted in a drive, the data cannot be accessed or modified. This “air gap” is particularly attractive in light of the growing rate of data theft through cyberattacks.

Plus, it writes fast (faster than a hard drive), and it’s dirt cheap. And hard drives are up against some nasty physical limits when it comes to how much more data they can store on platters. This is why cloud providers like Google and Microsoft, among others, still use tape backup for file storage, and why folks at IBM and other places are working to improve tape’s efficiency, speed, and reliability.

Lantz describes how the newest tape systems in labs can read and write data on tracks 100 nanometers wide, at an areal density of 201 GB per square inch. “That means that a single tape cartridge could record as much data as a wheelbarrow full of hard drives.” I’ve never needed a wheelbarrow full of hard drives, but I think that is pretty cool. And I’m always excited to find out that engineers are still working to make old, reliable, maybe unsexy technologies work better and better.

Alan Turing was an excellent runner

posted by Jason Kottke   Apr 17, 2018

Alan Turing Runner

Computer scientist, mathematician, and all-around supergenius Alan Turing, who played a pivotal role in breaking secret German codes during WWII and developing the conceptual framework for the modern general purpose computer, was also a cracking good runner.

He was a runner who, like many others, came to the sport rather late. According to an article by Pat Butcher, he did not compete as an undergraduate at Cambridge, preferring to row. But after winning his fellowship to King’s College, he began running with more purpose. He is said to have often run a route from Cambridge to Ely and back, a distance of 50 kilometers.

It’s also said Turing would occasionally sometimes run to London for meetings, a distance of 40 miles. In 1947, after only two years of training, Turing ran a marathon in 2:46. He was even in contention for a spot on the British Olympic team for 1948 before an injury held him to fifth place at the trials. Had he competed and run at his personal best time, he would have finished 15th.

As the photo above shows, Turing had a brute force running style, not unlike the machine he helped design to break Enigma coded messages. He ran, he said, to relieve stress.

“We heard him rather than saw him. He made a terrible grunting noise when he was running, but before we could say anything to him, he was past us like a shot out of a gun. A couple of nights later we caught up with him long enough for me to ask who he ran for. When he said nobody, we invited him to join Walton. He did, and immediately became our best runner… I asked him one day why he punished himself so much in training. He told me ‘I have such a stressful job that the only way I can get it out of my mind is by running hard; it’s the only way I can get some release.’”

I found out about Turing’s running prowess via the Wikipedia page of non-professional marathon runners. Turing is quite high on the list, particularly if you filter out world class athletes from other sports. Also on the list, just above Turing, is Wolfgang Ketterle, a Nobel Prize-winning physicist who ran a 2:44 in Boston in 2014 at the age of 56.

Radiohead hid an old school computer program on their new album

posted by Jason Kottke   Jul 19, 2017

As if you already didn’t know that Radiohead are a bunch of big ole nerds, there’s an easter egg on a cassette tape included in the Boxed Edition of OK Computer OKNOTOK 1997 2017. At the end of the tape recording, there are some blips and bleeps, which Maciej Korsan interpreted correctly as a program for an old computer system.

As a kid I was an owner of the Commodore 64. I remember that all my friends already were the PC users but my parents declined to buy me one for a long time. So I sticked to my old the tape-based computer listening to it’s blips and waiting for the game to load. Over 20 years later I was sitting in front of my MacBook, listening to the digitalised version of the tape my favourite band just released and then I’ve heard a familiar sound… ‘This must be an old computer program, probably C64 one’ I thought.

The program turned out to run on the ZX Spectrum, a computer the lads would likely have encountered as kids.

Watch a near-pristine Apple I boot up and run a program

posted by Jason Kottke   Mar 31, 2017

Glenn and Shannon Dellimore own at least two original Apple I computers built in 1976 by Steve Wozniak, Dan Kottke, and Steve Jobs. The couple recently purchased one of the computers at auction for $365,000 and then lent it to London’s Victoria and Albert Museum for an exhibition. The hand-built machine is in such good condition that they were able to boot it up and run a simple program.

The superlative rarity of an Apple-1 in this condition is corroborated by this machine’s early history.The owner, Tom Romkey, owned the “Personal Computer Store” in Florida, and was certified as an Apple level 1 technician in 1981. One day, a customer came into his shop and traded in his Apple-1 computer for a brand new NCR Personal Computer. The customer had only used the Apple-1 once or twice, and Mr. Romkey set it on a shelf, and did not touch it again.

The Apple I was the first modern personal computer: the whole thing fit on just one board and used the familiar keyboard/monitor input and output.

By early 1976, Steve Wozniak had completed his 6502-based computer and would display enhancements or modifications at the bi-weekly Homebrew Computer Club meetings. Steve Jobs was a 21 year old friend of Wozniak’s and also a visitor at the Homebrew club. He had worked with Wozniak in the past (together they designed the arcade game “Breakout” for Atari) and was very interested in his computer. During the design process Jobs made suggestions that helped shape the final product, such as the use of the newer dynamic RAMs instead of older, more expensive static RAMs. He suggested to Wozniak that they get some printed circuit boards made for the computer and sell it at the club for people to assemble themselves. They pooled their financial resources together to have PC boards made, and on April 1st, 1976 they officially formed the Apple Computer Company. Jobs had recently worked at an organic apple orchard, and liked the name because “he thought of the apple as the perfect fruit — it has a high nutritional content, it comes in a nice package, it doesn’t damage easily — and he wanted Apple to be the perfect company. Besides, they couldn’t come up with a better name.”

In other words, Woz invented the Apple computer, but Jobs invented Apple Computer. Here’s a longer video of another working Apple I:

This one is also in great condition, although it’s been restored and some of the original parts have been replaced. If you’d like to play around with your own Apple I without spending hundreds of thousands of dollars at an auction, I would recommend buying a replica kit or trying out this emulator written in Javascript. (thx, chris)

Computer Show is back! (As an ad for HP printers.)

posted by Jason Kottke   Feb 21, 2017

Missed this early this month while I was on vacation: Computer Show is back with a new episode, partnering with HP to showcase one of their fast color printers. Yes it’s an ad, but yes it’s still funny.

The Brilliant Life of Ada Lovelace

posted by Jason Kottke   Dec 09, 2016

From Feminist Frequency, a quick video biography of Ada Lovelace, which talks about the importance of her contribution to computing.

A mathematical genius and pioneer of computer science, Ada Lovelace was not only the created the very first computer program in the mid-1800s but also foresaw the digital future more than a hundred years to come.

This is part of Feminist Frequency’s Ordinary Women series, which also covered women like Ida B. Wells and Emma Goldman.

Calculating Ada

posted by Jason Kottke   Oct 11, 2016

From the BBC, an hour-long documentary on Ada Lovelace, the world’s first computer programmer.

You might have assumed that the computer age began with some geeks out in California, or perhaps with the codebreakers of World War II. But the pioneer who first saw the true power of the computer lived way back, during the transformative age of the Industrial Revolution.

Happy Ada Lovelace Day, everyone!

RIP Seymour Papert

posted by Jason Kottke   Aug 03, 2016

Seymour Papert, a giant in the worlds of computing and education, died on Sunday aged 88.

Dr. Papert, who was born in South Africa, was one of the leading educational theorists of the last half-century and a co-director of the renowned Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. In some circles he was considered the world’s foremost expert on how technology can provide new ways for children to learn.

In the pencil-and-paper world of the 1960s classroom, Dr. Papert envisioned a computing device on every desk and an internetlike environment in which vast amounts of printed material would be available to children. He put his ideas into practice, creating in the late ’60s a computer programming language, called Logo, to teach children how to use computers.

I missed out on using Logo as a kid, but I know many people for whom Logo was their introduction to computers and programming. The MIT Media Lab has a short remembrance of Papert as well.

Silicon Cowboys, a documentary film on the history of Compaq Computer

posted by Jason Kottke   Feb 18, 2016

Silicon Cowboys

Silicon Cowboys is an upcoming documentary about Compaq Computer, one of the first companies to challenge IBM with a compatible computer.

Launched in 1982 by three friends in a Houston diner, Compaq Computer set out to build a portable PC to take on IBM, the world’s most powerful tech company. Many had tried cloning the industry leader’s code, only to be trounced by IBM and its high-priced lawyers. SILICON COWBOYS explores the remarkable David vs. Goliath story, and eventual demise, of Compaq, an unlikely upstart who altered the future of computing and helped shape the world as we know it today. Directed by Oscar(R)-nominated director Jason Cohen, the film offers a fresh look at the explosive rise of the 1980’s PC industry and is a refreshing alternative to the familiar narratives of Jobs, Gates, and Zuckerberg.

There’s no trailer yet, but the film is set to debut at SXSW in March. The first season of Halt and Catch Fire had a lot of influences, but the bare-bones story was that of Compaq.

Many reviews mention the similarity of the characters to Apple founders Steve Jobs and Steve Wozniak, but the trio of managers from Texas Instruments who left to form Compaq in the early 80s are a much closer fit. The Compaq Portable was the first 100% IBM compatible computer produced.

Computer Show

posted by Jason Kottke   Oct 14, 2015

From Adam Lisagor’s Sandwich Video comes Computer Show, a present-day send-up of a personal computing show set in 1983. The guests and their products are contemporary and real, but the hosts are stuck in 1983 and don’t really know what the web is, what Reddit is, what links are, or anything like that.

“Computer Show” is a technology talk show, set in 1983. The dawn of the personal computing revolution. Awkward hair and awkward suits. Primitive synths and crude graphics. VHS tapes. No Internet. But there’s a twist.

The guests on this show are tech luminaries — experts, founders, thinkers, entrepreneurs…from 2015. They are real, and they are really on “Computer Show” to talk about their thing. Will it go well? Can they break through to the host Gary Fabert (played by Rob Baedeker of the SF-based sketch mainstay Kasper Hauser) and his rotating cast of co-hosts, who know of neither iPhone nor website nor Twitter nor…hardly anything?

The first episode, featuring Lumi, is embedded above and here’s the second episode with Reddit cofounder Alexis Ohanian.

Hopefully they’ll get to make more.

Update: In an interview with Inc., Lisagor shares how Computer Show came about.

So it just became clear to Roxana and Tony, there was something here. Roxana had the idea to make something in this universe. To produce a show like “The Computer Chronicles”.

One day, over Slack, Roxana asked me for the contact info of a producer I know. When I asked why, she told me she had this idea, and also asked if I’d maybe be a contributor on it. When I got more info out of her (she tends to be a little private about her personal projects) and explained to me that she had this idea of a tech talk show set in the early 80’s, where the guests would interview people from modern day, I just about flipped out and lost my mind I was so excited.

Update: While we not-so-patiently await new episodes of Computer Show, at least we can buy the t-shirt.

New York Historical Society’s Silicon City exhibit

posted by Jason Kottke   Oct 13, 2015

Early NYC Computing

The Silicon City exhibit at the New York Historical Society takes a look at the long history of computing in NYC.

Every 15 minutes, for nearly a year, 500 men, women, and children rose majestically into “the egg,” Eero Saarinen’s idiosyncratic theater at the 1964 World’s Fair. It was very likely their first introduction to computer logic. Computing was not new. But for the general public, IBM’s iconic pavilion was a high profile coming out party, and Silicon City will harness it to introduce New York’s role in helping midwife the digital age.

The exhibit opens on November 13, 2015 and runs through next April. The museum is using Kickstarter to help bring the Telstar satellite back to NYC for the exhibit.

A 1968 computer art contest

posted by Jason Kottke   Jul 06, 2015

Computer Art 1968

From the August 1968 issue of Computers and Automation magazine, the results of their Sixth Annual Computer Art Contest (flip to page 8).

Computer Art 1968

Computer Art 1968

It’s also worth paging through the rest of the magazine just for the ads.

Update: Looks like The Verge saw this post and did a followup on the history of the Computer Art Contest.

In any given issue, Computers and Automation devoted equal time to the latest methods of database storage and grand questions about the future of their “great instrument,” but the Computer Art Contest was soon a regular event. A look back through old issues of the journal (available at Internet Archive) shows how the fledgling discipline of computer art rapidly evolved. At the time, computers were specialized tools, most commonly used by individuals working in research labs, academia, or the military — and this heritage shows. Both the first and second prizes for the inaugural 1963 competition went to designs generated at the same military lab.

The computer collector

posted by Jason Kottke   Jun 11, 2015

Lonnie Mimms has a gigantic collection of vintage computers, software, and peripherals. You don’t realize the scope of the collection until you see him walking around the Apple pop-up exhibit he built inside of an abandoned CompUSA.

Simple CPU

posted by Jason Kottke   Oct 20, 2014

Very quickly, here’s how a computer works at the simplest level.

D Latch

Want to see how computers store data? This next device is called a ‘D-Latch’. It holds a binary bit. The top switch is the value to be stored, the bottom switch enables storage. Eight of these devices can be used to store a byte in memory.

Halt and Catch Fire

posted by Jason Kottke   Jun 20, 2014

Halt And Catch Fire

I’ve been hearing some good things about Halt and Catch Fire, which is three episodes into its first season on AMC. The show follows a group of 80s computer folk as they attempt to reverse engineer the IBM PC. The first episode is available online in its entirety.

Many reviews mention the similarity of the characters to Apple founders Steve Jobs and Steve Wozniak, but the trio of managers from Texas Instruments who left to form Compaq in the early 80s are a much closer fit. The Compaq Portable was the first 100% IBM compatible computer produced. Brian McCullough recently did a piece on Compaq’s cloning of the IBM PC for the Internet History Podcast.

The idea was to create a computer that was mostly like IBM-PC and mostly ran all the same software, but sold at a cheaper price point. The first company to pursue this strategy was Columbia Data Products, followed by Eagle Computer. But soon, most of the big names in the young computer industry (Xerox, Hewlett-Packard, Digital Equipment Corporation, Texas Instruments, and Wang) were all producing PC clones.

But all of these machines were only mostly PC-compatible. So, at best, they were DOS compatible. But there was no guarantee that each and every program or peripheral that ran on the IBM-PC could run on a clone. The key innovation that Canion, Harris and Murto planned to bring to market under the name Compaq Computer Corporation would be a no-compromises, 100% IBM-PC compatibility. This way, their portable computer would be able to run every single piece of software developed for the IBM-PC. They would be able to launch their machine into the largest and most vibrant software ecosystem of the time, and users would be able to use all their favorite programs on the road.

My dad bought a machine from Columbia Data Products; I had no idea it was the first compatible to the market. My uncle had a Compaq Portable that he could take with him on business trips. I played so much Lode Runner on both of those machines. I wonder if that disk of levels I created is still around anywhere… (via @cabel)

Update: I’m all caught up, five episodes into the season, and I’m loving it.

Turing Test passed for the first time

posted by Jason Kottke   Jun 08, 2014

A supercomputer running a program simulating a 13-year-old boy named Eugene has passed the Turing Test at an event held at London’s Royal Society.

The Turing Test is based on 20th century mathematician and code-breaker Turing’s 1950 famous question and answer game, ‘Can Machines Think?’. The experiment investigates whether people can detect if they are talking to machines or humans. The event is particularly poignant as it took place on the 60th anniversary of Turing’s death, nearly six months after he was given a posthumous royal pardon.

If a computer is mistaken for a human more than 30% of the time during a series of five minute keyboard conversations it passes the test. No computer has ever achieved this, until now. Eugene managed to convince 33% of the human judges that it was human.

I’m sure there will be some debate as members of the AI and computing communities weigh in over the next few days, but at first blush, it seems like a significant result. The very first Long Bet concerned the Turing Test, with Mitch Kapor stating:

By 2029 no computer — or “machine intelligence” — will have passed the Turing Test.

and Ray Kurzweil opposing. The stakes are $20,000, but the terms are quite detailed, so who knows if Kurzweil has won.

Update: Kelly Oakes of Buzzfeed dumps some cold water on this result.

Of course the Turing Test hasn’t been passed. I think its a great shame it has been reported that way, because it reduces the worth of serious AI research. We are still a very long way from achieving human-level AI, and it trivialises Turing’s thought experiment (which is fraught with problems anyway) to suggest otherwise.

The Macintosh is 30 years old today

posted by Jason Kottke   Jan 24, 2014

Apple is celebrating the 30th anniversary of the Macintosh with a special subsite.

Incredible that the Mac is still around; the 90s were a dire time for Apple and it’s amazing to see the current fantastic iMacs and Macbooks that came after some epically bad mid-90s machines. Here’s Steve Jobs introducing the original Mac in 1984 (a snippet of the full introduction video):

Steven Levy writes about covering the introduction of the Mac for Rolling Stone.

First, I met the machine. From the instant the woman running the demo switched on that strange-looking contraption (inspired in part by the Cuisinart food processor), I knew the Macintosh would change millions of lives, including my own. To understand that, you must realize how much 1984 really was not like 2014. Until that point, personal computers were locked in an esoteric realm of codes and commands. They looked unfriendly, with the letters of text growing in sickly phosphorescence. Even the simplest tasks required memorizing the proper intonations, then executing several exacting steps.

But the Macintosh was friendly. It opened with a smile. Words appeared with the clarity of text on a printed page - and for the first time, ordinary people had the power to format text as professional printers did. Selecting and moving text was made dramatically easier by the then-quaint mouse accompanying the keyboard. You could draw on it. This humble shoebox-sized machine had a simplicity that instantly empowered you.

Here’s the piece Levy wrote for Rolling Stone.

If you have had any prior experience with personal computers, what you might expect to see is some sort of opaque code, called a “prompt,” consisting of phosphorescent green or white letters on a murky background. What you see with Macintosh is the Finder. On a pleasant, light background (you can later change the background to any of a number of patterns, if you like), little pictures called “icons” appear, representing choices available to you. A word-processing program might be represented by a pen, while the program that lets you draw pictures might have a paintbrush icon. A file would represent stored documents - book reports, letters, legal briefs and so forth. To see a particular file, you’d move the mouse, which would, in turn, move the cursor to the file you wanted. You’d tap a button on the mouse twice, and the contents of the file would appear on the screen: dark on light, just like a piece of paper.

Levy has also appended a never-seen-before transcript of his interview with Steve Jobs onto the Kindle version of Insanely Great, a book Levy wrote about the Mac.

Dave Winer participated on a panel of developers on launch day.

The rollout on January 24th was like a college graduation ceremony. There were the fratboys, the insiders, the football players, and developers played a role too. We praised their product, their achievement, and they showed off our work. Apple took a serious stake in the success of software on their platform. They also had strong opinions about how our software should work, which in hindsight were almost all good ideas. The idea of user interface standards were at the time controversial. Today, you’ll get no argument from me. It’s better to have one way to do things, than have two or more, no matter how much better the new ones are.

That day, I was on a panel of developers, talking to the press about the new machine. We were all gushing, all excited to be there. I still get goosebumps thinking about it today.

MacOS System 1.1 emulator. (via @gruber)

iFixit did a teardown of the 128K Macintosh.

Jason Snell interviewed several Apple execs about the 30th anniversary for MacWorld. (via df)

What’s clear when you talk to Apple’s executives is that the company believes that people don’t have to choose between a laptop, a tablet, and a smartphone. Instead, Apple believes that every one of its products has particular strengths for particular tasks, and that people should be able to switch among them with ease. This is why the Mac is still relevant, 30 years on-because sometimes a device with a keyboard and a trackpad is the best tool for the job.

“It’s not an either/or,” Schiller said. “It’s a world where you’re going to have a phone, a tablet, a computer, you don’t have to choose. And so what’s more important is how you seamlessly move between them all…. It’s not like this is a laptop person and that’s a tablet person. It doesn’t have to be that way.”

Snell previously interviewed Steve Jobs on the 20th anniversary of the Mac, which includes an essay that Jobs wrote for the very first issue of Macworld in 1984:

The Macintosh is the future of Apple Computer. And it’s being done by a bunch of people who are incredibly talented but who in most organizations would be working three levels below the impact of the decisions they’re making in the organization. It’s one of those things that you know won’t last forever. The group might stay together maybe for one more iteration of the product, and then they’ll go their separate ways. For a very special moment, all of us have come together to make this new product. We feel this may be the best thing we’ll ever do with our lives.

Here’s a look inside that first MacWorld issue.

As always, Folklore.org is an amazing source for stories about the Mac told by the folks who were there.

Susan Kare designed the icons, the interface elements, and fonts for the original Macintosh. Have a look at her Apple portfolio or buy some prints of the original Mac icons.

Stephen Fry recounts his experience with the Mac, including the little tidbit that he and Douglas Adams bought the first two Macs in Europe (as far as he knows).

I like to claim that I bought the second Macintosh computer ever sold in Europe in that January, 30 years ago. My friend and hero Douglas Adams was in the queue ahead of me. For all I know someone somewhere had bought one ten minutes earlier, but these were the first two that the only shop selling them in London had in stock on the 24th January 1984, so I’m sticking to my story.

Review of the Mac in the NY Times from 1984.

The Next Web has an interview with Daniel Kottke (no relation) and Randy Wigginton on programming the original Mac.

TNW: When you look at today’s Macs, as well as the iPhone and the iPad, do you see how it traces back to that original genesis?

Randy: It was more of a philosophy - let’s bring the theoretical into now - and the focus was on the user, not on the programmer. Before then it had always been let’s make it so programmers can do stuff and produce programs.

Here, it was all about the user, and the programmers had to work their asses off to make it easy for the user to do what they wanted. It was the principle of least surprise. We never wanted [the Macintosh] to do something that people were shocked at. These are things that we just take for granted now. The whole undo paradigm? It didn’t exist before that.

Like Daniel says, it’s definitely the case that there were academic and business places with similar technology, but they had never attempted to reach a mass market.

Daniel: I’m just struck by the parallel now, thinking about what the Mac did. The paradigm before the Mac in terms of Apple products was command-line commands in the Apple II and the Apple III. In the open source world of Linux, I’m messing around with Raspberry Pis now, and it terrifies me, because I think, “This is not ready for the consumer,” but then I think about Android, which is built on top of Linux. So the Macintosh did for the Apple II paradigm what Android has done for Linux.

A week after Jobs unveiled the Mac at the Apple shareholders meeting, he did the whole thing again at a meeting of the Boston Computer Society. Time has the recently unearthed video of the event.

Is Google’s quantum computer even quantum?

posted by Jason Kottke   Nov 12, 2013

Google and NASA recently bought a D-Wave quantum computer. But according to a piece by Sophie Bushwick published on the Physics Buzz Blog, there isn’t scientific consensus on whether the computer is actually using quantum effects to calculate.

In theory, quantum computers can perform calculations far faster than their classical counterparts to solve incredibly complex problems. They do this by storing information in quantum bits, or qubits.

At any given moment, each of a classical computer’s bits can only be in an “on” or an “off” state. They exist inside conventional electronic circuits, which follow the 19th-century rules of classical physics. A qubit, on the other hand, can be created with an electron, or inside a superconducting loop. Obeying the counterintuitive logic of quantum mechanics, a qubit can act as if it’s “on” and “off” simultaneously. It can also become tightly linked to the state of its fellow qubits, a situation called entanglement. These are two of the unusual properties that enable quantum computers to test multiple solutions at the same time.

But in practice, a physical quantum computer is incredibly difficult to run. Entanglement is delicate, and very easily disrupted by outside influences. Add more qubits to increase the device’s calculating power, and it becomes more difficult to maintain entanglement.

(via fine structure)

Google’s new quantum computer

posted by Jason Kottke   Oct 16, 2013

Google’s got themselves a quantum computer (they’re sharing it with NASA) and they made a little video about it:

I’m sure that Hartmut is a smart guy and all, but he’s got a promising career as an Arnold Schwarzenegger impersonator hanging out there if the whole Google thing doesn’t work out.

The Apollo Guidance Computer

posted by Jason Kottke   Aug 23, 2013

A 30-minute documentary from the 60s on the Apollo Guidance Computer.

Richard Feynman and The Connection Machine

posted by Jason Kottke   Jul 31, 2013

I will read stories about Richard Feynman all day long and this one is no exception. Danny Hillis remembers his friend and colleague in this piece originally written for Physics Today (original here).

Richard arrived in Boston the day after the company was incorporated. We had been busy raising the money, finding a place to rent, issuing stock, etc. We set up in an old mansion just outside of the city, and when Richard showed up we were still recovering from the shock of having the first few million dollars in the bank. No one had thought about anything technical for several months. We were arguing about what the name of the company should be when Richard walked in, saluted, and said, “Richard Feynman reporting for duty. OK, boss, what’s my assignment?” The assembled group of not-quite-graduated MIT students was astounded.

After a hurried private discussion (“I don’t know, you hired him…”), we informed Richard that his assignment would be to advise on the application of parallel processing to scientific problems.

“That sounds like a bunch of baloney,” he said. “Give me something real to do.”

So we sent him out to buy some office supplies. While he was gone, we decided that the part of the machine that we were most worried about was the router that delivered messages from one processor to another. We were not sure that our design was going to work. When Richard returned from buying pencils, we gave him the assignment of analyzing the router.

For more Hillis, I recommend Pattern on the Stone and for more Feynman, you can’t go wrong with Gleick’s Genius.

Don’t mess with Texas’s old computers

posted by Jason Kottke   Apr 22, 2013

As recently as last year, a liquid filtration company in Texas was still using a computer built in 1948 to run all of its accounting work.

Sparkler’s IBM 402 is not a traditional computer, but an automated electromechanical tabulator that can be programmed (or more accurately, wired) to print out certain results based on values encoded into stacks of 80-column Hollerith-type punched cards.

Companies traditionally used the 402 for accounting, since the machine could take a long list of numbers, add them up, and print a detailed written report. In a sense, you could consider it a 3000-pound spreadsheet machine. That’s exactly how Sparkler Filters uses its IBM 402, which could very well be the last fully operational 402 on the planet. As it has for over half a century, the firm still runs all of its accounting work (payroll, sales, and inventory) through the IBM 402. The machine prints out reports on wide, tractor-fed paper.

Here’s what one of the computer’s apps look like:

IBM 402 apps

Objects in motion tends to stay in motion.

Do we live in a computer simulation?

posted by Jason Kottke   Dec 11, 2012

In 2003, British philosopher Nick Bostrom suggested that we might live in a computer simulation. From the abstract of Bostrom’s paper:

This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed.

The gist appears to be that if The Matrix is possible, someone has probably already invented it and we’re in it. Which, you know, whoa.

But researchers believe they have devised a test to check if we’re living in a computer simulation.

However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.

The supercomputers performing lattice quantum chromodynamics calculations essentially divide space-time into a four-dimensional grid. That allows researchers to examine what is called the strong force, one of the four fundamental forces of nature and the one that binds subatomic particles called quarks and gluons together into neutrons and protons at the core of atoms.

“If you make the simulations big enough, something like our universe should emerge,” Savage said. Then it would be a matter of looking for a “signature” in our universe that has an analog in the current small-scale simulations.

If it turns out we’re all really living in an episode of St. Elsewhere, I’m going to be really bummed. (via @CharlesCMann)