Google+

GrantCunningham.com

© 2014 Grant Cunningham Click to email me!

FRIDAY SURPRISE: This stuff has been around longer than you might believe!


Computer generated graphics are a staple of moviemaking these days. I think it's not unfair to say that the majority of blockbuster movies simply couldn't be made without extensive use of computer imagery. Here's an interesting question: when were computer generated graphics first used in Hollywood?

It goes back before the turn of the century. 1990? Nope. 1980? still too late. 1970? Keep going!

In fact,
the first use of computer graphics was most likely Alfred Hitchcock's masterpiece "Vertigo" - in 1958!

In the 1950s electronic computers were still in their infancy, and the desktop computer was decades away. There were, however, mechanical computers in wide use. While they aren't what we think of when we say 'computer', they were in fact computers in the technical use of the term: a device that can be programmed to carry out a specific task.

World War II saw the use of a large number of mechanical computers, and one of them was an
electromechanical analog computer known as the Kerrison Predictor (or, in military parlance, the M5 Gun Director.) It took all the variables in hitting an aerial target -- speed, altitude and azimuth, wind speed, the trajectory of the round being used, and even gravity -- and calculated the angle and lead necessary to hit it. The Predictor then drove servo motors on the gun to aim it so that it would hit the target.

In postwar Hollywood, an animator named
John Whitney put a surplus Predictor to use to make the opening spirals for Hitchcock's film. He programmed the Predictor to rotate at a specific speed, then suspended a pendulum over the rotating machine and used the combination to generate the various spirals used.

Whitney would, just a couple of years later, form one of the first (if not THE first) computer graphics firms in the world: Motion Graphics Incorporated. Today he's considered the pioneer in computer animation.

Humble beginnings!

-=[ Grant ]=-
Comments

FRIDAY SURPRISE: The Bell tolls for one of its own.


A couple of weeks ago I posted about one of our country's greatest research facilities, Bell Labs. Yesterday came the sad news that one of the Lab's shining lights has died.

Dennis Ritchie started working for Bell Labs in 1967 after graduating from Harvard with degrees in both physics and applied mathematics. This wasn't a tremendous surprise: his father Alistair was a scientist at Bell Labs and a seminal figure in switching circuit theory. The family business, and all that.

Dennis migrated to the relatively new field of computer science, where he made a name for himself by creating the 'C' programming language, co-authoring the definitive book on 'C', and - most dear to my heart - co-developing the UNIX operating system.

That dry list of accomplishments may not mean much to you, but a large part of what your computer does has roots in Ritchie's work. If you have a Macintosh computer, an iPhone or iPad, you owe him a special nod of appreciation: UNIX is the underpinning of the OS X operating system, which (in one form or another) is what runs all of those devices.

The development of modern software and the existence of the web as we know it wouldn't have happened the way they did without his work.

Thank you, Dennis.

-=[ Grant ]=-
Comments

Steven Paul Jobs, 1955-2011.




-=[ Grant ]=-
Comments

FRIDAY SURPRISE: Get off my lawn or I’ll Photoshop you off!


Back in the 1980s digital imaging was still a laboratory experiment. Pictures were made on film, and if you wanted to do anything to the image after it was recorded you had to master (or know someone who had mastered) such arcane things as register masking, transparency stripping, and optical printing.

Toward the end of the decade very powerful (and expensive) graphics workstations came available that were able to manipulate digitized images. Note 'digitized', not 'digital'; the pictures were still made on film, and the negatives or transparencies were digitized on a drum scanner to be read by a computer.

The big boys on the block were Scitex, an Israeli company that made a name for themselves in the emerging field of digital pre-press equipment. Their digital imaging workstation was combined with a Hell drum scanner and a film recorder to provide a way to retouch and alter photographs. The negative or transparency would be scanned, manipulated by the computer, then sent to the film recorder -- which made a new negative or transparency which was processed and printed conventionally. The results were almost comically primitive by today's standards, but back then it was a viable alternative to having a very expensive stripped dye transfer made.

Scitex wasn't the only player in the market, but they were the best known. Eastman Kodak, in yet another of their half-hearted attempts to break into digital imaging, introduced their 'Premier' digital editing system in 1990. Like the Scitex it combined a workstation, Hell scanner, and film recorder. I never used a Scitex, but I did get some experience on the only Premier system installed in Oregon. At the time it was magical, but today we can do all of the things the Scitex and Premier systems did on an iPad -- only faster and easier!

Just a couple years later the Premier system I used was scrapped, already a victim of the emerging PC and Mac digital image applications. Cost was a factor in their failure; I seem to recall that the installation I used was well north of $200,000. About that time Scitex gave up dedicated workstations and develop a more cost-efficient system based around a Mac II microcomputer and Sharp scanner. That didn't last long, either; it was quickly surpassed by the emerging (and now ubiquitous) Photoshop.

Here's a great video from 1988 showing the then-amazing things a Scitex could do.



-=[ Grant ]=-
Comments

FRIDAY SURPRISE: Ghost of the future.


One of my favorite PBS shows was "
Connections", the ten-part series from British science writer/historian James Burke. In it, Burke looked at the often surprising interrelationships of disparate discoveries and inventions that invariably culminated in something no one involved in the process could have imagined. From those connections (get it?) we see that even small changes in the past would have made huge impacts in the present. It's a concrete, approachable explanation of the butterfly effect.



What brought this to mind was last week's
surprisingly frank admission by John Sculley, the long-reviled ex-CEO of Apple Inc., that his tenure there was a "mistake." (As an aside, I gained new respect for Sculley for being able to judge himself so clearly.) While I agree with that assessment with regard to Apple, when I look further at the series of connections that occurred because of his position it's clear that something very good came of it.

You see, had Sculley not taken that job at Apple there would be no
World Wide Web. Certainly not as we know it today.

Follow me: when Sculley took over at Apple, he and
Steve Jobs clashed. A power struggle ensued which resulted in Jobs being forced out of the company he founded (and in which he held a majority of the stock.) Jobs spent the summer of 1985 contemplating his situation, and before the year was out had formed a new computer company: NeXT, Inc. NeXT's goal was to produce a very powerful personal computer that could be used in education and research, to simulate things like recombinant DNA laboratories.

Jobs put together a team of talented engineers who designed the hardware and software which would become the
NeXT Cube. The operating system, called NeXTStep, would combine parts of BSD Unix and the Mach kernel to produce a multitasking, object oriented operating system. While it never achieved the market success that they had envisioned (for a host of reasons, not the least of which was a retaliatory lawsuit from Apple-led Sculley) it did make significant inroads in research labs around the world.

It was in one of those labs, at
CERN in Switzerland/France, that a 35-year-old British physicist named Tim Berners-Lee came up with an idea: take the relatively new concept of hypertext and expand it beyond the single computer (or node of computers) to which it was then limited. His idea was to use the Unix Transmission Control Protocol (TCP) to allow computers across the internet to access each other's hyperlinks. That sounds dry to us today, but it was a breakthrough.

Hyperlinks and TCP are the basis on which the World Wide Web operates; without that combination, you wouldn't be able to click on the links in this article and go to other sites for more information - or even navigate www.grantcunningham.com. Without them, the web as we know it simply wouldn't exist. No Revolver LIberation Alliance, no online shopping, and no porn sites. (Ya gotta take the bad with the good.)

The computer that inpsired Lee, and on which he did his development work? The NeXT, running the NeXTStep OS. WIthout NeXT's heavily object-oriented development environment, Lee wouldn't have been able to design the ubiquitous "www". Would someone have eventually come up with the idea? Maybe, maybe not. Even if they had, though, it wouldn't have proceeded on the same path that it did. The web, if it even existed, would be a profoundly different thing than it is today. That's the nature of interrelationships: change one, and every other one changes. Some may not happen at all.

Whether Sculley knows it or not, the (unintentional) consequences of his actions in 1985 led to you being able to read about his self-assessment on your computer screen today. Ironic, isn't it?

-=[ Grant ]=-
Comments

FRIDAY SURPRISE: The day the internet died.


Yesterday was a monumental day in the history of the 'net: Duke University, the birthplace of Usenet,
shut down its Usenet server some thirty years after it first came to life.

Citing diminishing use and rising costs as the reason for the shutdown, this comes as sad news for those of us who cut their teeth on newsgroups. While there are other servers still hosting Usenet traffic, the closure of the Duke server is a sign that the end is near.

I spent far too much free time on Usenet in the '80s and '90s. Before the World Wide Web, Usenet was THE source of information and interaction on the 'net. If you know what DoD stands for, you spent a lot of time on rec.motorcycles; if you know who the KoTL is, you spent
too much time there!

There are people I "met" on Usenet with whom I still correspond. I first encountered Ed Harris, whose name should not be unknown to readers of this blog, on rec.guns. That was more years ago than either of us care to recount, and despite never having been face-to-face we've exchanged ideas, shared projects and even collaborated a bit on a training manual for emergency communications. There are others whose names would mean nothing to you, but mean a great deal to me.

With so many ISPs dropping Usenet access, people for whom the WWW is the whole 'net don't see the loss. For those of us who remember FidoNet gateways and
bang paths it's like losing an old friend.

Virtually, of course.

-=[ Grant ]=-
Comments

FRIDAY SURPRISE: Good Morning, Dave.


Once upon a time, two geeks met in college. They had some neat ideas about the world of computers, and were anxious to put their ideas into production. They started a little company.

Shortly after they incorporated, they introduced a new computer - one that was more accessible, more flexible, and under the control of a single person. They didn't make many of them, and very few exist today, but with it they changed the face of computing forever.

No, I'm not talking about Jobs & Wozniak. I'm thinking of Ken Olsen and Harlan Anderson, and the company they founded -
Digital Equipment Corporation. DEC, as it would come to be known, introduced what was really the earliest commercial incarnation of the personal computer: the PDP-1.



The PDP-1 certainly didn't look like what we've come to expect of the PC. Nevertheless, it started the downsizing of computing power, and introduced a concept critical to the modern PC: user interaction, as opposed to batch data processing. This shift was the necessary step to creating true personal computers, and DEC got there first.

Interactivity opened up huge new vistas for the computer. The PDP-1 has the distinction of initiating things we now take for granted: text editing, music programs, and even computer gaming. (The very first computer video game, 'Spacewar!', was written for the PDP-1. Yes, you have DEC to thank for your Wii.)

DEC only made 50 PDP-1 machines, of which only 3 are known to have survived. All of them are currently in the collection of the
Computer History Museum. One is fully operational, and is demonstrated twice a month by running that historic computer game. They've got a terrific website that details the history and restoration of the PDP-1.

-=[ Grant ]=-
Comments

FRIDAY SURPRISE: The Witch is Back.


Back in '51, the Atomic Energy Research Establishment in Oxfordshire welcomed a new member to their staff: a computer. Today we don't even bat an eyelid when a new PC shows up in the office, but back then computers were a Big Deal. (After all, how many new staff members get their own office - the largest one in the building?)

The Harwell Computer, later to be known as "WITCH" (Wolverhampton Instrument for Teaching Computing from Harwell), now occupies a unique position in computing history. It holds the distinction of being the world's oldest surviving computer with electronically-stored data and programs. All the original parts are present and it is capable, in theory, of being operated.



Though it hasn't been switched on for over 35 years, it is now
being restored to operational status at the Museum of Computing at Bletchley Park. They expect the restoration to be completed next summer, at which point the WITCH will be able to claim another title: oldest operational computer, beating out the Ferranti Pegasus whipper-snapper at London's Science Museum.

-=[ Grant ]=-
Comments