Before the World Wide Web did anything, Apple’s HyperCard did everything

“Sometime around 1988, my landlady and I cut a deal. She would purchase a Macintosh computer, I would buy an external hard drive, and we would leave the system in the living room to share. She used the device most, since I did my computing on an IBM 286 and just wanted to keep up with Apple developments,” Matthew Lasar writes for Ads Technica. “But after we set up the Mac, I sat down with it one evening and noticed a program on the applications menu. ‘HyperCard?’ I wondered. ‘What’s that?'”

“I opened the app and read the instructions. HyperCard allowed you to create ‘stacks’ of cards, which were visual pages on a Macintosh screen,” Lasar writes. “Intrigued, I began composing stacks… I eventually glanced at my wrist watch. It was 4:00 AM. Startled and quite tired, I turned in with visions of stack buttons dancing in my head.”

“Even before its cancellation, HyperCard’s inventor saw the end coming. In an angst-filled 2002 interview, Bill Atkinson confessed to his Big Mistake. If only he had figured out that stacks could be linked through cyberspace, and not just installed on a particular desktop, things would have been different,” Lasar writes. “‘I missed the mark with HyperCard,’ Atkinson lamented. ‘I grew up in a box-centric culture at Apple. If I’d grown up in a network-centric culture, like Sun, HyperCard might have been the first Web browser. My blind spot at Apple prevented me from making HyperCard the first Web browser.'”

Read more in the full article – recommendedhere.

MacDailyNews Take: Ah, HyperCard. You’re still missed.

Bill Atkinson discusses Apple HyperCard:

Related articles:
3 decades on, Apple’s HyperCard lives on at The Internet Archive – August 11, 2017
Former Apple CEO John Sculley: We blew it with HyperCard – October 3, 2003

33 Comments

    1. Recognizing is one thing, implementing is another… Bill implemented HyperCard on slow, memory starved ( at the time ) Macintosh computers, using assembly code and without a proper data base manager. Hats of to Bill…

      1. The problem is “network centric” Apple was having us do warranty repair and parts ordering over the ‘internet’ or at least what passed for the internet at the time. Using one of the ‘bones’ of AOL. Should he have known that the internet was coming. Yes. There were plenty of clues.

    2. Hypercard ruled.

      It was among many though, that could have been the first “web browser” if it were developed in a network-centric way. Lots and lots of near-misses out there.

      1. Totally agree! I used it for lots of different things in my lab at UCLA and it was the only time I’ve ever managed to do “programming” of any sort. I really miss that. I know there was a short stint of “SuperCard” which someone else made and used color as well, but it was never as great as HC!

      2. I’m glad it wasn’t the first “web browser” as it was great at what it did and offered (at least to me).
        Look at NutScrape – one of the first (Mosaic anyone?) browsers and where is it now? HC was a great app and tool. Just wish it had lived on…maybe it does in a way with Swift, etc…

  1. The great thing about HyperCard was its programming language, HyperTalk. It’s a shame that Apple never used HyperTalk as an interface to do anything else.

    There were so many opportunities along the way to use it and let users customize their own experience, but even when Steve came back, HyperTalk was never revived. That’s what makes me so sad.

  2. Hyperilink, hypertext, ARexx.

    Workbench, Amiga.

    Way before 1990.

    And absolutely no need for file extensions .xxx (.jpg .mp4 etc)
    Also, any program could run in it’s own screen resolution, eg paint program 1600×1200, word processing 800×600, video editing 1280×1024 all in one and only monitor. A feature that no other OS can muster (till today/including Mac OS).

    The only true next-gen OS till today. (search about it’s kernel, queue handling, minimal resources needed / ofcourse Workbench UI and it’s services are)

    If Cook and the rest of the team have any sense about real technology that works for the user and not the opposite, they should buy the copyrights of Amiga OS, convert it to ARM (or x86), modernize it and release it as Mac OS 11.

    Period.

    1. (search about it’s kernel, queue handling, minimal resources needed / ofcourse classic Workbench UI and it’s services are ***EDIT = outdated by now, after all it’s a 1985 OS)

        1. Workbench was the only OS that took UNIX light years ahead and in the same time made it user friendly (Amiga = female friend in Spanish IIRC).
          UNIX is……Dinosaur. From the days of punched cards lol.

          Still using Workbench today on my Mac (FS-UAE).

          I owned several Amigas back in the day, and ofcourse the MEGA machine+combo: Amiga 4000 + Video Toaster (LightWave was in the package by default). You cannot even imagine what we could do in 1990-1994 on that setup….PCs and Macs did it 15 years later (bar the flexibility of Workbench).

          1993/4= Amiga 3000 + Amiga 4000/VT/LW connected to 5 Monitors (2 Mutlisync, 2 Hi-Res, 1 PAL/NTSC), everything working like a charm, fully networked. With JUST 32 Megs of RAM. Read again: 32 MegaBytes of RAM for web serfing, video editing, image editing, music composing and multimedia authoring.

          Also: using Shapeshifter or Fusion Amiga could emulate Mac OS System 7/8 and crushed Macs (speed and cpu/gpu power). Netscape, Photoshop, Quark running way faster on my Amiga.

          For hundreds dollars less.

          Today?

          Only 2 computing categories: ugly PCs (same tower design since 1993), cripple OS (Win) or……..Macs with laughable power (cpu/gpu/bandwidth for RAM, lanes, buses,
          expansion) and broken keyboards.

          If I had a time machine and went back to 1993 and describe to myself the whole picture of current state of home computing (80s mainframe power in the palm of your hand ie. smartphones…..but used only for selfies and social media / computers that have the purpose of a coffee machine, meaning the OS is just serving the program you like to use = nobody cares about OSs anymore, like optimization, careful use of computer resources, stability and ultimately consumer’s wallet) I wouldn’t believe. Like, I’d think that my future self is full of mescaline/hoffman or something lol.

          Hey computer! You have come a long way, yes?
          Oh, hi! Yes I’am! From floor-big mainframes, then very smart 8-bit computers, after that extremely smart and user 16-32bit and affordable…….to very stylish thin, half-assed specs (x86 is the worst CPU architecture), behemoth-resource-needy OSs and of course skyrocket prices.

        2. Jumped to the Mac from an Amiga 4000 (because of OS X), never owned a Microsoft PC and never will. The Amiga OS and the hardware is still ahead in some area’s, similar to some of the SGI designed computers at a much higher price, Apple has a chance with it’s A series cpu and OS to go far ahead but it seems like Apple doesn’t have the will to do so. I miss the times when so many American companies designed hardware and OS’s as one.

        3. That’s the only way: design your own hardware + make your own OS.

          The “fashion” made by Micro$oft “we build the OS, others the hardware” is utterly dysfunctional. Every company except MS did it the one and true way = OS + hardware design of their own. (Atari TOS, CP/M, IBM OS/2 which is really Amiga OS ported to x86 / an agreement between IBM and Commodore, Irix etc).

          The last “proper” OS of Mac was Snow Leopard. Since then it became resource hog, too many bugs (never patched / like the real-time refresh of Finder since Mavericks).

          Cook and Co. are completely ignorant of what “technology” means. It’s not speakers, or earbuds, or whole-screen phones. Those are just gadgets. Sure, they are nice and powerful gadgets (and pricey), but they are just that = gadgets.

          True computers, work for the user. Not the other way around. Expandable beyond dreams (not just put a new PCI card. A computer must have a custom bus expansion slot -besides industry standard expansion slots- so hardware companies can offer amazing and undiscovered ways to make something unheard of before.

          Nowadays everything looks dull, repetitive, and almost alike in the eyes of average joe. iOS/Android, Windows/Mac OS. Almost the same. No “custom flavor” if you get what I mean.

          Still remember the day I worked on a SGI machine back in 1992. Sure, hundreds of thousands dollars cost……..but truly innovative. The “lower cost and somewhat similar” was the Amiga.

          Me too, I hate PCs and Windows. I went from Amiga(s) to Mac, because OS X Panther got my attention. I remember being happy for the future, cause OS X gave me an Amiga-sense of flavor. Was truly excited for the things to come.
          I applauded mr. Jobs when he banned Flash from iOS. The same when he was infuriated by those thieves (Google) and said “We go nuclear against them”. But, sadly he died (rip).

          Then Cook and the rest of the crippled team took over.
          And made Apple “not a tech company” (Cook’s wording).
          And made faulty keyboards (70s engineers are laughing).
          And underwhelming but overpriced Macs.
          And all the time giving speeches about gender equality, freedom of speech, etc.
          Dude (Cook) go be a politician. You are not a CEO of a tech company.

          Oh, I forgot. It’s not a tech company anymore…

        4. This is a reply to both this comment and the one below.
          I don’t (can’t) disagree with what your saying. But…

          SGI was pretty much positioned into a market niche, and at it’s price could never bring computing to the masses.
          And though I couldn’t care less about MS, and Vista was a total abomination, as was DOS, I do heavily favor component systems.
          This is not to say that tight integration doesn’t have advantages, but they come at the expense of latitude.

          The amount of innovation the component approach brought was nothing short of amazing.
          It offered economies of immense scale, and fostered unbounded creativity in both hardware and software. It made the computer truly personal and supported the entirely of the world’s hardware. Exactly the opposite of Apple and that is why Wintel won the PC wars. Freedom and versatility, not love.

        5. “The amount of innovation the component approach brought was nothing short of amazing.”

          Are you mad? The Microsoft Wintel dominance and component approach brought stagnation not innovation. Everyone in tech understands this. The industry was forced to slow down to Microsoft’s pace and level which was awful.

          “Exactly the opposite of Apple and that is why Wintel won the PC wars.”

          You’ve lost the plot. Every computer and most tech devices now work like and look like Apple’s. It is Apple that won. Winning doesn’t mean market share.

        6. No sir, you are utterly mad if you can’t see the explosion of software and hardware available to Wintel, especially from 1980-2010, and far too often not the Mac.

          Any it’s been a Wintel digression that they copied the more limited, less serviceable Apple model that served to make their lives easier.

        7. PS-And I can still buy from a far more vast latitude of machines and peripherals as always, that support the universe of hardware. Or build my own. In any case I am not beholden to one manufacturer.

        8. “the explosion of software and hardware available to Wintel, especially from 1980-2010”

          You are confusing an explosion of retail choice with innovation. The two are different animals completely. Choice is not innovation. Wintel slowed the pace of innovation down to the lowest common denominator and snuffed out new innovation in order to keep their stranglehold.

          Yes you had and still have many choices in a component based approach. Very little of that was or is innovation. It was and is a sustaining development.

          “I can still buy from a far more vast latitude of machines and peripherals as always, that support the universe of hardware. Or build my own. In any case I am not beholden to one manufacturer.”

          I will repeat, choice is not innovation. Don’t be confused that it is.

          The fact is that all computers have fallen in line with Apple’s larger vision. Differences in component vs single manufacturer approaches are not significant. From a distance you can’t tell what is a Mac or PC, what is an iPhone or an Android or other. The industry, for the most part, marches to Apple’s beat.

        9. Sir…
          You could get software applications that boggled the mind, from office to productivity to games to scientific and more. Linux too, not just Wintel, remember it’s the liberalization of the machine I tout. Just look at even shareware and Linux repositories and see the wealth of, yes, innovation. Some good, some not so good.

          Interchangeable parts, lower costs of performance, all included due to choice.

          Where’s the big box Mac of today?

          Choice may not be innovation in and of itself, but it sure fosters it!

        10. “it’s the liberalization of the machine I tout.”

          A laudable goal but it isn’t innovation.

          “Choice may not be innovation in and of itself, but it sure fosters it!”

          Not necessarily. Choice can just as easily stifle innovation which it largely did during Wintel’s reign.

          You are off on another track now to what you first said which was “The amount of innovation the component approach brought was nothing short of amazing.”

          Truth is Wintel’s component approach did not bring innovation. It gave us many things. Some good. Many bad. Innovation was stifled not spurred.

          You are confused about what real innovation is. I’ll be off now. You seem ideological in your thinking and not at all rational.

  3. I too stumbled across HyperCard in 1989 on a Mac which I was using at work. Once I started playing with it, I realised that it offered me the means to create a number of work-related utilities which massively increased our productivity and efficiency. The stacks I wrote had stylish graphical interfaces while at the time, most computers used up/down arrows and cursors for interaction. Clients were always asking where we got those utilities and found it hard to believe that I created them in my ( very limited ) free time.

    HyperCard was the reason why I then went out and bought my first Mac. I’ve used Macs exclusively since then and have never owned a Wintel PC.

    It’s a great shame that HyperCard was left to fade away as it offered a degree of programming that anybody could understand and allowed us to create new uses for our Macs.

  4. What was so impressive was the versatility of Hypercard….I remember making stacks that (with barcodes embedded) allow control of a video disk player….. programming a multi-media presentation well before PowerPoint. It was so easy to teach students to do it as well.

  5. I was the dealer for a HyperCard based electronic medical record system and a HyperCard stack intended for doctors to send out referral and consultation reports.
    It allowed for tracking lab results and even graphing.
    HyperCard was brilliant.

    There were so many stacks available. One of my favourite ones was called Fortune Cookies.

  6. l did a stack as an accompaniment to an album we cut back in the 80s. Close to the only time I ever finished a programming project, because it worked so well and it was so much fun. Sometimes I think Apple peaked with Hypercard. I also often with that the Apple User Interface guide had been made mandatory.

  7. Make a HyperTalk controller for a radio transmitter. Made one that output new bar codes for products. Just into put the product model number, out came camera ready art work for labels. Now you don’t even need the camera.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.