Somewhere between 2010 and 2012 when doing small business IT support, for a client, I spent some time installing 1-2-3 on an old, mostly disused machine in a corner of their office, then connecting to the machine via LogMeIn on an iPad (I think DosBOX because it better supported something on LogMeIn, I honestly don't remember why not CMD.EXE) so that this customer, a medium sized construction sub-contractor, could continue to use the 1-2-3 based estimation spreadsheet that had been developed in the early (yes, early, it might have been converted from visicalc) 80s. I offered to instead convert it to Excel since it wasn't really that complicated, but the response was basically "That's probably better, but I've been doing this since the 70s and I'm going to retire soon, I don't want to learn more new things that I don't care about instead of getting jobs done as fast as possible."
It's not that 1-2-3 or a TUI was better, but that if you know it and you don't care about it, you care about it.
Worse is better.
And so it goes with apologies to Mr. Richard Gabriel, but that experience helped me more deeply understand what software should do is do important things for people, not do things better.
This just reminds me something I bring up to my development team. The point of software is to serve humans, to make their lives better. You should avoid spending time solving problems created by yourself, the programmer.
I think about this a lot when I see UI redesigns that "look fresh" or "updated" while actually reducing productivity and degrading the life of the humans who need computers to get stuff done. All for some vague concept from the head of a "visionary" who often doesn't use the system to solve the problems it is meant to solve.
Facebook flew me over from Australia to interview a decade or so ago, back when they seemed untouchable as a product. I told one of my housemates that I was interviewing with them and she fired up, angrily telling me that I needed to go over there and tell them to stop changing things. Her experience of using Facebook was basically that she’d just figure out where the button for something was, and then they would redesign it for seemingly no reason and she couldn’t find anything again. I asked her a bunch of questions (since this is great stuff to talk about in interviews). I hadn’t appreciated how upsetting redesigns are to people who just want to use the software.
But you also need to make your software good for you as a programmer too. As programmers we’re like builders. What we build today becomes our workspace tomorrow. Leaving a codebase neat and tidy makes it easier to spot bugs and make changes down the road. You can overdo it, but spending 20 minutes adding unit tests today could save you a week of debugging 6 months down the line.
The tech industry seems to have a very severe, very deep, very wide disconnect with common men about what exactly computers and the software that run on them are.
Ask a techie and they'll probably say it's something to be maintained and updated, to be replaced on a given lifecycle because new things demand it, that it's an achievement of technology or something. A computer to the tech industry is the end to a means.
Ask a common man and they'll probably say it's a tool to get stuff done. That 'puter in the corner working happily for the past 50 years? It's great, gets the work done and puts food on the table at night like any other tool (also 50 years old) on their 50 years old workbench. A computer to common men is a means to an end.
I hope some day the tech industry comes to terms with this disconnect. The users would absolutely be better off for it.
Even brand new kitchens get replaced if new owners don't like the style. Just like perfectly painted walls get repainted in different colors all the time or like some people move around furniture to get a "new room feeling".
Also I would say software may not deteriorate on its own, but in the context of other software/technologies moving forward it does deteriorate. For example many old games kind of have deteriorated because they don't easily run on modern OS any more.
I think the point here is that nobody puts their kitchen sink inside a cupboard and installs their oven-stove upside down in the name of innovation and challenge during a remodeling.
My biggest frustration with a lot of modern software is this habit of presenting model feedback requests, “how likely are you to recommend to a friend?”, new feature tours, etc. If I’m using the product it’s because I need to get crap done, and this stuff just gets in the way of work. Leave me alone and let me work. I’ve been the guy who submits feedback of “stop asking me for feedback”, but evidently it’s not valuable feedback.
I alternate between "I don't go around recommending X to people. I'm not a weirdo" and "My feedback is that asking for feedback will generate bad feedback (and also stop asking)."
> I think about this a lot when I see UI redesigns that "look fresh" or "updated" while actually reducing productivity and degrading the life of the humans
This ^^^ it totally annoys me having to relearn programs when I'm faster with 16bit versions from the early 00's.
I’m not sure that’s the best takeaway, but maybe. It resonates with my experience with folks who near retirement and are on a glide path out - they know what they know, understand it will get them over the finish line, and aren’t interested in learning new tooling.
It has nothing to do with software design decisions and everything to do with a cost/benefit equation in the mind of someone with one foot out the door. I’ve seen this with senior DBA’s and CEOs alike. They are not objectively good judges of “what works” for anyone except themselves because of this huge conflict of interest.
I sincerely believe you see the same thing with the elderly at a certain point - I’ve known a few people that after like 85 have zero interest and feel zero compulsion to “stay on top of” new ways of doing things, whether it’s software, vehicles, whatever. They’ve got enough skills to make it out and why trouble yourself with stuff that doesn’t matter?
I understand it, but am not sure it’s wise for a retiree. For instance, Excel may come in handy for that guy doing budget planning or contracting gigs later.
Add to that, the further behind you slide with technology, the harder it is to catch back up. You miss out on integrating the models, paradigms, and affordances, and a couple gens of compounding evolution happen without you and the learning curve becomes much steeper than it needs to be. It’s like keeping your house clean by doing little bits frequently rather than massive efforts rarely.
If that is the takeaway you got from my comment, I am sorry for bad phrasing on my side. I was trying to short, pith, koan-like. Can I try again?
Almost no one want to use the tool. They want to use the tool to do the thing they want to do. For example, it is rare that someone would buy a piece of wood just so they could cut with their new saw. They might decide to make a table so they can cut with their new saw, but they are not going to use a tool to use a tool, it has to provide clear value to them. In this case, for this person, they were willing to learn new technologies (as I think evidenced by using DosBOX on an iPad in 2012, though I didn’t say that in the original comment, this was a very very clunky system then. I think they didn’t use an external keyboard) but not for the thing that was on their critical path, if there was not a clear improvement in outcome.
I didn’t provide a clear improvement in outcome, I just provided a better tool. After this person had been doing their job for as long as they had, using the tool they had, investing in a better tool for that job didn’t make sense. They also knew whoever came after them would probably not want to use their tool there, so why change it?
Put yet another way (since I didn’t do a great job the first time, I’ll try again), nobody wants to use business software. I have written line-of-business, not consumer, software most of my career. Even now, where I work and we have a solidly engineered, designed, custom CRM/glue software that was been improving since the late 90s that really fits the business well, no one goes into work in the morning thinking, ‘yay, I get to use our CRM today!’ And definitely no one thinks, ‘yay, I get to use JIRA and SAP today!’ They may think, if they even think about our software at all, ‘hey, this doesn’t stink’ but a recently I found out someone was exporting a table of data to Excel and then spending 4-8 hours a week creating a report, and I was able to turn that into the press of a couple buttons and picking a few values in a day, and now they get to do more of the things that bring our company value and make their job more interesting and they love that they get to use my software to do it. They don’t care that I used a mini Vue app with an AJAX call to create the interface instead of our older form/POST paradigm, they care about what the tool let them do.
That’s the value that I see computer systems, when thought about and executed well, providing to users everywhere. Not doing tasks better, but doing the right tasks, for the right people, at the right time.
To extend this yet another way, and hopefully establish my point firmly, I was thinking of the above situation as a win-win. I got to say just use Excel, they got to use a modern piece of software, what would be better? In the same way, the guys in my company’s IT Operations group see saying that the new hotel desks in the office are all double 24 in monitor setups. I do not really like using double monitors, I prefer one large monitor, like the 32in that is still kicking at the office from the pre-COVID times. I asked them to just give me one flex desk that had the large monitor, that I could use when I came in, we already have the monitor, please! But they feel like it would generate more tickets, and they are already overworked(and they are!) so no single monitor for me. They see it as win-win, they reduce their ticket burden, have more time for other things our manufacturing conglomerate needs, most people like dual monitors, so overall win-win. But I don’t like it and now my weekly trip into the office is a little less comfortable because my ergonomics are wrong, my muscle memory for where my windows are is wrong, and yes I can learn new things but the tools are made for man, not man for the tools.
I know that was long, hopefully that helped clarify my comments.
I once spent a summer upgrading my company's flagship app from VB6 to VB.Net. The performance of the .Net version was so bad we ended up throwing it away. I learned the hard way newer is not automatically better.
Curious. Do you know if it was because of the .NET overhead, or because the fresh implementation didn't have all the edge case optimisations of the previous one?
The app was analyzing all the pixels in a live image and overlaying data on it. Accessing the data from the camera was slower with .Net, I think more mem copies working with buffers and by arrays. This was long before any span<> memory<> functionality. Iterating the pixels was way slower, don't remember why but you have to remember vb6 has no garbage collector, and no VM/byte code. It was pretty fast surprisingly. So when we updated, our FPS took a major hit and we could never get the performance back to what it was.
That brings back some good memories. I started using Lotus 123 in the early eighties. I discovered macros and wrote templates that automated our fertilizer company. Stuff like inventory and blending fertilizer.
I was at a farmers once and got talking to his wife about Lotus and she showed me what she had built. She had a few boxes of software and books above her desk. The Lotus 123 box looked different than mine and found out she had version 1, not version 1A like the overwhelming majority that were sold.
I told her to never throw away that box and disks because sometime in the distant future she had a relatively rare collectors item. She laughed and said with my luck a competitor will come along and no one will ever want a rare copy. She may have been right.
> "That's probably better, but I've been doing this since the 70s and I'm going to retire soon, I don't want to learn more new things that I don't care about instead of getting jobs done as fast as possible."
My maternal grandfather-in-law recounts that he said the same thing when he was close to retirement, but regrets it badly today. The skills someone was trying to teach him, it turns out, would have been very valuable skills also in retirement.
If I were the customer, I’d reject your proposal too. The old software has been working fine for 40 years, why would I want to find someone to rewrite it?
And faster, safer and future proof. I wouldn't be surprised at all if most 30+ years old software ran better on modern machines under a VM than say on 15 years old hardware without emulation.
Faster, yes. But not all of the things modern software does is pointless overhead. For anything networked, modern crypto is a must have if you’re going to interoperate with other devices. And I wouldn’t trust a windows 95 virtual machine connected directly to my corporate network.
I didn't try, but firewalling the VM shouldn't be hard since it talks through a virtual dedicated interface. Also, one can isolate it from the network and use only shared directories with the host so it can read and write files without going online.
I think I can tell you why not CMD.exe, it's because you could run x86 MSDOS programs.
I know this because my dad ran Quattro Pro, a Lotus 1-2-3 competitor up until around 2020 (he finally retired but still might use it for other reasons) and we had to keep figuring out how to keep it working over the years.
Speaking of Lotus 1-2-3, the best computer I have ever owned was the HP 200 LX Palmtop - https://www.palmtoppaper.com
It was about the size of a graphing calculator, and ran MS-DOS 5.0 with a black and white CGA display, and had Lotus 1-2-3 and rudimentary task-switching.
Simply phenomenal for 1994. Nothing has ever really come close; maybe someday I'll modify one so I can use the keyboard and replace the screen with a modern phone.
Beat me to it! ;-) I continue to use my (three!) HP 200 LXs in different parts of my lab for different reasons. The most 1-2-3 stuff is for analyzing S Parameters and other RF tasks (conversions, ratios, formulas, Smith Charts).
Two AA batteries power it for a least a month; you could recharge NiMH with the built-in charger if you wanted. I use the PCMCIA card slot to hold a battery-backed SRAM, allowing complete backup (via automatic macros) of the entire machine (dead main batteries are not killers, here in the pre-FLASH world).
And, the comm -- you can plug a special cable in to get 115,200 serial port. And you have IrDA --- I wrote the very first IrDA-compatible 'stack' on that machine (and at the time sold compatible discrete IrDA HW, so you could attach to another end, e.g. a PC).
Sorry to highjack a 1-2-3 thread with praise for the HP 200 LX, but that was (is) one HELL of an awesome machine, still in use in my lab to this day.
I was about to make the same comment. It was a remarkable machine. I remember buying it in 1995. It was so small that it could fit into a shirt pocket. I was in college at the time, and I remember that I had an old Radio Shack dot matrix printer that I could hook it up to. It was a real step up from the TI-85.
Hmm.. I take that to mean it’s serving your needs directly now, and getting a pi to do those specific tasks would be more effort than it’s worth. But it’s a bridge too far, sir, to say a 200 LX is more capable than a Pi. ;) I bet one could create a replica 200 LX with a Pi zero with more features, but one could not replicate the Pi with a 200 LX. I know, I know, POC||GTFO, but I think it’s a good way to think about it.
That's what I'm getting at - the Pi is amazingly powerful, but it doesn't come in a clamshell with a built-in screen and run for a month on a pair of double As.
I would really like something that was the exact form factor of a 200lx, but with a modern very low power chip and a color screen, even if it only ran for a week or so.
Well, you could scoop out the internals of your 200 chassis and slip in a pi zero W, an oled screen, etc. it would be some work but there’s probably no reason you couldn’t pull it off. You’d need a different (and better?) battery solution than AA, but it’s really not hard to get your pi to run off battery and take usb-C charging. You might enjoy reading these links:
I got another stroke of luck, I found a third party printer driver on an old SUNET archive for the Siemens Highprint 7400. Remarkably, it had some ancient Codeview debugging data left in it.
I quite like the anti-deletionists at the Academic Computer Club in Umeå, Sweden. I had a similar experience a while ago - they were only ones currently hosting a once (well, somewhat) ubiquitous file.
SUNET is the name of the Swedish University NETwork. They operated ftp.sunet.se with a public archive starting 1990. In 2016 this academic computer club took over the hosting. They get the bandwidth from their university for free. Their machines are typically previous-gen donations from local companies. Oh, and the local university tends to cover the electricity costs as well. As well as the costs of the server room. Sysadmining the service is done by volunteers though.
I’m too young for Lotus 1-2-3. Or at least, I should have been.
I worked at a factory in 2011 that still used Lotus on their ancient dinosaur because why fix what wasn’t broken? The task was mostly just to record the values measured from a QC device in a spreadsheet, and print out a report for each batch. Since the computer didn’t have USB, printing involved saving the PDF to a floppy disk, running it across the factory to the printer, and inserting it into a USB-Floppy reader connected directly to the modern office printer.
Honestly, I was really impressed by how functional Lotus was for such old software. The majority of tasks I was used to in modern Excel were doable in 1-2-3, and it could even do several things that still aren’t possible on Google Sheets.
> The majority of tasks I was used to in modern Excel were doable in 1-2-3
Work for a few years in the finance or insurance industry, and this opinion of yours will likely change: people in these industries have a tendency to (ab)use functionalities of modern Excel a lot. :-)
Off topic, but the other day people were wondering why Microsoft did users stayed on the platform when operating systems like the Amiga, Mac and even NeXT existed and it was because the terminal user interface is good enough for many people.
Great work by the author. I really like hearing about extending the lifespans of DOS applications.
The way I remember it was that people often did use other platforms but Microsoft won largely because IBM lost control of the PC market allowing for hundreds of cheap clones.
It wasn’t until the mid 90s when PCs started to feel like the more capable platform.
It’s also worth noting that there is a reason Microsoft offers heavy subsidies for education: it’s because if you indoctrinate people into your platform early on then they’re likely to keep buying your platform for years to come because change is scary.
Edit: just to add some context, this is the point of view from England. Sounds like Apple had (and possibly still has?) a bigger presence in schools in the US. Whereas over here Apple machines were relatively uncommon compared to many of their counterparts.
> It’s also worth noting that there is a reason Microsoft offers heavy subsidies for education: it’s because if you indoctrinate people into your platform early on then they’re likely to keep buying your platform for years to come because change is scary.
That was Apple in the tail end of the 70's and 80's. Just about all grade schools where I am had Apple ]['s and few to no PC's. Apple heavily discounted the hardware and software for education. It was the same through Jr. High and High school (graduated in '90), except by then we had Mac's in the school "labs." As a kid and teenager, I didn't even know there was an IBM or Microsoft until the 90's. Everything was Apple in education. I was heartbroken to receive a 386sx/16 for my graduation present as I was heading off to software engineering in college. Until I got to college and hardly anyone had Apple's and the whole college was IBM clones.
I had a similar experience in the early 90s (II followed by 68K Macs (LCIIs) and a few molar macs), but by '98 (HS for me) they were buying pallets full of Gateway and Dell desktops which gradually replaced the Mac labs.
My guess on causes in no particular order:
1. Apple got stingier in the discounting
2. They faced more aggressive competition from PC vendors and weren't willing to eat any additional margin by increasing their subsidy
3. What I suspect is the case: Schools looked around, compared Windows 95 vs Mac OS 7 and 8, looked at Apple's business marketshare (nil) and their shrinking consumer marketshare, and concluded Apple was lost in the wilderness and teaching kids to use Macs was pointless. Even as an Apple admirer at the time (couldn't afford one though) I can understand the decision.
Are you American? Macs were much less common in Europe and particularly in England. In fact England had its own educational computer in the 80s as a venture between then BBC and Acorn, the company that invented the original ARM CPUs.
Which was funny to me because it wasn't until well past 2000 that any of my educational systems were Windows; grade school was Apple IIs and early Macs (one was color!) and high school had higher end Macs.
Cheap clones able to run 1-2-3 was a big part of it, but the real lightning bolt for everyone I knew was Windows 95 and the Internet. And most of the kids I knew would push their parents towards the Windows for the games.
In U.K. it was really common for people to have 8-bit micros even into the late 80s. Typically Amstrad and Commodores (with machines from Sinclair before that) while schools had BBC Micros (from Acorn). Though there was a plethora of machines around at that time that most people have since forgotten about, like the Dragon 32 / 64.
Europe, and the U.K. particularly, had a really strong computer hardware industry in those days. In fact France has the Minitel which is itself a really fascinating technology and shaped early French computing trends in a differ way to England and USA too.
Going back to the U.K., you’d see a few people, usually people with a little more money, but Atari or Amiga machines but more often people jumped from C64 or Amstrad CPC micro computers to IBM-compatible PCs. There definitely were people who had an Apple Mac, but it wasn’t the norm where I lived.
Showing my age a little here, but my high school still ran Windows 3.x (I don’t think it was even 3.1. Pretty sure it was 3.0 on 286s). It was all Microsoft Office too. I ran Lotus at home but Microsoft Word and Excel had already conquered the schools, at least from what I saw. But we had Corel Draw instead of PowerPoint. Not sure why. Maybe PowerPoint hadn’t been released yet?
I wasn’t around for the earliest iterations of PowerPoint but according to Wikipedia [0] for its first several years, PP was Mac-only (mid-1987 to mid-1990)…even though MS had acquired the PP developer as it’s first big Silicon Valley acquisition.
PowerPoint for Windows was first released in 1990 and was bundled into “Office” but didn’t have the deep interoperability with other Office apps until about 1993.
Do you remember what year your school only had CorelDraw?
And price, and gobs of network effects. Macs were much more expensive than PCs, and if you were working in an office with multiple computers, being the only one with a mac was a huge pain in ways that don't matter today.
The main way to transfer files in the 80s was floppy disks. Most PCs in the 80s had only 5.25" floppies, and Macs had 3.5". Even if you had a 3.5" drive though on your PC, the Mac had a much different format. There was some software that let you read Mac disks on a PC, but it was somewhat flakey, and then you were left with the fact that the base file format on the Mac with resource and data forks was much different, and then of course you were probably using different spreadsheet and word processing software. Overall it was just barely possible to share files if you really worked at it.
What you tended to see was that most everyone used PCs except certain industries/jobs that needed the graphical capabilities of the Mac. For example, if you were in a newsroom, everyone would be using Macs because there really wasn't good desktop publishing software for the PC until about the early nineties.
The Amiga had a niche in video production because there really wasn't anything comparable to the video toaster for the Mac or PC until much later. You'd also see various types of workstations used for CAD, and in the nineties, SGI was the only game in town for high end 3D graphics. Hardly anyone bought a NeXT because they were so expensive and therefore didn't have enough software. They were pretty good for rapidly developing GUI software though, so there were a few companies, specifically banks, that used some of them for internally developed software.
> Hardly anyone bought a NeXT because they were so expensive and therefore didn't have enough software. They were pretty good for rapidly developing GUI software though, so there were a few companies, specifically banks, that used some of them for internally developed software.
Famously, id Software also developed Doom on NeXTSTEP.
Xerox Ventura Publisher was popular before the 90's and ran on the GEM graphical desktop from Digital Research (the CP/M people). It was introduced in 1986.
One easy way to transfer files between early Macs & PCs was a null modem cable and terminal software that implemented Xmodem. Very long ago, I once pulled a DBase2 database off a CP/M system with 8" diskettes over a null modem cable by printing it to the serial port, and then I captured the output on a PC with Procomm.
There was a later "Laplink" program with a cable for parallel printer ports that ran faster than the 115kb max of the serial port.
I was pretty involved with the NeXT community back in the day.
If you recall, NeXT started as a workstation for higher ed. I worked at MIT in the 90s and compared to the Sun and SGI workstations departments routinely bought, NeXT was less expensive.
And since it ran Unix, people could use most of the software they were used to.
The premier spreadsheet for NeXT was Lotus Improv, which introduced many features that we take for granted today, like pivot tables. For that period of time, the display was so much better than anything comparable. Even the original greyscale display was stunning.
Steve Jobs only wanted the best software anyway; he never wanted lowest common denominator software.
Thanks to Display Postscript, it became worth it to buy a NeXT to use FreeHand or Illustrator that could do things other platforms couldn’t when it came to desktop publishing and graphic design.
People forget that back in the days of Dos and early MacOS, people rarely switched - the waves were new people coming into the market that would often sway which was burning up the world.
Software was expensive back then, too, many hundreds of dollars that you really really wanted to keep running.
Another reason is because the simple text-based interfaces and terminals are also much more responsive. Some people prefer a snappy response over constantly moving their hand between the keyboard and mouse.
This is massively important, especially for line of business apps or tools that the user touches hundreds pf times a day. GUIs are great when there are multiple, equally likely paths that the user could take. But LOB apps tend not to be like that: there is a mainline path (or very small number of paths) that dominates all others, and apps that let the user keystroke/tab/enter their way through these hot paths are waaaay fast.
For US people, look at how fast the employee in your local Costco can look you up on their ancient-looking text app.
You can do this in a GUI, or at least, you could in native apps, to some extent, with a series of tab presses to traverse the interface, and in the peak VB5 era this wast kinda ok, but not great, and webapps are a hot mess for this use case. If I'm shopping for flights, a gui webapp is great: lots of paths are valid, and i can poke around. If I'm a gate agent looking for the last seat on the plane out of O'Hare, I want speed.
Typically, the TUIs are also really good at consistently handling input, so even when they do get behind in processing the input, you can keep typing, and it will buffer inputs and catch up. There's often an attention key which is not buffered, and will dump the input buffer and stop any operations in progress, in case something is stuck.
Even if a GUI does buffer input, it's unusual for users to be comfortable clicking where the buttons will show up, before they do. More frequently, click processing is separate from whatever else, and early clicks (or taps) are ignored or directed elsewhere.
GUIs used to handle keyboard input consistently so you could type ahead, even into future dialog boxes, if you weren't using the mouse. Unfortunately, this tends to be broken for modern GUIs using async frameworks.
This is a nice side-effect of running processing code on the GUI thread. I guess we can't do that anymore now we are dealing with network latency all the time.
I remember watching clerks at Fry's Electronics I think it was - they had PCs with windows on them, but all they ran was some sort of terminal emulator to access the system mainframe or whatever.
You could ask them for a pick slip for whatever, they'd turn around, ask your name, type everything in in seconds, and then turn to the next customer; the computer would slowly plug along through all the screens and then print out the slip. They knew all the commands that far ahead.
Yeah --- Fry's used to be all real terminals, but I guess they ran out of equipment, so they ran a terminal emulator in Windows instead (almost always full screen, but modal popups from the printer? would break it out so you could see it was Windows). And the sales and returns terminals were usually setup so you could see them work. Cash registers were under the same system, but you usually couldn't see those. Word on the street was one of the Fry brothers had written that system, I think the server was a PC in the store somewhere (they did have some ability to check other store inventory, but I don't know how realtime that was)
Airline service desks and vehicle/driver licensing are also popular places to have a well developed TUI along with experienced operators that will queue up a large buffer of inputs.
At a company years ago we had a GUI that worked "well enough" but the dirty secret was it just sent keystrokes to an older TUI program that the C-level had decided was "too old looking and slow".
A TUI isn't that far from an actual API, after all.
It always warms my heart to see these classic TUIs on a screen at a place like an airline service desk or a store. (Same as the non-IP Nortel Norstar phone systems that are 25 years old but still work that are in most supermarkets and department stores)
1. GUIs were mostly pretty terrible back then. Yes, even on a Mac. You can't tell from screenshots, but GUIs were incredibly slow and clunky for daily use.
2. Clone PCs were much more affordable at a time when the cost of a computer was very high relative to now. When adjusting for inflation, a decent computer then would be $7,000-$8,000 in 2023 dollars.
Others have mentioned the ways that DOS was good enough or even better for some applications. One of the main strengths of a GUI wasn't being graphical, it was that every program had a similar interface. If your work just uses one program, DOS wasn't bad.
But remember that the people who use computers aren't the people buying those computers. They're purchased by the boss, and the boss isn't going to spend a single dollar for a better user experience, much less the hundreds or thousands it would have cost. This still applies today to enterprise software.
> Can anyone explain the deep nostalgia and longing for old DOS era software, and in particular VGA text mode interfaces?
Nowadays, our computers have instant access to an uncountable amount of storage, information, and software through the Internet, and even local storage is enormous. Back then, the world was much smaller; the software and data you had in your small hard disk and boxes of floppy disks (and perhaps tape, etc.) was all that you had, and you didn't have a constantly online network to distract you with an unlimited fire hose of nonstop information.
That led to a closer relationship between you and the software on your computer. You had time to explore each and every corner of the software, and to read its manual (be it in the form of an online help or a paper book) from cover to cover. The software stayed the same; there was no automatic online update, no security scare blaming you for staying on a release that's more than a day old. You knew your software and hardware limitations, and adapted to them; they would stay the same until you bought more hardware, or new software (which came in colorful physical boxes, that you could put on a shelf).
We have gained much since, but lost some of that sense of wonder, of making the most of memory and storage measured in megabytes or even kilobytes, processors with speeds measured in megahertz or even kilohertz, and displays with few lines and columns of characters and little color.
You wrote it beautifully and if I can add - software then was a tool. Like a good old hammer. You only bought a new one when needed, or a different type for a new task. Today, software is a service.
Exactly this. I still use WordPerfect 4.2 (DOS) and DataEase 4.5 (DOS) on a reasonably frequent basis and am much more productive. Even went as far as knocking up a simple WP4-to-Markdown converter in Go to keep the process workable.
Incidentally I'm also into retro games (Spectrum, CPC, MAME, DOS) for pretty much the same reasons. In particular the memory/nostalgia is so strong because, as you put it, "the world was much smaller; the software and data you had ... was all that you had".
Those games took long enough to load and were expensive enough (for a kid) to acquire that when they finally launched I stuck with them and got value from them. Modern games are (tens of?) thousands of times larger yet they are lost in a sea of content.
There is a definitely an element of pure nostalgia from growing up with those programs, but I think there is a bit more to it. For me it evokes memories and feelings of focus. Just look at all of today's IDEs and note application that offer "distraction free" or "typewriter" modes that are echoing simpler interfaces of old. And as another commenter mentioned, these apps were extremely capable and polished. So when I'd sit down to write my papers in high school, I'd open up WordPerfect and it alone would do everything I needed, and there was literally nothing else running so I just sat down and wrote. I don't achieve that sort of single-task work style very often these days.
DOS or more specifically MS-DOS was the platform on which the TUI, terminal user interface, applications ran. Because MS-DOS lacked universal drivers for printers and networking as well as being single tasking, it was up to vendors to provide the printer drivers, network stack as well as TSR, Terminate-and-stay-resident, programs to allow switching between applications. For most people a computer was just a function of their job and not a career, so using a single tasking program to edit a spreadsheet and sending it to someone on portable storage like a 3.5" floppy drive seemed normal. Most meetings were in person and rarely did meetings happen over group calls. MS-DOS represents that world for the majority of people working with computers in the 80's. Also, there was smoking indoors.
> Can anyone explain the deep nostalgia and longing for old DOS era software, and in particular VGA text mode interfaces?
I can only speak for me, but perhaps others feel the same way. It is nostalgia for me because this is the first way I experienced a computer, when everything was new to me and anything was possible. Until we got a computer, the only window outside of the world of my city was a TV, but a computer ran software of all sorts, and games, and others knew how it worked and could teach me stuff. Then the Internet came and I could talk to people anywhere in the world! What a thing this was.
Now, I spend all day looking at screens. I get paid to make them work, and that's cool, but the wonder and magic of it all is gone, and I miss it. DOS and text apps just happened to be my first interface, and everyone longs for their first love.
On a slightly more tactical angle, I really like typing and my keyboard, and terminal apps make it so I don't have to move my hands to a mouse and change contexts. A small thing, but it adds up. Yes, some stuff is easier with a mouse, but it's always fun to not have to change between input devices for long stretches of time.
Two very good reasons. I agree with this but I’d also add:
When you use a computer every day then sometimes anything that is different feels novel. It might be the nostalgia of an older UI or it could be listing over something new but yet to be released.
It's a limited interface (80x25, 16 or 256 colors) and major companies spend millions of dollars over 20+ years perfecting the interface for those things. They were top tier and felt humanly understandable.
To illustrate: In a DOS or terminal interface, a date field usually expected a certain format, made that format obvious in the UI, and you'd type the date with probably 6 keystrokes (numbers) (and it'd probably jump the focus right to the next field if you needed to enter more). And likewise, good software would find a reasonable way to fit everything possible on one screen.
With 'modern' web-based business software, most of them implement a date field with a pop-up mousing widget. You may or may not be "allowed" to type at all, or it may not be obvious how to, and you might need 47 mouse clicks and 15 seconds to enter a DOB in the 1970s. And screens are usually not laid carefully out at all, so even with small amounts of info on the screen, there may be plenty of whitespace, necessitating scrolling back and forth constantly.
A phone number field, likewise, may throw errors for using or not using the "expected" punctuation. Etc.
Stuff like that would be considered an absolute failure in the old text interfaces but it's the norm now that UIs are clunky and inefficient.
PS: None of this would be difficult in the web stack, it seems like the "UI/UX" designers (when they even exist) care exclusively about aesthetics.
I AM AMAZED that the average Pascal program of the day compiled faster on my old 386 than the average program compiles today in the latest and greatest workstation. Also don't get me started on cloud remote desktops. Go compare the immediacy of this garbage vs an old DOS computer in most applications.
About the UI... Dozens of flashing buttons, toasts and pings vs a Zen interface where you can, well, just focus on work.
> Can anyone explain the deep nostalgia and longing for old DOS era software, and in particular VGA text mode interfaces?
I’m not using any of those other old things you mention, but I love how I can run my full Emacs-configuration locally in a TTY, remotely over SSH or whatever with no loss of functionality.
Having that capability is IMO a strength, not a weakness, and I wish more software was like that.
I never used 1-2-3 in the old days, my first computer had Excel 95, but particularly what is attractive about 1-2-3 and similar software interfaces is that they can be controlled entirely by the keyboard and their input is buffered, so people can develop muscle memory and operate extremely fast, even faster than the feedback from the user interface (you don't have to look).
Since it is ported to Linux, you could run it remotely over ssh over slow and high latency connections with no issue. It also opens and operates instantly on any computer you can find.
The nostalgia is mostly about seeing what was possible withing limited hardware, and realizing we could do better with less.
Depends on the person. I too find myself looking at text interfaces into this or that every once in a while but always come back to the same few tools: Weechat (for Slack), w3m (browser that I'm currently replying on) and vim. More recently I've been experimenting with Word Grinder which is very powerful but I don't know if it's as safe as Google Docs with respect to auto saving.
For me, I use these tools in my daily life because the normal internet is just too damn noisy and gamified and addicting. I'm big on the slow internet movement and browsing the internet in w3m is a huge boon to focusing on what really matters: the words.
Lotus 1-2-3 was the spreadsheet that sold 10 Million PCs.
When you need a quick what-if calculation, it was the go-to. Then on the Mac, someone came out with a SpreadSheet as a desk accessory. Fpx Pro was a very simple data base, and WordStar was the king of word processing. SideWays also had a Lotus 1-2-3 driver for the extended EGA modes, 43x80, 43x132, and 50x132, That worked perfectly on a VGA card on a super crisp Sony 15" Trinitron, 0.25 stripe pitch.
for me you had direct access to the hardware, that made writing software a bit fun. In some cases you had to jump through hoops which added to the excitement.
Plus almost anyone could write programs, I saw some amazing programs created as COM files.
I assume the OP likes working in the terminal because like working with a keyboard.
If that's the case, they could use the old Windows version of 1-2-3, which was optimized for terminal users by allowing all the old slash commands, but can use the full screen. They could even use an old Windows version of Excel, which also supported the Lotus slash commands, to help transition users over.
Random trivia: Excel still uses "/" to bring up menu. At some point the shortcuts for "Alt" and "/" converged to Excel specific shortcuts, but my fingers still use "/" for whatever reason.
This is fascinating, but at the same time I got to ask. Wouldn't it easier for a person as experienced as Tavis Ormandy to simply write a console version of a spreadsheet software from scratch using a modern stack?
I added it to the Gentoo GURU overlay (still pending on dev branch). Will fit along with DBF/libdbf as formerly common 90's software available to install.
EDIT: Oh wait, this is lotusdrv. I haven't added that yet. Added the previously discussed linux native version of lotus123r3.
I don’t remember Lotus 1-2-3 much, but WordPerfect 5.1 was glorious. I wonder if there is something similar to it today? (I’m happy with Markdown and Emacs, but it’s not the same.) Also, its “reveal codes” feature (or whatever it was called) made understanding HTML later on trivial.
It's not that 1-2-3 or a TUI was better, but that if you know it and you don't care about it, you care about it.
Worse is better.
And so it goes with apologies to Mr. Richard Gabriel, but that experience helped me more deeply understand what software should do is do important things for people, not do things better.