Monday, March 19, 2018

Funny Support Stories

Through the course of my years in software, we had many encounters with technical support issues or bugs that were funny, either at the time, or looking back at them later.  Here are a few of them.

Close The Door

In the early days of personal computers, it seemed that no matter how well you documented instructions for installation of your software, a significant number of very non-technical users needed you to hold their hands through installation.  I remember at Synex, one of the support technicians almost falling out of her chair with laughter. When she finally finished the call, she told us what had happened.

She had helped the customer find the floppy drive on their PC, and told them to take the floppy out of it's paper sleeve, then slide it in with the logo on top and the open part in first.  Then she told the customer, over the phone, close the door. This of course meant the floppy door.

The customer said "just a second" and she could hear his office door closing in the background.  She almost died!

Where's The Any Key?

This is one of the old cliches of tech support, but I can assure you that it happened more than once. It was not uncommon for PC software to tell you to do something, then "press any key to continue".  More than one customer called us in a panic as no key on their keyboard was labelled "Any".

Corrupted 9-Track Tapes

Back in the day, telephones didn't have ring-tones, they actually rang. They rang by oscillating a magnetic field that made a little hammer wack two sides of a little semi-spherical metal bell on the underside of the telephone.  On occasion, someone would place an office phone on top of a stack of 9-track tapes. The phone would ring, and the top tape would be very effectively erased. Depending on how effective the little magnet was, you might partially erase the next tape down, too.

Putting one of these phones on top of floppies had the same effect, as floppies, like tapes, were magnetic media.

Fridge Magnets and Floppy Drives Don't Mix

On a similar vein, if a user had a bad time reading a floppy drive and it appeared to have been erased, it was often helpful to ask where the floppy was before getting to the computer. On occasion someone would stick it to a fridge with a fridge magnet ("I've never lost the fridge!").  This always worked as they were very thin and the magnet would hold it to the fridge quite nicely.  Needless to say, the magnetic data was nicely erased in the process!

The Wrong Floppy Type

The old 5 1/4 inch floppies held limited data, were big and a bit fragile.  So with the IBM PS/2 computer (not Play Station/2, Personal System/2), IBM came out with a new floppy, the 3 1/2 inch floppy. It held considerably more data than the 5 1/4 inch floppies, and a sturdier hard plastic case (it wasn't really "floppy" anymore) and was actually better technology.

We had a customer that we sent PK Harmony to on 5 1/4 inch floppy. When he couldn't fit it into his 3 1/2 inch drive, he tried folding it before calling us!

We sent him the right media and made a point of asking every customer what type of floppy drive they had before sending them media.

Providing Technical Support to GOD

At Liberty, we had bi-weekly support meetings to review how our tech support was doing in terms of meeting our SLAs for customers.  We had acronyms for most of our customers, and one of them was a company called Guaranteed Overnight Delivery.  Their acronym was GOD. This made for very interesting support meeting agendas and minutes!  It seems our technical support department was very helpful to the Almighty!

Mouse in the Agitator

I mentioned this story in a previous blog post from a few years ago, but I'll repeat it here.

At Liberty, we had a customer who had a D3 PICK system, and they were doing research for a cure for AIDS.  They did a lot of blood testing and had all the necessary equipment for doing it.

From the time that ODBC was first developed, Microsoft added a query tool called MSQuery to their Excel spreadsheet product.  This query tool is still there, largely unmodified. It does its job quite nicely! Unfortunately, MSQuery developed a bug that caused problems on just some computers. I believe that bug is long resolved but at the time it drove us nuts!  

This tool used the multi-threaded apartment model.  Even though you could have it pulling data back on a thread that was distinct from the thread that was updating the UI, every time you wanted to notify the UI that some number of rows had been pulled back, so it could update the running counter, you would have to call into the UI thread.  This meant that you put a blocking call onto the message pump for the UI.  This was not a big deal usually, but MSQuery would run into a situation where the message pump would get stuck unless you provided some user input to it.  This was not just a problem for us, but for any ODBC driver, it seemed.

This is how the bug manifested itself.  When you were doing a query, the little ODBC "World" icon would spin. When this problem happened, the world would stop spinning and nothing would happen until you moved the mouse or pressed a key on the keyboard.  Then it would start up again for a couple minutes before stalling once again.  Those stalls would stop both the UI updates and the query from retrieving data.

Our customer had run into this, and had a brilliant solution. He took his mouse, put it in a specimen agitator, and left it shaking until the query finished.  There was a constant stream of mouse movements that kept the query happening!  That was absolutely brilliant, out-of-the-box thinking on the part of our customer!

And that's one of the best things about my work over the years, getting to work with so many brilliant, kind, funny, good people!

Friday, March 16, 2018

Copy Protection Backfires

The Problem

Although there were lots of other personal computers, the IBM PC was the one that really took off for business use, and quickly surpassed all others, including Apple.  As the IBM PC and clones began to gain significant traction, software developers were faced with a dilemma.  The world of Software was a bit of a wild west. Was software subject to patent law? The Patent and Trademark Office of the United States initially said "NO!"  Software was like a mathematical algorithm, they said, and as such was not subject to patent law.  Was it subject to copyright? If so, did changing the variable names, and moving some subroutines around constitute a unique work?  Most software was delivered on 5 1/4 inch floppy drives. The YouTube clip below gives a demonstration of these floppies.

From day one developers of software were paranoid about someone stealing their software and reselling it.  Anyone with 2 floppy drives could make a full copy of one floppy to another.  It was easy to print out labels that would look like the original software. You could then resell potentially hundreds of copies of someone's software.  You'd get all the profit but the original authors got nothing. Because you didn't have the cost of developing or supporting the software, you could undercut them.

Copylock Protection

One solution to this was a product called Copylock.  They came up with a way to write to a track on the floppy, with special hardware, in such a way the the drives could read the track, but when copying the track, it defied the normal formatting and the drives would not copy the track.

Details of how Copylock worked are available here:

One of the most successful PC software companies, one that used Copylock protection, was Lotus. Lotus 1-2-3 came on 5 1/4 inch floppies with Copylock protection.  It was not uncommon for users to encounter a problem with their installation and would need to re-install Lotus. Copylock kept a count of how many times you had installed it, and it would only allow 3 installations before telling you that you had run out.  You could uninstall Lotus from a computer and it would increment its counter, but if your hard drive packed it in, or your 8 year old had discovered the "del" command, you had lost an installation.

Prior to my joining them, Synex had sent a software product that they had developed to an American software publishing company, in a bid to convince them to resell it. The company asked to see the source code, ostensibly to see its quality, and Synex sent it to them. That company told them shortly thereafter that they weren't interested in the software. A month later that company released it under their own name.  Remember that this was the wild west. There was no precedent of anyone being successfully sued for copyright over software, so Synex didn't pursue it.

As a result, they were very paranoid about piracy.  For their PC Harmony products for  Business BASIC and Wang BASIC they enlisted Copylock to protect their software.

When I joined Synex as their development manager, I noticed that we had a significant number of support incidents around people running out of installs on their floppy disks. In addition to taking a lot of support time, we had the cost of mailing them another floppy disk. Then one day, one of our Lotus 1-2-3 disks ran out of installs. I phoned Lotus support, who sent me another floppy, but instead of being another 1-2-3 diskette, it was software to disable the copy protection. They had figured out that the cost of supporting Copylock was higher than the risk of someone making a copy of the disk!

Removing Protectionism

I tracked for a time how much of our software support was due to Copylock issues and we quickly decided that we would stop using Copylock.  Our support costs dropped and our sales stayed steady!

One thing we had learned was that our product was complex enough that most users would want support, at least for installing the host software, so there was no real value in using the Copylock protection. The complexity of our product coupled with our excellent customer support was protection enough!  Lesson learned!

Friday, March 2, 2018

I Got Hired by Synex Systems

After working with Synex as a contractor from DataSense / Escom, I switched jobs to Synex Systems to become their Development Manager for PK Harmony and the PC Harmony products.

Synex was an interesting company.  One of the key people at Synex was Chris Graham.  In addition to terminal emulators, they were writing add-ins for Lotus 1-2-3.  They had a couple of Compaq luggable computers.  Below is a video clip of someone demonstrating one.

Murray my boss told me of going on a trip. Back in the day the airlines didn't let you take a computer as carry-on, so he had to check them in. He was telling the person beside him, in the window seat, about his concern that his computers would be OK, when the guy looked out the window and told him "I think your computers just fell 5 feet off the conveyor onto the tarmac!"  When he got them back to the office, he re-seated the boards in them, turned them on. One needed a new monitor and both cases were wrecked, but both ran just fine. Those Compaqs were truly rugged!

You could also get them with a built-in acoustic coupler, so you could put the phone receiver into it and dial up another computer.

Back to add-ins for Lotus 1-2-3.  You could only write them in Assembler, which was a pretty slow way to program.  There was a special piece of hardware call an Antron Probe, that would mount over the CPU chip. It would track all instructions going through the CPU and could back-trace these for you.  Chris took one of the luggables home for the weekend and reverse-engineered every entry point into DOS that Lotus 1-2-3 called, creating back-traces in assembler.  He then figured out how to hook a C program in.  Productivity for Lotus add-ins was immediately improved!

Chris later went on to be Director of Interoperability for Microsoft, and was a key contributor to Windows 3.0. The Easter Egg that listed Bill Gates and all the other key contributors listed Chris along with the others.

I also recall seeing a complete IBM BIOS assembler listing when I was still consulting there.  The draw was irresistible!  I had to work there!

It didn't take long for me to get my first Mark Williams C compiler. I poured through the Kernighan and Ritchie book, and went through the PC Harmony C and Assembler code.  I was hooked!

This was a time when it was possible to have a grasp of what was happening from your software right down to the hardware, before computers turned into onions with many layers.  You could order a set of Intel developer manuals and they'd ship them to you for free.  The first manuals were about 3 or 4 volumes, but over time they've grown. The last hard-copy ones I ordered were about 6 volumes and today you get them electronically. They are 10 volumes, but since they are electronic, they'll let you download them as 4 massive volumes instead.  I've always felt that the ability to visualize what is happening at different layers of a multi-layered, even distributed, application, is key to being able to architect, design and troubleshoot it.

At one point, I got to use a Compaq luggable that we called the "brick".

In addition to DOS, it had this cool new graphical Microsoft program called Windows 286.  It would do CGA graphics on the monochrome amber monitor.  It could also dual-boot into IBM's equivalent to DOS called OS/2 which had its own graphical component called Presentation Manager.

It was a really cool time to be working with computer software, and I was having a blast!

Saturday, February 24, 2018

PK Harmony - Terminal Emulation for PICK, and more

The Idea

In another blog post I talked about how Synex Systems had taken what we did for Terry Winter and turned it into a terminal emulator that could capture data from Business BASIC and Wang BASIC systems.  The owners of that company decided to pursue creating a version that worked with PICK systems.  A deal was offered to me that included me getting my own IBM PC computer in return for doing this little project.  I accepted the deal with great excitement!

The initial concept was to copy what the PC Harmony product did. The idea was that the terminal emulator software would respond to a special escape sequence, that was not used by any terminal we knew of, to shift into data capture mode.  It would then screen-scrape the data into an ASCII file, and another tool would allow you to define a ruler on the columns and capture the data.

This had some limitations. If you had summary lines, they often didn't line up, and created problems.  Also, when data wrapped in a column, it created interesting problems.

PICK had a query language called English on Reality systems, and Access on PICK systems (it actually had a bunch of names, but those two were the most common.)  It started with "LIST {filename}..." or "SORT {filename}..." and then you specified dictionaries for the columns you wanted on your report.  Dictionaries ranged from simple references to a field, to complex computed values. There were no joins, but you could reach into another file, if you could find its key, and pull out fields.  It was a pretty rich language and was very commonly used.

The Solution

So, I wrote a PICK/BASIC program that mimicked the LIST/SORT commands, but was called PICK-PC, and instead of padding and wrapping data for a columnar format, it delimited it.  Because both the PC and the PICK system had problems with high-speed transfers, and because hardware and software interrupts on the PC could cause it to lose data, we had to create an error correcting data transfer protocol for serial I/O.

The first version didn't handle some of the more complex computed dictionaries, but you could use raw data and could use "translates" to pull data out of other files.  In a pinch, you could stage the data you wanted into another file and then PICK-PC the data over.  I worked with the PC developers to come up with an architecture and design that would be practical on the PICK side, but still leverage the power of the PC.

More Horsepower Needed

It didn't take too long to realize that I had a little problem. I did much of this development (to earn my new PC) at home. But at home I could either boot into my 5 MB PICK partition, or my 5 MB DOS partition.  It was impossible to test PICK pushing data at the DOS side of the PC, as only one could be active at a time.  I also quickly ran out of room on my 10MB drive, so the company gave me a second hard drive. A 20 MB Seagate drive.

After a while, the company bought two AT&T 6300 computers. These were clones that were equivalent the the IBM AT class machine, which was the next generation after the XT. The video below shows one of these booting up.

Now I had a PICK system and a DOS system that I could run at the same time.  Both computers were quite a bit faster than my XT class machine, too, so that was another benefit.

Additional Features

In addition, I designed and the team developed a number of additional features:

  • A scripting language that the PICK system could use to drive PC activity. You could do a number of things like change your current directory, create a directory, check for existence of a file, run a DOS command, program or batch file.  
  • A scripting language that you could run on the PC to drive activity on the PICK application.  
  • We allowed you to take data out of several formats and upload it into a file on your PICK system.
The full feature list of the production version is listed in the product brochure below:

My First Spectrum Show

We went to a Spectrum show in Las Vegas to show off our prototype and get market feedback.  That Spectrum show had a special room called "PC Labs", and we had a booth there.  The first time I was doing the demo, I showed a LIST statement displaying some data on a screen.  Then I said "Now here's something you haven't seen in PICK." and showed them the PICK-PC command.

It was completely unrehearsed, but that line became our "hook". Everyone for 3 or 4 booths away trotted over to see this "new thing they'd never seen on PICK"!

We pulled up the same data in Lotus 1-2-3.  We did another set of data and pushed into a Wordstar mail merge.

Trivia Note: Wordstar was the first WYSIWYG word processor with mail merge capability.

We left that show with pre-orders for the product, and a list of interested beta customers!

What I Learned

For me, there were several things I learned:
  • I discovered PC programming and was intrigued at the possibilities.
  • I discovered the incredible rush of working with a brilliant team of creative geniuses to come up with something of value. Software is so much like magic. You start with an idea in your head and wind up with something valuable that people are willing to pay money for!
  • I discovered the excitement of working with sales and marketing teams to get something I helped author go to market!
  • I learned more about serial I/O and data communications.
Finally, although we were far from accomplishing our vision for the host scripting we had developed, that vision was in line with a recent trend called Robotic Process Automation.  Interesting to see it coming to fruition some 30 years later!

Note: For a long time, the computer systems used by most libraries in North America was PICK based system sold by Dynix. Synex had an annual contract to sell Dynix PK Harmony licenses in bulk to allow data transfers to other systems.  Our customers also included the largest McDonald's franchise in North America (in Florida), the London Underground system, and other companies, including  fortune 500 companies, that were using PICK systems in all industries and in countries around the world.  This was great exposure for me.

Sunday, February 18, 2018

Data Transfer from COBOL to REALITY and Back

First City Trust

While working for DataSense, one of the more active customers was First City Trust.  DataSense had software that was built for Canadian Trust companies, and First City was a big customer of theirs. I worked with a number of other Trust companies but First City was the one I spent the most time with.

First City had migrated their core deposit taking software to an IBM mainframe, but continued to run a number of business units on the Microdata Reality system.  This included mortgage banking (where they administered loans) and leasing. There may have been more but those two come to mind.

COBOL to Reality and Back

Because of Reality's convenient query and reporting language (called English on Reality, but other PICK systems called it Access), it was way faster to do reporting on the Reality system, so whenever management wanted a report from the mainframe, the quickest way was to get the data to the Reality system.

I wound up working with one of the First City IT people who knew COBOL to come up with a format that we could import into Reality. We had to do EBCDIC conversions and had to read the fixed width fields. This was a breeze compared with what I had had to do with the Terry Winter conversion, so I was always quick to create the imports for these. In some cases, I would create resulting data that would be EBCDIC converted and written to tape to go back to the IBM.

I wound up using the IBM terminals to track my time on the mainframe, and got used to using their messaging system.

Resetting Reports

First City had a printing system that was over 20 feet long.  They printed a ton of paper every day.  They also had to store a ton of reports. In an effort to reduce costs, they launched an initiative to figure out what reports were required, by whom, and were there other reports that would provide the information.

They managed to do some cleanup, but there were a large number of reports, some 100 or more pages long, that they were unsure if anyone needed them.

So they decided to take anything that they couldn't find an owner for and simply stop printing it, and see who complained.

Answer: No one...

I think they saved a few hundred acres of Amazon rain forest by that one initiative, reduced their storage and printing costs, and took an unnecessary load off of their computers!

Saturday, February 10, 2018

Pranks and Viruses

Over the years, at many places that I worked, the programmers I worked with were a fun bunch of people.  We'd have really intense discussions about technology and about how best to solve different problems, but ultimately, while we worked hard, we played hard, too.  Even the first viruses were more in the order of a prank compared with the ones we have today.

Changing User Prompt

One of the first pranks I encountered started out unintentionally when we were reverse engineering the Reality operating system.  We discovered a lot of interesting things, including that the very first byte (byte 0) of every frame (512 byte block of memory - that's how Reality organized memory) was not check-summed to ensure no corruption, but for one frame, it was actually used.  Both data and program space was organized by these 512 byte Frames. The first 12 bytes were used for linking and state information.  The rest was for data or machine code, for programs.

On PICK systems, the command line or shell mode was called Terminal Control Language (TCL).  The TCL prompt on Reality was a colon ":". For other PICK systems, they went with the greater than sign ">".

These first PICK systems, were what we called "Native" systems, where they provided the full operating system. Later versions ran as a shell on top of Windows, Unix, Linux and other platforms.  Prime Information was the first one to run as a shell, on PrimeOS, but it was the rare one. On these native versions, the first, lowest numbered frames were where the "operating system" and user assembly programs were typically loaded.  For most of them, frame 6 was the one that handled user terminal I/O.

On most PICK systems, byte 0 of frame 6 held the TCL prompt character.  If you had access to the system debugger, you could change that byte to any character that you wanted.

Be aware that while the TCL prompt character was ":", the BASIC debugger prompt was "$", and the system debugger prompt was "!".  These prompt characters would tell you that you were in different states.

When we discovered this, we changed it to another character to see what would happen. This was on a multi-user system, and we had other programmers doing work. As we changed it, we could hear swearing from the next room. We hadn't intended this as a prank, but it was too good to pass up! For fun, we changed it to the BASIC debugger prompt.  More swearing.  How about the system debugger prompt (this was getting too much fun!)  The swearing was sounding worried.  Then we changed it to the character that cleared the terminal screen.  This was too much!  The programmer in the next room came running over to say that the computer was having serious problems, only to find us laughing uncontrollably!  He realized he'd been had!


On all PICK systems, there was a file that controlled the error messages.  While you could mess with any of them, by convention this was frowned on, with the exception of ERRMSG 3.  This ERRMSG simply printed the words "VERB?" and returned to the TCL prompt.  It was the error message you got if you mistyped a TCL command or intentionally typed random garbage at TCL and pressed return.

001 E VERB?
EOI 001

[3] VERB?

It was generally considered OK to make changes to this ERRMSG.  For most customers, this would be a simple change to something like "WHAT???" or "SAY AGAIN?", but for a couple of our customers, they were always on the lookout for something really elaborate or different to put in it. I saw some that were like 12 lines of text, suggesting you present yourself for a Bozo award.  Finally, one customer changed it to "BUTTERFINGERS!". Unfortunately, there was a worker in a warehouse of theirs who had really big fingers, and was convinced that it was personal. That was the end of them changing that error message!

Water Detected in Drive A

In the early days of DOS, I remember finding this program called "Drain". It was actually an executable called "".  You'd run it, and it would create what looked like a DOS prompt "A:\>".  The moment you typed anything, it would send a beep to the system speaker, and display an error message:

"Water detected in drive A"

The cursor would flash for about 3 seconds, then it would display another message:

"Starting rinse cycle"

Now spinning a 5 1/4 inch floppy drive made a sound a bit like a washing machine on rinse or spin cycle. Enough that you got the idea. It ran the rinse cycle for about 5 seconds, then printed out:

"Starting spin cycle"

This ran for another 5 seconds.

It would end with a message saying something to the effect of "Drive A is now dry - you may resume work." and it would drop to the DOS prompt.

My wife was volunteering as secretary at our church and she ran it, then called the pastor over, telling him something was wrong with the computer.  Great fun!

Airplane Pilot Attempts to Outrun Electrons

One story a colleague of mine tells is of a software package that they had developed for airplane maintenance. The computer also had a word processor called JET and pilots would often use this to write letters.  They were at the Boundary Bay airport and a pilot was using the system as a word processor to write up a letter.  In typical style, still emulated by software programs today, if you typed the keystrokes that would delete your document, the program would ask if you really wanted to delete the document and you could type "Y"es or "N"o, followed by Return.  If you typed "Y" followed by Return, the program would tell you that document "xxxxx" has been deleted.

The pilot did the wrong keystrokes, and had typed "Y" + Return, and got the message saying his document was deleted, before he realized what it had asked him.  The quick-thinking pilot reached over his terminal (they were quite big affairs - see below) and yanked the serial cable out, then ran like mad to the computer room where he pulled the serial cables out of the back of the computer. Then he walked over to my colleague, Jan, who was watching all this in disbelief from another room, and asked her if she thought he was fast enough to stop his document from deleting!

The pilot reached over this and pulled out the serial cable on the back

Coffee Machine is Access Protected

One of the error messages that the computer would occasionally display was one that indicated that an operating system function had tried to access memory that they were not permitted to access or that was not properly initialized.  The message was to the effect that something "... IS ACCESS PROTECTED". It would then display the message "Abort @n.m". "n" was a frame number and the "m" part was the offset within that frame. This was the address of the program instruction that encountered the error.  It would then drop you into the system debugger, with the exclamation prompt "!".

Encouraged by the antics of other programmers that I knew, when working at First City Trust, I decided to create my own program for locking out my terminal while I went for coffee or lunch. I created a program that displayed what looked like a blank screen with a TCL prompt ":".  The first two times you typed anything into it, it would print the "VERB?" output, giving people the idea that they either mis-typed, or something else was going on.  On the third attempt, it would print the message:

ABORT @12:00 noon

The exclamation mark "!" made it look like the system debugger.  At any of these prompts you could type a special password and you'd drop out to real TCL.  If you typed anything else into that last "!", it would display messages to this effect:


Regardless what you typed, it then displayed


(SYSPROG was like the root account for these PICK systems.)

Finally the dots would stop displaying and it would log you off.

Note: No real SYSPROG accounts were ever hurt in the running of this program...

April Fools Day Endless Reboot

When I worked at Synex Systems, one April 1st, one of the programmers pulled a prank on all the other programmers.  The night before, he stayed late and changed every programmer's computer's autoexec.bat file to run a program that forced a coldboot just before finishing.

The effect of this was an endless reboot.  The first people arrived, started their computers and walked away. They'd come back with a coffee and the computer was still booting. OK, computers were slow at booting up (and many times still are), so they'd read a magazine for a few minutes. Finally, the realization dawned that this was taking too long, and that BIOS message they were looking at had already displayed twice...

You had to reboot off a floppy and edit that autoexec.bat file to fix it. That was a good April fools prank!

Stoned Virus

And then there was the stoned virus. It was one of the first real viruses, and very prevalent for a number of years.

When a computer boots up today, it looks for an operating system boot-loader in the "boot sector" of your hard drive. On the original IBM PCs and clones, the BIOS would start by checking drive A, then the fixed disk.  It was not uncommon for someone to load a program or data disk into drive A, forget it was there, and shut their machine down.  When they booted up the next time, they'd get an error message that the disk in drive A did not have an operating system on it.  You would remove the diskette and type Ctrl-Alt-Del to reboot.

The stoned virus was called a "boot sector virus". When you left an infected diskette in drive A, it would actually run a program that scanned your computer for diskettes and hard drives, and would add itself into the boot sector. Then it would display a message on your screen saying "This computer is now stoned." and it would print the standard error message saying that the diskette in drive A did not have an operating system.  You'd take the diskette out and reboot and all would look good. If you worked at Synex, you'd be convinced that one of those darned programmers put that "stoned" message on your computer as a prank.

Thereafter, every time you booted your computer, it would run this stub before running the operating system. Any diskettes you left in, it would attempt to write to the boot sector and infect it during boot-up. After boot-up, any diskette you put in, the now-infected operating system would attempt to infect with this virus.  Although I heard rumours that this virus would occasionally randomly delete files, to my knowledge, other than replicating itself, the only other thing I ever saw it do was occasionally it would display "This computer is stoned" message randomly during operating system boot-up.  These were the days of relatively benign viruses.

I knew of a company that published 6,000 diskettes to go out along with copies of Aldus Pagemaker, only to be told by Aldus that all of these diskettes were infected with the stoned virus!

We got in the habit of write-protecting all our floppy diskettes that we sent out to prevent our media from being infected.

This practice came in handy on multiple occasions. One such occasion was when I was at a large New York law firm, and someone came to me to say that there was something wrong with my diskette. I looked and he had an error message about not being able to write to the diskette. I pointed out that he should never write to our diskette, then got him the program that checked for the stoned virus.  He ran it and it turned out that every computer in the IT group had this virus!

Virtual Hand-Brake

And finally, in the same vein as the aforementioned airplane pilot, I leave you with an innovation that a number of us came up with, but no one ever figured out how to implement: The virtual hand-brake!

Have you ever had one of those times when you realized that the computer was causing havoc faster than you could undo it?  You need a virtual hand-brake!

I could probably go on for hours, but it's been long enough for today!  I hope you enjoyed this little trip down memory lane!

Saturday, February 3, 2018

My First Computer!

The company made an offer to some of their employees, including me. There were a number of opportunities available, and if you agreed, on your own time, to create one of the suggested products, the company would buy an IBM PC XT, and you would over time become the owner.  I'll talk about the project I was offered in another post.

So, I got my first computer. It was the IBM PC XT. It had dual floppy drives and a 10 MB Shugart hard drive.  Terry Winter had a 10 MB hard drive that was larger than most modern desktop systems, so I was impressed with how small it was, and convinced I wouldn't run out of space for a long time.  Was I ever wrong!

Partitioned half the hard drive and would boot into R83 from Pick Systems.  At that point, half my drive was unavailable for DOS stuff.

Then, I got hold of something called Revelation version G.2, and shortly thereafter, a bug-fix of G.2b.

This took up some of my space and I started creating applications with it. I had two applications that my church used for tracking finances and Sunday School attendance.

The Rev G.2b application I wrote for my church - still runs under DOSBOX!

The final blow was when I got a Mark Williams C compiler.  Suddenly, I had to manage my space, and back things onto floppy (never just one, it was always a series of floppys.)

The problem with backups on floppy was that the likelihood that one diskette having an error was fairly low, but when you multiplied that by 4 or 5 floppies, the likelihood went up. If you took one backup, you'd have a bad floppy. If you took two, you'd have a bad floppy in each. If you took three, all three would be fine!  Did I mention that floppies were slow? They were just slow enough to drive you crazy, and just fast enough that you couldn't really do anything else. And this was DOS. If you were copying to a floppy, you were not doing anything else!

Still, it was exciting times, and I loved the amazing new technology!

We changed the company name.  It was Toga Computer Services ("To"ny and "Ga"ry were the founders - hence "Toga".)  Occasionally we'd get calls from people who thought we were a laundromat.  The new name was Datasense.

Sunday, January 28, 2018

Personal Computer Invasion!

I had the great pleasure of being a part of the personal computer invasion.  When I started in software, options for businesses were mainframes and mini-computers.  But there were people with a vision of computing for the masses, and they were prepared to make their vision a reality.

When I was still a teenager, my older brother saved up his money from his job and bought a Tandy Radio Shack TRS-80 computer. Below is a short clip of someone using one.

My brother's computer had not hard or floppy drives. You had to use a cassette tape drive to save programs, and to load programs you wanted to run.  I can't recall if his had a built-in screen, but I do recall that he connected it to the TV.  There was a game where you had to shoot alien ships and recharge at your space station.  The one time I played it I destroyed all but one alien ship, found my station, recharged, then destroyed my space station. It gave me a message about probably getting court martialed. Then I destroyed the last alien ship.  I was the only ship left in the universe!

One of the first to really make waves was Apple.  My wife was working for Canarim Investments, in downtown Vancouver, and the bottom of her building had a store where Apple Computers was showcasing their personal computer.  It was exciting times, although the productivity gains simply weren't there in the first versions.

Here's a funny clip of some digital natives trying to figure out an old Apple computer.  It held the promise of good things, but fell a bit short on delivery:

Shortly after this, IBM came out with their IBM PC. There were several versions before the XT came out. Here's another video with someone looking at an old XT - it's a pretty long one, and the hard drive doesn't work in the end (not uncommon - the drives were always the first thing to go):

My first PC was an IBM PC XT - I'll talk about that in another post.

Soon there were lots of options! Most, like the TRS-80 were focused on games and consumers, and they were very expensive!

I remember going to computer shows, and at one of them, I ran into the Timex User Group of Vancouver (TUG).  This was the same company that made watches, but they'd delved into computers. This was the heath-kit of computers.  You got to assemble it yourself and program it.  The TUG folks struck me as being like a bunch of former WW-I biplane pilots playing with their toys. They had voice synthesis and other cool things that they did, none of which were commercial grade.  But then, these were called "personal" computers.

It was exciting to be involved in the early days of a new technology that held so much promise, and I got to watch it go from promise to reality!

Saturday, January 20, 2018

The Spin-off: PC Harmony

In my previous 3 blog posts I talked about a data migration that I participated in. I'm going to segue into what happened with the IBM PC software that they used for the data migration after the project was finished.

First, I'll take a minute to talk about computer terminals and personal computers.

Mini-Computers vs. Micro-Computers

Back in the day, business computers were either mainframe computers or a new class called mini-computers.  Really large companies could afford a mainframe, but smaller and mid-sized companies generally were using mini-computers. These computers had floppy, tape, or possibly one hard drive.  Occasionally, you'd have an array of hard drives, like what PBD had (another previous blog post.)

Business computers generally involved getting someone to write you a program, or buying a ready-made program from someone and customizing it to your business needs.  But a new thing, called a micro-computer, otherwise known as a personal computer had come on the scene. One of the most powerful tools that business users were seeing on these things was something called a spreadsheet program.  Business users had always had spreadsheets, but these were on paper. This new innovation was electronic.  It was tremendously popular!  One of the best known ones, which was available on the IBM PC, was called Lotus 1-2-3. It came on a 720 KB 5 1/4 inch floppy disk.  Many PCs came with dual floppy drives, so you'd put your program disk in drive A and run the program, and you'd use drive B for your data disk.

Serial Terminals

Now, most business computers used serial terminals with keyboards and monochrome green monitors for data access.  (No, they did NOT have a mouse on PCs yet. Joysticks, yes; mouse no!)  These printers had the ability to go into what was called Slave Print mode. The computer would send a special escape sequence (a string of characters starting with an ASCII 27, otherwise known as ESC) which would tell the terminal to send output to it's secondary serial port, which was usually an attached printer (referred to as a slave printer.)

A Product is Born!

The only problem was that these business computers had all the data, but the PC had the wonderful spreadsheet tool. How can you bring these two things together?

Now put that together with software that captured printer output in the previous blog posts, and an idea was born!  Synex Systems decided to come up with a software package that imitated one of several popular serial computer terminals like the ADDS Regent 40 terminal, but they added a new escape sequence, one that none of the real terminals used for anything. That code did the equivalent of a slave print, but it would capture the output to an ASCII file on the computer.

Then, they had another tool that you could use to set a ruler on the output file, assuming that it was typical columnar report data, and it would import the data into a word processor mail-merge, or a spreadsheet, or a desktop publishing program like Framework, or a PC-based database like dBase.

They developed a version for MAI Basic Four first, as they had worked with one at Terry Winter. Then they went on to do versions for almost all the Business BASIC systems and for a Wang BASIC system.  This product family was called PC Harmony.  MAI licensed an OEM version called MAI PC-Link. Thoroughbred Business BASIC licensed an OEM version called Thoroughbred-Link.  Other Business BASIC vendors just pointed their customers at Synex.

The PC Harmony product included components that ran on the PC, and Business BASIC libraries that were made available on the mini-computer.

I wasn't involved in the development of PC Harmony, but was aware of it happening as I continued to work for Toga Computer Services.

Sunday, January 14, 2018

My First Data Migration - Terry Winter - Part 3

Part 1 is here

In the previous parts of this story, I talked about the problem to be solved, and how we did the migration of the raw data to the staging file. Now we'll talk about what needed to be done with that data.

The data we had was all in a single staging file.  We needed to extract fields and records and write them to the appropriate PICK files (think "tables" if you're from the relational world).  I had to write a program that would take each block of text, and the control codes that said when to change print direction or issue a linefeed, and keep track of the relative offset. For backwards printed text, I had to reverse the order of the letters, all while tracking relative position. I also setup a control file where we defined a "ruler" for the fixed length fields in the print report from the BASIC Four machine.  I then extracted out the data, and for each field, wrote it to the appropriate PICK files.

Data Cleansing

Along the way, we ran into some data cleansing issues.

The first one was that spaces needed to be trimmed out, especially leading and trailing spaces.  But then things got very interesting.  

The system was used to track donations, but it didn't do arithmetic on them. As a result, there was no harm in using upper case "O" instead of a zero, or lower case "L" instead of a 1 (1, l - see they look very much the same!)  This meant that wherever there was a field that should have been numeric, you had to do some careful checking. We discovered that we could automatically replace a couple of these characters and recheck that it was numeric and that would get most of the cases.

The other thing that had happened was that people would use the arrow keys.  Let's say you were spelling the word "Hello" but you hit the letter "p" (right beside "o") instead.  What you really should do is use the backspace key to erase the p, then type the o.  Instead, these staff would press the back-arrow key and type "o". This moved the cursor over the p and displayed the o in its place, so it looked right but in the data, you had these characters:  "Hellp.o" where the '.' was actually a low ASCII control character. With text this was not so bad, but when you did it with numbers, it created an interesting problem!

We also had to deal with dates.  The Microdata couldn't handle lower case dates. "14 Feb 2018" would confuse it, but "14 FEB 2018" was just fine.  The BASIC Four was taking dates as text, so it didn't care. It didn't have to do math, it just had to print it. So we had a cleanup around dates, including the back-arrow problem noted above.  If you spelled a date "14 FEV 20018" the BASIC Four didn't care. It would simply print it.  The Microdata was unable to convert it and gave you an empty string.

Also, from time to time, we had a corrupted record from the transfer into the Microdata. Despite all the delays we put in, sometimes the Microdata would lose a character or two. I would have to go in manually, figure out what the correct positioning was, put some placeholder data in, record the donor information, and we'd have to go back to the BASIC Four to make sure we updated the correct data manually.

Application Development

Finally, we had transferred all the data and were ready to develop the application in Data BASIC.  The first order of the day was two simple programs.  One to capture information for a new donation, and one to print out a receipt that could be mailed to them.  We actually setup a test account that had a copy of the transferred data and started working in there.

Down the hall from their office, Terry Winter had fairly narrow book room. They would send out books as an offer with donations of a certain amount or more.  The room had a shelf for boxes of books, and enough room for a table and a chair. That was where I worked. It also had a door to another small room that had the Microdata and the Printronix chain printer as well as the air conditioning and power supply.  The cold air would seep through door from the computer room to the book room and I'd sometimes work with my jacket or a sweater on. From time to time, there would be a knock at the door and I'd help a shipper to load a couple of boxes of books on the shelves.  Still, it was a good paying job in a recession and I loved the opportunity to create a brand new application for a customer!

Agile Before Agile

This was long before Agile was a thing, but I would make it a point of choosing a key set of features in consultation withe the customer, developing code to the point where the user could see a prototype, then showing it to them and getting their feedback before proceeding.  The feature set to develop was always negotiated with the customer.  They were holding on to donation data, so as soon as the data entry program had all the essentials in it, they took it and started entering the data. The first version, having been rushed out the door, had some annoyances for the users that impeded productivity, so we focused on those for the next version.  The receipt printing program followed immediately on the heels of the first donation entry program version, as they needed to print and ship receipts, then we did the changes to deal with the annoyances. After that, we started building out new features and functionality.

When Agile methodology first came out, we had a bit of trouble understanding what the hype was.  We weren't full-on Agile by today's standards, but the concepts were baked into our DNA! I've had to deal with waterfall mode (it still has its place), and I can tell you which approach suits me better!

So that was my first data migration, and my first full application written from scratch!

A couple years later, I was still occasionally doing support for them.  Joan Winter called me up and told me about a bug they had encountered.  I told her what program to go to, roughly what line number, asked her to read me the code, told her what to change and got her to compile and catalog the program.  I fixed the bug over the phone from memory!  For me, software is like an old friend. I know it intimately, and can quickly pull it back up from memory.  OK, I am weird... I'll admit it!
While the news occasionally caught some big televangelists doing inappropriate things, Terry was the real deal.  He didn't drive a Rolls Royce, he drove a station wagon. For quite a while it had plastic in a window because a thief had broken into it.  He wasn't about ego.  He insisted on the local churches funding his crusades and would only have 1 offering taken, on the last day of his crusades.  He was serious about reaching out to Canadians with the gospel. He was not as big as Billy Graham, but he had the same integrity.  In December, 1998 Terry Winter passed away suddenly from an aneurysm.  I consider myself fortunate to have known him and his family!

Saturday, January 13, 2018

My First Data Migration - Terry Winter - Part 2

Click here to read part 1 of this blog.

The first day on the job, I was standing by the book room, which was just outside of the computer room. As noted in the previous post, the computer they were going to migrate to had 48 KB of RAM and a 10 MB hard drive, all fit into a chassis the size of a large fridge.

I watched my brother and his friend John carry the IBM PC in.  John had the PC, with the hard drive balanced on top, and my brother was carrying the monochrome monitor, keyboard and a power bar (no mouse, this was DOS, not Windows.)  This computer had more than 12 times the RAM and the same size hard drive, and they were carrying it in their arms!

To give you an idea what it would have looked like, here's a picture of my IBM PC XT. This one had the hard drive built in. In their case it was about half the size of the system unit, balanced on top.

They plugged it all in.  It needed one plug for the system unit, one for the hard drive (because it was external) and one for the monitor. When they turned it on, the hard drive sounded like an airplane motor starting.

The plan was to print a massive report of all the data to the Diablo, but instead of the Diablo, they would hook this into the IBM PC's serial I/O port.   This brought us to the first challenge:

The Diablo wanted the data to come to it at about 9600 baud (just under 1000 bytes per second).  The IBM PC's interrupt handler code for the serial port could barely handle 1200 baud on a good day. So they had to write an assembler routine to handle the serial I/O interrupts and replace the existing handler with this. This core routine was a critical piece of software for anyone wanting to do terminal emulation in future years on the IBM PC architecture, including all the clones that came out.

So they took the printer cable for the Diablo and had to rewire it a bit to connect to the IBM PC. It seems that pins 1 and 2 were used for send and receive, but you had to switch them at one of the ends, or you'd have the equivalent of someone holding an old phone receiver upside down, listening at the microphone and talking to the earphone.

They wrote a program on the PC to capture the data and write it to the hard drive.  I believe it was in assembler, but could have been in C.  Then they started with the surnames starting with the letter "A" and printed the report off of that 7 1/2 inch floppy.  They repeated this until they got to the letter "Z".

Then they ran through the data and organized it. They found all the places where escape codes were used to change direction and processed them specially.  Finally, they were ready for the "forward" part of the "store and forward" operation.

This was even trickier.  The Microdata's serial I/O handler was not interrupt driven the same way as the IBM PC.  The program had to be at a BASIC INPUT statement before you could send it data. Otherwise it would just echo a BEL character (your terminal would beep!)  What's more, even in input mode, if you sent two characters too quickly, you would lose the second one.  A human could type too fast for it, let alone another computer. In later years they implemented a type-ahead buffer, but at the time we were doing this conversion it wasn't an option.  The IBM PC could out-type any human, so the output program had to have some special logic.

We would send a character, then wait for the other end to echo what we sent.  As soon as we saw the echo, we'd start a delay of n milliseconds. I believe they parameterized that delay so they wouldn't have to change the assembler program each time.  When they sent a carriage return, you waited for the carriage return and linefeed to echo, then you put in a really long delay (almost a whole second as I recall it.)  Then you could get on with the next character.  This part took several days to complete and had to be restarted from time to time, when characters would fail to echo and the PC would stall, or other problems were encountered.

Finally, the data was all captured into a staging file on the Microdata.

Next post I'll talk about how we got that data into the target files and the data cleansing we had to do.

Thursday, January 11, 2018

My First Data Migration - Terry Winter - Part 1

One of my early customers was a company called Terry Winter Christian Communication.  Terry Winter was a televangelist, similar to Billy Graham, who had a TV show in Canada and did crusades, focused on smaller Canadian cities.  When I first got to know him, his company, and his family, they were looking to replace the system they used for tracking donations and providing tax receipts and reports to what was then called Revenue Canada with something a bit newer and capable of better functionality.

Their system at the time was an MAI Basic Four system that did not have a hard drive, but used a bank of 4 7 1/2 inch floppy drives for data. They organized each letter of the alphabet on its own drive. They had recently run out of room on their "F" drive, in part due to the large number of Mennonite donors across Canada, and the fact that the surname "Friesen" was very prevalent in that community, so they were now having to work through 2 floppies for the letter "F".

To print off receipts and reports, they used a serial printer called a Diablo. They referred to the MAI system as a Sol (can't find any references to it on the Internet) and this was a bit of a joke, as a Christian organization's computer Sol (soul) was connected to Diablo (Spanish for "devil").

Printing from the Basic Four to the Diablo was really interesting in that it would print a line of text, then send a line feed, then if the next line was longer you'd space out to where the last of the text would have been. Then you sent a code to tell the printer to print backwards and you'd send the next line of text in reverse. The printer would print it backwards to the start of the line! You'd send another line feed and a code to put the next line back into forward printing mode. This is important in the data migration stage.

The system they were going to go to was a Microdata Reality system with 48 K of RAM, 4 terminals, and a 10 MB hard disk. It used 9-track tape for backup and was the size of a refrigerator, but unlike the PBD system, the hard drive was inside the system unit. That 10 MB hard drive was as large as a big desktop computer is today.

We had 2 simple tasks:

1. Transfer the data to the new computer.
2. Design as system that let them take donations, and print receipts and Revenue Canada's annual reports.

After that we'd add additional functionality.

In order to do the transfer, they brought in the first ever IBM PC bought in the Vancouver area.  It was bought by Chris Graham of Synex Systems as an IBM PC 5150 with 256 KB of memory, the maximum it could hold at that time. It had 5 1/4 inch floppy drives and no hard disk.  As soon as the PC XT chip came out, he upgraded it added a 10 MB external hard drive.  Some time later he upgraded it to bump the memory to 640 KB.  This was the configuration that they used to do the data transfer with.

The recession was in full swing by this time, so Toga came up with a deal where Terry Winter got me full time for just a bit more than what I cost, so I'd be paid and Toga would not be out of pocket for my salary.

Next post we'll talk about the data migration itself.

Sunday, January 7, 2018

Paranoia is a Life Skill! Backups are your Friend!

In some fields, paranoia is considered unhealthy, but when dealing with software and computers, paranoia is definitely a life skill worth having.  Here are a few of the things that helped reinforce this for me:

One of the people I used to work with had a saying: 
If you take just one backup, it will have errors and be unreadable. If you take just two backups, you will have errors on both that will make them unreadable. If you take three backups, all three will be good!
Experience tells me these words are true just often enough to be worth believing!

When testing a program that does a series of updates, make a copy of the file and test against that copy. Then verify results.  If possible, do all development work in an isolated backup account / directory / whatever...  Don't do it in production.

Before running a test of a program that does a big update, make sure you are in the right place.  This was a real conversation that I was party to:
User phones in:  We're getting data errors in xxxxx entry program!
Us, checking: Hold on a second... Huhhhhhh???!!! The master file is empty!
We put the phone down, walk over to the development manager's office.
Us: Hey {dev manager's name}, what are you doing right now?
{dev manager}: I'm setting up the test account.
Us: Did you just clear the master file?
{dev manager}: Yes, I just did, right now.  Why? Do you need the test account for something?
Us: Can you check what account you are in?
{dev manager}:  @&&&#@@@!!!!
Fortunately, we had more than three day's worth of  backups, so all were good, and we got the data back. Unfortunately, staff had to reenter the morning's data that was entered since the backup was taken.

Microdata Reality systems had reports that would come out of the backup.  In those early days, we tried hard to train all our customers to check for a number at the end of the report. That was the number of "Group Format Errors" that the backup had encountered.  We trained them to call is in an immediate panic if that number was not zero, as it meant that their file system was corrupted!  There were stories of users who ignored these until their systems actually crashed. At that point, there were no backups that were fully usable. It was a mess!

As a result of all of this, I developed a healthy paranoia, and I got in the habit of hauling around a 9-track tape or two, and I would backup my own account. It didn't matter if the customer was doing a backup. I'd back my own account up.  I've had customers lose my account after I had done a week of work on it.  They thought they'd have to pay me to re-do the work, but I had my daily backup, so we only lost an hour's work.  The customer's respect was well earned!

I found a video of someone loading an old 9-track tape drive. It will start right where he loads the old one.  For you who have never had to do this, I could do it in my sleep!

Over time, the backups changed. You had 4mm, 8mm, and other tape formats. They were faster than 9-track, and held way more, so it was a good thing, but there were some downsides.  Not all 4mm or 8mm tape drives were compatible with each other. Probably the worst problem was that the report telling you of data corruption didn't exist in a lot of the newer operating systems that I worked with.  Paranoia had to take a new form.

Once a week, you would take a dummy record, somewhere near the end of your backup, rename it, and restore it from the backup tape, just to make sure the backup was good.  If you had done this for a while with consistent success, then you might drop back to once a month.

I no longer backup my systems to tape.  Some customers still do.

Nowadays, I have a product called Acronis.  It backs up my PC, my wife's PC, and my Mac, each to their own 2TB USB drives every night.  I'd occasionally swap one of these drives and send it offsite to a family member, but that got to be a lot of work.  My PC had over 250 GB of data. Backup over the internet, when I had 1MBPS DSL upload speed was simply not practical. It would take weeks to upload. Recently I upgraded to Telus Fibre Optic.  Now, all 3 computers back up over the internet, once a week!  In an emergency I could restore over the internet in a couple days.  I no longer have to get a family member to keep a backup offsite for me.

Every once in a while, I rename a file and restore from one of my backups, just to make sure it's all working!  Yes, Paranoia is definitely a useful life skill!

Note: Paranoia extends beyond backups - security, firewalls, password vaults, cloud solutions, and more.  Those will be for another day!

Friday, January 5, 2018

Recession Was a Good Teacher

When I graduated from high school in Mission, BC, I decided that I wanted to go into Mining, so I enrolled in Mining Engineering Technology at BCIT, and moved to Vancouver, rooming with my brother.  Mining at BCIT had never, since BCIT had started, had a year where they did not place their students in summer jobs by the winter break, so the choice looked like a good one.  Unfortunately for me, the year I entered BCIT was the year a big recession hit the mining industry in BC.

Mining was the number 1 employer in BC, so this was likely to have ripple effects, and it did.  I lost the part-time job that helped me pay my bills to go to college, and then, as I started going into debt to get into an industry that suddenly had massive unemployment, I tore some ligaments and the cartilage in my left knee.

I dropped out of BCIT after one term, returned home, and as soon as my knee healed, I got work with some local loggers.  That's another story; one for my SoTotallyBC blog.

After bouncing around looking for work, I got some with my brother, helping him with computer software, as noted in the two previous blog posts. After the PBD job, Toga offered me full time employment, which I took.  They had me take an accounting course at BCIT, since I was generally helping customers with systems that did at least some accounting functions, or integrated with an accounting system.  And began to get pretty good at programming in BASIC or PROC.

But as the year moved on, the recession's ripple effects began to have an effect on the company. Work slowed down, and people began to come up with ways to fill their time.  One of those ways was to start assigning learning projects.

One group began to reverse-assemble the Reality operating system.  I would watch as they did this, and learned lots of interesting tricks. I also learned about how a virtual machine worked.  A system engineer from the hardware vendor accidentally left their firmware manual behind. By the time they came by and picked it up, it had apparently fallen into the photocopier.  This gave us even more insight into how the system worked. I was a complete sponge, and absolutely loved it!

At the suggestion of Antoon and Gary, I started working on a reverse compiler.  The DataBASIC implementation on Reality, and PickBASIC on almost every Pick system compiled into P/Code. This P/Code, referred to as object code, was then executed by an interpreter.  On Reality there was a compile option (M) that would create a variable map record along with the object record.  The Map record contained all the variables, which allowed the BASIC debugger to show you variable names and contents.  I started by compiling a couple of very simple programs and hex-dumping the object code record. It didn't take long for me to create a table that showed what the different P/Code instructions were.

Then I wrote the reverse compiler (in BASIC).  My reverse compiler would create a program that would recompile to identical object code, whether you had variables or not. If you had the variable map, you'd pretty well get your original program back.

There were two things I couldn't do for you:

First, if you had comments, I'd put the comment marker in, but the actual text of the comment would be gone.

Second, if you had a set of GOTO commands that mimicked an IF/THEN/ELSE structure, I'd give you an IF/THEN/ELSE.  I had been trained to use structured programming concepts, and to avoid GOTOs, so that was the obvious thing to do, but it was possible to use GOTOs in a manner that was indistinguishable from IF/THEN/ELSE. In a pinch you could also use GOTOs for loops and a few other constructs. I'd give you structured code, if in doubt!

The side effect of these two exercises is that I now understood what the operating system was doing, and I understood what a BASIC program actually did!

Some time after doing this, I was doing some work for a Toga customer called CJ Management.  While applying a change to one of their most heavily used data entry programs, I noticed some code that I knew was inefficient, and while in there, I replaced it with a more efficient approach.

The customer's key data entry people noticed the change right away, and I was asked to take a run through several other programs and apply some optimization!

While many of the systems I've worked on since then were quite different, I've always had this underlying need to understand, to the best of my ability, how the system worked, what made it efficient or inefficient.  This curiosity has been a key feature of my career, and has benefited both me and my customers!

I also had an understanding of how the security worked. This was before the internet and hackers, so people were not very security conscious!  I remember a financial institution I was working for, where the administrator lost the password for the SYSPROG account. This was the admin, or root account on a Reality system.  They asked me to get them in, which I did in minutes.  This knowledge started me on the way to having a consciousness about security!

Don't get me wrong, recessions are awful, but we made the most of it, and for me, it was a learning experience I would undoubtedly never have had if there had not been a recession!

Next blog: Paranoia is a Life Skill

Thursday, January 4, 2018

Rescuing My First Customer

Many years ago, before I was married, when I was just 22, 1981 to be precise, Toga Computer Services had a customer called Pacific Brewers Distributors.  They've since been merged with other brewery distributors from other provinces into a company called Brewers Distributor Limited. But back then, they were just distributing beer for the three major BC breweries: Carling, Labatt, and Molson.

They had a computer system.  It was a Microdata 1600 (I believe - not 100% sure on the model) with 64 K of core memory and 4 Winchester disk drives that each had 50 MB capacity.  The disk drives looked like top loading washing machines. The computer was the size of a large refrigerator.  The really amazing thing was that their computer system, with only 64 K of core, ran 16 users.  If you do the math, you have 4 kilobytes of memory for each user.  It didn't really work like that. Each user used a lot more than 4 K. The system would page a user's state out to make room for another user to run.  Note: I have 128 Million kilobytes in my phone, and it runs 1 user (it can't even technically multi-task!)  This system ran a multi-valued operating system called Reality.  It was developed with Dick Pick's input, and was a variation of what was known as a Pick system.

Most of those 16 users took orders over the phone.  They would enter the order, which would be put into a phantom processing file.  Then a background process called a phantom processor, would pick up the orders and process them.

Now, there was a problem with the data design. I'll spell it out as simply as I can:

First, Pick predated relational databases.  (The main database at that time was ISAM.)  The idea of Pick was that if you had an invoice, a single record would have all the header information, and also all the detail lines and options for the invoice.  One record, that had multi-values for detail lines, and sub-multi-values (also called sub-values) if the detail lines had multiple options.

This meant a single disk read would get you a small to moderate invoice into memory. A single write would write it out.  The BASIC extensions for handling all this were very easy to use, making the handling of an invoice by a programmer very easy.

Unfortunately, someone decided that they would track all orders for a particular brewer in a single record.  And they also had a consolidated record that tracked all orders for all brewers.  This meant that every order had to update two of these 4 records.

These records recorded, by date, all orders of all products for that brewer (or any brewer for the consolidated record) for all licensed premises or liquor stores in all of BC.  The records got very big.

The smallest one was about 16K, the consolidated one was bumping into the 32K limit that Reality imposed on records. Given that core memory was only double that, the restriction was pretty reasonable.

The other thing you might notice if you are good at simple math, is that two of these records take up almost all of memory. But there's more!

If you add data to a record in the BASIC language, making it longer, there is a likelihood that it will be too big for the buffer the BASIC interpreter had originally allocated. At that point a new, bigger buffer gets allocated, and the data gets copied over to the new buffer along with the changes.  If you do that with the consolidated record, you have two copies of the record in memory and have now used up pretty well all of available core memory. Given some of that memory is used for other things, your working set cannot fit in memory at the same time.  And that's just the phantom processor. If any other users are trying to get work done, their state has probably been pushed out of memory.

Note that the read/write time on these old drives was extremely slow by today's standards, there was no caching to speak of (not even track reads at first), and you read or write 1/2 kilobyte at a time (512 bytes).  So if you are reading a 30K record, you have to do 60 disk reads.  If the copy that the BASIC processor is working with has to be written out to let another user do work, you get to read it back in before you can do any work on it.

I won't go into fragmentation or any of the other problems that this raises. The key thing is, that the system got stuck reading and writing to disk. The industry term is "the system thrashed". The other problem was that if you let the big record hit the 32K limit, it truncated and you had data corruption, that sometimes would result in the phantom program crashing. Because it ran in the background, you might not realize it had crashed for quite some time.

The users would enter orders until 5:00 pm, then the phantom process would try to catch up.  If you hit the size limit on the big record, it would crash. On many mornings the order desk could not open at 9:00 as the phantom was not finished processing.

So, in comes Toga Computer Services, with me, laid off from Fraser Mills Plywood Mill, helping to write a conversion program and change order programs to handle a new data design.

The conversion program took the 3 levels of multi-values in each record and wrote them into 3 different files. We turned 4 records into about 600.  We also had to change the order processing programs to process records from the 3 files, both reads and writes.

We tested and retested, and finally we did the conversion, in January of 1982, as I recall it.

Instead of flushing all of main memory several times over for each order, the system generally processed less than 1K of memory per order. Instead of 60 reads or writes for the consolidated record, we were down to usually just 3.

I was still very rusty and needed a fair bit of help to get it right, but we finally got it good enough to do the conversion in production.

The first day on the new system, we had to fix a few bugs, but the system performance was amazing, and within less than 1 minute of the order desk closing, the phantom processor had caught up all the orders!  The impact of the massive records on performance was exponential! The fix was amazing!

I learned a valuable lesson about data design, and came away with an appreciation of how data design, disk access, system memory management and other factors worked together to affect performance.  I also had the great pleasure of having the CEO and other executives of the company thanking us profusely for saving their system!

These were lessons that have stayed with me over the years!

Next post - Recession Was a Good Teacher...

Wednesday, January 3, 2018

My Journey Into Software

I don't do New Years resolutions. If there's something worth doing, I generally do it when I think of it. But having just published a children's Christmas story eBook and paperback, and as I'm wrapping up the marketing for it, I was thinking, what would I like to do next?

Then I ran into an old web archive of Ken North's ODBC Hall of Fame and the inspiration hit me!

I'd blog about some of the more interesting and sometimes amazing experiences I had in my journey as a software developer!  Here is the first post...

At the time that I got into software, most universities and the few colleges that had a computer department really only trained you for academic work. The types of things most businesses were trying to do with computers simply weren't being taught in most colleges.  You actually got better business programmers out of the technical schools like BCIT than the Universities.

So, I was able to get in through the back door.  And what got me in?  Typing...

When I was in grade 10, I had room for an extra class.  A couple of my friends suggested typing, and I thought that would be cool.  Not to mention that the class had a lot of girls in it. At 15 years old, that was a bit of an attraction, as well, but I think I just liked the idea of being able to type. I always liked machines.

So I took typing. I don't think there was even one boy who could out-type the slowest girl, but we all did pass.

Roll forward several years, and I'm looking for work.  My brother Tony (Antoon) and his friend Gary had started a software company and by combining the first two letters of their names, they came up with Toga Computer Services. Toga had a job programming for the City of St. Albert in Alberta, over a 300 baud datapack modem from Burnaby.  You got 300 baud on a good day.  When the line was bad, it metered down to 110 baud (not sure why the odd number, but that's what it was!)

You took an old style dial phone, and put the receiver into the modem, and it squealed your data into it over a carrier signal.  This was called an acoustic coupler.  You could pick up the phone from the modem and if you hissed the right pitch into it, it would get confused and hang up.  I could out-type the modem at 300 baud, and 110 baud was annoyingly slow, but I was getting paid, and more than minimum wage, so I was quite happy!

Image result for picture of acoustic coupler
Acoustic Coupler

For a chuckle, here's an old clip of someone using an acoustic coupler. You can see how slow it is, and at the end of the clip you can hear the carrier signal.

My brother would mark up program listings that he had printed off, and he would have me type the changes in, then compile them for him.  I'd gone with him the odd evening when I was attending BCIT for Mining Engineering Technology, and helped out a bit, but this was the first time he actually paid me.  It let him work on the next set of listings while I was typing over that annoyingly slow modem.

So that was my first software job. I had no idea how the software worked at first, but was intrigued, and started trying to learn.

At that time, Antoon decided to do some overnight training classes for me and some of his friends who were interested, and that, coupled with some books that we were told to read, began our journey into software.

Although I did not have formal college training in computers, I got to work with some truly brilliant people over the years, some of which I'll refer to in future blog posts.

At this point, I was employed part time temporarily, and didn't get paid for the training, but I really enjoyed what I was learning, and it was better pay than unemployment insurance!

Next post will be about rescuing a local customer.