Sunday, January 28, 2018

Personal Computer Invasion!

I had the great pleasure of being a part of the personal computer invasion.  When I started in software, options for businesses were mainframes and mini-computers.  But there were people with a vision of computing for the masses, and they were prepared to make their vision a reality.

When I was still a teenager, my older brother saved up his money from his job and bought a Tandy Radio Shack TRS-80 computer. Below is a short clip of someone using one.



My brother's computer had not hard or floppy drives. You had to use a cassette tape drive to save programs, and to load programs you wanted to run.  I can't recall if his had a built-in screen, but I do recall that he connected it to the TV.  There was a game where you had to shoot alien ships and recharge at your space station.  The one time I played it I destroyed all but one alien ship, found my station, recharged, then destroyed my space station. It gave me a message about probably getting court martialed. Then I destroyed the last alien ship.  I was the only ship left in the universe!

One of the first to really make waves was Apple.  My wife was working for Canarim Investments, in downtown Vancouver, and the bottom of her building had a store where Apple Computers was showcasing their personal computer.  It was exciting times, although the productivity gains simply weren't there in the first versions.

Here's a funny clip of some digital natives trying to figure out an old Apple computer.  It held the promise of good things, but fell a bit short on delivery:



Shortly after this, IBM came out with their IBM PC. There were several versions before the XT came out. Here's another video with someone looking at an old XT - it's a pretty long one, and the hard drive doesn't work in the end (not uncommon - the drives were always the first thing to go):



My first PC was an IBM PC XT - I'll talk about that in another post.

Soon there were lots of options! Most, like the TRS-80 were focused on games and consumers, and they were very expensive!

I remember going to computer shows, and at one of them, I ran into the Timex User Group of Vancouver (TUG).  This was the same company that made watches, but they'd delved into computers. This was the heath-kit of computers.  You got to assemble it yourself and program it.  The TUG folks struck me as being like a bunch of former WW-I biplane pilots playing with their toys. They had voice synthesis and other cool things that they did, none of which were commercial grade.  But then, these were called "personal" computers.



It was exciting to be involved in the early days of a new technology that held so much promise, and I got to watch it go from promise to reality!

Saturday, January 20, 2018

The Spin-off: PC Harmony

In my previous 3 blog posts I talked about a data migration that I participated in. I'm going to segue into what happened with the IBM PC software that they used for the data migration after the project was finished.

First, I'll take a minute to talk about computer terminals and personal computers.

Mini-Computers vs. Micro-Computers

Back in the day, business computers were either mainframe computers or a new class called mini-computers.  Really large companies could afford a mainframe, but smaller and mid-sized companies generally were using mini-computers. These computers had floppy, tape, or possibly one hard drive.  Occasionally, you'd have an array of hard drives, like what PBD had (another previous blog post.)

Business computers generally involved getting someone to write you a program, or buying a ready-made program from someone and customizing it to your business needs.  But a new thing, called a micro-computer, otherwise known as a personal computer had come on the scene. One of the most powerful tools that business users were seeing on these things was something called a spreadsheet program.  Business users had always had spreadsheets, but these were on paper. This new innovation was electronic.  It was tremendously popular!  One of the best known ones, which was available on the IBM PC, was called Lotus 1-2-3. It came on a 720 KB 5 1/4 inch floppy disk.  Many PCs came with dual floppy drives, so you'd put your program disk in drive A and run the program, and you'd use drive B for your data disk.

Serial Terminals

Now, most business computers used serial terminals with keyboards and monochrome green monitors for data access.  (No, they did NOT have a mouse on PCs yet. Joysticks, yes; mouse no!)  These printers had the ability to go into what was called Slave Print mode. The computer would send a special escape sequence (a string of characters starting with an ASCII 27, otherwise known as ESC) which would tell the terminal to send output to it's secondary serial port, which was usually an attached printer (referred to as a slave printer.)

A Product is Born!

The only problem was that these business computers had all the data, but the PC had the wonderful spreadsheet tool. How can you bring these two things together?

Now put that together with software that captured printer output in the previous blog posts, and an idea was born!  Synex Systems decided to come up with a software package that imitated one of several popular serial computer terminals like the ADDS Regent 40 terminal, but they added a new escape sequence, one that none of the real terminals used for anything. That code did the equivalent of a slave print, but it would capture the output to an ASCII file on the computer.

Then, they had another tool that you could use to set a ruler on the output file, assuming that it was typical columnar report data, and it would import the data into a word processor mail-merge, or a spreadsheet, or a desktop publishing program like Framework, or a PC-based database like dBase.

They developed a version for MAI Basic Four first, as they had worked with one at Terry Winter. Then they went on to do versions for almost all the Business BASIC systems and for a Wang BASIC system.  This product family was called PC Harmony.  MAI licensed an OEM version called MAI PC-Link. Thoroughbred Business BASIC licensed an OEM version called Thoroughbred-Link.  Other Business BASIC vendors just pointed their customers at Synex.

The PC Harmony product included components that ran on the PC, and Business BASIC libraries that were made available on the mini-computer.

I wasn't involved in the development of PC Harmony, but was aware of it happening as I continued to work for Toga Computer Services.

Sunday, January 14, 2018

My First Data Migration - Terry Winter - Part 3

Part 1 is here

In the previous parts of this story, I talked about the problem to be solved, and how we did the migration of the raw data to the staging file. Now we'll talk about what needed to be done with that data.

The data we had was all in a single staging file.  We needed to extract fields and records and write them to the appropriate PICK files (think "tables" if you're from the relational world).  I had to write a program that would take each block of text, and the control codes that said when to change print direction or issue a linefeed, and keep track of the relative offset. For backwards printed text, I had to reverse the order of the letters, all while tracking relative position. I also setup a control file where we defined a "ruler" for the fixed length fields in the print report from the BASIC Four machine.  I then extracted out the data, and for each field, wrote it to the appropriate PICK files.

Data Cleansing


Along the way, we ran into some data cleansing issues.

The first one was that spaces needed to be trimmed out, especially leading and trailing spaces.  But then things got very interesting.  

The system was used to track donations, but it didn't do arithmetic on them. As a result, there was no harm in using upper case "O" instead of a zero, or lower case "L" instead of a 1 (1, l - see they look very much the same!)  This meant that wherever there was a field that should have been numeric, you had to do some careful checking. We discovered that we could automatically replace a couple of these characters and recheck that it was numeric and that would get most of the cases.

The other thing that had happened was that people would use the arrow keys.  Let's say you were spelling the word "Hello" but you hit the letter "p" (right beside "o") instead.  What you really should do is use the backspace key to erase the p, then type the o.  Instead, these staff would press the back-arrow key and type "o". This moved the cursor over the p and displayed the o in its place, so it looked right but in the data, you had these characters:  "Hellp.o" where the '.' was actually a low ASCII control character. With text this was not so bad, but when you did it with numbers, it created an interesting problem!

We also had to deal with dates.  The Microdata couldn't handle lower case dates. "14 Feb 2018" would confuse it, but "14 FEB 2018" was just fine.  The BASIC Four was taking dates as text, so it didn't care. It didn't have to do math, it just had to print it. So we had a cleanup around dates, including the back-arrow problem noted above.  If you spelled a date "14 FEV 20018" the BASIC Four didn't care. It would simply print it.  The Microdata was unable to convert it and gave you an empty string.

Also, from time to time, we had a corrupted record from the transfer into the Microdata. Despite all the delays we put in, sometimes the Microdata would lose a character or two. I would have to go in manually, figure out what the correct positioning was, put some placeholder data in, record the donor information, and we'd have to go back to the BASIC Four to make sure we updated the correct data manually.

Application Development


Finally, we had transferred all the data and were ready to develop the application in Data BASIC.  The first order of the day was two simple programs.  One to capture information for a new donation, and one to print out a receipt that could be mailed to them.  We actually setup a test account that had a copy of the transferred data and started working in there.

Down the hall from their office, Terry Winter had fairly narrow book room. They would send out books as an offer with donations of a certain amount or more.  The room had a shelf for boxes of books, and enough room for a table and a chair. That was where I worked. It also had a door to another small room that had the Microdata and the Printronix chain printer as well as the air conditioning and power supply.  The cold air would seep through door from the computer room to the book room and I'd sometimes work with my jacket or a sweater on. From time to time, there would be a knock at the door and I'd help a shipper to load a couple of boxes of books on the shelves.  Still, it was a good paying job in a recession and I loved the opportunity to create a brand new application for a customer!

Agile Before Agile


This was long before Agile was a thing, but I would make it a point of choosing a key set of features in consultation withe the customer, developing code to the point where the user could see a prototype, then showing it to them and getting their feedback before proceeding.  The feature set to develop was always negotiated with the customer.  They were holding on to donation data, so as soon as the data entry program had all the essentials in it, they took it and started entering the data. The first version, having been rushed out the door, had some annoyances for the users that impeded productivity, so we focused on those for the next version.  The receipt printing program followed immediately on the heels of the first donation entry program version, as they needed to print and ship receipts, then we did the changes to deal with the annoyances. After that, we started building out new features and functionality.

When Agile methodology first came out, we had a bit of trouble understanding what the hype was.  We weren't full-on Agile by today's standards, but the concepts were baked into our DNA! I've had to deal with waterfall mode (it still has its place), and I can tell you which approach suits me better!

So that was my first data migration, and my first full application written from scratch!

A couple years later, I was still occasionally doing support for them.  Joan Winter called me up and told me about a bug they had encountered.  I told her what program to go to, roughly what line number, asked her to read me the code, told her what to change and got her to compile and catalog the program.  I fixed the bug over the phone from memory!  For me, software is like an old friend. I know it intimately, and can quickly pull it back up from memory.  OK, I am weird... I'll admit it!
While the news occasionally caught some big televangelists doing inappropriate things, Terry was the real deal.  He didn't drive a Rolls Royce, he drove a station wagon. For quite a while it had plastic in a window because a thief had broken into it.  He wasn't about ego.  He insisted on the local churches funding his crusades and would only have 1 offering taken, on the last day of his crusades.  He was serious about reaching out to Canadians with the gospel. He was not as big as Billy Graham, but he had the same integrity.  In December, 1998 Terry Winter passed away suddenly from an aneurysm.  I consider myself fortunate to have known him and his family!

Saturday, January 13, 2018

My First Data Migration - Terry Winter - Part 2

Click here to read part 1 of this blog.

The first day on the job, I was standing by the book room, which was just outside of the computer room. As noted in the previous post, the computer they were going to migrate to had 48 KB of RAM and a 10 MB hard drive, all fit into a chassis the size of a large fridge.

I watched my brother and his friend John carry the IBM PC in.  John had the PC, with the hard drive balanced on top, and my brother was carrying the monochrome monitor, keyboard and a power bar (no mouse, this was DOS, not Windows.)  This computer had more than 12 times the RAM and the same size hard drive, and they were carrying it in their arms!

To give you an idea what it would have looked like, here's a picture of my IBM PC XT. This one had the hard drive built in. In their case it was about half the size of the system unit, balanced on top.



They plugged it all in.  It needed one plug for the system unit, one for the hard drive (because it was external) and one for the monitor. When they turned it on, the hard drive sounded like an airplane motor starting.

The plan was to print a massive report of all the data to the Diablo, but instead of the Diablo, they would hook this into the IBM PC's serial I/O port.   This brought us to the first challenge:

The Diablo wanted the data to come to it at about 9600 baud (just under 1000 bytes per second).  The IBM PC's interrupt handler code for the serial port could barely handle 1200 baud on a good day. So they had to write an assembler routine to handle the serial I/O interrupts and replace the existing handler with this. This core routine was a critical piece of software for anyone wanting to do terminal emulation in future years on the IBM PC architecture, including all the clones that came out.

So they took the printer cable for the Diablo and had to rewire it a bit to connect to the IBM PC. It seems that pins 1 and 2 were used for send and receive, but you had to switch them at one of the ends, or you'd have the equivalent of someone holding an old phone receiver upside down, listening at the microphone and talking to the earphone.

They wrote a program on the PC to capture the data and write it to the hard drive.  I believe it was in assembler, but could have been in C.  Then they started with the surnames starting with the letter "A" and printed the report off of that 7 1/2 inch floppy.  They repeated this until they got to the letter "Z".

Then they ran through the data and organized it. They found all the places where escape codes were used to change direction and processed them specially.  Finally, they were ready for the "forward" part of the "store and forward" operation.

This was even trickier.  The Microdata's serial I/O handler was not interrupt driven the same way as the IBM PC.  The program had to be at a BASIC INPUT statement before you could send it data. Otherwise it would just echo a BEL character (your terminal would beep!)  What's more, even in input mode, if you sent two characters too quickly, you would lose the second one.  A human could type too fast for it, let alone another computer. In later years they implemented a type-ahead buffer, but at the time we were doing this conversion it wasn't an option.  The IBM PC could out-type any human, so the output program had to have some special logic.

We would send a character, then wait for the other end to echo what we sent.  As soon as we saw the echo, we'd start a delay of n milliseconds. I believe they parameterized that delay so they wouldn't have to change the assembler program each time.  When they sent a carriage return, you waited for the carriage return and linefeed to echo, then you put in a really long delay (almost a whole second as I recall it.)  Then you could get on with the next character.  This part took several days to complete and had to be restarted from time to time, when characters would fail to echo and the PC would stall, or other problems were encountered.

Finally, the data was all captured into a staging file on the Microdata.

Next post I'll talk about how we got that data into the target files and the data cleansing we had to do.

Thursday, January 11, 2018

My First Data Migration - Terry Winter - Part 1

One of my early customers was a company called Terry Winter Christian Communication.  Terry Winter was a televangelist, similar to Billy Graham, who had a TV show in Canada and did crusades, focused on smaller Canadian cities.  When I first got to know him, his company, and his family, they were looking to replace the system they used for tracking donations and providing tax receipts and reports to what was then called Revenue Canada with something a bit newer and capable of better functionality.

Their system at the time was an MAI Basic Four system that did not have a hard drive, but used a bank of 4 7 1/2 inch floppy drives for data. They organized each letter of the alphabet on its own drive. They had recently run out of room on their "F" drive, in part due to the large number of Mennonite donors across Canada, and the fact that the surname "Friesen" was very prevalent in that community, so they were now having to work through 2 floppies for the letter "F".

To print off receipts and reports, they used a serial printer called a Diablo. They referred to the MAI system as a Sol (can't find any references to it on the Internet) and this was a bit of a joke, as a Christian organization's computer Sol (soul) was connected to Diablo (Spanish for "devil").

Printing from the Basic Four to the Diablo was really interesting in that it would print a line of text, then send a line feed, then if the next line was longer you'd space out to where the last of the text would have been. Then you sent a code to tell the printer to print backwards and you'd send the next line of text in reverse. The printer would print it backwards to the start of the line! You'd send another line feed and a code to put the next line back into forward printing mode. This is important in the data migration stage.

The system they were going to go to was a Microdata Reality system with 48 K of RAM, 4 terminals, and a 10 MB hard disk. It used 9-track tape for backup and was the size of a refrigerator, but unlike the PBD system, the hard drive was inside the system unit. That 10 MB hard drive was as large as a big desktop computer is today.

We had 2 simple tasks:

1. Transfer the data to the new computer.
2. Design as system that let them take donations, and print receipts and Revenue Canada's annual reports.

After that we'd add additional functionality.

In order to do the transfer, they brought in the first ever IBM PC bought in the Vancouver area.  It was bought by Chris Graham of Synex Systems as an IBM PC 5150 with 256 KB of memory, the maximum it could hold at that time. It had 5 1/4 inch floppy drives and no hard disk.  As soon as the PC XT chip came out, he upgraded it added a 10 MB external hard drive.  Some time later he upgraded it to bump the memory to 640 KB.  This was the configuration that they used to do the data transfer with.

The recession was in full swing by this time, so Toga came up with a deal where Terry Winter got me full time for just a bit more than what I cost, so I'd be paid and Toga would not be out of pocket for my salary.

Next post we'll talk about the data migration itself.

Sunday, January 7, 2018

Paranoia is a Life Skill! Backups are your Friend!

In some fields, paranoia is considered unhealthy, but when dealing with software and computers, paranoia is definitely a life skill worth having.  Here are a few of the things that helped reinforce this for me:

One of the people I used to work with had a saying: 
If you take just one backup, it will have errors and be unreadable. If you take just two backups, you will have errors on both that will make them unreadable. If you take three backups, all three will be good!
Experience tells me these words are true just often enough to be worth believing!

When testing a program that does a series of updates, make a copy of the file and test against that copy. Then verify results.  If possible, do all development work in an isolated backup account / directory / whatever...  Don't do it in production.

Before running a test of a program that does a big update, make sure you are in the right place.  This was a real conversation that I was party to:
User phones in:  We're getting data errors in xxxxx entry program!
Us, checking: Hold on a second... Huhhhhhh???!!! The master file is empty!
We put the phone down, walk over to the development manager's office.
Us: Hey {dev manager's name}, what are you doing right now?
{dev manager}: I'm setting up the test account.
Us: Did you just clear the master file?
{dev manager}: Yes, I just did, right now.  Why? Do you need the test account for something?
Us: Can you check what account you are in?
{dev manager}:  @&&&#@@@!!!!
Fortunately, we had more than three day's worth of  backups, so all were good, and we got the data back. Unfortunately, staff had to reenter the morning's data that was entered since the backup was taken.

Microdata Reality systems had reports that would come out of the backup.  In those early days, we tried hard to train all our customers to check for a number at the end of the report. That was the number of "Group Format Errors" that the backup had encountered.  We trained them to call is in an immediate panic if that number was not zero, as it meant that their file system was corrupted!  There were stories of users who ignored these until their systems actually crashed. At that point, there were no backups that were fully usable. It was a mess!

As a result of all of this, I developed a healthy paranoia, and I got in the habit of hauling around a 9-track tape or two, and I would backup my own account. It didn't matter if the customer was doing a backup. I'd back my own account up.  I've had customers lose my account after I had done a week of work on it.  They thought they'd have to pay me to re-do the work, but I had my daily backup, so we only lost an hour's work.  The customer's respect was well earned!

I found a video of someone loading an old 9-track tape drive. It will start right where he loads the old one.  For you who have never had to do this, I could do it in my sleep!

Over time, the backups changed. You had 4mm, 8mm, and other tape formats. They were faster than 9-track, and held way more, so it was a good thing, but there were some downsides.  Not all 4mm or 8mm tape drives were compatible with each other. Probably the worst problem was that the report telling you of data corruption didn't exist in a lot of the newer operating systems that I worked with.  Paranoia had to take a new form.

Once a week, you would take a dummy record, somewhere near the end of your backup, rename it, and restore it from the backup tape, just to make sure the backup was good.  If you had done this for a while with consistent success, then you might drop back to once a month.

I no longer backup my systems to tape.  Some customers still do.

Nowadays, I have a product called Acronis.  It backs up my PC, my wife's PC, and my Mac, each to their own 2TB USB drives every night.  I'd occasionally swap one of these drives and send it offsite to a family member, but that got to be a lot of work.  My PC had over 250 GB of data. Backup over the internet, when I had 1MBPS DSL upload speed was simply not practical. It would take weeks to upload. Recently I upgraded to Telus Fibre Optic.  Now, all 3 computers back up over the internet, once a week!  In an emergency I could restore over the internet in a couple days.  I no longer have to get a family member to keep a backup offsite for me.

Every once in a while, I rename a file and restore from one of my backups, just to make sure it's all working!  Yes, Paranoia is definitely a useful life skill!

Note: Paranoia extends beyond backups - security, firewalls, password vaults, cloud solutions, and more.  Those will be for another day!

Friday, January 5, 2018

Recession Was a Good Teacher

When I graduated from high school in Mission, BC, I decided that I wanted to go into Mining, so I enrolled in Mining Engineering Technology at BCIT, and moved to Vancouver, rooming with my brother.  Mining at BCIT had never, since BCIT had started, had a year where they did not place their students in summer jobs by the winter break, so the choice looked like a good one.  Unfortunately for me, the year I entered BCIT was the year a big recession hit the mining industry in BC.

Mining was the number 1 employer in BC, so this was likely to have ripple effects, and it did.  I lost the part-time job that helped me pay my bills to go to college, and then, as I started going into debt to get into an industry that suddenly had massive unemployment, I tore some ligaments and the cartilage in my left knee.

I dropped out of BCIT after one term, returned home, and as soon as my knee healed, I got work with some local loggers.  That's another story; one for my SoTotallyBC blog.

After bouncing around looking for work, I got some with my brother, helping him with computer software, as noted in the two previous blog posts. After the PBD job, Toga offered me full time employment, which I took.  They had me take an accounting course at BCIT, since I was generally helping customers with systems that did at least some accounting functions, or integrated with an accounting system.  And began to get pretty good at programming in BASIC or PROC.

But as the year moved on, the recession's ripple effects began to have an effect on the company. Work slowed down, and people began to come up with ways to fill their time.  One of those ways was to start assigning learning projects.

One group began to reverse-assemble the Reality operating system.  I would watch as they did this, and learned lots of interesting tricks. I also learned about how a virtual machine worked.  A system engineer from the hardware vendor accidentally left their firmware manual behind. By the time they came by and picked it up, it had apparently fallen into the photocopier.  This gave us even more insight into how the system worked. I was a complete sponge, and absolutely loved it!

At the suggestion of Antoon and Gary, I started working on a reverse compiler.  The DataBASIC implementation on Reality, and PickBASIC on almost every Pick system compiled into P/Code. This P/Code, referred to as object code, was then executed by an interpreter.  On Reality there was a compile option (M) that would create a variable map record along with the object record.  The Map record contained all the variables, which allowed the BASIC debugger to show you variable names and contents.  I started by compiling a couple of very simple programs and hex-dumping the object code record. It didn't take long for me to create a table that showed what the different P/Code instructions were.

Then I wrote the reverse compiler (in BASIC).  My reverse compiler would create a program that would recompile to identical object code, whether you had variables or not. If you had the variable map, you'd pretty well get your original program back.

There were two things I couldn't do for you:

First, if you had comments, I'd put the comment marker in, but the actual text of the comment would be gone.

Second, if you had a set of GOTO commands that mimicked an IF/THEN/ELSE structure, I'd give you an IF/THEN/ELSE.  I had been trained to use structured programming concepts, and to avoid GOTOs, so that was the obvious thing to do, but it was possible to use GOTOs in a manner that was indistinguishable from IF/THEN/ELSE. In a pinch you could also use GOTOs for loops and a few other constructs. I'd give you structured code, if in doubt!

The side effect of these two exercises is that I now understood what the operating system was doing, and I understood what a BASIC program actually did!

Some time after doing this, I was doing some work for a Toga customer called CJ Management.  While applying a change to one of their most heavily used data entry programs, I noticed some code that I knew was inefficient, and while in there, I replaced it with a more efficient approach.

The customer's key data entry people noticed the change right away, and I was asked to take a run through several other programs and apply some optimization!

While many of the systems I've worked on since then were quite different, I've always had this underlying need to understand, to the best of my ability, how the system worked, what made it efficient or inefficient.  This curiosity has been a key feature of my career, and has benefited both me and my customers!

I also had an understanding of how the security worked. This was before the internet and hackers, so people were not very security conscious!  I remember a financial institution I was working for, where the administrator lost the password for the SYSPROG account. This was the admin, or root account on a Reality system.  They asked me to get them in, which I did in minutes.  This knowledge started me on the way to having a consciousness about security!

Don't get me wrong, recessions are awful, but we made the most of it, and for me, it was a learning experience I would undoubtedly never have had if there had not been a recession!

Next blog: Paranoia is a Life Skill

Thursday, January 4, 2018

Rescuing My First Customer

Many years ago, before I was married, when I was just 22, 1981 to be precise, Toga Computer Services had a customer called Pacific Brewers Distributors.  They've since been merged with other brewery distributors from other provinces into a company called Brewers Distributor Limited. But back then, they were just distributing beer for the three major BC breweries: Carling, Labatt, and Molson.

They had a computer system.  It was a Microdata 1600 (I believe - not 100% sure on the model) with 64 K of core memory and 4 Winchester disk drives that each had 50 MB capacity.  The disk drives looked like top loading washing machines. The computer was the size of a large refrigerator.  The really amazing thing was that their computer system, with only 64 K of core, ran 16 users.  If you do the math, you have 4 kilobytes of memory for each user.  It didn't really work like that. Each user used a lot more than 4 K. The system would page a user's state out to make room for another user to run.  Note: I have 128 Million kilobytes in my phone, and it runs 1 user (it can't even technically multi-task!)  This system ran a multi-valued operating system called Reality.  It was developed with Dick Pick's input, and was a variation of what was known as a Pick system.

Most of those 16 users took orders over the phone.  They would enter the order, which would be put into a phantom processing file.  Then a background process called a phantom processor, would pick up the orders and process them.

Now, there was a problem with the data design. I'll spell it out as simply as I can:

First, Pick predated relational databases.  (The main database at that time was ISAM.)  The idea of Pick was that if you had an invoice, a single record would have all the header information, and also all the detail lines and options for the invoice.  One record, that had multi-values for detail lines, and sub-multi-values (also called sub-values) if the detail lines had multiple options.

This meant a single disk read would get you a small to moderate invoice into memory. A single write would write it out.  The BASIC extensions for handling all this were very easy to use, making the handling of an invoice by a programmer very easy.

Unfortunately, someone decided that they would track all orders for a particular brewer in a single record.  And they also had a consolidated record that tracked all orders for all brewers.  This meant that every order had to update two of these 4 records.

These records recorded, by date, all orders of all products for that brewer (or any brewer for the consolidated record) for all licensed premises or liquor stores in all of BC.  The records got very big.

The smallest one was about 16K, the consolidated one was bumping into the 32K limit that Reality imposed on records. Given that core memory was only double that, the restriction was pretty reasonable.

The other thing you might notice if you are good at simple math, is that two of these records take up almost all of memory. But there's more!

If you add data to a record in the BASIC language, making it longer, there is a likelihood that it will be too big for the buffer the BASIC interpreter had originally allocated. At that point a new, bigger buffer gets allocated, and the data gets copied over to the new buffer along with the changes.  If you do that with the consolidated record, you have two copies of the record in memory and have now used up pretty well all of available core memory. Given some of that memory is used for other things, your working set cannot fit in memory at the same time.  And that's just the phantom processor. If any other users are trying to get work done, their state has probably been pushed out of memory.

Note that the read/write time on these old drives was extremely slow by today's standards, there was no caching to speak of (not even track reads at first), and you read or write 1/2 kilobyte at a time (512 bytes).  So if you are reading a 30K record, you have to do 60 disk reads.  If the copy that the BASIC processor is working with has to be written out to let another user do work, you get to read it back in before you can do any work on it.

I won't go into fragmentation or any of the other problems that this raises. The key thing is, that the system got stuck reading and writing to disk. The industry term is "the system thrashed". The other problem was that if you let the big record hit the 32K limit, it truncated and you had data corruption, that sometimes would result in the phantom program crashing. Because it ran in the background, you might not realize it had crashed for quite some time.

The users would enter orders until 5:00 pm, then the phantom process would try to catch up.  If you hit the size limit on the big record, it would crash. On many mornings the order desk could not open at 9:00 as the phantom was not finished processing.

So, in comes Toga Computer Services, with me, laid off from Fraser Mills Plywood Mill, helping to write a conversion program and change order programs to handle a new data design.

The conversion program took the 3 levels of multi-values in each record and wrote them into 3 different files. We turned 4 records into about 600.  We also had to change the order processing programs to process records from the 3 files, both reads and writes.

We tested and retested, and finally we did the conversion, in January of 1982, as I recall it.

Instead of flushing all of main memory several times over for each order, the system generally processed less than 1K of memory per order. Instead of 60 reads or writes for the consolidated record, we were down to usually just 3.

I was still very rusty and needed a fair bit of help to get it right, but we finally got it good enough to do the conversion in production.

The first day on the new system, we had to fix a few bugs, but the system performance was amazing, and within less than 1 minute of the order desk closing, the phantom processor had caught up all the orders!  The impact of the massive records on performance was exponential! The fix was amazing!

I learned a valuable lesson about data design, and came away with an appreciation of how data design, disk access, system memory management and other factors worked together to affect performance.  I also had the great pleasure of having the CEO and other executives of the company thanking us profusely for saving their system!

These were lessons that have stayed with me over the years!

Next post - Recession Was a Good Teacher...

Wednesday, January 3, 2018

My Journey Into Software

I don't do New Years resolutions. If there's something worth doing, I generally do it when I think of it. But having just published a children's Christmas story eBook and paperback, and as I'm wrapping up the marketing for it, I was thinking, what would I like to do next?

Then I ran into an old web archive of Ken North's ODBC Hall of Fame and the inspiration hit me!

I'd blog about some of the more interesting and sometimes amazing experiences I had in my journey as a software developer!  Here is the first post...

At the time that I got into software, most universities and the few colleges that had a computer department really only trained you for academic work. The types of things most businesses were trying to do with computers simply weren't being taught in most colleges.  You actually got better business programmers out of the technical schools like BCIT than the Universities.

So, I was able to get in through the back door.  And what got me in?  Typing...

When I was in grade 10, I had room for an extra class.  A couple of my friends suggested typing, and I thought that would be cool.  Not to mention that the class had a lot of girls in it. At 15 years old, that was a bit of an attraction, as well, but I think I just liked the idea of being able to type. I always liked machines.

So I took typing. I don't think there was even one boy who could out-type the slowest girl, but we all did pass.

Roll forward several years, and I'm looking for work.  My brother Tony (Antoon) and his friend Gary had started a software company and by combining the first two letters of their names, they came up with Toga Computer Services. Toga had a job programming for the City of St. Albert in Alberta, over a 300 baud datapack modem from Burnaby.  You got 300 baud on a good day.  When the line was bad, it metered down to 110 baud (not sure why the odd number, but that's what it was!)

You took an old style dial phone, and put the receiver into the modem, and it squealed your data into it over a carrier signal.  This was called an acoustic coupler.  You could pick up the phone from the modem and if you hissed the right pitch into it, it would get confused and hang up.  I could out-type the modem at 300 baud, and 110 baud was annoyingly slow, but I was getting paid, and more than minimum wage, so I was quite happy!

Image result for picture of acoustic coupler
Acoustic Coupler



























For a chuckle, here's an old clip of someone using an acoustic coupler. You can see how slow it is, and at the end of the clip you can hear the carrier signal.

My brother would mark up program listings that he had printed off, and he would have me type the changes in, then compile them for him.  I'd gone with him the odd evening when I was attending BCIT for Mining Engineering Technology, and helped out a bit, but this was the first time he actually paid me.  It let him work on the next set of listings while I was typing over that annoyingly slow modem.

So that was my first software job. I had no idea how the software worked at first, but was intrigued, and started trying to learn.

At that time, Antoon decided to do some overnight training classes for me and some of his friends who were interested, and that, coupled with some books that we were told to read, began our journey into software.

Although I did not have formal college training in computers, I got to work with some truly brilliant people over the years, some of which I'll refer to in future blog posts.

At this point, I was employed part time temporarily, and didn't get paid for the training, but I really enjoyed what I was learning, and it was better pay than unemployment insurance!

Next post will be about rescuing a local customer.