In the navigation bar above, every blue box represents information.
As a box darkens, click on the box to see its contents or click on its sub-pages.
Dog Collars
When I was about eight years old, while my father and I were visiting his foundry, I saw dog collars hanging at every door and passage way. But there were no dogs. I asked my dad, and he said, "We use dog collars for tourniquets." So many brutal accidents happened in the foundries; my father compared them to war zones. He told me gruesome stories about the foundry before he became the manage.
A grinding wheel ten feet in diameter had shattered and disemboweled workers. Chisels slipped and chopped off knee caps. Molten steel splashed and burned thru boots. Men came drunk and picked fights using pneumatic chisels and sledge hammers as weapons. Another horrendous tale was the ladle rider slipped ( or purposefully stepped ) onto a hot ladle and he melted like butter on a hot skillet.
Maintenance Overalls
My dad improved the safety factors. Every maintenance worker had to wear special overalls with tight cuffs and leggings to avoid being grabbed by machinery. Around his waist, hung six padlocks, which he used to lock down any piece of equipment on which he or his buddies toiled. Otherwise, another worker might turn on the machinery while a maintenance worker was inside. On the bib of the overalls, long pockets held the big fuses, which the maintenance man removed before he began certain tune-up operations. Where the fuse had been, he locked a padlock in place of the fuse. He did not want some one else to turn on the machinery. Each padlock had a number and the initials of the maintenance worker engraved into the body of the padlock. He wanted people to know who was working. (Before this safety policy took effect, several incidents led to a man being ground to death inside a machine. In a few cases, his remains disappeared, eg, his body became incinerated by the molten steel, circa 2800 degrees Fahrenheit.)
Inconvenient Locations
My father moved all the fuse boxes to visible but out-of-the-way locations. On big machines, the fuses and switches often were several steps up a ladder. No one would accidentally climb the ladder.
The analogies from the foundry to programming should be obvious. Thankfully, business programming has no life and death consequences. Some medical programming like heart monitors, surgery robots, pharmacology composition, and MRI's, has extreme danger. Avionics is full of danger.
When I was about eight years old, while my father and I were visiting his foundry, I saw dog collars hanging at every door and passage way. But there were no dogs. I asked my dad, and he said, "We use dog collars for tourniquets." So many brutal accidents happened in the foundries; my father compared them to war zones. He told me gruesome stories about the foundry before he became the manage.
A grinding wheel ten feet in diameter had shattered and disemboweled workers. Chisels slipped and chopped off knee caps. Molten steel splashed and burned thru boots. Men came drunk and picked fights using pneumatic chisels and sledge hammers as weapons. Another horrendous tale was the ladle rider slipped ( or purposefully stepped ) onto a hot ladle and he melted like butter on a hot skillet.
Maintenance Overalls
My dad improved the safety factors. Every maintenance worker had to wear special overalls with tight cuffs and leggings to avoid being grabbed by machinery. Around his waist, hung six padlocks, which he used to lock down any piece of equipment on which he or his buddies toiled. Otherwise, another worker might turn on the machinery while a maintenance worker was inside. On the bib of the overalls, long pockets held the big fuses, which the maintenance man removed before he began certain tune-up operations. Where the fuse had been, he locked a padlock in place of the fuse. He did not want some one else to turn on the machinery. Each padlock had a number and the initials of the maintenance worker engraved into the body of the padlock. He wanted people to know who was working. (Before this safety policy took effect, several incidents led to a man being ground to death inside a machine. In a few cases, his remains disappeared, eg, his body became incinerated by the molten steel, circa 2800 degrees Fahrenheit.)
Inconvenient Locations
My father moved all the fuse boxes to visible but out-of-the-way locations. On big machines, the fuses and switches often were several steps up a ladder. No one would accidentally climb the ladder.
The analogies from the foundry to programming should be obvious. Thankfully, business programming has no life and death consequences. Some medical programming like heart monitors, surgery robots, pharmacology composition, and MRI's, has extreme danger. Avionics is full of danger.
Cheese Chucker
In Wisconsin, in 1966, UPS won the contract to ship more than a million small boxes of cheese as Christmas presents. Many dairies had catalogs; a person would order by sending a check and an address to the dairy. UPS sorted the boxes manually; cheese chuckers did the work. From an incoming semi-trailer full of boxes of cheese, a cheese chucker would pick up a two pound gift box, read the address, and then throw the box across the warehouse to another cheese chucker standing on the tail gate of an out-bound truck. As a senior in high school, I had this wonderful job from 2am to 6am each morning from November thru December. I was rich at $8 per hour. About Thanksgiving time, I told my manager that the process would go faster if the addresses were in ZIP code order. The next morning, my manager took me to the warehouse manager and had me explain my idea. There was no particular feedback, but next year there were no cheese chucker jobs. Dairies packaged entire pallets by ZIP code.
Is Blank a Number
The school district had an IBM 360, and a few lucky students worked as trainees in the spring of 1967. We got about 72 cents an hour for three hours per day for three days per week. The 72 cents was the minimum wage for educational institutions. We could work more hours without pay. The programming manager was building up his library of routines, and he recruited me. Using Basic Assembly Language(BAL) we parsed many kinds of strings which came from terminal input. He was convinced that the standard IBM routines were slow, and he would run comparisons to test his hunches. He was right; he could squeeze more data through the machine when he eliminated the overly generalized routines from IBM. His name is lost to my memory. Mr Silbermann at IBM in Milwaukee had given me a BAL manual and a Programmer's Guide, and he alerted me to the job opportunity. We set the accumulator to zero and deblanked the input. If the input was empty, the algorithm quit. We returned the value 0 and an indicator of blank input,
Teaching COBOL to Veterans
In the fall of 1968 at Fullerton Junior College, many of my friends were Vietnam veterans. Some took courses in COBOL programming, and some of those asked for my help. I did not know COBOL per se, but we could run a program thru the compiler with a machine code switch turned on. A column at the right shows BAL. And Lo and behold, PERFORM maps to Branch with some register and stack manipulations. By the end of the semester, my friends and I had learned COBOL.
Making Computer Chips
In 1969, Transmask Corporation near John Wayne Airport in Orange County had one of the most precise cameras in the world. They used the camera to make silvered masks for making computer chips. Their customers included IBM, DEC, and GE. In those days, a big chip had between eight and twenty layers as indicated in our specs. For a year or two, Transmask could make thinner lines on masks--which meant more circuits per chips--than most other mask makers. We took four sheets of four-foot square photo-resist and etched with drafting tools and eXacto knives into the red layer. As we peeled out much of the red celluloid layer, the circuit diagram appeared as clear space. The camera man took all four sheets; each sheet defined a corner of the layer of the chip. The camera then reduced the diagram from eight feet square down to less than a half inch square, a downward factor of about 40,000.
Automation with CalComp Plotters
Also at Transmask, certain detailed patterns repeated on the layers. We isolated several that were about three inches across. In that the typical stripe was a quarter of an inch wide, the three by three could have about twelve horizontal cutting planes and twelve vertical. Such a square might have 30 cuts of various lengths. I suggested that a CalComp plotter fitted with a knife blade could make those cuts rapidly. For a human, the time to cut the pattern and peel it could approach an hour. The first CalComp attempt took nearly a week. The next took half a day. By the end of the second week, a human could instruct the machine in an hour and the machine could cut the pattern for a three inch by three inch square in four minutes. Another improvement involved a second tool for the plotter; the tool turned up the edges of the cut pieces. A human with tweezers still needed to peel the pattern. The draftsman then transferred the pattern to the correct three by three spot on the big four foot by four foot sheet. The savings was zero for the first of a pattern, and 45 minutes for subsequent occurrences of the same pattern. On some sheets, the savings was 16 hours per sheet: that was two man-days. Instead of a layer of four sheets taking fourteen man-days or more, a layer often took less than seven days. The number of human errors also dropped because humans were doing less of the work. The number of masks that failed inspection also dropped.
Our CalComp plotter could handle about two feet square. We looked for bigger patterns. During Christmas, they decided not to raise my trainee pay, and I started work as a programmer at UC Irvine in January. Transmask continued to use the plotters until they left the building about 1972.
Cascade Report Algorithm
In 1970, a professor was working on a data base called Vulcan. He wanted a generalized method for doing reports. I copied some Knuth and wrote a program in assembly code that the professor adopted. The result was the Cascade Report algorithm. In essence, at run-time,, the clerk chooses the sort sequence from some pre-defined keys. When the most minor of the chosen keys changes, it triggers a subtotal, and the report continues. When a major key changes, it triggers each of the minor keys from most minor up to its own subtotal. When the data file ends, the program does all subtotals from most minor to most major and then the report final totals. The algorithm requires the acquisition of a pointer to a subtotal routine for each sort key. In a production environment, the key often comes predefined, and the clerk merely chooses the sequence from the sort keys. For example, a report about purchase orders could have keys of vendor, PO, category, manufacturer, manufacturer's part, SKU, and estimated delivery date. Not all sequences of keys make easily usable reports, but when there are seven keys, such as in this example, the program can generate seven factorial variations. That is 5,780 variations. The one report could handle the mammoth portion of the reporting needs. At various companies in later years, a more sophisticated version loaded sort keys and titles when the clerk chose a report from a menu. One program; several reports. Vulcan spawned Dbase and others.
Mobile Home Parks
A friend worked for Jim Thayer, who designed mobile home parks, golf courses, and churches. Mobile home parks might have several hundred places, which are called pads. A pad indicates the orientation of the mobile home trailer and its utility hook-ups. When groups of 60 or 100 pads might be identical, that is an opportunity for computer-aided drafting. We experimented with the CalComp plotter, but we discovered a problem: the pads had many angles as the pads went around curved streets. The plotter drew shaky diagonals. We discovered that we could photo copy a pad diagram onto clear plastic, and then cut and orient those pads onto blue prints. We also discovered that we could photo copy fire hydrants, man-holes, and other repetitious items; then cut and glue them unto the composite blueprint. A final pass of the composite blueprint thru a blueprint copy machine reduced the many pieces to one uniform layer of printing. Many other architects followed similar ideas.
Payroll Year-to-Date
Payroll systems seem to follow two patterns. The poor ones calculate the current wages on a period by period pattern; wages of each pay period are proportioned up to an annual amount; then assessed taxes; then proportioned down, and finally accumulated to year-to-date totals. Thus, increased taxes within the current period seem exaggerated; also the totals contain a round off for every pay period. The sum of these round offs can become irritating.
The better method continually adds current wages to previous year-to-date wages. The current total is proportioned up to annual; then assessed taxes; then proportioned back down. The amount of current taxes is the difference between taxes on the total and taxes already paid year-to-date. This method continually keeps the round offs at one instance. Some systems round wages up to the nearest penny and taxes down to the nearest penny: who would complain.
Average Inventory Cost
Many inventory systems use a running average cost. There have been urban legends about this method where the round off became excessive . The current average cost is the result of dividing the current total cost by the current total quantity. Depending on the industry and the discernment of the CFO, this works reasonably well. However, in businesses where costs and quantities swing wildly, this method can generate noticeable discrepancies. For example, with 10 pieces on hand at fifty cents each plus an incoming count of 1000 pieces at a dollar each. The result is a decimal cost of .995049. Rounding down to .99 loses about $5 and rounding up overstates by $5. One solution is to carry several digits of precision and delay round off until a sale occurs. A more precise solution is to keep cost in pennies, and keep the remainder in a separate field, and include the remainder in every calculation.
At the Federated Group with inventory onhand exceeding $400 million and two or more turns yearly, we kept the remainder but wrote the discrepancies to a sub-ledger for analysis. Some years the sub-ledger hit $200,000. Merrill Lyons, VP on Finance, liked keeping the remainder; Michael Pastore, VP of Stores, shrugged his shoulders.
Inventory With No Part Numbers
At Pic'n'Save, the philosophy was to sell at a 1% markup. They demonstrated their definition by purchasing items for fifty cents and selling them for one dollar. Obviously, there is a joke here because that is a 100% markup, and they enjoyed it all the way to the bank. In fact, they kept most of the inventory by price point per category, which can be demonstrated as Men's Shoes at $4 and Men's Shoes at $9. No part numbers. The people procuring the inventory made great purchases, but the irregular nature of the inventory led to having no part numbers. Good purchasing and good sales hide a multitude of seemingly sloppy procedures.
Negative Inventory Costs
Usually a negative cost suggests an error. For some sub-business within the Pic'n'Save conglomerate, negative costs added to the profits. Some of the sub-businesses were paid by large retail chains to take merchandise off of the shelves of the big stores. For example, Nordstrom's paid a subsidiary of Pic'n'Save to take stray pairs of shoes out of the store. Those pairs would arrive at a large warehouse, where Pic'n'Save employees would consolidate the pairs of shoes into larger lots, like dozens, and a second subsidiary would re-sell those pairs at wholesale prices. Sometime Nordstrom's re-bought the shoes.
Recursive Warehouse Algorithms
Pic'n'Save had an enormous warehouse of 1.6 million square feet. Choosing locations for inventory created massive decision trees with potential constipated pallet flow. The bin picking algorithm combined direct manager input, product history and value, and buyer history. We ignored recommendations of some buyers as absurdly optimistic. In addition, current activity in the warehouse affected the bin picking program. Displays with key boards presented current pallet instructions. A dispatcher or a driver would choose a load, and a printer immediately printed the bin label, which might re-trigger calculations about loads still in the queue.
A programmer could run simulations in order to research possible strategies. The mix of factors was extensive: pallet weight, lane directions, bin heights, available loading zones, dollar value, fragile factor, stacking height, number and kind of fork lifts. Some shelves were 65 feet high; a semi-automated fork lift did the lifting and examined the location before attempting to shove in a pallet. Not good to shove a pallet into a full bin.
The Million Dollar Day
When the Federated Group first started clearing credit cards thru Security Pacific Bank, a programmer at the bank had a file of transactions that thoroughly exercised the process. He had designed the file so that a perfect run would have an exact total of one million dollars. Even more fascinating, he had assigned unique amounts to certain transactions within each category. Some were expressed as powers of two in pennies. As a result, the dollar amount of the error would point to records that possibly were processed incorrectly. This concept was not new, but his artfulness was pleasing.
Volume Pricing
Several wholesalers complained that a customer would sign a contract for 100,000 parts and get a great price. After maybe 20,000 parts had shipped, the customer would cancel the order. The hassle and the legal costs of recovering the lost revenue made the situation worse. The solution was to assign prices that declined as the orders shipped. The average cost might be a dollar, and the initial cost might approach two dollars. With each order, the price on the next order went down until the price on the last order approached some silly amount like ten cents. This mirrors average costing.
Extensible Data Bases
Ever changing waves of technology drive changes in attributes and policies for handling vendors and customers. Pagers came and went. Terms on contracts come and go. Accessories come and go. At NEC and at the Group, we solved the problem by artificially extending the main data base tables. For example, we created a table to hold extra information about vendors. The new table had columns for: the vendorID, a tag, a sequence, a status, and the value of the new attribute. We then made a new entry in the data dictionary to describe the tag, its specs, and its data format. We then added a field to the required screens. The field held a tag, a position, and a data field from working storage. We already had added the capability for programs to handle the new fields from the table in the data buffers of the main input programs. As such, only the edits from the dictionary took effect. In essence, a non-programmer could add or remove new fields. If more specific edits were needed, then a programmer put an edit routine for the field into the program.
Processing by the Penny
At Sierra Pacific Investments, which had more than a dozen partnerships, pennies of round-off caused many problems. For example, two investors, like husband and wife, would put equal amounts of $10,000 into the same partnership on the same day. In theory, both should receive the same quarterly returns. The first programmer, however, did not handle round off well, and with maybe a dozen income lines on a quarterly disbursement, the returns might differ by a nickel. This irritated some investors to the point of telephoning. Each telephone call had a cost and represented some dissatisfaction.
We solved the problem at its root. The funds each had eight thousand to forty-four thousand partners; total partners came in at nearly 200 thousand. Total investments per partner averaged nearly $20 thousand. The Chief Account, Bob Quaid, was persuaded to allocate distributions as pennies per partner rather than dollars per fund. If a fund had 22,034 partners, allocations were made as increments of $220.34. One penny per partner. Obviously business expenses do not come in such amounts, so any excess per category was retained until the next disbursement. No more round off and no more telephone calls on that topic.
A side effect was that the MS-Excel spreadsheets became exact. Quaid did not have to deal with round off while making his calculations in preparation for quarterly reports.
K1's Saved Millions
The big savings came in the processing of the annual K1 tax forms. The accounting firm charged a fee like $35,000 per fund per year to bless the accuracy of the accounting. Then the firm charged about $3000 per fund as a data preparation fee to set-up the K1 form and the necessary software on an IBM big machine. Finally,the firm charged between $7 and $8 per partner to print and charged postage to mail each K1. The elapsed time was five weeks, which put pressure on accounting to make changes so that management could review the typical K1 and the summary.
I told Quaid about a printing service near his house in Yorba Linda. On our little HP3000, we prepared a sample tape for the smallest fund. Quaid delivered the tape on his way home from work, and he picked up the K1's in the morning. Ecstasy. There was no $3000 set-up cost and each partner K1 was less that $1, or at least $7 less expensive than the accounting firm. The quick turn around gave management many more weeks to review and re-run, when needed. The savings exceeded $1.4 million each year. The downside was having to stuff pre-stamped envelopes, so employees and their kids got bonus money.
Base 3600 in College Credits
The Babylonians used base 3600 because many fractions convert exactly to parts per 3600. Transfer credits from semester colleges, quarter systems, and military institutes pose a similar problem. Usually I see programmers try to solve the problem by going out to many decimal places. They still must contend with the situation where one third becomes 33 hundredths, and so three thirds becomes 99 hundredths, and not the expected 100. Base 3600 handles the decimals from military institutes and thirds from quarter systems. ( 4 quarter units maps to 3 semester units.) The round off going into the common denominator is eliminated. The organization keeps the credits as 3600ths and must then decide how to express the sums. When using the Base 3600, one-third appears as .33 in decimals, and the sum of three thirds appears as 1.00 in decimals. Discrepancies still exist but less frequently.
Battleship Billing
Under the old system, a business had to keep detailed records on each emitting device about hours of operation, temperatures, fuel usage, repairs, and many more tiny details. With as many as forty pages of details per emitting device, the business would come to the AQMD Hearings and present its case on a device by device basis. The result was hundreds of hours of haggling and a fee of about $12.00 for a simple fryer at a restaurant. Maybe a restaurant with a strong case could reduce the fee to $11.00. The wasting of time was not worth the savings in money, The solution got the nickname of Battleship billing because the fees were arranged in a rectangular array like the game called Battleship. The lowest fee was in A1, which was column A and row 1. The highest fees were in column H. Each square contained many different devices with the same pre-set charge. With the battleship method, the restaurants and other business could choose to buy equipment and know the fees in advance. According the Mr Greenberg, General Counsel for the AQMD, the new method saved his officers tens of thousands of hours per year. Hearings and inspections became easier. Only important cases and general cases went to hearings.
For the data processing department, the change in billing was like night and day. Before, the billing program had a zillion sub-programs that tried to calculate emissions and thus fees. Each device had fuel inputs, equations for physical processes, and variations in output format. With thousands of devices, maintaining the complex program was a nightmare; the programmers could not keep up with the changes in the real world. After Battleship Billing, each permit basically required a table lookup. When the device on a permit failed to reference a fee in the table, the program postponed the permit like two weeks and wrote to an error log. An engineer would read log and make corrections to the table, The size of the billing program shrunk from tens of thousands of lines to less than a thousand. Maintenance time became hours per year, whereas before the program required multiple full-time programmers.
Calculating vs Sorting vs Reporting
The AQMD had tens of thousands of permits and their related devices. Daily the statisticians wanted to run listings and comparisons and trend reports. In the early 1980's, the programming department adopted Quiz and Quick from Cognos as the tool of choice. Programmers wrote many similar Quiz scripts each week. Some scripts became choices on the menu. For each new script, a programmer had the task of spending four hours to estimate the system impact; that task was absurd because typical script only ran ten minutes. And the slowest script ran about an hour to process and ten hours to print.
Using printers in the basement, operators printed the output on paper and delivered the paper to the requesting department during the day. Often the computer ran scripts through the night to catch up with the demand.
A new programmer analyzed the processing, and he added several pre-sorted indices. Many scripts could then run in one or two minutes. With that success, the programmer analyzed more trends. Sorting, merging, and subtotals for percentages slowed many scripts; calculations took minutes per script; the job listings showed the phases of the Quiz processing. The idea for pre-processed data structures materialized, but Quiz could not handle the structures.
With the help of Hupert Wilson, the programmer designed a COBOL program that could read the new structures. The programmer put the program on Hupert's menu. The new program could accept input from the terminal and display results in seconds. Hupert wanted both screen display and hard copy options. Hupert wanted inputs from saved files, and he wanted to keep output files indefinitely. Within weeks he wanted comma-separated files for Excel. Hupert wanted more of the pollutants besides the common five: NOX, SOX, CO, CO2, and particulate. After six more weeks of testing and enhancements, the programmer agreed that Hupert could share the program with people outside of Hupert's department, which was Air Chemistry.
Meanwhile, the manager and operators in the basement noticed a severe decline in batch jobs and paper output. Also, the manager saw that the CPU was often idle during part of each minute. The manager of programming saw a distinct drop in requests for new Quiz scripts, which required twelve to thirty hours to implement, which lowered the need for programmers. Ruckus ensued, and Mr Sweet, the Director of DP, investigated and declared that every one should embrace the future: end of bickering.
The only major drawback was the need to re-calculate and re-load the data structures each evening or at noon. The process took about 25 minutes. The re-load date-stamp became a reportable item, which pleased the users for its completeness. "All active permits as of 03:20 am 06 JAN 1990."
The Delta Table
The Filemaker data base at APU could not handle the large daily replacement of nearly a quarter million records. It was a shame that so much data was being copied to such a weak data base, and the process took many hours. While the Filemaker data base existed, it was a liability for backups and security; it required too much time from a programmer to load and reload regularly. Then a clerk named Linda stated the obvious, "Please give us just the changes."
The programmer had been unloading the entire large file from the main data base and then loading that same file into Filemaker. Instead, a new second database contained a copy of the main database as of the previous process date. A new program then used the date of modification on the records in the main database to create a file adds, changes, and non-changes; that process to 25 minutes. To find deletes, another program compared the new second data base against the extracted file, which took another 25 minutes. In less than an hour, the programmer had the file of changes. Lastly, the programmer copied the current data base to the copy database. Filemaker was happy until it was replaced about a year later.
Authoritative Reports
The University had more than thirty departments and almost as many reports of students per department. Imprecise methods led to a greatly overstated enrollment and awkward debates about funding per student. The IT department and the Provost got together and agreed that during the term, a person was a student if and only if the person was enrolled in a class. Secondly, the Provost wanted to count students per department as the number of students whose primary major was sponsored by the department. Immediately, the enrollment matched the sum of majors per department. A second column showed second major or minor per department. A third column showed the total unduplicated head count from classes per department, and a fourth column showed the duplicate count. A duplicate count occurs when a student takes more than one classes from the same department. The fairness and the precision of the "Majors by Department" report soon forced the other reports into the shadows.
User Preferences
The debates about colors, fonts, and minor features on screens often wastes much time. Allowing the user to set personal preferences eliminates much of the squabble. Many times, I have observed that power users dim or remove labels. Later, they might restore the labels to understand some anomaly and then they again dim all the superfluous items on the screen. The power user does not want the visual clutter. A good screen handler gives the power user the ability to remove labels and field numbers, change initial values, allow multiple input formats, change sequences of fields, and keyboard control for all commands. Using the mouse might help occasional users, but the mouse should be optional for power users.
Screen Navigation Automatically
Allowing the clerk to set initial values and the order of navigation greatly increases the usefulness of most input screens. For example, at a college, the admissions office receives about fifty inquiries per week from a magazine. The clerk sets the initial values for the kind of advertising and name of the advertising, which is the magazine. The clerk then puts the screen into add mode and chooses the order of the fields to visit. Finally the clerk turns on automatic update at the end of the fields. The result is that for the cost of initializing the fields, the clerk saves many strokes and much time.
No Garbage Data
The proliferation of data threatens to overwhelm many businesses. Concentrating on the important still remains essential. One method to eliminate bad data from a system is to restrict bad data at the point of entry. This should seem obvious, but people often complain about their own bad data. A more sophisticated approach evaluates the quality of each incoming record. What value does the information convey? Is the name and address good, but the remainder is bad? Consider having a field to indicate the quality of the record, or consider having a second repository.
In Wisconsin, in 1966, UPS won the contract to ship more than a million small boxes of cheese as Christmas presents. Many dairies had catalogs; a person would order by sending a check and an address to the dairy. UPS sorted the boxes manually; cheese chuckers did the work. From an incoming semi-trailer full of boxes of cheese, a cheese chucker would pick up a two pound gift box, read the address, and then throw the box across the warehouse to another cheese chucker standing on the tail gate of an out-bound truck. As a senior in high school, I had this wonderful job from 2am to 6am each morning from November thru December. I was rich at $8 per hour. About Thanksgiving time, I told my manager that the process would go faster if the addresses were in ZIP code order. The next morning, my manager took me to the warehouse manager and had me explain my idea. There was no particular feedback, but next year there were no cheese chucker jobs. Dairies packaged entire pallets by ZIP code.
Is Blank a Number
The school district had an IBM 360, and a few lucky students worked as trainees in the spring of 1967. We got about 72 cents an hour for three hours per day for three days per week. The 72 cents was the minimum wage for educational institutions. We could work more hours without pay. The programming manager was building up his library of routines, and he recruited me. Using Basic Assembly Language(BAL) we parsed many kinds of strings which came from terminal input. He was convinced that the standard IBM routines were slow, and he would run comparisons to test his hunches. He was right; he could squeeze more data through the machine when he eliminated the overly generalized routines from IBM. His name is lost to my memory. Mr Silbermann at IBM in Milwaukee had given me a BAL manual and a Programmer's Guide, and he alerted me to the job opportunity. We set the accumulator to zero and deblanked the input. If the input was empty, the algorithm quit. We returned the value 0 and an indicator of blank input,
Teaching COBOL to Veterans
In the fall of 1968 at Fullerton Junior College, many of my friends were Vietnam veterans. Some took courses in COBOL programming, and some of those asked for my help. I did not know COBOL per se, but we could run a program thru the compiler with a machine code switch turned on. A column at the right shows BAL. And Lo and behold, PERFORM maps to Branch with some register and stack manipulations. By the end of the semester, my friends and I had learned COBOL.
Making Computer Chips
In 1969, Transmask Corporation near John Wayne Airport in Orange County had one of the most precise cameras in the world. They used the camera to make silvered masks for making computer chips. Their customers included IBM, DEC, and GE. In those days, a big chip had between eight and twenty layers as indicated in our specs. For a year or two, Transmask could make thinner lines on masks--which meant more circuits per chips--than most other mask makers. We took four sheets of four-foot square photo-resist and etched with drafting tools and eXacto knives into the red layer. As we peeled out much of the red celluloid layer, the circuit diagram appeared as clear space. The camera man took all four sheets; each sheet defined a corner of the layer of the chip. The camera then reduced the diagram from eight feet square down to less than a half inch square, a downward factor of about 40,000.
Automation with CalComp Plotters
Also at Transmask, certain detailed patterns repeated on the layers. We isolated several that were about three inches across. In that the typical stripe was a quarter of an inch wide, the three by three could have about twelve horizontal cutting planes and twelve vertical. Such a square might have 30 cuts of various lengths. I suggested that a CalComp plotter fitted with a knife blade could make those cuts rapidly. For a human, the time to cut the pattern and peel it could approach an hour. The first CalComp attempt took nearly a week. The next took half a day. By the end of the second week, a human could instruct the machine in an hour and the machine could cut the pattern for a three inch by three inch square in four minutes. Another improvement involved a second tool for the plotter; the tool turned up the edges of the cut pieces. A human with tweezers still needed to peel the pattern. The draftsman then transferred the pattern to the correct three by three spot on the big four foot by four foot sheet. The savings was zero for the first of a pattern, and 45 minutes for subsequent occurrences of the same pattern. On some sheets, the savings was 16 hours per sheet: that was two man-days. Instead of a layer of four sheets taking fourteen man-days or more, a layer often took less than seven days. The number of human errors also dropped because humans were doing less of the work. The number of masks that failed inspection also dropped.
Our CalComp plotter could handle about two feet square. We looked for bigger patterns. During Christmas, they decided not to raise my trainee pay, and I started work as a programmer at UC Irvine in January. Transmask continued to use the plotters until they left the building about 1972.
Cascade Report Algorithm
In 1970, a professor was working on a data base called Vulcan. He wanted a generalized method for doing reports. I copied some Knuth and wrote a program in assembly code that the professor adopted. The result was the Cascade Report algorithm. In essence, at run-time,, the clerk chooses the sort sequence from some pre-defined keys. When the most minor of the chosen keys changes, it triggers a subtotal, and the report continues. When a major key changes, it triggers each of the minor keys from most minor up to its own subtotal. When the data file ends, the program does all subtotals from most minor to most major and then the report final totals. The algorithm requires the acquisition of a pointer to a subtotal routine for each sort key. In a production environment, the key often comes predefined, and the clerk merely chooses the sequence from the sort keys. For example, a report about purchase orders could have keys of vendor, PO, category, manufacturer, manufacturer's part, SKU, and estimated delivery date. Not all sequences of keys make easily usable reports, but when there are seven keys, such as in this example, the program can generate seven factorial variations. That is 5,780 variations. The one report could handle the mammoth portion of the reporting needs. At various companies in later years, a more sophisticated version loaded sort keys and titles when the clerk chose a report from a menu. One program; several reports. Vulcan spawned Dbase and others.
Mobile Home Parks
A friend worked for Jim Thayer, who designed mobile home parks, golf courses, and churches. Mobile home parks might have several hundred places, which are called pads. A pad indicates the orientation of the mobile home trailer and its utility hook-ups. When groups of 60 or 100 pads might be identical, that is an opportunity for computer-aided drafting. We experimented with the CalComp plotter, but we discovered a problem: the pads had many angles as the pads went around curved streets. The plotter drew shaky diagonals. We discovered that we could photo copy a pad diagram onto clear plastic, and then cut and orient those pads onto blue prints. We also discovered that we could photo copy fire hydrants, man-holes, and other repetitious items; then cut and glue them unto the composite blueprint. A final pass of the composite blueprint thru a blueprint copy machine reduced the many pieces to one uniform layer of printing. Many other architects followed similar ideas.
Payroll Year-to-Date
Payroll systems seem to follow two patterns. The poor ones calculate the current wages on a period by period pattern; wages of each pay period are proportioned up to an annual amount; then assessed taxes; then proportioned down, and finally accumulated to year-to-date totals. Thus, increased taxes within the current period seem exaggerated; also the totals contain a round off for every pay period. The sum of these round offs can become irritating.
The better method continually adds current wages to previous year-to-date wages. The current total is proportioned up to annual; then assessed taxes; then proportioned back down. The amount of current taxes is the difference between taxes on the total and taxes already paid year-to-date. This method continually keeps the round offs at one instance. Some systems round wages up to the nearest penny and taxes down to the nearest penny: who would complain.
Average Inventory Cost
Many inventory systems use a running average cost. There have been urban legends about this method where the round off became excessive . The current average cost is the result of dividing the current total cost by the current total quantity. Depending on the industry and the discernment of the CFO, this works reasonably well. However, in businesses where costs and quantities swing wildly, this method can generate noticeable discrepancies. For example, with 10 pieces on hand at fifty cents each plus an incoming count of 1000 pieces at a dollar each. The result is a decimal cost of .995049. Rounding down to .99 loses about $5 and rounding up overstates by $5. One solution is to carry several digits of precision and delay round off until a sale occurs. A more precise solution is to keep cost in pennies, and keep the remainder in a separate field, and include the remainder in every calculation.
At the Federated Group with inventory onhand exceeding $400 million and two or more turns yearly, we kept the remainder but wrote the discrepancies to a sub-ledger for analysis. Some years the sub-ledger hit $200,000. Merrill Lyons, VP on Finance, liked keeping the remainder; Michael Pastore, VP of Stores, shrugged his shoulders.
Inventory With No Part Numbers
At Pic'n'Save, the philosophy was to sell at a 1% markup. They demonstrated their definition by purchasing items for fifty cents and selling them for one dollar. Obviously, there is a joke here because that is a 100% markup, and they enjoyed it all the way to the bank. In fact, they kept most of the inventory by price point per category, which can be demonstrated as Men's Shoes at $4 and Men's Shoes at $9. No part numbers. The people procuring the inventory made great purchases, but the irregular nature of the inventory led to having no part numbers. Good purchasing and good sales hide a multitude of seemingly sloppy procedures.
Negative Inventory Costs
Usually a negative cost suggests an error. For some sub-business within the Pic'n'Save conglomerate, negative costs added to the profits. Some of the sub-businesses were paid by large retail chains to take merchandise off of the shelves of the big stores. For example, Nordstrom's paid a subsidiary of Pic'n'Save to take stray pairs of shoes out of the store. Those pairs would arrive at a large warehouse, where Pic'n'Save employees would consolidate the pairs of shoes into larger lots, like dozens, and a second subsidiary would re-sell those pairs at wholesale prices. Sometime Nordstrom's re-bought the shoes.
Recursive Warehouse Algorithms
Pic'n'Save had an enormous warehouse of 1.6 million square feet. Choosing locations for inventory created massive decision trees with potential constipated pallet flow. The bin picking algorithm combined direct manager input, product history and value, and buyer history. We ignored recommendations of some buyers as absurdly optimistic. In addition, current activity in the warehouse affected the bin picking program. Displays with key boards presented current pallet instructions. A dispatcher or a driver would choose a load, and a printer immediately printed the bin label, which might re-trigger calculations about loads still in the queue.
A programmer could run simulations in order to research possible strategies. The mix of factors was extensive: pallet weight, lane directions, bin heights, available loading zones, dollar value, fragile factor, stacking height, number and kind of fork lifts. Some shelves were 65 feet high; a semi-automated fork lift did the lifting and examined the location before attempting to shove in a pallet. Not good to shove a pallet into a full bin.
The Million Dollar Day
When the Federated Group first started clearing credit cards thru Security Pacific Bank, a programmer at the bank had a file of transactions that thoroughly exercised the process. He had designed the file so that a perfect run would have an exact total of one million dollars. Even more fascinating, he had assigned unique amounts to certain transactions within each category. Some were expressed as powers of two in pennies. As a result, the dollar amount of the error would point to records that possibly were processed incorrectly. This concept was not new, but his artfulness was pleasing.
Volume Pricing
Several wholesalers complained that a customer would sign a contract for 100,000 parts and get a great price. After maybe 20,000 parts had shipped, the customer would cancel the order. The hassle and the legal costs of recovering the lost revenue made the situation worse. The solution was to assign prices that declined as the orders shipped. The average cost might be a dollar, and the initial cost might approach two dollars. With each order, the price on the next order went down until the price on the last order approached some silly amount like ten cents. This mirrors average costing.
Extensible Data Bases
Ever changing waves of technology drive changes in attributes and policies for handling vendors and customers. Pagers came and went. Terms on contracts come and go. Accessories come and go. At NEC and at the Group, we solved the problem by artificially extending the main data base tables. For example, we created a table to hold extra information about vendors. The new table had columns for: the vendorID, a tag, a sequence, a status, and the value of the new attribute. We then made a new entry in the data dictionary to describe the tag, its specs, and its data format. We then added a field to the required screens. The field held a tag, a position, and a data field from working storage. We already had added the capability for programs to handle the new fields from the table in the data buffers of the main input programs. As such, only the edits from the dictionary took effect. In essence, a non-programmer could add or remove new fields. If more specific edits were needed, then a programmer put an edit routine for the field into the program.
Processing by the Penny
At Sierra Pacific Investments, which had more than a dozen partnerships, pennies of round-off caused many problems. For example, two investors, like husband and wife, would put equal amounts of $10,000 into the same partnership on the same day. In theory, both should receive the same quarterly returns. The first programmer, however, did not handle round off well, and with maybe a dozen income lines on a quarterly disbursement, the returns might differ by a nickel. This irritated some investors to the point of telephoning. Each telephone call had a cost and represented some dissatisfaction.
We solved the problem at its root. The funds each had eight thousand to forty-four thousand partners; total partners came in at nearly 200 thousand. Total investments per partner averaged nearly $20 thousand. The Chief Account, Bob Quaid, was persuaded to allocate distributions as pennies per partner rather than dollars per fund. If a fund had 22,034 partners, allocations were made as increments of $220.34. One penny per partner. Obviously business expenses do not come in such amounts, so any excess per category was retained until the next disbursement. No more round off and no more telephone calls on that topic.
A side effect was that the MS-Excel spreadsheets became exact. Quaid did not have to deal with round off while making his calculations in preparation for quarterly reports.
K1's Saved Millions
The big savings came in the processing of the annual K1 tax forms. The accounting firm charged a fee like $35,000 per fund per year to bless the accuracy of the accounting. Then the firm charged about $3000 per fund as a data preparation fee to set-up the K1 form and the necessary software on an IBM big machine. Finally,the firm charged between $7 and $8 per partner to print and charged postage to mail each K1. The elapsed time was five weeks, which put pressure on accounting to make changes so that management could review the typical K1 and the summary.
I told Quaid about a printing service near his house in Yorba Linda. On our little HP3000, we prepared a sample tape for the smallest fund. Quaid delivered the tape on his way home from work, and he picked up the K1's in the morning. Ecstasy. There was no $3000 set-up cost and each partner K1 was less that $1, or at least $7 less expensive than the accounting firm. The quick turn around gave management many more weeks to review and re-run, when needed. The savings exceeded $1.4 million each year. The downside was having to stuff pre-stamped envelopes, so employees and their kids got bonus money.
Base 3600 in College Credits
The Babylonians used base 3600 because many fractions convert exactly to parts per 3600. Transfer credits from semester colleges, quarter systems, and military institutes pose a similar problem. Usually I see programmers try to solve the problem by going out to many decimal places. They still must contend with the situation where one third becomes 33 hundredths, and so three thirds becomes 99 hundredths, and not the expected 100. Base 3600 handles the decimals from military institutes and thirds from quarter systems. ( 4 quarter units maps to 3 semester units.) The round off going into the common denominator is eliminated. The organization keeps the credits as 3600ths and must then decide how to express the sums. When using the Base 3600, one-third appears as .33 in decimals, and the sum of three thirds appears as 1.00 in decimals. Discrepancies still exist but less frequently.
Battleship Billing
Under the old system, a business had to keep detailed records on each emitting device about hours of operation, temperatures, fuel usage, repairs, and many more tiny details. With as many as forty pages of details per emitting device, the business would come to the AQMD Hearings and present its case on a device by device basis. The result was hundreds of hours of haggling and a fee of about $12.00 for a simple fryer at a restaurant. Maybe a restaurant with a strong case could reduce the fee to $11.00. The wasting of time was not worth the savings in money, The solution got the nickname of Battleship billing because the fees were arranged in a rectangular array like the game called Battleship. The lowest fee was in A1, which was column A and row 1. The highest fees were in column H. Each square contained many different devices with the same pre-set charge. With the battleship method, the restaurants and other business could choose to buy equipment and know the fees in advance. According the Mr Greenberg, General Counsel for the AQMD, the new method saved his officers tens of thousands of hours per year. Hearings and inspections became easier. Only important cases and general cases went to hearings.
For the data processing department, the change in billing was like night and day. Before, the billing program had a zillion sub-programs that tried to calculate emissions and thus fees. Each device had fuel inputs, equations for physical processes, and variations in output format. With thousands of devices, maintaining the complex program was a nightmare; the programmers could not keep up with the changes in the real world. After Battleship Billing, each permit basically required a table lookup. When the device on a permit failed to reference a fee in the table, the program postponed the permit like two weeks and wrote to an error log. An engineer would read log and make corrections to the table, The size of the billing program shrunk from tens of thousands of lines to less than a thousand. Maintenance time became hours per year, whereas before the program required multiple full-time programmers.
Calculating vs Sorting vs Reporting
The AQMD had tens of thousands of permits and their related devices. Daily the statisticians wanted to run listings and comparisons and trend reports. In the early 1980's, the programming department adopted Quiz and Quick from Cognos as the tool of choice. Programmers wrote many similar Quiz scripts each week. Some scripts became choices on the menu. For each new script, a programmer had the task of spending four hours to estimate the system impact; that task was absurd because typical script only ran ten minutes. And the slowest script ran about an hour to process and ten hours to print.
Using printers in the basement, operators printed the output on paper and delivered the paper to the requesting department during the day. Often the computer ran scripts through the night to catch up with the demand.
A new programmer analyzed the processing, and he added several pre-sorted indices. Many scripts could then run in one or two minutes. With that success, the programmer analyzed more trends. Sorting, merging, and subtotals for percentages slowed many scripts; calculations took minutes per script; the job listings showed the phases of the Quiz processing. The idea for pre-processed data structures materialized, but Quiz could not handle the structures.
With the help of Hupert Wilson, the programmer designed a COBOL program that could read the new structures. The programmer put the program on Hupert's menu. The new program could accept input from the terminal and display results in seconds. Hupert wanted both screen display and hard copy options. Hupert wanted inputs from saved files, and he wanted to keep output files indefinitely. Within weeks he wanted comma-separated files for Excel. Hupert wanted more of the pollutants besides the common five: NOX, SOX, CO, CO2, and particulate. After six more weeks of testing and enhancements, the programmer agreed that Hupert could share the program with people outside of Hupert's department, which was Air Chemistry.
Meanwhile, the manager and operators in the basement noticed a severe decline in batch jobs and paper output. Also, the manager saw that the CPU was often idle during part of each minute. The manager of programming saw a distinct drop in requests for new Quiz scripts, which required twelve to thirty hours to implement, which lowered the need for programmers. Ruckus ensued, and Mr Sweet, the Director of DP, investigated and declared that every one should embrace the future: end of bickering.
The only major drawback was the need to re-calculate and re-load the data structures each evening or at noon. The process took about 25 minutes. The re-load date-stamp became a reportable item, which pleased the users for its completeness. "All active permits as of 03:20 am 06 JAN 1990."
The Delta Table
The Filemaker data base at APU could not handle the large daily replacement of nearly a quarter million records. It was a shame that so much data was being copied to such a weak data base, and the process took many hours. While the Filemaker data base existed, it was a liability for backups and security; it required too much time from a programmer to load and reload regularly. Then a clerk named Linda stated the obvious, "Please give us just the changes."
The programmer had been unloading the entire large file from the main data base and then loading that same file into Filemaker. Instead, a new second database contained a copy of the main database as of the previous process date. A new program then used the date of modification on the records in the main database to create a file adds, changes, and non-changes; that process to 25 minutes. To find deletes, another program compared the new second data base against the extracted file, which took another 25 minutes. In less than an hour, the programmer had the file of changes. Lastly, the programmer copied the current data base to the copy database. Filemaker was happy until it was replaced about a year later.
Authoritative Reports
The University had more than thirty departments and almost as many reports of students per department. Imprecise methods led to a greatly overstated enrollment and awkward debates about funding per student. The IT department and the Provost got together and agreed that during the term, a person was a student if and only if the person was enrolled in a class. Secondly, the Provost wanted to count students per department as the number of students whose primary major was sponsored by the department. Immediately, the enrollment matched the sum of majors per department. A second column showed second major or minor per department. A third column showed the total unduplicated head count from classes per department, and a fourth column showed the duplicate count. A duplicate count occurs when a student takes more than one classes from the same department. The fairness and the precision of the "Majors by Department" report soon forced the other reports into the shadows.
User Preferences
The debates about colors, fonts, and minor features on screens often wastes much time. Allowing the user to set personal preferences eliminates much of the squabble. Many times, I have observed that power users dim or remove labels. Later, they might restore the labels to understand some anomaly and then they again dim all the superfluous items on the screen. The power user does not want the visual clutter. A good screen handler gives the power user the ability to remove labels and field numbers, change initial values, allow multiple input formats, change sequences of fields, and keyboard control for all commands. Using the mouse might help occasional users, but the mouse should be optional for power users.
Screen Navigation Automatically
Allowing the clerk to set initial values and the order of navigation greatly increases the usefulness of most input screens. For example, at a college, the admissions office receives about fifty inquiries per week from a magazine. The clerk sets the initial values for the kind of advertising and name of the advertising, which is the magazine. The clerk then puts the screen into add mode and chooses the order of the fields to visit. Finally the clerk turns on automatic update at the end of the fields. The result is that for the cost of initializing the fields, the clerk saves many strokes and much time.
No Garbage Data
The proliferation of data threatens to overwhelm many businesses. Concentrating on the important still remains essential. One method to eliminate bad data from a system is to restrict bad data at the point of entry. This should seem obvious, but people often complain about their own bad data. A more sophisticated approach evaluates the quality of each incoming record. What value does the information convey? Is the name and address good, but the remainder is bad? Consider having a field to indicate the quality of the record, or consider having a second repository.
One Page Transfer Documents
Several studies discovered that warehouse personnel handled short transfer documents better than long transfer documents. The pickers and shippers were more accurate and faster per page with shorter transfer documents. Likewise, the receiving agents did better with short documents. In both cases, the clerks lost time when reviewing multiple pages. The solution was to publish all transfers as one page documents. The clerks were more efficient and precise with many one page documents than with a large document having the same number of lines.
Enumerated Signatures
The loan agency had to go to court frequently and argue that the paperwork had been signed on the date written on the document. For various reasons, the plaintiffs often alleged that the loan agency had back-dated the signature. After being caught cheating, the loan agency was having a difficult time in court. As much as the defense wanted to argue each case separately and privately, news of the cheating was part of the court proceedings. A notary came up with a simple solution. The policy became that each agent numbered each signature along with writing the date. This solved the signature-date problem. The problem of back-dating and altering still exists in many other forms. Is the paperwork complete and unchanged? See Check Digits
Check Digits Revive
Forty years ago, during data entry, trips to the data base in order to look-up part numbers were very expensive and time consuming. On some slow systems, a single look-up could take a big fraction of a second. Meanwhile, CPU time was relatively cheaper. So, designers assigned part numbers according to numeric formulas. When a clerk entered a part number, the program would verify that the part number fit the formula; if not, the program sent an error to the clerk. Even if the part number passed, there was no guarantee that the look-up would succeed, but at least the format was valid.
Databases now are wonderfully fast, but check digits still exist. The typical credit card purchase requires the kind of card, like MasterCharge, because the kind of card determines a verifying formula. Likewise, many document systems now calculate a checksum when the document is saved. An unauthorized change will cause the re-calculation of the checksum to fail on the next opening of the document by the system. The checksum might be long, like ten or sixteen digits. The likelihood of an unauthorized change generating the same checksum can be designed to be quite low.
Date or Time Interval Arithmetic
A good friend was trying to determine if the classes taken by a student happened to have overlapping class hours. The vendor of our software had a very complicated bit-mapping technique that worked to the nearest 12 minutes. We showed him a simple calculation to compare two time intervals. Basically, express begin and end times in military-extended time and sort the intervals by beginning time. Then compare rows where the interval of a record contains the begin time of any subsequent records.
Several studies discovered that warehouse personnel handled short transfer documents better than long transfer documents. The pickers and shippers were more accurate and faster per page with shorter transfer documents. Likewise, the receiving agents did better with short documents. In both cases, the clerks lost time when reviewing multiple pages. The solution was to publish all transfers as one page documents. The clerks were more efficient and precise with many one page documents than with a large document having the same number of lines.
Enumerated Signatures
The loan agency had to go to court frequently and argue that the paperwork had been signed on the date written on the document. For various reasons, the plaintiffs often alleged that the loan agency had back-dated the signature. After being caught cheating, the loan agency was having a difficult time in court. As much as the defense wanted to argue each case separately and privately, news of the cheating was part of the court proceedings. A notary came up with a simple solution. The policy became that each agent numbered each signature along with writing the date. This solved the signature-date problem. The problem of back-dating and altering still exists in many other forms. Is the paperwork complete and unchanged? See Check Digits
Check Digits Revive
Forty years ago, during data entry, trips to the data base in order to look-up part numbers were very expensive and time consuming. On some slow systems, a single look-up could take a big fraction of a second. Meanwhile, CPU time was relatively cheaper. So, designers assigned part numbers according to numeric formulas. When a clerk entered a part number, the program would verify that the part number fit the formula; if not, the program sent an error to the clerk. Even if the part number passed, there was no guarantee that the look-up would succeed, but at least the format was valid.
Databases now are wonderfully fast, but check digits still exist. The typical credit card purchase requires the kind of card, like MasterCharge, because the kind of card determines a verifying formula. Likewise, many document systems now calculate a checksum when the document is saved. An unauthorized change will cause the re-calculation of the checksum to fail on the next opening of the document by the system. The checksum might be long, like ten or sixteen digits. The likelihood of an unauthorized change generating the same checksum can be designed to be quite low.
Date or Time Interval Arithmetic
A good friend was trying to determine if the classes taken by a student happened to have overlapping class hours. The vendor of our software had a very complicated bit-mapping technique that worked to the nearest 12 minutes. We showed him a simple calculation to compare two time intervals. Basically, express begin and end times in military-extended time and sort the intervals by beginning time. Then compare rows where the interval of a record contains the begin time of any subsequent records.