Front-Load Effort Like a 1980s Video Game Developer

Video game developers in the 1980s had a significant challenge. They wanted to build games that had rich graphics, immersive sound, and realistic movement. To do this with even a simple platform game such as the classic
Super Mario Brothers
requires many calculations for every frame being drawn, but they were using computers that were less powerful than the chips that run today's toasters.
Constrained by the environment, developers developed techniques to front-load effort so that the computer didn’t need to do so much work for each frame. Today we look for opportunities to use similar techniques when designing processes and records to reduce effort, remove cognitive load and eliminate sources of error.

Want to get articles like this right to your inbox, every week?
Subscribe Here!

Remember Super Mario Brothers?

Take the example of Mario jumping off of a platform: with each frame you need to calculate: the character’s movement some simple physics for the rise and descent; the changing graphics for the figure; whether there’s been a collision with Goomba or a flag pole; and move the other characters and background. All while playing music and keeping track of what buttons have been pressed.
That’s a lot of calculations that have to complete on-time with every frame. And you need to repeat them least 20 times per second! And remember, we’re talking about Super Mario here, not some immersive 3d game that also has things like light sources, moving shadows and speaking characters.

It's Basically a Souped-Up Toaster

Now, you might think that this is exactly what computers are good at - doing calculations - and you’re correct. However, back in the day game developers were working with computers that were terribly constrained in both memory and processing power, and under pressure to constantly push the boundaries of what was possible with those souped-up toasters.
While computers are pretty good at doing calculations, they are even better at looking up information at a fixed place in memory. So developers found ways to front-load the effort that the computer needed to do on-the-fly so that it did not need to work so hard with every frame.

When Things are Looking Down, Look Up

Developers used various strategies for this, including two we’re going to talk about today: look-up tables and memoization.
A
Look-Up Table
is exactly what it sounds like - a table of pre-calculated values where you just look up the one you want. The key here is that you don't make a table of every possible result - just the ones that are necessary for the context. Way better than solving a differential equation, don’t you think?
Memoization
is a related concept, but is more like combining mathematical re-factoring with look-up tables. Here, you identify and combine portions of procedures and calculations so that they are performed more efficiently, storing the result for later re-use. For example if you have to do a set of calculations many times, but a portion of the them are the same for every iteration, you could do that portion just once and store the result for the rest of the iterations.
Both of these strategies front-load effort to eliminate repetition of work - where you can you do the work once, at design time, so the values are explicitly written into look-up tables that can be held in the computer's memory and accessed quickly. That only works when you can reasonably constrain the number of results required - for example "I'm only ever going to calculate this for these 100 cases".
When things are more variable you can still front-load effort by identifying
parts of the calculation
that are repetitive, and then find ways to compartmentalize and store those parts for reuse. You're basically finding ways to re-use part of the work, creating useful references on the fly that help you later on in the task.

Back to the Real World

So what do Look-Up Tables and Memoization in video game development have to do with us?
Well, it turns out that like computers, people are better at looking things up than we are doing calculations on the fly: humans have adapted to being able to recognize patterns quickly. We're also prone to make errors when we have to do too much at a time, and have somewhat leaky memories.
So while we don't usually expect operators to do hundreds of complex calculations in short periods of time, we do expect them to perform complex tasks consistently, efficiently, and while holding a lot of context in their heads. Instead of pushing them to operate at the boundaries of their
toasters
brains, we can significantly increase quality by reducing effort and cognitive load, and removing some sources of error entirely.
Here are a few ways where we can apply the two strategies around our quality systems.

Look-Up Tables on a Form or SOP

Look for ways to incorporate pre-defined values as much as possible into forms and SOPs:
You could provide a table of useful pre-calculated results, example calculations or an expected range. This provides the operators with an order-of-magnitude check.
Avoid (or in addition to) providing percentages or ratios, give concrete numbers in the units and format that they will be measured. So “between 4.1 cm and 5.5 cm” instead of “4.8 ± 15%”.
Where possible, design data input areas on forms or in software to limit data entry to only valid options. For example, use yes and no check boxes, restricted date entries that only allow the correct format, or lists of relevant values.
If you’re using reference tables, provide a link to the entire reference, but also try to summarize the relevant part(s) in the SOP or form. For example, don’t just reference the ANSI sampling tables - you will probably only use a handful of those values. Take the time to figure out what the appropriate sample sizes and AQLs are for your tolerance levels and the usual batch size ranges of your products, and construct a summary table for those ranges.
Look-up tables and other reference data improves repeatability, can reduce the cognitive load on the operators, which in turn reduces the opportunities for calculation errors. It can also improve the quality of your records, and might even allow you to see ways to simplify documentation requirements. Just make sure you're not restricting them so much that they need to resort to comments to document unforseen or special cases!

Write Specifications Where You Can

Specifications also act like a look-up table. We're not talking about full design specs where we specify that "this button is 37 mm wide and red", just the minimum, known constraints such as regulatory and process requirements that aren't going to change anytime soon.
When you take the time to write down minimum requirements for a process, equipment, software or spreadsheet, you are providing a set of data that consistently informs design, testing and review throughout the lifecycle of that item. Changes may be validated and justified against them. New revisions can be easily checked against them. Forms and QC/testing programs can be built against them.

Memoization, not Memorization:

How about front-loading the work that's part of a procedure? Are there ways that we can emulate memoization too?
In its simplest form, you can provide guides for operators to record data that will be referenced later on. For example, avoid making the operator do calculations in their head, and break calculations into simple steps with key data explicitly recorded.
Ensure that data is recorded as displayed on the instrument. Any rounding, conversion, or other calculations (work) should be performed at the last possible point before it’s needed, keeping the record of the original observation intact.
Similarly, provide spaces for any intermediate values so that calculations are as simple as possible and all pertinent steps to the final value are visible to the operator and any verifiers.
If there are multiple, intermediate measurements (for example, tare, initial and final weights when combining material) then provide spaces for the operator to record all of them. Don’t make them use marginalia, scraps of paper, or their memories. Even tare functions, though convenient, can be sources of errors when skipped before a container is filled.

Ordering and Reordering

A more subtle form of memoization is to identify work that can be moved to more convenient or efficient places in the workflow:
Move tasks out of the critical path of a procedure, freeing up the operator to focus on the important parts. You probably already do this when organising for a study day or production batch, for example by signing out and gathering the relevant forms, pre-labelling containers, batching container tare weights. Formalising these kinds of preparatory tasks in a procedure will allow you to standardize and find efficiencies as well.
You might also want to bring in some work to the critical path, for example by scheduling record and form review before the end of a shift or batch. This way any problems can be found and corrected before the batch is completed or data goes into a report. Similarly, you can bring QC personnel onto the production or laboratory floor at key times so that you can inline QC sampling.
Often Quality Assurance tasks can be in-lined as well. This is already built into GLP insofar as in-study inspections during critical phases are required. In a production scenario you could, for example, have QA be present for key parts of production and get portions of the batch review and reconcilliation completed early.

Today we saw two useful concepts when designing quality processes, forms and automation tools: Look-Up Tables and Memoization. These are ways of Front-Loading effort in the form of information that is stored within our tools to change some complex work performed during a task into a simple read-and-understand.
By looking for ways to front-load effort like this, we can reduce effort and cognitive load, and remove some sources of error entirely. You'll have fewer deviations, happier operators and better quality overall.
Until next time, thanks for reading!
– Brendan

Join Brendan's Beyond Compliance List

Weekly articles on designing, improving, and troubleshooting quality processes for regulated industry professionals from Operations Managers to QA Leaders.
We'll never share your email. Unsubscribe any time.
© 2022 Brendan Hyland. All rights reserved. See our
privacy policy
and
terms and conditions
.
Generated by
elm-pages
. About the
icons and images
used on this site.