In the early 1950s, specimens were delivered, usually by the house officers who had collected them, but occasionally by students or nurses. This had some advantages. One could assess the degree of urgency and sometimes the houseman would wait for the result. For a time, it was possible to deal on the spot with such problems as illegible requests, lack of clinical details, unsuitable specimens, etc. but, as the work load increased, this became impractical. However, the biggest disadvantage was the unpredictability of time of arrival of the main workload. Late working of staff of all grades was unpaid and was common. Out-of-hours analytical work was done by the university teaching staff, who worked a one week in four unpaid roster which included the weekends. One of these staff members was not in the phone. When he was on-call, the hospital phoned the Police Station nearest his home and they sent round a "bobby" to tell him. An out-of-hours service staffed by the technical staff was instituted around 1965. (ref: 10, 39)
Within the laboratory, specimens were centrifuged and recentrifuged at a succession of work places. Each technician pipetted what he required directly from the specimen tube using a bulb pipette. At that time, large amounts of specimens were required for most assays. For example, it took all the serum from a 10mL specimen of blood for a single Clark and Collip measurement of calcium and the procedure took so long that emergency results could not be provided. (ref: 10)
Initially centrifuges were hand operated with two 15mL conical tubes which were swung round inside a wire mesh cage. Sometimes these were driven by water turbine devices, which were liable to cause floods, or they were massive belt-driven machines bolted to the floor. Small electrically driven bench-top centrifuges, made in America by International Equipment Company, became available in the early 1950s. They were reliable, although subject to vibration troubles, and had a brown crackle enamel finish which was very difficult to clean. International also made a floor standing model with a capacity of eight 50mL or 100mL tubes and these had a large rheostat as a speed control which got very hot during running. The first refrigerated version had an ammonia compressor and its expansion coils were inside the steel bowl where they could have been fractured had a bucket come off during a run. Not long after this, MSE began producing their Minor, Medium and Major range and the importation of American centrifuges largely stopped. (ref: 10)
Tom Allison was the Chief Technician in the 1950s when Sam Frazer joined the department and Frazer described him as "a pillar of strength, with a willing and cheerful acceptance of whatever burdens the day might hold in store". Allison kept a loose-leaf "cook book" of analytical methods for a wide range of tests, which he would never let out of his possession except to lend a single sheet for an hour or two. This book proved so useful that, on one occasion when Allison was on sick leave, Frazer took a microfilm copy of it (this was before the days of photocopiers). He referred to this copy (using a x 25 magnifying glass) whenever a rarely requested assay was required, over the next two decades. Allison was succeeded by John Proffitt in the 1960s. (ref: 10, 32)
In the early 1950s the original Hagedorn & Jensen method for blood sugar, the manometric Van Slyke procedure for carbon dioxide combining power (performed after equilibrating the blood with expired air in a milk bottle) and for urea (measuring the nitrogen liberated by alkaline hypobromite) and the Volhard method for chloride were still in use. Albumin and globulin measurements were made by salt fractionation followed by Kjeldahl digestion, steam distillation and back titration. (Some departments had introduced the biuret method for total protein but this was considered by others as part of an unacceptable philosophy of "good enough for clinical purposes".) (ref: 10, 39, 101)Serum sodium was measured gravimetrically by the zinc-uranlyl acetate method and potassium by the cobaltinitrite method. These gravimetric methods took all day, by the time the precipitate had been dried but, in an emergency, they could be speeded up to take about 4 hours. In 1950 CP Stewart obtained one of the first flame photometers in this country. It was an internal standard instrument of American origin with a large glass spray chamber which required relatively large volumes of specimen. It was replaced by a much simpler EEL flame photometer. The staff were actively involved in introducing this method of giving relatively reliable and rapid results for sodium and potassium. As this became an increasing important part of the laboratory's work, there were many discussions with clinicians as they became familiar with electrolyte disturbances. For example, until it was fully appreciated that serum electrolytes did not necessarily parallel the whole-body state, the sometimes dramatic fall in serum potassium which accompanied the treatment of diabetic ketoacidosis was taken as evidence that the potassium values were unreliable rather than an indication for urgent potassium administration. (ref: 10, 39, 101)
The early pH meters were mostly vulcanite and mahogany instruments which were reasonably satisfactory for checking the pH of buffers but which were of little use for precise work. The Cambridge Instruments Company had a "serum electrode" which was a glass electrode in the shape of a small cup, into which the serum specimen was placed, in contact with air, and the salt bridge was lowered into the specimen. Marconi had a simpler device but neither had temperature control nor anaerobic measurement facilities and the practical limit of the readings was around 0.2 units (15 to 20 nmol/L hydrogen ion in the physiological range). EIL and Pye produced more reliable and stable instruments, which allowed reliable measurements to be made on small specimens, in the late 1950s. The high input impedance of these instruments created the interesting phenomenon that they were generally more stable when used by men than when used by women. This was due to the static charges generated by nylon which was more commonly used in clothing for women than for men. At that time (ca 1958), the Radiometer Company in Denmark produced an instrument which coupled Astrup's concept of "standard bicarbonate" with his elegant equilibration technique for determination of pCO2 and this achieved almost universal adoption everywhere by clinical biochemistry departments. (ref: 10)
In Edinburgh, a team of four consultants (two physicians, one surgeon and a biochemist) reviewed each case where dialysis was proposed. Initially there was only one machine to serve a population of a million and, therefore, rigorous criteria were applied before dialysis was undertaken. Repeated pre-dialysis assessment was performed. Dialysis fluids were made up separately for each patient according to what was thought to be required. The composition of each batch of dialysis fluid was checked for sodium, potassium, chloride, hydrogen ion and glucose concentrations and adjusted if necessary. This had to done before dialysis could begin and again in mid-session when the fluid was replaced. It was essential because occasionally the chemicals supplied did not correspond to the chemical listed on the label of the bottle. Hourly serum sodium, potassium, carbon dioxide combining power, urea and glucose concentrations were assayed while dialysis was in progress. As there was no mechanised analytical equipment available, other than a centrifuge, this was a formidable task over the eight to ten hour period, especially if the patient's clinical condition led to specimens being taken more frequently than hourly. (ref: 10) Once the technology became available to allow chronic renal dialysis to be performed, it was realised that staff working in these units were at particular risk of infection with the hepatitis B virus ("serum hepatitis"). In 1969 a patient who was being maintained by chronic renal dialysis in the Edinburgh Royal Infirmary Unit developed hepatitis. By the following year, several other patients and members of the hospital staff had become infected. There were four deaths among the hospital staff (two doctors, a laboratory technician and a laboratory clerk/receptionist). This prompted an urgent and careful review of the standards practices in departments working with specimens from potentially infected patients. (IW Percy-Robb, J Proffitt & LG Whitby, J clin Path, 1970; 23: 751).
The growth of Clinical Biochemistry as a science during the working life of CP Stewart can be illustrated by comparing the notes in the preface to his text book, co-authored with Professor Derrick Dunlop, "Clinical Chemistry in Practical Medicine". The preface of the first edition, published in 1936, contained "an apology for the place of clinical chemistry in patient care"; the preface of the sixth edition, published in 1962, cautioned against excessive reliance on biochemistry, to the detriment of careful history taking and bedside examination. (ref: 1)
The record keeping system which had developed over the years in the department involved the maintaining of day books and bench notebooks. Individual alphabetically filed cards carried a limited amount of hand-written information about individual patients. The report to the ward was a type written chit and, for this reason, with the increasing work load the reporting room became the rate limiting step in the production of reports. A combined no-carbon-required request and report form was introduced which saved much clerical time. A system of cumulative reports was introduced in 1964/65. In 1966 the department embarked on an eighteen month trial of off-line data processing for 5 AutoAnalyzer channels using an Elliott Automation 803 computer at the Scottish Medical Automation Centre. In 1968 an Elliott 903 was installed in the department and, by 1976 when the assays were transferred to the SMAC, this system processed the data from up to 18 channels of AutoAnalyzer seven days a week. In 1978 the 903 was replaced by an ICL 2903. (ref: 10, 17, 80)
Sam C Frazer (1956 to 1962) introduced a quality control scheme into the laboratory in the late 1950s. All the methods in the laboratory at that time were manual and, therefore, any such scheme could only add to the work of the already overburdened staff. It would also involve an element of scrutiny, if not criticism, of the work of individual analysts. (Now, when quality control schemes are commonplace and are concerned with the functioning of analysers and the reliability of reagent kits rather than with the individual skill or conscientiousness of the technical staff, it is easy to forget the anxiety, and indeed the resentment, which was evoked by the introduction of test specimens.) The real problem in a manual laboratory was that rogue results were sporadic, not systematic, and it took a great deal of effort to achieve very little improvement. (ref: 10, 39, 80)
David G Campbell from Wellington, New Zealand was appointed in 1964 and he took responsibility for quality control (Frazer having moved to Aberdeen). Campbell left in 1965 to become Director of Biochemistry at the Royal Women's Hospital, Melbourne and he later moved to the Royal Melbourne Hospital. (ref: 10, 39, 80)
Specimen collection techniques were far from satisfactory. Syringes were metal-ended, with metal plungers. They were supposed to be sterilised at the ward by boiling in a soda solution followed by a rinse in sterile water and then they were dried on the top of the ward steriliser. Quite often syringes were used wet, sometimes straight from the steriliser without having had the soda solution rinse off. Despite the foregoing, at least two thirds of samples reached the laboratory in a usable condition. The Central Supply of sterilised all-glass syringes was a significant improvement although even this was not without problems. On at least one occasion, erratic high potassium results were traced to the use of soft soap followed by inadequate rinsing prior to sterilisation. (ref: 10)