I'm an information technologist. There, I said it, and I'm glad. It's such a relief to get that off my chest.
I have worked in IT since the 1980s, which makes me a very old guy in that space. Back then, there were not nearly as many of us as there are now. I guess you could say that, defined by the use of IT tools and enhancing them, there are now many millions of information technologists world-wide. The spread of skills, knowledge, and abilities in IT across the globe has accelerated the pace of development and even accelerated the rate of change. (A nod to the literature spanning Thomas Kuhn to Ray Kurzweil and thousands of other authors who have charted the transformations.) My experience in this space is more applied and personal.
I began with IBM punch cards, usually waiting hours for my job to run--often at 3 AM because that's when the computer science students finally went home and mere students in the humanities could get time on the mainframe. My frustration level bounced from irritated to furious during these sessions. Irritated because it was 3 AM and furious with myself when a misplaced comma in line 13 caused a syntax error, stopping the job. The fury was because it usually took 45 minutes to get back the cards and a green-bar printout showing me the error of my ways. Then another 45 minutes to submit the cards and see the result.
I took several such courses across my undergraduate and graduate studies, never really mastering any of the arcane science. I vowed not to touch a computer again for the rest of my working life as a historian. I was confident that my oath would stand since I used pen and paper, or at most a typewriter. Life then took a turn and things did not follow my plan.
Within three years of finishing my dissertation, I owned a microcomputer and was using it to develop a fund accounting system for the historical society where I was working. I backed into IT by working at a non-profit that only had six employees, so everyone had to wear multiple hats. Because I had some coursework in programming languages (and was the only one in the office who had), I began helping out with the membership database. That turned into creating the accounting system. That turned into developing an archival materials database. That turned into creating a database index to photographic archives. I found that I had become conversant in two languages not spoken by human beings but very useful for computers. In fact, I had come to see the fun and the potential of these technologies.
How does a Luddite find fun in technology? When I could see it work. The microcomputers and then the PCs that I worked on allowed me to find the issues in my programs within seconds (I've never been a great coder) and correct them. Each loop of coding, compiling, and execution supported my learning curve because I got better with each cycle. The difference between 45 minutes then 45 seconds and then 4.5 seconds from conception to execution helped me build my expertise.
Another phenomenon was taking place as I moved from the life of a scholar to a computer programmer. I shifted from machines that recorded data on floppy disks to magnetic disks to optical discs. All within a few years. The change in storage size and speed of response was exponential. I also moved from paper punch cards, to green monitors, to amber monitors, to color monitors over a few years. At one point, I wondered why anyone would want to have a color monitor. I saw the reason in a magazine advertisement that filled the page with random text but showed the message in red letters. De-cluttering the data to derive the message in color was better.
It seemed that I needed to replace my perfectly functional set of equipment with a new generation every three years or so. I was learning the paradox that information technology is a strategic asset that one throws away on a regular basis. That seems wasteful, but is not when properly considered. The changes in processing power (famously expressed by Moore's Law), size of storage, and resolution of display--just to name three--have kept growing at various rates from fast to very fast. The first magnetic hard drive I saw was the size of a modern washing machine, cost $250,000, and held 10 MB. Today, I can buy a 64GB SD disk for $120. That's a 6.4 million times increase in storage that can fit under a postage stamp. (Remember letters?) On another plane, my smartphone has more than 300 pixels per inch--twice that of previous versions. Without a microscope, I cannot see the pixels. The increase in power, capacity, and resolution has created a new set of technologies every 2 to 3 years. I had to upgrade just to stay in place.
The issue in play was not a technological arms race. It was remaining conversant with the rest of the field. If my version X code was to run on version Y, I had to move to the latest platform. Hence I became adept in relearning a language that had changed or even learning a new-to-me language. Everything I knew about IT has changed every five years, requiring me to constantly study, adapt, and transform my approach. Finally, the new capabilities of the current version always outstripped the earlier ones by at least an order of magnitude. But there is more to say.
The changes transformed our approaches we networked computers into LANS and WANS, then again when we connected to the Internet. The capabilities of the Internet are still rapidly developing as we come to understand crowd computing (for good and ill) that brings millions of people together to contribute to a shared experience. The rise of these social spaces challenge the old notions of relationships and the definition of society. They seem to spring at us from nothing and become household terms faster than a baby learns to walk.
What does this mean to those who have invested in previous efforts in our field of biosurveillance? It means that old systems must be replaced by new. Sunk cost is an anchor dooming unchanging approaches to irrelevance. History is speeding up. One must change faster just to keep pace. Lewis Carroll's Red Queen was a prophet.