Ever few weeks, or months, we hear about how some new super-computer has claimed the crown of the ‘fastest’ computer in the world. We are then bombarded with many numbers that document its impressive CPU count, RAM size, x gazillion operations per second, bus speed etc. But have you ever wondered-
What are these machines being used for anyway?
While running benchmarking programs to showcase the capabilities of your new toy is certainly satisfying, I cannot help but wonder what happens next. The conventional line fed to journalists who cover such events goes something like this-
“These new machines will help us model the universe/ weather systems/ climate change/ protein folding or any fashionable cause that attracts more funding.”
But haven’t we been doing those things since the beginning of the computer age? What have we achieved so far? How far has simulation of complex natural systems been helpful in understanding them? Can we make better predictions using faster computers or more refined algorithms? So far, computer simulations have not helped us understand or find dark matter- if something like that even exists. Our ability to predict the weather is still shit, and our climate models require “correction” factors to even approach observed values. Our ability to model protein folding and bio-molecular interactions is still pretty pathetic. This state of affairs has persisted in the face of colossal increases in available computational power. So what is going on? Why haven’t the computer gods delivered? Why would throwing more computational power at a problem solve it if previous attempts to do so have proved futile?
I believe that the problem lies elsewhere. Maybe our paradigms, assumptions, theories and algorithms are defective. While accepting this premise might be hard, it explains why the almost exponential increase in available computational power has failed to produce even a linear increase in our ability to accurately model complex systems.
I have long believed the science today is closer to a mystery-based religion than an objective methodology to understand the surrounding world. How many scientists and “experts” really understand what they are talking about in their jargon laden dialects? How much nonsense and bullshit is accepted and canonized because it sounds smart or knowledgeable. Aren’t their jargon-laden explanations rather similar to priests and witches communicating with deities, demons and spirits? How many scientists have even thought through the multitude of theories and paradigms they enthusiastically profess and promote?
Isn’t the search for dark matter a lot like the search for ‘celestial ether’ in the late 1800s, or the holy grail in a previous era? Aren’t most theories of the universe which can be proved only by self-referential mathematical manipulations a bit too much like religions beliefs based on misleadingly complex and sophistic arguments? Isn’t modeling climate with algorithms that require significant correction factors reminiscent of astrology, palm reading or reading animal entrails? While there are many reasons for this sad state of the so-called “cutting edge” of science- one factor stands out by its sheer obviousness and impact.
The need to show evidence for high metrics driven productivity in a system riddled with bureaucracy, bloated egos, stupidity and callousness.
Much of scientific research has long ceased to be an endeavor to expand the frontiers of human knowledge and benefit mankind- if that was ever the case. Today, it is mainly an exercise in hand-waving driven by the need to give the appearance of hard work to ensure continued funding. It is now a giant con game that preys on the hopes and fears of other people and repeatedly promises them things that they simply cannot deliver- kinda like religion promising salvation, heaven or an eternal afterlife.
I believe that using super-computers to create, store and distribute porn is a far better and more appropriate use of such machines.
What do you think? Comments?