Archive

Posts Tagged ‘computing revolution’

Computing “Revolution” of Past Two Decades as a Showy Failure: 2

October 30, 2020 5 comments

In the previous part of this series, I wrote about how almost every technological and scientific achievement we associate with the current era was developed before the personal computing “revolution” of past 20-25 years. We successfully designed and made everything from nuclear submarines, ICBMs, nuclear weapons, modern airliners, modern drugs, interplanetary space probes before this so-called “revolution”. Even more interestingly, the past 20-25 years have been the most stagnant period from the point of useful technological advancement in over 200 years. It is as if these two decades have not produced anything which has actually improved our lives or allowed us to real stuff that was previously considered out of reach.

In this post, I will go into some of the stuff I promised in that post- starting with automobiles. As Scotty Kilmer always likes to remind his audience, Japanese cars from mid- to late- 1990s consistently last for over 400k miles as long as you don’t go out of you way to abuse them. So let me ask you the next logical question- has any of the “computerization” of cars introduced since then made them last longer, significantly safer or somehow “better” for the consumer. I think we all know the answer to that question. Which brings us to next inevitable question- Why do corporations keep doing something that does not result in a better product.. and why does this trend keep getting worse. What is going on?

Why are car companies incorporating circuits in to their engines which make them easier to hack, far more sensitive to damage and often result in a lower quality product that does not last as long. Why do so many of them want to replace very ergonomic physical controls with virtual controls that make using them a far bigger chore than necessary. Why are so many car companies pushing hybrids that have excessively complex, hard to repair and often finicky hardware when they seldom have even a 5% better real-life mileage than their conventional counterparts. Also, curiously, why are some Japanese and Korean corporations far less likely to implement the worst of these costly and dangerous trends than their North American or European counterparts. What explains this difference?

Moving on to housing.. Has the quality of housing or the experience of living in one improved in the past 20-25 years? Have “smart” thermostats or “smart” security systems improved the quality of your indoor environment or security? Has having “Alexa” or its Google equivalent in you home improved the quality of your life apart from showing others that you are “hip” and “with it”. Also, what sort of idiot wants to pay corporations and the government to constantly spy on them in their own home? Have “smart” bulbs or LEDs really improved the quality of lighting in your house or substantially affected your electricity bill. Why do all the “smart” refrigerators, washing machines, coffee makers and other appliances fail much sooner, in addition to being unrepairable and more expensive, than their “dumb” analog counterparts .

Let us talk about education- both K12 and university. Has the extensive use of computers in education improved the quality of learning or made it less expensive. Are 2020 graduates somehow better than their counterparts from two decades ago? A large increase in use of computers for education has not improved its quality or made it less expensive. But if it hasn’t made education better, why is there still a continued push to increase the level of computer use in education. If something does not make the situation better, why keep pushing for more of it. And this phenomena goes far beyond automobiles, household alliances and education.

Consider the supposedly indispensable role of modern computing in running corporations. Did you know that large and multi-national corporations existed for decades before electronic computers of any sort existed. Did you also know that corporations of all sizes were able to run their supply chains, manage production, develop innovative products and pay employees and creditors on time before the first electronic computer of any sort was assembled. How did they do that? How did USA, USSR and Nazi Germany produce all the weapons and vehicles necessary for WW2 without possessing modern computers for running logistics or access to Excel tables and PowerPoint presentations? How did Ford, General Motors, Chrysler, GE, Motorola, IBM and many other corporations become big without access to CRM software.

How did large oil refineries run in the era before electronic computers? What about machine tooling? How did they build big stuff such as nuclear submarines, supersonic fighters and bombers, aircraft carriers or make millions of rifles, submachine guns, semi-auto handguns, assault rifles, artillery pieces in the pre-computer era. What about nationwide electrical grids, highway systems, railway networks etc? How come they ran just fine before era of electronic computers, let alone the computing “revolution”. Why didn’t the lack of electronic computers stop people from designing or building large dams, hydroelectric projects, irrigation products, coal-powered stations or electric grids. It is as if the lack of even older electronic computers has little to no effect on the ability of human beings to get things done in a way compatible with maintaining a modern lifestyle.

Since we are, once again, close to a thousand words, I will now wrap up this post. In the next part, I will write more about how the so-called computational “revolution” has not improved the process of drug development, everyday financial transactions and popular entertainment.

What do you think? Comments?

Computing “Revolution” of Past Two Decades as a Showy Failure: 1

October 23, 2020 40 comments

One of the defining features of the past two decades in west has been the dominant position in public consciousness of corporations involved in manufacturing personal computer hardware (desktops, laptops, smartphones, tablets, IoT crap, embedded electronics etc) or making them function and do stuff (‘IT’ corporations such as Google, FakeBook, MicroShaft etc). One could say that Amazon is a an ‘IT’ company which sells stuff that people used to buy in department stores. A large part of current market value of many stock indices in the west now comes from corporations who either make personal computational hardware or the software they run.

But have you ever asked yourself- has these rise of these corporations or the widespread usage of products and services sold by them actually improved the quality of life for the vast majority of people. To understand what I talking about, let us ask two more basic questions. Question #1: Would the absence of personal computing “revolution” during past twenty years have any negative effect on the quality of life or somehow constrain development of other technologies? Question #2: Has the computing “revolution” improved quality or reliability of other products and services, let alone increase the general quality of life for vast majority? As you will soon see, the answers to both questions are obvious as well as surprising.

The unpleasant fact for many geeks is that the computational ‘revolution’ of past two decades has been the most sterile and unproductive period of general technological advancement in the past two hundred years- and I do not make that claim lightly. To better understand what I am getting at, ask yourself if you can name a single non-computer product that has improved your life or is somehow associated with the modern world which would not have existed without this pseudo “revolution”. Give it a try.. can you think of any non-compter product which would not have exsited without this so-called “revolution”.

Since we have to start from somewhere- let us start with modern jet airliners? Well.. every airliner designed until the late 1990s was largely designed by competent engineers using their engineers using their experience and some combination of slide rules, desktop calculators, 8- or 16- bit desktops connected to a few clunky mainframes. The DC-9, DC-10, 737, 747 etc were designed in what was essentially pre-computer era. The A-320 was designed at very start of era where electronic computers (mostly mainframes) of any type were widely used for aircraft design. The 777 was the last aircraft designed with a combination of good engineering and primitive CAD technology. Only the 787 was designed in era of modern “computing”- and it has been the most over-budget and troubled design of them all.

And this is not just restricted to airliners. Consider space exploration and missiles. The space race between erstwhile USSR and USA occurred before the modern computing “revolution”. People went into space before even their vehicles had a single solid-state transistors, let alone a IC or CPU, within their rockets and vehicles. The flight control computer used in Apollo missions was a hand-made computer with about the same computational capability as an early Apple II, TRS-80 or Commodore PET- though it was a 16-bit machine. The Pioneer and Voyageur Probes which are the only man-made objects to visit Uranus and Neptune (albeit in a fly-by) did not have CCD cameras nor CPUs. The same is true for both Viking probes which landed on Mars in l970s as well as the Venera family of space probes that USSR successfully landed on Venus in that era. Oh.. and all those lunar probes and soviet lunar rovers too.

The vast majority of space probes launched prior to late 1990s used tube technology (or very primitive CCDs) for imaging and very basic IC circuits joined to make ersatz CPUs. And guess what.. they performed their job magnificently. But it gets even more interesting when you look at aircraft and missiles used by the military. Did you know that first ICBMs did not use solid-state electronics and it was not until the 1980s that ICBMs using Integrated Circuit Blocks for guidance became commonplace. Funny thing is that the accuracy of ICBMs has not increased by a worthwhile margin since the 1980s. Even ALCMs (Air Launched Cruise Missiles) achieved almost the same accuracy and guidance capabilities as those used today with what essentially a mixture of custom ASICS along with 8- and 16- bit CPUs. The GPS system worked just fine with receivers that contained what were essentially 8- and 16- bit CPUs.

Even the state of design for nuclear weapons, which were often designed using a combination of previous experience and calculations on some of the first real “supercomputers”, has not progressed much further than it was in the mid-1980s. Remember that every single warhead in American and Russian Inventory was (at best) designed on a “super-computer” with less computational power than the original XboX game console. The same holds for design of everything from nuclear submarines, tanks, guns and missiles. To put it bluntly, even in areas where the computational “revolution” should have helped the most, things have been pretty stagnant since the 1980s- and not for the lack of money and resources thrown at the Military-Industrial complex. It is as if big and substantial technological advances haven’t occurred in these and many other fields since the late 1980s to mid-1990s.

Since we are at almost a thousand words, I will wrap up this post. In the next ones, I will write about how the so-called computational “revolution” has not improved the quality of housing and automobiles, school and university education, transport and corporate logistics, process of drug development, everyday financial transactions and.. yes.. even popular entertainment. Even popular entertainment..

What do you think? Comments?