Search Results

Keyword: ‘Computing Revolution of Past Two Decade Showy Failure’

The Connection Between Fake ‘Innovation’ and Late Capitalism

December 10, 2020 16 comments

My previous post about the ongoing crapification of personal computing and series about how the computing “revolution” of past two decades has been a showy failure lead us to a seldom asked question. While it can be phrased in many ways, here is the simplest version- Why do so many who are rich, in power or aspire to either keep incessantly talking about “innovation”, “paradigm shifts” and “disruption” when we clearly live in an age of profound technological stagnation. Why do people pretend that Apple is an innovative company when the last time they did anything remotely innovative was 2007? Why do so many fanboys celebrate every fart emanating from Elon Musk? What is innovative about Uber, DoorDash or any other service which replicates the services already available in third-world countries full of poor and desperate people? What is innovative about the plantation-lite work environment of Amazon?

Now let us get back to the central question- why do so many people want to believe in fake “innovation”? What is the upside of celebrating fake “innovation” even when it is results in regression rather than progress? Consider user interfaces design for personal computers or software. What is the gain from producing and pimping increasingly shittier “redesigns” and “upgrades” which make the interface or program less useful, slower, buggier and resource intensive? Also, why did user interfaces remain fairly constant for over two decades (mid-1980s to mid-2000s before starting to become progressively shittier? Or take automobiles.. why hasn’t the reliability, safety and longevity not improves since late 1990s- even though their “complexity” has? Why are automobiles from non-Japanese manufacturers (and Nissan) full of progressively bad design choices in everything from layout of engine and powertrain components to increasingly gimmicky but dangerous design choices for driver control panels?

The same can be said for the increasingly shitty style of management of retail store chains which has caused many to go out of business in past decade or Boeing building progressively worse versions of their older airliners. Why do multi-million dollars homes in western countries look bland, formulaic and ugly. Are you seeing a common thread running through all of them and what does any of this have to with the strong connection between fake “innovation” and late capitalism aka neoliberalism aka financialism? To understand what I am going to talk about next, we have to first go into the pillars holding the unstable edifice of late capitalism. As I have mentioned in some previous posts, one of these pillars is credentialism. But how does it work in practice? Well.. the real function of credentialism is to cultivate incestuous insider networks with other “elites” by going to the same “elite” educational institutions. But how does this lead to fake “innovation” and actual regression of technological progress?

It comes down to its interaction with another pillar of late capitalism. Have you noticed that every corporation and rich person seems to have unusually high levels of investment in how they are allegedly perceived by the public? But who are they trying to impress? Do average people buy into the bullshit about “caring” and “socially responsible” corporations anymore than they believe that the HR person at work is on their side? If average people don’t give a fuck about the “social liberal” causes which are heavily supported by corporations, who are they trying to impress anyway? The simple answer is that all of these virtue displays, fake philanthropy and show of social liberalism by “elites” are about oneuping each other. But hasn’t this always been the case? Haven’t the “elites” of all societies throughout history spent too much time trying to oneup each other? So what is different now?

Well.. in previous eras, the “elites” of those societies did not pretend to have reached their positions because they were competent or actually good at whatever they were supposed to be doing. They were quite honest that being an “elite” was about being born to the right parents, married to the right person or being good at violence. Consequently, they left the actual work of getting things done to competent people employed by them. That is why, for example, the Medici family of Renaissance Italy stuck to the business of merchant banking and political influence while being great patrons of art rather than pretending to be great artists themselves. That is also why a lot of the industrial and banking dynasties of late 19th and 20th century Europe and USA stuck to their original vocations rather than seriously dabble in stuff which would make them look liberal, hip or “progressive”.

So what happens when incompetent but rich people try to do stuff at which they suck? Ask Nicholas II of Russia of the erstwhile Russian Empire who decided to personally take charge of military operations during the later stages of WW1 in Russia? Or what about Enver Pasha of the erstwhile Ottoman Empire who decided to cosplay as a military leader during WW1. The same hold true for thousands of generals and officers who gained their pre-WW1 positions in French Army through connections, bribery and kissing the right behinds. Long story short, when incompetent but powerful people enter roles they are not capable of fulfilling.. things go to shit. While this is especially obvious during acute crises such as large wars or economic meltdowns, it still occurs in times without obvious crises- albeit at a slower pace. The point I am trying to make is that “elites” who seriously dabble in real work almost always end up as massive and spectacular failures.

The reason this is a far bigger problem today than in past is that nobody expected “elites” or aspirational elites to do anything beyond being idle or playing insider games of the type seen in royal courts of yore. However the ideology of late capitalism aka neoliberalism is built around the concept of “meritocracy” which require participants to act as if they are involved in doing important work. It is even worse for aspirational elites, who in previous eras just had to play long with these insider court games or marry the right person. This results in people with power and money but no intellectual ability or competence pretending they are geniuses- with predictable results. It is even worse for aspirational elites, who now have to pretend even harder to be competent and brilliant people than the greedy power grabbers they really are.

The only way to succeed in institutions (including corporations) dominated by “elites” who subscribe to the ideology of neoliberalism is to fake the appearance of progress even if it destroys all of the real progress made in the past. That is why people who push bad ideas like Windoze 8 and 10 will be promoted over those who wanted to improve Windows 7. It is also why developing more fragile phones and computers protected by increasing amounts of hardware DRM is now the business model of Apple. Now you know why the UI of Gmail, Google Maps and MS Office, to name a few, has become worse with each iteration. This is also why people celebrate frauds such as Elon Musk who are pretending that technological capabilities which were refined 50 years ago are the result of recent “innovation”. And guess what.. you don’t even have to possess any actual technology to fake “disruption”- just ask that Theranos woman. It is all about appearance and style, not substance.

To summarize. the rise of fake “innovation” under late capitalism has much to with a toxic combination of financial incentives, “elites” and aspirational “elites” being out of their depth in a culture which very strongly favors the appearance of work as measured by “metrics” rather than anything close to the real thing. Might write more about this topic in future, based on comments and responses.

What do you think? Comments?

Recent Articles about Ongoing Crapification of Personal Computing

December 3, 2020 20 comments

While browsing the intertubes in past few weeks, I came across a few articles about ongoing crapification of personal computing. As you know, this is interesting to me since I am also writing a short series about the computing “revolution of past two decades has been a showy failure. Hope to finish the next part in that series sometime soon. But till then, have a look at these posts by other people making similar observations.

Bring back the ease of 80s and 90s personal computing

Back in time when things were easy: You could opt into purchasing major (feature) upgrades every 2–3 years, and got minor (quality) updates for free very infrequently (say, 1–2 times a year). You made a conscious decision whether and when to apply upgrades or updates, and to which applications. You usually applied updates only if there was a specific reason (e.g., a feature you wanted or a bug you were running into and needed to be fixed). Systems typically ran on the exact same software configuration for months if not years.

Contrast this with today: Systems increasingly become “moving targets” because both the operating system and the applications change by updating themselves at will, without conscious decisions by the user. The absolute perversion of this are “forced automatic updates” as are common in some organizations, where users have no choice but to accept that updates are installed on the machine (even requiring reboots of the machine) whenever some central system administrator decides that it is time to do so.

Computer latency: 1977-2017

It’s a bit absurd that a modern gaming machine running at 4,000x the speed of an apple 2, with a CPU that has 500,000x as many transistors (with a GPU that has 2,000,000x as many transistors) can maybe manage the same latency as an apple 2 in very carefully coded applications if we have a monitor with nearly 3x the refresh rate. It’s perhaps even more absurd that the default configuration of the powerspec g405, which had the fastest single-threaded performance you could get until October 2017, had more latency from keyboard-to-screen (approximately 3 feet, maybe 10 feet of actual cabling) than sending a packet around the world (16187 mi from NYC to Tokyo to London back to NYC, more due to the cost of running the shortest possible length of fiber).

On the bright side, we’re arguably emerging from the latency dark ages and it’s now possible to assemble a computer or buy a tablet with latency that’s in the same range as you could get off-the-shelf in the 70s and 80s. This reminds me a bit of the screen resolution & density dark ages, where CRTs from the 90s offered better resolution and higher pixel density than affordable non-laptop LCDs until relatively recently. 4k displays have now become normal and affordable 8k displays are on the horizon, blowing past anything we saw on consumer CRTs. I don’t know that we’ll see the same kind improvement with respect to latency, but one can hope.

Things are so bad that a google search for ‘why is windows 10 so bad‘ yields hundreds of results, including long discussions threads on multiple subreddits and official Microsoft support newsgroups. You can get almost the same number of hits for asking ‘why is office 365 so bad. And it is not just Microsoft as you can find similar opinions past few iterations of Mac OS X and iOS. In case you are wondering, Android has always been a shitshow, though it is a little better than the older versions. Did I mention that even widely used google services such as Google Maps and web version of Gmail has become significantly worse and inconsistent over past few years. And then there are the numerous poorly executed design updates by Amazon, FakeBook, Twatter, InstaCrack etc. My point is that is an industry-wide phenomena.

What do you think? Comments?

Computing “Revolution” of Past Two Decades as a Showy Failure: 2

October 30, 2020 5 comments

In the previous part of this series, I wrote about how almost every technological and scientific achievement we associate with the current era was developed before the personal computing “revolution” of past 20-25 years. We successfully designed and made everything from nuclear submarines, ICBMs, nuclear weapons, modern airliners, modern drugs, interplanetary space probes before this so-called “revolution”. Even more interestingly, the past 20-25 years have been the most stagnant period from the point of useful technological advancement in over 200 years. It is as if these two decades have not produced anything which has actually improved our lives or allowed us to real stuff that was previously considered out of reach.

In this post, I will go into some of the stuff I promised in that post- starting with automobiles. As Scotty Kilmer always likes to remind his audience, Japanese cars from mid- to late- 1990s consistently last for over 400k miles as long as you don’t go out of you way to abuse them. So let me ask you the next logical question- has any of the “computerization” of cars introduced since then made them last longer, significantly safer or somehow “better” for the consumer. I think we all know the answer to that question. Which brings us to next inevitable question- Why do corporations keep doing something that does not result in a better product.. and why does this trend keep getting worse. What is going on?

Why are car companies incorporating circuits in to their engines which make them easier to hack, far more sensitive to damage and often result in a lower quality product that does not last as long. Why do so many of them want to replace very ergonomic physical controls with virtual controls that make using them a far bigger chore than necessary. Why are so many car companies pushing hybrids that have excessively complex, hard to repair and often finicky hardware when they seldom have even a 5% better real-life mileage than their conventional counterparts. Also, curiously, why are some Japanese and Korean corporations far less likely to implement the worst of these costly and dangerous trends than their North American or European counterparts. What explains this difference?

Moving on to housing.. Has the quality of housing or the experience of living in one improved in the past 20-25 years? Have “smart” thermostats or “smart” security systems improved the quality of your indoor environment or security? Has having “Alexa” or its Google equivalent in you home improved the quality of your life apart from showing others that you are “hip” and “with it”. Also, what sort of idiot wants to pay corporations and the government to constantly spy on them in their own home? Have “smart” bulbs or LEDs really improved the quality of lighting in your house or substantially affected your electricity bill. Why do all the “smart” refrigerators, washing machines, coffee makers and other appliances fail much sooner, in addition to being unrepairable and more expensive, than their “dumb” analog counterparts .

Let us talk about education- both K12 and university. Has the extensive use of computers in education improved the quality of learning or made it less expensive. Are 2020 graduates somehow better than their counterparts from two decades ago? A large increase in use of computers for education has not improved its quality or made it less expensive. But if it hasn’t made education better, why is there still a continued push to increase the level of computer use in education. If something does not make the situation better, why keep pushing for more of it. And this phenomena goes far beyond automobiles, household alliances and education.

Consider the supposedly indispensable role of modern computing in running corporations. Did you know that large and multi-national corporations existed for decades before electronic computers of any sort existed. Did you also know that corporations of all sizes were able to run their supply chains, manage production, develop innovative products and pay employees and creditors on time before the first electronic computer of any sort was assembled. How did they do that? How did USA, USSR and Nazi Germany produce all the weapons and vehicles necessary for WW2 without possessing modern computers for running logistics or access to Excel tables and PowerPoint presentations? How did Ford, General Motors, Chrysler, GE, Motorola, IBM and many other corporations become big without access to CRM software.

How did large oil refineries run in the era before electronic computers? What about machine tooling? How did they build big stuff such as nuclear submarines, supersonic fighters and bombers, aircraft carriers or make millions of rifles, submachine guns, semi-auto handguns, assault rifles, artillery pieces in the pre-computer era. What about nationwide electrical grids, highway systems, railway networks etc? How come they ran just fine before era of electronic computers, let alone the computing “revolution”. Why didn’t the lack of electronic computers stop people from designing or building large dams, hydroelectric projects, irrigation products, coal-powered stations or electric grids. It is as if the lack of even older electronic computers has little to no effect on the ability of human beings to get things done in a way compatible with maintaining a modern lifestyle.

Since we are, once again, close to a thousand words, I will now wrap up this post. In the next part, I will write more about how the so-called computational “revolution” has not improved the process of drug development, everyday financial transactions and popular entertainment.

What do you think? Comments?

Computing “Revolution” of Past Two Decades as a Showy Failure: 1

October 23, 2020 40 comments

One of the defining features of the past two decades in west has been the dominant position in public consciousness of corporations involved in manufacturing personal computer hardware (desktops, laptops, smartphones, tablets, IoT crap, embedded electronics etc) or making them function and do stuff (‘IT’ corporations such as Google, FakeBook, MicroShaft etc). One could say that Amazon is a an ‘IT’ company which sells stuff that people used to buy in department stores. A large part of current market value of many stock indices in the west now comes from corporations who either make personal computational hardware or the software they run.

But have you ever asked yourself- has these rise of these corporations or the widespread usage of products and services sold by them actually improved the quality of life for the vast majority of people. To understand what I talking about, let us ask two more basic questions. Question #1: Would the absence of personal computing “revolution” during past twenty years have any negative effect on the quality of life or somehow constrain development of other technologies? Question #2: Has the computing “revolution” improved quality or reliability of other products and services, let alone increase the general quality of life for vast majority? As you will soon see, the answers to both questions are obvious as well as surprising.

The unpleasant fact for many geeks is that the computational ‘revolution’ of past two decades has been the most sterile and unproductive period of general technological advancement in the past two hundred years- and I do not make that claim lightly. To better understand what I am getting at, ask yourself if you can name a single non-computer product that has improved your life or is somehow associated with the modern world which would not have existed without this pseudo “revolution”. Give it a try.. can you think of any non-compter product which would not have exsited without this so-called “revolution”.

Since we have to start from somewhere- let us start with modern jet airliners? Well.. every airliner designed until the late 1990s was largely designed by competent engineers using their engineers using their experience and some combination of slide rules, desktop calculators, 8- or 16- bit desktops connected to a few clunky mainframes. The DC-9, DC-10, 737, 747 etc were designed in what was essentially pre-computer era. The A-320 was designed at very start of era where electronic computers (mostly mainframes) of any type were widely used for aircraft design. The 777 was the last aircraft designed with a combination of good engineering and primitive CAD technology. Only the 787 was designed in era of modern “computing”- and it has been the most over-budget and troubled design of them all.

And this is not just restricted to airliners. Consider space exploration and missiles. The space race between erstwhile USSR and USA occurred before the modern computing “revolution”. People went into space before even their vehicles had a single solid-state transistors, let alone a IC or CPU, within their rockets and vehicles. The flight control computer used in Apollo missions was a hand-made computer with about the same computational capability as an early Apple II, TRS-80 or Commodore PET- though it was a 16-bit machine. The Pioneer and Voyageur Probes which are the only man-made objects to visit Uranus and Neptune (albeit in a fly-by) did not have CCD cameras nor CPUs. The same is true for both Viking probes which landed on Mars in l970s as well as the Venera family of space probes that USSR successfully landed on Venus in that era. Oh.. and all those lunar probes and soviet lunar rovers too.

The vast majority of space probes launched prior to late 1990s used tube technology (or very primitive CCDs) for imaging and very basic IC circuits joined to make ersatz CPUs. And guess what.. they performed their job magnificently. But it gets even more interesting when you look at aircraft and missiles used by the military. Did you know that first ICBMs did not use solid-state electronics and it was not until the 1980s that ICBMs using Integrated Circuit Blocks for guidance became commonplace. Funny thing is that the accuracy of ICBMs has not increased by a worthwhile margin since the 1980s. Even ALCMs (Air Launched Cruise Missiles) achieved almost the same accuracy and guidance capabilities as those used today with what essentially a mixture of custom ASICS along with 8- and 16- bit CPUs. The GPS system worked just fine with receivers that contained what were essentially 8- and 16- bit CPUs.

Even the state of design for nuclear weapons, which were often designed using a combination of previous experience and calculations on some of the first real “supercomputers”, has not progressed much further than it was in the mid-1980s. Remember that every single warhead in American and Russian Inventory was (at best) designed on a “super-computer” with less computational power than the original XboX game console. The same holds for design of everything from nuclear submarines, tanks, guns and missiles. To put it bluntly, even in areas where the computational “revolution” should have helped the most, things have been pretty stagnant since the 1980s- and not for the lack of money and resources thrown at the Military-Industrial complex. It is as if big and substantial technological advances haven’t occurred in these and many other fields since the late 1980s to mid-1990s.

Since we are at almost a thousand words, I will wrap up this post. In the next ones, I will write about how the so-called computational “revolution” has not improved the quality of housing and automobiles, school and university education, transport and corporate logistics, process of drug development, everyday financial transactions and.. yes.. even popular entertainment. Even popular entertainment..

What do you think? Comments?