HARMONIC ONE – COMMENTARY | December 2022 Edition
Article Dedications (Quotes)
“Remember, there are more people in the world than yourself. Be modest! You have not yet invented nor thought anything which others have not thought or invented before.”
-Unknown
Neo: Why do my eyes hurt?
Morpheus: You've never used them before.
- “The Matrix”, 1999 Movie
Introduction
The ability to use tools and build machines dates back billions of years among humans, various beings, animals and insects. While animals may develop basic tools and methods to facilitate their day-to-day existence, humans take it a step further; In addition to training animals, humans invent tools and machines with the common reason of solving/addressing a problem, making a laborious task easier or innovating an entirely new device for some critical aspect of day-to-day use in society.
But the picture becomes complicated when the “accepted” history of technological advancement contains instances of sudden growth and radical innovation and then abruptly ending with an “innovation plateau”. This “plateau” is often characterized by continually repackaging the unchanged product with a “new shiny skin” or other new appearance and being presented to the public as “radically new” or “high tech”.
When examined closely, these tech changes do not “shake up” the industry or are considered a “ground-breaking product” that creates a “completely new industry”. There are many examples of this kind of stagnation across many industries, but one example that has affected almost every device we use in society is the semiconductor electronic circuit. These circuits are used in every digital application such as computing, phones, internet, transportation systems, and mass communications and will be examined in the upcoming paragraph.
Examining Innovation in Computer Technology
When examining if there has been any real progress in technology people use on a daily basis, it is important to review the definition of a “computer” and observe “westernize” history associated with digital electronic semiconductor circuitry.
There are many definitions of a computer, but it’s common to start with the word to organize a meaning. Hence, using logical reasoning and observation of how it conceptually works , one view is that it is a tool to assist the user with computation, hence the name “computer”.
Within this context, “computing” is theoretically characterized by the machine or tool “accepting” an input and then conducting a mathematical operation (“algorithmic” instructions) to produce an output. This output can then be examined by the user for accuracy and adjustments are made to minimize errors. Additionally, the user configures the machine to “automatically” store and retrieve this output on demand and as needed by the user.
With this definition in mind, computing in the last 200 years, experienced several different types of computing devices which supposedly created the base for a wide range of applications used today. According to Western historians ,this included research, development and use of handheld simple computing devices (Antikythera mechanism ,slide rules, abacus etc), analog gear driven mechanical computers, electro-mechanical computers, digital and finally micro-digital computers. Each machine type, in the order stated, supposedly served as a base or inspiration for the next iteration of what we call a computer.
The last type, micro-digital circuitry, is hardware upon which many technological applications are developed. It’s construction is often characterized by exponential miniaturization of digital components once visible to the “naked” eye in mechanical and electro-mechanical computers (i.e. Babbage machine and ENIAC computer).
Thus, in micro-digital electronics, semi-conducting components such as microscopic sized resistors, capacitors, transistors and relays are constructed into a thin wafer substrate. Many layers of these wafers create an integrated circuit, which are the major component of a “central processing unit (CPU)” also known as the “brain” of the computer. Benefits claimed include faster performance, lower power consumption, and enabling the miniaturization of electronic devices such as desktop computers, phones, medical devices, military applications etc. Hence the use of miniaturized electronics exploded and is used in a wide variety of applications in nearly every aspect of society.
This miniaturization appears to have started in the 1950s and has been using the same semi-conducting components (resistors, capacitors, relays, transistors) in the past 90 years. Technology is supposedly “more advanced” as the parts “get smaller” hence the recent term “nano technology” to represent the “next level” for micro computing. There has even been research to increase the miniaturization to the “quantum” level or to other sizes below the “nano” level, such as “pico”, “femto” , “zepto” and “yocto” sized digital entities.
But is miniaturization of the same components used since the early 1900s a “disruptive technology” or is it “hiding” the stagnation of innovations associated with semiconductor components? Are there other phenomena that can truly replace resistors, capacitors, relays and transistors, irrespective of size? Can these new phenomena run on mediums that are not on the current old micro-technology hardware? If so, can such a replacement completely transform digital devices into breakthrough wonders that we can use in every facet of our society?
The answer is not clear, especially if there are emerging technologies that have either not made it to the mainstream or barely touched the surface such as quantum computing. Despite some advances in quantum computing, its use is nowhere near the common use of old micro-computing technology. Additionally, quantum computing does have its limitations compared to conventional computing, including being subjugated to the limitations of hardware upon which much quantum efforts operate upon.
Others may mention examples of “supercomputers” as evidence of “technological advancement”. But these “refrigerator behemoths”, with likeness to that of the large and older mainframe computers, are commonly combined efforts of mostly desktop processors assembled in series or parallel fashion..
To bring more to this observation, one can examine the lack of disruptive innovation in the following sample areas of our current society.
Television/Radio/Telephone Technology
According to western/Eurocentric historians, the morse-code based telegraph led the way for developments in radio, phone and television
But phone technology has largely remained unchanged. The basic function of two people speaking over an “electro-magnetic wave” in early 1900 is still the same radio-frequency technology used between two people speaking over a cell phone in 2022/2023. This lack of innovation is especially evident in the large messy sprawl of telephone wiring hanging from wooden poles, some wooden telephone poles as old as 100 years.
Additionally wireless technology and the radio are older technologies that have not changed for over 150 years, with the first wireless devices up and running as early as the mid 1800s. Hence the same principles and methods which governed wireless devices in the 1800s is still the same technology used in cells phones and other devices, such as “internet of things” (IOT)
Like the computer, TV went through a series of mechanical developments before stabilizing upon the first color TVs that were up and running in the late 1940s. There was also the existence of electro-mechanical and digital televisions, during that time period, which could show both monochrome and color displays. Beside the slight change from the thick cathode ray tube to flatscreen, there has been no real innovation of TV technology in the past 100 years. Most flatscreen are possible only because of the miniaturization of the function of a cathode ray tube within old semiconductor technology.
The Internet
The internet and instant messaging (a.k.a “texting” and “email) was up and operational in the mid-1950s and has not seen any real innovation since then. This early internet included the first use of network multiplexers (the predecessor to a network switch) and massive databases. This development also included its variants known as relational databases in the late 1960s (Relational databases, 50 years later, are still in widespread use and have not largely changed since Cobb’s description of primary/foreign keys in the late 1960s ).
Due to these innovation stagnations, internet based social media, such as YouTube, Facebook, TikTok etc are just integrating old technologies from TV, radio and instant messaging into one “screen location”. Hence, it is not “disruptive”, “ground breaking” technology. (i.e. using your mouse or finger to “select” between social media channels is conceptually no different than using a knob on an old cathode tube TV with rabbit ear antenna… you still get the commercials in both settings),
Gas Powered Vehicles
Despite the gas engine helping to spark what western/Eurocentric scholars call the “Second Industrial Revolution '', the combustion engine, and its variants (rotary engine, diesel engine, jet engine, rocket engines etc) have not changed in principle for over 150 years.
The internal combustion engine is still being used today in millions of transportation endeavors around the globe and is one of the leading contributors to what scientists call “global warming”. Electronic devices, built using old micro-computer technology, mostly function as measurement devices, not to “transform” the car engine (i.e. speedometer, gas meter, flat tire indicator, rear mirror camera, turn signals etc). Changing the interior and exterior with different colors, “eye pleasing” contours, leather seats, “navigation systems", exotic materials and shapes does not modernize the internal combustion engine. These elements help companies resell conceptually the same or slight variations of car, rocket or jet engines over and over again.
Electric Vehicles
Some people may jump to the consideration of electric cars as a new “mode of propulsion” for automobiles and deem EV as “high tech” or “modern auto technology”. This belief is consumed with the assumption that the electric vehicle was “invented” or “first brought to market” by the company we call “Tesla”, and that other car companies “are following in pursuit of the Tesla company”. Another assumption is that electric vehicles will help reduce greenhouse gas emissions, a view that lacks a definitive answer how effective that would be.
What many may not know is that electric cars, within western american history, are an old technology that was actually developed in the late 1800 due to developmental progress with batteries and alternative electromagnetic propulsion systems. One researcher attributed the quiet operation and ease of recharging to reasons why EV was very popular among women in the early 1900s. While there have been many reasons developed on why gas powered cars outpaced electric cars in terms of usage, some speculate Henry Ford’s mass-produced Model T is what dealt a blow to the equal expansion of the electric car in the 20th century.
Centralized Electric Energy
The fundamental principles of generating electric power from primal energy sources such as water dams and coal have not changed for close to 250 years, according to Western scholars. Coal is still being mined, dug out and processed in 2022/2023, in a manner similarly conducted in the early 1800s. Most of the transmission and distribution lines used to deliver power to the common consumer are just as old. Similar to usage in automobiles, power plants use micro-digital technology to monitor the large machines which produce electricity, not to radically “modernize” electric distribution.
As populations grew in certain cities, strains on the grid that plagued engineers in the early 1800s still does so in 2022/23. Evidence of this is seen in a number of situations involving rolling blackouts or total blackouts, several attempts to create “smart energy grids”, experiments with solar,wind energy or hybrid sources that either did not scale out enough to be replaceable energy sources or out right failed. This challenge makes one wonder if the increased push for consumers to buy electric cars will be sustainable or will the large proliferation of electric cars put a larger drain on what is left of the energy grid.
Traffic Technology
Many people who live in any large metropolitan area can identify with the agony and discomfort of sitting in long lines of cars, waiting to move. Traffic congestion is a problem that is over 200 years old. The 3 light traffic signals invented close to 150 years ago to reduce collisions between horse driven buggies are still being used today and have not changed since then. Some may mention “Connected-Vehicle Environment (CVE)” as the next “evolution” of traffic technology, but it's not clear how such an idea will be integrated in the current traffic infrastructure. Additionally, CVE is using basic remnants of machine learning software sitting on top of old micro-computer technology and it's not clear if this would be effective at addressing this agonizing problem.
Machine Learning /Artificial Intelligence
Machine learning and AI was developed and initiated in the early 1940s and 1950s. Despite a plethora of algorithms developed, the basic principle of AI to attempt mirroring the decision-making process of human cognition and neural networks has not changed for nearly 90 years. This may be indirectly or directly related to limitations associated with the micro-computer hardware upon which AI applications operate with.
Additionally, most AI developments are focused on finding new ways to apply AI to various situations, not to radically innovate some breakthrough of AI itself. Hence, the plethora of media stories reporting “new technologies” and “advancements” in a wide range of industries to which AI is used in some way may not really be new at all. Therefore, developing a new algorithm for a “self-aware” swarm of flying machines is conceptually no different than a user who develops an AI application that “advise” on the investment feasibility of a real estate endeavor. Both are basically computers that use hardware and software to interact with the user to process an input to create an output.
Questioning Disruptive Technologies: What Happen?
Many of the technologies mentioned (and many others not mentioned) appear to have not changed in the last 150 to 200 years. It’s not clear why the innovation momentum between approximately 1700 through 1900 suddenly stopped after, what seems to be, 1960. Science journals indicate innovative technologies are still being developed, but are largely not making it to the public domain. There are many theories, reasons and research around this point that one can independently ponder and explore.
Furthermore, as one reflects on, what we believe is the age of earth and the billions of people who have and are living on it, is it realistic to think that the current technologies people are using in today’s society is the “best” we have? For millions of years, humans observed nature to inspire new developments in machines and tools. Some recent examples in the include:
· Automobiles, bicycles and motorcycles largely drew imitation and inspiration from humans riding on horses (or being pulled by one);
· Airplanes are often designed using methods and terms associated with observations of the flight behavior and movement of birds;
· Boats and submarines are built on principles similar to principles of physics which govern the movement of sea animals at and below sea level;
· Artificial intelligence/machine learning drew inspiration from the attempts to understand cognitive construct of the human brain and how it processes information.
Did people suddenly start observing nature to develop technologies only starting in the last 200 years (compared to millions of years of human existence?!?) That seems to not be accurate since there have been archeological discoveries reporting mind altering technology found in remnants of ancient civilizations thought to be “primitive”.
To add complexity to this situation, many of these same literature archives exclude recognition of fascinating technology achievements, such as aircraft development, found in ancient Asia, Africa, South America and India.
Conclusion
In research references governed mostly by Eurocentric/Western point of views, there has been, almost every year since approximately 1800, research and reverse engineering efforts involving many “unusual” scientific subjects. This includes electromagnetic gravitics, quantum vacuum energy, unusual construction technologies, unconventional medical techniques and machines which provide miraculous healing, advanced capabilities of the human mind/body, and other “strange” and so-called “scary” research.
In conclusion, we must remember that technology is a tool used by humans for various endeavors in our day-to-day existence. They should be used to solve problems, enhance human quality of life and help innovate the natural and progressive development of our ecosystem. Problems such as global warming, homelessness, hard to cure diseases, efficient transportation, food distribution, education and some social ills could be transformed with unconventional technologies which truly represent innovation, if they do indeed exist. Without these advancements, our existence could be threatened with unnecessary hardships that would hinder growth and can cause damage to earth and its inhabitants.