In an interview with Confare founder Michael Ghezzo, Reinhold M. Karner, FRSA, (aka RMK) talks about economic developments and crises and what opportunities and risks they hold for IT and digitalisation decision-makers in companies.
CONFARE: There are currently many issues challenging our society. How fit are our societies for the challenges of the future?
RMK: We live on very thin ice. Not only because of the multiple global crises and the Russian war of aggression on Ukraine. Many things are in question. A new zeitgeist is even emerging. As a result, we are facing considerable risks – but also the potential for opportunities.
I have confidence in our society that we can cope with the many challenges we face. Human beings are fundamentally resilient and very creative – especially when there is a proverbial fire.
Still, we are no longer as fit as we should be. We have become sluggish, too comfortable, too saturated, untrained and spoiled. We already had many diseases of affluence before, but the global crises and relative consequences have – to sustain the metaphor – increased our risk of a heart attack. The somewhat out-of-control high blood pressure and sugar levels of our situation can be seen represented in the decade-long high inflation, enormous new national debts and the explosion of energy prices.
The medical advisor of the situation – after encouraging us for years to be more fit and live a healthier, more modest lifestyle – now asks again: Do you finally want to face the brutal facts, get your act together and change something quickly? Do you want to prescribe a fitness programme, practise renunciation and implement it in a disciplined way, or carry on as usual and hope everything will be okay?
Already the list of challenges – independent of the war in Europe - is long. Whether it is our reckless consumption of resources (Earth Overshoot Day), climate change or numerous other issues at many levels – in the economy, globalisation, democracy and politics, the financial and social system – the list goes on.
It is clear that a great deal is changing, that we have to adapt, and that many things should change for the better. Everything else is an illusion. The situation is complex. Interestingly, under the lens of a careful analysis, we realise that we have brought some of it on ourselves.
Let's take the issue of inflation. The last major inflation was triggered by the two oil crises of 1973 and 1978. After that, there was largely a moderate level of inflation for some three decades. Now the inflation rate is higher than it has been for 70 years. But the current inflation problem is not as monocausal as it was in the 1970s.
The reason for this is not only the shortage of labour and raw materials, the Ukraine war, and other current global crises but also the fact that in the Eurozone the ECB has almost tripled the money supply, the money stock of the Eurozone economy, from around €4 trillion (2006) to over €11 trillion (2021) since 2010 due to the financial crisis of 2007 to 2010. It has also inflated the balance sheet total from about €1.5 trillion (2006) to some €8.7 trillion (2022), while GDP has only increased by a little over 30 per cent during this time. The ECB simply tried to sit out the earlier crises with an ostrich policy by utilising a larger money supply to cover them up and thus spare the highly indebted euro countries. In this respect, introducing only one Eurozone was a cardinal flaw because the economic performance of all EU member states will never be at a similar level.
That this bubble, with “Fiat money”, would burst one day, and fall on our heads, had been abundantly clear for a long time. Economists such as Prof. Dr Dr Hans-Werner Sinn of the IFO Institute in Munich have warned about this for years. When too much money meets a smaller quantity of goods – it’s only a few steps to a chain reaction. This is what happened in 2021.
Long-term problems in the aftermath of the financial crisis were joined by the trade conflicts initiated by the then US President Donald Trump, especially between the US and China. Then two “black swans” – unlikely events – entered the world stage that no one could have foreseen: the COVID-19 crisis and the accompanying further expansion of the money supply and Vladimir Putin's invasion of Ukraine.
These two events exposed our strategically dangerous dependence on other countries. More significant disruptions occurred in already strained supply chains, and energy resources (gas for electricity production) became scarce and very costly.
Overall, the changes and upheavals resulting from all the global crises will be enormous. We will see a clear de-globalisation – with a bulk of production efforts relocated or brought back, regionalised. Investments in the green energy transition, but also in military defence capabilities, will skyrocket. The clusters of economic zones in the world will reorganise themselves differently. Friends and partners against arch-competitor, "frenemies", and enemies will play an essential role in this.
It is understandable to me when renowned experts like Prof. Dr Hermann Simon (inducted into the Thinkers50 Hall of Fame in 2019, he coined the term “hidden champions”) now assume that high inflation will probably accompany us for another ten years or so, as was already the case in the 1970s.
The effects of even a moderate inflation should not be underestimated. Our currencies have lost about 40 per cent of their purchasing power in the past 30 years until 2021 (the euro was introduced in 1999). The US dollar has lost a staggering 84.9 per cent of its purchasing power by the end of 2021 since US President Richard Nixon changed the rules of the post-war world economic order in 1971 to finance the Vietnam War by abandoning gold convertibility (the gold standard), thus triggering the “Nixon shock”.
Yet inflation would not be inevitable. Because before the abolition of the gold standard, governments were forced to deal with discipline in their use of money, and therefore there were constant ups and downs between rising and falling prices (inflation/deflation) in the years from 1680 to 1930, but this evened out over the years. So inflation averaged zero per cent at that time.
In this respect, it should be clear today that the current overall situation will also cause our prosperity to melt away noticeably.
CONFARE: While we have talked a lot about transformation, innovation and progress in recent years, today's crises have slipped strongly into the public perception. Is the flight of fancy over?
RMK: Nobody knows the future. As Sir Isaac Newton observed as early as the 17th century, logical analysis, thinking and calculation are not congruent with complex dynamic systems. You cannot explain the functioning of something living with cause-and-effect logic alone. The same applies to the hyped expectations of artificial intelligence (AI). Even today, the mathematical foundation of cause-effect is problematic, as explored in The Book of Why by Prof. Judea Pearl.
I can, therefore, only speculate. I think continuous technological development will hardly slow down – after all, necessity is the mother of invention. On the other hand, economic development will probably continue for some time, in real terms, somewhat at a lower altitude. Let us hope the situation with Taiwan does not escalate because the world economy would take a nosedive due to the massive dependence on the semiconductor industry there – two-thirds of all microchips needed worldwide come from this island State.
Let us hope the situation with Taiwan does not escalate because the world economy would take a nosedive due to the massive dependence on the semiconductor industry there- Reinhold M Karner
The good thing about the current situation may be that it gives us more grounding. After all, topics such as innovation, start-ups, globalisation or other so-called advances were also riddled with considerable PR bluster, most of which ultimately did not pass the acid test.
According to economist and founder of modern management Peter F. Drucker (1909-2005), a business enterprise has only two essential functions: marketing and innovation, because only these lead to results. Everything else is a cost. It is true that the creation of customer value mainly takes place through innovation, but a certain sobriety is then appropriate in practice. Studies (e.g. by Simon Kucher & Partners) show that about 70 per cent of innovations are simply disappointing when it comes to increasing customer value; paradoxically, this percentage is even much higher for digital and high-tech innovations.
And in the case of new businesses and start-ups, the latter being, according to the definition, only those that are innovative, show significant growth and are younger than 10 years, we see that according to the Eurostat report Key figures on European business – 2022 edition only 45 per cent survive the fifth year. More than half will die off by then. Most start-ups, especially in the high-tech, deep-tech and digital sectors, do not survive three years. And only less than 20 per cent of all former start-ups survive the tenth year. This shows that there is a considerable gap between desire and reality in the public perception - too many things are wrong from the start. To explain all this in more detail, I ask for your patience until probably autumn 2023, when I will publish a book on the subject with both the analysis and the solutions.
The crux of the matter is that wanting to protect the environment and simultaneously focusing on limitless growth are mutually exclusive. It just doesn't work.
On the other hand, the central question we have to ask is: What do we really need? Our current economic system is geared towards limitless progress and permanent growth – that is the promise of salvation. It has a lot to do with Keynesianism and its errors. It is becoming increasingly evident in many minds that things cannot go on like this.
Our times appear to be somewhat extraordinary and turbulent. Yes, they have their difficulties, but also their solutions. But we are not unique in this – and it’s not the first time this has happened. And as history has shown, a new day will follow – that may herald new problems, but also new opportunities and possible solutions.
CONFARE: With Metaverse, crypto, NFT or blockchain, many new hype topics have appeared on the horizon. What recommendations do you have for correctly assessing the scope of such developments?
RMK: Let's start with the currently tangible things, where we don't need a crystal ball. The use of language is already perverted when we speak of virtual reality, for instance. There is no such thing, in fact. Because something is either virtual or real, you can't be a little pregnant. There is only one binary answer here, 0 or 1, virtual or real.
The fact that cryptocurrencies and the entire crypto scene have been in turmoil for some time should not come as a surprise. These applications and their digitally securitised values are only virtual. They have been sold to the public as the new alchemy by modern means. Many have made billions on this, and others have lost just as much. "Crypto is the new betting,” wrote Martin Hock in the famous German daily newspaper FAZ. He is correct; the element of speculation is disproportionately inherent in it.
Rana Foroohar wrote in The Financial Times in November 2022: “New asset, old problem – If the bankruptcy of FTX and the subsequent meltdown of all things crypto have shown us anything, it is that this time, it still isn't different when it comes to the financial sector and risk. The product at the heart of the current market collapse may be high-tech, but the details of how we got here mirror many aspects of the 2008 financial crisis and other periods of financial speculation, such as the dot-com bubble or even the run-up to the 1929 market collapse."
We see that people do not learn consistently. Greed is always dangerously tempting – and we have known this since the bursting of the first well-documented speculative bubble in economic history, the Tulip Mania (1637) in the Netherlands.
In addition, cryptocurrencies have been used for a great deal of fraud, money laundering and illegal transactions precisely because it is a completely opaque black box. Hence, the previous boom as it attracted many people from the shadow world.
But the previous billions of illegal money transfers and thefts were also breathtaking. Cryptocurrencies – despite or even more so because of the blockchain – are not only more vulnerable, at least not more secure, but even more rewarding. This is because such a heist can be carried out from any living room or home office around the world, without the need for a large transport vehicle or storage space for the stolen fortune.
This parallel financial world is not adequately supervised and regulated because it operates globally without borders in the digital space and is difficult to catch.
When I had a dinner in October 2018 with Changpeng Zhao (nicknamed “CZ”), the founder and CEO of Binance (est. 2017), by far the largest cryptocurrency exchange in the world, which is why CZ was even considered the wealthiest person in the world for a while, he was still raving about his vision of a liberal, unregulated financial system, past the establishment. However, in November 2022, after the FTX bankruptcy, he urged leaders at the G-20 summit in Bali, Indonesia, for strict regulations, rules and safeguards.
This scene has now arrived in reality after all – despite virtual happenings. This change from gambling to serious predictability and the creation of trust will still need a lot of effort and time, but the path in the right direction has – by necessity – been taken. It seems.
If large central banks like the ECB make our currency available digitally – unlikely, as the ECB itself has claimed Bitcoin is ‘on the road to irrelevance’ - it will probably be subject to the same rules and controls as our current monetary system. That will help massively. Nevertheless, one can only hope that cash and giro money will not be abolished because that would have fatal consequences. In the long run, I think that is widely underestimated. Regarding blockchain, the world's leading IT market researcher Gartner Inc. was right from the start in its assessment that this is a fundamental technology, not an application, and it will therefore take a long time before it is standardised, robust, energy-efficient, secure and trustworthy.
The father of the World Wide Web, Sir Tim Berners-Lee, did not rate the decentralised database technology blockchain as a suitable solution for building the next generation of the internet at the Web Summit in Lisbon in November 2022. “Ignore the Web3 stuff,” the physicist and computer scientist advised, and was not at all convinced by the future scenarios of the crypto-visionaries. He even called it a “real shame that the Ethereum people have adopted the already existing term Web3 for a lot of the things they are doing with blockchain, when in fact this form of Web3 is not the web at all”.
On almost every website of Web3 projects, you can find the term “decentralisation”. Web3 itself likes to be called the “decentralised web”, but it is not far off. The well-known hacker Moxie Marlinspike concluded that “All the promises of decentralisation are just PR because the well-known products all depend on a handful of central services”.
The beacons of the Web3 themselves do not adhere to central values and promises. Moreover, the question arises as to what practical value an abstract decentralisation has because the WWW is also technically decentralised, yet we see monopolies forming.
When dealing with Web3, one is always amazed, for instance, when one goes through the system and finds that smart contracts, once deployed, are only patched at significant cost and under certain conditions. However, all attackers can read the code. Or one wonders why NFTs, which are touted to us as the future of digital ownership, do not give you any ownership or licensing rights to anything, nor do they guarantee that they are the only ones that refer to a specific object.
Even though Web3 evangelists never tire of beating the drum of the miracle of the next web, the situation is similar to that of the blockchain debate in that one always speaks only in the subjunctive; nothing is binding or fixed. So, it's better to keep a wary eye on things for now.
Let's now consider - although Ethereum has meanwhile switched to energy-saving operations - that, as studies show, the digital currency Bitcoin causes more significant climate damage than the global production of beef or all SUVs. For instance, more electricity is consumed in a year for Bitcoins transactions than in Austria or Portugal. This shows it will be many years before we have established a standardised, acceptable fundamental technology for blockchains at their full scale. Everything in between will be interim solutions.
Incidentally, all digitalisation (hardware, infrastructure) accounts for around eight per cent of global electricity consumption. This share is expected to increase by 50 to 80 per cent by 2030. This is not only due to more users and applications on the web but also because even AI is immensely energy-hungry.
Hugging Face estimates that training a large language model (LLM) produces at least 25 tons CO2 emissions. This figure doubles if one also considers the emissions caused by the supercomputer equipment's production, the computer infrastructure's construction and the subsequent regular operation. Translated, this means 50 tonnes of CO2 emissions for such a model correspond to about 60 long-distance flights between London and New York.
Nevertheless, later, when the enabling blockchain technology is fully baked, its dissemination, integration and business volume should be gigantic. Regardless, where today's blockchain possibilities bring real advantages, a tangible benefit, and you are sitting in the driver's seat, i.e. you have technological and application control over it, I would already work with low-hanging fruits applications, even with simple, smart contracts or non-speculative NFTs, and gather experience, but in the knowledge that this will not be the mature technology architecture.
On the Metaverse, I’m cautious because I still see this as more of the next grenade, the next hype, a kind of technological greenwashing. I also see the change of the company name from Facebook “because of this” to Meta, more as a flight to the front to distract from the many systemic challenges reflected in the decline of its share price and the current wave of lay-offs. The name choice also seems to have been made too hastily because the company will have to change its name again at the next big technology change, which weakens the brand.
With all this hype, whether about crypto, AI or Metaverse, we should keep in mind the Gartner Inc. hype cycle, which in my experience is the most accurate. This was introduced by their analyst Jackie Fenn in 1995 and is still valid.
The waves and phases of this hype cycle for technologies lead across its timeline from the ’Technology Trigger‘ rapidly uphill to the ’Peak of Inflated Expectations‘, then nosediving down into the ’Trough of Disillusionment‘, then recover somewhat to the ’Slope of Enlightenment‘ before entering the ’Plateau of Productivity‘. In my 40 years in business, I have hardly seen any technology that has developed differently.
Tim Cook, CEO of Apple, made it clear in an interview with the Dutch news channel Bright in September 2022 that he doesn't think much of this vision of Facebook founder Mark Zuckerberg, saying: “I'm really not sure the average person can tell you what the Metaverse is. … And I don't think you want to live your whole life like that”.
He is probably right. In fact, I think hardly anyone can provide a general definition because there are so many ideas, approaches, and visions about it. Everyone is pulling in a different direction. And that is the problem.
In an October 2022 Wall Street Journal Tech interview with Apple's vice presidents, Craig Federighi, head of software development, and Greg Joswiak, global marketing boss, the latter said: “The word Metaverse was one he would never use”.
Hanna Henning, Siemens Group CIO, said at the Tyrolean Economic Forum in November 2022 that virtual parallel worlds, such as those being developed in the Metaverse, would play a major role in industrial production in the future. People would meet in the Metaverse to create products. This would have several advantages because pre-developments could be made with a minimum of material expenditure, their actual practical suitability could be put through its paces before production, and any problems could be ironed out. Metaverse technology would usher in a new era and come faster in the industry than elsewhere.
I’ve stopped counting the times we have heard such promises. Even though I can understand Hanna Henning's enthusiasm, I see nothing more in it than an amalgamation and further evolutionary development of already existing technologies, repackaged under a different name.
It is about the expansion of the visual interface, more 3D visualisation and more powerful user experience (UX, MX, CX, EX) and the use of further human interaction possibilities with the digital equipment, that is, beyond the keyboard, mouse, touchscreen, monitor, and stick. Specifically, the combination of technologies such as digital twins, CAD simulations, computer gaming, Microsoft’s HoloLens, video conferencing or 3D video chat solutions à la Starline by Google or hologram projection, voice dialogue and control and sound à la Alexa or Siri, VR glasses, is crucial.
As I see it, this is a logical development, even a consolidation, but neither a new era nor a General Purpose Technology (GPT). However, one should be somewhat concerned about such concepts, especially for private and consumer use, the issue of “escape from reality into virtuality”, just as the extent of gambling addiction in the gaming scene has long been very alarming.
Creating hype and surfing on these waves while making a lot of money and pushing up company valuations is something we have always been very good at in the information technology scene and the world of digitalisation.
The big question is: Which side are you on? As a provider, it may give you a significant advantage to jump on such a bandwagon with great momentum. If you are on the user side, things may look different.
Stanford professor Jim Collins and his research team, in an acknowledged analysis of long-term extremely successful, listed corporations in the million-selling bestseller “Good to Great”, identified an engaging, strategic key factor in this regard, which in my eyes still hits the nail on the head today and also applies to SMEs.
Highly successful corporations think differently about technologies and technological change than mediocre companies. These top performers avoid technological fads and hypes and yet become pioneers in applying carefully selected technologies. Technology by itself is not an impulse generator but a catalyst. Therefore, to judge whether technology serves as an accelerator of business success for one's company, the most important criterion is to answer whether it fits the company's strategy. After all, a company cannot use technology meaningfully without understanding and examining its exact significance for its core business. Therefore, it is best to avoid blindly jumping into the latest “innovations” for fear of being left behind.
How a company responds to technological change is a good indicator of its inner drive for extraordinary success rather than mediocrity. Great companies respond with thoughtfulness and creativity, driven by a desire to turn untapped potential into results. On the other hand, Mediocre companies too often react out of fear of being left behind, similar to FOMO – the fear of missing out.
CONFARE: In your opinion, what are the megatrends (digitalisation, AI …) that decision-makers should address today?
RMK: In my consultations and coaching sessions, I generally find that many business leaders still find it challenging to understand the breadth and depth of digitalisation in principle and even more so regarding the potential benefits or damage to their business. This is understandable because most well-experienced entrepreneurs or managers do not come from an information technology background, and one cannot know everything.
Nevertheless, it’s a must for success to deal with it sensibly. Gobbledygook should be left out because it doesn't help at all. The more straightforward and well-founded the explanations and practical examples, the better. This can be done professionally in a few days with an intensive crash course.
Today, it is almost impossible to imagine any industry without the professional application of digitalisation. Not doing so will be a severe competitive disadvantage in the foreseeable future. However, using this technology – correctly, appropriately and sensibly – can lead to unimagined success.
In my view, there are three central issues for business leaders that should be known and understood in principle from a birds-eye perspective.
First, nowadays, we can obtain an infinite amount of knowledge in the blink of an eye, almost effortlessly. On the internet, in the digital space, we can recruit knowledge for ourselves any time of day or night, in every possible and impossible situation in life. We have the world's knowledge at our disposal.
But it’s not about adding more knowledge to our knowledge. It's about essential knowledge. Even more so, about a well-founded, correct understanding of facts. And as a result, we must avoid turning off our common sense or becoming dumbed down by an overabundance of information.
The second is that algorithms (mostly AI-based) have long been the secret weapons of companies. In this way, some systems somewhere in the world know us better than our mothers or partners do. With only 68 ’thumbs up’ on, say, Facebook an analysis will determine what skin colour, sexual orientation, or political orientation someone has; with 70 Likes, a suitable algorithm can assess the user better than their friends; with 150 Likes the algorithm knows them better than their parents, and with 300 Likes the life partner no longer stands a chance against the machine. Finally, 350 likes are enough, and the machine knows the user better than they know themselves.
The fundamental element of the new digitalised world in the age of Big Data is the human being as a customer, i.e. the personalised data package. The aim is not to discover what this ’package‘ thinks but how its behaviour can be controlled and predicted. Human behaviour is all the more predictable, the more it moves in habits, which is usually the case. That is why these are the preferred object of desire of behavioural research in the consumer goods industry. In fact, they are usually their gold mines.
Today, behavioural specialists are no longer psychologists and sociologists but statisticians, Big Data specialists and computer scientists. AI algorithms are increasingly taking on the leading role. With digital technologies, far more information is collected, processed and then translated into new knowledge using AI algorithms, and faster than ever before. This is not only related to consumers but also to almost all business processes and areas.
Today, behavioural specialists are no longer psychologists and sociologists but statisticians, Big Data specialists and computer scientists- Reinhold M Karner
And thirdly, it should be understood what artificial intelligence is because this term is entirely misleading. From the original idea of AI, ’only‘ Machine Learning (ML) works until today. In particular, thanks to the breakthrough in 2012 of the team led by Stanford professor Fei-Fei Li, Deep Learning. (See her exciting TED Talk on this).
And their algorithms, which are often highly complex and trained, as a result, are based to date to over 95 per cent on ‘supervised learning’. Full stop. These already offer powerful, splendid and beneficial applications. It's based on mathematics, especially probability calculations.
For the business world, I find the definition of the Canadian AI professors Ajay K. Agrawal, Avi Goldfarb and Joshua Gans, who have also written several fascinating books on the subject, to be the best so far, namely: “AI is a Prediction Machine. It is a very cheap technology for business decisions, autonomous driving, medical analysis, weather, language processing (NLP), control, chatbots, financial decisions, etc.
“Prediction is the process of filling in the missing information. Prediction takes the information you have, often called "data”, and uses it to generate information you don’t have.”
The extent to which Amazon, for instance, will one day manage to increase the hit rate for purchase recommendations to customers from the current over five per cent to over 95 per cent to realise its corporate vision of turning the business model from the current “shopping-then-shopping” to “shipping-then-shopping” remains to be seen. Still, it does show what it aims for, thanks to AI.
The technological father and principal designer of ARM processors, Prof. Steve Furber, is also the chief developer of the Human Brain Project (HBP), where, after more than a decade of planning and construction, he and his team put the neuromorphic SpiNNaker system with over one million ARM processor cores into operation for the first time at the end of 2018 at the University of Manchester and TU Dresden. This is a research project of the European Commission, which aims to compile all the knowledge about the human brain and reproduce it using computer-based models and simulations.
Furber, together with the Cambridge-based Austrian Dr Hermann Hauser (then CEO, now a deep-tech star investor with Amadeus Capital) and Sophie Mary Wilson (computer architect), founded the British ARM Ltd. in 1990, now a decacorn, a multibillion-dollar company. Thanks to their energy-efficient RISC architecture, we have smartphones and tablets today. Today, multiple times more ARM processors (230 billion units in 2022) are manufactured and installed annually than by Intel.
When Hauser and I asked ourselves during the commissioning of SpiNNaker what the brain simulation performance of this specialised neuromorphic supercomputer would be and to what extent it would be exponential in any way, Furber wrote me the following explanation:
”The brain is quite modular, so the number of connections does not grow exponentially with the number of neurons. In fact, it has been shown that a form of Rent’s rule applies equally to brains as to digital circuits – the amount of wiring follows a power law as a function of the number of neurons.
"However, a simpler way to look at this is just to count neurons and/or synapses. The human brain has just under 10^11 neurons and 10^15 synapses. On SpiNNaker we can model up to a thousand neurons per core, so a million cores can model one per cent of the number of neurons in the human brain, but we can probably handle at most 10^12 synapses. These would have to be the simplest sorts of neuron and synapse model. We can probably handle a mouse brain model, which is 1,000 times smaller than the human brain. In practice, a whole (real time) human brain model is beyond the capabilities of any existing machine. However, a lot of data is missing to build such a model, even when we have a powerful enough machine!”
The fact is that until now, it is still not understood how the human brain works at all. That is why Furber wrote that they work with the most superficial neuron and synapse models. This means it is impossible to recreate a precise, identical digital twin of hardware and software for a 1:1 simulation. Until brain research unravels all the riddles, this is hopeless and can only be approached cautiously.
For me, it’s clear our brain does not work digitally and mathematically. I think we are even looking in the wrong spot to understand our brain, our mind. Because of creation, the universe consists of only about five per cent matter, but everything else, the central part, is oscillation, vibe, spirit – or whatever you want to call it. For me, it currently looks as if someone who is dealing with digitalisation for the very first time, for example, and wants to fathom this “modern miracle” scientifically, is only focusing on examining everything visible, physically tangible. So only the hardware and infrastructure. I think we can quickly agree that even if you researched it for a hundred years, you would never understand how and why digitisation works. Because operating systems, firmware, software, and so much more of the "invisible" also contribute to the overall work of digitisation.
It's a bit like the Big Bang, which, as I understand it, explains too little but is simply a pretty picture of the limits of what can be explored. According to the motto, you can look as far as the big bang, just before it, thanks to all the modern technologies such as the James Webb Space Telescope (JWST), but not to the Big Bang itself. And certainly not behind it, namely, what was before the Big Bang and why did it happen that way and time?
That's why I think we need different perspectives in science. Even research into the fascinating field of quantum physics is probably not enough to understand the whole secret of how our brain, thinking, inspiration and intuition works, let alone be able to recreate it.
Long before the advent of modern physics, Johann Wolfgang von Goethe (1749-1832) posed the question in his work Faust (Faust I, Faust, lines 382–5) about the nature of the basic building blocks of the world: “To grant me a vision of Nature's forces, that bind the world, all its seeds and sources. And innermost life—all this I shall see, And stop peddling in words that mean nothing to me”.
Physics is still searching for a mathematical understanding of the universe, for the ’world formula‘, and likes to refer to this Goethe quote for this endeavour. The answer from science is still a long time coming.
In 2017, the Swiss Blue Brain Project scientists published their findings on the human brain in a study that amazed one humbly. They found that our brain thinks in as many as 11 dimensions. “We found a world that we had never imagined. There are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to 11 dimensions.” said the lead researcher, neuroscientist Henry Markram from the Swiss EPFL Institute.
But today's computer can only calculate in binary, digital form, and consumes an enormous amount of energy, electricity, in relation to the brain. Incidentally, the quantum computer does not work much more brain-conformant either.
Training large ML models requires powerful computers that eat up much MW. On the other hand, our brain only consumes around 20 W/h and, with its 90 billion neurons and 500 quadrillion synapses, is capable of even more complex considerations. AI is still light years away from this if it ever achieves it.
And it's not just about computing power, as that’s all that AI does because our brains can do so much more.
The fact that computers are superior to humans in certain tasks has nothing to do with intelligence. Mechanical machines have been far superior to us in terms of muscular strength since the industrial revolution, and before that, it was animals like horses. And that doesn't bother us either; rather, it helps us. In terms of large computing tasks with large amounts of data, the first spreadsheet, a predecessor of Excel, Visicalc by Dan Bricklin on the Apple II, was already vastly superior to us in 1979. But even that has nothing to do with intelligence. As with AI, it is simply a matter of complex machine computing.
Of course, we could look at possible future computer technologies, such as quantum computers, light computer technology or DNA computing, or why the nanometre specification for chips has long since become a deceptive package. This is all very exciting, but it is no longer relevant for business users and would probably go too far here.
A perennial issue is, of course, the question of cybersecurity. I don't think we can resolve this enough, as new tricks are invented and loopholes discovered and exploited every day, with devastating consequences if you don't protect yourself adequately.
CONFARE: Digitalisation and IT have an enormous impact on society. At the same time, access to the internet and technology is not fairly distributed worldwide – how much explosive social power do you see in that?
RMK: Fair and balanced development has probably never existed since civilization. We can only try to work on that, but as we know, many countries and people who could afford it are not following suit. What helps cover the internet's reach for as many people as possible are increasingly sophisticated and cheaper technologies. A lot has already happened and will continue to happen.
But, as I have already explained in my daily and business newspapers articles, which can also be found on my website, I currently see much more social explosive power in the question of how the Russian war of aggression on Ukraine will end. This is an attempt by a despot to overturn the world order, re-establish the old system and right of the strongest, and, in doing so, wipe out the weaker. If this were to succeed, the world order would be so shaken to its core that no country in the world would be safe anymore. Especially if it is a smaller country. Then the question of fairly distributed internet access no longer matters either.
But the conflicts, especially with China and the G-7 countries, the Western world, in the struggle for the upper hand of the political system in the world, also play a crucial role. Especially when some countries like Russia and China are already threatening to shoot down satellites that would give their populations free access to the internet because they would no longer have any leverage for censorship and their autocratic or totalitarian control to maintain power.
Nevertheless, I hope that successive internet coverage will not be stopped, even in poorer or autocratically-run countries.
Reinhold M. Karner, FRSA, is an entrepreneurship and start-up evangelist, multiple chairman (e.g. AP Valletta), corporate philosopher, entrepreneur, author, university lecturer and fellowship connector of the Royal Society for Arts, Manufactures and Commerce (RSA) for Malta and Austria.