Noah Crisp
Cybersecurity Reporter
A member of the Project Counsel Media team
19 July 2024 (Washington, DC) — Ah, timing is everything.
This morning’s monthly “cyberchat” breakfast sponsored by Lockheed Martin (which does an enormous amount of cybersecurity work for the Federal government) was going to be on cyber kill chains. The cyber kill chain is a series of steps that trace stages of a cyberattack from the early reconnaissance stages to the exfiltration of data. The kill chain helps us understand and combat ransomware, security breaches, and advanced persistent attacks (APTs).
But the global IT crash knocked out that agenda as the chaps from Lockheed Martin ran us through how today’s global IT meltdown happened, and why. The technical explanation is quite complex but there is enough out there in the mainstream media to give you a briefing.
As all the presenters noted, it really should be a “wake up call” but nobody is going to listen. And as one said:
“If a single bug or error can take down airlines, banks, retailers, media outlets, and more – on a regular basis – what on earth makes you think we are ready for AGI? It is insane people are still talking about AGI. The world needs to up its software game massively.
We need to invest in improving software reliability and methodology, not rushing out half-baked chatbots. But we won’t do that, will we?”
Twenty years ago, Alan Kay said it best:
“Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves. Few people have any clue the fragile system we are buildingā.
And just a note on Alan who I actually met once but whom our boss knows. He is the American computer scientist best known for his pioneering work on object-oriented programming and windowing graphical user interface design. At Xerox PARC he led the design and development of the first modern windowed computer desktop interface. It was his ideas that Steve Jobs poached to build the iPhone.
Or better yet, read the book Rebooting AI which explained that part of the reason we are struggling with AI in complex AI systems is that we still lack adequate techniques for engineering complex systems that are hopelessly misconfigured. None of that has changed.
Chasing down black box AI, which is always difficult to interpret, and especially difficult to debug, is not the answer. And leaving more and more code writing to generative AI, which grasps syntax but may never grasp meaning, as the new studies from Stephen Wolfram show, is not the answer, either. As we have noted before, Stephen is the “go to” guy for how AI works, and genAI and all of its manifestations.
As tech CEO Thorsten Linz (founder of multiple tech companies and an expert on AI infrastructure and communication infrastructure) said on Twitter ātech giants really need a serious commitment to software robustness, and I do not see it. Rushing innovative tech without robust foundations, which is exactly what everybody seems to be doing, is utter stupidity”.
And with the TRUMP 2.0 machine seeking to end all tech and AI regulation, an unregulated AI industry is surely a recipe for complete disaster.
Yet, it still boggles the mind that this disruption, which reached what some experts called āhistoricā proportions, shows – yet again – a stunning example of the global economyās fragile dependence on certain software, controlled by only a few companies, creating a cascading effect across the world.