Sponsorship

Sunday, January 14, 2024

Supercomputers Are A Load Of Crap

To begin with, there's something I need to tell you. My family has a pretty serious health crisis. The "our next year is completely going to be about survival" kind of crisis. Sorry. Hardware projects are going on soft hold, woodwork projects are going on soft hold, garden projects are going on soft hold, and blogging is going on soft hold. 

I mean that. I'm looking at the latest cucurbit-qubitatron teraflopsical quantum computer on some tech site's web page knowing in my heart of hearts that it's already superseded. And the next one might actually even work. In the sense that someone could run a spreadsheet on it, really really fast. Whatever.

So when I say supercomputers are a load of crap, I honestly really mean it. In the days of yore (OMFSM was it ever 1984? Forty years ago? Well fk me dead.) I remember being in so much awe that a Commodore 64 or and Amstrad 464 could run a spreadsheet or word processor that I'd have had to code by hand for my old ZX80, and load from my cassette player every time I wanted to run it. 

I was in so awe, much big eyes! for that first 386, the 486, the first Pentium I owned. Don't know what I'm talking about? That just proves my point innit? They were powerhouses for their day. And each reigned supreme for a matter of a year or two, sometimes only a matter of a few months, before the next superprocessor came along. 

Silicon Graphics came along with a few superb workstations that amazed, I've never actually used an Apple of any kind, and the CRAY-1 from the previous decade was still one of the most amazing supercomputers I knew about, because it was a first, and it had a new successor the CRAY-2. 

It's almost fifty years after the CRAY-1 and now I've used little single-board computers from the PIC8F series through Arduinos and ESP32s and RP2040 boards. That last on, it's the same processor family as the Raspberry Pi, the little single board computer that's the size of a pack of playing cards. CRAY-1? You're outta here! Yeah, I cried a silent little tear over that. 

But there's a lesson in that. 

Everything is capable of being improved on. E.V.E.R.Y.T.H.I.N.G. I just recently posted on one of my other blogs about how far AI and tech in general will improve in the future - if we let it. 

I'll be posting pretty soon (crisis permitting) about how we've gone from just knowing that CO2 is A Bad Thing for the planet and "sequestering" it in the ground to trying to scrub it out of the air with really big vacuum cleaners with great HEPA filters and are now to the point where we can now convert it into useful chemicals and materials and even clean-burning fuel. 

I've also seen EV batteries go from lead acid to lithium to - well, read on. From a range of a few dozen miles and a life of a few thousand miles before needing to be replaced to a a few hundred miles of range and tens of thousand of miles before needing replacement to the fabled million mile battery with a range of a thousand miles per recharge.

Materials that are made from exotic and difficult to engineer like Kevlar originally was, to stuff ten times stronger than Kevlar that can be produced in a medium-high-tech bathtub. Solar panels have gone from pathetic little things that failed in the full heat of the sun after a few months and produced under 100 watts to panels (like the one I use to keep 12V on in the shed and a source of mains voltage for some equipment so I don't need an upgraded breaker for the workshop) that clock in at 240 watts and will probably outlast me. 

And quantum computers are already in the shadow of The Next Big Thing. We're so good at continually improving and even on improving the speed at which we make improvements that there will come a point where the line goes vertical. 

There are also rumours that we already live in a simulation, and perhaps when that point is reached when we create The Last Big Computer, it'll be capable of reversing time and running the simulation on itself so that we can bring its existence about. 


Not going to beat about the bush, unless I can manage some kind of asymptotic increase in productivity myself, I simply won't be able to post as many articles. Fewer articles means fewer social media notifications. Fewer of those means fewer visitors to the blogs, and that means fewer people patronising these pages and less money to pay to keep things rolling along online. It's a death-spiral, people.

Unless you can, you know, maybe donate using one of the links below, share this article or the link to Ted's News Stand, and that way maybe I can keep my own money and pay those things out of patronage. It's a dream I have...

If this was a video I'd be saying "Make sure to like and subscribe, and be sure to visit my Ko-Fi page!" 


No comments: