Chris X Edwards


Internet Denied

2016-09-25 16:04

Last week my cable modem suddenly died and that single point of failure cast me into the prehistoric world of no internet. While I was sorting that mess out, computer security hero Brian Krebs was also having internet trouble. He came back on line today after being squeezed off the internet for a few days by a distributed denial of service (DDoS) attack, a rather nasty one it seems. He wrote an interesting report and a follow up, worth taking a look at. It kind of makes my head spin. A lot of technical details but it seems like the internet is sick.

Let’s take a look at some of the rotten parts. First is DDoS itself. I never much paid this too much attention. It seemed like a craven and clumsy kind of petulance more than anything serious. "These are non-professionals who use DDoS…to instigate attacks out of boredom or spite" says this report. Do we really need to be worried about bored punks? Maybe. It seems this stuff is becoming professionalized which is making it more prevalent. Beyond denying you service to a legitimate resource you’d like to have access to, the aforementioned DDoS clumsiness pollutes the entire internet with junk packets. It basically degrades everyone’s bandwidth or increases everyone’s cost or both. I can’t find a solid number but it seems like between 2% and 5% of the internet are bogus packets. I don’t even know if that counts spam.

Brian mentions the Internet of Things. I find the notion somewhat cloying since I’ve been waiting for computers to interact with the real world for a long time. And like the sadly clumsy applications that have been contrived for our marvelously small and efficient new computers, the dumb things that are being envisioned for the IoT makes me think, don’t bother. Brian’s experience (many of the attacking hosts were IP webcams) show that these devices, which are really just difficult-to-manage computers with bad proprietary controls, are a security nightmare and a threat to the internet of data.

Brian mentions border gateway protocol, BGP, in his last article before the recent attack (perhaps not coincidence). I’ve been concerned about BGP hijacking since last year when I learned about it. (It didn’t cause my problem, but it easily could have. You have been warned.) We’ve already figured out that DNS is pretty much sucker bait for computer criminals. We knew that when you typed in a name, you might go to the wrong number. I think we’re going to see a rise in cases where you get the right number, but the mysterious routers deep in the internet send you to the wrong machine anyway. All I can tell you is learn to appreciate SSH host keys and don’t use the WWW for anything serious.

These Krebs articles also tipped me off to RFC1701, Generic Routing Encapsulation. Is this some kind of joke? You’ve heard of Voice over IP; this is IP over IP. Or am I missing something here? But should I be surprised? VMs run the world now and the virtual machine, like IP over IP is an admission of failure. It’s basically saying, ok, we’ve mismanaged (the OS|the network) so badly let’s just start again with a pristine one, (emulated|encapsulated) in the mismanaged one. On the other hand, if you screw that up, the same recursive solution is always available to you. So there’s that.

Oh well, enjoy your internet while it lasts. Hopefully it all just stumbles along sufficient to requirements. But it might be wise to cultivate a lifestyle, or at least some hobbies, that don’t require it at all. A prolonged internet outage would probably do many of us some good.

The Ugly, The Good, And The Bad

2016-09-22 22:06

Hearing about Yahoo’s massive leak tonight made me think I should double check that I don’t care too much about that password. Not that it really matters since this cow left the barn two years ago. Or something like that. The details are pretty vague - "miscreant", "dark web" and "state sponsored" all mentioned. Whatever.

I don’t think I’ve got anything to fear personally since I don’t really use Yahoo at all. But for good measure I tried to log in.

They said I needed to enter a verification code that they were emailing me. They showed an email "template", something like this.

Verifcation was sent to:

The only problem is that I can not think of how I would ever have used an email address with that form. My domain is and I would definitely have used some yahoo specific name at that domain (as my notes clearly indicate). Sure enough I eventually did get an email but it did not contain a verification code. Only an exhortation to change my password. Uh, ya, that’s what I was trying to do. But at least the message did say this.

On Thu, Sep 22, 2016 9:46 PM PDT, we noticed an attempt to sign in
to your Yahoo account  from an unrecognized device in United

So far so good. But then they falter with this ridiculous request.

If this was you, please sign in from a device you regularly use.

It’s quite possible I haven’t logged into my Yahoo account in years, in fact, since I was living elsewhere. Should I go knock on the door of my former residence and ask if I can borrow their internet and hope somehow the IP address hasn’t changed? So that’s very ugly Yahoo.

Let’s compare this with Google. I don’t use Google much either but today I needed to send an SMS text message (no, not to RSVP for a turn of the century theme party). I first had to install the Google Voice client called Chromium since I don’t like the long tentacles of Google stuff mixing with non Google stuff, for which I use Firefox. Anyway…

It’s pretty rare for me to log into Google at all and I did this on my laptop at work and so I got this perfectly competent message.

Hi Chris X, Your Google Account …… was just used to sign in from Chrome on Linux.

Chris X Edwards ……

Linux Thursday, September 22, 2016 8:27 AM (Pacific Daylight Time) San Diego, CA, USA Chrome Don’t recognize this activity?

Why are we sending this? We take security very seriously and we want to keep you in the loop on important actions in your account.

Nicely done, Google. That’s a good example of how this should be done.

And finally, the bad. Let’s say you create a brand new AWS account from San Diego, California. Then a couple of hours later you fire up the maximum number of VMs and start mining cryptocurrency full gas… from China. What kind of security email do you get verifying that you took some kind of super fast military aircraft from California to China to do the exact thing criminals are most likely to do? Answer: none. That is one reason why I believe AWS' security is bad. It’s not even problematic across all of Amazon; if a retail order is placed, even a small one, a verification email is sent. But not AWS. AWS waits until the charges are many times more than the credit limit of the credit card on file for the account and then decides to double check with you. It’s hard not to be cynical.

Best Brexit Justification Yet

2016-09-09 13:34

One of the reasons a noticeable number of British people felt that EU membership offered a suboptimal return on investment is the perception of over-regulation. While the merits of this point are debatable, clearly it’s a meme. I don’t really have much of an opinion about it other than to encourage that people be left alone to do what they please unless it truly becomes a serious problem.

One issue I do have a strong opinion on is a type of regulation that is, to me, clearly terrible: criminalizing WWW hyperlinks to copyrighted material. Yesterday the EU Court of Justice apparently did just that.

Here’s a good article by the EFF describing the problem.

This is not a new issue. Here’s a good article from 2007 from John Dvorak which also is worth noting. He says, "If linking without prior permission is infringement then the web is dead." I pretty much agree with that. On the other hand, if all web content goes through two social media hubs the web would seem to be dead for that reason too, but, it lumbers on. Perhaps we can call it the World Wide Wheel based on its new and not improved topology. But I digress.

The only positive aspect of this I can see is if all scientific publishers hosting proprietary content go out of business because their articles inevitably and knowingly link to proprietary scientific publications. Not going to hold my breath.


2016-09-06 17:01

I’ve been doing a very tiny amount of JavaScript "programming" recently. I think I’ve finally figured out what the other problem with JavaScript is.

Before I describe that, I obviously have to mention the first problem and, simultaneously, why my JavaScript skills are intentionally very weak. The incredible thing about JavaScript’s first problem is that it has been entirely cured. Although JavaScript’s name implies some kind of relationship to Java, I can think of only one aspect of similarity: both JavaScript and Java are quintessential examples of what I call "turd polishing". Say what you will about Java, but these days it can program a Minecraft, which is astonishing when one considers how truly awful it was in every dimension in the 1990s. Exactly like its otherwise completely unrelated namesake, JavaScript underwent the same kind of transformation from completely useless as a programming language to, well, less useless.

What exactly then was JavaScript’s glaring problem that has been completely cured? In the 1990s and early 2000s JavaScript suffered from being utterly pointless to learn. Sure learning it and saying as much on a resume served many people well, but I’m talking about fundamentals, not irrational market forces. The reason that learning JavaScript was, in my opinion, a completely worthless investment was that it was not general purpose. You’d have been no less enlightened learning PLC programming in elevators or traffic lights.

Not only was JavaScript confined to the browser in a very bizarre way, but what could be done with it was absurdly limited. It seemed to me that JavaScript specialized in tepidly validating form boxes, thwarting client rendering preferences, creating inconsistent mouse over surprises, and some other truly forgettable annoyances.

That wasn’t even what made it pointless to learn. Maybe you needed to implement some of those annoyances. The problem I had was that whenever I felt I had a reasonable occasion to apply JavaScript, rather than learning it properly, I could simply search for the exact functionality I had in mind, and there would be the JavaScript code I needed. Not only that but there would be huge collections of such standard techniques. All JavaScript could do was control the browser and because the server controlled the proper content anyway, the only time it was interesting was when there was user input. Back in those days, that meant form boxes. Because there was such limited applicability, it seemed to me that every possible problem that JavaScript could conceivably be the solution to was not only solved, but those solutions were completely enumerated, implemented, and ready for download to all. It was a programming language where the random monkeys had typed all possible programs and helpfully put any vaguely sensible ones on the web. Why would one bother to recreate their work?

Although I do know a lot about JavaScript compared to normal people, I’m no JavaScript programmer. I feel like I know obscure languages that I’ve just dabbled in better than JavaScript even though it is so prevalent in everyday life. That brings me to the second problem that I’ve realized about JavaScript: side effects. Side effects in programming are basically where you send a function some input and it returns some output… And, oh ya, some other stuff somewhere else might be different now. Good luck with that.

I am aware that JavaScript has transformed into a completely proper and sensible programming language thanks to efforts like Node.js and I’m sure there are sensible people doing sensible things with it all the time. I’ve even seen that Node.js sports relatively high performance. However, it still seems to me that the overwhelming majority of JavaScript code that is run is run by a browser. And, frankly, as someone who learned and accepted how K&R wrote programs, browser JavaScript just seems damned weird. Speaking of K&R, the first place to look for an example of the weirdness, is Hello World, i.e. the first program in traditional programming language pedagogy. "Hello world" is not exactly natural in C, but in JavaScript it’s really hard to even be sure you’re really writing something along those lines at all. This is because normal JavaScript has no natural input/output system (yes, I know, but still). To "print" a message somewhere, you have to either create a new web page, replace part of an existing web page, or generate a pop up alert box.

Basically all real functionality is done by sly state changes in the DOM (basically the web page). The DOM is like one huge ugly global variable discouraging good design practices. This brings up a programming ethos question: are side effects bad? In some basic sense they’re almost necessary just to get a screen to update or something noticeable to happen. But in C, you can write a program that just calculates some things and returns an answer as an exit code with absolutely zero side effects. With JavaScript not only is it possible to write code with nothing but side effects, that’s the typical way! Changing some nebulous DOM state seems more common than changing properly scoped variables. I feel like JavaScript not only allows you to shoot yourself in the foot, it sets up the loaded gun already aimed at it.

Thanks to HTML5 (the "canvas" tag especially), WebGL, and SVG, JavaScript has become quite a useful language capable of solving far more problems than it could 15 years ago. Despite this, I still have a hard time letting my mind get comfortable with the style of programming JavaScript encourages. Incidentally, if you fastidiously run NoScript like I do, it quickly becomes apparent, sadly, that most JavaScript is still pure evil. Oh well, at least it’s improving. 😕

My Crazy Microsoft Assertion

2016-08-18 09:15

I have previously written about Microsoft’s excellent transformation into a respectable member of our technological civilization. The trend continues with the news that "Microsoft Is Open-Sourcing PowerShell for Linux, Macs". Once again, I congratulate the Nadella Dynasty Microsoft for being intelligent and absolutely doing the right thing. Good job!

The technical details of Microsoft’s latest display of humility are not something I need to elaborate on. Today’s news provides a good perspective for understanding a radical assertion I’ve made now for at least a decade.

Microsoft has retarded computer technology by 20 years.

Because this is so counterintuitive to most people and the explanation is complex, I’ll try to make my case for this idea so that I can avoid long discussions on the topic in the future which, frankly, I’m as tired of as my regular readers must be.

One reason my premise is so shocking is that most people believe the complete opposite. I think this has a lot to do with the cult of Microsoft’s original Khan. When I hear Bill Gates praised as a great personage I cringe. Bank robbers can be philanthropists too, often on the advice of their tax accountants. I think Bill Gates needs to be judged on what he did to get the money, not how he’s spending it.

This completely typical assessment of Bill Gates lists the following "five greatest achievements".

  1. Inspiring the era of the home computer

  2. Commercializing the operating system

  3. Windows

  4. Becoming the richest man in the world

  5. Giving his money away

Although I consider it more cancerous than praiseworthy, number 4 is correct enough. But with Gates still the richest man in the world today by far, number 5 looks a bit dubious.

I am horrified by the shockingly entrenched but false idea of number 1. In my opinion Bill Gates inspired nothing having to do with home computers. If you want the person who inspired the era of home computing, Steve Wozniak owns that like Gates owns the Forbes list of billionaires. Even Steve Jobs deserves way more credit. Woz demonstrated that home computers were technically possible and Jobs correctly sensed that the devastating social stigma associated with them could be scraped off in time. I don’t even give Gates credit for "inspiring the era of the office computer". Clearly that was IBM’s inspiration all along.

It is true that Bill Gates was at the helm when Microsoft produced Windows. But is this genius? Or even benign? Despite no preconceived bias at the time, I found Windows (3.1) to be a complete usability nightmare. My epiphany came when I realized that with Windows, I couldn’t do extremely basic things that I could do 15 years earlier using a Wozniak designed computer. A poignant example for me was that Windows lacked a native (to the OS) way to display graphical images. The list of frustrations was long and with Windows95 the list was enlarged as much as it was reduced. But does Microsoft Windows really explain why computers are advanced now in a way that they otherwise wouldn’t have been? As much as I hate 2-D hokey GUI metaphors based on office equipment, it certainly isn’t right to let Microsoft have any credit for pioneering them. Those terrible anachronistic metaphors are office related because they came from an office equipment company, Xerox. Note that Alan Kay and Xerox created their Alto GUI computer 19 years before Windows 3.1 was released. See what I mean about 20 years? As with almost everything Microsoft did, their GUI OS interface was not some kind of inspired protean innovation. It may be remembered as such because it was ultimately so dominant and destructive to other potential innovations.

The really important item on this list of Bill Gates' alleged accomplishments is the commercialization of the operating system. But "commercialization" is a bit too euphemistic. Red Hat and others have commercialized a free and public domain operating system but that business model hardly allows all competition to be utterly obliterated. What Microsoft did that was truly historic, and this returns to my original premise, is that they seized almost total control over how humanity created and exchanged information. Whether Microsoft understood that themselves or not, they have certainly come much closer to achieving that goal than any entity in history. Computers have become the dominant communication tool of our species. With exclusive unilateral control of and access to the system that manages the computer itself, Microsoft came dangerously close to ruling all information.

And this was bad. I tend to agree with Lord Acton who believed, "Despotic power is always accompanied by corruption of morality." Microsoft may have thought of itself as a benevolent dictator and with the best intentions, but after some serendipitous success, once network effects had eliminated their competition, I feel they focused primarily on stifling positive innovation that could have threatened their dominance. And they were good at that. An example is that there used to be dozens of word processors in a formerly competitive market. The way humanity luckily escaped this oppression was by moving most business from the direct control of the OS to the mediated control of the web browser. This was strongly opposed by Microsoft at the time.

Not only was Microsoft working hard to limit computing choices but those choices were degenerating too. Since every person who needed to communicate would need a computer and every person who needed a computer would need a Microsoft OS, the main focus of Microsoft that might generously be seen as prosocial involved introducing more ordinary people to computers. Since they didn’t have to compete with any better approaches they decisively moved toward the lowest common denominator. As my computer agenda became more technical and serious, Microsoft was doing everything possible to infantilize computer use. In that they definitely succeeded.

It is for these reasons that I believe Microsoft has set us back twenty years. The reason I’m writing this today is to justify the full two decades. OS X was released 15 years ago. I consider this a milestone indicating Apple’s return to competent computing and a competitive marketplace. OS X has always had Unix. There was a choice of shells (the original default was tcsh), SSH client/server, and all the wholesome Unix tools that every sane system should include. Microsoft is just announcing that they’ve seen the light and, 15 years later, they’re working on catching up. Maybe in a couple of years they will. That still leaves a couple of years. Well, OS X may have demonstrated that a mainstream popular OS could be competent 15 years ago, but Linux was comfortably doing it many years earlier. I’m standing by my time frame.


For older posts and RSS feed see the blog archives.
Chris X Edwards © 1999-2016