Chris X Edwards

--------------------------

The Ugly, The Good, And The Bad

2016-09-22 22:06

Hearing about Yahoo’s massive leak tonight made me think I should double check that I don’t care too much about that password. Not that it really matters since this cow left the barn two years ago. Or something like that. The details are pretty vague - "miscreant", "dark web" and "state sponsored" all mentioned. Whatever.

I don’t think I’ve got anything to fear personally since I don’t really use Yahoo at all. But for good measure I tried to log in.

They said I needed to enter a verification code that they were emailing me. They showed an email "template", something like this.

Verifcation was sent to:
●●●●●●●●●●●@t●●●●●●●●●.com

The only problem is that I can not think of how I would ever have used an email address with that form. My domain is xed.ch and I would definitely have used some yahoo specific name at that domain (as my notes clearly indicate). Sure enough I eventually did get an email but it did not contain a verification code. Only an exhortation to change my password. Uh, ya, that’s what I was trying to do. But at least the message did say this.

On Thu, Sep 22, 2016 9:46 PM PDT, we noticed an attempt to sign in
to your Yahoo account  from an unrecognized device in United
States.

So far so good. But then they falter with this ridiculous request.

If this was you, please sign in from a device you regularly use.

It’s quite possible I haven’t logged into my Yahoo account in years, in fact, since I was living elsewhere. Should I go knock on the door of my former residence and ask if I can borrow their internet and hope somehow the IP address hasn’t changed? So that’s very ugly Yahoo.

Let’s compare this with Google. I don’t use Google much either but today I needed to send an SMS text message (no, not to RSVP for a turn of the century theme party). I first had to install the Google Voice client called Chromium since I don’t like the long tentacles of Google stuff mixing with non Google stuff, for which I use Firefox. Anyway…

It’s pretty rare for me to log into Google at all and I did this on my laptop at work and so I got this perfectly competent message.

Hi Chris X, Your Google Account …….@gmail.com was just used to sign in from Chrome on Linux.

Chris X Edwards …….@gmail.com

Linux Thursday, September 22, 2016 8:27 AM (Pacific Daylight Time) San Diego, CA, USA Chrome Don’t recognize this activity?

Why are we sending this? We take security very seriously and we want to keep you in the loop on important actions in your account.

Nicely done, Google. That’s a good example of how this should be done.

And finally, the bad. Let’s say you create a brand new AWS account from San Diego, California. Then a couple of hours later you fire up the maximum number of VMs and start mining cryptocurrency full gas… from China. What kind of security email do you get verifying that you took some kind of super fast military aircraft from California to China to do the exact thing criminals are most likely to do? Answer: none. That is one reason why I believe AWS' security is bad. It’s not even problematic across all of Amazon; if a retail order is placed, even a small one, a verification email is sent. But not AWS. AWS waits until the charges are many times more than the credit limit of the credit card on file for the account and then decides to double check with you. It’s hard not to be cynical.

Best Brexit Justification Yet

2016-09-09 13:34

One of the reasons a noticeable number of British people felt that EU membership offered a suboptimal return on investment is the perception of over-regulation. While the merits of this point are debatable, clearly it’s a meme. I don’t really have much of an opinion about it other than to encourage that people be left alone to do what they please unless it truly becomes a serious problem.

One issue I do have a strong opinion on is a type of regulation that is, to me, clearly terrible: criminalizing WWW hyperlinks to copyrighted material. Yesterday the EU Court of Justice apparently did just that.

Here’s a good article by the EFF describing the problem.

This is not a new issue. Here’s a good article from 2007 from John Dvorak which also is worth noting. He says, "If linking without prior permission is infringement then the web is dead." I pretty much agree with that. On the other hand, if all web content goes through two social media hubs the web would seem to be dead for that reason too, but, it lumbers on. Perhaps we can call it the World Wide Wheel based on its new and not improved topology. But I digress.

The only positive aspect of this I can see is if all scientific publishers hosting proprietary content go out of business because their articles inevitably and knowingly link to proprietary scientific publications. Not going to hold my breath.

alert("Javascript!");

2016-09-06 17:01

I’ve been doing a very tiny amount of JavaScript "programming" recently. I think I’ve finally figured out what the other problem with JavaScript is.

Before I describe that, I obviously have to mention the first problem and, simultaneously, why my JavaScript skills are intentionally very weak. The incredible thing about JavaScript’s first problem is that it has been entirely cured. Although JavaScript’s name implies some kind of relationship to Java, I can think of only one aspect of similarity: both JavaScript and Java are quintessential examples of what I call "turd polishing". Say what you will about Java, but these days it can program a Minecraft, which is astonishing when one considers how truly awful it was in every dimension in the 1990s. Exactly like its otherwise completely unrelated namesake, JavaScript underwent the same kind of transformation from completely useless as a programming language to, well, less useless.

What exactly then was JavaScript’s glaring problem that has been completely cured? In the 1990s and early 2000s JavaScript suffered from being utterly pointless to learn. Sure learning it and saying as much on a resume served many people well, but I’m talking about fundamentals, not irrational market forces. The reason that learning JavaScript was, in my opinion, a completely worthless investment was that it was not general purpose. You’d have been no less enlightened learning PLC programming in elevators or traffic lights.

Not only was JavaScript confined to the browser in a very bizarre way, but what could be done with it was absurdly limited. It seemed to me that JavaScript specialized in tepidly validating form boxes, thwarting client rendering preferences, creating inconsistent mouse over surprises, and some other truly forgettable annoyances.

That wasn’t even what made it pointless to learn. Maybe you needed to implement some of those annoyances. The problem I had was that whenever I felt I had a reasonable occasion to apply JavaScript, rather than learning it properly, I could simply search for the exact functionality I had in mind, and there would be the JavaScript code I needed. Not only that but there would be huge collections of such standard techniques. All JavaScript could do was control the browser and because the server controlled the proper content anyway, the only time it was interesting was when there was user input. Back in those days, that meant form boxes. Because there was such limited applicability, it seemed to me that every possible problem that JavaScript could conceivably be the solution to was not only solved, but those solutions were completely enumerated, implemented, and ready for download to all. It was a programming language where the random monkeys had typed all possible programs and helpfully put any vaguely sensible ones on the web. Why would one bother to recreate their work?

Although I do know a lot about JavaScript compared to normal people, I’m no JavaScript programmer. I feel like I know obscure languages that I’ve just dabbled in better than JavaScript even though it is so prevalent in everyday life. That brings me to the second problem that I’ve realized about JavaScript: side effects. Side effects in programming are basically where you send a function some input and it returns some output… And, oh ya, some other stuff somewhere else might be different now. Good luck with that.

I am aware that JavaScript has transformed into a completely proper and sensible programming language thanks to efforts like Node.js and I’m sure there are sensible people doing sensible things with it all the time. I’ve even seen that Node.js sports relatively high performance. However, it still seems to me that the overwhelming majority of JavaScript code that is run is run by a browser. And, frankly, as someone who learned and accepted how K&R wrote programs, browser JavaScript just seems damned weird. Speaking of K&R, the first place to look for an example of the weirdness, is Hello World, i.e. the first program in traditional programming language pedagogy. "Hello world" is not exactly natural in C, but in JavaScript it’s really hard to even be sure you’re really writing something along those lines at all. This is because normal JavaScript has no natural input/output system (yes, I know, but still). To "print" a message somewhere, you have to either create a new web page, replace part of an existing web page, or generate a pop up alert box.

Basically all real functionality is done by sly state changes in the DOM (basically the web page). The DOM is like one huge ugly global variable discouraging good design practices. This brings up a programming ethos question: are side effects bad? In some basic sense they’re almost necessary just to get a screen to update or something noticeable to happen. But in C, you can write a program that just calculates some things and returns an answer as an exit code with absolutely zero side effects. With JavaScript not only is it possible to write code with nothing but side effects, that’s the typical way! Changing some nebulous DOM state seems more common than changing properly scoped variables. I feel like JavaScript not only allows you to shoot yourself in the foot, it sets up the loaded gun already aimed at it.

Thanks to HTML5 (the "canvas" tag especially), WebGL, and SVG, JavaScript has become quite a useful language capable of solving far more problems than it could 15 years ago. Despite this, I still have a hard time letting my mind get comfortable with the style of programming JavaScript encourages. Incidentally, if you fastidiously run NoScript like I do, it quickly becomes apparent, sadly, that most JavaScript is still pure evil. Oh well, at least it’s improving. 😕

My Crazy Microsoft Assertion

2016-08-18 09:15

I have previously written about Microsoft’s excellent transformation into a respectable member of our technological civilization. The trend continues with the news that "Microsoft Is Open-Sourcing PowerShell for Linux, Macs". Once again, I congratulate the Nadella Dynasty Microsoft for being intelligent and absolutely doing the right thing. Good job!

The technical details of Microsoft’s latest display of humility are not something I need to elaborate on. Today’s news provides a good perspective for understanding a radical assertion I’ve made now for at least a decade.

Microsoft has retarded computer technology by 20 years.

Because this is so counterintuitive to most people and the explanation is complex, I’ll try to make my case for this idea so that I can avoid long discussions on the topic in the future which, frankly, I’m as tired of as my regular readers must be.

One reason my premise is so shocking is that most people believe the complete opposite. I think this has a lot to do with the cult of Microsoft’s original Khan. When I hear Bill Gates praised as a great personage I cringe. Bank robbers can be philanthropists too, often on the advice of their tax accountants. I think Bill Gates needs to be judged on what he did to get the money, not how he’s spending it.

This completely typical assessment of Bill Gates lists the following "five greatest achievements".

  1. Inspiring the era of the home computer

  2. Commercializing the operating system

  3. Windows

  4. Becoming the richest man in the world

  5. Giving his money away

Although I consider it more cancerous than praiseworthy, number 4 is correct enough. But with Gates still the richest man in the world today by far, number 5 looks a bit dubious.

I am horrified by the shockingly entrenched but false idea of number 1. In my opinion Bill Gates inspired nothing having to do with home computers. If you want the person who inspired the era of home computing, Steve Wozniak owns that like Gates owns the Forbes list of billionaires. Even Steve Jobs deserves way more credit. Woz demonstrated that home computers were technically possible and Jobs correctly sensed that the devastating social stigma associated with them could be scraped off in time. I don’t even give Gates credit for "inspiring the era of the office computer". Clearly that was IBM’s inspiration all along.

It is true that Bill Gates was at the helm when Microsoft produced Windows. But is this genius? Or even benign? Despite no preconceived bias at the time, I found Windows (3.1) to be a complete usability nightmare. My epiphany came when I realized that with Windows, I couldn’t do extremely basic things that I could do 15 years earlier using a Wozniak designed computer. A poignant example for me was that Windows lacked a native (to the OS) way to display graphical images. The list of frustrations was long and with Windows95 the list was enlarged as much as it was reduced. But does Microsoft Windows really explain why computers are advanced now in a way that they otherwise wouldn’t have been? As much as I hate 2-D hokey GUI metaphors based on office equipment, it certainly isn’t right to let Microsoft have any credit for pioneering them. Those terrible anachronistic metaphors are office related because they came from an office equipment company, Xerox. Note that Alan Kay and Xerox created their Alto GUI computer 19 years before Windows 3.1 was released. See what I mean about 20 years? As with almost everything Microsoft did, their GUI OS interface was not some kind of inspired protean innovation. It may be remembered as such because it was ultimately so dominant and destructive to other potential innovations.

The really important item on this list of Bill Gates' alleged accomplishments is the commercialization of the operating system. But "commercialization" is a bit too euphemistic. Red Hat and others have commercialized a free and public domain operating system but that business model hardly allows all competition to be utterly obliterated. What Microsoft did that was truly historic, and this returns to my original premise, is that they seized almost total control over how humanity created and exchanged information. Whether Microsoft understood that themselves or not, they have certainly come much closer to achieving that goal than any entity in history. Computers have become the dominant communication tool of our species. With exclusive unilateral control of and access to the system that manages the computer itself, Microsoft came dangerously close to ruling all information.

And this was bad. I tend to agree with Lord Acton who believed, "Despotic power is always accompanied by corruption of morality." Microsoft may have thought of itself as a benevolent dictator and with the best intentions, but after some serendipitous success, once network effects had eliminated their competition, I feel they focused primarily on stifling positive innovation that could have threatened their dominance. And they were good at that. An example is that there used to be dozens of word processors in a formerly competitive market. The way humanity luckily escaped this oppression was by moving most business from the direct control of the OS to the mediated control of the web browser. This was strongly opposed by Microsoft at the time.

Not only was Microsoft working hard to limit computing choices but those choices were degenerating too. Since every person who needed to communicate would need a computer and every person who needed a computer would need a Microsoft OS, the main focus of Microsoft that might generously be seen as prosocial involved introducing more ordinary people to computers. Since they didn’t have to compete with any better approaches they decisively moved toward the lowest common denominator. As my computer agenda became more technical and serious, Microsoft was doing everything possible to infantilize computer use. In that they definitely succeeded.

It is for these reasons that I believe Microsoft has set us back twenty years. The reason I’m writing this today is to justify the full two decades. OS X was released 15 years ago. I consider this a milestone indicating Apple’s return to competent computing and a competitive marketplace. OS X has always had Unix. There was a choice of shells (the original default was tcsh), SSH client/server, and all the wholesome Unix tools that every sane system should include. Microsoft is just announcing that they’ve seen the light and, 15 years later, they’re working on catching up. Maybe in a couple of years they will. That still leaves a couple of years. Well, OS X may have demonstrated that a mainstream popular OS could be competent 15 years ago, but Linux was comfortably doing it many years earlier. I’m standing by my time frame.

Calculus Is A Weird Anachronism

2016-08-09 19:18

Here’s a parody of a calculus problem for you.

dQ/dt = du/dt - di/dt + M

I don’t know how to solve it but I know enough to know it’s not really a proper calculus problem. In this equation Q is quality of life, u is the utility of calculus, and i is the investment one makes in developing a calculus proficiency sufficient for u. M is the intrinsic motivation to learn and be knowledgeable about calculus; mine is used up! Although this equation is quite silly it parallels all real world textbook problems involving calculus by distorting the situation into an absurd simplification.

I consider this equation a very dubious justification for the extraordinary emphasis placed on calculus in the educational system I was (and still am) a part of. For me, calculus was no small investment (i). Between high school calculus and a university engineering degree, I was studying calculus for about 3 solid years. And although I don’t have a typical engineering career, I am horrified that I have put my calculus training to good practical use exactly zero times in my life (u). The reason for this, perhaps the reason I intuitively let myself not be as proficient at calculus as possible, is that I believe that calculus is never essential if you have access to a computer. And if you do have access to a computer (which is everybody reading this), calculus is actually irritatingly counter-intuitive because it implies some wrongish things about how to best model the world (an assumption of analog, for example). I’m not talking about using a computer instead of calculus to "get answers" the way a pocket calculator (app?) can do basic arithmetic. A pocket calculator does not replace the need to understand arithmetic but numerical solutions with a computer do obviate the need to understand calculus.

Hokey religions and ancient weapons are no match for a good blaster.

Apologists argue that calculus may not be super useful but that the equation above is close enough that a radical restructuring of education wouldn’t be worth a mere quibble. The problem with this equation, however, is that it is missing a very important term. Here’s the better version.

dQ/dt = du/dt - di/dt - dc/dt + M

Here c represents opportunity costs. In less economic terms, critics of calculus must answer the question, "What should we be teaching/learning instead?" There is no argument that calculus is useful. It is, just not very. This implies there are better ways to spend our time. There are, many. Besides potential engineers and physicists, I can’t think of any reason for high school students to learn calculus that is better than the reasons to learn the following things.

  • Vocational (machining, welding, plastics, ag) - Even if you aren’t going to fix your own car or work in a factory, understanding the foundations of the most real parts of our civilization can’t possibly be a complete waste of time. If I had to relinquish either my machine shop apprenticeship or my university engineering education, I would jettison the latter.

  • Language History - This is commonly called "Latin", but really learning some Old English and Latin and how English became to be like it is (throw in King James, Shakespeare, etc) turns out to be extremely useful in building solid communication skills. Many foreign speaking cultures now teach their kids English well enough to smoothly participate in the Anglosphere. Language history would help English speakers preserve an edge. For the same reason, I would mention enhanced grammar study as being much more useful than calculus but I understand that, unlike the colorful history of language, it’s only slightly more interesting.

  • Art - Did you know that the BLS predicts that by 2024 "Arts, design, entertainment, sports, and media occupations" will increase by 4.1% (source)? Despite wasting so much of my life learning calculus I have enough math sense to notice that because the percent increase in total occupations is 6.5%, this is actually a per capita net loss. So we really should be focusing on engineering, right? Uh… No. That is projected to grow at only 2.7%. That’s 50% more of a reason to support the arts over engineering. The US is a world leader in design, fashion, fine art, graphic art, digital art, performing art, movies, video games, photography, cuisine, typography, and sports. All despite calculus. We could reinvest in our waning art culture or leave it to other cultures to take over.

  • Music - And if American art has been a grand success, America has been to music what the 13th century Mongols were to Asia. Calculus is probably best learned well into adulthood if the need arises. Music is best learned when young. We neglect our true cultural legacy at our peril.

  • Home Economics - Are our real problems today a lack of people who can derive formulae for ballistics trajectories using 18th century techniques? Or is it that as a species we’re all becoming depressingly unhealthy and fat? Empowering high scholars to make better decisions about food would surely pay society back far more than calculus. Or maybe offer kids the option of an hour of running or an hour of calculus lessons - obesity epidemic cured!

  • Personal Finance - How about helping hapless high school seniors out with the facts of life about debt before they take on those predatory student loans. And credit card debt, payday loans, adjustable rate mortgages, etc. To give them calculus instead is a shameful dirty trick.

  • Statistics - Proponents claim calculus is good mental exercise for later skills in technical fields that are essential. They also say the history of calculus is important for understanding modern technology. That’s fine, but the warm up doesn’t also have to be useless and disorienting. I have a lot of problems with statistics as it’s normally taught (I had almost 2 solid years of that), but even if it’s all completely bogus, it’s still topical and essential for engaged discourse. Who knows, maybe if we treat it seriously we’ll produce an Einstein type figure who will revolutionize the field and create a paradigm shift appropriate for the modern uses (e.g. quantum physics) which Bernoulli, Laplace, Gauss, and other early pioneers had no intention of contributing to.

  • Linear Algebra - "Think of the engineers!" cry the calculus apologists. Surely they need all this "math" just for good practice and it just might be useful. Bollocks. If we care about that, linear algebra is the way to go. If you’re an engineer in 2016 and you’re using Newtonian calculus way more than linear algebra, you’re doing it wrong. In fact, if you’re doing something serious and you’re not using linear algebra to do your Newtonian calculus you’re probably doing that wrong. Linear algebra has the delightful bonus property that it teaches itself to many adolescents (and adults) - "Hey, who’d like to make a 3d video game?"

  • Numerical Analysis - Less interesting than linear algebra but way more useful than calculus is numerical analysis. If we’re compulsively fetishizing the cult of personality of Sir Isaac then numerical analysis is ideal.

  • Information Theory - "No, no, we need calculus because $BetterAlternative is too easy." If this is your feeling, that high school kids need to be tormented with a weird subject that is incomprehensible in any way to normal people, then information theory is ironically hard to communicate to students. Honestly, it’s not a great idea to add information theory to a standard high school curriculum but it would be much better (useful/interesting) than calculus! If we could just broadly teach people that "password" is not a good password, it’d be a good trade.

  • Computer Science/Programming - It may not have been obvious in 1953 when the first transistorized computers were built that we, as a society, should immediately scrap calculus and instead focus on these miraculous new tools. But it is obvious now! You may never need to know how a linked list works, but if you’re reading this, you’re using a profusion of them right now. And that’s esoteric computer science mumbo jumbo that may or may not be useful. The starkly obvious real-world potential utility of computers that goes untapped because of a lack of education is pathological.

  • Philosophy/Ethics - A common justification for calculus (here’s one) is that it helps "teach people to think", including logic, problem solving, etc. I believe that if we have to disguise the study of philosophy as calculus (Newton and Leibniz did the reverse by the way) then that itself is terrible philosophy and proof that we have a problem. Just teach philosophy! It’s worth it!

Just as we have stopped beating kids with the lash (not sure about Texas), sometimes a society needs to accept that it’s on the wrong track and abandon the cultural script that mindlessly proscribes suboptimal practices. Although there is gathering momentum for calculus reform, by speaking out against this educational hazing I’m doing what I can to break with our obsolete past. In an ideal world everyone would learn calculus - right after the thousand other worthier subjects.

--------------------------

For older posts and RSS feed see the blog archives.
Chris X Edwards © 1999-2016