Friday, June 26, 2015

Guys, "guys" is perfectly fine for addressing diverse groups

With the Political Correctness police gaining momentum again after being laughed out of the 80ies, the word "guys" has apparently come under attack as being "non-inclusive". After discussing the topic a bit on twitter, I saw Peter Hosey's post declaring the following:

"when you’re addressing a mixed-gender or unknown-gender group, you should not use the word 'guys'."
As evidence, he references a post by Julia Evans purportedly showing that for most uses, people perceive "guys" to be gender-specific. Here is the graph of what she found:

What I find interesting is that that data show exactly the opposite of Peter's claim. Yes, most of the usage patterns are perceived as gender-specific by more people than not, but all of those are third person. The one case that is second person plural, the case of addressing a group of people, is overwhelmingly perceived as being gender neutral, with women perceiving it as gender neutral slightly more than men, but both groups at over 90%.

This matches with my intuition, or to be more precise, I find it somewhat comforting that my intuition about this appears to still match reality. I find "hey guys" neutral (2nd person plural), whereas "two guys walked into a store" is male.

Prescription vs. Description

Of course, they could have just checked their friendly local dictionary, for example Webster's online:
guy (noun)
Definition of GUY
  1. often capitalized : a grotesque effigy of Guy Fawkes traditionally displayed and burned in England on Guy Fawkes Day
  2. chiefly British : a person of grotesque appearance
  3. a : man, fellow
    b : person —used in plural to refer to the members of a group regardless of sex
  4. 4 : individual, creature <the other dogs pale in companion to this little guy>
So there we go: "used in plural to refer to the members of a group regardless of sex". It is important to note that unlike continental dictionaries (German, French), which proscribe correct usage, the anglo-saxon tradition is descriptive, meaning actual use is documented. In addition, my recollection is that definitions are listed chronologically, with the older last and newer ones first. So the word's meaning is shifting to be more gender neutral. This is called progress.

What I found interesting is that pointing out the dictionary definition was perceived as prescriptive, that I was using trying to force an out-of-touch dictionary definition on a public that perceives the word differently. Of course, the opposite is the case: a few people are trying to force their perception based on outdated definitions of the word on a public and a language that has moved on.

Language evolution and the futility of PC

Speaking of Anglo-Saxons and language evolution: does anyone feel the oppression when ordering beef or pork? Well, you should. These words for the meat of certain animals were introduced to English in 1066 with the conquering Normans. The french words for the animals were now used to describe the food the upper class got served, whereas the anglo-saxon words shifted to denote just the animals that the peasants herded. Yeah, and medieval oppression was actually real, unlike some other "oppression" I can think of.

Of course, we don't know about that today, and the words don't have those association anymore, because language just shifts to adapt to and reflect reality. Never the other way around, which is why the PC brigade's attempts to affect reality by policing language is so misguided.

Take the long history of euphemisms for "person with disability". It started out as "cripple", but that word was seen as stigmatizing, so it was replaced with "handicapped", because it wasn't something a defect with the person, but a handicap they had. Then that word got to be stigmatized and we switched to "disabled". Then "person with disabilities", "special", "challenged", "differently abled". And so on and so forth. The problem is that it never works: the stigma moves to the new word that was chosen because it was so far stigma-free, so nowadays calling someone "special" is no longer positive. And calling homeless people "the temporarily underhoused" because "home is wherever you are" also never helped.

So leave language be and focus on changing the underlying reality instead. All of this does not mean that you can't be polite: if someone feels offended by being addressed in a certain way, by all means accomodate them and/or come to some understanding.

Let the Hunting begin :-))

Wednesday, June 17, 2015

Protocol-Oriented Programming is Object-Oriented Programming

Crusty here, I just saw that my young friend Dave Abrahams gave a talk that was based on a little keyboard session we had just a short while ago. Really sharp fellow, you know, I am sure he'll go far someday, but that's the problem with young folk these days: they go rushing out telling everyone what they've learned when the lesson is only one third of the way through.

You see, I was trying to impart some wisdom on the fellow using the old Hegelian dialectic: thesis, antithesis, synthesis. And yes, I admit I wasn't completely honest with him, but I swear it was just a little white lie for a good educational cause. You see, I presented ADT (Abstract Data Type) programming to him and called it OOP. It's a little ruse I use from time to time, and decades of Java, C++ and C# have gone a long way to making it an easy one.


So the thesis was simple: we don't need all that fancy shmancy OOP stuff, we can just use old fashioned structs 90% of the time. In fact, I was going to show him how easy things look in MC68K assembly language, with a few macros for dispatch, but then thought better of it, because he might have seen through my little educational ploy.

Of course, a lot of what I told him was nonsense, for example OOP isn't at all about subclassing, for example the guy who coined the term, Alan I think, wrote: "So I decided to leave out inheritance as a built-in feature until I understood it better." So not only isn't inheritance not the defining feature of OOP as I let on, it actually wasn't even in the original conception of the thing that was first called "object-oriented programming".

Absolute reliance on inheritance and therefore structural relationships is, in fact, a defining feature of ADT-oriented programming, particularly when strong type systems are involved. But more on that later. In fact, OOP best practices have always (since the late 80ies and early 90ies) called for composition to be used for known axes of customization, with inheritance used for refinement, when a component needs to be adapted in a more ad-hoc fashion. If that knowledge had filtered down to young turks writing their master's thesis back in what, 1997, you can rest assured that the distinction was well known and not exactly rocket science.

Anyway, I kept all that from Dave in order to really get him excited about the idea I was peddling to him, and it looks like I succeeded. Well, a bit too well, maybe.


Because the idea was really to first get him all excited about not needing OOP, and then turn around and show him that all the things I had just shown him in fact were OOP. And still are, as a matter of fact. Always have been. It's that sort of confusion of conflicting truth seeming ideas that gets the gray matter going. You know, "sound of one hand clapping" kind of stuff.

The reason I worked with him on a little graphics context example was, of course, that I had written a graphics context wrapper on top of CoreGraphics a good three years ago. In Objective-C. With a protocol defining the, er, protocol. It's called MPWDrawingContext and live on github, but I also wrote about it, showed how protocols combine with blocks to make CoreGraphics patterns easy and intuitive to use and how to combine this type of drawing context with a more advanced OO language to make live coding/drawing possible. And of course this is real live programming, not the "not-quite-instant replay" programming that is all that Swift playgrounds can provide.

The simple fact is that actual Object Oriented Programming is Protocol Oriented Programming, where Protocol means a set of messages that an object understands. In a true and pure object oriented language like Smalltalk, it is all that can be, because the only way to interact with an object is to send messages. Even if you do simple metaprogramming like checking the class, you are still sending a message. Checking for object identity? Sending a message. Doing more intrusive metaprogramming like "directly" accessing instance variables? Message. Control structures like if and while? Message. Creating ranges? Message. Iterating? Message. Comparing object hierarchies? I think you get the drift.

So all interacting is via messages, and the set of messages is a protocol. What does that make OO? Say it together: Protocol Oriented Programming.


So we don't need objects when we have POP, but at the same time POP is OOP. Confused? Well, that's kind of the point of a good dialectic argument.

One possible solution to the conflict could be that we don't need any of this stuff. C, FORTRAN and assembly were good enough for me, they should be good enough for you. And that's true to a large extent. Excellent software was written using these tools (and ones that are much, much worse!), and tooling is not the biggest factor determining success or failure of software projects.

On the other hand, if you want to look beyond what OOP has to offer, statically typed ADT programming is not the answer. It is the question that OOP answered. And statically typed ADT programming is not Protocol Oriented Programming, OOP is POP. Repeat after me: OOP is POP, POP is OOP.

To go beyond OOP, we actually need to go beyond it, not step back in time to the early 90ies, forget all we learned in the meantime and declare victory. My personal take is that our biggest challenges are in "the big", meaning programming in the large. How to connect components together in a meaningful, tractable and understandable fashion. Programming the components is, by and large, a solved problem, making it a tiny bit better may make us feel better, but it won't move the needle on productivity.

Making architecture malleable, user-definable and thus a first class citizen of our programming notation, now that is a worthwhile goal and challenge.

Crusty out.

As always, comments welcome here and on HN.

Sunday, June 7, 2015

Steve Jobs on Swift

No, there is no actual evidence of Steve commenting on Swift. However, he did say something about the road to sophisticated simplicity.

In short, at first you think the problem is easy because you don't understand it. Then you begin to understand the problem and everything becomes terribly complicated. Most people stop there, and Apple used to make fun of the ones that do.

To me this is the perfect visual illustration of the crescendo of special cases that is Swift.

The answer to this, according to Steve, is "[..] a few people keep burning the midnight oil and finally understand the underlying principles of the problem and come up with an elegantly simple solution for it. But very few people go the distance to get there."

Apple used to be very much about going that distance, and I don't think Swift lives up to that standard. That doesn't mean it's all bad or that it's completely irredeemable, there are good elements. But they stopped at sophisticated complexity. And "well, it's not all bad" is not exactly what Apple stands for or what we as Apple customers expect and, quite frankly, deserve. And had there been a Steve in Dev Tools, he would have said: do it again, this is not good enough.

As always, comments welcome here or on HN