Feeds:
Posts
Comments

I’ve been reading Isaacson’s DaVinci biography (that’s another post) and thinking about metaphors, analogies, teaching and learning.

Teaching is hard. The world is a complex place, so that’s to be expected. Learning is hard, although many people expect it to be easy. I mean, really, like, you can just Google things.

Well, really, not so much.

For me teaching is all about the group and the motivating example. Humans learn best by metaphor, going from the known to the unknown. Kind of like having one foot firmly planted on the lip of the hot tub and testing the temperature with the other. Just jumping in might work. Not something to rely on though. If you give people a framework they can relate to, it affords them a place from which to extend what they know.

On my high school senior physics class final was a problem that began, “A rock explodes into three pieces …”. Really? Why? It’s been a lifetime since that event and yet the premise of the problem still sticks with me. During my undergraduate studies, I had a physics professor whose motivating examples were based on James Bond situations. As contrived as physics problems tend to be in order to tease out a self-contained use of some specialized equation, at least contextualizing thing via James Bond gave them a veneer of reason. Mostly. Sort of.

During my graduate studies, I dropped a class in neural networks because the professor presented the material in such an abstract fashion that I couldn’t anchor it. It wasn’t until I took Andrew Ng’s first Machine Learning class on Coursera (which was one of two first offered) that neural networks actually made sense to me. He presented the material in the context of real-world use cases.

I’m not say that everything can be learned by simply having a good story. If you work with computer software long enough, you’ll have to confront numbers represented in binary, octal or hexadecimal. You’ll just have to memorize the conversions. The same is true for operator precedence.

Let’s look at learning for a minute, lest everyone think that I’ve forgotten it.

In order to learn something, assuming that it’s not rote memorization, you must accept the framework within which it exists. Unless you put can do that, things won’t stick. You will forever be condemned to Google it hell. I can usually tell the people who will have difficulty learning a programming language when they complain that it’s not like the language they’re used to. As I like to say, “you can program C in any language.” Some people never get past that point. And we all suffer because of it.

I’m not limiting this to C-based languages. The interpreted world has more than their fair share of people still programming BASIC in any language. I like to think of them as the Python without classes crowd. I’m not sure where the whole “classes are bad” mentality came from, but it seems to have a strong following.

For a less software example, consider using a word processor. Do you still type two spaces after the period? Unless you’re using a typewriter, all you’re doing is messing up the formatting software (technically hyphenation and justification (H&J) system). Try this experiment. Take a word processing document and look at how it formats the text with both a single and double space. This becomes especially evident when full-justifying paragraphs.

All well and wonderful, but what about the pretzels?

Yeah, about those. I struck me that this whole teaching / learning thing can be likened to making pretzels. You know the big, soft, knotted, salt-covered ones. Consider the dough as the learner, the salt and shape the material to be learned and the kitchen equipment the methodology. The cook is the teacher. If the dough is frozen or dried out, it can’t be shaped. This is a refusal to accept the rules of the material. If the equipment is inadequate or the cook lacks an understanding of how to use it, the results will be inconsistent. Likewise, if the cook doesn’t understand how to handle the dough or when to apply the salt, things will probably not be the best. It is only when all three elements are brought together properly that the expected outcome is achieved consistently.

In the realm of teaching this means that the teacher needs to be able to create a motivating example and framework that works for the learners. This changes over time. Just as the world changes. The teacher should be always looking for signs that a student is frozen and be ready with additional material they may more readily relate to. The most difficult cases are the dried out students. They see no need to learn the new material and are at best taking up air. At worst, they are disruptive. These individuals should be given to understand that their presence is optional and that others should be allowed to learn.

Finally, as a teacher, always, always be looking for what you can learn from the students. The world is bigger than you little pretzel shop.

Whenever I take long trip I try to bring a book to read. When I went to CppCon 2017 in September, I brought Ray Dalio‘s Principles. Ray founded the world’s largest hedge fund. His company Bridgewater, is the exemplar for his book. [29 April 2018 Note: I’d actually completed the book during that trip, but got distracted by other things. I let this linger for far too long.]

At this point, I could assert that having created an organization a large and successful as Bridgewater would be justification for following the methodology he espouses. After all, isn’t that what we do, chase success is search of our own? If your desire is to leverage Ray’s book for that, you’ve failed before you’ve even started.

The information you will find in Principles is slow burning. Everything about his methodology requires tremendous time, effort and attention to detail. Let’s look at how the book decomposes:

In the first part (of three), he gives his personal backstory. This takes about 120 pages of the 550 or so page book. That’s a lot of exposition for a book ostensibly about life and work being able to be systematized. Then again, this isn’t a “here’s a fish” kind of book. If someone is going to purport to present a set a principles leading to success, you need to establish that you actually have the chops. And those chops don’t come from those whose success is inherited or the result of random chance. As has been said in many forms, “you learn nothing from success, but you learn everything from failure.”

The second part (about 150 pages) is devoted to life principles. The takeaway here is that work success is an extension of who you are. Your work, as opposed to your job, is not a coat to be put on, like some bulwark against the financial storms of life. To many his life principles are quite Machiavellian. While having clear goals is the basis of any true achievement, and root causing problems and designing solutions around them are in service to that end; both not tolerating problems that stand in your way and doing what is necessary to achieve results fall into the ruthless bucket. In no way am I opposed to his principles. In fact, I wouldn’t have personally accepted his work principles had he not been ruthless in his personal ones.

The third part is the thing we came to see, it covers his work principles. Here again, we see the division into thirds. They are culture, people and organization.

In the area of culture, first and foremost are the dual concepts of radical truth and radical transparency. On this foundation is having meaningful work and relationships. Next, and where I’ve seen many companies fail miserably, the culture needs to accept mistakes but demand that people learn from them. Once things are spun up, you have to keep everyone in sync. This goes back to the ideas of radical truth and acceptance of mistakes.

On the people front, you must hire the right people. Hiring the wrong people will kill your company eventually. In that vein, who the people are (life principles) is more important that what they know. People with a good foundation can be built upon. Those without can’t. Finally, you must constantly refresh people’s skills. This rigorous regime of renewal is not something that everyone can embrace. When people don’t … well, we’re back to Machiavelli.

Lastly, he addresses the organization itself. In many ways, his approach to the organization is identical to that of a person. The same issues of goals, problem tolerance, evaluation and improvement apply. That may sound like I’m short changing that part of book, but given Ray’s premise that work is an extension of the self, it’s only natural that the organization is an extension of the people working there.

This book is one you will either embrace as an affirmation, or reject as too demanding. As to why I don’t believe that there’ll be any middle ground, the title says it all. You either see the material in the book as principles or you don’t.

I, for one, do.

The early years of computing were a like a Renaissance dance, lots of people who somehow manage to get to dance with each other at least once. A Mind at Play: How Claude Shannon Invented the Information Age gives us yet another place to stand and watch that dance.

Claude Shannon is one of those people who fundamentally changed the way we look at the world. The problem with fundamental change is that we tend to be on one side or the other of it. Today we speak of information theory as though it’s as obvious a concept as making paper. Kind of the same way we obsess over software developers being able to write code to sort numbers or reverse linked lists. At some point, the fundamental reality of the existence of high-quality libraries and data structures will make these queries as relevant as requiring people to explain a tape sort. But I digress.

He was a researcher, tinkerer, teacher, juggler, and for all appearances didn’t seem attached to labels. He had Vannevar Bush looking out for him. As an MIT professor, he had Danny Hillis and Ivan Sutherland, among others, as doctoral students. He worked with Alan Turing during World War II. And the box-switch-thing that turns itself off. That was him.

Reading the book, you get a sense of possibilities explored. So often people either dismiss or defer possibilities. He literally had a basement full of them. If only he’d know Ron Popeil, every home might have a few of them.

I don’t know how well he would fare in the world today. In his time, Bell Labs basically paid to have him around. He had cachet. He also helped focus people’s ideas. He brought this sensibility to MIT with him as a professor. We get so terribly wrapped up in being hyper-specialized, in know the what but not the why. To often we come across the proverbial Gordian knot and turn away. People are either unwilling to try, or believing themselves to be special, simply act as though the problem does not exist. (Treating people poorly and flaunting violations of the law fall into this category.) Few people are willing to question the fundamentals. What do you need? What do you have?

The interesting people are those who solve problems and help other people solve problems, not by merely telling them what the answer is, but by enabling them to see that solutions can come from places that aren’t necessarily rooted in the past ways of doing things.

In our day and age, when we focus on special skills and special languages and special hardware, it would behoove us to remember that there is no best skill or language or hardware. There is only the universe of problems. It is far more valuable to be able to help others see the shape of the solution than to be an individual capable of providing a answer to a well-defined question whose value will in time expire.

When I was asked to create and teach a Python class, I had to ask myself, “where is the starting point for this language?”

When it comes to computer languages, I like them logical, powerful, compact and fast. The language currently at the top of my list is Swift. When it comes to longevity, C++ wins hands down. Python is neither compact nor fast. It is, however, very popular. It’s also very flexible.

The C language has so many children that it’s easy to use analogs. What about Python?

I’ve taken intro CS courses from Harvard, Rice and Stanford all of which use Python. They all teach C programming in Python in my opinion. I get where they’re coming from. You’ve spent years using FORTRAN then Pascal then Java.

Happily, my first language was FORTRAN. You think you need to build all your own data structures if you use C. Consider yourself lucky. But that’s a story for another day. My second lanugage as APL. Go ahead try to teach APL the way you teach C. Knock yourself out. A bit later I picked up BASIC, which after FORTRAN was trivial. Next came Forth. That took a bit to wrap my procedural head around. My experience with APL had taught me that it’s perfectly fine to focus on the data and not the process. Forth’s focus on the stack is strangely intriguing. The fact that it lives on in UEFI and Postscript is a testament to the fact that there is value in that view of the world.

So my approach was to start with data representation. In Python the world is all objects and references. But for some reason, people don’t want to approach the language from that standpoint. They like to talk about how easy it is to write ‘hello, world’ programs or how it’s more readable than Perl. Aside from APL, I don’t know anything that’s less readable than Perl. Except maybe TECO macros.

Now that I had a place to start, life should be a hop, skip and a jump to classes, yes? Not so fast pilgrim. It’s easy to explain that everything is an object and that integers are a sub-class of rationals. It’s easy to explain that 0 takes 12 bytes of storage. You can even justify the lack of a character object. But I think you do a true disservice if you don’t address the fact that strings are Unicode based. I’ve dealt with enough internationalization issues to know that to gloss over this would be a disservice to the student. I probably spent more time working on the string section of my class than any other.

The reason? It’s one thing to present information that raises obvious questions like, “so you’ve told me that there as multiple ways to represent the same grapheme and that these strings will have different code units, but what am I supposed to do about it?” Or “how am I supposed to sleep knowing that a Vai 4 digit isn’t the same as an Arabic numeral 4?” You might as well throw them under a bus if you honestly believe that you’ve discharged your duty as a teacher by telling the students that there are land mines out there and that they should bring an umbrella. You’ve essentially just given them a compelling reason to never use the language.

Once all the ‘core’ data types are out of the way, it’s time for some core data structure like list, tuple, and namespace.

Functions, generators, and lambdas come next. These are relatively straightforward. The trick to generators is to show the equivalent implementation in C++. Yes, they are different beasts, but you can get close enough. Similarly with lambdas.

Now, you’re in a position where classes can be reasonably explained. A point to mention here is that back before embarking on a exposition of data types, keywords and variable conventions were addressed. Now, for most students, this goes in one ear and out the other. Having arrived at classes, all those naming conventions come back like an overeager Sheltie wanting to play. Ignore them at your peril. Enumerations are also introduced here.

For many, all the object bits will now come into focus. This is a good thing. I’ve never liked the approach where students are taught how to use bits and pieces of libraries without the foundation to understand what’s actually happening. They end up with a false sense of accomplishment and may never seek to build an accurate model of the languages world. They’re like Jeff Goldblum’s character in ‘The Fly’ having no clue how things actually work since he just specified what he wanted a given part to do and plugged the parts together. We all now how that worked out for him. It also emphasizes their importance of debugging you system.

I’ve never understood why some people shy away from classes in Python. They act as though organization is an inherently evil thing. They also probably have all their laundry in two piles in the middle of their bedroom (one dirty, one clean).

Classes point us to resource management, but we can’t do that discussion until input / output is covered. That gives us the idea of using classes (file streams) and their methods to process data. Here’s where string formatting and core data conversion comes in.

Up until now, exceptions have been alluded to. Now there’s enough structure to not only address, but give meaningful examples of their use.

At this point, you’re done with the core language. So we’ll address unit testing. We’ve done bits of this along the way since the introduction of classes, but now we can talk about the unit test library.

What remains are sections on sequence and associative containers. An important aspect of this is teaching how to select the appropriate container. Yes, you can use list and tuple alone, but there are better things to do with your life than reinventing the wheel. Technical interviews insisting that people be able to balance binary trees notwithstanding.

Finally, a brief introduction to the standard library. Before Googling, how about being aware what’s already in the box.

You’ll note a distinct absence of web browsers, GUI applications, client-server systems, etc. Just the language here.

Would my class make you a Python expert. By no means. As with all things, you become proficient through years of study and practice. It is my hope that my Python class would give you a good start on that journey.

I just finished reading Blink: The Power of Thinking Without Thinking by Malcolm Gladwell. Perception is a fickle thing. When IBM created a text editor for their mainframe terminals, they ran into a weird problem. They we too fast. Changes were applied to the screen faster than people would perceive them. In order to nudge the brains of their users, they made the screen flash when a change occurred.

Working in an industry where people want high returns on low risk, I find myself at a loss to explain my sense of what is the right or wrong path. On more than one occasion. I have spend days preparing presentations to give decision makers warm fuzzies in dealing with issues which seem perfectly obvious to me. It takes days because there is a long way between perfectly obvious and the breadcrumb laden trail that people seem to need.

Unfortunately, you can’t Google yourself into a state of experience. So the question becomes, you do want to understand or just cover you ass? A paper trail does this quite nicely. At some point, you must make the decision.

If you want to glimpse the process, read the book.

 

I’ve just finished reading Code Warriors: NSA’s Codebreakers and the Secret Intelligence War Against the Soviet Union by Stephen Budiansky. It tells the story of the US National Security Agency (NSA) up through the end of the Cold War.

Given the number of dry histories of the people and agencies who deal with cryptography and spying, this book is reasonably readable. If you’re looking for a less arch, and more human view on how things got to where they are; you’ll like this book.

The takeaway from the book is that the biggest hindrance in the world of security is people. People who are control freaks or don’t believe that rules apply to them, or believe that the “other side” is stupid, or are just too damn lazy to do the simple things that would avoid issues are the problem. You can’t design your way around them. If you try, you’ll only make things worse.

You can’t pretend you have the moral high ground when you’re collecting enough information to make the US National Archives, the Library of Congress, Google and Facebook look redundant. I’m not picking sides, I’m just saying that if you’ve got a hammer and you use the hammer, own up to the fact and don’t go around telling everybody and their brother that they shouldn’t use hammers and that in fact that hammers either don’t exist or are illegal (or would be if they did actually exist, which they don’t).

Along the way, you’ll be introduced to a cast of well-intentioned, clueless, brilliant and ruthless individuals. There are miscommunications, denial of responsibilities, bruised egos, moments of insight and face palm moments.

Please keep in mind Hanlon’s razor.

 

On Point

The Chrysanthemum and The Sword is an exploration of what makes the Japanese tick. At least from the standpoint of a early 20th century scholar. Ruth Benedict was one of that era’s foremost anthropologist.

I could go into the interesting discussion of how the American and Japanese cultures are compared and contrasted. Or how she describes the Japanese approach to Buddhism as being free of non-corporeal entanglements. I’ll leave those to the earnest reader.

What made the greatest impression on me was the lengthy and detailed exploration of on (恩) and giri (義理). These can broadly thought of as debt and obligation. In the west, we have a fixation on equivalent exchange. We like to pretend that there is nothing which cannot be fully bought and paid for. The Japanese fully recognize that this is not the case and have build a society around the concepts of overlapping obligations.

One that I have always found difficult to explain is that of shogimu (諸義務) or obligation to one’s teacher/mentor. This is always an asymmetrical relationship. The apprentice has nothing to offer in exchange in comparison to what they are given. In the west, we, as they say, “just take the money and run.” This is especially true in regard to the attitude of Googling for the answers. People have come to believe that they can monetize the collected knowledge of the world without cost to themselves. The problem comes not to this first generation, but to the second and finally fully realized in the third generation of internet users. The problem is that of who supplies the knowledge.

One of the big complaints of the pre-internet era was how big companies hoarded knowledge, making it available only at a premium. Should it not be free to all? Between the small band of developers (relatively speaking) who contributed to open source, the internet and google, we find ourselves in a place where you don’t need a master to monetize. And so rather than having to pay for software as a function of complexity and craftsmanship, we tolerate mediocre software because it’s free (with ads or in exchange for our personal information).

Meanwhile that high quality software we wouldn’t pay for when it was $400 with a year of updates, we will pay for when it’s a $120 annual subscription to a cloud service. Except that now if we stop paying, we lose access.

But back to the thorny question. Do we really believe that the Google-for-code and I-built-it-all-from-other-people’s-stuff crowd are going to give back to the community? What will happen when those who made all these goodies possible and available retire? If we take the C++ community as an example, there are hundreds supporting millions. This doesn’t scale.

Answers? Nope.

Thoughts? Some.

Personally, I’ve got a bucket full of on, so I should get back to it.

%d bloggers like this: