Feeds:
Posts
Comments

Archive for the ‘Apple’ Category

I believe that you can learn a lot about people through the things they fill their heads with. Over time the mechanisms for this process have grown in number and availability. Once people would travel great distances seeking out teachers. Of course if you were powerful enough you could have them come to you. Once we managed to get the teaching down in permanent (well mostly) form, you didn’t actually need to bother with the whole physical presence thing. Still the needing the scribe thing made this practical only for the silly rich. By the time we get to the 20th century the super clever public library idea meant that you could recommend a book to someone without running the risk that their dog would eat the only copy for a thousand miles. The mid 20th century added audio and by the end video to the menu. With the advent of the internet, we could not only find materials to borrow from libraries with ridiculous ease, but could reserve it and get an email when it was ready for pickup. Then came the Kindle, Zinio, Netflix and the iTunes store. Now if someone is ingesting some bit of knowledge, in all likelihood, you can too (and within minutes).

So, what’s with the Burke-ian prologue?

Well, I was reading The New Yorker‘s article The Shape of Things to Come, about Jony Ive and the future of Apple. Among the bits of past, present and impact; was a fascinating bit. Ive was watching Moon Machines [iTunes Store]. Not expected that.

I’ve always been a bit of a space wonk, so I was interested just on the face of it. What I found fascinating was that a person born in 1967 England best know for early 21st century industrial design saw something of interest in a series dedicated to the United States’ Apollo program.

Having now watched the series, there are things that jump out at me. As with every time I take in something that’s been recommended (if person with the time constraints of a SVP at Apple mentions that they see value the spending time, that’s a hint one would be ill advised not to take advantage of), I strive to understand how it relates to the person, their work and goals. Ive’s comment speaks volumes.

… like the Apollo program, the creation of Apple products required “invention after invention after invention that you would never be conscious of, but that was necessary to do something that was new.”

The Apollo program was a tech start-up writ large. The goal was abstract; the time tables unyielding; the cost astronomical (literally); the toll on people and their relationships severe. In the end, the successes were ascendant and the failures devastating. The six episodes take on major aspects which had to work together in order to assure the success of the program.

The lessons of Apollo are applicable to endeavors in science, business, politics and design. Issues of control, quality, planning, communication and contingency are laid bare. As are their failures. Of particular distinction are the moments of crisis. Unlike anything before or since, we have documentation of and visibility into the people who stepped up to lead their teams and the processes through which they overcame them.

In an era of ever-increasing abstraction and the misplaced belief that you don’t actually need to understand how things work in order to produce something of quality, Moon Machines provides timely lessons. The quality of the end product begins with the confluence of domain and technology, not the application of one to the other. The speed and manner of disposing problems during a crisis depends greatly on the depth of understanding extant in the team of the two questions: What do I have? and What do I need? As well as the understanding of how to get to the latter using the former.

In the end, I have a greater admiration of those involved in Apollo thanks to a comment by Jony Ive.

Read Full Post »

It’s been five months since Apple’s Worldwide Developer Conference ended. This year the WWDC iOS application had the session videos available sooner than ever. I’d been watching them during my commute to and from Portland’s downtown. Well, this past week, I finished watching the last one. And with 107 videos, it’s Apple’s version of Netflix putting up a few years of Doctor Who episodes for people to gorge on.

Truth be told, I would have finished a month ago. So, what was the hold up? iOS 8 released. In doing so, it revealed that some change in the OS caused the WWDC app to crash when you tried to view a video. Very sad. The really unfortunate part is that I was only four videos away from having watched them all.

Well, I’m happy to report that the WWDC app is now working fine and I’d encourage anyone who’s doing OSX or iOS development to take advantage of them. There is a tremendous amount of information on the latest tools, technologies and techniques. Swift and Metal feature prominently.

Now, I’m not saying that these are all Jobsian quality presentations. As much as I appreciate that software development is a global endeavor, whoever decided that someone with a pronounced French accent should be littering his presentation with the word banana should be forced to watch that session from beginning to end. Subtitles would also have helped some sessions.

All-in-all these presentations are well crafted and delivered. Apple also continues its tradition of providing substantive sample code. Nothing is worse than code snippets that don’t convey the true flavor of using an API as you would in practice. Happily gone are the days when the AOCE (Apple Open Collaboration Environment) documentation was 1200 pages of paper with no index. Not the best way to pick up a new technology.

So if you need a break from watching the original run of The Tomorrow People, the WWDC sessions are there for you.

Read Full Post »

Update 2015-12-21: Apple released Swift source code. I’ve updated my sample again to reflect what I learned.

Update 2015-09-27: It’s been a year and much has happened with Swift. Please see my latest post on Swift command line input for current code.


Recently I’ve been watching Stanford intro CS classes. I like to see how they present the fundamental concepts and techniques of programming. This got me thinking about those missing bits of Swift that would allow me to actually write a command line-based application. [see my previous post Swift: Second Things First] Having these bits would be to allow me to teach Swift as a first language without having to teach the abstractions and interfaces required to properly develop for a graphical interface. I’m not much into attempting to teach in a way that breaks the “go from strength to strength” methodology. If you’re going to teach me to sing, it’s a whole lot easier if I don’t have to learn how to spin plates at the same time.

So, I spent some time and created a simple set of routines that when added to a Swift command line application allow you to get and put strings, integers and floats. Not exactly rocket science, which begs the question, “Why didn’t Apple do it?” Well, since I’m not Apple, I have no idea.

Here without further comment is the content of the file I wrote. That it is not the best Swift code, I have no doubt. If you can make it better, cool. And Apple, it you read this, please make something sensible of it.

Update: I’ve manually wrapped a few lines as WordPress is clipping.

//
//  swift_intput_routines.swift
//  swift input test
//
//  Created by Charles Wilson on 9/27/14.
//  Copyright (c) 2014 Charles Wilson.
// Permission is granted to use and modify so long as attribution is made.
//

import Foundation

func putString (_ outputString : NSString = "")
{
  if outputString.length >= 1
  {
    NSFileHandle.fileHandleWithStandardOutput().writeData(
               outputString.dataUsingEncoding(NSUTF8StringEncoding)!)
  }
}

func getString (_ prompt : NSString = "") -> NSString
{
  if prompt.length >= 1
  {
    putString(prompt)
  }

  var inputString : NSString = ""
  let data        : NSData?  = NSFileHandle.fileHandleWithStandardInput().availableData

  if ( data != nil )
  {
    inputString = NSString(data: data!, encoding: NSUTF8StringEncoding)!
    inputString = inputString.substringToIndex(inputString.length - 1)
  }

  return inputString
}

func getInteger (_ prompt : NSString = "") -> Int
{
  if prompt.length >= 1
  {
    putString(prompt)
  }

  var inputValue : Int = 0
  let inputString = getString()

  inputValue = inputString.integerValue

  return inputValue
}

func getFloat (_ prompt : NSString = "") -> Float
{
  if prompt.length >= 1
  {
    putString(prompt)
  }

  var inputValue : Float = 0.0
  let inputString = getString()

  inputValue = inputString.floatValue

  return inputValue
}

And here’s a little test program that uses it.


//
//  main.swift
//  swift input test
//
//  Created by Charles Wilson on 9/27/14.
//  Copyright (c) 2014 Charles Wilson. All rights reserved.
//

import Foundation

var name = getString("What is your name? ")

if name.length == 0
{
  name = "George"

  putString("That's not much of a name. I'll call you '\(name)'\n")
}
else
{
  putString("Your name is '\(name)'\n")
}

let age = getInteger("How old are you \(name)? ")

putString("You are \(age) years old\n")

let number = getFloat("Enter a number with a decimal point in it: ")

putString("\(number) is a nice number")

putString("\n\n")
putString("bye\n")

You’re probably wondering why I don’t use print(). Well, print() doesn’t flush stdout’s buffer. And, I really like to enter data on the same line as the prompt. And for those of you who say that you can use print() in XCode’s output window, I’ll remind you that a simulator isn’t the target device.

“But, wait. You didn’t comment the code.” No, I didn’t. By the time a student has enough understanding of Cocoa to compose the I/O routines provided above, the comments would be unnecessary.

So, there you have it. The typical second program that you’d ask a student to write.

Read Full Post »

I’ve been looking at Swift for about a month now. My first thought when I see a new language is,

How would I teach this language to someone new to programming?

After spending countless hours dealing with the little things in a language that contribute to the lack of patience developers have with non-developers, I still hold out hope that we will have learned from our past and can create a language which will enable those of us who toil in darkness to get out a bit more.

Back in the day, when a new language emerged from the primordial soup, it was accompanied by a language description. This document is ensures that in the case where the creator(s) of the language are run down by a rampaging gaggle of salvage yard geese mysteriously loosed by the supporters of the favorite language de jour will still be available to the six people already using it. [Just like Apple’s linker that was written in Oberon. But that’s a story for another day.] Lest you are under the mistaken impression that this document is the be-all and end-all of the language, I would refer the gentle reader to the first Ada spec which indicated that a unary minus could be present in the middle of a digit string.

Sometime later, a language manual would appear. If you are very lucky, this will be written by a teacher (Pascal User Manual and Report). Alternately, it may be written by a practitioner who is known for their ability to create concise code like awk (The C Programming Language). If you’re really lucky, the author may be both a teacher and practitioner who’s book printings required the deforestation of small pacific islands (The C++ Programming Language). Regardless of the provenance, only those who live on the bleeding edge or college (more recently high school) students embrace these tomes of wonderment.

Assuming that the language becomes popular enough to catch the attention of people other than the full-stack crowd, a book may appear whose clarity will ensure that it is longer lived than the inevitable dummies book. This rare collection includes the A FORTRAN Coloring Book, Basic BASIC, and Programming in C.

So far, Apple has released three documents on Swift. The first was the language reference. The second details Swift-ObjectiveC interoperability. The latest is the Swift standard library reference.

This year’s WWDC included seven Swift-specific sessions and eight others that referred to it. This level of coverage is quite impressive, but then again, they’ve been working on the language for about four years.

Enough background already, how would I teach Swift as a first programming language?

Unfortunately, right now, today, I can’t. You can tinker with Swift in playgrounds. You can integrate Swift and ObjectiveC. You can create swift-based iOS or OSX applications. What you can’t do is write a CLI program that is pure Swift.

Look at any programming language instructional methodology. What’s the second thing they teach? The second? Yes, the second. The first, since K&R, is hello, world.

The second thing that you have people do is prompt for their name and say hello back to them. Output is important, but without input programs are pretty boring. I’m not ignoring the vast and glorious mound of ObjectiveC and by extension C and C++ code that’s accessible to swift, but that’s not the same as being able to create the same things in swift.

Generally, I find swift a compelling language, but today it’s not a first language. I’m hoping that Apple will correct this deficiency in the not too distant future.


So, that was the post. It’s now a week later. Why is it still sitting unpublished? Well, I just wasn’t happy with my conclusion. Having had a bit of a mull, I’ve not changed my mind but I believe that I need to revisit my basic assumption as to what constitutes the baseline for teaching a first computer language.

The idea that to teach a person how to program you should have as little magic at play a possible. What is magic? Elaborate command invocations for one. Just being able to use the word invocation should be enough of a clue. Requiring the construction of things that have nothing to do with the actual language is another. This is probably the aspect that I have the most difficulty with.

“I’ll teach you how to program, but first you’ll need to lash together the user interface.” That would be all well and fine except that print is provided. Why don’t I need to provide a mechanism for stuff to go out if I need to provide one for stuff that comes in?

So, where do we start? The advantage of the pre-GUI age was that there was one true interface to the computer. The way we thought about our programs was dictated by the programming language we used. For a long time after the GUI was introduced we tried to treat our interfaces as extensions of our programs rather than partner environments. Even after we decided that there was sufficient power to run multiple applications at once, we were still mucking about with low-memory globals.

Trying to make the UI an independent entity took the idea of an abstraction penalty to new heights. The things that worked didn’t scale. And, in general, the things that scaled didn’t work. We won’t even talk about speed. Or fragile base classes … I’ll leave GUI evolution posts for another time.

Suffice it to say that the bones of many developers were used to pave the smooth road on which today’s applications travel to get from creator to customer. Somewhere in the process, we went from being a bunch of villages connected by trails to a planet full of complexity and wonder.

So now, we think about desktop, embedded systems, mobile devices, web, distributed systems, databases and games in radically different ways. These ‘once computers’ are now ‘delivery platforms.’ In order to create a product that aims to make use of (or be available on) multiple of these, it is necessary to perform the equivalent of running a restaurant where the staff are all expert in what they do, but each speaking their own language. To complicate matters, sometimes they want to use the same tools in the kitchen (usually the knives) and the customers tend to fight over getting the ‘best’ table.

If I teach someone C, C++, Java, lisp, PHP, python, or [insert language here], I don’t have to teach them the UI language of the system at the same time. With Swift I do. Is this going to complicate things? Probably. Will it take longer? If I want to be sure that they realize that this UI metaphor isn’t ‘the one true’ metaphor, absolutely.

I believe that Swift has a lot of potential. I would hate to see it restricted to being used only in the context of ‘Developing for Apple devices with Swift.’

Read Full Post »

I’ll be the first to admit that I obsess over security. My internship in college dealt with Unix security. I’ve created encrypted protocols for wireless data communication. And for my master’s thesis, I created a highly virus-resistant computer architecture (AHVRC – aka Aardvark). I wrote it in 1993. I put it up on the web in 1999.

So, what to my wondering eye did appear a few days ago? None other than the latest installment of Apples “iOS Security” document.

Personally, I like reading Apple documentation. But then again, I read owner’s manuals. Anyway …

So, I find myself reading iOS Security and keep thinking, “that’s what I would have done.” Wait, that’s what I did do.

I was casting about for a thesis topic and my department chair noted that no one was doing anything in secure architectures. So I spent a chunk of time thinking and put a little 124 page missive together. Now gentle reader, you having taken it upon yourself to read a few pages in begin thinking, “this can’t be serious, it’s got animals instead of sub-systems.” True, true. The master level is supposed to have a certain level of awe and wonder associated with it. Boring. Here’s a little secret. In a traditional master’s program, you devote the equivalent of three courses to the research and writing of a document (thesis). The point of the thesis and its defense demonstrates mastery of the discipline. The defense is done publicly. Anyone may attend. You must advertise it to the student body. Some number of professors, typically in your discipline and of your choosing, make up the group who decide if you and your work are up to snuff. Question may be asked in any area of your studies, but primarily the discussions will revolve around your thesis. Hence, being called a defense. Once the professors have had at you, the gallery gets their shots.

You already knew that didn’t you. Well, that’s not the secret.

The secret is that the defense is conducted within the context of the thesis. They attack, but you get the build the world. Think of it as a duel. You get to choose the weapons.

Nothing warms the cockles of my heart more than to see the distinguished faculty discussing a highly technical matter in the context of dolphins, gophers and kinkajous.

I even applied (with Rose-Hulman generously funding) for a patent. Had I had more patience and a more informed examiner at USPTO, I probably would have a patent for the work.

I’m not sure if the developers at Apple ever read my thesis or referenced my patent filing. I do find the similarities in the two architectures interesting.

I hope everyone who reads this posting takes the opportunity to read both documents. Apple’s because they present the state-of-the-art in application security model implementation. Mine, because I think I’m pretty well pleased with myself about it.

Read Full Post »

« Newer Posts