Thursday, May 15, 2014


Republicans Care About Idaho

I was going to run in the Republican primary to be governor of Idaho, but there were some tough questions in the first debate that I just didn't know how to answer. For instance:

1. If you were governor of Idaho, what would you do about jobs?

I thought the answer revolved around increasing sales of potatoes to fast-food restaurant chains, but nobody seems to care about the iconic spud. Some good ideas that hadn't occurred to me include more logging, more mining, digging for natural gas, and killing more wolves. All of these will provide good jobs.

2. If you were governor of Idaho, what would you do about the economy?

Even though growing more potatoes is pretty much my answer for everything, it seems that too many Idahoans are working for minimum wage in fast-food restaurant chains, and the ones who might be helping the economy by killing wolves are out of work. Wiser heads suggest more logging, more mining, digging for natural gas, killing more wolves, and clear-cutting our overgrown forest to prevent costly fires so that we have more timber to harvest and sell, and more roads for mining and drilling.

3. If you were governor of Idaho, how would you help to preserve the state's natural resources for future generations?

God gave us Luther Burbank, and Luther Burbank gave us the Idaho potato. Just like Idahoans, a good potato might be brown on the outside, but it's pure white and flaky on the inside! As it turns out, and I just never knew it, Idaho has other natural resource, as well. There's logging, mining, and drilling, and we just need to do more of it to preserve what's ours from those land-grabbing feds. And we should make sure every school kid in the state knows this by teaching Idaho superiority in our schools.

4. If you were governor of Idaho, what would you do about taxes?

Shiiiiiit (excuse my French)! One thing we can all agree on is that we don't need no God-damned taxes to pay the feds to come in here and tell us what to do with Idaho!! We've got our God-given natural resources that belong to all Idahoans; says so right here in the State Constitution.

5. Do you have any concluding remarks?

The problem with Idaho is the Feds. We've got all these lands and almost none of it belongs to us. Just take back what's rightfully ours, tell the Feds to go suck on a rock in some other, less-enlightened state, and we'll have full employment with real, manly-man jobs. None of this flipping burgers for minimum wage. We'll have guns, chainsaws, oil rigs, bulldozers, and we'll know what to do with them. These is our God-given rights and I don't care who knows it.

Thursday, April 10, 2014

Big Deal: Amazon Fire TV

Amazon Fire TV is a big deal. Why? 

Because it's Amazon, it's new, and it's going up against products from Apple and Google. It's also a big deal for Amazon, because Amazon wants to own the world's content by owning the devices used to deliver and consume the world's content. Fire TV fills a big hole in Amazon's quest for world-content dominance. 

Do I want to buy an Amazon Fire TV? No. I already have an Apple TV, which satisfies my video-streaming and content-casting needs.


If I were new to video streaming would I buy Fire TV? No. It's unable to stream content from my other devices to my TV (content-casting).

Nonetheless, Fire TV has three distinct advantages. It's faster, which doesn't affect streaming performance, but does make the interface more responsive; it includes voice command, so you can speak into the remote to select movies, and there's a game controller available as an additional purchase.

Amazon's Hyperbole Machine

Despite the hyperbolic praise, these are not breakthrough features, and I get the sense that Amazon knows this. To me, there's something sinister behind the huge marketing campaign for Fire TV.

Have you seen the "Gary Busey Meets Amazon Fire TV" ad? It features a mentally-unstable-seeming actor playing himself and talking to inanimate objects. He talks at a Roku remote, nothing, but his Fire TV remote hears him when he says his own name and brings up movies he's in. He smiles an iconic Amazon smile (sort of).

It's a nicely produced ad, but as much a dig at Roku as it is praise for Fire TV. Oddly, Amazon.com makes money selling Roku devices. Isn't there something falsely ingenuous about this? 

What's Up Amazon's Sleeve?

Another example of the games Amazon is playing—let's have a look at Amazon's Apple TV product page. (They make money on these, too.) Looks like any product page, but there's something here I'm having trouble finding on other Amazon listings. There's a banner before the Apple TV information begins: in Amazon-smile orange, is the label, "Similar Items to Consider," which include two Roku models and Google Chromecast.



There isn't a similar banner across the top of the Roku or Chromecast product pages showing Apple TV as a similar item to consider. Nothing like this on the Fire TV page. Is Amazon saying, "We'd rather you didn't buy this product?"

But wait, there's more! Amazon, ever helpful to its customers, has created a "Streaming Media Players Store." More marketing sleight of hand—"Stream your favorite content anywhere, anytime." This statement is the real tip off, because Amazon is really a content store. Buy a Kindle or a Fire TV and you'll be paying for content from Amazon for the rest of your life (or at least for the life of your Amazon-branded device).

George Eastman, Mr. Kodak, made the remarkable discovery that if you give away cameras, you'll be selling film, forever. Ditto for Gillette and razors (the holder, not the blade). Same trick for Polaroid. 

God of Consumerism

Jeff Bezos sees each of us as a consumer, and whatever we want to consume, he's going to sell it to us. He wants all of us to be Amazon consumers, all the time. Fire TV is a trojan horse for content, a way for Bezos to get Amazon into our living rooms and keep everyone else, out!

This first iteration of Fire TV isn't going to win the battle of the living room. For one thing, its customer satisfaction rating on amazon.com is only 3.7, compared to 3.9 for Chromecast, 4.3 for Roku 3, and 4.4 for Apple TV. 

But if this one isn't quite good enough, you can bet there'll be another, better one. And if that doesn't make it, Bezos can cut prices so low that he's practically giving them away. He's not after huge profits, just total market domination.

Wednesday, March 26, 2014

Throwing An Unforgettably-Productive Work Party

Work Party? That's an oxymoron that has somehow turned into standard practice. But who says work can't, or shouldn't be, enjoyable? I think we can all drink to that!

My Favorite Work Party, Ever, Was a Big Mistake

I once hosted a "Work Party." It was terrifying. Scores of the nerdiest, freelance developer/authors on the planet, all in San Francisco to attend one of Apple's WWDC gatherings. This was an after-hours gathering, and few of us had met face-to-face before, so it had to be socially comfortably—AKA: beer, wine, and nerd snacks aplenty.

I had some distinctly work-oriented objectives, and once the conversations got going, there was plenty of talk of our many collaborative projects, current and future. The fact is, iOS and mobile developers are a congenial and basically happy lot. The place was pleasant, the food was good, and most people stuck around through more than a single bottle of suds.

We made lots of new friends, and there were so many substantial ideas discussed, I had to break down and take notes for follow up. In fact, we ate all the food and hung around chatting until we were kicked out. All seemed to have gone successfully. I later learned that all was doomed from the start!

Experiencing Both Sides of The Double-Edged Sword

As it turned out, the corporate culture of my employer was distinctly anti-Work Party. This was not an acceptable way to do business, and for all sorts of non-business-like reasons that went unsaid during the planning process. Seems there were managers who were jealous of our "fun," which I failed to anticipate.

In short, from a productivity point of view, much was accomplished by gathering to party. From a corporate point of view, no good deed goes unpunished. But my lesson is not a negative one. You see, I remain friends and collaborator with many who attended the party and long ago said goodbye forever to the corporation!

Saturday, March 22, 2014

Apple—Singin' & Dancin' in the Rain

I read the following article in Quartz: Why Apple should make its own TV shows, just like Netflix. It's a wrongheaded, poorly argued piece that made me angry. Here's the nutshell version:

  • Statement: Netflix, Sony, Yahoo, Amazon, and Microsoft are all acquiring and/or producing exclusive content.
  • Problem: Apple "confronts slowing growth in the sales of its devices."
  • Conclusion: "Maybe it’s something Apple should consider as well."
So says Macquarie Equities, which, according to Quartz, "became the 63rd research house to cover the world’s biggest company this week." And they show an impressive lack of expertise on the subject.

Let's restate the problem by changing one word:
Apple notes slowing growth in the sales of its devices.
It's true, Apple's sales curve is no longer growing logarithmically. The numbers for Apple's most recent quarter, reported January 27th are as follows:

  • 51 million iPhones vs 47.8 million the previous year, up 6.7% 
  • 26 million iPads vs 22.9 million the previous year, up 13.5%

Both of these figures are all-time quarterly records for Apple. Since the article speaks of devices as a single category, I'll lump Apple's iDevices together, yielding:

  • 77 million iDevices vs 70.7 million previously, up 9.1%

IF 9% growth is a problem, would 10% growth to 77.7 million units still be perceived as a problem? At what point is Apple's growth sufficient? Would 15% growth, 81.2 million units, avoid the need for problem confrontation?

This value judgement notwithstanding, the rate of growth has undoubtedly slowed, which is what one expects in any product lifecycle. At the same time, "Apple…, is being criticized for not innovating enough." To which we can only wonder, how much innovation is enough?

For the sake of argument, let's accept the assertions that slowing iDevice sales are a concern and that Apple's ability to innovate is in doubt. We must also assume that the likely introduction of an iPhone 6 in June will be a disappointment, requiring Apple to look beyond its devices for additional sources of revenue.

I know, let's make content! Macquarie says:
We believe that Apple would benefit from the deployment of some of its considerable cash balance toward securing exclusive media content. In our 15 years of covering the interactive entertainment space, we have frequently observed the value that can be generated through high-quality, desirable content that is exclusive to a platform (the original Xbox is a classic example of this, with early-stage growth driven in large part by the popularity of Halo). We think just one or two key exclusives could be very helpful in establishing new products and extending iOS’s reach.
Translation: sometimes high-quality movies, TV, and video games make money.

This is not a profound insight. Furthermore, what has this got to do with iDevice sales? The article says:
There are already reports that, as sales on iTunes dwindle, Apple is trying to convince record companies to provide it with music that only it may sell.
Except that Apple reported a 20% increase in sales on iTunes for the last quarter. So maybe there's something else?
Of course, there’s no suggestion the company is even contemplating this, but if it ever did, Apple’s ruthless obsession with quality means it would probably be worth watching.
There it is. We'd really like to see something produced with such ruthless attention to quality that only Apple could make it. If only they'd admit that they're getting clobbered in the marketplace by the likes of Google and Samsung, we could end this charade and get down to some serious entertainment!

Thursday, February 27, 2014

Buttermilk Biscuits & Civilization, A True Story

Mythical Buttermilk Biscuit Cow


Just as pancakes are merely a vehicle for maple syrup, biscuits provide the perfect vehicle for blackberry, strawberry, raspberry, peach, or apricot jam. And just as I've always made jam, I've always made biscuits and scones. I also count myself among those who have sought in vain to differentiate scone from biscuit and biscuit from scone. So it was with eager anticipation that I read:


Biscuits and Scones Share Tender Secrets, in the New York Times: Dining & Wine, Feb 25, 2014.

In which Julia Moskin tells us that American biscuits originated in the British Isles as scones, first mentioned in print in the 16th century. Her source authority, Elisabeth Luard, a director of The Oxford Symposium of Food and Cookery and the author of The Old World Kitchen.
The proto-scone is believed to come from Scottish kitchens, where rounds of oat and barley dough were cooked on large griddles, then cut into wedges. They were a simple combination of fat, flour and liquid, which became softer and lighter as wheat, butter and leaveners like baking soda and baking powder became widely available.
Not exactly breaking news, but good journalism with source clearly attributed. But what is it about the New York Times food writers? So often they choose to state opinion as fact. For instance:
Height is paramount to a good biscuit or scone.
I like a tall biscuit, too, but an acquaintance from Mobile, Alabama makes a traditional New Year's biscuit that is flat and nearly as crispy as a cracker. With a breakfast pork chop and gravy, it's delicious. That's opinion, because deliciousness is a matter of taste, not fact.

To be fair, opinions stated authoritatively can easily be taken as fact, but we want to trust the New York Times to at least get the facts right. To wit:
Buttermilk is a traditional liquid for biscuits and used to contain more butterfat, but today it is a lean and sour product.
This is not only wrong, it's gratuitous. Are we supposed to think that old-fashioned buttermilk made better biscuits? Let's set the record straight, which isn't too hard, because the name says it all (and there are plenty of sources, like this one from a cheese-making professor of biology and chemistry).

Traditional buttermilk was the liquid left behind after churning cream into butter. The fat solidifies into butter leaving a liquid byproduct, buttermilk. You can't buy traditional buttermilk, at least not in these parts, but you can make it yourself.

Try the recipe for Home-churned butter and buttermilk: the sweet-cream type, on page 245 of Milk: The Surprising Story of Milk Through the Ages, by Anne Mendelson. She also goes into the differences between "true" buttermilk and the modern-day product. She finds both, delicious.

Commercially-available buttermilk is generally a "cultured" product, i.e. bacteria is added to milk and allowed to ferment. Like yoghurt, you can ferment any kind of milk with any amount of fat you choose. Most grocery store buttermilk appears to be a cultured, low-fat product. It's thicker than milk, thinner than yoghurt, and both pourable and drinkable. More importantly for our biscuits, it's more acidic than unfermented milk.
Cooking Chemistry 101: Acid + Baking Soda = Gaseous Lift
The chemical reaction creates carbon dioxide and gives baked goods a quick rise, which is why buttermilk is a traditional addition to biscuits, scones, and Irish soda bread, too. (It's almost St. Patrick's Day.)

Milk begets cream, sour cream, butter, buttermilk, yoghurt, kefir, and a long list of other fermented, cultured, and ripened products, including countless varieties of cheese.

Goats, sheep, or cow's milk, plus some milled grain, like wheat, oats, or corn, and you've got the dough of civilization. Let it sit around to collect yeasts, and you've got bread. If you're in a hurry to leave town, you've got matzo. Or you can move civilization forward to 1846, when Messrs Dwight & Church, two New York bakers, built the first baking soda factory (now Arm & Hammer), and matzos magically became biscuits. (Though a high-fat cream certainly makes a richer-tasting biscuit).

Leave out the double-acting baking powder and stick to buttermilk and baking soda, plus butter for richness and flavor. I'm going to have to try the crowding tip of "huddling" unbaked biscuits close together on the baking sheet so they've got no place to expand but up! But then again, nice flat crackers have their devotees, as well.

Saturday, January 11, 2014

Cory Doctorow Dons Chicken-Little Suit to Fight DRM in HTML5, Why?

©The Simpsons
Holy smokes, Cory Doctorow thinks The End is Near! We are Huxleying ourselves into the full Orwell.
I can’t shake the feeling that 2014 is the year we lose the Web. -Cory Doctorow
Either Doctorow knows something we don't, or he's been infected with Chicken-Little virus and believes the sky is falling. But Doctorow is not so easily dismissed. In fact, we like and admire him and his works. He writes smart stuff and he understands the Web. He's seriously concerned with the free and open sharing of ideas and information, and he's not greedy.

Doctorow is the sworn enemy of DRM, Digital Rights Management, and is an active, articulate, and vociferous leader for a DRM-free world. When the "W3C green-lights adding DRM to the Web's standards," Doctorow responds: "I can't let you do that, Dave."

To make matters even scarier, no less than Tim Berners-Lee is openly supporting the W3C's decision! This feels to many like a betrayal, as if TB-L cares more about streaming his favorite Hollywood movies over Netflix, than about genuine Net Neutrality and Openness. (Though one wonders if TB-L ever watches movies.)

But we must take a step back and reflect again on Doctorow's contention that this has anything to do with "losing the Web." What's really happened and what are the implications?

On September 30, 2013, the W3C announced that W3C content protection for video is "in-scope" for discussion in the HTML Working Group. This is a euphemism for the inclusion of Encrypted Media Extensions (EME) in HTML5, a form of DRM that the movie industry has been lobbying for.

The HTML5 standard includes the necessary video tags to support streaming media within browsers, including encryption. But vendors who wish to enforce their copyrights must turn to browser plug-ins for DRM.

EME would provide a Web Standard platform for plug-ins, like Flash, which may or may not make a huge difference to end-users, but might make life a little more transparent for developers. But this is hardly the issue.

As much as we're sympathetic to Doctorow and the forces for an open Web, it's hard to see the W3C's decision as a threat to the Web's survival.

TB-L, never at a loss for detailed explanations, says:
if content protection of some kind has to be used for videos, it is better for it to be discussed in the open at W3C, better for everyone to use an interoperable open standard as much as possible, and better for it to be framed in a browser which can be open source, and available on a general purpose computer rather than a special purpose box.
We concur: 2014 will not be the end of the Web, including some kind of EME platform within HTML5 will not create an opaque experience for users and developers (as Doctorow claims), and the sky is not falling.

We also support Doctorow's quest for a DRM-free world, but can't how the W3C's pragmatic support for DRM will bring about the end of the world. In the inimitable words of Science Officer, Spock, "It's not logical."

-Professor Walrus