Skip to content

2010

The funny thing about iPhone apps

So a friend of mine told me of his iPhone app, called Etude. He sung the praises, it sounds really cool to me. So I whipped out my iPhone and went to download the app. Wouldn't you know it costs $4.99. I turned to him and I said, hey, "it's not free!".

Yeah silly reaction, given how I've ranted myself about free and not-free. And here's the funnier part still. He said, Oh, yes, it's $3.00. I said, "No it's more like $5.00". He said, "Oh yes, we just raised the price." I gave him a blank look.

Here's the kicker. He said, "What if I give you $5.00 right now?".

I still didn't want to buy it… How does that make any sense?

The “T Word” – Trust

I am not sure exactly what this article about Trust means exactly, but it's thought provoking, don't you think?

"Trust is present or it is absent. Grab a nerd and he’ll tell you that even the absence of trust is a measure of trust and that particular measure is zero. When trust is non-zero (which is better, believe me) it is based on one of two methodologies — empiricism or transparency (the other T-word)." (from I, Cringely)

Bring me different rocks

Scott Kirsner summarizes a new book "Mastering the VC game" and re-tells a funny VC scenario:

"He also talks about an exercise called the "rock fetch," when VCs ask an entrepreneur to spend time finding other investors willing to join them on the investment (bringing them "rocks"), but then decline to collaborate on a deal with those investors ("bring us a different rock.")" (from Scott Kirsner)

Ok, not really funny. More ~~sad~~ ~~depressing~~ ~~poignant~~ familiar…

Sequoia Voting System Witch Hunt, err… Study Project

Check this post Sequoia Voting System Witch Hunt, err… Study Project from Coder's Revolution:

Matt Woodward pointed out this Slash Dot article today about the accidental release of code from the Sequoia Voting Systems and a web site dedicated to studying that code. Apparently the Election Defense Alliance obtained a copy of the election data for Riverside County, California. It came in the form of a Microsoft SQL Server backup that was SUPPOSED to have all the code such as stored procs and triggers redacted. I wandered over to the "Sequoia Voting System Study Project" and scored me a copy of the data. [More] (from:Sequoia Voting System Witch Hunt, err… Study Project)

There's been (among election geeks, anyway) a flurry of comment on the discovery (by accident) that Sequoia uses SQL Stored Procedures. First of all the code snippet that was released was IMGHO innoccuous, second, I don't think it's really clear where that code snippet is from. So I agree with this post (and his vigorous defense in the comment stream) framing this as a Witch Hunt.

Originally posted on Oct 23, 2009. Reprinted courtesy of ReRuns plug-in.

Two fun (to me) articles about arcane mathematical topics

Ok, probably to a mathematician these are not arcane, but to normal people (oops, sorry, I love mathematicians) I think they might be. Anyway, read and enjoy without any further commentary:

  • Needle-in-a-haystack Problems: "[snip…]A needle-in-a-haystack problem is a problem where the right answer is very difficult to determine in advance, but it's easy to recognize the right answer if someone points it out to you. Faced with a big haystack, it's hard to find the needle; but if someone tells you where the needle is, it's easy to verify that they're right[snip…]"

  • Needle-in-a-haystack Problems, and P vs. NP. "[snip…]Last week I wrote about needle-in-a-haystack problems, in which it's hard to find the solution but if somebody tells you the solution it's easy to verify. A commenter asked whether such problems are related tothe P vs. NP problem, which is the most important unsolved problem in theoretical computer science. It turns out that they are related, and that needle-in-a-haystack problems are a nice framework for explaining the P vs. NP problem, which few non-experts seem to understand.[snip…]"

CAPTCHA’s cracked

CAPTCHA is the nickname of the venerable (ok only a few years of veneration) technique of verifying if the person on the other side of the screen is actually a person not a computer. We've all seen them a million times: a very hard to read bit of text in a small box, with a request that you decipher it and type in the text into another box.

Well it seems that spammers have figured out a way to defeat them: yes, you guessed it, "Captcha Farms." I first thought they were some exaggerated fear mongering but I suppose their commonplace enough that it's being reported even in, the, g-u-l-p, New York Times:

"Sophisticated spammers are paying people in India, Bangladesh, China and other developing countries to tackle the simple tests known as captchas, which ask Web users to type in a string of semiobscured characters to prove they are human beings and not spam-generating robots." (from Spammers Paying Others to Solve Captchas, from the NYT.)

Except I've heard of an even more subversive variation, where spammers put up what appear to be run of the mill porn sites, that, instead of asking you to pay with cash for access, ask you to solve a captcha for each image or video or whatever. (whatever?) Poor guy is not only solving it to gain access, they are also solving it, in real time, to help a spammer gain access to some site they are attacking. Creative.

Security by obscurity and other slogans

If you've been in computing for any time you may have been hit over the head by the slogan "Security by Obscurity is No Security". As I have understood the argument it has a few components:

  1. If your security relies on secret tricks, trap doors, and a hope that no one will be able to find out or guess the work around, then you're fooling yourself. Sooner or later someone will be able to guess the trick, see the code, quit your company and take the secret with them.
  2. Allowing your code and methods to be inspected and analyzed by the public (bad guys included) is the only way to learn about weaknesses that you would be blind to and give you a chance to close them. The other slogan which I will tackle some other time is "All bugs are shallow to a thousand eyes" implying that no matter how subtle the weakness, if you allow lots and lots of people to look, they will find them all.

(Actually Wikipedia has a longer and probably more correct summary of the Security By Obscurity concept.)

In the past I was usually quickly persuaded or at least silenced when confronted with these arguments, although at a gut level it never really sat right with me. While the arguments are strong, I had an vague sense that obscurity in fact does help security and often is a useful part of the whole security story. But who was I to argue?

With that background I was interested to see an article in the New York Times the other day, "Cyberattack on Google Said to Hit Password System":

"[snip…]But a person with direct knowledge of the investigation now says that the losses included one of Google’s crown jewels, a password system that controls access by millions of users worldwide to almost all of the company’s Web services, including e-mail and business applications. The program, code named Gaia for the Greek goddess of the earth, was attacked in a lightning raid taking less than two days last December, the person said. Described publicly only once at a technical conference four years ago, the software is intended to enable users and employees to sign in with their password just once to operate a range of services.[snip…]" (from New York Times , "Cyberattack
on Google Said to Hit Password System
")

This got me thinking, where is the Security By Obscurity crowd now? If you read the whole article you see that there is considerable concern at Google about the fact that the operation of this single sign-on, security system has been revealed.

Not that passwords or digital certificates were compromised, but (apparently) just the operation or algorithm or code for it was compromised. Isn't this just security by obscurity?

It makes perfect sense to me that these are state secrets for Google and that it's considered a major breach.

[GEEKY] iPhone earphones revisited

The earphone/microphone that came with my iPhone is acting odd. First I thought it was a problem with my ear, then with my iPhone, but it seems to be a problem with the earphones, which I never thought of as something that could actually fail, except if I broke it physically.

The sound quality is just fine, but it only gets 1/2 as loud as it should. How do I know? If I plug in a pair of brand-x earphones, I get the right sound level. Is this a plausible failure mode for the earphones?

P.s. Have you noticed that all these earphone pairs, the apple ones, but other brands too, have an unexplained tiny little clip along the wire in which you can slide another part of the wire? Nowhere is the purpose of this thing explained, but yet they all have it. What is it for???

A classic article that I finally read

The article "Reflections on Trusting Trust" is often mentioned in conversation and cited. I finally tracked down and read this classic, and it is indeed a classic of computer science. And typically for classics, it's short, clear, readable and impactful:

"The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ peoplelike me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect." (from Reflections on Trusting Trust)