Wednesday, April 27, 2011

Patriotic Millionaires mix it up with Orrin Hatch

I am a "member" of a "group" called Patriotic Millionaires for Fiscal Strength. The reason I put the words in scare quotes is that nothing binds us together except for the fact that we've all signed a letter to the powers that be in Washington urging them to raise taxes on the rich (which is us) in order to reduce the budget deficit. This apparently got the attention of Senator Orrin Hatch who felt the need to lecture us about the fact that we can make voluntary contributions to the government:


For those that are interested in making voluntary contributions to pay down the national debt, the process is both easy and advantageous. Federal law authorizes the Secretary of the Treasury to accept conditional gifts to the United Stales for the purposes of reducing the public debt. Individuals can go to ihe website pay.gov where they are able to make a tax deductible charitable contribution to pay down the public debt.


I'm proud to have been the principal drafter of a response to Senator Hatch. I reproduce it here in its entirety:


Dear Senator Hatch:

Thank you for your letter of April 20. With all due respect, you appear to be laboring under a number of misapprehensions. On behalf of the "so‐called" (your words) Patriotic Millionaires we would like to take this opportunity to set the record straight.

First, we are well aware that making voluntary contributions to reduce the deficit is an option that is open to us. That you seem to think reminding us of this is a constructive contribution to this serious debate indicates that you have missed the point. In our democracy, individual citizens do not get to pick and choose what government spending to pay for. You and your colleagues over the past decade have voted for vast outlays that many of us as individuals might not agree with. Nonetheless, we recognize our responsibility as citizens to pay for these expenditures, which were authorized by our elected representatives, and are therefore ultimately our collective responsibility. That is an intrinsic part of living in a democracy: you don't get to opt out.

But letting people opt out is precisely what you are suggesting with your proposal of paying down our debt with voluntary contributions. In World War II, when we faced great challenges as a nation, we didn't ask for voluntary contributions to pay for the war, or ask only those who supported the war to contribute. We had high taxes during the war, and high taxes to pay down the debt, afterward. Today, we benefit from that fiscal discipline. But we are undoing those benefits to society by cutting taxes on the wealthy at the same time we face enormous expenses and are carrying enormous debt. We need all of the above to address this problem, just as we have done in the past.

During World War II, we even resorted to rationing to share the burden of war more equally. Who is paying the burden of war, today? Our less privileged, who fight and die in disproportionate numbers, and our future generations, who will bear the burden of the debt. We think that is shameful.

We are ready to step up to the plate with a willingness to sacrifice for the greater good but we are not willing to make that sacrifice in vain, which it surely would be if we followed the course that you suggest. You even point this out yourself in your letter when you note that "the Bureau of Public Debt recorded only $3.1 million in gifts in 2010." We have been more fortunate than most people, but we are a very small group. If there were even the remotest chance of making a noticeable dent in the problem by acting alone we would have done it already. But we are a few dozen people in a nation of over 300 million facing a debt measured in the tens of trillions. To suggest that we try to tackle this problem by making individual contributions is, frankly, insulting. It is like suggesting to someone expressing a desire to serve their country by bearing arms that they buy a rifle and a plane ticket to Afghanistan. Some problems are too big to be solved except through collective effort and shared sacrifice, and this is one of them.

Second, you write: "this debt crisis is not caused because we tax too little. It is caused because our nation spends too much." This is quibbling over semantics. Deficits result when spending exceeds receipts. Whether that happens because spending is too high or receipts are too low is a matter of perspective and priorities.

In 1977 when you first became a Senator, the U.S. National Debt was approximately $700 billion – that’s with a B ‐ or 36% of then‐GDP. At the end of 2008, before Barack Obama came to the White House, the National Debt ballooned to almost $10 trillion – that’s with a T ‐ and about 70% of 2008 GDP (OMB). While there are different opinions as to how this happened, the National debt did not creep up on us suddenly. The spending that led to such debt resulted from the collective actions of Senators and House Representatives, including you.

It is true that government spending levels are at historic highs, but it is also true that tax rates (and hence receipts) are at historic lows in terms of percentage of GDP. It is the combination of these two factors that has taken us from surplus to near‐catastrophic deficits in a mere decade.

Third, you cite Kevin Williamson's argument that "a public school administrator earning $130,000 married to a pharmacist earning $125,000 a year is rhetorically lumped together with millionaires and billionaires." That may be, but it is Williamson doing the rhetorical lumping, not us. We have urged the President to raise taxes only on people earning over $1 million a year, so Williamson's argument is a complete non‐sequitur. But even under the most aggressive plan currently on the table, Williamson's hypothetical couple would pay zero additional taxes, as deductions and exemptions would reduce their taxable income to well below $250,000.

Finally, we would like to remind you of two historical points. The first is that the Constitution of the United States of America was established for the express purpose of "promot[ing] the general welfare" and not just the welfare of the rich and powerful. Over the last ten years we, the signatories of the PM letter, have done very well, in no small measure because we benefited from public education, government services, a civil society, and world‐class infrastructure, all provided by the government. However, our good fortune has not been shared by the vast majority of our fellow citizens and since our success has been supported by the general public, we feel an obligation to pay back.

The second historical point is that we have faced a crisis like this before. In the early 1990's we successfully addressed a similar crisis through a combination of tax increases and spending cuts. As a result, in 2000 we were not debating how to address a debt crisis, but rather how best to dispose of a budget surplus. It is also worth noting, as a matter of historical fact, that we reached this happy state of affairs through a bipartisan effort involving a Democratic president and a Republican Congress. This makes us fundamentally optimistic that the problem we face today is surmountable.

You close by expressing concern about raising taxes on us "during a vulnerable economic recovery." It is precisely because we do not want this problem solved solely on the backs of the most vulnerable that we have asked the President to call us to our duty. To him and to you we say again: raise our taxes. We can take it.

A comedy of political errors

The White House has "released" (you'll see why I put that word in scare quotes when you follow the link) the long form of his birth certificate. Donald Trump immediately took credit for "hav[ing] played such a big role in hopefully getting rid of this issue." The CSM gives the win to Trump, saying, "In politics, anytime you force an opponent to react to you, as opposed to talk about what they want to talk about, you’ve often scored a point."

The CSM is right, but it overlooks one important point: this will not get rid of the issue. It will only fan the flames higher. There is a reason that conspiracy theories have "conspiracy" as a prominent part of their name. They are impervious to facts because "facts" can be manufactured. You don't even have to try very hard to concoct a birther-friendly explanation of the alleged long-form birth certificate. For starters, the White House did not release the document in a sane format, like an image of a PDF. They "released" it as an Adobe Flash file. So you can't look at the whole document, you can only peer at it through a teeny weeny little window. Yes, you can zoom in and out and pan around, but if you zoom in far enough to be able to read the text you can only see a tiny part of the document. Why did they do this? Obviously there can only be one "reasonable" explanation: the White House is desperately trying to conceal the fact that the document is a forgery.

So yes, Trump scored a point by making the President react. But then he conceded the game by implicitly accepting the document as legitimate! You can't have it both ways. Either you're playing for the loons or you aren't. Switching horses in mid-stream is political suicide. Trump already lost all the sane people when he signed on as a birther in the first place, but now he will lose the birthers too because he's become part of the conspiracy!

This is Trump's greatest weakness: he can't stop himself from gloating over a victory. It just cost him the White House.

[Update] Turns out they released it as a PDF too. But of course that won't matter either, they'll just come up with some other story.

Tuesday, April 26, 2011

Is it Lisp or is it me?

John D. Cook writes:


I’m skeptical when I hear someone say that he was able to program circles around his colleagues and it’s all because he writes Lisp. Assuming such a person accurately assesses his productivity relative to his peers, it’s hard to attribute such a vast difference to Lisp (or any other programming language).

...

There are genius programmers who write Lisp, and Lisp may suit them well. But these same folks would also be able to accomplish amazing things in other languages.


As someone who has specifically gone on the record saying pretty much exactly that (more than once) I feel the need to refute this claim.

Since the argument is based on the premise that I was in fact more productive than my peers, I have to toot my own horn a bit more than I'm generally comfortable with. I spent twelve years of my early career at the Jet Propulsion Lab during which time* I was promoted to the rank of Principal, the highest rung on the technical career ladder. I think it's safe to say that I did not achieve this through my sparkling personality, my willingness to brown-nose, or my finely honed political skills. In fact, looking back on it, I think at one time or another I managed to alienate about 90% of the people I came in contact with. So the only reasonable explanation of the fact that I was promoted instead of fired was that I produced results.

That still leaves open the possibility that I'm just freakin' brilliant, I could have (and would have) produced the same results in any old programming language. But there is actual data to refute that theory.

First, when I was at Google I got to see firsthand what real coding genius looks like, and it ain't me. There were dozens of people at Google who could code so fast and so effectively that it literally left me slack-jawed. My colleagues regularly did things that I would not have even thought possible had I not seen them with my own eyes.

Second, when I tried to learn to do what they were doing, I failed miserably. I tried to wrap my brain around Java and C++ and I just couldn't. I found myself so frustrated by the fact that I had to manually worry about a zillion little details that I could just ignore if I were using Lisp that I got a mental wedgie and I just couldn't get past it. I'd get bogged down in protection faults, STL errors that scrolled off the top of the screen, the fetid cesspool that is Perl, and get so frustrated that I just couldn't make any progress.

Third, I'm not the only one this has happened to. When I was working on the remote agent experiment an attempt was made to port part of that code (the planner) from Lisp to C++. After a year that effort had to be abandoned because the planner team just couldn't make it work. So it's not just me.

Fourth, it's just seems obvious from first principles that it you're using a language where you don't have to worry about memory management, you're going to get things done quicker and more reliably than a language where you do.

It is because of all this that I attribute my own success more to Lisp than to my personal coding prowess.

This is not to say that Lisp is a panacea. I do believe in the Lisp curse that the power of the language is in some respects self-undermining because it empowers the individual and so tends to attract people who don't work well in teams. A team of competent programmers willing to put up with bullshit will out-compete a lone wolf striving for elegance no matter how brilliant he is. It's sad, but that's the way it is.

---
[*] To be strictly accurate, after 12 years I was a Senior, one rung below Principal. I then left for a year to work at Google, and when I returned I was re-hired as a Principal.

Friday, April 22, 2011

New startup rule: revenue pages should get top priority

My world, along with that of a zillion other geeks, came crashing down around me about 24 hours ago when Amazon Web Services went down and took Reddit down along with it. Things are slowly returning back to normal, but at the moment only paid Reddit Gold members are being allowed to log in. I've actually been meaning to sign up for Gold for a while just on general principle because I like Reddit and want to see it continue to thrive, so this seemed like as good a time as any. Unfortunately, it turns out that to sign up you have to log in. Catch-22.

So... if you're doing an on-line startup, you should build it in such a way that the revenue path stays up no matter what, or at least is absolutely the last thing to go. You never know when or why people might want to give you money. You should never make it impossible for them to do so.

Thursday, April 21, 2011

Articles on the web need to be prominently dated

During my morning read-through of the day's news I ran across this link pointing to a Washington Post article disclosing the existence of what are essentially secret police operations conducted by the FBI without a court order. Naturally my blood began to boil and I dashed off an indignant blog post. Only after I published the post and it had been up for a while did I notice that the article was from 2007.

This is one of the problems with digital media. Back when newspapers were on paper it was easy to tell if an article was old: the paper would be yellow and frayed. There are no such cues on the web. Every article is as pristine as the day it was written, so unless there are some obvious stylistic cues there is no way to tell when a particular article was written unless it is dated. And even if it is dated, can you really trust the date?

The web really needs some infrastructure for producing reliable and prominent timestamps.

Wednesday, April 20, 2011

A split-brain experiment

My sister pointed me to this video about a particularly interesting split-brain experiment. In these experiments, the corpus collosum, the bundle of nerves that connects the two hemispheres of the brain, is surgically severed, usually as a treatment for severe epilepsy. The result is literally two separate brains residing in the same body, which can develop two separate personalities. In this case, one of the personalities believes in God and the other one doesn't. As the speaker observes, this raises a profound theological question: what happens when this person dies? (And no, saying this person has two souls doesn't solve the problem, because that just raises the question of when he acquired his second soul.)

Personally, I believe this is just an extreme case of what is actually a much more common phenomenon than is generally appreciated. Our conscious selves are not, in fact, the coherent whole we perceive them to be. The perception we have that we are "individuals" with an "essence" or an identity that is constant across time or even coherent at any given moment is an illusion. This can be demonstrated by a wide range of psychophysics experiments, but I don't have time to look those up at the moment. Fodder for a future post. Someone remind me.

Abortion and the Bible

Abortion is once again in the news because the anti-choice mafia is successfully taking over the state house. So this seems like as good an excuse as any to point out the fact that the Bible does not support the pro-life position.

I wouldn't go so far as to say that the Bible is pro-choice (at least not for women) but it turns out that it is explicitly against the proposition that life begins at conception. Exodus 21:22 says:


“If people are fighting and hit a pregnant woman so that she miscarries but there is no serious injury, the offender must be fined ..."


So killing a fetus should be punishable by nothing more than a fine.

That should be game, set and match in the debate over whether abortion is murder, but of course it won't be because religion is not about truth, it's about feeling good, so as long as fighting for "innocent unborn children" feels good, no amount of logic nor the Word of God will deter the anti-choicers from their fix.

Tuesday, April 19, 2011

Oh, puhleez!

People are playing Quidditch. Or at least they're trying. They're adapting the game to conform to the laws of physics, but the result sounds like less like Quidditch and more like the bastard stepchild of curling (they use brooms), rugby and Calvinball:


Welcome to the wild and weird sport of Muggle Quidditch, where boundary lines are suggestions, four balls are in play at any given time, and every player -- except for the elusive golden Snitch -- dashes about with large, bristly broomsticks held mid-thigh. Six years ago, the game was just a cool idea hatched by a group of students at Vermont's Middlebury College; today, there are more than 700 teams on high school and college campuses worldwide.


No, I'm sorry, but this is not "a cool idea", this is a completely stupid idea. The whole point of Quidditch (the "real" version, the one in J. K. Rowling's books) is that you can't play it in real life because it requires actual magic. That's what gives it its charm (so to speak). Quidditch without magic is like water polo without water. And, I'm sorry, but carrying a broom between your legs while you run around is just ridiculous. I hate to be a wet blanket, but someone needs to get a life.

Raising taxes on the rich will not harm the recovery

We've actually done this experiment once before back in the 90's. The Republicans screamed bloody murder. They were wrong then, and they are wrong now.


The universal claim of Republicans in 1993 was that if the Clinton budget and its tax hikes on the wealthy ever became law, the country would immediately be plunged into another recession and millions of Americans would lose their jobs. That didn't happen at all. Moreover, when the economy began growing rapidly in the middle and late years of the decade, the new, higher rates established by the Clinton budget helped produce a massive revenue windfall -- enough to lead to record surpluses and to bring into sight the elinination of the entire national debt. You don't need to claim that Clinton's '93 budget itself spurred the economy or that it was the only reason a balanced budget was reached so quickly (yes -- there were other factors!) to acknowledge that (a) it did not cause any of the economic turmoil that Republicans guaranteed Americans it would and it did not hinder -- at all -- the sustained growth that marked the rest of the decade; and (b) it brought in far more revenue than the pre-Clinton budget tax rates would have -- and this revenue made achieving a balanced budget much easier.

Kentucky church comes out in support of gay marriage

Tip o' the hat to the Douglass Boulevard Christian Church in Louisville, Kentucky whose members have voted to support gay marriage. In fact, they've gone one step further and voted to publicly protest marriage discrimination against gays by not signing marriage licenses until gay marriage is legalized.


"In our attempt to live out God's call to pursue justice for all, the Elders of the congregation joined the Pastors in witnessing to the right for gay and lesbian persons to God's blessing on their union and witnessing to the Commonwealth toward ending the refusal to recognize these unions," said Rev. Chuck Lewis, Chair of Elders with the church.


Kudos!

Saturday, April 16, 2011

Thank you to my adoring fans! :-)

Just over a year ago (holy cow, has it been that long?) I was privileged to attend the premiere of a film made by a friend of mine named Hilalry Scarl called See What I'm Saying, the Deaf Entertainers Documentary. (It's a really great film. If you haven't seen it, DVDs are now available :-)

That was a real Hollywood premiere. It took place at the Egyptian theatre. About ten million people attended (well, maybe not quite that many, but it felt like it). There were limos, a band, paparazzi, a red carpet, acrobats (no, really!) ... the whole shlemobble.

My premiere last night wasn't like that.

We were planning to show up half an hour ahead of time just to leave some margin, and it's good we did because we hit traffic and only arrived ten minutes early. We were the first ones there. "There" turned out to be an old naval base in Alameda (near where they film "Mythbusters"). We were really wondering if we had the right place. If you can imagine a place that looks nothing at all like the place where you'd hold a film festival, that's what this place looked like.

Until you got inside.

There behind the doors was what has to be one of California's best kept secrets: an absolutely gorgeous art-deco theatre, not quite as nice as the Egyptian, but in the same league. It was at once beautiful and sad because it was absolutely empty. And it was still empty except for me and Nancy when the show started.

Now, my film was the third one of the evening. The first was a ten-minute short music video, and the second was an hour-long film about sustainable industry. By the time the second film started there was still no one there, which meant that not even the people who made the first two films had bothered to show up to their own screening.

I was starting to get a little worried.

To my great relief, before the second film was half-way through people started to trickle in, and by the time my film started there were a fair number of people there (hard to tell exactly how many because it was dark).

Now, I have seen my own film a zillion times. I know every frame, every note in the soundtrack, every syllable spoken, every helicopter flying overhead (sound was the bane of my existence for two years when I was filming). We have a pretty big screen hi-def TV at home with a fairly decent sound system. But watching my film in a theatre was a completely different experience, not just because of the audience (which despite its small size was still the biggest group of people who had ever seen it at once), but also because there was something about the sound system that brought out details in a way that headphones and our home audio system just never did. I can't put my finger on exactly what it was (reverb maybe?) but whatever it was it made a huge emotional impact on me, and I guess it did on the audience as well because they all laughed at all the funny parts and (as far as I could tell) cried at all the sad parts. And when it was done the applause sounded enthusiastic, not merely polite. It felt great. Made the whole five-year long effort worth it.

Afterwards, the festival host called me on stage (he called me "brother Ron" which was the first time anyone has ever called me that, and I thought that was pretty cool too) and we had some Q&A. There were a lot of questions, and a lot of compliments. (If anyone who attended is reading this, I would really appreciate if you would write up a quick review and send it to me so I can put it up on the web site. Thanks!) It was a really great feeling. I can only imagine how Hillary felt in the same position in the Egyptian with a standing room only crowd.

I have no idea where the film will go from here. I'm very much hoping I'll be able to line up some more screenings. If you saw the film and liked it please tell your friends. If you know a film festival programmer or Jon Feltheimer, tell them :-) I'm not much of a marketeer so my main hope of reaching a broader audience is to have it go viral.

Thanks to everyone who attended last night!

Wednesday, April 13, 2011

Rich people are not the creators of wealth

I was a signatory to the original Patriotic Millionaires campaign back in November, and I recently signed on for round two. It's looking more promising this time. Obama finally seems to have found his spine:


In December, I agreed to extend the tax cuts for the wealthiest Americans because it was the only way I could prevent a tax hike on middle-class Americans. But we cannot afford $1 trillion worth of tax cuts for every millionaire and billionaire in our society. And I refuse to renew them again.


Actually, it's a pretty good speech. Worth reading the whole thing. (I don't think I've ever said that about a political speech before, and it will probably be a long time before I say it again. Sorry about the NYT link. I tried to find the text on the whitehouse.gov site and failed. If anyone from the White House PIO is reading this, if someone like me can't find the text of the president's most recent speech on your site, you have a serious problem.)

Of course, the Republicans were attacking the speech, particularly the part about raising taxes on the rich, before it was even delivered, lending credence to the theory that Republicans don't actually think about what they are saying but just mechanically object to anything the president says or does, even if it's their own idea.

But there's one very dangerous notion that really needs to be squashed: "now it’s time to tax the people who create the wealth.” (That was Michele Bachman, but you can count on hearing this line from the chorus on Fox News for the next few weeks.) The implication being that 1) rich people create wealth and 2) if you tax them, they'll stop. Both of these ideas are wrong.

Making money and creating wealth are two very different things. You can make money without creating wealth, and you can create wealth without making money. Salman Khan has created vastly more wealth than, say, Angelo Mozilo, but Mozilo made vastly more money. (Mozilo actually made his money by destroying wealth.)

Money and wealth are not completely unrelated, of course, but even when they go together the causality is often backwards from what Republicans tacitly assume. Money is usually the result of creating wealth, not the cause.

"But," you might object, "you need money to create wealth." That's not true. You need capital, but just as money isn't wealth, it isn't capital either. Money can be exchanged for capital (tools, factories, computers) just as it can be exchanged for wealth (food, clothing, shelter, entertainment). But money is distinct from both wealth and capital, and just because someone has money doesn't mean they got it by creating wealth, or will use it to create more. Furthermore, the government has been a very effective creator of capital: Interstate highways. The military-industrial complex (whether you like it or not, it's capital). Much of our medical and aerospace research and higher-education system. The Internet. All created by or with significant help from the government.

The proposition that the rich are the (sole) creators of wealth, and that taxing the rich destroys wealth, is sheer nonsense.

Tuesday, April 12, 2011

Seeking a non-technical co-founder

I'm looking for a non-technical co-founder for a startup I'm working on. Main qualification is good interpersonal skills. Your primary initial responsibility will be sales/bizdev. Prior sales/bizdev experience is a huge plus, as is prior experience in (or at least some passion for) the aviation industry. Being located in the Silicon Valley is a plus but not a requirement. There will be significant travel (mostly domestic at first but possibly international later on) involved, so this is not a good opportunity for someone who wants to spend a lot of time with their family in the short term. You would be employee #1 so there is significant upside potential. If you are interested, drop me a line.

Ron rambles on

So it seems that the Ramblings will continue. Thanks to everyone who responded to my last post. It was a real revelation to learn that there are nearly 300 people subscribed to Rondam Ramblings via RSS. Seriously, who are all you people?

This still leaves me with the problem of deciding what to write about. So for now, here's a video of an A380 clipping the tail of a CRJ-700 at JFK yesterday. If you think you're having a bad day at work, watch this and it will put your troubles in perspective.

Monday, April 11, 2011

To blog or not to blog

I've been writing Rondam Ramblings for nearly eight years now. It started out with no clear purpose, and now eight years on that's still pretty much the case. At various times I've written about science, programming, politics, religion, philosophy, current events, travel, my personal life, and short links to things that seemed cool to me. If I were starting this project today I'd probably be doing it on FaceBook, but when I started RR, FB didn't exist (and it didn't open to the public until much later).

Along the way I've picked up thirty-four followers, and a few regular contributors have come and gone (mostly gone). I have no idea who most of you are or why you decided to subscribe, though I'm grateful you did. There is no greater reward for a writer than to be read. But it seems to have been a very long time since I've posted anything here that generated much interest. The last time anything really seemed to resonate was last October (and I don't want to make a blogging career out of taking pot shots at people).

Keeping up a blog is a non-trivial amount of work, and it is becoming less clear that Rondam Ramblings is worth the effort. If you thirty-four people, whoever you are, are still getting something out of this I'm happy to keep it up. But if you've moved on and just not bothered to unsubscribe then I'm going to redirect my efforts to other venues. So if you're still out there please let me know: send me an email, leave a comment, or just click on a reaction box, if for no other reason so I'll know what kinds of things I ought to be posting here, or if it's time to turn the page. Either way, thanks for listening.

Taking ridiculous to a whole 'nuther level

The Ramblings have been quiet because I just got back from a week aboard the MS Allure of the Seas, the world's largest cruise liner. Having seen her up close and personal, I rank her as one of the great achievements of human civilization, right up there with the Egyptian pyramids, the Coliseum, and vanilla soft-serve. She's a quarter million gross tons, carries over six thousand passengers, two thousand crew, and is the only ship in the world (except for her sister ship, the Oasis of the Seas) to have cabins with balconies that don't have a view of the water. It also has a basketball court, two FlowRiders, more restaurants, shops and pools than I can count, and an ice skating rink. She is a fully fledged floating city. In her one may even catch a glimpse of the future of human settlements.

By sheer happenstance, in one of our ports of call (St. Maartins) we docked right next to a little boat called Eclipse, which at 536 feet is at the moment the world's biggest private yacht:



(Click on the image for a hi-res version.)

They were in the middle of provisioning her for a trip and so I had a chance to chat briefly with one of the crew. I asked if she was available for charter. The answer: "I'm not allowed to tell you that." I took that as a "no" :-)

Sunday, April 03, 2011

Nuclear power is the safest way to make electricity

In particular, it is vastly safer than coal pretty much any way you slice it.

There's a particularly remarkable study linked to by this article (but the anchor text is misleading so the link is hard to find) to a study of the long-term health effects of Chernobyl, which was so bad that it could almost be considered a "dirty bomb". From the abstract:


Among adult populations, there is no strong evidence to suggest that risk of thyroid cancer, leukaemia, or other malignant disease has increased as a result of the Chernobyl accident.


The article also says that in a study of 120,000 Hiroshima and Nagasaki survivors, some of whom were exposed to the blasts and other who were away at the time, that since 1950 there have been 822 "excess" deaths among the exposed population, or about 2% of the 42,304 people who have died during the study period.

Yes, the situation in Japan is very serious and very tragic. But can we please stop freaking out about radiation?

Thursday, March 31, 2011

Pioneer Anomaly Solved

The Pioneer anomaly, one of the longest-standing mysteries in physics, has apparently been solved. It seems someone forgot to carry the one :-)

Actually, that's not so far from the truth. The problem with earlier models turned out to be that they didn't take into account the way heat from the spacecraft's plutonium power source reflected off the antenna.

What is most mind boggling to me is that it's actually possible to measure the position of the spacecraft accurately enough to tell the difference.

More hope for the world

Fox News viewership is down 21% with Glenn Beck taking the biggest hit.

Tuesday, March 29, 2011

Tax the Super Rich now or face a revolution

Think 2008 was bad? Paul B. Farrell says that was nothing compared with what lies in store if we don't start taxing the super-rich:


Warning: The Super-Rich Delusion has pushed us to the edge of a great precipice: Remember the Roaring Twenties? The Crash of 1929? Great Depression? Just days before the crash one leading economist, Irving Fisher, predicted that stocks had “reached what looks like a permanently high plateau.”

Yes, he was trapped in the “Great Gatsby Syndrome,” an earlier version of today’s Super-Rich Delusion. It was so blinding in 1929 that the president, Wall Street, all America were sucked in … until the critical mass hit a mysterious flash point, triggering the crash.

Yes, we’re reliving that past — never learn, can’t hear. And oddly it’s not just the GOP’s overreach, the endlessly compromising Obama, too-greedy-to-fail Wall Street banksters, U.S. Chamber of Commerce billionaires and arrogant Forbes 400. America’s entire political, financial and economic psyche is infected, as if our DNA has been rewired.

The Collective American Brain is trapped in this Super-Rich Delusion, replaying the run-up to the ’29 Crash.


Worth reading.

Newt Gingrich joins the raving loon club

Newt is afraid of "a secular atheist country ... dominated by radical Islamists."

Do Republicans stop to think at all about what they are saying, or do they just robotically spout buzzwords? Atheists! Terrorists! Muslims! (Oh My!)

Good grief.

Two more ways Obama is like Bush

I've said before that Obama is becoming increasingly indistinguishable from George Bush in terms of his policies. Add two more data points: Obama has now started his own war in the middle east with an unclear mission and justification, and he has suppressed scientific research:


"The Obama administration has issued a gag order on data over the recent spike of dead dolphins, including many stillborn infants, washing up on Mississippi and Alabama shorelines, and scientists say the restriction undermines the scientific process."

...

One biologist involved with tracking dolphin mortalities for over 20 years and speaking on the condition of anonymity, told Reuters that: “It throws accountability right out the window. We are confused and ... we are angry because they claim they want teamwork, but at the same time they are leaving the marine experts out of the loop completely.”

Some scientists said they have received a personal rebuke from government officials about “speaking out of turn” to the media over attempts at determining the dolphins’ deaths.


Excuse me? Speaking out of turn? What does that even mean? Since when have scientists had to wait their turn to speak to the media? Is this the U.S.A. or the U.S.S.R.? Honestly, it's starting to get a little hard to tell the difference.

Monday, March 28, 2011

The NYT paywall: a digital train wreck in slow motion

I believe in supporting quality independent journalism, so it's painful to watch the digital train wreck that is the New York Times paywall. They've apparently spent $40 million on it, and not only is it badly broken, but has a fundamental design flaw that makes it trivial to work around.

But that's not the worst problem.

The worst problem is their pricing. It's deceptive, and deeply hidden in their subscription page, which proudly touts 99 cent teaser rates, but hides the actual underlying rates under a tangle of links and redirects. Here are the actual weekly rates for an electronic subscription:

Web + smartphone: $3.75
Web + tablet: $5.00
Web + tablet + smartphone: $8.75

And here are the rates for home delivery, which come with "free all-digital access" (i.e. the third option above):

7-days a week: $7.40
Friday-Sunday: $5.20
Sunday only: $3.75
Monday-friday: $3.75

How is this pricing model broken? Let me count the ways.

1. There is no web-only option. If I have neither a smartphone nor a tablet I have to pay for one of those services regardless.

2. The cost of Web+tablet+smartphone is the sum of web+smartphone and web+tablet. So if I want all three, I have to pay for web access twice. Unless...

3. The cheapest way to get all three is to subscribe to the Sunday times. The NYT in effect will pay me $5.00 a week to take a paper copy of the Sunday times and throw it in the recycling bin.

A more perverse set of incentives is hard to imagine. Like I said, I believe in supporting quality independent journalism. I would happily pay to read the NYT on line if I felt that the price I was being charged is fair. But I just want to read the Times on the web, not my phone, and I don't own a tablet. I don't want to pay $250 a year for the privilege of not having to throw out a three-inch-thick stack of newsprint every week.

And I'm also pretty leery of supporting an organization that can't figure out how to build a freakin' paywall for less than forty million dollars.

Friday, March 25, 2011

The climate wars have begun

The Marshall Islands have fired the first shot in what I predict will eventually become a global conflict: instead of sitting idly by and watching while their country sinks beneath rising sea levels caused by global climate change, the Marshal Islanders have taken the initiative by legalizing cocaine in order to attract tourists and raise hard currency.

Good for them.

Now, I don't want to be misunderstood. I am not advocating the use of cocaine. Cocaine -- especially crack cocaine -- is a horrible scourge. But the fallout from making cocaine and other narcotics illegal is an equally horrible scourge. It does nothing but create a market opportunity for those willing to take the risk of breaking the law, and a barrier to entry for those unwilling to do so. Making cocaine illegal is nothing more than an indirect tax subsidy for drug dealers, one the Marshal Islanders have decided to avail themselves of. And they arguably deserve it:


The Marshall Islands were the site of nuclear testing prior to independence from the United States, and Marshallese claims against the US in regard to the fallout from the atomic testing are ongoing.


Don't buy your plane tickets just yet though. This is almost certainly nothing more than a bluff to try to get draw attention to their plight and maybe get a bit of leverage with the international community. But I predict that before long (10-20 years) we'll see other island nations trying similar tactics as the gravity of their situation becomes apparent. It's going to get ugly. Civilization depends on at least the perception of everyone having something to gain by playing by the rules. When your country is sinking beneath the waves that proposition becomes a very tough sell.

Monday, March 21, 2011

It's official!

My movie, But for the Grace of God?, a feature-length documentary about homelessness, is premiering at the Oakland Film Festival on April 15! There's more information and a trailer on the web site. If you live in Northern California I would love to see you there.

I'm not going to write much about the film here (I'm setting up a separate blog for that) so if you want updates please sign up for the mailing list on the film web site or subscribe to my twitter feed (@rongarret).

Sunday, March 20, 2011

A exceptionally good summary of what's going on at the Japanese reactors

In case you were wondering. Bottom line: the reactors worked exactly as they were designed to. These are forty year old reactors, they experienced one of the worst natural disasters in recorded history, and while they have been damaged, there has been no danger to public health. And the most likely long-term danger to the public is that everyone will freak out over nuclear energy, which will hamper efforts to control carbon emissions.

Friday, March 18, 2011

The NBC Nightly News Drinking Game

Want to get smashed in 30 minutes? Then pull up a bottle of tequila and the NBC Nightly News with Brian Williams and take a shot whenever Brian says any of the following words or phrases: "Massive", "As Always" or "Our very own." I guarantee you will not be able to walk a straight line at the end of this exercise.

[UPDATE - 3/23] - I've actually noticed that ever since I posted this, Brian seems to be using these phrases a lot less. I wonder if maybe someone at NBC reads my blog? :-)

Friday, March 11, 2011

What Kind of Sick Culture Blames an 11-Year-Old for Being Gang-Raped?

When I first saw that headline I assumed they were talking about some backwards tribal culture in a Muslim country. But no, this happened (actually, is happening) in Texas. Guess I was wrong about the Muslim part.

Wednesday, March 09, 2011

A bumpy ride through the moral landscape

Sam Harris takes a valiant whack at the dragon (or is it a windmill?) of moral relativism in his new book, "The Moral Landscape." Harris argues that, contrary to Hume, one can derive "ought" from "is", at least to a first-order approximation, by starting with the premise that morality is that which advances the interests of conscious beings.

One of my mentors in graduate school once told me that it is worth paying attention to what smart people have to say even -- perhaps especially -- when they are wrong, because they are usually wrong in interesting ways. "The Moral Landscape" is a perfect example. It's wrong, but it is wrong in a very interesting way.

Let me say up front that I have tremendous sympathy for Harris's agenda. I wish it were true that one could derive "ought" from "is" (and I think it might be possible, but it's much, much harder than Harris -- or any of the new atheists -- seem to recognize). I do accept Harris's premise that maximizing utility for conscious beings is not only a reasonable foundation for such an endeavor, it is the only possible reasonable basis for it.

Unfortunately, between Harris's premises and his (predictable) conclusion that religion is the root of all evil (his Introduction has a section prominently entitled "The Problem of Religion") are a whole host of tacit biases and assumptions that render his reasoning circular. Harris defines the problem of morality as maximizing some utility function with respect to consciousness (which is perfectly fine), but then he goes on to assume without any foundation (and, worse, without being explicit about it) that this quality metric should have certain characteristics. Like erstwhile provers of Euclid's fifth postulate the assumptions he makes appear intuitively obvious. But if science teaches us anything it is that what is intuitively obvious is often wrong.

Harris's argument runs off the rails almost from the very beginning. This is taken from his introduction:


For my argument ... to hold, I think one need only grant two points: (1) some people have better lives than others, and (2) these differences relate, in some lawful and not entirely arbitrary way, to states of the human brain and to states of the world. To make these premises less abstract, consider two generic lives that lie somewhere near the extremes on this continuum:

The bad life

You are a young widow who has lived her entire life in the midst of civil war. Today, your seven-year-old daughter was raped and dismembered before your eyes. Worse still, the perpetrator was your fourteen-year-old son, who was goaded to this evil at the point of a machete by a press gang of drug-addled soldiers... Since the moment you were born your world has been a theatre of cruelty and violence. You have never learned to read...


I won't quote the whole thing; you get the idea. The good life, on the other hand, I reproduce here in its entirety because the details matter:



The good life

You are married to the most loving, intelligent and charismatic person you have ever met. Both of you have careers that are intellectually stimulating and financially rewarding. For decades, your wealth and social connections have allowed you to devote yourself to activities that bring you immense personal satisfaction. One of your greatest sources of happiness has been to find creative ways to help people who have not had your good fortune in life. In fact, you have just won a billion-dollar grant to benefit children in the developing world. If asked, you would say that you could not imagine how your time on earth could be better spent. Due to a combination of good genes and optimal circumstances, you and your closest friends and family will live very long, healthy lives, untouched by crime, sudden bereavements, and other misfortunes.


Surely it is obvious that the Good Life is preferable to the Bad Life in every way? Well, alas, no it is not. It is certainly preferable from the point of view of an affluent Western academic, which both Harris and I happen to be, so I can certainly understand the appeal. But it is not true that this need be the case for all conscious beings, or even for all rational conscious beings. But Harris dismisses this possibility out of hand:


Anyone who doesn't see that the Good Life is preferable to the Bad Life is unlikely to have anything to contribute to a discussion about human well-being. Must we really argue that beneficence, trust, creativity, etc. enjoyed in the context of prosperous civil society are better than the horrors of civil war endured in a steaming jungle filled with aggressive insects carrying dangerous pathogens?


Well, yes, we must. Hidden in the trees of horrific detail is the forest that makes Harris's Bad Life preferable to his Good Life for many people: the woman in the Bad Life scenario (one wonders if Harris considers being a woman to be a salient characteristic of the Bad Life) has children while the person (notably with gender unspecified) in the Good Life scenario doesn't (or, if s/he does, they don't figure prominently in Harris's reckoning.)

Now, I do not mean to suggest that any rational person would choose the totality of Harris's Bad Life over his Good Life. I merely point out that Harris's quality metric is heavily prejudiced by the fact that he is an affluent Western academic male. Money, in particular, figures very prominently. He mentions it three times. It is particularly noteworthy, I think, that helping the poor unfortunate children in the developing world is done with a billion dollars of other people's money rather than your own.

Let us give Harris the benefit of the doubt and assume that he is simply ignorant of the evidence that providing financial assistance to developing countries does more harm than good and that his heart is actually in the right place. But look at where he puts the emphasis: helping other people is not good because of the benefit it provides to others, but because of the personal satisfaction that it provides to the benefactor. The Good Life is not good because you are good, it is good because you feel good. You are free of pain and want, and on top of that you get to bestow a billion dollars of largesse on some poor unfortunate urchins without compromising your standard of living. That sounds good to me because I am a member of Harris's demographic. But I wonder how it sounds to the urchin.

(If you still doubt this point, let me add just one sentence to Harris's Bad Life: "Because of your suffering, the attention of the world's media has been drawn to the plight of your people, and years after you are dead millions will be living better lives because of your sacrifices." And another to the Good Life: "Unfortunately, though you are blissfully unaware of it, the money you have given out to third world countries has ended up in the pockets of corrupt dictators and the net result is that you have made the lives of millions of people worse, not better." Now which life is the Good Life and which is the Bad Life?)

Again, my point here is not to argue that Harris's Bad Life is superior to his Good Life, only to plant a seed of doubt that the superiority of every aspect of Harris's Good Life is beyond question. Unfortunately, even this small seed of doubt undermines Harris's entire agenda. The problem with applying science to morality is that it requires you to choose a quality metric from a complex space with multiple incommensurate dimensions. Even as simple a premise as, say, all else being equal it is good to minimize physical pain is open to rational doubt: it may well be that a certain amount of physical pain is necessary to psychological well-being (as measured according to some other quality metric). Maybe people who never experience any physical pain end up being so risk-averse that they become dysfunctional cowards. I really enjoy my affluent lifestyle, but I really wonder if I'm going to be up to the challenges that are going to come our way when, say, the planet's reserves of crude oil start to run out. (Or, what ought to be even more frightening, phosphorus.)

The fundamental problem is that "the interests of conscious beings" is not well defined. What exactly are those interests? To exist? To exist free of pain? To exist at some balance of pain and comfort that maximizes some other ineffable quality like "self-fulfillment" (whatever that might mean)?

The ultimate irony is that the reason that the interests of consciousness is not a coherent basis for morality (or anything else for that matter) is precisely because consciousness was created by evolution and not by God. Consciousness exists not because it is the cosmic destiny of the universe, but rather because, like all other complex things, it has survival value -- but not for itself. Consciousness is not an end, it is a means. Consciousness exists because it provides a powerful motivator for an entity afflicted by it to keep itself -- and hence its genes -- alive. Wealth and physical comfort feel like wins because up to a point they increase reproductive fitness. But as soon as it gets to the point where consciousness starts to value things like "self-fulfillment" over having children, trouble begins. The interests of that sort of consciousness are not longer aligned with those of its creator.

This is why Harris's program is almost certainly doomed to fail. Advancing the interests of consciousness will not lead to a planet full of humans singing kumbaya in blissful conscious harmony because that's not what consciousness is for. Consciousness exists to make us care about making sure our children stay alive long enough to have children of their own. Our children. Not someone else's. Of course, the situation is complicated by the fact that the boundaries between "ours" and "theirs" are fluid and can change opportunistically (because that kind of flexibility also has survival value). But there is nothing in the laws of physics that says those lines should be drawn around a single species, or even a single mental attribute like consciousness.

I wish things were different. I really do. I would like nothing more than to be able to preach the Gospel of Sam and so help to bring peace and harmony to the world. But Sam Harris of all people should be able to sympathize with someone like me who has a limited ability to suspend disbelief, so I trust he will forgive me.

Is the Square reader a security hole?

Verifone tooka swipe at Square today, saying that the Square credit card reader, which plugs into an iPhone headset port and lets anyone accept credit card payments, is a security hole. Are they right?

Yes and no. Yes, it is possible to use the Square reader to steal credit card information. But no, the Square reader does not make the existing credit card security situation appreciably worse than it already is.

Credit cards are basically 1950's technology, and their security model is fundamentally broken for on-line transactions. Back in the 1950's when credit cards were invented, the security model was that you had physically present the card to the merchant, who created a physical imprint of the card using a mechanical device. The consumer then signed the imprint. This made the security model essentially the same as that for checks: you had a physical token (the check or the card imprint) and a signature. Perpetrating credit card fraud was about as hard as perpetrating check fraud. You had to produce a physical artifact (a fake check or a fake credit card) and forge a signature. That was a high enough bar that fraud was rare by today's standards.

The descent from that halcyon days of the 1950's to today's chaos happened very gradually. Although finding documentation for this is probably very hard, the first step was almost certainly the result of merchants dealing with mechanical failures in the card imprint machines by writing down the credit card number on the sales slip by hand. The one day a merchant gets the bright idea that because they can write the number down by hand, they can accept orders over the phone. In the 1960's, magnetic stripes were added to cards, which allowed the entire end-to-end process of processing a credit card transaction to be computerized. This was a big win for efficiency, but in the process it completely eliminated the two features of credit cards that provided security: the physical imprint and the signature. The result, predictably, was a dramatic increase in fraud.

The fundamental problem with credit cards for in-line transactions is that, by definition, on-line transaction can involve only the exchange of information, not any kind of physical token. But the information that you have to give to a merchant in order to conduct one transaction is the same information that is needed to conduct an arbitrary number of transactions.

The credit card industry has responded to this situation with breathtaking naivete. A number of "security" measures have been added over the years, but they all amount to minor variations on one of two themes: 1) require additional information to conduct a transaction (expiration date, billing address, CVV code, and use computers running sophisticated pattern recognition algorithms to try to detect fraudulent activity. Neither of these measures is even remotely adequate for the task. As long as the information to process a transaction is the same for every transaction it doesn't matter how much of it there is, a fraudster can easily acquire this information (whatever it is) simply by posing as a legitimate merchant, which is trivial to do on the web. And heuristic fraud detection helps, but it will always have both false positives and false negatives. The result is a horrifically inefficient and fraud-prone system. The Square reader does make it slightly easier to perpetrate credit card fraud: now a fraudster can scan the card instead of, say, taking a photo of the front and back. But letting a fraudster copy a card in two seconds instead of six is unlikely to have even a detectable impact on current fraud levels.

The credit card companies could easily solve this problem by deploying smart cards with embedded processors that use cryptographic techniques to produce tokens that are unique to a particular transaction. This would all but eliminate credit card fraud overnight. Why don't they do it? That's a good question. The honest answer is that I don't know, but I strongly suspect that it's because the card companies are not the ones feeling the pain. The cost of fraud is substantial, but it's just fobbed off onto the merchants in the form of ridiculously high transaction fees, chargebacks, and rules that prevent the merchants from passing these costs on to the customers. The merchants are a captive audience because consumers, understandably, insist on paying with cards, blissfully ignorant of the fact that billions of dollars are being silently funneled out of their pockets and into the coffers of fraudsters and banks.

Normally, a situation like this would be ripe for a startup to come in with a better, more efficient disruptive solution. But the problem is that there is a huge chicken-and-egg problem: merchants won't want to use a new payment system unless consumers are using it, and consumers won't want to use a new payment until merchants are using it. So at the moment, unless the banks decide to do the Right Thing (don't hold your breath), we're stuck in this local minimum.

Sunday, March 06, 2011

Did NASA discover bacteria in meteorites (again)?

Did they?

No.

Worth reading the second link, if for no other reason than to learn what "pareidolia" means (I didn't know).

Tuesday, March 01, 2011

Pardon me while I pick my jaw up off the floor

The Supreme Court just ruled that corporations do not have a right to privacy. And it was unanimous! My worldview lies in shambles. I would have given you long odds that Roberts, Scalia, Thomas, and Alito would rule in favor of corporations at every possible turn. Maybe there is some hope for the future yet.

Monday, February 28, 2011

The obfuscater of last resort

I never thought I'd find myself agreeing with Antonin Scalia, but I have to join him in decrying the Supreme Court's ruling in Michigan v. Bryant. The Constitution is very clear that "In all criminal prosecutions, the accused [has] the right ... to be confronted with the witnesses against him." Period, end of story, no exceptions. But the Court has ruled that "because the primary purpose of the interrogation was to enable police to deal with an ongoing emergency, the statements resulting from that interrogation were not testimonial and could be admitted without violating the Confrontation Clause."

Scalia in his dissent writes that the decision "distorts" the Constitution and "leaves it in shambles... Instead of clarifying the law, the court makes itself the obfuscator of last resort." Which, of course, it does. But no more than the proposition that corporations are entitled to Constitutional protections but gays and women aren't. Perhaps Justice Scalia needs to be reminded that as ye sow, so shall ye reap.

Wednesday, February 23, 2011

Another domino falls

This is a big one. President Obama has ordered the justice department to stop defending the Defense of Marriage Act. Wow.

Now if we can just get taxes raised on multimillionaires I can retire from politics.

Wednesday, February 16, 2011

Why Isn't Wall Street in Jail?

Matt Taibbi plumbs the depths of Wall Street and Washington corruption. Long, but as usual worth reading all the way through. The U.S. is not quite yet a banana republic but it's getting too close for my comfort. What has the world come to when China is the leader in innovation, Russia wins the space race by default, and Egypt sets the standard for government of the people, by the people and for the people?

Republicans are hypocrites

Yeah, I know, tell you something you didn't already know. But this is pretty brazen even by Republican standards: the Democrats introduced an amendment to the Patriot Act that would require government officials to follow the Constitution. All but two Republicans voted against it.

Sunday, February 13, 2011

Glen Beck is a complete lunatic

This video is proof. In it, Beck spins this bizarre conspiracy theory about how labor unions, communists and radical Islamists are plotting to bring about "fundamental change" in the United States under the rubric of "One Nation." He specifically cites onenation.org as the cyberspace HQ for this nefarious plot.

Trick is, onenation.org is a right-wing web site advocating English-only education. One has to wonder if Beck (or his producers) even bothered to look at the site before deciding to point to them as the bogeyman for the twenty-first century.

No wonder even Bill O'Reilly looked skeptical.

Saturday, February 12, 2011

If you think sexism is dead...

Think again.

Watching C-SPAN is usually about as exciting as watching the proverbial paint dry but I found this hour-long argument about parliamentary procedure to be oddly captivating. In it Congressman Anthony Weiner of New York presses a point of order that the Republicans failed to follow their own rule requiring every new piece of legislation to cite the specific section of the Constitution that gives Congress the authority to enact the proposed law. It's fun watching the Republicans squirm while Weiner keeps asking them which section of the Constitution gives them the power to outlaw abortion and they are, of course, unable to provide an answer.

But what is most striking is the badinage between all the good old boys while Jan Schakowsky of Illinois tries for almost thirty minutes to be recognized to speak while the chairman pointedly ignores her. If you're an American it's really worth taking an hour out of your life to watch the whole thing. This is your government at work.

Good night. And good luck.

The hardest part of getting what you want -- part 3

When I was a kid I went through a new-agey period where I "realized" that the boundaries between "me" and "the rest of the universe" were not as clearly delineated as I had thought, and I spent a few months feeling as one with the Universe. But that is, of course, wrong, Just because the boundaries aren't crisp doesn't mean they aren't there. It is in fact one of the hallmarks of life is to draw the line between itself and its environment. The invention of the cell wall arguably marked the beginning of life as we know it. Separating itself from its environment is essential to -- perhaps even the essence of -- life. You cannot be alive without distinguishing (even if not explicitly) between "myself" and "the other."

We humans have multiple layers of boundaries. We are eukaryotes. We draw boundaries even within the confines of our own cells, which glom together to make higher-order boundaries between organs, which glom together to make individual humans.

Because our human consciousness resides as an (apparently) unified whole within a body that is also a physically unified whole with more or less clearly defined boundaries it is tempting to think that the hierarchy ends with those bodies. But this is not necessarily the case. The idea that physically distinct entities ought to be considered logically distinct individual life forms is a prejudice. Ants and honeybees, for example, glom together to form colonies which have more of a claim to the title of "individual living entity" than do individual ants or honeybees. Most individual ants and bees are sterile and cannot reproduce when separated from the colony or hive, just as most of the organs in our bodies exist under similar constraints. So it is a not unreasonable point of view to look at an ant colony as a living individual whose component parts just happen not to be rigidly attached to each other.

Most "higher" life forms cannot reproduce as single individuals. It takes at least two to tango, as it were. And in the case of humans, it takes, as they say, a village. It is an extremely rare human who can survive for more than a few days without the support structure of at least some kind of society. This is why we are social creatures and being alone for too long drives us mad.

This is another thing that the rationalists get wrong. Discussion of rational behavior is invariably predicated on the assumption that the quality metric is bound to an individual human. We speak of rational people but never of rational groups whose group interests may or may not coincide with the interests of the individuals making up those groups. Indeed, self-proclaimed rationalists seem to actively resist even considering such quality metrics, and take it as axiomatic that, for example, having individual humans whose minds believe in objective truth is a good thing. There is no rational justification for this, which makes it all the more ironic.

So you get Lisp programmers who spend all their time honing the arguments for why Lisp is the best programming language, and C programmers who ignore those arguments, write a bunch of ugly code, and take over the world. You get evolutionists who spend all their time honing the arguments for why evolution is true, and creationists who ignore those arguments, tell people that they are special because they are created in the image of God, and take over the world, not because they were correct but simply because the "you are special" meme reproduces better in the mind of a social creature than the "you are alone in a hostile universe" meme does.

Kudos to Egypt

I have to say that I am amazed at the events in Egypt over the last 48 hours. Right now it really does appear as if that country is on the road to the successful conclusion of a (mostly) peaceful democratic revolution. Two weeks ago I would not have thought that possible. I knew there were reasonable Muslims out there, I just didn't think there were so many of them in Egypt of all places. It remains to be seen if the center can hold, but right now it's looking a lot more promising than I would have predicted two days ago.

I think a lot of the credit has to go to the army. I find it hard to imagine that it never crossed Mubarak's mind to use Tiananmen Square as a model for how to resolve the situation in Tahrir square. Someone high up in the army must have, either implicitly or explicitly, let Mubarak know that was not going to happen on his watch. I really want to know who that was.

Friday, February 11, 2011

Maybe it's just a plain ol' conspiracy after all

Glen Greenwald on the BofA campaign against Wikileaks:


The U.S. Government's obsession with destroying WikiLeaks has been well-documented. And because the U.S. Government is free to break the law without any constraints, oversight or accountability, so, too, are its "private partners" able to act lawlessly. That was the lesson of the Congressional vesting of full retroactive immunity on lawbreaking telecoms, of the refusal to prosecute any of the important Wall Street criminals who caused the 2008 financial crisis, and of the instinctive efforts of the political class to protect defrauding mortgage banks.

The exemption from the rule of law has been fully transferred from the highest level political elites to their counterparts in the private sector. "Law" is something used to restrain ordinary Americans and especially those who oppose this consortium of government and corporate power, but it manifestly does not apply to restrain these elites.

...

In this world, law does not exist as a constraint [on large corporations]. It's impossible to imagine the DOJ ever, ever prosecuting a huge entity like Bank of America for doing something like waging war against WikiLeaks and its supporters. These massive corporations and the firms that serve them have no fear of law or government because they control each. That's why they so freely plot to target those who oppose them in any way. They not only have massive resources to devote to such attacks, but the ability to act without limits."


Worth reading the whole thing.

Tuesday, February 08, 2011

The hardest part of getting what you want... part 2

I realized from Don's comments on yesterday's post that I gave a mistaken impression. That post was intended to be about politics, not my personal quest for meaning in life. (I am not quite so narcissistic as to think that that is worth blogging about.) I was just couching it in first-person terms to be illustrative. The point I was trying to make was that rationality is in some sense self-undermining. It leads you inexorably to the conclusion that our fundamental nature as humans is a substrate for competing replicators (and even that is just an approximation to the underlying metaphysical truth). This is not a statement about purpose, it's a statement about objective reality, or at least some reasonably accurate approximation of it. And the point I was trying to make was simply that some people can't handle the truth and so they turn to God. I hope I don't have to describe how this has political ramifications.

There is an additional aspect of the underlying objective truth that I hinted at yesterday: "It is not possible to rule out the possibility that there are other replicators resident in ourselves whose nature is not quite so apparent." This alludes to what I dubbed (to my everlasting regret) the "great conspiracy" (a name which turns out, incidentally, to be already taken), the possibility that there are "mega-memes" or "meme-complexes" that are coherent replicating wholes that are distributed across multiple brains. This could account for the fact that, for example, no one seems to be in charge of the protests in Tunisia and Egypt. One might be tempted to argue that this is not evidence for a "mega-meme" but merely the large-scale replication of an ordinary meme of the sort that lives in one brain at a time. The problem with this theory is that it is not readily apparent what that meme might be. The obvious candidate, the freedom-and-democracy meme, is also obviously wrong. When (no longer "if") Hosni Mubarak falls the most likely replacement is the Muslim Brotherhood, and the most likely result of that is Sharia law. Egypt is not trading repression for freedom, it is trading secular repression for religious repression [UPDATE: it seems I may be wrong about this -- see the comments] despite the fact that not a single individual involved in the protests would be willing to concede that.

You can see the same disconnect between individual rhetoric and underlying collective reality in American politics as well. Republicans talk about being fiscally responsible and getting government out of people's lives while at the same time running up record deficits, and going to some rather extraordinary lengths to invade people's privacy when it comes to things like drugs, abortion, and "national security." Democrats talk about ending Republican abuses but then continue nearly every one of those abusive policies. No one on either side of the ideological divide seems to notice the gaping disconnect between rhetoric and reality.

There are at least three plausible explanations of all this. One is that the vast majority of people are simply too stupid to notice that what they are being told is not what is actually going on. The main problem with this theory is that there are an awful lot of apparently smart people falling for a wide variety of flim-flam, both political and not. Say what you will about Karl Rove, he is not stupid.

The second possibility is that there is an actual conspiracy, a shadowy cabal of powerful individuals who are consciously and deliberately manipulating the world's politics for their own benefit. The problem with that theory is: who are these people? I don't see any plausible candidates for the role of puppet master, particularly in the middle east. In the U.S. I can see an argument to be made that there is a sociopolitical elite that essentially cultivates the general population like a herd of domestic animals to provide them with whatever their avarice desires, but who is pulling the strings in Cairo and Tunisia? The elephant in the living room of the traditional conspiracy theorists is Islam: who benefits from its rise? In the case of Christianity you can always point to the Church, but Islam has no Church, no cache of wealth, no central locus of power. If Islam is a conspiracy, who is conspiring, and to what end? (Yes, I suppose one could point to the House of Saud, but surely the Saudis have nothing to gain by fomenting discontent in Egypt, so say nothing of Yemen.)

The third, and to my mind most plausible, explanation of all this is that individual humans are not in fact the principal actors on the world stage. The political and macroeconomic forces at play in the modern world do not seem to be working for the benefit of the vast majority of individual humans in the world, but they are clearly benefiting, in Darwinian terms, certain meme complexes, with Islam being the poster child, and various forms of corporatism and nationalism running close behind.

It may be that in the deep dark recesses of the world's mosques and membership clubs and private jets is a small collection of imams, politicians and corporate leaders who are living deliriously happy lives at the expense of the rest of us. But I really doubt it. I really believe that very nearly every individual human struggles to figure out their role in the scheme of things, and what we see is the predictable, understandable, and cosmically (or maybe that should be comically?) tragic result.

Monday, February 07, 2011

The hardest part of getting what you want...

Over the weekend I got a very flattering but somewhat bizarre request in the comments of a post I wrote nearly a year ago. Someone who goes by the handle lordbap asked me to write a book about my "take on politics, and hints toward conspiracy theories." It's very gratifying to know that someone out there cares what I think about politics, but writing a book is a non-trivial undertaking, and writing a book about a topic about which I don't actually know very much would probably suck up the better part of a year. So if I were going to take seriously lordbap's request to write about conspiracy theories I would first cast a very jaundiced eye towards the request itself and start to wonder about what sinister ulterior motives might be lurking behind it.

Which brings me to the topic of this post.

In the ongoing battle between the forces of rationalism and faith my natural sympathies tend towards the rational. From my privileged vantage point near the top of the socioeconomic ladder (at least when measured on a logarithmic scale) I can see the positive effects that rationality has on the material well-being of me and my fellow humans. It is natural then to wonder why irrationality (a.k.a. faith) seems to be so resilient. One theory is that faith is an effective palliative against existential angst and other psychological (and even physical) maladies, and it derives its resilience from the same source as other drugs: it's addictive. I still believe that, but in the nearly two years since I first advanced that theory I have come to believe that there is something much deeper going on.

In terms of Darwinian evolution, humans are the caretakers of at least three different kinds of replicators: genes, memes, and, in the last decade or so, computer viruses. (I say "at least" because it is not possible to rule out the possibility that there are other replicators resident in ourselves whose nature is not quite so apparent.) Even leaving aside computer viruses for the moment it is apparent that we serve the reproductive interests of both ideas and DNA. Sometimes those two replicators are symbiotic (like when memes invent antibiotics and sanitation), but not always. Promiscuous sex, for example, can be a very effective reproductive strategy for genes, but having to raise a few dozen kids doesn't leave much time for scholarly pursuits. Memes have since responded by inventing contraception and pornography, which for the moment seems to have given them the upper hand in the eternal reproductive arms race, at least in certain circles.

If one opts to travel the rational road, that is, if not the final destination, at least a major truck stop along the way.

Having reached this place, one can for a while bask in the warm glow of understanding, revel in the power of being able to manipulate one's environment, and enjoy the hedonistic pleasures that become available in an industrialized world of material plenty. But after a while one comes face to face with a very thorny problem: having freed ones self from the constraints that guided the existence of one's ancestors one must now make a weighty decision: now what do I do? Hang out on the beach? That gets old surprisingly quickly (at least it did for me). Write? Write what? Novels? Screenplays? Essays? Code? A book? That seems like it could be interesting, but is that really what I want, or is that just the part of my brain that has been parasitized by my memes manipulating me into frittering away the rest of my life on serving their interests? (I actually hate writing because I hate reading the crap that most of what I write turns out to be. But I love having written.)

Then too, I love learning new things, which seems like an honorable pursuit (but again maybe that's just my memes talking?) Maybe I should take a class? (I sometimes toy with the idea of going to law school.) Oh, but it's so much easier and faster to just look things up on the web. In fact, it's so much easier and faster that I can quench the desire to learn, at least for a while, by skimming a few wikipedia articles. Maybe it's too easy? Wikipedia seems like scholarly pornography -- you get the endorphin rush without having to go to the bother and risk of seducing an actual woman or doing some actual research work.

Write some code? Start a company? So much easier (and more fun) to just pot shots at the people who are actually doing it (but getting it all wrong of course) and justify it to myself as bestowing the benefits of my wisdom and experience on all those green aspiring entrepreneurs and angel investors out there. And how is that so different from what I'm doing right now? Well, the person I'm taking pot-shots at right now is me, and I'm doing it because someone asked me to instead of on my own initiative, but is this really what I want out of life? Am I really so unsure of my own self-worth that I need cultivate a pack of ass-kissing (lordbap's description, not mine) sycophants to make me feel whole? (Note to lordbap: you asked me not to edit. Careful what you wish for.)

Deep understanding and financial independence are not the panacea they may appear to be. Don't get me wrong, I wouldn't trade my problems for anyone else's. But even nice problems to have can still be real problems.

The hardest part of getting what you want.... is figuring out what it is. And the more options you have the harder that becomes. Which is, I think, why beyond a certain point money really can't buy happiness.

When it comes to choosing a direction in life the rationalist is truly adrift. He has no god to turn to. He cannot blindly follow the dictums of society. He has nothing to rely on but himself, the data, and Bayes' theorem. It works for a while because there are clear immediate needs to attend to that both genes and memes can agree on: vanquishing disease, increasing crop yields. But, like I pointed out earlier, once basic needs are met the agendas diverge. And here the rationalist has no choice but to peer into the abyss, because this is a fundamentally irreconcilable conflict as decreed by none other than Darwin himself.

The reason religion is so resilient is not just that it helps deal with existential angst, it's because it provides a goal. Rationality can provide the tools for making decisions, but it cannot provide a quality metric, at least not for an individual. "What do you want out of life" is a question that only you can answer. The problem with rationality is that it eventually leads you to wonder who -- and what -- "you" are. Is it really "you" who wants to start that business? Write that book? Surf that wave? Fuck that woman (or man)? Vote for that candidate? Argue for that position? Or is it some gene or meme complex that has hijacked your brain for its own soulless purpose? (For that matter, do you really have "free will" to make decisions about what "you" want? Do you even exist, or are you just a butterfly dream or an artifact of quantum decoherence?

It's very easy to get wrapped around the axle over such questions. I think it's a rare human being who has fully grappled with them and not either decided to punt or gone insane. When you gaze into the abyss...

For the other seven billion of us it is very tempting to just hand that kind of heavy lifting over to God and let Him deal with it. And, on those occasions when God isn't available, the Republican Party is there ready to pick up the slack.

Well, lordbap, that's the result of five hours of unedited (except for typos and grammar) writing. Not the twenty you were asking for, but I'm afraid that's all I have in me right now.

Perspective

If you shrank the sun down to the size of a ping pong ball, the earth would be the size of a very fine grain of sand about three yards away.

Jupiter would be the size of a small pea about 15 yards away.

Pluto would be just over 100 yards away, and it would be so small you'd need a microscope to see it.

At this scale, light travels about a foot a minute, and the nearest star is... 400 miles away.

It's a big universe. And it's mostly empty space.

Monday, January 31, 2011

Obamacare is doomed

SCOTUS will have the final say of course, but this sure sounds like the death knell to me:


"Because the individual mandate is unconstitutional and not severable the entire Act must be declared void."


Now if I could just figure out how to use my gift of prophecy (or is it a curse?) to pick stocks.

Monday, January 24, 2011

Geek corner: on blurring the distinction between code and data

A while back I wrote this as a throwaway comment in a discussion on comp.lang.lisp:


IMHO (one of) the hallmark(s) of "real" programming is a general blurring of the distinction between "compile time" and "run time". Compilation is just one kind of optimization. Running that optimization as a batch job makes it easier to apply, but the real challenge is refining the optimization on a continual basis in response to new information, including changes to the operational spec.


Someone sent me an email asking me to expand on that thought, and I promised I would. It took me a lot longer to render that expansion into words than I anticipated, so I thought I'd put it up here in case others might find it useful.

Writing programs typically goes something like this: First, a specification of what the program is supposed to do is written. Then that specification is rendered into code. The code is then (typically) compiled into some kind of an executable image. That image is then delivered to users who run the program and (again, typically) provide it with input, which we call "data".

This distinction between code and data is purely artificial. On a fundamental theoretical level there is no distinction between the two. All "code" can be viewed as "data" that is fed as input into an interpreter or a compiler, and all "data" can be viewed as a "code" for a specialized interpreter (or compiler) that comprises the application. From the computer's point of view it's all the same: bits go in, bits come out. Whether those bits are code or data is in the eye of the beholder.

We choose to make the (artificial) distinction between code and data because doing so has benefits. "Programs" can serve to bridge the often severe impedance mismatch between the mental states of typical users and the underlying reality of computational hardware. They can also restrict what a user can do in order to prevent him or her from getting the machine into undesirable states. And constraining what a program does allows optimizations that makes the resulting code run faster.

But making this distinction also has drawbacks. There is, obviously, a fundamental tradeoff between writing "programs" according to certain assumptions and constraints (and hence availing yourself of the benefits of those assumptions and constraints) and the freedom and flexibility to discharge those assumptions and constraints. This is the reason that code "maintenance" is considered an activity in its own right. Code doesn't require "maintenance" the way that mechanical systems do. Code doesn't degrade or wear out or require periodic lubrication. What happens instead is that the users of a program come to the realization that what the program does isn't quite what they wanted. There are bugs, or missing features, or parts that run too slowly or consume too much storage. So now you have to go back and change the code to conform to new assumptions and constraints. Often this is more work than the initial development.

The important point is not that these things happen, but that they happen because of an engineering decision, namely, the strong distinction between code and data, and the correspondingly strong distinction between programmer and user. There is nothing wrong with the decision to make this distinction. There are perfectly sound reasons to make this decision. But it is a decision. And because it is a decision, it can be changed. And it often is changed in small ways. For example, a spreadsheet blurs the distinction a little. Embedding a macro programming language like Visual Basic into, say, a word processor blurs the distinction more. Javascript probably blurs the distinction more than anything nowadays. Anyone with a web browser and a text editor has a Javascript development environment.

The line between code and data is blurrier now than it used to be, but it is still quite distinct nonetheless. There is still a strong division of labor between those who write web browsers and Javascript interpreters and those who write Javascript, and also between those who write Javascript and those who typically use web pages. There are still fairly clear distinctions between "scripting" languages, which tend to be easier to use but slow, and "real programming languages," which tend to be harder to use but faster, though this distinction too is beginning to blur as well. But the final merging of code and data, coder and user, compile-time and run-time, is still a ways off, and for a very good reason: it's really, really hard to do. That is what I meant by my original quip.

Whether or not the trend towards blurring the distinction will continue to the point where it disappears entirely is an open question. There are theoretical reasons to believe that a complete blurring might not be possible or even desirable. But the trend is inarguably in that direction.

One of the reasons I like to program in Lisp in general and Common Lisp in particular (and one of the reasons I think CL has had so much staying power) is that it is still the language that most effectively blurs the distinction between code and data, compile-time and run-time. (It doesn't blur the distinction between coder and user because of its abstruse syntax. Like I said, this is a really hard problem.) It's the only language in existence that allows you to change the lexical syntax while a program is running. On top of that, you can get native-code compilers for it. That is a stunning -- and massively under-appreciated -- accomplishment. Suddenly decide you want to use infix notation to write your code? You can do that, and you don't have to stop your already-running program in order to do it. That is mind-blowing. It's incredibly powerful. And, of course, it's dangerous, and if you don't know what you're doing you can get yourself into deep trouble if you're not careful. Solving that part of the puzzle is still an open problem.

Friday, January 14, 2011

The quantum conspiracy

I gave a talk at Google the other day entitled, with tongue-in-cheek, The Quantum Conspiracy: What Popularizers of Quantum Mechanics Don't Want You to Know. It's basically a recap of this paper that I wrote ten years ago. Despite my efforts to enlighten the world, you will still read in the popular press nonsense like, "When an aspect of one [entangled] photon’s quantum state is measured, the other photon changes in response, even when the two photons are separated by large distance."

No, nothing changes when you "measure" an entangled photon. Watch the talk (or read the paper) to find out why. Physicists have known this for decades now. Why does the popular press have such a hard time getting it right?