Umm... did it ever really start?
The White House has silently tripled the number of Web pages that it forbids Google and other search engines from accessing. Is this a bad omen or much ado about nothing?
Those bloggers drunk on hope who desperately wanted to see proof of Obama's commitment to his campaign promises of transparency and Google Government now find themselves with a difficult choice: they can either accept and acknowledge that robots.txt files are not a set of digital tea leaves through which you can read the new administration, or, if robots.txt does carry weight, they can try to come up with a way of explaining a 200 percent increase in the number of directories blocked by Obama's Web team as anything but Cheney-esque secrecy.
Simply put, the robots.txt file was created and managed by engineers, not lawyers or policy makers. It is not the place to judge the president on tech policy issues.
The president's tech policy should instead be judged on real issues: how many former RIAA and MPAA lawyers will be given positions of power in the administration, who ends up working at the FTC and FCC, and who will be named the new cybersecurity czar.
As for the president's commitment to transparency, he has already violated his pledge to post all nonemergency bills on the Whitehouse.gov Web site for five days before signing them. The text of the Lilly Ledbetter Fair Pay Act of 2009, which was signed into law yesterday, was certainly not posted to Whitehouse.gov for anywhere near five days.
Obama's broken commitment to transparency remains advertised on the White House blog:
One significant addition to WhiteHouse.gov reflects a campaign promise from the president: we will publish all nonemergency legislation to the Web site for five days, and allow the public to review and comment before the president signs it.
It is by looking to these kinds of concrete issues by which we can judge the president, not robots.txt
Continued evidence that The One is not The One.