You might have heard about WordPress turning 10 years old on May 27th. To celebrate local WordPress communities around the world are having anniversary parties on May 27th, 2013. This includes the greater Salt Lake City, Utah area. The details are at http://www.meetup.com/WordPress/Salt-Lake-City-UT/930892/:
When: Monday, May 27, 2013, 7:00 PM
Where: Sonny Brian’s; 33 East 11400 South in Sandy
If you are coming be sure to RSVP so that we have an idea of how many people to expect. Bluehost is also giving away the 10th Anniversary WordPress t-shirts to the first 30 people who fill out this form on wpslc.com.
It will be a fun time to hang out and chat with other local WordPress fans/users/designers/developers.
Today VaultPress announced the Code Garage migration details. We really wanted to make sure that we had the details and options on this right. I know that migrations like this can often be annoying, so we went out of our way to make the process smooth and inviting.
Code Garage users that migrate to VaultPress will get their first two months on VaultPress free. For those who don’t want to migrate, we’ll refund your last payment.
Even if you aren’t a Code Garage customer, you should go read Peter’s CodeGarage Locker is Migrating to VaultPress post. He gives a personal history of how Code Garage came to be, how it grew, and how it ultimately was sold to Automattic.
Ted Dziuba on The Speculation Trap:
As soon as you accept the invitation to speculation, the boundaries of the hypothetical problem you need to solve are beyond your control. The other person can change the situation as needed to throw you off your balance.
His suggestion on how to deal with these types of speculation questions is to stay on message.
If you’ve looked at front end performance you’ve probably come across webpagetest.org before. Did you know they have an API for running tests as well? Docs for the WebPagetest.org API are at https://sites.google.com/a/webpagetest.org/docs/advanced-features/webpagetest-restful-apis.
The ioprofile tool captures a process’s I/O activity through lsof and strace and summarizes it. The result is a tabular display that shows you where the process spent its time on I/O operations.
Here is a screenshot from the first example in the original ioprofile docs:
Which shows the time spent in I/O for each of those files. Another useful option is the see the byte count of I/O for each file:
Baron Schwartz on TokuDB going open source:
I think TokuDB will rapidly become the storage engine of choice for MongoDB, and could catapult MongoDB into the lead in the NoSQL database arena. This would have profound implications for opensource databases of all flavors, not just NoSQL databases.
It is easy to forget how long it took MySQL to get to where it is today ( and it still has plenty of issues ). When I look at MongoDB I see it going through a similar path to MySQL ( and most other software ) to reach a point of critical features, stability, and mass. Improving the underlying storage engine, by using an existing open source one, could move it down that path much faster.
I was reading up on encoding issues and came across the chess symbols in Unicode Wikipedia page. I knew Unicode had support for all kinds of things, but chess pieces too? That was news to me.
The Wikipedia entry included a codepoint and HTML entity table, which makes it easy to list out all of the pieces:White pieces:
♔ ♕ ♖ ♗ ♘ ♙
♚ ♛ ♜ ♝ ♞ ♟
I’ve posted the slides and example code from my ‘Site Testing with CasperJS’ presentation at the OpenWest conference.
A PDF of the slides and all of the example code is available at https://github.com/josephscott/casperjs-2013-05.
I’ve posted the slides and example code from my ‘Simple Filesystems with Python and FUSE’ presentation at the OpenWest conference.
A PDF of the slides and all of the example code is available at https://github.com/josephscott/python-fuse-2013-05.
Welcome to May 2013, and snow:
Fortunately snow this late in the season doesn’t accumulate very much, and burns off quickly.
Last week I mentioned the concerns over the $1 sale price of the iProvo fiber network to Google. Well, the Provo Google Fiber project continues to get even stranger than that. From the Daily Herald article on the Provo city council vote:
Curtis also introduced new information and obligations that had not been discussed during the initial excitement of last week. There will be a need to spend some money. For one, the map on where the fiber conduits are actually laid is not available and it may take some guessing at a few locations as to what side of the street the fiber backbone is under. There is also an agreement the city will have control of the fiber to the schools and the city operations. Money has already been set aside from the telecom fund to take care of those needs. An insurance policy will also be needed to protect the city from the unknown. The total cost for city outlay will be approximately $1.7 million.
Emphasis at the end is mine.
A more detailed breakdown was reported by The Salt Lake Tribune:
- $722,000 “for equipment in order to continue using the gigabit service for government operations already using the network, such as the operation of traffic lights and police and fire services.”
- $500,000 “to a civil engineering firm to determine exactly where the fiber optic cables are buried, a requirement by Google”
- $500,000 “for an insurance policy to help mitigate any possible legal damages should Provo’s network not be presented to Google as promised”
Of course Google is paying Provo $1 for the network, so the real cost to Provo for selling their existing fiber network to Google is only $1,721,999. Still a fair bit money to pay someone to take an asset off your hands.
Then there is the issue of not even knowing where all of the fiber in the ground actually is. Didn’t they have to file permits with the city when they installed it in the first place? If they moved it later on, wouldn’t that require getting permits from the city as well? For something that they paid $39M for I would have thought they would keep a close eye on it.
The Salt Lake Tribune also reported numbers on how much is still due on the original $39M in bonds:
With interest, taxpayers still have to pay $3.3 million in bond payments per year for the next 12 years.
For a total of $39.6M that Provo will have paid out over the next 12 years.
This story may still have a happy ending. If Google Fiber in Provo blossoms into everything it could be, then all of this may have been worth it.
The money thing doesn’t really freak me out though. What really freaks me out is that the city of Provo has connected “operation of traffic lights and police and fire services” to the same fiber network that connects to the Internet. That strikes me as a really, really, really bad idea.
I came across openclipart.org while looking for some icons to use in an upcoming presentation. Their default ( and suggested ) method of licensing is public domain.
It took some digging around, but I did eventually find some decent quality graphics that I was able to use in my slides.
Brython, Python in your browser:
That is a trip. Practical? Probably not. Definitely neat though.
The brython.js code that does all the heavy lifting is only 175,936 bytes.
We can find these historic details because links have at least a provisional permanence to them. They are, literally, paths to locations. Thanks to those, we can document the history we make, and learn from it as well.
- Doc Searls on Why Durable Links Matter
The Internet in general, and the web specifically, have given people an unprecedented power to both record and erase history.
Last week I mentioned that Google Fiber was coming to Provo. This process would be accelerated because they were purchasing the existing fiber network, iProvo, instead of having to built out a completely new one.
Turns out the sale price of iProvo fiber network to Google is $1. It should come as no surprise that some people are less than thrilled about this price.
The original cost to build out the iProvo fiber network is reported at $39 million. To fund this the city of Provo issued bonds. Those bonds have not yet been paid in full. The $1 in revenue from the sale isn’t going to pay off the bonds. In an effort to help pay off the bonds in 2011 a surcharge was added to the utility bills of every household in Provo.
While most people seem happy to have Google coming in to upgrade and run the fiber network in Provo, others are concerned that the sale of the network does nothing to address the remaining millions of dollars in bond obligations.
Both sides are benefiting from this, but the $1 sale does feel a bit odd. I’d assume the existing fiber network has some value, less than the original $39 million and more than $1. Perhaps something on the order of $5 to $10 million would have been reasonable. Google would still have been given a great deal on existing infrastructure, and the city of Provo would have some money to pay down the bonds faster.
In the negotiations for this I’m sure Google held all the cards. The city of Provo was likely happy to bend over backwards and into a pretzel to make sure Google didn’t walk away from the deal.
The next meeting is tomorrow night ( April 24th ), where Matt Jones will be talking about e-commerce with WordPress.
The point is, some days you have to drop down into tcpdump and pcap files to get the truth.
- Steve Souders while tracking down how much HTML5 video data iOS actually receives.
Turns out server logs were giving an incorrect picture of the amount of data that was being transmitted. It took watching individual packets to track down what was really happening. Go read the post for the odd twist at the end.
Reading this reminded me of using Wireshark to track down multiple packet issues in HTTP POST requests from XMLHttpRequest.
The big announcement in Utah yesterday was that Google Fiber is coming to Provo. Instead of building out completely new infrastructure, Google will be purchasing the existing fiber network in Provo. From there they will be upgrading it to be in line with other Google Fiber installs.
I live about a 30 minute drive north of Provo, so this won’t make it to my home. Though Comcast recently upped my connection to 50Mbps down, so my current connection isn’t horrible. I wonder if Comcast knew this announcement from Google was coming and wanted to preempt some of the complaints from users.
The other thing that I wonder about is the NSA datacenter a short hop north of Provo. Was this a motivating factor on the either side? No idea. Would be nice to know if Google Fiber in Provo ends up peering with the NSA data center.
Speaking of peering, there are a few others in the area that would probably love to have a network peering arrangement with Google Fiber in Provo. C7 has a data center in Bluffdale ( north of Provo and not far from the NSA data center ) that apparently has a hefty number Twitter servers. Bluehost has a data center in Provo. Those are the two I could think of off the top of my head, there are data centers in the area who would be interested as well.
This is just the beginning of Google Fiber in Provo, and even though I won’t have access to it, I’ll be watching closely to see how it develops.