Tuesday, March 14, 2017

25th Iron Horse - Republished from 1996













When my old high school friend Todd Barbey, invited me to his horse ranch in Durango Colorado to participate in the 25th running of the Iron Horse Classic, I thought "Hey that would be fun." Two and a half hours into the 47 mile, 5800 Vt. ft race negotiating 3 inches of snow over the 10,000 ft Colbank pass I had a different impression.

Having spent 6 years in the Swiss Alps I should have known better and been more prepared but hey, a 47 mile race, it’s not that far and with all that climbing I’m sure not going to weigh myself down with heavy things like a warm rain jacket, winter gloves and booties. But that’s what I would have needed and more (a tent would have been good) to finish safely. When I was finally rescued 8 miles from the finish there was 3 inches of snow covering my abandoned bike. Anyway, All’s well that ends well, and Todd and I will have a good story to tell.

The race is truly an America classic which started 25 years ago when two brothers; one a cyclist, the other the train conductor on the Colorado - Silverton line, challenged each other to a race on the opening day of the train line. On that first running the train won but the younger brother-cyclist eventually prevailed beating the train in the 3rd running. Since then the race has grown steadily and the 25th running had well over 800 riders.

This years winner Jonathan Vaughters set the course record and broke the 2 hour barrier. He simply had a superior level of fitness having just completed the Tour Dupont, 16th in the CG.
I think I was headed to a 3:10 time which would have put me in the top twenty of the 45+ class. The winner in that group was at 2:17!!!

The perennial favorite, Durango’s own Ned Overend is the Iron Horse participant with the most wins at 4 victories and even at the age of 41 took 2nd also breaking the course record. The following is exerted from the Iron Horse Web site.

The day began as a perfect racing day in Durango with overcast skies and a cool 40 degrees. By the time the winners reached the finish in Silverton it was snowing. Jonathan Vaughters from Englewood, CO was the winner breaking the previous Iron Horse record by a whopping four minutes. At the finish Vaughters confessed, "I actually thought I was playing it conservatively". Yeah, right! Second place went to Ned Overend of Durango who was four minutes behind Vaughters.

The race began with the parade through Durango pacing with the train. The race didn't actually begin until the racers reached the Iron Horse Inn, just north of town. A 30 bike pileup occurred shortly after that, but the racers had the advantage of a 20-30 mph tailwind coming up the Animas Valley.

The high point of the race (literally) is Molas pass (just under 11,000 ft. By 11:30 there were 3 inches of snow forcing race officials to stop the race for hundreds of riders further back and evacuate them off the passes. Most who finished had to brave hypothermic conditions for the last 15 miles of the race. As is often true of spring in Colorado, the weather was downright pleasant by 1:30.

Tuesday, November 29, 2016

How to Get Started with Zwift












I'm glad you're considering Zwift.  It's a fun training tool and a way to meet like-minded cyclists from around the world.  I've been riding in Zwift since March 2016 and I've ridden in the virtual world with 100s of people from Australia to Japan to Africa and all over Europe and the US.  Zwift is very social and it's easy to communicate with the riders you're next to via text messages in the game.  As with any video game the more you ride the more cool bikes, wheels and kit you unlock to dress up your avatar. If you climb 164,042 feet (50,000 Meters) you unlock the ultimate fastest bike called the Tron. It will take the better part of a year to get there!  Here's a picture of me and Meylina on our Trons.

Communicating with riders after your ride is possible because you save your rides to Strava and Strava shows who you rode with. Knowing that you can start a conversation with them by commenting on their activity.  Zwift creates a .fit file (just like a Garmin) which details real GPS coordinates and makes it possible to apply all the Strava analytics to your rides.  Here's a picture of my screen in a group ride.

You can also ride with friends in the virtual Zwift world no matter where they are in the physical world.  I have done many rides with my brother who lives in Florida and friends I have met in Zwift from New Mexico, Texas, Connecticut, England, France, Canada and Australia.

There are all kinds of group rides and races which make it even more fun. In Zwift all the dynamics of drafting, climbing and decending work just like in real life.  It makes for great motivation to see a rider just ahead and put the hammer down to catch them and get in their draft. I find I ride longer and harder indoors then I ever did before Zwift when the indoor trainer was a necessary evil.  Now I'd rather ride in Zwift with over 800 people than outside by myself. Here is an example from a recent Zwift race showing my Suffer Score.  That's 142 of 145 in the RED! 


Your speed in the game is based on the power you're generating and your weight or watts/kg.  The more watts/kg you're generating the faster you'll go.  Using this formula for speed replicates all the cycling dynamics of real life.  Heavier riders will need to put out more watts on a climb than lighter riders.  Similarly heavier riders will have an advantage on the descents. 

Here's a video of me climbing the mountain in Zwift Showing the different views you can switch between.




How to Get Started

A Zwift setup includes:
  • Your bike
  • A stationary trainer 
  • A Computer (laptop, PC, iPhone or iPad) to run the Zwift application and communicate with your trainer and bike via ANT+ or Bluetooth.  You download the app from Zwift.com and the first month is free.  From there on it is $10 a month.
Optionally you may consider 
  • A large screen TV or monitor for a more immersive experience
  • A cadence sensor to accurately reflect the cadence of your avitar
  • A heart rate monitor to know how hard you're working
  • The iPhone Mobile App which also lets you control the game and look up rider profiles during the ride.  It also has the schedule of events and let's you join in advance.
You will want to pick a good place for your Zwift "pain cave."  The setup should offer access to your keyboard to control the game (I use a separate bluetooth keyboard) and a fan to keep you cool (I actually use 3 fans, one in front and one on each side)

Here is a picture of my setup in the garage.

Here are some suggestions for your setup.

Your Bike
It is most convenient to have a bike dedicated to Zwifting. This greatly reduces the time it takes to get going.  Your bike should be a road bike with 700c wheels.  You may also consider mounting a trainer tire on the back wheel.  These tires are a lot quieter and last longer that standard road tires.

Your Computer
Most late model computers (Windows or Mac) will be able to run Zwift which requires 64bit.  Zwift is a video game so high performance graphics enhance the experience.  You'll also need an ANT+ USB that plugs into your PC or laptop. I use the Garmin one: ANT+ USB Stick  If you have a computer you can test it out by downloading Zwift and running it without any other equipment. Zwift has recently added support for the iPhone and iPad which only communicate via bluetooth which means they only work with Smart trainers that support bluetooth.

The Trainer
The least expensive option is a Zwift supported "dumb" trainer with a speed detector on your back wheel.  A dumb trainer is one that does not communicate directly with the game.  With a dumb trainer you communicate with Zwift by transmitting the speed of your wheel. Zwift estimates the power you are putting based on the speed of your wheel and the power curve of the trainer.  Here is the setup I started with. Travel Trac Comp Fluid -  I think this is the best least expensive option. For a speed sensor I use the Garmin Speed Sensor Link

For a more immersive experience you may want to go for a smart trainer like the Wahoo Kickr.  With a smart trainer Zwift controls the resistance so when you climb it's harder to pedal. After well over 1000 miles in Zwift using the Trac Comp Fluid I upgraded to the Wahoo Kickr SNAP.  The advantage of smart trainers like the Kickr SNAP is that they communicate your actual power directly to Zwift. DC Rainmaker has a detailed evaluation of trainers here. and the Zwift site has the complete list of supported trainers.

Here are some other resources you will find useful
You can follow me on Strava at https://www.strava.com/athletes/8196543 and I hope to ride with you in Zwift soon.  

RideON!


Friday, August 09, 2013

Was I A Victim of Big Data?

On a recent trip to Phoenix my flight was delayed then cancelled.  Through rerouting and other delays what should have been a 1.5 hour direct flight turned into a 12 hour ordeal.

I wonder was I a victim of Big Data?  Let me explain.

As a Big Data marketeer one of my favorite stories of how Big Data can force changes in the way you do business and cause you to call into question recommendations that are not intuitive is the following hypothetical dilemma.
You are the flight operations manager of an airline.  You have 2 planes about to depart.  It's snowing hard.  The airport calls down to inform you that only 1 of your 2 departures will be granted permission  to depart before the airport will shut down.  One plane has 4 passengers on board, the other is full with 200 passengers.  
What do you do?
You run your new "Flight Operations Optimizer" application.  It's a new "Big Data" application that calculates the down stream 72 hour impact of canceling a flight based on multiple data sets including all passengers impacted, expected weather delays at all downstream destinations, airplane maintenance schedules and crew schedules.
The Flight Operations Optimizer comes back and advises that you let the flight with 4 passengers depart.  That alternative has the least down stream impact to the airline - A better business outcome.
To which you say WHAT! That can't be right.  If I do that I'll have 200 pissed of passengers at the counter to deal with and furthermore I won't make my goal of - most passenger on time departures!
The story points out 2 major elements of Big Data.

  1. If you can analyze enough data often kept in different silos the results can well be counter intuitive.  Leading you to dismiss it and still take the decision that doesn't lead to the best business outcome.
  2. The employee goals you have in place to drive the best corporate outcomes might be driving behaviors that don't actually don't accomplish that.
Now back to my bad travel karma story.

Moments before boarding we were told our flight is now departing from another gate.  Quickly we hurried over to the new gate only to be told our flight was delayed 4 hours.  What happend?

From eavesdropping on the gate agents conversation I learned that the plane for the flight to LA had a mechanical and that our plane was allocated to the LA flight. Our flight was delayed while they looked for another plane.  2 hours later our flight was canceled. 

Could it be that the airline flight operations manager ran the "Flight Operations Optimizer" application the result being that giving our plane to the LA flight had the least down stream impact?

Then after most of the passengers on our flight found alternatives to waiting 4 hours there were very few left on our delayed Phoenix flight.  So the "Flight Operations Optimizer" application advised it be cancelled.

And that is how I suspect I fell victim to Big Data.  Best business outcomes don't always mean best personal outcomes.

Saturday, December 08, 2012

The Big Data Inflection Point




NetApp has offered Big Data Solutions since May of 2011.  We offer a portfolio of 10 solutions that address the major use cases of Big Data.   These solutions are based on both our storage platforms: FAS with Clustered Data ONTAP and E-Series with Santricity.

The Big Data market is loud and confusing.  In fact Big Data was named the most confusing term in IT this year surpassing Cloud which is now number 2.  Most of this confusion is the result of the fact that no two cases of Big Data are alike.  Additionally there are new technologies like Hadoop and it’s ecosystem of tools and applications that are causing a disruption to analytic technologies resulting a many innovations and considerable VC investment in start-up companies.

Big Data has captured the imagination of many enterprises as documented success stories point to new ways of doing business that change the game and result in considerable competitive advantage or significantly better business outcomes.

NetApp has a credible seat at any customer discussion about Big Data by way of our considerable experience in managing data at scale.  Our largest customer has over an Exabyte of data and we have hundreds of customers with over ten petabytes.  Many of the storage efficiency innovations that NetApp has lead such as deduplication and thin provisioning have lead to contemplation of “keep forever” data strategies.  The “delete key” is no longer the answer to Big Data.

What makes Big Data different is that customers reach an inflection point where they can no longer continue to what they did yesterday but just a little more.  Indeed they must fundamentally rethink their data storage strategies.   It is at that inflection point where without deployment of new approaches and technologies data growth and Big Data can become a liability or with the right approach become a propellant to the business.

It is at this inflection point that NetApp can be your trusted partner to help customers use Big Data to grow their businesses efficiently and flexibly. 

Saturday, March 31, 2012

Mike Olson - The Future of Hadoop

Last week I attended GigaOM Structure DATA in New York.  The conference is solely focused on Big Data with a mashup of start-ups, big companies, investors and thought leaders.  

Mike Olson did a masterful job in his interview with Jo Maitland painting a picture of Hadoop and the Big Data Analytics landscape while deftly handling numerous landmines tossed his way by Jo.  It's worth a watching.


Watch live streaming video from gigaombigdata at livestream.com

Saturday, October 29, 2011

Big Data Starts with ABCs














If you haven't noticed Big Data has created a lot of buzz lately.  Much of the buzz is from the absolute wow factor of how big is big.  With the number of smart phones nearing 6 billion all creating content, Facebook generating over 30 billion pieces of content a month and data expected to grow at 40% year on year it's easy to imagine big really is BIG.

In fact the digital universe has recently broken the zettabyte barrier which is approximately equal to a thousand exabytes or a billion terabytes.  How big is that?  To give you an idea of scale it would take everyone on the planet posting to Twitter 7*24 for 100 years to generate a zettabybe.

So you get the idea - it’s really big. 

As an IT organization you may be thinking that your own data growth will soon be stretching the limits of your infrastructure. A way to define big data is to look at your existing infrastructure, the amount of data you have now, and the amount of growth you're experiencing.  Is it starting to break your existing processes? If so, where?

“Big” refers to a size that's beyond the ability of your current tools to affordably capture, store, manage,and analyze your data. This is a practical definition since “big” might be a different number for each person trying but unable to extract business advantage from their data.


When we talk to our customers, we find that their existing infrastructure is breaking on three major axes:

  1. Complexity.  Data is no longer about text and numbers; It includes real-time events and shared infrastructure. Data is now linked at high fidelity and includes multiple types. The sheer complexity of data is skyrocketing. Having to apply normal algorithms for search, storage and categorization is a lot more complex.
  2. Speed.  How fast is the data coming at you? High definition video, streaming over the Internet to storage devices, to player devices, full motion video for surveillance – all of these have very high ingestion rates. You have to be able to keep up with the data flow. You need the compute, network and storage to deliver high definition to thousands of people at once, with good viewing quality. For high performance computing you need systems that can perform trillions of operations and store pedabytes of data per second.
  3. Volume.  For all of the data you are collecting and generating you have store it securely and make it available for ever. IT teams today are having making decisions about what is “too much data”. They might flush all data each week and start again. But there are certain applications like healthcare where you can never delete the data. It has to live forever.

These trends in data growth are something we at NetApp have been following for quite a while now.  We’ve been enhancing ONTAP to deal with the scale needed to handle large repositories of data and we have also made strategic acquisitions anticipating the need for high density high performance (Engenio) and infinite content repositories (Bycast).

In conversations with our customers dealing with the onslaught of data we have noticed 3 important use cases that are stretching the limits of their existing infrastructure.

We’ve named these axis’ the ABCs of Big Data.

  • Analytics.  - Analytics for extremely large data sets to gain insight and take advantage of that digital universe, and turning it into information. Giving you insight about your business to make better decisions.
  • Bandwidth - Performance for data-intensive workloads at really high speeds.
  • Content - Boundless secure scalable data storage that allows you to keep in forever.