
Owning up to mistakes
Going back a few years to the Patriots and Seahawks Super Bowl was an emotional roller coaster leading up to one of the most emotional ending for any fan. A high if you are a Patriots fan or low if you’re not. I was in the not category; the reasoning behind that is another story altogether.
Long story short, the Seahawks were a few feet away from winning; instead of handing the ball to their running back to blast through the defence, they decided to throw the ball with a lot of defensive players around. Unfortunately, that pass got intercepted, and the game was over. The announcers, twitter, facebook and all other social media went insane with the amount of traffic asking “what were you thinking?”.
Looking back on that it got me to thinking about some of the missteps I did during my long career and how I handled them. Each one I owned up to, there was no real reason not to. None of my missteps were illegal, and some did cost time and money to fix, but nothing that was an excessive amount. Probably the biggest issue I caused was deleting about 75% of clean test data in a database by accident. It was a department initiative to clean up the data and automated test scripts to improve execution efficiency. Another QA analyst and I worked on the project for over six months. It was tedious and involved using a mainframe that was very particular on the instructions that would be coded in. During one of my data moves to the new “clean” database, I did not notice that I accidentally hit the space bar at one point in keying my move instructions. Because of that, the system moved all data from the “sandbox” database to the clean database. That move included all the garbage data and other non-functional data that need a specific one-off script to work.
There was a massive knot in my stomach as I noticed the move was taking longer than it should, and there was no way for me to stop it. Months of work looked destroyed and the project would be at square two, some of the stuff that moved was clean in both databases. I didn’t try to hide it or fix it before anyone noticed. I went to my boss and broke the news. He was a little upset but was happy when I also told him about how I planned on fixing it. We were lucky to find out that the system did weekly back-ups, so we only lost about a week’s worth of work after the backup restore.
There are plenty of stories out there where one small mistake can have enormous consequences. Years ago, the news about Toy Story 2 accidentally deleted comes to mind (http://thenextweb.com/media/2012/05/21/how-pixars-toy-story-2-was-deleted-twice-once-by-technology-and-again-for-its-own-good/). I am sure others have experienced something like that, especially in IT where a simple keystroke can do so much.
Owning up to the mistake and working with others on how to fix it shows a lot of leadership. Even if you may not want to lead a group or team, it shows that you are taking responsibility for your actions. It is better to do that then let it fester, but that is another blog altogether.