Looking for something? Look here!
Want more unvarnished truth?
What you're saying...
What I'm saying now
I think tag clouds are pretty, and not to be taken overly seriously
##MoveWithGary #Home Inspection #MoveWithGary 111 Chop House 75 on Liberty Wharf 9/11 A Broth of a Boy ABCs Abiouness accountability activities alcohol Allora Ristorante Analysis Angry Hams ANSI/TIA 942 Anthony's Pier 4 Apple Application Armsby Abbey Arsenal Arturo's Ristorante Ashland AT&T Audio Automation baby Baby Monitor babysitting Back To School Bad News Bangkok Thai banks lending movewithgary Bar Bay State Common baystateparent BBQ BCP Bees BeeZers Before I die I want to... behavior Big Bang Bike Bill of Rights Bistro Black Box BlackBerry Boston Boston Marathon boundaries Boyston BPO brand Breakfast Bridge Bring Your Own Technology Budget Burlington Burn Burrito buyer BYOD Cabling Cambridge Camp Campaign career Casey's Diner Castle casual cCabling Cell Phone Central Square Change Management Cheers Chef Sun ChengDu Chet's Diner Children Chinese Christmas Christmas Families Holiday CIO Cloud coddle collage College College Acceptance co-lo Co-Location Co-Location Tier Power Cooling Comfort Food Control Country Country Kettle Crisis customer dad Dad Phrases damage daredevil Data Center Data Center Design Davios Day Care Dead Death declaration Del Frisco's Design Desktop Video dinner Disaster Recovery Divorce Do Epic Shit dodgeball downsizing Downtown Crossing DR driving Droid Easter Economic Kids Edaville Education Elbow Night Elevator Employee Engagement Erin Estate Planning Etiquette Evaluation events Exchange Expiration Dates Facebook Failing family Family Law Fatherhood Favorite things first time buyer Flash Flemings Fogo de Chão Food Hits and Misses Format Foundry on Elm Foxborough Frameworks fraternity Fraud French Fried Clams friends fun Fusion Generations germs Girl Scouts girls Global Go/No Go GPS Grafton Grandchild Grandpa Harry's hazing Healthcare Healthy Choices while Dining Out Help Desk Hisa Japanese Cuisine Historic holiday Home Home Inspection hope Horizons hose Hot Dog Hurricane IIT Assessment incident Indecision Indian Infrastructure Inn Innovation Insurance Internet Inventory Management iPhone IT IT Assessment IT Satisfaction Italian Jack Daniels Jakes Restaurant Janet Japanese Jazz Joey's Bar and Grill JP's Khatta Mitha kickball kids Laid off Lakes Region Lala Java Leadership Learning legacy Legal Legal Harborside Les Zygomates L'Espalier Liberty Wharf life transition lights out Linguine's loss Love Lucky's Cafe luxury luxury home M&M Macys Thanksgiving Day Parade mai tai Managed Application Services Managed Services managers Mandarin Manners Mark Fidrych marlborough marriage Mary Chung mass save Maxwell-Silverman Mediterranean meetings Memorial Day memory Mendon Mergers Mexican MiFi Migration Ming III miss MIT MIT CIO Symposium mmortgage Mobility Moes Hot Dog Truck MOM money mortgage Mother MoveWithGary Moving on Name nature neanderthal neighborhood Network new listing New York Marathon newborn newtomarket Northborough Not Your Average Joe's Nuovo Nursing On-Call Operations Operators Oregon Club Organization Pancakes Pandemic Parental Control Parenting Patch Peeves People Perserverance UMASS growth Photography Play Plug and Run Predictable Pride Problem Process Production program Project Management propane PTA. PTO PUE QR Quick Response Rant re/max Real Estate Realtor Recognition Red Rock Resiliency Respect restaurant Restaurant Guy RFP ribs Ritual Root Cause Analysis rReal Estate Sam Adams Sandy Sapporo savings School Sea Dog Brewing Company Sea Dog Steak and Ale Seafood Seaport Security Sel de la Terra Service Service Desk Service Indicator Light sharing ShearTransformation SHIRO Shit Pump Shriners SHTF Simplification Skunk Works Skype Sleep sleepovers Sloan Smith & Wollensky soccer Son SOP sorority spanking Squarespace staffing staging Starbucks Status Reporting Steak Steve Jobs Storage Strategy stress Summer Sushi swimming Tacos Acalpulco teacher Technology Teen Telephony Temperature Strip Tenka terrorist Testing Texas BBQ Company Text Thai Thanksgiving in IT The Mooring Thomas Thought Leader Three Gorges III TIA 942 Timesheets Toby Keith Toddlers traditions Transition treehouse turnover TV Twitter unspoken moments Valentine's Day Value Vendor Venezuelan Verizon Vermont Video Vietnamese voice VoIP Watertown Wedding Westborough Korean Restaurant Westborough MA. StormCam WiFI Wi-Fi Wilbraham Wine Worcester work work life balance working Yama Zakura Zem Han Zitis

Entries in DR (6)


Lessons from Hurricane Irene or how are your BCP/DR plans in practice?

So the media was all over Irene and with good reason.  At this writing, 21 people have perished in the hands of this evil lady, with hundreds of millions (or more) in damage.

In my lovely town, we greeted Irene with caution.  Sure, I had bought a couple tarps and some duck tape (yes, mine says duck, and not duct.)  After all, at 29 miles west of the ocean, I wasn’t concerned about storm surge.  I did invest $36 in a DC/AC inverter as an afterthought with the intention of streaming video from my moving F350 truck. 

Fortunately, it never got bad enough in this neck of the woods to “do foolishness.”  Nearly all my town of 18,000 lost power at 10:49AM Sunday….and this gave time to consider technology and DR/BCP plans.

As I sat with my neighbor on my front porch last night at 11PM, enjoying a crisp, post-storm sky and watching the police slowly patrolling for evil-doers, my neighbor kept saying, “What was it like without power?”

Well, what was it like without power?

I contend we were never really completely without power.  Everyone had cell phones, and they were clearly working for the duration.  AT&T, Verizon, MetroPCS and Sprint all seemed fine.

A last minute Livestream account afforded me the opportunity to “broadcast” live from the scene.  Frankly, the scene was (fortunately) pretty boring.  When the power went out, the laptop’s battery kicked into gear, and a battery operated mifi unit kept me on the air.

A local gas station had to close when the power went out.  The pumps wouldn’t work. 

The local pizza house, accustomed to producing hundreds of pizzas a day using an electrically operated conveyor oven, had to use the original “old school” gas fired pizza oven, creating a backup of as much as two hours for pizza (and making some customers gripe loudly.)  They used flashlights and emergency lighting.  Since the cash register was down, battery operated calculators were used.  With no phone service, the only ordered was at the restaurant, and credit card numbers were written down for subsequent processing.  Was this part of their formal disaster recovery plan?  No…it was put together on the fly.

On those customers griping loudly….it’s interesting to observe crowd behavior.   The storm had been forecast for a week, with public authorities recommending people “stock up” on a “couple days” food.  While two hours is a long time to get a pizza, when you look into the darkened kitchen seeing the staff working with flashlights and the owner manning the pizza oven…somehow “starvation” doesn’t come to mind.  Patience and understanding is what comes to mind.  One thirty something patron showed the bright side when she declared, “the beer is still cold” while waiting for the pizza.

Without power, the WAN connection to my home failed.  As a Verizon FiOS user, with fiber optic to the home, I tried powering the router with the inverter and could not pull a signal  (the pole mounted repeaters were powerless.)  My old school copper phone did work until the FiOS battery backup failed HOURS into the event.

That $36 inverter provided power for the laptop, and fluorescent lighting in two homes.  While it was nice to have some light, and to stay connected, a Disaster Recovery strategy of truly working from home would need to address any VoIP phone issues (perhaps using soft phone), and longer battery times. 

Monitoring the police/fire scanner, the fire department was kept busy dutifully responding to reports of fire alarm failures throughout town.  As the batteries lost power in these systems, they arranged for a “one last call”….creating a subsequent follow up task for the fire department.  It strikes me these systems should have at least a 6 hour battery backup if not more.

Companies investing in appropriate infrastructure or co-lo weathered the storm quite nicely, and I couldn’t find any popular site or bank not responding during the height of the storm.   On the front deck we could hear the muffled generator from a distant cell tower, enabling our communications.  My town hosts a number of large data centers, and they were impervious to the storm.

The MBTA shut down service before the storm hit.  While protecting their staff and systems during the storm, I’m always reminded how many people rely on public transportation to get to the office.

At 9:49PM, my cell phone rang with a robo call from my electric company, suggesting they were working on safety first (downed lines) and then would work to restore normal power.  “It may be several days before all power is restored.”  They even have a nifty website for seeing outages.Sample Power Outage Website

At 7:21AM, my power was restored, and in my corner of the world things are quickly getting back to normal.  Honey Dew Donuts is still without power and Dunkin’ Donuts is picking up the slack.  So I’m a lucky one.  My thoughts go quickly to those who lost loved ones and/or homes.  We have an expression, “never let a good disaster go to waste”, and Irene presented this central Massachusetts town with lessons.

Companies of all sizes and types can work to ensure they have solid business continuity and disaster recovery plans, and that the plans are regularly tested.  Making plans up “on the fly” is good for an adrenaline rush, and not good for business.  Investing in the appropriate infrastructure, and understanding how communications are provided to your facility (routing, including above or below ground considerations) are key.

In the Northeast, Hurricanes are rare.  Yet “stuff happens” and daily little outages do take place almost daily.  We’ve put together business continuity and disaster recovery plans and performed assessments for startups to large investment management companies.  If your company doesn’t have this, it’s time before the next event.

How did you and your company weather the storm?



DR , BCP and Dirty Little Secrets

BCP…Regulations…Real testing…DR…Capacity…Capability…

Do you have an honest view of your ability to recover? How do you fix your resilience or recoverability, or even begin?

DR and BCP – How good is your DR and BCP capability? Ask anyone in a position of authority from the smallest firm to a well known national company and the response will likely be similar. “DR, BCP. Yeah we do that.” But when pressed, what you actually do for it, many many people don’t have a true grasp on it. There are lots of reasons for this. Some people truly don’t have a clue and just shrug it off. Others know the true state of their recoverability and just don’t want anyone to know for obvious reasons. Others are aware of their underinvestment in Business Resilience and it scares the pants off them. Still others think they have a good program but have no confidence that it will work.

Those are my observations from the unscientific research and work I have done in this field. We like to call it the ‘Dirty Little Secret’

Business Continuity, how could you not do that well? That’s an easy one. BC covers a broad spectrum of a business. I like to explain BC and DR as:

BC = people and DR = systems.

Whether BC or DR, to do them well, one needs to know what to protect and what to let go. Also, the amount of risk that can be absorbed by the business across its lines. This can be accomplished with a BIA, a Business Impact Analysis. Properly done it can actually save you money as you will know what and where to spend your scarce dollars to get the most bang for the buck. Refreshing it regularly (12-18 months) will ensure that you keep up with any growth or change in your business. Not only growth, but these days, contraction. You may be able to see where you can trim some of your spending.

DR and BC are similar to life insurance. Many times companies grow and grow and grow and DR/BCP gets put off until later. We need production systems, development systems, upgrade production….We can put off the DR upgrade to the next release. Once that is done, it is easy to put it off again and again. The result? Very similar to a family; young people get married and have a little insurance (enough to be buried if any at all.) Time marches on…a house, kids, cars… obligations. One day you take stock and oops, I have $10K of insurance and 1 million in needs.

Insurance, you can always go and buy and if you die before you get any….well, you won’t really care but your family will be pretty mad at you.

You business responsibilities on the other hand will have a direct impact on your livelihood. Fail to recover from a DR or BCP problem and you will be out of work. You may also be liable to regulators, state or federal authorities, and possible criminal liability if attestations of recoverability are made that turn out to be patently false. Rest assured, if you fail, the lawyers will be standing over the carnage looking to asses accountability. You do not want to be in that position.

Have an honest discussion with your CEO, CIO, COO or CFO. One or all of them are ultimately responsible for assuring the survival of the firm. If you or they don’t truly know the real score, conduct a test. If this is a problem, engage a contractor or consultant to design and administer one for you or to do an assessment.

If you know you are deficient, but not sure of where to focus, conduct a BIA. This will give you the tools to build a roadmap to recoverability. Whatever you do, don’t let your firm’s recoverability go unaddressed. You can make a difference and expunge the dirty little secret.

This post was prepared by John Manning, Associate Partner at Harvard Partners. He can be reached at john.manning@harvardpartners.com


When DR is Fraud

I am a big believer in Disaster Recovery (DR), or having the processes with the electronics fail.

DR can also be used to perpetuate fraud, as I learned in a recent company.

We were a property management company, receiving rents monthly. We tended to go out on the 10th day of the month to collect unpaid rents.

Imagine our surprise when two tenants had handwritten receipts for paid rent…yet the automated system did not show anything?

It turned out our new receptionist was pocketing cash deposits, and handing out handwritten receipts. We discovered this quickly, and she spent time in the big house.

The lesson is to make sure your manual processes and automated ones “tie out.” At this company, nobody was checking the manual receipt log…especially where we’d had no outages.

We were fortunate…at the end of the day we lost a few hundred dollars. A bigger company with more cash could have had a devastating issue.


Is the Pandemic a bust like Y2K?

Is the Pandemic a bust like Y2K? A real problem that didn’t happen?


Over the last year we have been inundated with messages about the Swine Flu, H5N1. One needs to be vaccinated, one needs to be prepared. Is your family ready, is your business ready? What will you do if the flu strikes? What will you do if you can’t work, infrastructure breaks down? Hmm? What will you do?

Sounds like a bad insurance ad. Well, last year’s flu season has come and gone….and by gosh, the world didn’t come to an end and there were no Monty Python-like scenes of bodies in the streets with shouts of bring out your dead. OK, so I exaggerate but only a little. To listen or read the public health information, the picture painted was dire. Many of us got vaccinated or at least got our children vaccinated.

Thoughts of the Bird Flu or Swine Flu and Pandemic motivated many to action. On the positive side, families got vaccinations, schools and businesses worked to educate and improve health practices. What practices? Like teaching kids to sneeze into their elbow and not sometimes catch it in a little hand. Same for adults. Washing hands thoroughly every time you even walk by a bathroom….and use soap (for the kids of course)

Also, the sale of hand sanitizer went through the roof. Many buildings have dispensers next to all the elevators and stairs.

Is hand sanitizer effective? At one time I used to work in a 30 + story building and was part of the life safety team for fire drills. At the rally point, I had a clipboard and staff rosters. I needed to account for my teams. With my ‘GO KIT’ I also had a bottle of hand sanitizer- a small one the first time, a bigger one the second time and HUGE one the third. Why? Figure there are about 200 people per floor, and you are on floor 20. Heading downstairs, you will put your hand on the railing that 4,000 people just touched. What a great way to spread a cold or other nasty bug. Just think of that, 4.000 people, itching, scratching, picking (ok gross, but you get the picture)

When my teams came up to check in I offered them some hand sanitizer. At first people we ‘nah I’m good.’ After the above scenario, now I see may people sharing a squirt from their own stash of hand sanitizer. Almost reminds me of college days and shots of schnapps. That would work as a sanitizer too and you could drink it <LOL>. Somehow I don’t think the HR folks would be digging that. Sure would improve everyone’s outlook on fire drills! I digress. 

Back in the late 90’s we were all made aware of the dire consequences of not taking action. Planes would from the sky, phones would not work, and everything would stop working. Well, Y2K came and went and nothing happened. Except, we spent a fortune improving infrastructure and testing testing testing with the result that Y2K was a non-event. Post January 1, people asked, “why did we spend that money?” Was it necessary? Would it really have been that bad if new didn’t spend that money?



The parallels with pandemic planning are interesting. Since the predictions of dire results from massive flu outbreaks failed to occur, the predictions are like crying wolf.

The problem is the government ramped up their Pandemic response plan and the pandemic failed to have the impact expected/predicted. Just like Y2K, everyone got hyped up, but Armageddon didn’t happen. The flu failed to cooperate.

It would be nice to say that this isn’t going to happen again. It will. The problem for emergency managers and planners is the public will be skeptical to act so soon after the Swine Flu outbreak of 2009-2010. The take away from this article is being prepared and aware will always be beneficial, it is never a waste of time or resources especially since these plans can be reused and recycled as needed.

Think it doesn’t happen, think again. I was vaccinated against the Measles in 1966. My mother actually had my records! The firm I was working for a few years ago was in the process of buying a smaller firm. During the course of due diligence, many site visits were conducted. Well, lo and behold, some of staff from the target firm had just returned from extended tours overseas in an area of the world that didn’t practice immunization as the US and much of the first world has. The company made a decision everyone in the building needed to either provide proof of immunization or get the shot now.

The firm had a flu/ pandemic/communicable disease plan. They didn’t need to think about a response. They had one ready to roll. The Business Continuity team presented it to senior management with the options. The management team had what they needed on a timely basis, well thought out with options. This allowed them to implement a measured response to the incident.

Regardless of event size, planning and practice will always be beneficial, even if a predicted ‘big event’ doesn’t turn out. While the Swine Flu of last season didn’t turn out to be as bad as anticipated, the H5N1 Avian flu is still building and could break out to be on the scale of the great Flu of 1918. Let’s hope not. Being prepared and aware is the best response.

This post was prepared by John Manning, Associate Partner at Harvard Partners.  

He can be reached at john.manning@harvardpartners.com


Where are Businesses with DR and Business Continuity?

I recently refinanced my house for a lower interest rate. The final days leading to the closing give insight to the business continuity and DR improvements companies can strive to achieve.

My refinance was with the mortgage holder. This US bank, one of the big four and the recent benefactor of bailout funds, was more than happy to accept my refinance application.

As a bill-paying-never-in-arrears-with-my-mortgage customer, the approval process was lengthy. (“If only you’d missed some payments we could make this happen quickly” Argh.) Once finally approved, I wanted to move quickly to the closing to immediately begin reaping the benefits of the lower rate.

The closing was scheduled with great expectation for 8AM on a Wednesday.

This is when the company flaws became pronounced.

At 2:00PM the day before the closing the bank called, “We’re sorry, we don’t have the final closing numbers because our computers are down.”

“So a big name bank with billings of dollars (and bonuses to match) can’t access my closing account information. OK, interesting… banks should have generally available systems, outages are really unacceptable. Oh well, I’m sure it is temporary,” I thought.

Wrong…the next day, around 9:00AM, we rescheduled to 2PM.

And then we rescheduled to 4:30PM.

At this point, I asked for a manager. The manager sheepishly acknowledged, “We’ve now got the final numbers, but the Title Company we use has staff ‘working from home’ due to heavy snow in Maryland. They not able to work effectively from home.”

So this bank subcontracts certain key elements of the closing process to other firms…and obviously the business continuity plans are ineffective. When was the last time these plans were exercised? If Maryland is getting hammered with snow, why not redirect the work to the west coast? Why isn’t the bank asking these questions of the firm they use?

Another day goes by, and I’m still paying the old, higher rate on the mortgage. Somehow, this doesn’t seem right. And what reasonable recourse do I have? I am paying the bank for a service, and they hired the other companies. The DR and continuity plans are clearly inadequate. How do I get reimbursed for the extra day at the old interest rate? How do we address the poor service issue?

As a customer, there’s little we can do beyond being vocal, especially at the end of a long road. The companies providing the weak service get paid no matter what, and are not held accountable.

Ironically, if a gas pump at a local gas station doesn’t work, you either use a different pump or go to a different station. There is a direct impact on the sales and profitability of the station. It’s a simple model.

How does a bank get held accountable by their customers? Go to a different bank…easily said, and harder to do at the end of a process. I don’t envy Department of the Treasury Secretary Timothy F. Geithner trying to sort the bank accountability issue!

Did I eventually close? Yes. I did discover this bank has an active social media monitoring effort. To their credit, they picked up on some tweets in the waning moments of the process and tried assisting.

The closing attorney and I did have a bit of a disagreement; I insisted the computer generated forms use my name and not someone else. We’ll talk data quality in another post!


Role Clarity in a Crisis  

“Let Barbara do her job,” was the text message received from the CIO.

We were in the middle of a major crisis. The network had a glitch of some kind, and while the old fashioned host connected machines were fine, the Chairman wasn’t able to retrieve his email.
The conference call had been running for hours. Barbara headed (voice and data) communications, and with a deep voice background was somewhat new to data.

Since the call had run for a lengthy period, frustrations were bleeding on to the conference call. It seemed everyone was now a data communications expert, especially the desktop support people responsible for the non-disconnected clients.

So while Barbara had been leading the call, Barbara’s manager felt compelled to “help” and began directing the call, hence the text message from the CIO lurking on the conference bridge.

This brings up a couple key points in Crisis Management.

Having clarity around leadership is key. Barbara is a very competent leader, and while new to data communications was more than capable of following a process to resolution. Barbara was trying to lead her team in a structured approach AND deal with the conference call of interested parties. A more effective approach would be to have two conference calls…a technical call and a management call. Barbara should have been leading the technical call, with someone else leading the management call.

Barbara’s manager should have coordinated with Barbara were a change needed in bridge leadership. Basically taking over the bridge on strength of personality cut Barbara off at the knees. Everyone saw this (Barbara, Barbara’s staff, and the support organizations. It was not a smooth handoff, it was grandstanding unnecessary during a crisis.

Knowing who is on the call is important as well. In this case, the CIO was silently lurking on the call. It was his organization, and he was on the hook to update management. While there was no reason to exclude him, obviously it was unknown he had joined. What if the company was publicly traded and the “lurking CIO” was a member of the media? One approach some companies use is to have each conference call established with unique calling IDs (although you need to be sure ex-staff aren’t still getting the text pages).

Another uses a gate keeper to answer a call in number, confirm identity, and then join the caller with the conference call already in process. While more overhead, it also gives a chance to update callers before they join a call (as often the first question is “what is going on”, inevitably disrupting the conference call flow.

Role clarity is key in any crisis, lest a free for all develop. Clarity around leadership, management updates, protocols are all important.

We are struck by the Christmas 2009 bombing attempt on Northwest Airlines flight 253 and whether Janet Napolitano would have benefited from these lessons as she uttered, “One thing I’d like to point out is that the system worked.” The system worked after the incident, arguably there were issues before. Ms. Napolitano’s words created a separate large preventable firestorm.