Mainframe Blog

Think You Can Avoid the Digital Tsunami? Think Again

John Barry
3 minute read
John Barry

The explosive growth of data in the digital enterprise can hit your organization like a tsunami so you’d better be prepared to avert a disaster. Tsunamis, as you may know, are massive tidal waves, usually caused by earthquakes, volcano eruptions, and landslides. The destruction can be far reaching and long term.


Just like a tsunami in nature, the digital tsunami will be devastating. If you’re not prepared, the wave of overwhelming data growth will slow down your applications and lead to outages, costing you customers.  This digital tsunami is caused by the demands of people accessing more applications and conducting transactions more frequently than ever before. For example, several years ago you may have gone to a gas station and paid for the gas with a credit card. You just conducted just one transaction with your bank. Today, while you’re waiting for the tank to fill, you check your account balance, deposit a check using the camera in your phone, transfer funds from your savings account, and purchase that movie ticket for later tonight. In the same amount of time, you created five transactions with your bank. This type of activity is driving an exponential growth in data to your business.

Failure to use the right tools to optimize and manage this growth can result in downtime that leads to lost customers and lost revenue. There’s a new way of looking at downtime. One large credit card company that processes over 1,000 transactions a second told us they don’t measure downtime in dollars, but rather in lost customers. Every minute of downtime costs the business 60,000 customers who may never return.

Pay Attention Now to Prevent a Devastating and Costly Digital Tsunami
There is a silver lining to this digital tsunami, however. In nature, monitors can help to predict tsunamis well in advance to help people be better prepared and get to high ground.

For the digital tsunami, let this be your early warning! The wave is coming, if it’s not already breaking on you! Mobile and analytics are expected to drive an 800 percent increase in data in the next three years. Unfortunately, many organizations are still using the same utilities for regular “housekeeping” on the mainframe that they used decades ago, when transaction rates were much lower and data volumes were a fraction of what they are now.  If this is you, heed this WARNING: You are at risk!

Leverage Next Generation Technology to Stay Ahead of the Storm
You simply can’t manage the high transaction rates and enormous data volumes in the digital enterprise with 1980s data management technology.

Pay attention to the warning signs:

  • Do you only reorg a small subset of partitions on a weekly basis?
  • Do you schedule reorgs for the weekend when they really need to be done now?
  • Do you avoid large non-partitioning indexes because they slow down a reorg? Or worse, do you drop them to get a reorg done?

If the answer to any of these questions is yes, you are at risk! Your applications simply aren’t optimized.  In the mobile world, your company’s customer is just one click away from becoming someone else’s customer. If your applications are too slow and your customers’ transactions aren’t fast enough, they get frustrated and find alternatives. So, if you use old technology to manage your data you could be putting your business in jeopardy. As the data grows and the tsunami approaches, your problems are only going to get worse.

There is hope! There is a way to manage more data than ever before and actually lower costs and simplify data management processes! The Next Generation Technology for Data Management is here!

Act Now!
Want to learn more about how to manage your largest databases with zero application outages and keep them at peak performance while reducing operational costs? Want to learn how one customer ran 6x as many reorgs and lowered their costs by 10 percent? View this on-demand webinar with Sheryl Larsen and John Barry from BMC that was hosted by IBM Systems Magazine: Tsunami Warning! Preparing DB2 for the Data Tidal Wave.

Access 2020 Mainframe Survey Results

BMC’s 15th annual mainframe survey reveals strategies to adapt, automate, and secure the mainframe for continued success.

These postings are my own and do not necessarily represent BMC's position, strategies, or opinion.

See an error or have a suggestion? Please let us know by emailing

BMC Bring the A-Game

From core to cloud to edge, BMC delivers the software and services that enable nearly 10,000 global customers, including 84% of the Forbes Global 100, to thrive in their ongoing evolution to an Autonomous Digital Enterprise.
Learn more about BMC ›

About the author

John Barry

John Barry

John Barry has more than 20 years of Db2 experience, dating back to V5 of Db2 and every subsequent version since. He is currently the Principal Product Manager for the BMC Solutions for Db2. John is a regular speaker at area user groups around the world and has worked with Db2 installations to help them reduce cost and manage growing volumes of data through improved automation strategies for Db2 data management.