Before autonomous data correction software met the mainframe, a day in my life as a DBA looked like this:
2 a.m. – Diagnose a critical maintenance utility failure for a panicked night operator, re-submit the REORG job, and head back to bed.
8 a.m. – Leverage a database tool to pull pertinent data for an emergency report on an internal customer’s sales region.
9 a.m. – Use various database tools and review performance-related data to improve data access for developers alarmed their application performance is slowly degrading.
12 p.m. – As lunch approaches, identify where I can save data for a scheduled backup, having noticed unforeseen space problems, and successfully capture my backup.
There I was, a highly trained computer professional, troubleshooting random bugs and issues, sometimes in the wee hours of morning.
The world changed for DBAs like me when database tools came on the market. Rudimentary at first, the past 20 years have seen such software blossom into autonomous tools that can adapt and intervene on a DBA’s behalf to address any event before it becomes a problem.
Add to the equation the power and performance of the mainframe to keep such systems running at peak performance, and autonomous solutions can free up a skilled employee and change entire systems for the better.
How autonomous computing improves a DBA’s life
There are five levels of autonomous computing:
Level 1 is no autonomy, and level 2 has little enough that significant human input is still required. At level 3, things get interesting. This is where software can not only detect an event, but it can also advise users on the solution. Level 4 lets the system both advise and make certain corrections. And at Level 5, we see autonomous fixes for detected events, with the ability to “watch and learn,” adapting to changes in real time.
If all of this seems processor-intensive, it is. Autonomous software produces its own share of log data, and it consumes precious processor resources as it listens to events upon which to act with its purposeful “watch and learn” raison d’etre. And with the sheer volume of data passing through any business’s systems, not just any processor can contain transactions of this volume.
With the advent of microservices running on the mainframe, each monitoring specific sensors for specific conditions, the future is actually here for DBAs — and the future is the mainframe. Suddenly, a DBA’s day can look like this:
6 a.m. – DBA reviews the previous night’s batch window activity via a mobile UI, all problems already solved.
8 a.m. – Internal customer pulls their own report with an interactive web UI and automated assistance.
9 a.m. – DBA notified via an interactive web UI of performance avoidance measures taken on business-critical objects.
12 p.m. – DBA notified of space outage avoidance measures taken during routine data backup
ALL DAY LONG – DBA is free from tedious troubleshooting and can focus on higher and more business critical tasks.
The whole reason we build machines is to free up human capital and focus our precious time solving bigger problems. Does autonomic software mean the days of a DBA are numbered? Absolutely not. Instead of being reactive, DBAs of the future will be proactive, able to bend their training and creativity toward building better, smarter, and more efficient systems. And mainframe database administrators know the secret sauce is the mainframe itself. With the muscle and processing power of the mainframe, autonomous troubleshooting software can handle the drudgery so the DBA can plug into their human ingenuity.