Criss Scruggs – BMC Software | Blogs https://s7280.pcdn.co Wed, 05 Jan 2022 12:07:43 +0000 en-US hourly 1 https://s7280.pcdn.co/wp-content/uploads/2016/04/bmc_favicon-300x300-36x36.png Criss Scruggs – BMC Software | Blogs https://s7280.pcdn.co 32 32 Free your digital business from the burden of technical debt https://s7280.pcdn.co/free-your-digital-business-from-the-burden-of-technical-debt/ Mon, 01 Apr 2019 00:00:44 +0000 https://www.bmc.com/blogs/?p=13804 It’s hard to win a race when you’re carrying extra baggage. When your business moves at digital speed, it’s not enough to deliver new business services more quickly—you’ve also got to keep the technical debt of legacy applications from weighing you down. As the transition to digital continues to expand and accelerate, Gartner has been […]]]>

It’s hard to win a race when you’re carrying extra baggage. When your business moves at digital speed, it’s not enough to deliver new business services more quickly—you’ve also got to keep the technical debt of legacy applications from weighing you down.

As the transition to digital continues to expand and accelerate, Gartner has been working to identify the formula for success for IT leaders today—a direction it calls ContinuousNext. In a recent report, “Predicts 2019: Governing Application and Product Portfolios,” the firm calls out two priorities familiar to many organizations:

  • Supporting continuous improvement and continuous delivery for new services
  • Preventing technical debt from undermining digital agility

It’s easy to see the connection between these two themes. The accumulation of technical debt makes it increasingly difficult to keep pace with the demands of digital business, as the requirements of legacy systems drain IT resources that could otherwise fuel innovation. Initiatives for application rationalization and modernization typically deal only with the symptoms of the problems without addressing the core issue: the changing relationship between business and IT assets. Gartner puts the impact of this situation bluntly: “By 2023, 90% of all technical debt existing today will still exist, and will continue to strangle business innovations.”

There’s a better way to manage the impact of technical debt. In fact, many organizations already have the right solution in-house: application workflow orchestration. As Gartner says, “With technical debt it’s not always the amount that paralyzes an organization, it’s how the technical debt affects the business process within that organization.” By using application workflow orchestration to enable a Jobs-as-Code approach to CI/CD, you can ensure delivery of new business services even as your legacy systems continue to play a mission-critical role.

Building Jobs-as-Code into your CI/CD pipeline eases the burden of technical debt and accelerates service delivery in several ways:

  • By left-shifting job creation to the development stage, you can eliminate the need for scripting at the operations stage; IT ops receives production-ready applications.
  • An automated CI/CD pipeline reduces manual work for developers, who can then focus on innovation.
  • Defects and errors can be found much earlier in the software delivery lifecycle, when they can be fixed much more quickly and inexpensively.

The legacy systems and silos that drive technical debt won’t be modernized overnight—and trying to do so would be misguided anyway. What IT organizations really need to be asking is, how can we keep technical debt from undermining our agenda for transformation? And how can we build the fast, agile, and efficient CI/CD pipeline our business needs to enact this agenda? With Jobs-as-Code, you can answer both questions.

]]>
Data Quality Management: An Introduction https://www.bmc.com/blogs/what-is-data-quality-management/ Mon, 30 Jul 2018 00:30:01 +0000 https://www.bmc.com/blogs/?p=12542 More business leaders are becoming aware of the tremendous impact big data has on the trajectory of the enterprise organization as it relates to: Predicting customer expectations; Assisting with effective product management; Being available on-demand to influence top-down decision making; Tailoring customer service innovations by investigating shopping habits of customers; and Providing organizations with competitor […]]]>

More business leaders are becoming aware of the tremendous impact big data has on the trajectory of the enterprise organization as it relates to:

  • Predicting customer expectations;
  • Assisting with effective product management;
  • Being available on-demand to influence top-down decision making;
  • Tailoring customer service innovations by investigating shopping habits of customers; and
  • Providing organizations with competitor information

However, there’s one big caveat — if your data isn’t accurate, complete and consistent it can lead to major missteps when making business decisions. In fact, Gartner estimates the average financial impact of poor data quality on businesses at $15 million per year1, which means you can’t afford to not make data quality management a priority starting right now.

What is Data Quality Management?

Data quality management (DQM) refers to a business principle that requires a combination of the right people, processes and technologies all with the common goal of improving the measures of data quality that matter most to an enterprise organization. That last part is important: the ultimate purpose of DQM is not just to improve data quality for the sake of having high-quality data but rather to achieve the business outcomes that depend upon high-quality data. The big one is customer relationship management or CRM. As often cited, “CRM systems are only as good as the information they contain”.

A Foundation for High-Quality Data

Effective data quality management requires a structural core that can support data operations. Here are five foundational principles to implement high-quality big data within your data infrastructure:

#1 Organizational Structure

IT leadership should consider the following roles when implementing DQM practices across the enterprise:

DQM Program Manager: This role sets the tone with regard to data quality and helps to establish data quality requirements. He or she is also responsible for keeping a handle on day-to-day data quality management tasks, ensuring the team is on schedule, within budget and meeting predetermined data quality standards.

Organization Change Manager: This person is instrumental in the change management shift that occurs when data is used effectively, they make decisions about data infrastructure and processes.

Data Analyst I or Business Analyst: This individual interprets and reports on data.

Data steward: The data steward is charged with managing data as a corporate asset.

#2 Data Quality Definition

Very simply, if you don’t have a defined standard for quality data, how can you know if you are meeting or exceeding it? While data quality definitions as to what data quality means varies from organization to organization.

The most critical points of defining data quality may vary across industries and from organization to organization. But defining these rules is essential to the successful use of business intelligence software.

Your organization may wish to consider the following characteristics of high-quality data in creating your data quality definitions:

  • Integrity: how does the data stack up against pre-established data quality standards?
  • Completeness: how much of the data has been acquired?
  • Validity: does the data conform to the values of a given data set?
  • Uniqueness: how often does a piece of data appear in a set?
  • Accuracy: how accurate is the data?
  • Consistency: in different data sets does the same data hold the same value?

In addition, to ensure these characteristics are satisfied each time, experts in data protection recommend the following guiding governance principles when implementing your DQM strategy:

  • Accountability: who’s responsible for ensuring DQM?
  • Transparency: how is DQM documented and where are these documents available?
  • Protection: what measures are taken to protect data?
  • Compliance: what compliance agencies ensure governance principles are being met?

#3 Data Profiling Audits

Data profiling is an audit process that ensures data quality. During this process auditors look for validation of data against meta-data and existing measures. Then they report on the quality of data. Conducting data profiling activities routinely is a sure way to ensure your data is the quality needed to keep your organization ahead of the competition.

#4 Data Reporting and Monitoring

For most organizations, this refers to the process of monitoring, reporting and recording exceptions. These exceptions can be captured by business intelligence (BI) software for automated solutions that capture bad data before it becomes usable.

#5 Correcting Errors

Once potentially bad or incomplete data has been sorted out by BI systems, it’s time to make appropriate data corrections such as completing the data, removing duplicates or addressing some other data issue.

Five Best Practices for Data Quality Management

For businesses starting the data quality management process, here are five best practices to keep in mind:

#1 Review Current Data

It’s likely you have a lot of customer data to begin with. You don’t want to toss it out and start over, but as they say in the tech world “garbage in, garbage out.”

The last thing you need is to fill your new data infrastructure with bad insights. Therefore, when you’re getting started with data quality management, do an audit of your current data. This involves taking inventory of inconsistencies, errors, duplicates; and recording and correcting any problems you come across to make sure that the data that goes into your infrastructure is as high-quality as it can be.

#2 Data Quality Firewalls

A firewall is an automated process that prevents and blocks a figurative fire. In this case, the fire is bad data. Putting up a firewall to protect your organization against bad data will help keep the system clear of error.

User-error is easy, and firewalls help prevent this process by blocking bad data at the point of entry. The number of people allowed to feed data into the infrastructure largely affects the quality of the data. But in many large organizations, it’s imperative to have multiple entry points.

A firewall helps data stay error-free even when there are a number of people with access to enter data.

#3 Integrate DQM with BI

In today’s business culture, the buzz is all about integration. And why shouldn’t it be?

When systems work together they work better. The idea here is that no enterprise business can justify the resources required to comb each and every data record for accuracy all the time. But integrating the DQM process with BI software can help to automate it. Based on predetermined parameters, certain datasets can be isolated for review; for instance, new data sets that are likely to be accessed often can be audited as part of the DQM cycle.

#4 Put the Right People in Place

As described above, there are several positions within your organization that have accountability over the data quality process. Ensuring these positions are seated and dedicated to the job, means ensuring governance standards can be met consistently.

#5 Ensure Data Governance with a Board

Creating a data governance board helps protect businesses from the risk of making data-driven decisions. The panel should consist of business and IT users and executives. The group will set the policies and standards that become the cornerstone of data governance.

In addition, the data governance board should meet periodically to set new data quality goals and monitor the success of DQB initiatives DQM across the various LOBs. This is where developing an objective measurement scale comes in handy since in order to improve data quality, there must be a way to measure it.

Data Quality Management is a Marathon, Not a Sprint

Big data is an important component of doing business in today’s digital world. It offers customer and competitor insights that can’t be achieved with any other tools or resources.

Because of its high velocity, big data is accessible to business leaders who can use it to make decisions in real-time. But for that reason, it’s also associated with business risks that need to be managed properly. And DQM is one effective tool for achieving just that.

Overall, DQM offers many benefits to your organization:

  • Business processes run more efficiently — because you get the right data the first time.
  • Better business outcomes — DQM offers you a better view of what’s going on with your customers, vendors, marketers, etc.
  • More confidence – DQM helps to drive more informed business decisions.

With these considerations in mind, it is also important to remember that DQM is an ongoing process that requires continuous data monitoring and reporting.

1 How to Create a Business Case with Data Quality Improvement by Susan Moore, Smarter with Gartner, June 19, 2018? https://www.gartner.com/smarterwithgartner/how-to-create-a-business-case-for-data-quality-improvement/

]]>
Why Collaboration and Automation are Key to DevOps https://www.bmc.com/blogs/why-collaboration-and-automation-are-key-to-devops/ Mon, 25 Jun 2018 00:00:41 +0000 https://www.bmc.com/blogs/?p=12529   Imagine what Wolfgang Amadeus Mozart, one of the most famous and prolific composers in history, would say if you asked him about the power of meticulous orchestration. He might tell you that each note performed a “job,” which had to occur at a specific time without fail. Mozart’s works brought together violins, horns, flutes, […]]]>

 

Imagine what Wolfgang Amadeus Mozart, one of the most famous and prolific composers in history, would say if you asked him about the power of meticulous orchestration. He might tell you that each note performed a “job,” which had to occur at a specific time without fail. Mozart’s works brought together violins, horns, flutes, trumpets, and more to enhance the magical flow of his compositions. Good timing with just the right mix of sounds is also what makes different styles of music appealing – whether it’s from U2, Taylor Swift, Beyoncé, Garth Brooks, Miles Davis, Coldplay, the Beatles, or Mozart.

Just thinking about the process of orchestrating and integrating instruments made me consider how this process relates to digital technology, DevOps, and the cloud. Okay, I must confess that I was also inspired by listening to soothing music while drinking a few cups of coffee when this idea occurred to me. This connection isn’t really a stretch, either. I’ll explain why.

Collaboration cuts through chaos

Without the right level of orchestration and integration, as well as automation, your enterprise can’t keep up with business demands and your DevOps teams struggle to collaborate. (Note that I added automation to mix, which wasn’t much of a priority in the 1700s. So, Mozart is off the hook for not automating his compositions).

The right processes and tools help organizations collaborate better so they can grow and meet the demands of a complex, chaotic digital environment. To succeed, DevOps teams must efficiently coordinate their efforts to deliver high-quality, new applications and services quickly. Just as musicians have practice sessions to eliminate mistakes before performing in front of an audience, developers need to take a Jobs-as-Code approach to develop, test, and address issues before applications move into production.

Managing workloads across disparate platforms and teams needs to be done with agility and speed. As I mentioned earlier, DevOps teams can take application integration and orchestration to the next level with automation. That’s where managing and monitoring application workflows with cross-platform job scheduling and the ability to integrate all aspects from a single point of control comes in. A platform that can deliver these capabilities and leverage an intuitive API creates a harmonic convergence of technology and teams to drive revenue and keep developers and operations in synch.

Go from the data center to the cloud

Let’s look at how this all plays out in multi-cloud environments. It involves moving legacy applications to the cloud and developing new applications that use cloud services. That presents some challenges for scheduling workflows:

  • Migrating legacy applications can require rework of thousands of workflows for scheduling in the cloud.
  • Many workflows may not be managed with the same tools and processes.
  • IT must manage workflows in the legacy and new multi-cloud environment before and frequently after transitioning to the cloud.

Let the Concert Begin with Control-M

Control-M integrates, automates and orchestrates digital business application services and enables DevOps teams to overcome these and other pressing challenges. So, when services are moved between on premises and any cloud they’re working in concert to ensure that the data and workflows move correctly.

  • IT can move workflows from traditional environments to the cloud without rework.
  • Developers and DevOps engineers can create, test, debug, and manage service workflows using the same, intuitive Control-M API.
  • Cloud-based applications that include workflows can be developed faster and more efficiently to help companies gain a competitive edge.

One large global leader in travel services, you may have heard of them – Amadeus – developed a new operating model that leverages Control-M. The company is able to support migration to a new cloud architecture solution, meet the demands of explosive growth, and accelerate the delivery of new services. And in the process, Amadeus saved $37 million in the conversion effort alone based on 250,000 existing jobs, as well as additional savings in other areas to date. Now, that’s music to anyone’s ears.

]]>
Big Data, the Internet of Things, and Extracting Insights that Create Value https://www.bmc.com/blogs/big-data-internet-things-extracting-insights-create-value/ Thu, 27 Oct 2016 13:27:04 +0000 http://www.bmc.com/blogs/?p=9876 There’s no end in sight to the growth of connected devices (also known as the Internet of Things, or IoT) and this expansion can have a big impact on your business. For example, sensors can track data on shopper behavior in stores to maximize revenue or monitor manufacturing processes to ensure quality. Companies that can […]]]>

illustration tunnel data glow blue

There’s no end in sight to the growth of connected devices (also known as the Internet of Things, or IoT) and this expansion can have a big impact on your business. For example, sensors can track data on shopper behavior in stores to maximize revenue or monitor manufacturing processes to ensure quality. Companies that can pull and process data from these devices to gain meaningful insights will have a distinct competitive advantage. How significant is this growth in IoT? By 2018, Forrester forecasts that connected devices (like connected cars, utility meters, etc.) will outnumber cellphones and increase from less than 5 billion in 2015 to almost 16 billion by 2021.1 Yet while the focus on IoT tends to be all about connected devices, the data they produce is much more valuable. In fact, again, according to Forrester, by 2025, IoT is expected to deliver a potential impact of at least $3.9 trillion.2 Yes, that’s $3.9 TRILLION.

Enter Hadoop
Forrester describes how Hadoop plays an important and growing role in extracting insight from IoT. They also explain how solutions for processing and obtaining insight from data streaming and connected devices provide much greater opportunities for business. Control-M for Hadoop from BMC, for example, automates and accelerates Hadoop data workflows to reduce costs and speed service delivery. Since IoT is expanding at such a rapid pace, being able to deliver data on-time, every time between business critical applications will help to drive business success.

Unlock the Value of IoT
Forrester highlighted some examples of how companies are delivering value by providing timely analysis of data as it streams from remote sensors. This includes:

  • Finding a better way to integrate car charging into renewable energy grids based on more effective weather prediction and communication flows within the power grid to predict and manage energy requirements more effectively.
  • Improving the diagnosis and treatment of asthma with connected devices to understand and devise devices that are better able to survey the environments in which they will be used.
  • Moving from monthly utility consumption reports to real-time dashboards in smart homes that will ultimately deliver real-time insight to millions of customers.

How to Transform Business with Insights from the IoT
What do organizations need to do to reap the benefits of IoT? They need systems that provide insight into customer content and behavior focused on the data within IoT, which should be extracted cost effectively and analyzed rapidly and effectively. Without the capability to process Hadoop workloads in motion quickly and cost effectively, there is a wasted opportunity to create meaningful competitive advantage. That’s why it’s important to consider how enterprise workload automation can automate and accelerate Hadoop batch workflows to capture greater competitive value from this data.  Get more details in this brief from Forrester: Brief: Streaming Data from the Internet of Things Will be the Big Data World’s Bigger Second Act.

1 Forrester, Brief: Streaming Data From The Internet Of Things Will Be The Big Data World’s Bigger Second Act, July 5, 2016. 2 Ibid.

]]>