Building Smart DevSecOps Pipelines For Fast Deployment

Building smart DevSecOps pipelines

Challenges we face in DevSecOps pipelines

DevSecOps pipelines aim to integrate security and compliance in each phase of the software development life cycle. During this whole process, we are collecting a lot of valuable data from numerous instances.

●      Project managers collect data around cost, resourcing, and time.

●      Product owners collect data from issue tracking systems.

●      Software architects collect data from static analysis.

●      Developers collect data from functional test results.

●      QA/testers collect various test cases and reports

If you step back and notice, people are collecting and using data throughout the software delivery pipeline. 

The challenge is everybody uses the data for their own needs. Either they export the data in a spreadsheet or add some additional automation for their specific context. There is no holistic, automated, shared learning with this approach. The learning largely takes place in people’s minds. In practice, DevSecOps definitely helps to deploy faster, but most of the decisions are still made by the experts.

We know this happens — because if we ask, “What is the estimate?”, you will get a vague response like, “here's my estimate based on experience.” However, this approach is not scalable for the speed we are trying to attain in software delivery. Also, a siloed knowledge creation and learning model isn’t sustainable in the long term.

Managing business value and risk through the DevSecOps life cycle

By adding intelligence to our DevSecOps pipelines, we can create smart software pipelines which would be the next stage of evolution around DevSecOps automation data.

The goal is to help us get smarter about secure software development. We can do this by leveraging emerging concepts like machine learning while still being lean and agile. We should help people become more efficient by using the data generated from the software delivery life cycle.

There are a lot of benefits to adding this intelligence.

Getting a prediction based on patterns saves time. It increases the speed of software delivery. We also get higher accuracy and quick learning from our mistakes. It can eliminate many manual processes in every phase of the life cycle.

One of the biggest areas for improvement lies in the planning stage. If a plan is not accurate, based on what we are doing, it means very little. Instead, we should be able to dynamically forecast based on past patterns of behavior. Let’s take another example of risk.

What is the impact of risk on an organization? If you don't implement a specific security control, what are the consequences? Based on historical data analysis, we should be able to get a better understanding of the impact. In the end, when there is clear alignment with business risk, you can defend why something was done, how it was done, and what components were used. This also helps to be proactive with respect to deploying secure applications.

Having multiple sources of risk insight can also help reduce organizational risk. For instance, in the past, we would use code analysis to provide a high, low, and medium risk profile. But if your dominant risk is different from somebody else’s, that should be relevant to you. A risk score based on all DevSecOps life cycle risks will add value to your pipelines.

A myopic risk score doesn't really mean anything unless we tie that with organizational risk scores.

Implementing a smart DevSecOps pipeline

Implementing a smart software pipeline does not imply throwing away existing DevSecOps tools. In fact, we already have a lot of excellent tools. Almost every tool reports data through log files or an API which we can use to feed into a data lake to learn and gain insights from.

These insights provide feedback to all other stakeholders as well. So, we can still use existing DevSecOps tools but with a different mindset toward smarter software delivery. We create layer upon layer in the data lake and connect the dots. This is the next logical evolution of DevSecOps.

The future of DevSecOps

Eventually, every stakeholder in DevSecOps will be required to analyze data. We are generating a lot of data from every component in every pipeline. We now need to learn from the behavioral data we generate. With a smart software pipeline, we have the ability to radiate insights and help us make more informed decisions.

There is work already being done to build smart software pipelines from a continuous delivery perspective. Focus right now is on what type of data needs to be collected around delivery management. We envision that eventually there will be a lot of tools with analytical dashboards to help us reduce the ambiguity in DevSecOps.

You can listen to our podcast to learn more about building smart software delivery pipelines.

About the Author

Hasan Yasar

Hasan Yasar is the Technical Director of Continuous Deployment of Capability group in Software Engineering Institute, CMU. Hasan leads an engineering group to enable, accelerate and assure Transformation at the speed of relevance by leveraging, DevSecOps, Agile, Lean AI/ML, and other emerging technologies to create a Smart Software Platform/Pipeline. Hasan has more than 25 years’ experience as senior security engineer, software engineer, software architect, and manager in all phases of secure software development and information modeling processes. He is also an Adjunct Faculty member in CMU Heinz Collage and Institute of Software Research where he currently teaches “Software and Security” and “DevOps: Engineering for Deployment and Operations ”.

More Content by Hasan Yasar
Previous Article
Cutting Cybersecurity Budgets to Save Costs During Slowdown
Cutting Cybersecurity Budgets to Save Costs During Slowdown

Organizations are considering cybersecurity budget cuts to save costs during the current slowdown. Read why...

Next Article
Contact Tracing: Managing Public Trust Amid Privacy Concerns
Contact Tracing: Managing Public Trust Amid Privacy Concerns

Contact tracing apps have become critical tools for managing the spread of COVID-19, but are we trading-off...