Pitfalls to avoid when adopting DevOps

devops2As DevOps continues to grow in popularity, some organizations still struggle with its successful implementation. Perhaps developers really don’t understand the practice and chafe at being forced to follow its concepts? Maybe the network engineers feel DevOps favors the software team, while automating many of their standard administrative tasks?

Whatever the reasons for its difficulty in adoption, getting things right offers many benefits to software shops of all sizes. DevOps plays a key role in boosting development efficiency to the point it becomes a competitive advantage. So, let’s take a look at a few common pitfalls to avoid when adopting DevOps.

Avoid these Mistakes when adding DevOps at your Software Shop

Rebecca Dodd, from the software development process experts at GitLab, wrote an article for DZone covering these major pitfalls to avoid during a DevOps implementation. She talked with a few people at GitLab responsible for project success with their customer base. They provided interesting food for thought on what issues hamper DevOps adoption.

Focusing Too Much on the Tools

GitLab noted that companies who make too much of an investment on their toolset tend to encounter difficulty when implementing DevOps. GitLab Technical Account Manager, John Woods, commented on the issue. “You think you have it all when you’ve got your issue tracker, version control system, CI/CD service, etc. However, what’s the cost of setting all those up and configuring them to ‘talk’ to each other?” said Woods. 

In essence, the time spent configuring and integrating multiple tools takes up valuable time and resources. GitLab calls this the “DevOps Tax.” Make it a point to ensure you use tools that support your DevOps policies and procedures; not the other way around.

In a similar fashion, some companies simply become too attached to their development tools. This adds difficulty if those tools aren’t really compatible with the unique DevOps methodology. GitLab notes some customers try to wrench decades-old tools into their fledgling modern workflow.

Ultimately, the smartest tack involves finding the right integrated toolset compatible with how software gets written in a DevOps world.

Deployment and Monitoring are as Important as Development and Testing

Another pitfall noted by Dodd involves companies not covering the entire SDLC when adopting DevOps. Instead, the only follow its principles for software development and QA, ignoring it for the deployment and monitoring processes. Ultimately, this isn’t a true DevOps implementation.

In most cases, companies leverage DevOps to achieve continuous integration or continuous delivery. Reaching these goals isn’t possible without a full adoption of the methodology. In short, go hard or go home!

Security needs to be part of the DevOps Equation

We previously talked about the importance of information security as part of any DevOps implementation. This is one of the reasons DevSecOps is a hot buzzword. In these days, cybersecurity needs to be a core concept within any software development practice – DevOps or not.

GitLab notes that companies adopting DevOps who still treat security as an afterthought ultimately struggle with its implementation. Valuable resources end up making security-related fixes at the last minute. Consider a DevSecOps approach.

Ultimately, steer clear of these pitfalls to ensure your DevOps adoption goes great!

Keep coming back to the Betica Blog for additional insights and dispatches from the wide world of software development. Thanks for reading!

News from the World of Software Development – February 2017

This fresh edition of the Betica Blog news digest contains a few interesting stories from an endlessly fascinating software development world. If interested, here is a link to last month’s article. Use these insights and ideas at your own shop to stay on the forefront of an ever-changing industry.

Developers and QA Engineers on the Frontlines of the Battle for Cybersecurity

Earlier this month, CIO Magazine reported on how software engineers and QA personnel can improve their efforts to prevent cybercriminals and other nefarious agents from hacking their systems and technical infrastructure. This battle is especially fierce considering the growing number of devices connected to the Web because of the Internet of Things (IoT) and mobile technology. Stronger coding practices and more thorough software testing are key factors in protecting applications.

Chris Wysopal, co-founder and CTO of the software security firm, Veracode, commented on the importance of stronger code and testing when considering cybersecurity. “In today’s technology environment, application security testing for vulnerabilities and flaws in software code should be a security best practice, regardless of an organization’s size or industry,” said Wysopal. Unfortunately, a survey by his company reported 83 percent of the respondents deployed code without a full vetting of the underlying application security.

The article noted companies must require developers to perform code reviews focused on security. Additionally, state of the art QA techniques, like static and dynamic application testing as well as white hat testing are needed to ensure an application is sufficiently protected before it’s released into production. While automated testing tools help somewhat, humans also need to be involved to assure the highest possible level of security.

CIO reported that the Open Web Application Security Project (OWASP) provides a valuable resource for companies looking to improve their cybersecurity efforts. It offers practical information on the best practices for ensuring an application’s code is safe. Ultimately, this freely-available information is vital for winning the war against hackers and other cybercriminals, especially concerning the current shortage of application security talent in the IT industry.

Is “Low-Code” the Next Wave in Software Development?

The problems discovered when forced to maintain and enhance legacy applications has led to a new paradigm focused on using tools that assemble pre-written functionality into a complete application. In a sense, this is a streamlined and highly-automated take on the current microservices trend in the industry. SiliconANGLE discussed low-code software development in a February article.

The app used by the ride-sharing service, Uber, is a highly public example of an application developed using low-code techniques. It pieces together functionality from a variety of sources, including Box Inc.’s Cloud storage, Google Inc.’s Maps, payment services from Braintree, Twilio for messaging, and SendGrid’s email services. Many pundits feel the flexibility offered by the low-code model suits today’s competitive business era better than traditional application coding techniques.

The industry research analyst group, Forrester, predicts the low-code software market will grow to over $10 billion over the next two years. “The market for these [low-code] platforms is growing fast, but selecting a platform that actually delivers without creating a [fourth-generation programming language]-like orphan in the software portfolio isn’t easy’” said Forrester. Obviously, this makes it a trend worth watching at your software development shop.

Keep coming back to the Betica Blog for additional news and information on the expanding software development universe. Thanks for reading!

Microservices – a Flexible Architecture for the Continuous Deployment Era

As more modern businesses embrace new organizational structures like DevOps, with a goal of achieving the continuous deployment of software, SOA architectures are becoming more granular. Microservices is a term used to describe these lightweight, highly portable applications used to build larger systems. Each microservice typically runs in its own process, communicating with other microservices using a protocol, such as HTTP.

Like many newer technology industry buzzwords, it is hard to explicitly define microservices, but enough common attributes exist to provide a high-level overview. Perhaps this architectural approach makes sense for your team’s next application design?

An Architecture to better support a Scalable Internet

The esteemed software architecture pundit, Martin Fowler, describes how the need for microservices grew out of the hassle of making relatively minor changes to large monolithic applications running in the Cloud. For example, a simple UI change required all the components in the application to be rebuilt and redeployed across multiple servers.

Improved scalability in a Cloud-based distributed environment is another major advantage of microservices. Older applications required all of their components to be scaled. On the other hand, software designed using microservices only needs the scaling of the most resource intensive portions of the application.

The fact that each microservice is individually deployable ultimately makes this process easier to manage for build engineers.

Improved Flexibility when designing Applications

Being able to leverage collections of microservices is a boon for organizations looking at code reuse for quickly architecting, designing, and building a web-based application. This echoes some of the original promises of SOA – or even piecing together desktop software using components – but the improved granularity of a smaller microservice works better in this era of the Cloud. 

Using microservices also makes it easier to organize an application’s architecture. Fowler notes many enterprises create teams based on the business capability for a microservice. This means each cross-functional team includes personnel responsible for the UX, database, middleware, etc.

From an organizational standpoint, this is a structure similar to the Agile Tribes concept used at the Internet music streaming company, Spotify. Fowler mentioned that companies organizing their software development teams around their chosen application architecture is another example of Conway’s Law influencing the software engineering process – a process we talked about last year.

Designed for Continuous Delivery

As mentioned earlier, application design using microservices helps organizations achieve a continuous delivery model compared to older software architectures. Given a scenario where only a small portion of a microservice needs updating, it is easier to rebuild that granular piece instead of an entire application. Organizations are able to leverage automated test and build routines to streamline the entire process.

Still an Emerging Software Development Model

Fowler feels it is too soon to anoint microservices as the future of software development. “While our experiences so far are positive compared to monolithic applications, we’re conscious of the fact that not enough time has passed for us to make a full judgment. Often the true consequences of your architectural decisions are only evident several years after you made them,” said Fowler.

There’s no denying that microservices architecture is worthy of further analysis by your software development organization. It just may be the missing link on your path to highly scalable and easily deployable applications.

Keep returning to the Betica Blog for additional insights on the software development world. Thanks for reading!