Add Chef to your Organization’s DevOps Kitchen

Companies looking at DevOps with the hopes of streamlining their software development process sometimes struggle with the initial implementation. Leveraging the right set of DevOps tools is an important factor in achieving success as much as any organizational or policy-based changes. One such tool – known as Chef – is especially helpful for shops taking advantage of the Cloud as part of their overall application engineering strategy.

What follows is a closer look at the features and functionality of Chef to see if it allows your team to manage server infrastructure quicker than ever before.

Open Source Server Configuration Management for the Cloud… and more

Chef’s main functionality centers on the management of Cloud-based infrastructure. It offers value to any company whether they manage ten servers or ten thousand – no matter the platform. It lets your development staff focus on ensuring their software runs properly, instead of having to deal with the drudgery of server administration tasks. While it truly shines in the Cloud, Chef also works with on-premise servers as well as a hybrid infrastructure.

A Code-based Approach to Server Management

What makes Chef unique among similar infrastructure management tools is its emphasis on using code to define and automate a collection of servers. This lets you handle automated server management in a similar fashion as your applications, with development, QA, and production environments ensuring a high level of quality. Additionally, letting your developers manage servers using code fits nicely with the overall philosophy of DevOps, where formerly segregated duties are handled in a more communal fashion.

A development kit, known as the Chef DK, includes everything required to develop and test infrastructure automation code. Test Kitchen handles the running of these tests, using InSpec as the TDD programming language. Not surprisingly, the included code analysis tool is known as the “Food Critic.”

Continuing with this kitchen metaphor, the collection of code used to automate and define a server infrastructure is known as a cookbook, and – of course – they are made up of recipes. This nomenclature definitely helps developers new to Chef better understand the functionality of each part of the system. Behind this somewhat humorous style lies a very powerful tool.

The Chef Server is the central repository for every cookbook in the system. This design allows the Server to manage any number of physical or virtual machines in your infrastructure. The Chef Client runs on each of these nodes; staying in constant communication with the Server.

An Essential Tool for DevOps

As noted earlier, Chef offers any DevOps organization the means to manage their technical infrastructure easier than before. Its code-based scheme for server management lets you leverage your development talent in a new fashion. The kitchen-based metaphor used in Chef also makes it easy to understand by both your technical and non-technical team members.

Chef, and similar tools, like Ansible which we previously covered, play an important role in any company deriving value from its investment in DevOps. Ultimately, this is a methodology requiring more than just a change in organizational structure for success. Download Chef to see if it makes sense in your shop.

Thanks for reading the Betica Blog. Keep coming back for additional insights from the software development world.

News from the World of Software Development – June 2017

Welcome to the June edition of the Betica Blog software development news digest. We hope this month’s stories provide a measure of insight and inspiration to compliment your daily work routine. If you are interested in checking out last month’s stories, including the increased use of Agile at government agencies and Microsoft embracing Git, simply click on the following link.

The Forecast for Software Development is Cloudy

Companies continue to leverage Cloud-based services to make their software development processes more efficient. This trend was reported on earlier this month at the blog for the document management software company, Formtek, as well as other sources.

 This migration to the Cloud is a natural outgrowth of organizations increasingly adopting Agile, DevOps, and other modern development frameworks with the hopes of achieving continuous integration or to simply deliver software faster. Some of these Cloud-based services include containers and virtualized server environments, as well as QA and database services.

Formtek explains that because many development companies are actually writing software to be run in the Cloud, it makes using Cloud-based tools a natural fit. This fact is highlighted in their three top reasons why software engineering becomes more efficient in the Cloud. In short, developers are already using the Cloud in their daily work; they want to use new tools which are largely Cloud-based, and DevOps and Continuous Delivery largely depend on Cloud-based tools.

The future of software development is quite cloudy, indeed.

Volvo and NVIDIA working on Driverless Car Technology

While NVIDIA is primarily known as a developer of graphics processor chip technology, the company is working with Volvo and two other companies on a driverless car system. News about this futuristic consortium first appeared this week in Forbes.

The heart of this new technology is NVIDIA’s Drive PX automotive computing platform. The company’s graphics processing background comes into play with Drive PX’s auto-pilot functionality, which is able to read real-time information from 12 HD cameras, processing 1.3 gigapixels of graphics data per second. The system uses dual high-powered Tegra X1 chips, capable of recording two 4K streams at a refresh rate of 30Hz.

Software routines using AI and deep learning are able to make intelligent decisions based on all that graphical data. This facilitates object detection while allowing for the automated control of the vehicle. The first driverless cars from the Volvo/NVIDIA group are expected to hit the marketplace by the end of 2021.

Crowdtesting grows in Popularity

As software companies hope to improve their QA processes, a new form of quality assurance – crowdtesting – is growing in relevance. Applause, a company involved in the practice, recently described how it works on its ARC website.

At its essence, crowdtesting uses the targeted audience demographic of an app as part of its testing team. The hope is to have the QA process mimic the real world environment of a website or mobile app as closely as possible. Applause currently has over 300,000 testers available across the world, so it is able to match a group of testers to most applications.

It will be interesting to see if crowdtesting becomes part of the QA mainstream in the next few years.

Stay tuned to the Betica Blog for additional news and insights from the constantly evolving world of software development and QA. As always, thanks for reading!

News from the World of Software Development – January 2017

Welcome to a brand new year and a fresh look at the latest news from the constantly evolving world of software development. If you are interested in checking out the stories from the end of December, simply click on this link. Leverage this month’s insights and information to help make your application engineering process more efficient and productive. Good luck!

The Cloud is making Software Engineering Faster

Software development teams are increasingly using Cloud-based services to produce new applications, make enhancements to current apps, and fix bugs at a faster rate. Collaboration with remote development teams and a new Features-as-a-Service (FaaS) API model for code reuse appear to be two of the most common use-cases for Cloud-enhanced software engineering. This growing trend was reported on this month in TechTarget.

The TechTarget article also covers the wider use of containers, something we also mentioned in our 2017 Trends in Software Development post. Tools like Docker, Vagrant, and others allow software engineering shops to leverage virtualization – either in-house or Cloud-based – to make managing development, production, and QA environments a more efficient process. Companies hoping to achieve a Continuous Delivery model are increasingly using Cloud-based virtualization as part of their methodology.

Cloud-based APIs and services – increasingly marketed with the FaaS moniker mentioned earlier – allow development teams to meet deadlines without having to “recode” the wheel. Code reuse has been in the wise developer’s toolbox for decades, and Cloud-based services simply make it easier. Amazon and Microsoft are continually adding new routines to their own publically-available Cloud-based APIs.

Componentization and microservices are two other ways development teams are using the Cloud to improve their software engineering process. Expect to hear more information on microservices in an upcoming blog post.

Tom Nolle, the writer for TechTarget, sums up this growing trend. “The most important impacts of the cloud on faster software development are being felt only now, and it’s clear that we’re heading for a true software revolution in just a few years,” said Nolle.

AI and Data Science are Important Skills for New Developers

Anyone interested in moving into software engineering, or current developers hoping to keep their skills up to date, need to ramp up their knowledge of Artificial Intelligence and Data Science. That is the opinion of an article published this week in InfoWorld. The IT magazine spoke with Jim McHugh, vice president and general manager for Nvidia’s DGX-1 supercomputer, to get his insights on the growing importance of AI in the industry.

The DGX-1 is largely used in deep learning and data analysis scenarios. McHugh feels the supercomputer and its employment of AI and data provides an example of how the process of writing software is being transformed. “We’re using data to train the software to make it more intelligent,” said McHugh. 

Part of the application infrastructure, like the interface and flow, are still coded using largely traditional methods. The actual meat of the app, however, uses data analysis to influence new feature sets. McHugh mentioned developers manage and curate the data while guiding the app through learning its new enhancements.

The influence of AI in the software development process is definitely an area to watch over the upcoming decade.

Stay tuned to the Betica Blog for additional insights and news from the ever-changing software development universe. As always – thanks for reading!