Welcome to a brand new year and a fresh look at the latest news from the constantly evolving world of software development. If you are interested in checking out the stories from the end of December, simply click on this link. Leverage this month’s insights and information to help make your application engineering process more efficient and productive. Good luck!
Software development teams are increasingly using Cloud-based services to produce new applications, make enhancements to current apps, and fix bugs at a faster rate. Collaboration with remote development teams and a new Features-as-a-Service (FaaS) API model for code reuse appear to be two of the most common use-cases for Cloud-enhanced software engineering. This growing trend was reported on this month in TechTarget.
The TechTarget article also covers the wider use of containers, something we also mentioned in our 2017 Trends in Software Development post. Tools like Docker, Vagrant, and others allow software engineering shops to leverage virtualization – either in-house or Cloud-based – to make managing development, production, and QA environments a more efficient process. Companies hoping to achieve a Continuous Delivery model are increasingly using Cloud-based virtualization as part of their methodology.
Cloud-based APIs and services – increasingly marketed with the FaaS moniker mentioned earlier – allow development teams to meet deadlines without having to “recode” the wheel. Code reuse has been in the wise developer’s toolbox for decades, and Cloud-based services simply make it easier. Amazon and Microsoft are continually adding new routines to their own publically-available Cloud-based APIs.
Componentization and microservices are two other ways development teams are using the Cloud to improve their software engineering process. Expect to hear more information on microservices in an upcoming blog post.
Tom Nolle, the writer for TechTarget, sums up this growing trend. “The most important impacts of the cloud on faster software development are being felt only now, and it’s clear that we’re heading for a true software revolution in just a few years,” said Nolle.
Anyone interested in moving into software engineering, or current developers hoping to keep their skills up to date, need to ramp up their knowledge of Artificial Intelligence and Data Science. That is the opinion of an article published this week in InfoWorld. The IT magazine spoke with Jim McHugh, vice president and general manager for Nvidia’s DGX-1 supercomputer, to get his insights on the growing importance of AI in the industry.
The DGX-1 is largely used in deep learning and data analysis scenarios. McHugh feels the supercomputer and its employment of AI and data provides an example of how the process of writing software is being transformed. “We’re using data to train the software to make it more intelligent,” said McHugh.
Part of the application infrastructure, like the interface and flow, are still coded using largely traditional methods. The actual meat of the app, however, uses data analysis to influence new feature sets. McHugh mentioned developers manage and curate the data while guiding the app through learning its new enhancements.
The influence of AI in the software development process is definitely an area to watch over the upcoming decade.
Stay tuned to the Betica Blog for additional insights and news from the ever-changing software development universe. As always – thanks for reading!
Posted on January 27, 2017 January 27, 2017 | Categories Software Development | Tags AI, artificial intelligence, Cloud, data science, faas, Software Development, software engineering