A Closer Look at the MEAN Stack

The LAMP stack – which stands for Linux, Apache, MySQL, and PHP – has been standard practice for web development at many shops for nearly a decade. Since the one constant in the technology world is its rapid pace of change, it stands to reason a new standard is emerging in this software development space. The MEAN stack leverages many recent innovations in technology, including NoSQL databases in addition to some popular JavaScript libraries.

What follows is a high level overview of the MEAN stack to give you some food for thought before architecting your next web development project. Leverage these insights to make an informed decision on which development stack works best for your needs.

What is “MEAN?”

The MEAN stack is made up of MongoDB, one of the most preeminent NoSQL databases, used in combination with three popular JavaScript frameworks, ExpressJS, AngularJS, and Node.js. The fact that nearly all code for a MEAN project – from database to client – is written in JavaScript is one of the main reasons for its rapid growth. If your organization boasts a lot of JavaScript coding talent, it makes MEAN worthy of consideration on your next web project.

The Four Components of the MEAN Stack

MongoDB is a NoSQL document database widely popular for all kinds of applications. MongoDB is also available through many Cloud service providers, including Amazon AMS, Microsoft Azure, and Google Cloud. It leverages the JSON format for data transfer, making it highly appropriate as the database of choice for MEAN.

A lightweight framework for architecting web applications, ExpressJS was inspired by the popular Ruby library, Sinatra. It is a high performance framework well suited for both scalability and concurrency. It also facilitates the creation of unique APIs specifically for use in a web application.

AngularJS is a Google-developed framework for quickly building web-based user interfaces. It makes the creation of dynamic web pages a breeze; leveraging two-way data binding along with other useful features, including client-side code execution and support for the MVC model. Angular’s extensibility and flexibility enhances its compatibility with other frameworks and libraries, in addition to being a major component of the MEAN stack.

Node.js provides the server side execution environment for a MEAN application. Expect a high scalability factor even with a server farm charged with hosting multiple applications. Built upon version 8 of the Chrome JavaScript runtime engine, Node.js by itself is growing in usage among development teams.

The Advantages of the MEAN Stack

Obviously, the fact that all server and client code is written in JavaScript remains of the major advantages of the MEAN stack. Companies are able to take advantage of their staff’s familiarity with a scripting language that’s been around for two decades. Any overall learning curve is lessened by simply focusing on learning MEAN’s three libraries and MongoDB. 

The scalability features of ExpressJS and Node.js make the MEAN stack suitable for the highly concurrent web applications currently in vogue throughout the technology world. The flexibility of the libraries used in MEAN make it easy to swap out any of the components for a library (or database) more familiar to your development staff. It is definitely worthy of exploration for use in your team’s next web development project.

Keep returning to the Betica Blog for additional dispatches from the wide world of software development. Thanks for reading!

News from the World of Software Development – September 2017

With autumn now upon us, it becomes time to train our eyes towards the latest software industry news to see if any interesting stories provide meaningful insights on how your team builds applications. If you want to check out last month’s stories, simply click on the following link. Stories on the use of AI to improve continuous delivery, and a new DevOps metrics tool await you.

CCleaner Malware Attack places renewed Onus on “Cybersecure” Development

One of the last month’s biggest stories in the technology world involved the malware attack on CCleaner, a cybersecurity application from Avast, one of the most well known anti-virus companies in the industry. Hackers were able to infect the development team at Avast, interjecting malware into versions of the deployed application – both CCleaner and CCleaner Cloud.

Ultimately, the over two-million users who installed the application on their own systems effectively provided cyber criminals with a gateway into their computer. End-users feeling they are taking the right steps to protect their desktops ended up getting burned by a cybersecurity company unknowingly serving as the middleman for hackers. News and analysis of this insidious cyber attack was published on eWEEK, as well as many other sources.    

Avast acquired the original developer for CCleaner – Piriform – in July. The attack took place some time in August, with all versions of the application installed from August 15 to September 12 affected by the malware. Since the CCleaner install had a legitimate digital signature from a respected antivirus company, effectively all users installed the program unaware of the hacked code within.

The places the onus on software engineering teams to secure all computers and digital signatures involved in the development process, a point echoed by Craig Williams, a senior technical lead with Cisco. “The fact of the matter is, when it comes down to supply chain attacks, if the attacker is in your build system already, you’ve lost. Once the attacker has all the certificates and all the keys and all the passwords, there is not a lot you can do,” said Williams.

Artificial Intelligence changing Software Quality Assurance

AI continues to influence many aspects of the software engineering process, so it isn’t surprising quality assurance is also taking advantage of machine learning routines to improve its efficacy. A variety of companies specializing in QA services – Infostretch, Appdiff, and dinCloud – are now including AI-based functionality in some of their testing products. News about the inroads artificial intelligence is making in the QA world was published this month in Tech Target

Infostretch’s new service is called Predictive and Prescriptive QA. It relies on data analysis and machine learning to quickly give software testers the information they need to find defects. The other two companies’ products essentially are testing bots aimed at software development organizations already taking advantage of automated QA as part of their DevOps implementation.

The introduction of AI and robotic testers doesn’t mean QA engineer jobs are at risk. Instead, these tools only help them become more productive and ultimately better at finding software defects.   

Keep returning to the Betica Blog for additional news and insights from the world of software development and QA. As always, thanks for reading!

News from the World of Software Development – August 2017

Welcome to our regular look at interesting stories from the ever dynamic software development world; this time from the month of August. Hopefully, you find a bit of actionable information to help in your daily coding activities or perhaps the strategic direction of your organization. If interested in last month’s news digest, simply click on the following link.

New Product helps Companies keep track of DevOps Metrics

As DevOps continues to become part of the technology mainstream, companies struggle with determining the return on investment on their transition to a new methodology. DevOptics, a new product from CloudBees, aims to provide a means to track the efficacy of an organization’s DevOps processes and procedures. News about DevOptics appeared in August in Enterprise Times as well as other sources.

One of CloudBees’ major features is a real-time view of an organization’s software development pipeline, allowing managers and other key personnel to track the status of code changes as they are pushed from development to QA and eventually production. The hope is to lessen the number of meetings that tend to siphon productivity. Sacha Labourey, the CEO of CloudBees, commented on DevOptics.

“This is about data. We go through a lot of code changes, use a lot of tools, make a lot of modifications but all of the data vanishes. DevOps has been adopted in many, many cases as a feature that we replicate across the organizations. It’s a feature at scale not an enterprise solution. Now we are moving towards building a system of record for IT processes,” said Labourey.

If your organization is interested in how DevOptics can help keep a handle on your DevOps implementation, contact CloudBees to schedule a demo of the product. It just might be the missing piece of the puzzle for managing your software development projects.

Continuous Delivery – powered by AI – is the Future of Software Development

A recent article in The Next Web wonders if continuous delivery, assisted by artificial intelligence algorithms, is the future of software development. Considering how often we cover DevOps and continuous delivery here on the blog it is safe to wonder if that future is actually already here.

The Next Web article cites recent survey data from Evans Data that shows while a majority of companies – 65 percent – are using continuous delivery as part of their software development process, they only leverage it on a subset of their projects. Only 28 percent of surveyed organizations use it for all their applications.

Leveraging AI and machine learning as part of automation will play a key role in making continuous delivery commonplace. This is the opinion of Diego Lo Guidice of Forrester Research. “AI can improve the way we build current software; it will change the way we think about applications — not programming step by step, but letting the system learn to do what it needs to do — a new paradigm shift,” said Lo Guidice.

Expect artificial intelligence to continue to make inroads throughout the software development world, but especially in improving processes currently using automation. Once it does, continuous delivery – and DevOps for that matter – will truly become an industry standard.

Stay tuned to the Betica Blog for additional insights from the wide world of software development. As always, thanks for reading!