blog

Open Source Databases in 2017 and Trends for 2018

Forrest Lymburner

Published:

With 2017 quickly coming to a close and 2018 looming on the horizon we wanted to take a minute and reflect on what’s been happening in our industry in the past year and what we are excited about for the future.

Johan Andersson, Severalnines CTO and Co-Founder took a few minutes from working on the newest releases of ClusterControl to talk with us about his thoughts on 2017.

2018 Database Trends and Predictions

As technology moves fast the open source world moves even faster. Here are some predictions from around the web for 2018…

FORBES

  • “In 2017, DevOps suffered from under-budgeting and a perception from management that things that were inexpensive as tools were mostly open source. However, non-standardized adoption and expensive DevOps resources skewed budget. With the realization that open source doesn’t equal free, especially as enterprise-grade support is required, there will be increased awareness of the budget needed for skilled DevOps resources. This barrier should get lowered — organizations will need a budget for experimentation and failure.”

GITHUB

  • “Data will rule all. Over the last several years, Cloud 1.0 has been about computing in big clouds, while Cloud 2.0 is all about data. This includes data movement and the tools and services that support it, like analytics and machine learning systems. Today all companies are data companies, whether they know it or not. In 2018, so long as teams know how to use it, data will become their greatest asset.”
  • “A decade ago, Linux was a big deal. Now it’s standard. Back in the day, companies like Amazon, Google, and Microsoft were forced to build their own, proprietary tools because no other software existed to meet their needs. Many of these frameworks have since been open sourced—and other open source technologies, like Kubernetes, are becoming integral to developers’ workflows. This shift is changing what companies are investing in, making open source software traditional software’s biggest competitor.”

OPENSOURCE.COM

  • “Containers gain even more acceptance. Container technology is the approach of packaging pieces of code in a standardized way so they can be “plugged and run” quickly in any environment. Container technology allows enterprises to cut costs and implementation times. While the potential of containers to revolutionize IT infrastructure has been evident for a while, actual container use has remained complex. Container technology is still evolving, and the complexities associated with the technology decrease with every advancement. The latest developments make containers quite intuitive and as easy as using a smartphone, not to mention tuned for today’s needs, where speed and agility can make or break a business.”

DBTA.COM

  • “Rapid Kubernetes adoption forms the foundation for multi-cloud deployments:We predict runaway success of Kubernetes, but it is running away with the prize of adoption so fast that this may quickly be more of an observation than a prediction in 2018. So far, however, almost everybody is thinking of Kubernetes as a way of organizing and orchestrating computation in a cloud. Over the next year, we expect Kubernetes to more and more be the way that leading-edge companies organize and orchestrate computation across multiple clouds, both public and private. On premises computation is moving to containers and orchestration style at light speed, but when you can interchangeably schedule work anywhere that it makes sense to do so, you will see the real revolution.”
  • “Fear of cloud lock-in will result in cloud sprawl: As CIOs try to diversify investment in their compute providers, inclusive of their own on-premise capabilities, the diversification will result in data, services and algorithms spreading across multiple clouds. Finding information or code within a single cloud is tough enough. The data silos built from multiple clouds will be deep and far apart, pushing the cost of management onto humans that need to understand the infrastructure.”

Severalnines 2018 Predictions

Several members of our team took a moment to share their thoughts on 2018 and the future of the open source database world…

Vinay Joosery, CEO & Co-Founder

  • Databases in Containers: I think a lot of people have gone from discussing the pros and cons of Docker for databases, and whether it is a good idea at all when it comes to running databases on Docker, to trying it not only in test/dev, but in actual live production!

Alex Yu, VP of Products

  • Cloud and containerized services as the new norm – More companies will start new projects in the cloud with multiple providers and applications will be written with cloud-native architectures in mind.
  • Traditional monolithic applications will continue to give away to more loosely coupled services which are easier to build, deploy, update, scale or move to other cloud and container service providers. Cloud-native services are resilient by nature and will facilitate faster development cycles and feedback loops.
  • Container technologies such as Kubernetes and Docker are already de-facto standard for typical stateless applications/services using “application containers” however databases though are intrinsically stateful and we will hopefully see more use of “system containers” such as LXD coming into the mainstream and gain adoption.
  • This “new world” has an impact on database management and monitoring applications where containerized services come and go frequently. The lifetime of a host is many times longer than a container where the uptime can be measured in only minutes or hours. Host centric management and monitoring will give away to a cloud-native service oriented model where the transient nature is the norm.

Jean-Jérôme Schmidt, VP of Marketing

  • The European Union (EU) is the world’s 2nd largest economy after China and before the US. It’s about to enact a new piece of legislation with far-reaching consequences for anyone doing business involving its residents … and yet it seems to be going almost unnoticed. The European General Data Protection Regulation (GDPR) is due to come into effect in May 2018 and will impact anyone and any business or organisation that deals with and stores EU residents’ personal data in some form or shape. And it will make the organisations that are handling that data responsible for any breaches or misuse. It will therefore be of the highest importance how data is collected, processed and secured. In other words, databases and their infrastructures will be in the spotlight more than ever before and how these database are automated and managed will be crucial for anyone doing business with or within the EU. The GDPR is probably not getting the attention it should because the perception that’s being maintained globally is that the US (and maybe China) are the only large-scale economies worth being concerned with, but the reality is that the EU is the one to be focussed on, particularly next year. So if you’re not sure whether you have your databases and their EU residents data 99.999% under control … contact us

Ashraf Sharif, Senior Support Engineer

Krzysztof Książek, Senior Support Engineer

  • The main new trend that I see today is a move towards column store, analytics databases. MariaDB has it as part of their offering and ClickHouse seems to get traction as a go-to analytics Database engine that works alongside MySQL. ProxySQL’s support for ClickHouse also makes it easier for the application to connect to either MySQL or ClickHouse, whatever is needed at that moment. If your dataset is small, you can do analytics in MySQL but there are other tools which do it better – faster and use less disk to store the data.

Subscribe below to be notified of fresh posts