I am, of course, paraphrasing that classic Beatles song and as sure as the sun will rise tomorrow, the volume of data being generated across the planet will continue to grow at an explosive exponential rate. Currently it is estimated that greater than 2.5 quintillion bytes of data are created every single day, and that by 2020, it’s predicted that 1.7MB of data will be created every second for every person on earth.
So, what does a new data minute look like today? Well for example now in 2018 it is estimated that:
- Every minute Facebook users send roughly 31.25 million messages and watch 2.77 million videos.
- Every minute 2.4 Million Google searches are performed resulting in 3.45 Billion searches per day.
- Every minute Instagram users post almost 50,000 photos.
- Every minute more than 2 million Snapchat users share snaps and more than 200,000 new snaps are uploaded.
- Every minute more than 350,000 tweets are tweeted.
- Every minute more than 2 million minutes of Skype calls are generated.
- Every minute YouTube users watch more than 4 million videos and greater than 400 hours of video are uploaded.
- Every minute Spotify streams over 750,000 songs.
- Every minute Netflix users stream more than 87,000 hours of video.
- Every minute more than 800,000 files are uploaded to Dropbox.
- Every minute more than 29 million WhatsApp messages are sent.
- Every minute more than 150 million emails are sent.
- Etc. etc.
On and on it goes. By 2020, it is expected that we will have over 6.1 billion smartphone users globally and within five years there will be over 50 billion smart connected devices in the world, all developed to collect, analyze and share data.
So, it became clear to us at Openet that we needed to set our sights on creating the performance capability within our solutions to handle this ever-increasing vast quantity of data in motion. Thereby enabling our customers to handle with grace and ease the vast actionable data volumes they need to. So, we started thinking about what our moonshot objective would be, as we really wanted to push the boundaries and achieve the almost unimaginable outcome, where audacious technology and pure innovation intersect; we decided we would achieve the astounding figure of 1,000,000,000,000 (1 Trillion) events per day.
And since we knew that it is estimated that by 2020, at least a third of all data will pass through the cloud, we wanted to do this in a way that most aligned with our customer’s future by achieving these 1 Trillion events per day performance in the cloud. We also wanted to do this with a real-world scenario, utilizing real world use-cases and transactions. So, we took our state of the art Openet Data Fabric (ODF) and built use cases around IoT devices, creating events and transactions that needed to be ingested and aggregated by ODF; to provide a real result. We deployed this in a containerized fashion on Azure to closely replicate the needs of our customers and then optimized and tuned it to extract the greatest performance possible within the Microsoft Azure cloud. We began the scaling and performance tuning initially with testing to first achieve 2 MTPS (Million Transactions per Second). From there, we started scaling and tuning our Azure clusters to then achieve 6 MTPS, followed by 8 MTPS and then onto the milestone of 10.2 MTPS utilizing a 17-node cluster. Finally, we achieved the massive milestone of greater than 11.5 MTPS utilizing a 20-node cluster on Azure, elegantly scaling with ease to almost 12.3 Million Transactions Per Second, that are being ingested, aggregated and processed, enabling us to exceed our ultimate goal of above 1 Trillion Events per day!
And based on our findings, we believe we can tune our solutions further to achieve even greater results in these native cloud environments. This incredible achievement is made possible by the Openet Data Fabric’s modular design which is built with a micro services approach, leveraging Openet’s extensive interface library and leveraging Open Source technologies to ensure open data innovation. As a result, the Openet Data Fabric provides a truly strategic and comprehensive approach to data management, data processing and data governance.
This result was truly phenomenal and one we at Openet are extremely proud of as a lot happens in a new data minute and we need to not only be able to handle it, but insightfully action the data with ease now and in the future. So, rest easy as we at Openet know it’s alright.