Monday, December 30, 2013

Four Trends That Will Shape 2014

My favorite part of being CEO of Jaspersoft is spending time directly with our customers and partners.  I learn so much about how we are succeeding together and what Jaspersoft can do better to help them succeed more fully.  Learning from our customers is a passion at Jaspersoft.  We all take notes from our customer discussions and share them throughout the company in an effort to constantly grow our understanding and push us forward.  Their plans and ideas become our ambitions - to create a far better approach to the next generation of reporting and analytics.  Our hope and intention is that our customers will be compelled to rely on Jaspersoft for a continually larger portion of their projects and needs.

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe will make 2014 a transformative year for reporting and analytics.  Think of this as my travel log and diary distilled into a short, easily readable series of blog posts.  I invite you to follow this short series, starting with my first installment here.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and
partnership, which mean everything to us.

Trend #1:  Forget Sentiment Analysis, Sensors + Software Will Change the World

Much of the Big Data hype has focused on social media and sentiment analysis, in an effort to get closer to the customer and better understand the market in which an organization competes.  While this is a valid goal, relatively few organizations will find both the skill and useful data patterns that add up to a material top-line difference.

Instead, the focus should be on the “Internet of Things”, for the transformative power it represents.  Every day, I see more powerful examples of sensors and software in action.  I prefer to describe it as “sensors + software” because these terms better symbolize the grittier, more real-world value that can be delivered by measuring, monitoring and better managing vast amounts of sensor-generated data. Why is this important in 2014?  Firstly, sensor technology has become remarkably low cost (an RFID tag, for instance, can cost as little as 50 cents, according to this report - which means more data points).  Secondly, the data storage and analytic technology to capture and analyze this data is incredibly low cost and widely available (often in open source editions). Lastly, sensor-based data is well suited for correlation analysis, rather than looking strictly for causation, which increases the potential for finding value among this machine-generated data.

Analyst predictions are vast for the economic and far-reaching value of making “Things” smarter and connecting them to the Internet.  Why limit analysis to the words and attitudes of a relatively vocal few (social media and sentiment analysis), when you can analyze the actual behavior of a much larger population (sensor data)? So, I believe a quiet revolution is already underway.  In 2014, sensors + software will change the world.

Monday, April 8, 2013

Amazon, Jaspersoft, and the Future of Cloud Computing


Last month, Jaspersoft announced the industry’s first completely pay-as-you-go reporting and analytic service on Amazon’s AWS Marketplace.  With this service, you can literally be up-and-running (analyzing your data) in less than 10 minutes and pay as little
as 52 cents per hour to do so.  And, as we’ve just announced, Amazon and Jaspersoft added more than 100 customers during the first month of availability – a great start to a new service destined to change the way BI is consumed for many purposes.

One of my favorite University professors recently asked me what worries me the most about being on the cutting edge with Amazon and this new service.  My response:  NOT being on the cutting edge with Amazon and this new service.  In other words, I would worry most about not innovating in this way.  Disrupting through both our business model and product innovation is a critical part of our culture at Jaspersoft.

In fact, the early success of our new Amazon-hosted service reminded me of two fast-emerging, inter-related cloud computing concepts that, though not discussed sufficiently, will have a substantial impact on the future usage and adoption of cloud-based computing services. These two concepts are: cloud-originated data and the post-transactional cloud *1.  I maintain that, as the former quickly grows, the latter becomes commonplace.

Cloud-Originated Data
While the total digital universe currently weighs in at nearly 3 zettabytes, it is estimated that more than one Exabyte of that data is stored in the cloud.  Each day, the growth rate of cloud-originated data increases, because of the explosion in services and applications that rely on the cloud as infrastructure.  So, a disproportionate amount of the 13X growth projected in the digital universe between now and 2020 will come from cloud-originated data. IDC estimates that by 2020, nearly 40% of all information will be “touched” by cloud computing (somewhere from origination to disposal).  Eventually, most of the digital universe will be cloud-based.

The growth in Amazon’s Simple Storage Service (S3) provides another compelling data point for the growth of cloud-originated data. In the past several years, Amazon’s S3 service has seen meteoric growth, now storing nearly one trillion objects (growing by 1 billion objects per day) and handling more than 650,000 requests per second (for those objects). The chart below illustrates this dramatic growth *2.



Importantly, cloud-originated data is more easily liberated (post-transaction) by other cloud services, which can unlock additional value easily and affordably.  According to a recent report by Nucleus Research, companies that more quickly utilize cloud-based analytics are likely to gain a competitive advantage:

“As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.”

Ultimately, analytics is just one of many important post-transactional uses of cloud-based
data, which will surely be the subject of future posts.

Post-Transactional Cloud
My working definition of the post-transactional cloud is “the next-generation of cloud services, beyond Software-as-a-Service (SaaS), designed to enable platform and middleware tools to use cloud-originated transactional data and deliver a richer, more sophisticated computing experience.”

The concept of a post-transactional cloud provides a powerful analog that mirrors the history of the on-premises computing world. Let me elaborate.

The ERP/CRM/Supply Chain application boom of the ‘80s and ‘90s preceded an enormous need in the ‘90s and ‘00s for additional tools and software systems designed specifically to create even greater value from the data generated by these (on-premises) transactional applications. Then, tools for data management, data integration, data warehousing and business intelligence (reporting and analytics) were born to deliver this new value.

Cloud computing has grown substantially in the last 10 years largely because of applications hosted in the cloud and made available as a service directly to consumers and businesses.  The poster
child here is Salesforce.com (although there are thousands of others).  Note that we call this category “Software-as-a-Service” when it really should be called “Application-as-a-Service” because the
providers in this category are delivering a transactional, process-oriented application designed to automate and improve some functional aspect of an organization.  As the use of these managed services/applications grows, so too does the quotient of cloud-originated data generated by these applications.

The dramatic rise in cloud-originated data from SaaS applications portends a similar need: this one for post-transactional cloud-based tools and software systems to define a new usage curve for liberating cloud-based data and creating substantially new organizational value. It’s just a matter of time. Which makes Jaspersoft’s work with Amazon clear and understandable.

In fact, Jaspersoft’s cloud-based service (across all major Platform-as-a-Service environments, such as VMWare’s CloudFoundry and Red Hat’s OpenShift, but right now, especially with Amazon’s AWS) helps ensure our tools are the de facto standard for reporting and analysis on cloud-originated data (in the post-transactional cloud). We’ll do this in two ways:
1. By bringing our BI service to customers who already prefer to use cloud services, and by being available in their preferred cloud instead of forcing them into our cloud; and
2.  By enabling elegant, affordable, embeddable reporting and analysis within cloud-based applications, so those who deliver this software can include intelligence inside their transactional applications.
At Jaspersoft, we ultimately see our cloud-based service as vital to reaching the broadest possible audience with just the right amount of reporting and analytics (not too much, not too little).  The post-transactional cloud will be fueled by cloud-originated data and the need to deliver cleverly-designed intelligence inside this environment will be more important than ever.

Brian
Gentile
CEO, Jaspersoft


1 I’ve borrowed the term “Post-Transactional Cloud” from ZDNet’s Andrew Brust, in his article entitled “Amazon Announces ‘Redshift” cloud data warehouse, with Jaspersoft support”.
2 Data and chart excerpted from TechCrunch article “Amazon S3: 905 Billion Objects Stored, 1 Billion Added Each Day”, Sarah Perez, April 6, 2012.