Analysts remain front and center in BI 3.0.
By Assaf Araki and Ben Lorica.
Automation and AI will democratize BI, expand the user base of BI tools, and enable users to perform increasingly more sophisticated analytics. New tools will reduce time to insights (TTI) by enabling data and business analysts to extract and transform data, uncover patterns, and produce more accurate forecasts, root-cause analysis, and simulations.
Organizations use a variety of BI tools to analyze structured data. These tools are used for ad-hoc analysis, and for dashboards and reports that are essential for decision making. While the typical BI user is an analyst, as BI and data management tools evolved, analysts have been able to add advanced analytics (and even machine learning) to their toolboxes.
In this post, we describe a new set of BI tools that continue this trend. These new tools make it easier for analysts and business teams to analyze data and generate reports with minimal assistance from their IT counterparts. Accompanying improvements in ETL and data management systems expand the data sources BI users can use while lessening the need for assistance from IT teams.
The initial set of companies we list offer mainly SaaS systems. Thus, companies who want to use these new BI solutions will need to move their data to public clouds.
BI History Review
BI solutions first appeared in the 1970s with early systems from companies like SAP, Siebel, and JD Edwards. The growth of data warehouses in the 1980s gave rise to a new set of solutions including Microstrategy, Cognos, and Business Objects. This early group of BI solutions (“BI 1.0”) were owned by the IT department, meaning that most users were not capable of creating reports and dashboards on their own. Users had to undergo extensive training to become proficient in using and administering these solutions. This generation of tools focused primarily on producing reports and dashboards.
The early 2000s added more speed to BI development and saw a concentration of BI in the hands of IBM, Microsoft, SAP, Microstrategy, and Oracle. This generation of BI systems let users perform ad-hoc analysis based on pre-generated schema. More precisely, users could create dashboards and they could “slice and dice” data using different dimensions and metrics.
The mid aughts saw the rise of new solutions built for data analysts. This new set of “BI 2.0” tools – as exemplified by Tableau and Qlik – put a large emphasis on visualization, interactive analysis, and ease of use. And these companies introduced a new form of interacting with data – visual pivoting – which combined pivot tables with charts and visualizations. Users might still rely on IT to connect their BI tool to a data warehouse or a database, but they could also use these tools on datasets they control like spreadsheets or text files. Once they connect to a data source, these tools give analysts the ability to run ad-hoc analysis, create and repost dashboards and complex visualizations, with no interference from their IT department. These tools made BI and interactive analytics ubiquitous.
With the advancements in BI in the last 30 years, the core function of data analysts has remained largely the same. An analyst frequently starts with a hypothesis or a question and interrogates data to refine his or her understanding. This is an iterative process that might entail wading through a series of hypotheses – using a BI tool pointed at a high-dimensional data set – before settling on a reasonable answer. What if this process can be automated?
BI 3.0 tools attempt to address two significant issues: reliance on IT and manual analysis. In order to solve the first issue, data analysts need to be able to create their own data models by using solutions that generate data warehouses automatically (e.g., ThoughtSpot, Hypersonix) or by using solutions that abstract the ETL process (e.g., Fivetran, Matillion). The second issue is being addressed by a new set of “AutoBI” solutions like Sisu, Anodot, YellowFin, and Outlier. These solutions automatically generate insight and recommendations and thus reduce the need for manual data analysis.
BI 3.0 Stack
In this post we highlight promising innovations that are beginning to appear in products and research systems. In doing so we list a few trends that are changing BI. First, we see solutions that reduce the need for manual analysis through automation and large-scale pattern recognition. Second, we see further simplification and democratization as new interfaces and new data preparation solutions reduce the need to call on IT teams. Finally, we see BI solutions that augment users and allow them to tackle more complex questions and problems.
Users of BI tools have grown accustomed to graphical user interfaces for creating reports or for interacting with data. GUIs can be used for ad-hoc analysis, and for creating, scheduling and managing reports and dashboards. Advanced users have access to text-based interfaces where they create custom queries (in SQL or another query language), and write scripts to automate tasks.
A new set of tools have the potential to expand the user base of BI tools and simultaneously expand what users are able to do.
- While past manifestations of natural language based query tools never lived up to initial excitement, improvements in natural language models may finally lead to tools that allow users to write queries in the form of text or voice queries. Recent examples include tools that turn simple questions about tabular data into database queries, tools that use search interfaces, and new natural language interfaces that rely on neural models.
- We are starting to see BI tools that address non-tabular data. For example, graphs and graph databases are beginning to be used in many applications including recommendation engines, fraud detection systems, identity and access management, and search. Tools like Graphistry and Linkurious allow analysts to analyze and interact with extremely large graphs. Another example is Kyrix, a research project that includes interactive visualizations involving geospatial data.
- New hardware – VR/AR, tablets, and surface computing – have led to new interaction paradigms that are starting to be used in some in new BI tools.
Companies deal with large amounts of data coming from many different sources. Analysts routinely have to point their BI tools to data collections with many fields and dimensions. A new set of tools scan large amounts of data with many dimensions and use machine learning to augment human analysts.
- Auto BI (descriptive analytics): Imagine an analyst needing to understand what drives a KPI but needs to manually point a BI tool at a table composed of hundreds of columns. A new set of tools scan large data management systems, surface patterns and trends, automatically suggest charts and reports.
- Auto Analytics (predictive analytics): Another set of tools enables analysts to do complex analytics without having to call in data scientists. There are now tools that automate cluster analysis, anomaly detection, forecasting and run what-if scenarios and simulations.
Automating Data Preparation for BI (ETL + DWH on the fly)
Even the best BI tools rely on the creation of data warehouses that analysts and business units can interact with. A new set of automation tools compress the process of creating data warehouses for BI. These tools focus on structured and semi-structured data. They cover extracting and moving data from different systems (cloud-native ETL), as well as the automatic creation of data structures that analysts can use for BI.
Over the next few years, expect plenty of innovation in the upper layers of the BI stack (AutoBI, interfaces, AI services). In this post, we described new tools that reduce reliance on IT and manual analysis. Analysts value tools that increase their productivity and reduce their TTI (Time To Insight). In an increasingly competitive business landscape, reducing TTI will be an important criteria for evaluating BI solutions. In fact, the companies we listed in this post are all designed to reduce TTI.
We also expect solutions to continue to help analysts expand their capabilities to include larger data sets, unstructured data, and advanced analytics. New tools for advanced analytics will enable analysts to generate insights that will help their organization meet (or even surpass) key metrics.
Innovations in data management will accompany developments in BI. We anticipate the emergence of self-managed, scale-out, cloud-native solutions. We also foresee cloud data management systems that mimic the serverless model by pricing based on the actual amount of resources consumed. But for data and business analysts a DBMS is a commodity, like the engine under the hood of a car. Analysts do not care about the details of storage or computation engines. They want tools that reduce their TTI, increase their productivity, open up advanced techniques, and help them exceed their KPIs.
- Taking Low-Code and No-Code Development to the Next Level
- What is DataOps?
- The Growing Importance of Metadata Management Systems
Disclosure: Intel Capital is an investor in Hypersonix. Ben Lorica is an advisor to Graphistry.
Assaf Araki is an investment manager at Intel Capital focused on AI and Data Analytics platforms and products.
Ben Lorica is chair of the NLP Summit, co-chair of the Ray Summit, and principal at Gradient Flow.
Related content: Other posts by Assaf Araki and Ben Lorica.
- An Enterprise Software Roadmap for Sky Computing
- What is DataOps?
- The Growing Importance of Metadata Management Systems
- Demystifying AI Infrastructure
- Software 2.0 takes shape