
Access to data is a key driver of competitiveness for businesses. However, the limited resources of data departments often result in significant delays in processing requests, leading to frustration and inefficiency for business teams. Many companies adopt a self-service data approach, enabling business users, product owners, and executives to directly access relevant data through user-friendly tools.
In a self-service BI strategy, it is essential to add a semantic layer that is understandable to business teams. Data transformation tools like dbt or LookML (for Looker) help structure data into usable formats translated into business terms. This allows end users to manipulate and visualize data easily through solutions like Power BI, Looker, Metabase, or even Excel, without requiring advanced technical expertise.
Implementing a self-service strategy enhances the autonomy of business teams, enabling them to access data directly without relying on IT teams, thus accelerating decision-making processes. It also contributes to creating a centralized platform, facilitating data access across the entire organization. By strengthening internal data culture, this approach helps teams better understand, use, and analyze data in their daily operations.
However, the increase in tools and access points can lead to "data chaos," where multiple versions of the same business metric coexist, complicating analysis. To avoid these issues, it is crucial to establish a robust semantic layer, ensuring consistent and uniform data interpretation across all users.
MDN’s point of view
This is the dream of every data team. However, it requires impeccable data structuring, clear governance, transparency, and upskilling of business teams in data usage for decision-making. This topic is closely linked to the implementation of One Big Tables (OBTs), a semantic layer, and especially an easily exploitable gold layer.
Theodo’s point of view
At Theodo, we recommend implementing a self-service BI strategy for companies with a significant pool of users who can benefit from simplified data access. Self-service strategies improve business team efficiency, reduce the burden on IT teams, and enhance data literacy.

Users familiar with Qlik or Power BI know that creating visuals can be tedious, especially when data is poorly prepared. Additionally, the complex interfaces of these tools often discourage beginners. Beyond these technical difficulties, rigid dashboard deployment processes can slow down functional teams in their work.
Metabase positions itself as a Business Intelligence (BI) solution aimed at solving these issues by focusing on pragmatic data exploration.
This open-source tool, with a modern interface, allows for dashboard creation while also offering additional features that are sometimes missing in other BI technologies.
MDN’s point of view
Metabase is a simple and ideal solution for business teams that are not comfortable with BI tools. With the self-hosted version, access to usage data is valuable for analyzing how the tool is used and making informed decisions. The roadmap is detailed and ambitious for the coming months and years. Superset could be a more suitable open-source alternative for expert analysts looking for a wider range of visualizations.
However, Metabase has some limitations. Its high flexibility for data exploration requires good data governance practices, especially regarding access control and KPI sharing. The available visualizations are simple, with limited customization options. Finally, SSO and row-level security (RLS) are only available in the paid Pro and Enterprise versions.
Theodo’s point of view
We recommend Metabase for ad-hoc data exploration and the autonomy it provides to functional teams. It offers similar features to Looker (SQL programming, reusable visuals) while remaining more affordable and accessible for non-developers.

Power BI is Microsoft’s Business Intelligence solution and one of the most popular tools on the market. It enables users to develop and share interactive dashboards with a large audience. Its main strengths include:
However, Power BI’s interface can feel overwhelming and requires some persistence to master. The tool is also prone to bugs, particularly related to its relational modeling features.
The solution is not available on Unix-based systems (such as macOS). Power BI can be tested for free on Windows, but a subscription is required to publish and view reports. Additionally, optimizing dashboards requires proficiency in two proprietary languages: DAX for measures and M (or Power Query) for data transformations.
Theodo’s point of view
We recommend trying Power BI for its versatile use cases, strong integration with the Microsoft ecosystem, high level of customization, and the availability of experts on the market.

Looker Studio, formerly known as Data Studio, is a Business Intelligence tool within the Google Cloud ecosystem that offers a key advantage over other popular BI tools: a set of free basic BI features, with the option to upgrade to a pro (paid) version for additional capabilities. This free access aligns with Looker Studio’s goal of democratizing data access for business teams.
Looker Studio provides several key features that set it apart:
Looker Studio also supports connections to numerous external data sources, such as CSV files and third-party APIs. Reports can be shared or embedded in web interfaces. However, the tool has limitations in terms of customization for complex use cases, performance with large data volumes, and governance management.
Some features are also locked behind the pro version, including:
Theodo’s point of view
We recommend adopting Looker Studio for companies seeking a fast and cost-effective reporting solution, especially those already using the Google Cloud ecosystem. For more advanced requirements, it may be worth considering more robust alternatives such as Looker or Power BI.

Looker, not to be confused with Looker Studio, is a Google Cloud Business Intelligence tool designed to address key challenges faced by business teams and data analysts.
Beyond traditional dashboarding functions found in BI tools, Looker offers several key features:
The tool integrates seamlessly with Google Cloud services, SQL databases, and Git. It also allows users to export visualizations and embed them externally, while maintaining access controls, particularly for BigQuery through Google Cloud IAM.
However, Looker has some limitations. Managing complex use cases in LookML can be challenging, and its flexibility is somewhat restricted (it is not open-source, and its supported data sources are limited to databases). Additionally, teams need training to use it effectively. Looker is also one of the most expensive BI solutions on the market.
Theodo’s point of view
We recommend Looker and its semantic layer for centralizing data and empowering business teams, provided that users receive adequate training. For simple dashboarding, more cost-effective alternatives, such as Looker Studio (free on GCP) or traditional BI solutions like Tableau and Power BI, may be more suitable.

Text-to-SQL allows non-technical users to formulate queries in natural language, which are then automatically converted into SQL. This field is rapidly evolving, driven by generative models such as GPT-4o and Sonnet, along with initiatives like QueryGPT from Uber, which aim to improve performance by incorporating more context and evaluating the quality of generated queries.
These technologies leverage intelligent agents to interpret user intent and generate queries tailored to specific databases. However, despite significant progress, they still face limitations. Common issues include translation errors in SQL, such as inventing nonexistent columns, and challenges in handling more complex queries involving multiple joins or subqueries. Additionally, contextual understanding of data remains imperfect.
Theodo’s point of view
We recommend a cautious evaluation of these technologies. While they can boost productivity for simple queries, they are not yet reliable for complex environments without human supervision.

Datamarts are specialized subsets of data warehouses designed to meet the analytical needs of a specific department or business function. In the context of Business Intelligence (BI) tools, a datamart goes beyond simply copying tables: data undergoes essential transformations—cleaning, structuring, and summarization—to make it immediately usable for visualization. The focus is on delivering preprocessed, relevant information tailored to specific use cases.
While datamarts offer better performance, ease of use, and ready-to-use data, they also have limitations. They lack flexibility and can lead to functional and technical duplication, which may cause broader governance issues. Additionally, business users remain dependent on technically skilled data analysts to expand their analysis scope, as only they can modify or create new datamarts.
The rise of distributed data warehouses and SQL-based transformation tools has led to diverging approaches between data engineers and data analysts. Data engineers favor centralized, robust solutions aligned with Inmon’s methodology, while data analysts prefer more agile and flexible methods, closer to Kimball’s philosophy.
Even though data is now processed and stored in the same warehouse, it is often organized into distinct zones or schemas, blurring the traditional distinction between datamarts and data warehouses in favor of a more integrated approach.
Theodo’s point of view
We recommend reassessing the use of datamarts, as their adoption has declined over the past decade for the reasons mentioned above. However, this paradigm remains relevant in cases where data engineers and analysts use different tools or when processing capabilities are limited or costly. Instead, we advocate for implementing a semantic layer, which provides greater flexibility and scalability for data analysis.

A data marketplace is a platform for data sharing where businesses can make their data accessible to other users. These platforms allow companies to select and purchase datasets through standardized APIs. By facilitating inter-company collaborations, data marketplaces not only provide an opportunity to generate additional revenue but also enable access to valuable external data sources. This enhances decision-making and optimizes internal operations.
This approach supports the growth of the data-as-a-service (DaaS) market, making data exchanges more seamless and standardized. By developing APIs for real-time data sharing, companies can convert their data into direct revenue, increasing their profit margins and overall profitability.
Data sharing has become a crucial driver for the growth of modern businesses. Many companies store data that, rather than being a cost, actually represents a monetization opportunity. By establishing data marketplaces, they allow other businesses to access these valuable datasets, creating a new revenue stream.
Companies that implement a data-sharing strategy can see an average revenue increase of 1 to 3%, while also benefiting from a 10 to 20% reduction in operational costs. According to a McKinsey study, by 2025, 80% of companies are expected to integrate external data into their decision-making processes, making them even more competitive.
Theodo’s point of view
We recommend data sharing for companies that have a deep understanding of their data assets and a clear vision for their projects. To ensure success, we advise applying rigorous governance from the outset to avoid costly adjustments and guarantee an optimal return on investment.

Analytics as Code is revolutionizing the work of data analysts. This approach overcomes quality issues related to the duplication and manual modification of dashboards and SQL queries. By adopting software development methodologies, it introduces modularity and versioning, significantly improving the quality and reproducibility of analyses.
The transition to code is essential but complex. A promising evolution is underway, with the emergence of markup-based semantic layers such as LookML and the shift from graphical ETL tools to code-first solutions like dbt. Although these advancements remain fragmented, they pave the way for a more consistent analytical experience.
Innovative companies are offering integrated solutions, including:
These tools have the potential to radically transform the analytical workflow, making it more efficient and collaborative. Some solutions excel in modularity but struggle to demonstrate a quick ROI, while others, though more comprehensive, lack the power of established systems.
Theodo’s point of view
We recommend exploring these technologies with interest but caution. Their potential is undeniable, but their maturity and integration into complex environments remain unproven. A careful evaluation is necessary to determine their added value compared to existing cloud solutions and modern data stacks.

QlikSense is one of the oldest Business Intelligence (BI) technologies still on the market. Released in 1993, it offers features very similar to Power BI (advanced visualizations, contextual variables, etc.), but with a less user-friendly interface. Some functionalities, such as cumulative sums, are hidden within a complex UI, making them harder to access.
Another drawback is Qlik’s official documentation, which is poorly indexed in Google SEO. Searching for help often leads to official forums, where the quality of answers is inconsistent. Additionally, Qlik Set Analysis, the proprietary language used to calculate Master Items (similar to Power BI measures), combined with a lack of debugging tools, makes Qlik difficult for beginner data analysts.
One of Qlik’s key features is QlikViewDatasets (QVDs), binary files that allow fast data reading. However, setting them up is not automatic and requires a dedicated Qlik application, adding extra development effort compared to other BI tools.
Theodo’s point of view
We do not recommend Qlik as a BI tool. It feels outdated, with a complex and rigid interface. The writing and reading of QVDs lack transparency and scalability. For reporting needs, tools like Power BI or Looker Studio provide better usability and documentation.
