As firms like Carousell push extra reporting into cloud knowledge platforms, a bottleneck is displaying up inside enterprise intelligence stacks. Dashboards that when labored fantastic at small scale start to decelerate, queries stretch into tens of seconds, and minor schema errors ripple in stories. In brief, groups discover themselves balancing two competing wants: steady government metrics and versatile exploration for analysts.
The stress is changing into frequent in cloud analytics environments, the place enterprise intelligence (BI) instruments are anticipated to serve operational reporting and deep experimentation. The result’s typically a single setting doing an excessive amount of – appearing as a presentation layer, a modelling engine, and an ad-hoc compute system without delay.
A current structure change inside Southeast Asian market Carousell exhibits how some analytics groups are responding. Particulars shared by the corporate’s analytics engineers describe a transfer away from a single overloaded BI occasion towards a cut up design that separates performance-critical reporting from exploratory workloads. Whereas the case displays one organisation’s expertise, the underlying downside mirrors broader patterns seen in cloud knowledge stacks.
When BI turns into a compute bottleneck
Fashionable BI instruments enable groups to outline logic instantly within the reporting layer. That flexibility can pace up early improvement, however it additionally shifts compute stress away from optimised databases and into the visualisation tier.
At Carousell, engineers discovered that analytical “Explores” had been steadily linked to extraordinarily giant datasets. In response to Analytics Lead Shishir Nehete, datasets generally reached “a whole bunch of terabytes in measurement,” with joins executed dynamically contained in the BI layer, not upstream within the warehouse. The design labored – till scale uncovered its limits.
Nehete explains that heavy derived joins led to gradual execution paths. “Explores” pulling giant transaction datasets had been assembled on demand, which elevated compute load and pushed question latency larger. The staff found that 98th percentile question instances averaged roughly 40 seconds, lengthy sufficient to disrupt enterprise critiques and stakeholder conferences. The figures are based mostly on Carousell’s inner efficiency monitoring, which was offered by the analytics staff.
Efficiency was solely a part of the problem: Governance gaps created further danger and builders may push modifications instantly into manufacturing fashions with out tight exams, which helped function supply however launched fragile dependencies. A tiny error in a subject definition may trigger downstream dashboards to fail, forcing engineers to carry out reactive fixes.
Separating stability from experimentation
Relatively than proceed to fine-tune the current setting, Carousell engineers selected to rethink the place compute work ought to stay. Heavy transformations had been transferred upstream to BigQuery pipelines, the place database engines are designed to carry out giant joins. The BI layer shifted towards metric definition and presentation.
The bigger change got here from splitting tasks in two BI cases. One setting was devoted to pre-aggregated government dashboards and weekly reporting. The datasets had been ready upfront, permitting management queries to run towards optimised tables as an alternative of uncooked transaction volumes.
The second setting stays open for exploratory evaluation. Analysts can nonetheless be a part of granular datasets and check new logic with out risking efficiency degradation of their government colleagues’ workflows.
The twin construction displays a broader cloud analytics precept: isolate high-risk or experimental workloads from manufacturing reporting. Many knowledge engineering groups now apply related patterns in warehouse staging layers or sandbox initiatives. Extending that separation into the BI tier helps keep predictable efficiency beneath development.
Governance as a part of infrastructure
Stability additionally relied on stronger launch controls. BI Engineer Wei Jie Ng describes how the brand new setting launched automated checks via Looker CI and Look At Me Sideways (LAMS), instruments that validate modelling guidelines earlier than code reaches manufacturing. “The system now routinely catches SQL syntax errors,” Ng says, including that failed checks block merges till points are corrected.
Past syntax validation, governance guidelines implement documentation and schema self-discipline. Every dimension requires metadata, and connections should level to authorized databases. The controls cut back human error whereas creating clearer knowledge definitions, an essential basis as analytics instruments start so as to add conversational interfaces.
In response to Carousell engineers, structured metadata prepares datasets for natural-language queries. When conversational analytics instruments learn well-defined fashions, they will map consumer intent to constant metrics as an alternative of guessing relationships.
Efficiency beneficial properties – and fewer firefights
After the redesign, the analytics staff reported measurable enhancements. Inner monitoring exhibits these 98th percentile question instances falling from over 40 seconds to beneath 10 seconds. The change altered how enterprise critiques unfold. As a substitute of asking if dashboards had been damaged, stakeholders may focus on evaluating knowledge stay. Simply as importantly, engineers may shift away from fixed troubleshooting.
Whereas each analytics setting has distinctive constraints, the broader lesson is simple: BI layers mustn’t double as heavy compute engines. As cloud knowledge volumes develop, separating presentation, transformation, and experimentation reduces fragility and retains reporting predictable.
For groups scaling their analytics stacks, the query isn’t about tooling alternative however round architectural boundaries – deciding which workloads belong within the warehouse and which stay in BI.
See additionally: Alphabet boosts cloud funding to fulfill rising AI demand
(Picture by Shutter Pace)


Wish to be taught extra about Cloud Computing from trade leaders? Try Cyber Safety & Cloud Expo going down in Amsterdam, California, and London. The excellent occasion is a part of TechEx and is co-located with different main know-how occasions, click on right here for extra data.
CloudTech Information is powered by TechForge Media. Discover different upcoming enterprise know-how occasions and webinars right here.
