+61 2 9007 9887 contact@bizcubed.com.au

Mistakes businesses make when it comes to Analytics Projects

I have had the pleasure (and the ensuing pain) of working in the Analytics field for more than 20 years now. I have worked at both ends of the spectrum. I have been involved in small-scale departmental analytics projects involving a few analysts. I have also been...

Expectations v Reality in Enterprise Analytics

Big Data is a big deal in enterprise, and for good reason. If harnessed entirely and leveraged, it has the potential to change the fate of an organisation. Enter the Internet Of Things (IoT) and the data just got bigger, and that too, by a lot. And it is not just the...

Pentaho 8.1 Release – A Quick Summary

  The Pentaho 8.1 Enterprise Edition was released a few months ago. It delivers a wide range of features and improvements, including; Improved Streaming Steps in PDI, Increased Spark Capabilities in PDI, Enhancements to Google Cloud Data, Increased AWS Security....

Pentaho 8.1 Release – Doubling Down on its Core

  At BizCubed, we are proud to be a Hitachi partner organisation. And a key reason is, Hitachi has remained committed and consistent to interoperability of data processing and analytics across complex data ecosystems; a core cause we are also driving.  Through the...

What your Data Life Cycle probably looks like

From Analyst, to Scientist, to Engineer A common problem we see is that different roles are using different methods of preparing and reporting data. What occurs is that there is very little collaboration between key data driven roles within the organisations that we...

Your competitors are doing this. Why aren’t you?

Unlock your business – your team has the answers We have recently been reviewing the impact that data quality has on workforce productivity.  In the analyses that we have seen, there is an impact of 10—32% of workforce time being wasted due to poor data productivity....

5 Keys to Creating a Killer Data Lake

It’s been several years since the term “Data Lake” was coined by my friend and Pentaho co-founder James Dixon.  The idea continues to be a hot topic and a challenge to execute properly. The problem is that too many people think all they need to do is dump data into...

Filling the Data Lake with Hadoop and Pentaho

The blueprint for filling the data lake refers to a modern data onboarding process for ingesting big data into Hadoop data lakes that is flexible, scalable, and repeatable.  It streamlines data ingestion from a wide variety of source data and business users, reduces...

Introducing the Adaptive Execution Layer and Spark Architecture

Recently Pentaho announced the release of Pentaho 7.1. Don’t let the point release numbering make you think this is a small release. This is one of the most significant releases of Pentaho Data Integration! With the introduction of the Adaptive Execution Layer (AEL)...

Westfield Case Study

Westfield Group has one of the world’s largest shopping centre portfolios with locations on four continents. Last year more than one billion customer visits generated over $40 billion in retail sales from the world’s leading retailers. Wanting to improve the customer...

De Bortoli Wines Case Study

De Bortoli Wines is a third generation family wine company established by Vittorio and Giuseppina De Bortoli in 1928. The couple immigrated to Australia from Northern Italy, and created a winery with a foundation rooted in hard work, generosity of spirit and sharing...