Most organizations have some sort of experience with building and maintaining a data warehouse or a data mart. And most of these warehouses and marts took a lot of people a long time to build, and cost the organization quite a bit of money. As the primary data source behind BI and analytics programs, the data warehouse is considered a critical component for storing and managing historical data. Over time, enterprises have learned the tools, trained the staff, and have their procedures and processes in place to build and maintain the data warehouse. When the enterprise has done it like the experts said – everything should be running smoothly. Right?
Yet data warehousing often gets a bad rap. Why? Because warehouses take a lot of money and a lot of people a long time to deliver them. But it’s not that the IT staff or your outsourced systems integrator is incapable of delivering. Usually it is because the data warehouse is complex to build, the analytical requirements change regularly and rapidly, and the nature of the traditional data warehouse build is accomplished manually. Yes, there are software tools available, but there is still much manual labor involved in the process. With so many moving parts, it’s easy to make errors, and if enough time elapses before go-live and between updates, the business requirements can change.
As with many modern conveniences, data warehouse automation has been a time-saver and a labor-saver. Yet data warehousing, for all its value to an enterprise, remains a project largely built, maintained, and operated using manual techniques. Consider these high-level steps: business requirements are captured on paper or in tools that end with a logical design. To translate the logical design to a physical schema, DBAs write SQL programs. The data integration team uses ETL tools to map and transform source to target data, usually requiring hundreds or thousands of discrete jobs to load the data warehouse. Then, more programming is required by the BI team to build a semantic layer for BI tool access. This process is repeated each time a new data source is identified, or new business requirements arise. Why not bring some automation to the data warehouse design, build, and operation activities? Data warehousing is an activity that is ripe for benefiting from some automation. Without a solution to reducing the time to delivery from traditional data warehouse process bottlenecks, data consumers will look for their own solutions. Debates about whose numbers are correct will increase with ungoverned rogue data silos and when other shadow IT projects arise.
Attunity Compose is designed to bring automation to the data warehouse design, build, and operate processes. By automating the tedious, time-consuming, repetitive tasks, enterprises can realize significant time savings – TDWI estimates automation makes building the data warehouse up to 5 times faster. One Attunity Compose customer has reported being able to deploy a data mart in a single day, compared to 2 weeks before using automation – that’s 14 times faster!
While the primary process remains the same, the difference is automation is used to accelerate the process. Attunity Compose works within common data warehouse environments, using the same skilled people and same processes your teams know – except now they are automated and complete much faster. Enterprises can build and launch the first version of the data warehouse in days or weeks, and they can also update it for new requirements much faster. Another Attunity Compose customer reported they can now implement six times as many changes to the data warehouse in the same period than before switching to Attunity Compose.
Interested in learning more? Check out one analyst’s view and then contact us to see how data warehouse automation can enable your enterprise to get more value from your data warehouse.
John Evans is a software marketing professional with over 20 years of experience with enterprise software working with companies that include Attunity, Oracle, Kalido, and IBM.