When our clients consider leaving their legacy Microsoft Dynamics GP on-premises infrastructure to upgrade to Cloud-based Microsoft Dynamics 365 Business Central, they face a few challenges. One of the biggest may be what to do with all the data, sometimes over 20 years of data, currently housed in their servers.
Migrating all of the historical data your company owns into Business Central can be very expensive and probably unnecessary. However, it makes sense that your company can’t throw out that data or leave it sitting on a server in a dusty basement closet. You still need access to what is a large, unwieldy pile of information.
The solution lies in the Microsoft Azure Data Lake.
Let’s review the most common questions that clients ask our Enavate experts about the Microsoft Azure Data Lake before, during and long-after their Cloud migration.
The Azure (pronounced AH·szure) Data Lake as we know it today is relatively young with the current Gen 2 version having been released in February 2019. One of the main advantages of the Azure Data Lake is that it can store both structured and unstructured data, such as pictures, improving the Microsoft ecosystem and Cloud experience for business of all sizes from small to mid-size up to very large enterprises.
No, a Data Lake isn’t technically a database, which most people understand to be a relational database. Instead, it’s a central repository for unstructured and structured data in various formats that are frequently sourced from multiple channels. The Data Lake is a highly scalable, highly secure data storage platform for data that we can still access with common reporting and analytics software. So, a Data Lake is a holding tank for all sorts of data, including years of historical financial information on your business.
Data Lakes today sit primarily in the Cloud. The scalability of the Cloud allows organizations to store massive amounts of information. Sophisticated Cloud reporting tools such as Power BI enable you to query and gain insight from structured and unstructured data from any channel. With a Cloud Data Lake and Power BI, you never have to ask yourself, “What is the business value inside all of this unstructured data?” That’s because a Cloud Data Lake can provide a single location source for an entire organization’s data, including legacy data.
Today’s Data Lakes are fully managed under a software-as-a-service (SaaS) model with a Cloud provider such as Microsoft Azure that handles provisioning, security, reporting and performance tuning.
The cost of a Data Lake depends on your selected vendor and several variables including:
For example, if you select Microsoft Azure Data Lake storage as your service provider, there are several options with different pricing. It’s free to turn a Data Lake on as you only pay for the actual storage you consume, and you can turn on the Data Lake service anytime. There doesn't have to be a big bang migration of legacy data before you run this week’s payroll!
Our advice is to first, focus on migrating to Microsoft Dynamics 365 Business Central with your active data flows. Then, return to the legacy data and run business reporting and analytics on Azure Data Lake when you’re ready.
The most notable benefits of having a Microsoft Azure Data Lake are:
“Many Enavate customers migrating to Microsoft Dynamics 365 Business Central come from older systems. They want to be able to access the 20 years of data that they won’t end up migrating to the new platform but don’t want to keep their old Microsoft GP server running. Instead, they send all that legacy data to the Data Lake, where it’s safe and accessible. Then they can use Power BI to interact with the data and understand what it’s trying to tell them.”
– Melissa Grover, BI and Practice Leader at Enavate
Data Lakes can be complicated to set up and manage, which is why Microsoft Azure offers a SaaS Data Lake product which greatly simplifies the entire Data Lake. If your organization has a small IT team or is new to big data, an Azure Data Lake is a good option that allows you to keep your historical information securely in the Cloud. You’ll also need a strong Microsoft partner to help manage the data transition to the Data Lake.
With the Cloud, the sky is, literally, the limit.
The Data Lake is a stepping stone in an organization’s data journey. It is a next step in thinking “How do I pull data together and do something intelligent with it? How do I turn my structured and unstructured data into business information that I can make critical strategic decisions against?”
Think of your Data Lake as a literal pool of information. It’s at the bottom of an analytical and technical process that culls data from that lake and funnels it through various branches of your IT infrastructure. A Data Lake serves an important function for organizations seeking to take both structured and unstructured data and turn it into actionable decision-making information for their business. It is the first step toward all the exciting technologies you’ve heard about, like predictive analytics and artificial intelligence (AI). But a Data Lake also serves an essential role for any company considering migrating from legacy Microsoft platforms into the Cloud.
Cloud security is always a concern for clients migrating from an on-premises environment they control. But Cloud-based Data Lakes offer increased security over on-premises data storage, particularly if you move to a trusted Cloud hosting service. Microsoft Azure holds the most security compliance certifications in the industry with 15 Global, 31 Regional and 64 industries specific formal compliance certifications. This level of security requires an economy of scale that most organizations of any size can’t afford, especially small businesses.
Some of the most common security measures for Azure Data Lakes include:
With Azure Data Lake being a SaaS fully managed solution, businesses do not need stand up any infrastructure, virtual machines or Kubernetes containers. All businesses need to do is to turn the Data Lake on via the Azure Administrative Portal and start channeling data into it.
But the processes for channeling that data can be complicated. Depending on the data type, whether it is historical or legacy information or data flowing in real-time, there are multiple processing frameworks for migrating and potentially standardizing and cleaning the data before it hits the Data Lake. As this can differ for each client, talk with your Enavate partner about your options.
The specific methods for monitoring data in a Cloud Data Lake depend on the company’s requirements and the Data Lake provider you use. For example, Microsoft Azure Data Lake storage offers several tools and services to audit your data, including monitoring tools to resolve performance issues and respond to cybersecurity threats. Data Lake analytics on Microsoft Azure allows you to monitor real-time data streams for anomalies that can trigger a cyber alert. Microsoft Azure can monitor data quality, usage, and end-user access patterns over time depending on your specific needs.
There are several tools to access your Azure Data Lake storage account. The most popular and easy to use tool is Microsoft Power BI, a data visualization and business intelligence software that can create interactive reports and dashboards. Other tools include Azure Databricks, Azure Synapse and Azure Data Lake Storage Explorer.
Another standard configuration is to add a data warehouse between the Data Lake and the end users. Think of a data warehouse as a library for structured data. Microsoft offers Azure Synapse, a Cloud-based data warehouse.
Synapse natively supports pulling data out of a Data Lake, doing high-speed manipulation and restructuring for reporting purposes. The Data Lake is a mishmash of information from various channels, including financial, marketing, HR, sales, distribution and more. Microsoft’s Synapse takes all this data, and structures it according to your instructions to optimize business needs such as strategic planning, reporting, and operations monitoring.
Implementing these new systems will up the ante in your analytics game. Say, for example, your company has 30 years of financial data in a legacy on-premises Microsoft GP server. You understand that the Cloud migration to Microsoft Dynamics 365 Business Central is the next step in your IT evolution and critical to modernizing your business operations. However, migrating your 30 years of data into Business Central is going to be cost- and resource-prohibitive. It makes more sense to migrate that legacy data to Azure Data Lake where you can easily reach it later without incurring excessive Business Central storage and data migration costs.
Azure Data Lake Storage offers a secure and scalable platform for storing and processing large amounts of legacy data. The old data from Dynamics GP can be extracted, transformed and loaded (ETL) into the lake. When ready, you can use Microsoft Power BI to query and make sense of what you have. This process frees you to concentrate on the more urgent goal of moving critical, current data to Microsoft Dynamics 365 Business Central. Within this context, an Azure Data Lake makes tremendous sense for companies concerned about what to do with all that data you need but don’t actively use every day.
A Data Lake is your best option if your legacy data is holding you back from migrating to Microsoft Dynamics 365 Business Central. As Microsoft Azure Cloud implementation experts, we help companies just like yours get started on creating an Azure Data Lake as part of their Microsoft Dynamics 365 Business Central migration.
The role of an ERP implementation partner goes beyond software support. Our team can help you build an architecture that prepares you for the next step in your big data journey and continue to support you with any changes that come next.
Reach out to one of our Enavate experts to learn more about how to best use your legacy data in Microsoft Azure.
As the Global Microsoft Solutions Evangelist, Robert is responsible for helping our clients with their end-to-end digital transformation journey within the Microsoft solutions and Cloud ecosystems. With over 20 years’ experience in the Microsoft channel – he helps our clients with Hybrid Cloud architectures, strategic and technical road-mapping, DevOps automation, Packaging, and deployment, navigating Microsoft App Source, partner relationships and more.