Story image

No ETL? No worries, with SQL Server 2019's new data virtualisation

05 Apr 2019

Article by IT journalist David Williams

Big data is big business, and global software titan Microsoft is making big data its business to streamline businesses with the next release of SQL Server 2019 late this year.

Microsoft states SQL Server 2019’s focus is to make managing a big data environment a lot easier.
In fact, the product supports a rich variety of environments like Hadoop Distributed File System (HDFS), Spark, analytics tools, and even Kubernetes-managed Linux containers.

Previously, Microsoft allowed T-SQL queries inside SQL Server 2016 to pull data from Hadoop and return it in a structured way without moving or copying the data.

In the 2019 release, this concept of data virtualisation is widely expanded to data sources including Oracle, MongoDB, even CSV files.

Data virtualisation means database administrators can virtualise external data in a SQL Server instance, regardless of source, location, and format, so it can be queried like any other set of tables within your SQL Server instance.

Data virtualisation helps you create a single “virtual” layer of data from these disparate sources providing unified data services to support multiple applications and users.

This sounds like a data lake, but unlike a typical data lake there is no need to move data from where it lives when using data virtualisation, and this is a huge advantage over traditional ETL (extract, transform, load) processes that by necessity require delays, extra storage, additional security, and an amount of engineering to setup and maintain.

For database administrators, software developers, data scientists and others this new capability is transformative and powerful in its simplicity.

Quite literally, Microsoft has enabled SQL Server 2019 databases to hold virtual tables which come from some other place on the Internet - whether it is a SQL Server database or otherwise, whether it’s local or far - and query and join and work with these tables just as if they were native to a company’s database.

All complex ETL processes are gone.

There are numerous situations where this delivers tremendous time and functionality savings.

For instance, if a user makes a dashboard showing their company’s revenues by period, office and client out of their main line-of-business app’s database.

Management then asks for budgeted sales figures to be included so the dashboards contrast actuals against the budget.

However, these aren’t included in the main app but instead held in an Excel spreadsheet.

Traditionally, the solution is to manually or periodically upload the spreadsheet into a new table created for this purpose, which your dashboard then queries against.

Or, if they’re using SQL Server 2019, the user merely saves the budget figures as a CSV file on the Finance team’s file share and add this as a virtual table in their app’s database.

The budget figures appear, and can be queried against, and reported on, as if they were always in the database.

Another case: the company needs sales figures to move from its main application into the back-office ERP platform.

Typically, this is achieved by exporting and importing the data between the disparate products.

Or, if someone is using SQL Server 2019, they include relevant tables from the ERP system as virtual tables in the main application’s database, and directly move data via a scheduled stored procedure.

The ERP doesn’t even have to be an SQL Server database, so long as it is accessible.

SQL Server 2019’s data virtualisation supports Cosmos DB, SQL Server, Azure SQL, Oracle, HDFS and DB2.

Other SQL Server 2019 features include

  • transformational insights over structured and unstructured data supporting Hadoop and Spark
  • scalable compute and storage clusters
  • a complete AI platform to train and operationalise R and Python models in SQL Server Machine Learning Services or Spark ML using Azure Data Studio notebooks
  • the choice of programming language and platforms like .NET, PHP, Node.JS, Java, Python, Ruby and more, and deploy the application on Windows, Linux or containers both on-premise and in the cloud
  • real-time analytics on operational data using Hybrid Transactional and Analytical Processing (HTAP)
  • intelligent query processing features that improve scaling of queries
  • security enhancements including Always Encrypted secure enclaves, and Data Discovery and Classification labelling for GDPR
What the future of fibre looks like in NZ
The Commerce Commission has released its emerging views paper on the rules, requirements and processes which will underpin the new regulatory regime for New Zealand’s fibre networks.
Gen Z confidence in the economy is on the decline
Businesses need to work hard to improve their reputations.
Why NZ businesses have less than two years to adopt digital before disruption hits
Research found that digital disruption is already impacting two-thirds of New Zealand organisations.
Dell EMC launches interactive AI Experience Zones
The AI Experience Zones are designed to educate visitors about how to start, identify, and implement an AI project.
What NZ can learn from the Baltimore cyberattack
“Businesses must control physical access to their computers and secure their networks."
Infratil seeks clearance to acquire up to 50% stake in Vodafone NZ
The commission will give clearance to a proposed merger if they are satisfied that the merger is unlikely to have the effect of substantially lessening competition in a market.
Hands-on review: MiniTool Power Data Recovery Software
I came across a wee gem of advice when researching the world of data recovery. As soon as you get that sinking feeling and realise you’ve lost a file, stop using your computer.
Deepfakes the 'next wave of concern' - but can law really stomp it out?
Enforcing the existing law will be difficult enough, and it is not clear that any new law would be able to do better. Overseas attempts to draft law for deepfakes have been seriously criticised.