Jet Driver Error When Importing Data from Excel Files

Importing data from files is common. Csv file formats are also common but can cause issues with characters such as commas. Excel files offer a nice solution as they are usually formatted properly and do not have issues with characters. However, there can also be a common problem scheduling a job to load the data due to a jet driver issue.

ImportData:Error: The requested OLE DB provider Microsoft.Jet.OLEDB.4.0 is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode. Error code 0x00000000. An OLE DB record is available. Source: “Microsoft OLE DB Service Components” Hresult: 0x80040154 Description: “Class not registered”.

Here is how to overcome this error. First, install the ‘Microsoft Access Database Engine 2010 Redistributable‘ file which will provide the required jet driver.

Next, open the SSIS package/solution in visual studio and click on Project, <project name> Properties. On the Configuration Properties, Debugging tab, Set Run64BitRuntime = False. The jet driver is only 32 bit.

There is one final step and that is a configuration setting on the SQL Agent job itself. Without setting this, it will fail. Edit the job, click on Steps and edit the step. On the General tab, click on Configuration, Advanced and check the box that says 32-bit runtime.

That’s it! Execute the job and it should now finish successfully.

Advertisement

SQL Saturday #611 – Orange County 2017

sqlsat611_web

SQL Saturday Orange County is coming up fast! The event will be on April 1st, 2017 at Golden West College, 15744 Goldenwest St, Orange County, California, 92647.

Reserve your spot today by going here: http://www.sqlsaturday.com/611/EventHome.aspx

I’ll be presenting two sessions:

  • SQL Injections and How To Stop Them
  • Mastering Master Data Services (Lots of changes in SQL Server 2016!)

SQL Saturday #497 – Huntington Beach 2016

sqlsat497_web

April is shaping up to be a fun, yet busy month. I have also been selected to present two topics at SQL Saturday #497 in Huntington Beach on April 2nd! I will be presenting on SQL Injections and Master Data Services. Here are descriptions of my presentations.

SQL Injections and How to Stop Them

Right now, there are hackers all around the world trying to get into your web applications. How safe are you? By using a technique called SQL injections, hackers can wreak havoc with web applications by compromising security, manipulating data, hoarding system resources, retrieving sensitive information, and manipulate data database objects such as dropping databases!

During the demo, we will take on a couple of different roles. As a hacker we will walk through steps a hacker might take to compromise a web application in order to retrieve sensitive data such as credit card information, usernames, passwords, and social security numbers. Assuming the role of a developer, we will then show various prevention techniques and their effectiveness in preventing SQL injections.

Attend this session to learn how SQL injections work, identify if you are being attacked, and how to stop them.

Update: Thank you to everyone who attended my session. We had a full room! 🙂

As requested, you can download the presentation and scripts here.

jeffprom-presenting

Mastering Master Data Services

As your organization grows, one challenge will be the management of data between systems and organizational units. With MDS, Microsoft provides the ability to have one accurate source of non-transactional data. This data can then be used within applications, other databases, and ETL processes. By leveraging data stewards to help manage the data repository, you can ensure that your MDS data is always up-to-date and accurate.

In this presentation we will cover MDS from start to finish including installation, creating models/entities, working with data, using the Excel add-in, security, hierarchies, views, versions, and business rules. After this presentation, you will have a good understanding of what master data management is, what MDS does, how to use it, and how to get started with your own MDS project.

To register for this event and to find additional information, you can visit the SQL Saturday Huntington Beach page here:
http://www.sqlsaturday.com/497/EventHome.aspx

SQL Saturday #492 – Phoenix 2016

sqlsat492_web
I recently moved from Minneapolis, Minnesota to San Diego, California. I’m certainly enjoying the warmer weather and am excited that I can check out SQL Saturday events in a new area. Along those lines, I am happy to announce that I have been accepted to present on Master Data Services at SQL Saturday #492 in Phoenix on April 16th! Below is a description of my presentation. I hope to see you at this event!

Mastering Master Data Services

As your organization grows, one challenge will be the management of data between systems and organizational units. With MDS, Microsoft provides the ability to have one accurate source of non-transactional data. This data can then be used within applications, other databases, and ETL processes. By leveraging data stewards to help manage the data repository, you can ensure that your MDS data is always up-to-date and accurate.

In this presentation we will cover MDS from start to finish including installation, creating models/entities, working with data, using the Excel add-in, security, hierarchies, views, versions, and business rules. After this presentation, you will have a good understanding of what master data management is, what MDS does, how to use it, and how to get started with your own MDS project.

To register for this event and to find additional information, you can visit the SQL Saturday Phoenix page here:
http://www.sqlsaturday.com/492/EventHome.aspx

Formatting Power Query Columns

Power Query is a great tool to quickly, and easily, load data into an Excel workbook. However, on the surface one potential drawback is the lack of formatting one can apply to the columns. This post will describe in further detail what I mean and show how to overcome this issue.

First, let’s take a look at how Power Query doesn’t work very well with formatting columns. Here I am pulling a set of data into Power Query and looking to format various columns with appropriate data types such as integer, currency, and date. I set the appropriate data types.

Whole Number

pq - whole number

Currency

pq - currency

Date

pq - date

Now that we set the data types, do a Close & Load to load the data into the Excel worksheet. We can inspect each column that we formatted and will see that the Whole Number and Currency turned into a General format. However, the Date data type actually does come through as a Date format.

Whole Number

pq - whole number - after load

Currency

pq - currency - after load

Date

pq - date - after load

Now let’s look at how we can actually format the number (integer), and currency columns. Yes, we can set it in Excel like normal but the problem is that when you refresh the data it will revert back to non-formatted columns. To get around this, click on the Data tab at the top and select Properties.

pq - data properties

Check the box that says ‘Preserve column sort/filter/layout’.

pq - preserve column layout

You can now format the columns in Excel like normal and it will retain formatting changes after data has been refreshed!

pq - format preserved

This works very well and you can format data just like normal in Excel. The only downside is that it is workbook specific. If you use the same query, possibly from the Power BI Data Catalog query, it will not retain the formatting changes and you will need to format each column again in every workbook you use.

Sorting a Chart in SSRS

Sorting a chart in SSRS sounds easy enough, but the sort option isn’t where you might expect to find it. In this example I am using Visual Studio 2012.

First let’s take a look at an unsorted chart. Even if you sort values in your underlying dataset, your chart will likely not be sorted.

ssrs chart unsorted

One might expect to find sort options by right clicking the chart and going to Chart Properties which will not work. You then might think, of course it must be on the Series Properties. However, that again will not work. So let’s take a look at how to sort the chart. Start by double clicking the chart which will bring up the Chart Data box. Now you will probably expect to find sort options by looking at Value options.. and you would be wrong yet again. Click on the drop down arrow under the Category Groups. Select Category Group Properties.

ssrs chart select category group

Here you will find a Sorting option. Click Add and select what you would like to sort on.

ssrs chart add sort order

Now you can see that we finally have a sorted chart.

ssrs chart final sorted result

SSRS Chart Does Not Show All Labels

When creating a new chart in SSRS, you may find that not all of the labels are being displayed on either the vertical or horizontal axis. This will happen when the chart tries to determine how many labels it can show using the default setting. Sometimes this isn’t the best solution. For example, as you can see in my chart below there should be plenty of room to show all of the labels but it decides not to.

ssrs chart final sorted result

In this case we definitely want to show all of the labels. To do this we will need to adjust some settings. Double click on the label names on the chart.

ssrs select label names

On the right hand side of Visual Studio, you should now see the Chart Axis values on the Properties area.

ssrs axis properties

Under the Interval attribute, change the value from ‘Auto’ to 1. This will specify that we want to show labels on every occurrence.

ssrs set interval value

Re-run the report and now you can see that all of the labels are shown.

ssrs chart with all labels

Importing Stock Quotes Into SQL Server Through SSIS

I’ve worked at a publicly traded company for the past year and a half and one thing I always thought would be fun to do was to download stock values through SSIS and use it on reports or display it in a dashboard. Sounds easy enough, but as it turns out it was a bit harder than I expected for a few reasons:

1. Finding a stock quote provider that worked well with SSIS was a challenge.
2. I wasn’t finding any other good tutorials on how to download or process stock quotes using SSIS.
3. Lack of time to research and investigate a solution.

However, all three of the above have finally been overcome and that is why I am posting this article in case others were hoping to do this as well. So let’s take a look at how this can be done. In my examples, I am using Visual Studio 2012 and SQL Server 2012. If you are going to load data for just one particular stock, there won’t be a massive amount of data, so it’s easy to just truncate the destination table and reload each day which is what I’ll show here.

First, decide what stock symbol you would like to use. In my examples we will use Microsoft (MSFT). Their stock dates back to March 13th, 1986 and provides us with some good data. To get historical stock quotes, we are going to use an API from Yahoo Finance. We will make a URL call to download a csv file which accepts a stock quote parameter.

Here is what it looks like: http://ichart.finance.yahoo.com/table.csv?s=MSFT
The csv file includes data starting the day before (yesterday) and goes back to the fist IPO.

Now that we have a way to get stock quotes, let’s create an SSIS package and work on processing the data. Retrieving data through an HTTP connection is the next challenge we need to overcome. SSIS has an option to create a HTTP connection under Connection Managers but unfortunately it will not accomplish what we need. Let’s take a different approach. Start by creating two variables:
1. LocalFilename (point to a local folder and filename)
2. RemoteFilename (http://ichart.finance.yahoo.com/table.csv?s=MSFT)

variables used

Next, add a Script Task to the Control Flow. Edit the Script Task and add the two variables we just created to the ReadOnlyVariables.

add script task

Click on ‘Edit Script’. Under the public void Main() section, add the following:

Variables varCollection = null;

Dts.VariableDispenser.LockForRead(“User::RemoteFilename”);
Dts.VariableDispenser.LockForRead(“User::LocalFilename”);
Dts.VariableDispenser.GetVariables(ref varCollection);

System.Net.WebClient myWebClient = new System.Net.WebClient();
string webResource = varCollection[“User::RemoteFilename”].Value.ToString();
string fileName = varCollection[“User::LocalFilename”].Value.ToString();
myWebClient.DownloadFile(webResource, fileName);

Dts.TaskResult = (int)ScriptResults.Success;

script task code

At this point you can run the package and it should download the csv file to your local folder.

You can open the csv file and see that there are 7 columns (Date, Open, High, Low, Close, Volume, Adj Close).
(CAUTION: If you leave the csv file open and try to run the package again, you will get an obscure error because the script task cannot get a lock on the file)

error message

Now is also a good time to create a table where we can store the incoming data. Here is an example of the table I created. Feel free to modify it to fit your needs.

CREATE TABLE [dbo].[FactStockValue](
[StockValueKey] [int] IDENTITY(1,1) NOT NULL,
[DateKey] [int] NULL,
[StockSymbol] [varchar](5) NULL,
[OpenAmount] [money] NULL,
[HighAmount] [money] NULL,
[LowAmount] [money] NULL,
[CloseAmount] [money] NULL,
[Volume] [int] NULL,
[AdjustedCloseAmount] [money] NULL,
CONSTRAINT [PK_FactStockValue] PRIMARY KEY CLUSTERED
([StockValueKey] ASC)
)

Instead of storing just the date value, I convert the date to a smart key value which can then join to a date dimension table. Feel free to skip this step and store just the date if you like.

Now let’s hop back into the SSIS package and add a Data Flow Task. Within the Data Flow Task, add a Flat File Source and create a new connection that points to the local csv file. Since the csv file doesn’t include the stock symbol, I added a Derived Column transform to specify the symbol we are processing.

derived column

Next, I added a Data Conversion transform to convert Date to a database date [DT_DBDATE] which was needed to do a lookup on my date dimension table, and to convert the stock symbol value from unicode to string [DT_STR].

data conversion

In my example, I am doing a date lookup on a date dimension table to get a DateKey value. Finally, end with an OLE DB Destination which points to the destination table. Run the package and verify that it works.

data flow success

Check the destination table and verify that everything looks correct.

table results

Now that we have a way to retrieve and store stock data values, we can use it on dashboards and reports!

stock value chart

Cleansing Data Through SSIS with DQS

While Data Quality Services (DQS) takes some time to build up an effective knowledge base through manual data review, there are times when it sure would be handy to use SSIS. This is only recommended when you need quick results and are feeling pretty good about your knowledge base or are perhaps using an external knowledge base through reference data. Should you decide to cleanse data through SSIS using DQS, it’s important to note that it can export some quality control metrics such as Status and Confidence score with the final data which can later be used to review the results and make final decisions. So let’s take a look at using the DQS Cleansing transform in SSIS.

In this example, I created a basic SSIS package with one Data Flow. I selected an Excel Source and pointed it to a file that has some data to be cleansed. In my example I am going to send in some bad addresses and attempt to cleanse them using Melissa Data. To learn more about the Melissa Data Address Check add-in and reference data in general, view my other blog post here.

Create a new package, add a Data Flow Task, and then add a DQS Cleansing transform object.

dqs cleansing transform setup

Click on the New button next to the Data quality connection manager to setup a new connection to an instance of DQS and then select a knowledge base.

dqs cleansing transform setup with connection

Click on the mappings tab. Here you will map input columns from your data source to the domains on the knowledge base.

dqs cleansing transform with mappings

If you would like to see additional columns on the final results such as the Confidence score and the Reason then click on the Advanced tab.

dql cleansing transform advanced tab

When finished, click OK and test it out. In my example I am just going to a Union All object as my destination for demo purposes but I can still see the resulting cleansed data through the Data Viewer.

dqs cleansing results from transform

As you can see from the results, two addresses were corrected, two had suggestions, and one failed to be cleansed. We could get fancy and split up the results based on the Status values. Corrected ones may be considered ready to ship. Auto Suggest records may want to be reviewed, and the New ones would need to be reviewed since it was not found in the knowledge base.

Using the DQS Cleansing transform in SSIS is useful but doesn’t eliminate the need to review the final cleansed results. Cleansing data through SSIS is dependent on the quality of the underlying knowledge base but does provide a quick and efficient way to do an initial data cleanse. The resulting cleansed data can then be evaluated using the Status, Reason and Confidence score values. Once the data has been processed and reviewed, only then should it be considered cleansed and ‘ready for use’.

Cleansing Address Data With Data Quality Services (DQS) and Melissa Data

In this post I will show how to cleanse address data using Microsoft Data Quality Services (DQS) and Melissa Data as a reference data service. First, you need to make sure you have a working installation of DQS. If you don’t, you can view my other post here that walks you through installing DQS. Once DQS has been installed, you will need to visit the Microsoft Azure Marketplace website and subscribe to the Melissa Data Address Check service. When I wrote this post, you could get 1,000 addresses cleansed per month for free. This is useful for testing or if you have a small set of data to be cleansed. Select the desired service level/amount from the list and click the Sign-Up button. You will now have a subscription added to your account.

melissa data service on the azure marketplace

Next, we need to tell DQS about our Azure Marketplace account. On the Microsoft Azure Marketplace website, click on ‘My Account’ and look for the Primary Account Key value.

azure marketplace ID

Highlight and copy the Key value. Open the Data Quality Client tool and click on Configuration on the right-hand side. Paste your Key value into the DataMarket Account ID entry box and click on the validate button.

dqs validate account ID

Once your ID has been validated you are then ready to setup a knowledge base in DQS that will use Melissa Data. On the DQS client, click on the Close button on the Configuration screen to get back to the main menu. On the top left, click on New Knowledge Base. Add 4 new domains (Address, City, State, Zip). Next, add 1 composite domain (FullAddress). Add the first 4 domains you just created (Address, City, State, Zip).

setting up the knowledge base domains

With the FullAddress composite domain selected, click on the Reference Data tab. Click on Browse and check the Melissa Data service.

On the Schema mapping table, create the following mappings between the RDS Schema and Domains:
1. AddressLine (M) = Address
2. City = City
3. State = State
4. Zip = Zip

adding melissa data as reference data

You can adjust several of the Providers Settings such as Auto Correction Threshold, Suggested Candidates, and Min Confidence. Once finished, click the Finish button and select Publish.

save knowledge base with reference data

Next, we will cleanse a set of data using our new knowledge base. On the main screen of the DQS client, click on New Data Quality Project. Give it a name, and be sure to select the knowledge base we just finished setting up that uses the Melissa Data reference data.

new data quality project

Click Next and select a data source that points to address records you would like to cleanse. Map the 4 Source Columns to the appropriate Domains. If you have mapped all 4 correctly, you should be able to click on the ‘View/Select Composite Domains’ button which tells you that the composite domain will now be used.

data quality project - map domains

Click Next, and Start. DQS will analyze the data and provide a summary of results.

cleansing profile results

Click Next to see the cleansing suggestions and to ultimately work through corrections. In my example I am using 5 addresses which are Melissa Data office locations that I found on their website. I altered some aspects of the addresses to see how well it can cleanse the data. Let’s take a look at the results.

record results before adjustments

As you can see here, DQS suggested 2 corrections (Road to Rd, Drive to Dr). The interesting thing is that both of these corrections don’t match what is on their website. (They have Road and Dr.) Perhaps they need to cleanse their own addresses. 🙂 It also listed one entry as Invalid even though all I did was add a 1 to the address value. This record would need to be manually reviewed/corrected or we could adjust the threshold settings to see if we get better results. DQS successfully corrected 2 records with a good confidence score. It wasn’t fooled by me changing the City name on one and Zip on the other.

approved results

Work through the results by making adjustments, approving or rejecting and then finally exporting the records to it’s final destination (SQL Server, CSV, or Excel).

Address cleansing can be an important yet difficult task. However, with DQS we now have the ability to tap into external Knowledge Bases such as Melissa Data to help. By leveraging reference data within DQS we can quickly and effective cleanse address information.