MODIFY FILE failed. Specified size is less than or equal to current size

I recently ran into the following error when trying to install Polybase on SQL Server 2017 Enterprise:

MODIFY FILE failed. Specified size is less than or equal to current size

Polybase creates three new databases during the installation:

  • DWConfiguration
  • DWDiagnostics
  • DWQueue

The installation seems to have a problem creating these new databases when using the file settings on the model database.


To get around this, toggle over to SQL Server Management Studio and change the file settings on the model database. In my case, they were set to 1024 MB, 512 MB (Logs), and 512 MB autogrowth.

I changed it to 512 MB, 71 MB (Logs, and 10 MB (autogrowth)

I then toggled back to the Polybase installation, clicked Retry and it worked! After the installation, I then went back and changed my model file size settings back to what they were.


Workaround for Self-Joining Table Limitations on Indexed Views

Indexed views are special views that can offer huge performance improvements over regular views. They perform faster because they are stored in the database the same way a table is stored. They also have a unique clustered index which greatly improves performance. However, these benefits come at a price. Indexed views have a lot of limitations to consider before implementation.

To see the full list of limitations and to learn more about indexed views, click here:

Recently, I was creating several indexed views and came across a limitation of joining to the same dimension table multiple times (role-playing dimension). I received the following error message: Cannot create index on view “x_DW.dbo.vw_SchemaBoundView_Test”. The view contains a self join on “x_DW.dbo.DimUser”.

Google searching resulted in several not-so-great suggestions to overcome this limitation. I believe the best solution was to create a new table and essentially writing duplicate data values to this new table. For example; create a dbo.DimUser2 table. I really do not like this approach because I would have to create another table for one single purpose and maintain the ETL and data for this one purpose.

After some thinking, I came up with another solution which requires less maintenance and seems to have good performance with less overhead. I broke my views up into two views: ViewName and ViewName_Base. The Base view has essentially everything minus the second join to the same table. I identified which of the two self-joining tables had the greatest amount of columns or caused the biggest performance hit. This was the join I included in my Base view because it will get stored like a table and indexed. I created the base view with schemabinding and created the unique clustered index.

Next, I created the other, non-base, view. This was nearly identical to the Base view. However, it’s primary source is the Base view. I then joined the Base view to the other self-joining table reference to get the final desired columns which I needed in my select statement.

By splitting the original view into two views, I was able to work around the self-joining table limitation in indexed views but was still able to have a really nice performance improvement. I did not need to create additional ETL or need to create another table with duplicate data that I would need to maintain.

View Running Queries by CPU Time

If you have ever done performance tuning you know it can be a bit of an art and you need your detective hat on. I was recently working on a server that was performing poorly. After looking at resource monitor it was clear that the CPU usage was unusually high. As it turns out SQL Server was utilizing most of the CPU. The hunt was on. I grabbed my magnifying glass and followed the trail. While I could run sp_who2 to find some relevant info, I instead ran the query below. sys.dm_exec_requests returns information about each request that is executing within SQL Server. sys.dm_exec_sql_text returns the text of the SQL batch that is identified by the specified sql_handle.

-- check for queries running. sort by cpu time
SELECT a.session_id, db_name(a.database_id) as db_name, a.start_time, a.command, a.status, a.cpu_time, a.total_elapsed_time, a.reads, a.writes, b.text as query
FROM sys.dm_exec_requests a
OUTER APPLY sys.dm_exec_sql_text(a.sql_handle) b
WHERE a.session_id > 50 -- filter out background tasks
and a.session_id <> @@spid -- filter out this query session
order by a.cpu_time desc

This query shows open queries sorted in descending order by CPU time. I was able to nab a few culprits in the act. I copied the queries to new windows and checked the execution plans. After some intense interrogation it was clear that several tables needed an index and the sub-queries should be turned into joins. Updates were put in place, and the CPU usage went way down. Case closed!

SSRS Page Break After X Number of Records

In SQL Server Reporting Services you can add a page break after a desired amount of records. Here is how you do it.

1. Add a new Row Group and give it a name.
In our example we named our group:

SSRS Row Group

2. Go to the properties of the Row Group.

SSRS Row Group Properties

3. Use the following expression to group the row on:
=Floor((RowNumber(Nothing) – 1) / 250)
In this example, 250 is the number of records we want per page.

SSRS Row Group Expression

4. On the Page Breaks tab, check the following:
‘Between each instance of a group’
‘Also at the end of a group’

Page Breaks Tab

Go ahead and run the report.
You should now have 250 records per page.

Using RowVersion and Timestamp

If you do any kind of batch ETL processing, it can be very useful to know if any records in your database have been updated. One way to do this is to use a rowversion or timestamp column in your tables.

Books online has a good definition for timestamp:
Each database has a counter that is incremented for each insert or update operation that is performed on a table that contains a timestamp column within the database.
This counter is the database timestamp. This tracks a relative time within a database, not an actual time that can be associated with a clock. A table can have only one timestamp column. This property makes a timestamp column a poor candidate for keys, especially primary keys. Any update made to the row changes the timestamp value and, therefore, changes the key value.

Below are a couple of examples I came up with to demonstrate their potential. The first set uses timestamp. The second set uses rowversion. However, rowversion still uses timestamp behind the scenes so it’s not much different.

-- 1. create a test table and put a few records in it
CREATE TABLE Test_TimeStamp (RowID int PRIMARY KEY, Value int, TS timestamp);
INSERT INTO Test_TimeStamp (RowID, Value) VALUES (1, 0);
INSERT INTO Test_TimeStamp (RowID, Value) VALUES (2, 0);
INSERT INTO Test_TimeStamp (RowID, Value) VALUES (3, 0);

-- 2. store the latest timestamp that is currently in the table
DECLARE @TS AS timestamp;

-- update a couple of records
UPDATE Test_TimeStamp SET Value=2 WHERE RowID=1
UPDATE Test_TimeStamp SET Value=3 WHERE RowID=2

-- show all of the records that have changed

-- show the new latest timestamp stored in the table

-- 3. re-run step 2 and see the changes again and again
--DROP TABLE Test_TimeStamp

-- 1. create a test table and put a few records in it
CREATE TABLE Test_RowVersion (RowID int PRIMARY KEY, Value int, RV rowversion);
INSERT INTO Test_RowVersion (RowID, Value) VALUES (1, 0);
INSERT INTO Test_RowVersion (RowID, Value) VALUES (2, 0);
INSERT INTO Test_RowVersion (RowID, Value) VALUES (3, 0);

-- 2. store the latest timestamp that is currently in the table
DECLARE @TS AS timestamp;

-- update a couple of records
UPDATE Test_RowVersion SET Value=2 WHERE RowID=1
UPDATE Test_RowVersion SET Value=3 WHERE RowID=2

-- show all of the records that have changed
SELECT * FROM Test_RowVersion WHERE RV > @TS

-- show the new latest timestamp stored in the table

-- 3. re-run step 2 and see the changes again and again
--DROP TABLE Test_RowVersion

SSRS Page 1 of 2?

Starting in reporting services 2008 there is a new feature called On Demand processing.

With this feature reports may be rendered only one page at a time to speed up processing. One drawback to this can be seen on large reports. On your action bar, you may see page 1 of 2?
Here is an example:

One trick to avoid this is to add a textbox in your report header or footer. On the properties of the textbox, set Hidden=True. For the expressoin of the textbox, set it to =Globals!TotalPages.
Now when the report runs, it figures out the total number of pages to put into this textbox. And as a result, it also puts the correct number in your action bar!

Using Images as links in SSRS

In SSRS you can create drill-through links to other reports from textboxes. But if you really want to jazz things up you could instead add an Image and make it clickable. One catch is that you cannot associate an Image to a dataset to pass down values. How do we overcome this you ask?

There are two methods we can use:
1) If you used parameters in your report, you can easily add these as hyperlink parameters.

2) The trickier one is non-parameter values. To pass down a value from a dataset you need to have the
value available somewhere on your report. If your value may be different for each row, you will need to display it somewhere on your reports’ result set. Then on the Image, Action tab add another parameter and give it a name.

Image Properties and Parameters

For the expression value you would enter =ReportItems!Textbox126.value (or whatever the field name may be on your table)

Image Properties and Parameters

SSRS Divide by Zero

In SSRS you often have calculated fields that do division. To avoid having divide by zero errors, you may need to use IIF statements in the expression:
=IIF(Fields!column2.Value = 0, 0, Fields!column1.Value/Fields!column2.Value)
which means if the denominator is 0, show 0, otherwise show the calculated value.

Now for reasons unknown this only works if you are using integers. If you are using decimal values, then you will need to do some more work. The best approach I’ve seen is here, where Robert Bruckner provides a nice solution.

You would go to Report -> Report Properties -> Code and insert the following:

Public Function Divide(ByVal first As Double, ByVal second As Double) As Double
If second = 0 Then
Return 0
Return first / second
End If
End Function

Now you can do the following in your expression:
=IIF(Fields!column2.Value = 0, 0, Code.Divide(Fields!column1.Value, Fields!column2.Value))

SSRS Formatting

SSRS has some standard formatting options predefined, yet they seemed to have left out a couple of commonly used ones.
1. Formatting currency with no cents. To do this, in the column expression field put in =FormatCurrency(Fields!column1.Value, 0). The 0 means we want no decimal places to be shown.
2. Show a number with commas, no decimal places, and a zero if the value is zero. One way to do this is to set the formatting properties to #,### but if the numbers value is zero it will show nothing. So we would want to use #,##0 to show a zero if neccesary.

Property Owner is not available for Database

If a database becomes orphaned and has no database owner, you will get the following error message when you try to view the database properties in SSMS:

Cannot show requested dialog. (SqlMgmt)
Property Owner is not available for Database ‘[database]’. This property may not exist for this object, or may not be retrievable due to insufficient access rights.

You can use the following code to see orphaned databases.
SELECT databases.NAME AS DB_Name, server_Principals.NAME AS User_Name
FROM sys.[databases]
LEFT OUTER JOIN sys.[server_principals]
ON [databases].owner_sid = [server_principals].sid

To assign a new owner:
EXEC sp_changedbowner 'newuser'

If you get the following error message:

Msg 15110, Level 16, State 1, Line 1
The proposed new database owner is already a user or aliased in the database

Open up the user’s Login Properties window under Security\Logins and uncheck the Map checkbox for the database and click OK. Basically the code won’t assign them as the new owner because it thinks they are already associated with that database. Now run the sp_changedbowner command again and it will work.