Saturday, November 14, 2015

SSIS imports Excel with some blank columns

I am working with SSIS to import an Excel file into SQL Server. Not being an SSIS expert, I usually use the import wizard in SSMS which lets me create the destination table, load the data, and save the package to the file system. Then I will review the wizard-created table, modify any data types as needed, and then generate a proper CREATE TABLE statement.

In some cases however, the wizard does not assign the correct data types, and the saved package does not work - one or more columns will be blank. This is caused by the baked-in behavior of Excel, where it examines the beginning of the data & guesses the data types. If the first few rows are not fully populated with the appropriate data, that's a problem.

The fix for this is remarkably simple, but took me lots of Googling to find the solution Import Excel in SSIS and get Null values in stead of Numeric values posted by Kasper de Jonge...

When you open the package in SSIS, in the Excel connection manager you simply add "IMEX=1" so it looks this:

Excel 8.0;HDR=YES;IMEX=1

which forces SSIS to read the Excel file using the data types you specify in the package.

Friday, October 30, 2015

SSMS Utility Explorer

One of the features of SSMS 2008R2 is the Utility Explorer window, which "connects to Database Engine instances to provide a tree view of all the objects in the SQL Server Utility." (

This feature is available in the Enterprise & Developer Editions.

Out of curiosity, I opened that window to check it out, and then closed the window. Thought that would turn off the feature. However, every time I opened SSMS, the UE window opened too. Easy enough to close it with a single click, but could not turn it off.

I found that there is a config file windows.prf located here:
C:\Users\first_name.last_name\AppData\Roaming\Microsoft\Microsoft SQL Server\100\Tools\Shell


... and luckily there was a backup file in there, so all I had to do is close SSMS, rename the windows.prf to something else, and then rename the backup to windows.prf and Voila! the Utility Explorer window does not open when you launch SSMS.

Wednesday, October 21, 2015

TortoiseSVN and SSMS icon overlays

Tortoise SVN is one of the popular source-control systems, which we use at our office. If you've used this product, when you commit you'll sometimes see a message that a new version is available; I had 1.8.x installed, so this time I did the update but - got bit by version 1.9.2 ...

This is using a recent Windows 7 x64 machine, by the way.

After the update, the Windows folders all had the familiar icon overlays, but when I launched SQL Server Management Studio & hit the File Open dialog, none of the colored icons were present. Hmmm. Google'd about looking for a solution & found numerous registry hacks involving icon overlay keys, keys that were added by/for Groove and/or Windows 10; finally found the answer: version 1.9.2 seems to have a problem.

So I uninstalled the 1.9.2 client and downloaded the previous 1.9.1 version, and Voila! the icons all came back. Good times.

Friday, January 09, 2015

sp_who2 with sql text

SQL Server has an undocumented stored procedure sp_who2 that provides information about the connections to the database.

Recently I have been using sp_who2 to track down connections that are "suspended" which in my case means that our MS-Access application is not processing all of the requested data in a timely manner. This can interfere with other users, even though there is no actual blocking going on. Another issue that we're seeing is that users will log-in to their terminal server environment, launch the application & then leave it open for days at a time. Even though the connections are sleeping, they still require some resources on the SQL server.

The task at-hand is 1) determine which connections (SPID's) are suspended & what SQL was running, and 2) determine which connections have been sleeping for more than a day. The first item will reveal the SQL so we can find an alternate solution; the second will allow us to identify the sleepers so we can close them out.

I started by creating a temp table & inserting the sp output, and then running some code to look up the SQL text. . .

    -- set this to the name of your database
    @database sysname = 'my_database',
    @not_sleeping int,
    @not_today int,
    @spid int,
    @sqltext varbinary(128),
    @sql_command nvarchar(4000),
    @today nvarchar(50)

-- LastBatch only provides month & day
SELECT @today = LEFT(CONVERT(nvarchar,GETDATE(),101),5)

-- create temp table to receive sp output, and execute
IF OBJECT_ID(N'tempdb..#sp_who2', N'U') IS NOT NULL
DROP TABLE #sp_who2


    SPID int,
    Status nvarchar(1000) NULL,
    Login sysname NULL,
    HostName sysname NULL,
    BlkBy sysname NULL,
    DBName sysname NULL,
    Command nvarchar(1000) NULL,
    CPUTime int NULL,
    DiskIO int NULL,
    LastBatch nvarchar(1000) NULL,
    ProgramName nvarchar(1000) NULL,
    SPID2 int,

INSERT #sp_who2
EXEC sp_who2

-- look up the sql text for each SPID
ALTER TABLE #sp_who2 ADD sql_command nvarchar(4000)
    SELECT @spid = spid FROM #sp_who2 WHERE sql_command IS NULL

    SELECT @sqltext = sql_handle
    FROM sys.sysprocesses
    WHERE spid = @spid

    SELECT @sql_command = TEXT
    FROM sys.dm_exec_sql_text(@sqltext)

    UPDATE #sp_who2
    SET sql_command = @sql_command
    WHERE spid = @spid
SELECT @not_sleeping = COUNT(*)
FROM #sp_who2
WHERE DBName = @database
 AND status <> 'sleeping'

SELECT @not_today = COUNT(*)
FROM #sp_who2
WHERE DBName = @database
 AND    LEFT(LastBatch,5) <> @today

-- display a list of all SPID's of interest
IF @not_sleeping <> 0 OR @not_today <> 0
        DBName = @database
     AND    (
        Status <> 'sleeping'
        LEFT(LastBatch,5) <> @today

DROP TABLE #sp_who2

Sunday, August 17, 2014

T-SQL Select Schema

Let's say that you have the same table that exists in two different schemas -


...and you want to write a query with a parameter so you can select from one schema or the other. Without resorting to dynamic SQL.

One possible solution is to write a UNION query that will deliver the data from one schema or the other, like this:

DECLARE @schema sysname
SET @schema = 'A'

FROM A.my_table
WHERE @schema = 'A'
FROM B.my_table
WHERE @schema = 'B'

Here we are using the ALL keyword to make sure that we retrieve all records; normally a UNION will only retrieve unique records.

Of course, the DDL for both tables must match, or the UNION will fail; the number of columns must be the same, and the data types must be compatible.

This idea was actually applied to an SSRS report - for which, the data source must be either T-SQL text or a stored procedure.

Saturday, October 20, 2012

Programming Tab Controls

MS-Access tab controls offer a method to present multiple groups of similar data, making it easy for user to locate specifics within an intuitive & user-friendly layout - very much like an Excel workbook. The main control is of course called a Tab Control, and the tabs within it are called Pages.

For example, you might have a main form where you select a customer, and then you have a tab control that has pages that display related data for such things as Contacts, Sales, Invoices, Payments, Credit Data and so on. Usually each tab has a subform that contains the data related to the current selection from the main screen.

However, when you access the tab using VBA, there are few variations that are not exactly clear in the online documentation.

For starters, when you place a control on a tab, the Parent is actually the Main Form, not page of the the tab control. So if you have a main form called "main_form", a tab control named "MyTab", and several pages named Page1, Page2, and Page3 - and then - you place a command button Button1 on Page1, the full reference to that control is


...note that the Page number has no effect on that. Therefore, when you have a form with one or more tabs controls, you must be careful to use a unique name for each of the controls, whether on the main form, or on any of the tabs.

If you wish to detect which tab has been clicked, you will use the OnChange event; in this example when you click the first tab Page1 which happens to have a page index of 1, the MyTab value will be 1 -- so to get the name of the page you will need to use

Private Sub MyTab_Change()
With Me
MsgBox .MyTab.Pages(.MyTab).Name
End With
End Sub

to get the name of the page that was clicked.

Sunday, September 09, 2012

SQL - list columns in a table, updated, again

The code has been updated again, this time instead of using sp_columns we are using the system tables directly. This provides access to additional data, such as whether the column is computed. Plus, in the event that Microsoft changes sp_columns - that would not break our code.

One significant change since the previous January 2012 version is that option 4 now provides a skeleton for an insert stored procedure - all of the column names are provided with data types in a DECLARE statement as parameters, followed by the INSERT for column names, and finally the SELECT statement that references the parameters. In this option, the code is excluding non-updating columns such as identity, timestamp, image, or any computed columns.

In all prior versions, we bracket [] any columns that have non-alphanumeric characters such as spaces or hyphens. In order to accommodate the #4 option, we have a large REPLACE statement to convert those to underscores. If you have any other characters to exclude, you can edit the REPLACE (which occurs in two places) to make the corrections.

EDIT ON 10/20/2012 - this new rewrite of the wm_columns will not work on views, because sp_tables actually pulls in both tables and views; where of course sys.tables only looks at tables. Hmmm. I will probably add one more option to this, so views will be back in the list.

Thursday, January 12, 2012

SQL - list columns in a table, updated

Way back in April 2009, I had published the original version of my stored procedure wm_columns which can be used instead of sp_help or sp_columns to view a list of column names for a table or view.

The new version has a number of enhancements, and you can view the complete code here.

The first parameter is required, which is the name of a table or view, such as:
wm_columns 'my_table'
...and the default schema is dbo

You may also include the schema, e.g. in AdventureWorks you can use this:
wm_columns 'Person.Contact'

The second parameter is optional, and has the following choices:

0 = display the column list in physical order
1 = display the column list in alphabetical order
2 = display a select statement
3 = display an insert statement
4 = display an insert statement, with data types added as comments

For the options 2, 3, and 4, column names are bracketed with [] if they contain spaces or other non-alphanumeric characters.

For the options 3 and 4, any identity or timestamp columns are displayed but commented-out.

The third parameter, when used with option 0 or 1, allows you to apply a fuzzy filter to the column name, for example:

wm_columns 'my_table',0,'id'
...will list only the columns that contain 'id' in the name.

The third parameter, when used with option 2, allows to you use an alias for the table, for example in AdventureWorks:

wm_columns 'Person.CountryRegion',2,'cr'
...will display a select statement like this:
Person.CountryRegion cr

Thursday, June 30, 2011

SQL Timestamp and Access

The timestamp field in a SQL Server table holds a unique value that is updated every time a record is updated. In a recent post in a SqlServerCentral forum, there was a suggestion that when you are using MS-Access as the front-end, you should add a timestamp column to every linked table - and thus prevent write-conflict messages.

If you use the SQL Server Migration Assistant (SSMA) to convert your Access database, it will automatically add a column SSMA_timestamp to every table that has either a floating-point field (Single or Double) or has a Boolean field that allows nulls. This is supposed to help SQL Server manage updates to records. For more details you can refer to the Microsoft article Move Access data to a SQL Server database by using the Upsizing Wizard.

Now, if you want to follow the original suggestion, but some tables already have a timestamp, you could write a script to generate the update scripts, like this:

'ALTER TABLE [' + OBJECT_NAME(t.object_id) + ']
ADD my_timestamp TIMESTAMP'
FROM sys.tables t
FROM sys.columns
WHERE system_type_id =
SELECT system_type_id
FROM sys.types
WHERE name = 'timestamp'
AND object_id = t.object_id
AND t.type = 'U'

...and when you run this, you get one ALTER TABLE statement for each table that does not have a timestamp column. NOTE that this does not change the tables, it only creates the scripts to make the table changes. Then, you can review the scripts & remove any tables (e.g. staging or import tables) where adding a column might break the application. This also documents the changes being made. When you actually execute the script, it will add the timestamp columns as needed.

(BTW in the statement "ADD my_timestamp TIMESTAMP" I've left out the "NOT NULL" clause, simply because the timestamp data type by definition cannot be NULL.)

On the other hand, if you take a look at SQL Timestamp field for Access use there is a discussion of how the timestamp can actually cause write-conflicts if you have a form / subform combination that uses columns from the same table, which is not an uncommon practice.

So, before you make radical changes to your database, remember to backup everything, and run some test cases including measuring response times - this gives you a baseline after which you can apply the mods & evaluate the results.

Sunday, May 15, 2011

T-SQL Computed Columns, part 3

My last two posts described how to create a computed column, and how to control the data type and size.

In the first example, the computed column would be "A" if there is an approval date, otherwise it would be "".

The general rule is that the computed column can only refer to other columns within the same table. However, it is possible to use a scalar function to perform a lookup from another table. This might have a dramatic effect on performance, depending on table size, row count etc. but it is an option according to the BOL.

So then, what if that approval date actually resides in a different table?

Assuming that my_table has a primary key PK int, and other_table has the related column ID int, we could write a lookup function, like this:

@PK int
RETURNS varchar(1)
DECLARE @Lookup varchar(1)

SELECT @Lookup =
WHEN approval_date IS NOT NULL
FROM other_table

RETURN @Lookup

...and then use that function in our computed column, like this:

ALTER TABLE my_table
ADD A_display

Again, this could have an impact on the performance - any query run against my_table will execute the function for every row - but this is one more option for computed columns.

A better solution might be to create a column in my_table to hold the A_display data, and then create a trigger on other_table that would keep A_display in sync. We'll explore that in a future post.