SSIS Excel destination: Templates!

If you have a report that always outputs to an Excel Destination, here’s a trick my team uses to always export the output the same way each time. We use an excel template to keep the output uniform.

  • Run the output SQL first to get the data.
  • copy/paste the output to Excel.
  • Save the file with a generic, but related name. We keep our templates in a separate directory for safekeeping.

template_dir * directory

template_example * filename

  • Remove data, but keep headers.

template_ex_headers

  • Create 2 variables in SSIS
    • srcFile (source file / template)
    • dstFile (destination file, using expression for datestamp)
  • In the beginning of your SSIS package, use a filesystask and use the two previously created variables.
    • filesysvars
    • What this step does is copy your template with headers to a new file location with a unique name (based on datestamp).
  • In the Excel Destination step, create an expression in the properties.
    • exceldstexpr.PNG
    • This will write your output from your run into the unique file that was just created, without modifying the original template.

Using this technique, you can use the template over and over again without having to manipulate the output data after the fact.

 

SSIS to Excel output limit fix

With many of the SSIS packages my team writes, we have to output to an Excel Destination. One problem we found is that any data sets over 10k rows would not populate the Excel file all the way. We tried everything from buffer size to exporting to text first – nothing quite worked!

After much research, we tested a theory. Our Network admin modified permissions on the output directory and the account that was running the SSIS and generating the Excel files. I believe it had to have full admin rights on the job submission directory.

Problem solved!

Seems rather odd that because of permissions, the excel destination is limited. We could tell from testing. Limiting the output of the same query to say, 100 records was no issue. 9999 records – no problem. As soon as the limits were off the problem returned. That’s where we figured there was some type of limitation problem.

A couple of other points to the process:

  • We have a generic Batch account to run the processes from the SQL agent, instead of having one of the team member accounts use it. This helps to maintain a single point of permission management.
  • We also have a single output directory on our sharedrive. The ‘job submission’ branches off for each department and individual outputs beneath. The structure is locked down to read-only but users can copy off for themselves and modify the file to their hearts content. One too many times the original file was messed up or deleted.

2016!

Wow!

It’s been a while. Recently, my responsibilities at work tripled. This didn’t leave too much time to record any fun IT thoughts. I took a lot of notes though. More to come!