It’s Elementary!

Some recent conversations reminded me of an interesting problem my team had to resolve. Through some ingenuity, business process analysis and a little bit of grit we were able to find the issue that almost ground admissions at this particular site to a halt.

It was about a week after I started working with this client. I was working with the team of programmers when the complaints started flowing in. Approximately 50% of the students enrolling in courses that term were not getting their generated email IDs or system accounts.

Problem Identification

The initial review of all the workflows and systems showed no obvious issues. At a high-level, the student goes through the admissions process and then once accepted, has their data pushed through a semi-automated process that creates their accounts. As previously stated, a lot of those students weren’t getting what they needed. No recent software updates, no system changes – we were stumped.

Temp fix if needed

While the team dug in, we were able to implement a temporary ‘process’. One of the programmers could take student information and manually create the required accounts for students. The downside of this was that it took most of that persons’ time each day to process, not leaving much time for regular work. It got the job done and luckily it was only the summer session so it wouldn’t be as bad had it been registration for the fall.

The big question

“How can students get into the system?”  – Meaning – what are all the possible ways for data to get into the SIS? We knew the ERP software pretty well, so we knew that there were a few possibilities. After testing and ruling out the proper processes (these worked as expected), we focused on the non-traditional.  We noticed that in most of the cases of the issues, they were missing admissions applications. Note – this was a critical piece in the workflow which created the necessary accounts.

The Gathering

Given all the information we had, we decided that we needed input from the key players. During a meeting with Admissions and Registration, the truth was revealed. After the team presented our findings, it came to light that the Registrars office was skipping the admissions process and putting students straight into courses. By doing that, they bypassed the proper process components to kick off the workflow and create the accounts – hence the daily issues. Kind of humorous that two people who virtually sat next each other could do things so differently.

Win some, lose some

Now that we had the complete picture, we could implement a real solution. The most simple and straightforward path would be that the processes be followed and not skip steps.  Unfortunately, this is one of those times the executive level overruled the proposed solution. We had to develop a new workflow to accommodate the different student entry point in order to create the needed accounts.

In the end, the issue was resolved. Sometimes, that’s all that matters even if you are left scratching your head about it. Sometimes, you have to chicken-wire and duct tape things together…and that’s a lot of the fun of working in IT. There’s a lot of tech out there, the key is getting the right people and right tech to get the job done.

Cheers! TG



Finally took my ITIL foundation test and passed it (33/40 correct). I’m really excited that I was able to succeed in passing it with a fairly good score. I’ve already been using the concepts of ITIL in my daily work, so this will solidify my application.

When I finally saw this, I was elated!


Looking forward to my future endeavors on the ITIL path.

Thanks for reading – TG

Going from “We can?” to “We can!”

It’s an awesome feeling when you can give someone an AHA! moment. When you can provide the solution to a long-standing issue, it can be very satisfying.

“You can do that?”


“Yes! We can do that!”

The journey to get to that moment can be a long one, with many struggles along the way. Sometimes it’s just a matter of knowing what is available and who to go to in order to make it happen. My team experiences this a lot. In a fairly young/new/small environment, some formal processes haven’t been cemented yet. Knowing who does what or the ownership of data/processes can be a mystery at times. You’d think that in a small, fast moving environment, the communication and collaboration would flow freely.

Not so much.

That’s where my team comes in. We provide end-to-end support for the Core services, as well as the connected software and services. Additionally, we provide:

  • Automation
  • Optimization
  • Reporting / Analytics

About 2 years ago, this group was formed with just one developer and myself. Since that time, we’ve more than tripled in size, including business analytics and print fulfillment. Working with our internal customers, we work out processes and create solutions for their various needs.

For example (a fraction of what we do):

  • Instead of multiple downloads of the same file
    • Setup a process to deliver one file in a shared location and send a notification.
  • Instead of manual processes, use automated tasks to complete repeatable functions.
  • Use SSIS and SSRS to provide self-service and automated reporting to replace spreadsheets and querying.

From the beginning, the goal was to provide solutions and ideas to make their work day easier. Word gets around and the requests increase.

“Hey, I heard you can…”

Yes, yes we can.

Office 365 – OneNote: From phone to collaboration bliss

I’m not one who usually geeks out on new tech as it seems to me most ‘advances’ are minor mods that aren’t all that exciting or do me any good. I do like to tinker with different tech to see if it will help speed up development or collaboration with my team. You never know when you’ll come across something that helps.

We employ the Office 365 suite at work, though not as thoroughly as I believe we could. Given that, I take any opportunity to make use of it. From SharePoint online for organizing  important documents, schedules and other resources for my team to the new ‘Planner’ software recently rolled out for task tracking. One really neat benefit in O365 that I found is the cloud aspect of connecting data and applications easily.

Scenario: Multi-team collaboration on a whiteboard.

Typically, we would work something out on the whiteboard and either hand write the notes, type it out or take pictures. Afterwards, go back to our desks and email around with our notes and try to sync them up. Usually, we can agree on the output. If there are any pictures taken, they have to be emailed or downloaded from a cloud backup….Lots of extra steps.

NOW: Office 365 and OneNote

This definitely speeds things up and reduces any questions as to accuracy.

I downloaded the OneNote app for my Android phone (I believe there are versions for other platforms as well) and logged into my O365 account for work. I created a new page to work with (fig 1).

1note_1 Fig. 1: creating a new OneNote Section.

Once in the section, you can see the new section and how it relates to any other notes you have.


In the upload example section, click on the camera icon to add any related images that you need to share.


Once you have you images, you can add text and name the page for the section.


This is my favorite part – This page can be viewed from the web-based OneNote or your desktop version (if connected).

1note_online  Online


1note_desktop  Desktop

Now, anyone who is connected to that OneNote can collaborate on the Notes taken – Single source of information!

Hope you found this helpful – Enjoy!


Log file from Excel COM API program

It’s a good habit to track the data your process has changed. Most programs should have some type of logging mechanism. Sometimes in the haste to roll a tool out the door, simple tracking pieces are left off. I ran across that recently when working with some of our Macro-enabled spreadsheets. Using the COM APIs provided with the software we use for our billing system, we’ve built some tools to process data in a different fashion than the standard jobs from the system.

I noticed that there wasn’t any output from these tools though. That bothered me when updating numerous records that may have needed to been rolled back. Doing some digging online, I made some mods to the VBA coding to allow for a ‘log’ file and track what the code did.

For this entry, here is the setup: An excel spreadsheet with a button that will run some VBA script. There is one tab that has the data points to process and the button to fire off the code. The other tab is for configuration: log output directory, username, etc. This is a rudimentary way to set up where the file goes and how to name it.


There will be two important variables:

  • “Operator” (who is running the process)
  • “FilePath” (where the log file will be written)


The key with these fields is to name [B1 and B2 in the pic] appropriately. These will be used in the code. In the example above, click on B1, then click on the namebox – Type the name of the cell (like FilePath) and hit Enter. This aliases the cell where you will put the output path of the log.


In the code editor (Developer -> Visual Basic), add the following to the code in order to create and write to the log:


Using some variables, capture the configuration data before going on to the rest of the code. The vFileName combines the configuration operator value with a date stamp and file name to distinguish the function of the process. It uses the cell aliases from the config tab.

Creating the log file


This creates the file using the filepath variable and filename using some built-in VBA code.

Writing to the log


In this case, I wrote the data to a variable first then used the VBA API to write to the log using the variable. Use this at any point in the code to log an action on the data. In the example above, I’m logging that I updated one value to another for a particular account.

Output Result:

Based on the config, I can check my output.


Opening on the file I know I just created, I can review the details of the process.



That’s about it! A pretty simple solution to help keep track on changes made to data via the VBA processing.

Thanks for reading! TG

SSIS Excel destination: Templates!

If you have a report that always outputs to an Excel Destination, here’s a trick my team uses to always export the output the same way each time. We use an excel template to keep the output uniform.

  • Run the output SQL first to get the data.
  • copy/paste the output to Excel.
  • Save the file with a generic, but related name. We keep our templates in a separate directory for safekeeping.

template_dir * directory

template_example * filename

  • Remove data, but keep headers.


  • Create 2 variables in SSIS
    • srcFile (source file / template)
    • dstFile (destination file, using expression for datestamp)
  • In the beginning of your SSIS package, use a filesystask and use the two previously created variables.
    • filesysvars
    • What this step does is copy your template with headers to a new file location with a unique name (based on datestamp).
  • In the Excel Destination step, create an expression in the properties.
    • exceldstexpr.PNG
    • This will write your output from your run into the unique file that was just created, without modifying the original template.

Using this technique, you can use the template over and over again without having to manipulate the output data after the fact.


SSIS to Excel output limit fix

With many of the SSIS packages my team writes, we have to output to an Excel Destination. One problem we found is that any data sets over 10k rows would not populate the Excel file all the way. We tried everything from buffer size to exporting to text first – nothing quite worked!

After much research, we tested a theory. Our Network admin modified permissions on the output directory and the account that was running the SSIS and generating the Excel files. I believe it had to have full admin rights on the job submission directory.

Problem solved!

Seems rather odd that because of permissions, the excel destination is limited. We could tell from testing. Limiting the output of the same query to say, 100 records was no issue. 9999 records – no problem. As soon as the limits were off the problem returned. That’s where we figured there was some type of limitation problem.

A couple of other points to the process:

  • We have a generic Batch account to run the processes from the SQL agent, instead of having one of the team member accounts use it. This helps to maintain a single point of permission management.
  • We also have a single output directory on our sharedrive. The ‘job submission’ branches off for each department and individual outputs beneath. The structure is locked down to read-only but users can copy off for themselves and modify the file to their hearts content. One too many times the original file was messed up or deleted.