SQL Saturday Speaker Marketing – With Powershell

Last Updated on July 3, 2018 by John Morehouse

pass_marketing_2015

If you follow the Twitter account for SQL PASS, you probably noticed that they are actively promoting individual Summit 2015 speakers.  I think this is a fantastic marketing strategy not only for the individual speakers but also for the Summit itself.

I’m currently involved with organizing a SQL Saturday event (SQL Saturday #403 – Louisville).  With these events, I am always on the lookout for ways to make our event even better.  When I saw PASS doing this type of marketing, I knew that we had to do it for our event.

Now, if you look at the event schedule, we have 36 speakers.  I’ll admit it would be relatively easy to copy & paste things together for a tweet  How about those SQL Saturday’s that have 75 speakers like Atlanta?  With 75 speakers, the copy & paste process would become unwieldy quickly.

sqlsat_xml_feed_exampleOne of the benefits of the SQL Saturday website is that they offer a XML feed for every event.  This XML feed provides a number of useful data elements. These elements include speakers, sessions, and the event itself.  Given that I like to make things suck less, I wanted to find a way to dynamically generate Twitter content using this XML feed.  Once I have the content, then I can schedule them appropriately in my favorite social media application.

Enter Powershell.  Stage left.

Powershell has many abilities.  One of those is the ability to consume and manipulate XML data.  This is a perfect solution to generate what I need.  I modified the format of the tweet from the one that PASS was using.  In short, I wanted the script to:

  • Print out the length of the tweet so that adjustments could be made if longer than 140 characters
  • Include a link to each individual session
  • Automatically shorten the session URL
  • Include session title
  • Automatically use the appropriate event hash tag, namely the number or the name
  • If the speaker included their Twitter handle, include that as well
  • Output the results either to the screen or to a CSV file

Bit.ly URL Shorting

posh_tweet_1First, let’s get setup to shorten the session URL.  There are a number of services that can do this, however I choose to use Bit.ly.  For no specific reason aside from, I was able to find examples of how to communicate with their service on the internet.   In order to do this, I had to create an account at Bit.ly and generate an application token.  This token will allow the script to automatically pass in the long URL and retrieve the shortened version.

It is worth noting that you should not publicize the token as someone could use it to impersonate you when communicating with the service.

Let’s dive into the script!

The first thing that we are going to do in the script is declare some variables that will be used later in the script. We will also create a function to shorten the URL. This function will be called as we iterate through the XML data.

#Tweet Format
#Speaker [name] presents [title] [SessionURL} /cc [twitter handle] [event hash tag]

$eventNum = "403"
$OAuthToken =  "[bit.ly token goes here]"
$outputfile = "c:\temp\sqlsat$($eventnum)_tweets.csv"
$outtofile = 1

#declare our array to hold our tweets
$tweets = @()

# used to shorten the session URL for a better character count for Twitter
# internals borrowed from http://powershellnotebook.com/2014/10/29/powershell-shorten-links-with-the-bit-ly-api/
# function wrapper was written by John Morehouse

function Get-ShortURL{
    Param([string]$longURL, 
            [string]$OAuthToken)

  # Make the call
  $MyURL=Invoke-WebRequest `
    -Uri https://api-ssl.bitly.com/v3/shorten `
    -Body @{access_token=$OAuthToken;longURL=$LongURL} `
    -Method Get

  #Get the elements from the returned JSON 
  $MyURLjson = $MyURL.Content | convertfrom-json
   
  # Print out the shortened URL 
  write-output $MyURLjson.data.url 
}   

Given that we now have the function available, we can start to parse the XML data. To do this we will use the Invoke-webrequest cmdlet. The output from the cmdlet will put the results into the $xdoc object.

#let's get the XML from the SQL Saturday website
$xdoc = Invoke-WebRequest -Uri "http://www.sqlsaturday.com/eventxml.aspx?sat=$eventNum" -UseBasicParsing

If you examine the $xdoc object, you will notice that there are a number of elements that you can use. For this purpose, we will only need certain elements from the Speaker and Event objects.

#we only need a subset of each node of the XML, mainly the speakers and the sessions
$speakers = $xdoc.GuidebookXML.speakers.speaker | select importid, name, twitter
$sessions = $xdoc.GuidebookXML.events.event | select importID, title

To my knowledge there isn’t any easy way to “join” two Powershell objects together so instead I opted to use a ForEach loop to do this (if someone knows a better method, please share, I would love to know).  I start with the $speakers object and for every speaker within the object, we need to get their respective session from the $sessions object. The $sessions object will be piped into the Where-Object cmdlet using the criteria that the session import id is equal to the speaker import ID. In SQL terms, this would be the foreign key between to the two objects.

foreach ($speaker in $speakers){
$session = $sessions | where-object {$_.importid -eq $speaker.importID}

Now we have the speaker and their respective session, we need to clean up some data. The SQL Saturday website does not appear to sanitize the Twitter feed information. This means that speakers can enter a range of values for their Twitter handle. Things like “http://twitter.com/sqlrus”, “sqlrus”, or “@sqlrus”.

Another issue I ran into is with the Title. The site stores the title along with ASCII encoded characters. For example, ampersands appear as “&amp” and apostrophes as “#39″. Since we want the tweet to be as clean as possible, we will need to adjust the title for those characters.

#santize the data some
#if the twitter value is less than 2, just set it to a blank value
IF ($speaker.twitter.Length -le 2){
$twitter = ""
}
#if the twitter value is larger than 1 and begins with https, replace it with an @
ELSEIF (($i.twitter.Length -gt 1 ) -and ($i.twitter.substring(0,5) -eq "https")){
$twitter = "/cc " + $i.twitter.Replace("https://twitter.com/","@")
}
#if the twitter value is larger than 1 and begins with http, replace it with an @
ELSEIF (($i.twitter.Length -gt 1 ) -and ($i.twitter.substring(0,4) -eq "http")){
$twitter = "/cc " + $i.twitter.Replace("http://twitter.com/","@")
}
#if the first character is NOT an @, add one
ELSEIF ($i.twitter.substring(0,1) -ne "@"){
$twitter = "/cc @" + $i.twitter
}
#else build in the /cc string
ELSE {$twitter = "/cc " + $i.twitter}

#clean up the title if there are any ASCII encoded characters
$title = $session.title.TrimEnd().replace("#39;","'").replace("amp;","&")

At this point, we’ve cleaned up the Twitter information and the title. Now we are ready to get the shortened URL. Since each session has a unique ID for each event, it is pretty simple to dynamically build out the long URL. We will then pass the long URL along with the OAuth security token into the function.

#get the short URL
$longurl = "http://sqlsaturday.com/$eventNum/Sessions/Details.aspx?sid=" +$session.importID
$shortURL = Get-ShortURL -longURL $longURL -OAuthToken $OAuthToken

Now we can bring it all together.  This will insert every individual tweet into our array.

#bring it all together and put it in the array
  $tweets += "Speaker " + $speaker.name + " presents `"" + $title + "`" " + $shortURL + " " + $twitter + " " + $xdoc.GuidebookXML.guide.twitterHashtag

Our array is now fully populated with all of the dynamically created tweets.  Since I wanted the ability to either output the contents to the screen or a CSV, we have to do some conditional logic.  We also provide the array some formatting so that things look nice and clean.

#provide some formatting to our array
$format = @{Label="Length";Expression={$_.Length}}, @{Expression={$_};Label = "Tweet"}

#if the $outtofile variable is 1, export it to CSV.  If not, pump it to the screen.  
If ($outtofile -eq 1){
    $tweets | select-object -Property $format  | export-csv $outputfile
  } else {
    $tweets | select-object -Property $format | format-table -AutoSize
  }

The output to the screen looks like this:
posh_tweet_2

If I had wanted to output the results to a CSV file, then it would look like this:posh_tweet_5

In our case, we are using the CSV to track when each tweet goes out as well as who is sending it out.

Final Product

Now that I have the tweets created, all I have to do now is to schedule them accordingly.  I choose to use Buffer because the interface is clean and simple.  If you have a free account, you are limited to 10 tweets in the queue so you will have to manage it accordingly.

Once you have created a Buffer account, you will need to link it to your Twitter account.  The process is simple to do.  After linking the accounts, you can start to add tweets to the queue.

As you can see below, I’ve got one ready to be added for Bill Fellows.    This particular tweet will only be going to my Twitter feed.  However, my Buffer account is linked to my Twitter feed, my LinkedIn Profile and my Facebook account so I could also publish the content there as well.

posh_tweet_3

posh_tweet_4I just simply add it to the queue and Buffer does the rest! You can also schedule it for a specific date and time.  Whether it’s been scheduled or just in the queue, at the appropriate time, Buffer will deliver the tweet for you! You might notice that Buffer kindly convert the short URL to their own version.  You might have this same experience using your own social media tool so just something to be aware of.

Summary

In today’s world, we want our events to be as successful as possible.  With the number of SQL Saturday’s that are available almost every weekend, every little bit of marketing helps to drawn in attendees.  Not only will our speakers get more press time, the event itself will too.   Hopefully a script like this will make things easier.

You can download the entire script here.

Enjoy!

© 2015 – 2018, John Morehouse. All rights reserved.

6 Replies to “SQL Saturday Speaker Marketing – With Powershell”

  1. Thanks for this. We are using it for SQL Saturday KC and have the tweets being sent out on a regular interval now (you being one of those presenting obviously). You instructions were fantastic and it has been a good learning experience for me if nothing else. Thanks for your efforts, they are appreciated.

Hey you! Leave me a comment and start a discussion!

This site uses Akismet to reduce spam. Learn how your comment data is processed.