Recently, I've been handed a lot of Excel spreadsheets (and various other formats) and told to move the data into a SQL table. You could do this with DTS if you have that kind of access to the server, but suppose you didn't. Here's a little PowerShell script that crunches through the rows in a CSV file and generates insert statements in a SQL script. The script requires that you pass a path to the CSV file and also accepts an optional parameter for the name of the table to insert into. If you don't pass this in, the filename is used as the table name. The script is intentionally lengthy and properly commented to make it easier to read and understand. Here you go...
# Verify that a file argument was passed in.
if ($args[0]) { $file = $args[0] }
else { write-warning "You must supply a path to the csv."; break }
# Verify that the file exists.
if(![System.IO.File]::Exists($file)) { write-warning "Can not find '$file'."; break }
# Verify that the file is a csv file.
if(!$file.ToLower().EndsWith("csv")) { write-warning "The file specified is not a CSV file."; break }
# If no table name was provided, we'll use the file name instead.
if($args[1]) { $tableName = $args[1] }
else { ls $file | % { $tableName = $_.Name.ToLower().TrimEnd(".csv") } }
# Generate an informative header for the sql file.
# Note that the out-file command will overwrite any existing file.
$output = $file.ToLower().TrimEnd(".csv") + ".sql"
"/************************************************************`n" +
"** Script generated by CsvToBatchInsert.ps1 `n" +
"** Date: " + [System.DateTime]::Now.ToString("MM/dd/yyyy hh:mm:ss") + " `n" +
"** From file: $file `n" +
"************************************************************/`n" | out-file -filepath $output
# Loop through the rows in the csv file.
Import-Csv $file | % {
# The insert variable is used to build a single insert statement.
$insert = "INSERT INTO $tableName ("
# We only care about the noteproperties, no use dealing with methods and the such.
$properties = $_ | Get-Member | where { $_.MemberType -eq "NoteProperty" }
# Create a comma delimited string of all the property names to use in the insert statement.
# You should make sure that the column headings in the CSV file match the field names in
# your table before you run the script.
$properties | % { $insert += $_.Name + ", " }
$insert = $insert.TrimEnd(", ") + ") VALUES ("
# Couldn't figure out how to access the value directly. So here I'm forced to use
# substring to get it. The Definition looks like "System.String PropertyName=PropertyValue".
# Since the value will be enclosed in single quotes, you will run into trouble if the value
# contains a single quote. To escape the single quote in T-SQL, just put another single quote
# directly in front of it.
$properties | % {
$value = $_.Definition.SubString($_.Definition.IndexOf("=") + 1)
$insert += "'" + $value.Replace("'", "''") + "', "
}
$insert = $insert.TrimEnd(", ") + ")"
# Append the insert statement to the end of the output file.
$insert | out-file -filepath $output -append
}
You can easily tweak this script to generate update statements, or delete statements, or anything else for that matter. As always, I welcome any comments, especially ones that show a better way of doing this. Enjoy!!!
2 comments:
why not just use bcp.exe?
I wasn't aware of the bcp utility. It looks cool. :)
Can you point me to a good example of converting a csv or Excel file into a table?
Post a Comment