SQL Server and Postgres Datatypes



SQL Server Postgres
 bigint            numeric(20,0)                 
 binary            bytea                         
 bit               numeric(1,0)                  
 char              character(10)                 
 date              date                          
 datetime          timestamp without time zone   
 datetime2         timestamp(6) without time zone 
 datetimeoffset    timestamp(6) with time zone   
 decimal           numeric(18,0)                 
 float             double precision              
 geography         character varying(8000)       
 geometry          character varying(8000)       
 hierarchyid       character varying(8000)       
 image             bytea                         
 money             numeric(19,4)                 
 nchar             character(10)                 
 ntext             text                          
 numeric           numeric(18,0)                 
 nvarchar          character varying(10)         
 real              double precision              
 smalldatetime     timestamp without time zone   
 smallint          numeric(5,0)                  
 smallmoney        numeric(10,4)                 
 sql_variant       character varying(8000)       
 sysname           character varying(128)        
 text              text                          
 time              time(6) without time zone     
 timestamp         character varying(8000)       
 tinyint           numeric(5,0)                  
 uniqueidentifier   uuid                          
 varbinary         bytea                         
 varchar           character varying(10)         
 xml               xml                           

postgres equivalent of sql server begin transaction


image


Comparing Transactions between SQL Server and Postgres.

SQL Server : : BEGIN Transaction  -- Commit Transaction – Rollback Transaction

Postgres      : Begin – Commit – Rollback


Unlike SQL Server , Postgres commits or Rollback the transactions at the First Commit/Rollback comman after any begin statement  and Ignore Subsequent Commit/Rollbacks.

But in SQL Server , Each Begin Transaction requires a Commit/Rollback.

Migrate SQL Server Database to AWS (RDS) or Azure using SSIS

 

Migrate SQL Server Database to AWS (RDS) or Azure using SSIS

If backup and restore is not an option to migrate or copy the whole database which is not an option, specially to migrate managed database services like AWS RDS /Azure or any Environment which has restrictions to restore backups needs Import/Export mechanisms to migrate the data.
The top 2 ways to migrate data is through either SSIS (Export Wizard) or BCP. choosing the one from these two depends on whether it has direct access to the destination server (SSIS) or disconnected server (BCP) which needs to copy the data dump and insert the data at the destination.

Both these techniques are very familiar to most of the DBAs, here I am highlighting the problems
faced with working with these 2 techniques (SSIS/BCP)

SSIS

Export Wizard:
Export Wizard has an option to save an SSIS package without executing. which is pretty good feature,
but it has some limitations which needs manual intervention.

1. Identity Insert should be ON for all the tables which has identity key property, Its very tedious manual process to identify which tables has this property and manually enable them in each data flow.

2. Keep Nulls should be on the tables with default values. E.g.: if the table, column which allows nulls and has a date data type with default value as getdate()  then after exporting the data it inserts the date values instead nulls, unless the SSIS dataflow destination property option FastLoadKeepNulls is checked

3. fast load options are not set by default; it has to be set manually for the property sFastLoadOptionsBatchCommit 

4. TABLOCK which is not enable by default

5. ROWS_PER_BATCH is not configured by default

Above 5 settings have to be done manually.  Without these the package would fail or perform slowly
Think about a situation where you need to export 100 tables with millions of rows.
Below VbScript will create a new SSIS package with all the above options. 

How it works:
It replaces specific string in the package to enable the above configuration values in the source code.

Note: This has been tested on SQL Server 2008 EE. If this doesn’t work in your environment, then modify the key strings as per the release.

Steps:

1. Run the Export export data wizard and save the package to C:\TEMP\northwind.dtsx

2. copy the below vbscript code to a file in C:\TEMP\SetSSIS.VBS 

3. Edit the Vbscript, Line 5 "filePath = "<ssis file path>"

4. run the vb script

5. it creates a new package with new name as <PKG_NAME>.NEW

6. open the package in BIDS

7. run the package

Before

After

clip_image002[6]

clip_image004[6]

Const ForReading=1
Const ForWriting=2
Set objFSO = CreateObject("Scripting.FileSystemObject")
folder = ".\"
filePath = "C:\temp\northwind.dtsx"
dim NewrowsPerBatchString
NewrowsPerBatchString = "TABLOCK,ROWS_PER_BATCH = 50000</property>"
Set myFile = objFSO.OpenTextFile(filePath, ForReading, True)
Set myTemp= objFSO.OpenTextFile(filePath & ".new.dtsx", ForWriting, True)
dim sIdentity
sIdentity =   "name=""FastLoadKeepIdentity"" dataType=""System.Boolean"" state=""default"" isArray=""false"" description=""Indicates whether the values supplied for identity columns will be copied to the destination. If false, values for identity columns will be auto-generated at the destination. Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">false</property>"
sKeepNulls = "name=""FastLoadKeepNulls"" dataType=""System.Boolean"" state=""default"" isArray=""false"" description=""Indicates whether the columns containing null will have null inserted in the destination. If false, columns containing null will have their default values inserted at the destinaton. Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">false</property>"
sFastLoadOptionsBatchCommit = "name=""FastLoadMaxInsertCommitSize"" dataType=""System.Int32"" state=""default"" isArray=""false"" description=""Specifies when commits are issued during data insertion.  A value of 0 specifies that one commit will be issued at the end of data insertion.  Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">0</property>"
sFastLoadOptions = "name=""FastLoadOptions"" dataType=""System.String"" state=""default"" isArray=""false"" description=""Specifies options to be used with fast load.  Applies only if fast load is turned on."" typeConverter="""" UITypeEditor="""" containsID=""false"" expressionType=""None"">"
dim x
Do While Not myFile.AtEndofStream
myLine = myFile.ReadLine
If InStr(myLine, sIdentity) Then
          myLine=Replace(myLine,"expressionType=""None"">false</property>", "expressionType=""None"">true</property>")
End If
If InStr(myLine, sKeepNulls) Then
          myLine=Replace(myLine,"expressionType=""None"">false</property>", "expressionType=""None"">true</property>")
End If
If InStr(myLine, sFastLoadOptionsBatchCommit) Then
          myLine=Replace(myLine,"expressionType=""None"">0</property>","expressionType=""None"">50000</property>")
End If
If InStr(myLine, sFastLoadOptions) Then
          myLine=Replace(myLine,"expressionType=""None"">","expressionType=""None"">"&NewrowsPerBatchString)
  x=instrrev(myLine,NewrowsPerBatchString )
          myLine=left(myLine,x+(len(NewrowsPerBatchString)-1))
End If
myTemp.WriteLine myLine
x=""
Loop
myFile.Close
myTemp.Close
'objFSO.DeleteFile(filePath)
'objFSO.MoveFile filePath&".tmp", filePath

BCP

Check the below link which is using BCP to migrate the Data.

sql server search job commands

 

Handy query to search a particular command across all the Jobs.

To search all the jobs which are using mail object , set the @keyword value to “Mail”
To search all the jobs which are using external folders , set the @keyword value to “\”
To search all the jobs which are calling DTS packages , set the @keyword value to “.dts”

declare @keyword varchar(max)
set @keyword = 'mail'

set @keyword = '%'+@keyword +'%'
SELECT [sysjobs].[name] AS N'job_name',[sysjobsteps].[step_name],[sysjobsteps].[command] AS N'step_command',[sysjobsteps].[database_name],[sysjobsteps].[output_file_name],[sysjobsteps].[last_run_date]
FROM [msdb].[dbo].[sysjobsteps]INNER JOIN [msdb].[dbo].[sysjobs]ON [msdb].[dbo].[sysjobsteps].[job_id] = [msdb].[dbo].[sysjobs].[job_id]
where [sysjobsteps].[command] like @keyword

postgres cant open the log file

 

If you are PgAdmin tool and facing this error whenever the execution code encounters an error like below

image

To fix this.

Open PgAdmin
-- File – Options – Logging – set the path to the existing folder. Eg: /tmp/pgadmin.logxc

image

 

image

SQL Server AWS Migration BCP


stored procedure to generate BCP scritps to migrate the SQL Server database.
Developed this stored procedure on my labs to simulate the Migrating databases to AWS RDS.
 
Execute the output of the below procedure in CMD window.
 
Output :
creates Folders to store the output data
creates Folders to log the errors
image
BCP out Commands
image
BCP in Commands
 
image
Veirfy Errors : Generates Powershell Commands
image
 
 

/*
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
1) Create this sp on source

Example :
Exec  AWS_BCP_MIGRATION
'C:\BCP_DATA' ,
'my_source_server',
'my_database',
'my_dest_server',
'my_dest_database',
'my_sa_password',
0
------------
-- next run , truncate these tables manually. dont want to disturb. if there is any tables already exists in your environment
TRUNCATE TABLE MSDB..AWS_IMPORT
TRUNCATE TABLE MSDB..AWS_IMPORT
------------


BCP Parameters

S -- server
T -- trusted connection
U - username
P - password
n - native values , existing values
E - keep identity ( this is very important in migration)
w - unicode
t - column delimiter / field terminator
b - batch size
e - error log file
"TABLOCK" - for performance
*/
---------------------------------------------------------------------------------------------------------------------------------------------------------------------
ALTER proc AWS_BCP_MIGRATION
(
@RootFolder varchar(max)
,@SourceServer varchar(max)  = NULL
,@SourceDB varchar(max) = NULL

,@DestServer varchar(max)
,@DestDB varchar(max)
,@DEST_SQL_SA_PASSWORD VARCHAR(256)
,@EXP_IMP_EACH_TABLE BIT = 0
)

AS
IF @SourceServer IS NULL SET @SourceServer = @@SERVERNAME
IF @SourceDB IS NULL SET @SourceDB = DB_NAME()

 

SET NOCOUNT ON
DECLARE @BCP_DATA_FOLDER VARCHAR(MAX)
DECLARE @BCP_EXPORT_ERROR_FOLDER VARCHAR(MAX)
DECLARE @BCP_IMPORT_ERROR_FOLDER VARCHAR(MAX)
DECLARE @BCP_EXPORT_Logging_FOLDER VARCHAR(MAX)
DECLARE @BCP_IMPORT_Logging_FOLDER VARCHAR(MAX)

----------------------------------------------------------------------------------------------------------------------------------------------------------
IF @SourceServer = @DestServer AND @SourceDB = @DestDB
BEGIN
RAISERROR ('Source and Dest Connections Cant be same',18,1)
return
END

 

 

set @RootFolder  = @RootFolder  +'_'++REPLACE(DB_NAME(),' ','_')

DECLARE @BCP_COL_DELIMITER VARCHAR(10)
SET @BCP_COL_DELIMITER = '[@*#%^|$'
----------------------------------------------------------------------------------------------------------------------------------------------------------

 

set @BCP_DATA_FOLDER = @RootFolder  +'\BCP_DATA'

 

SET @BCP_EXPORT_ERROR_FOLDER = @RootFolder   +'\LOGS\EXPORT_ERRORS'
SET @BCP_IMPORT_ERROR_FOLDER = @RootFolder   +'\LOGS\IMPORT_ERRORS'


SET @BCP_EXPORT_Logging_FOLDER = @RootFolder   +'\LOGS\EXPORT_LOGGING'
SET @BCP_IMPORT_Logging_FOLDER = @RootFolder   +'\LOGS\IMPORT_LOGGING'

SELECT 'MKDIR '+@BCP_DATA_FOLDER
union all
SELECT 'MKDIR '+@BCP_EXPORT_ERROR_FOLDER
union all
SELECT 'MKDIR '+@BCP_IMPORT_ERROR_FOLDER
union all
SELECT 'MKDIR '+@BCP_EXPORT_Logging_FOLDER
union all
SELECT 'MKDIR '+@BCP_IMPORT_Logging_FOLDER
union all
SELECT ''
union all
SELECT 'ECHO FOLDERS CREATED..'
union all
SELECT ''

 

IF @EXP_IMP_EACH_TABLE  = 1
    BEGIN
            select IDENTITY(int,1,1) as iid, ' bcp "select * from '+@SourceDB +'.['+schema_name(schema_id) +'].['+ name  +'] " queryout '+@BCP_DATA_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_')+'.txt -n -S '+@SourceServer+' -T -E -w -k -t'+@BCP_COL_DELIMITER+' -b 10000 -e'+@BCP_EXPORT_ERROR_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_') +'.TXT >> '+@BCP_EXPORT_Logging_FOLDER+'\'+REPLACE(NAME,' ','_')+'.TXT'
            +CHAR(13) + ' bcp '+@DestDB+'.['+schema_name(schema_id) +'].['+ name  +'] in '+@BCP_DATA_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_')+'.txt -n -S '+@DestServer+' -Usa -P'+@DEST_SQL_SA_PASSWORD +'-h TABLOCK -E -w -k -t'+@BCP_COL_DELIMITER+' -b 10000 -e'+@BCP_IMPORT_ERROR_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_') +'.TXT >> '+@BCP_IMPORT_Logging_FOLDER+'\'+REPLACE(NAME,' ','_')+'.TXT '
            as [ECHO EXPORT BEGIN...]
            INTO MSDB..AWS_EXPORT
            from sys.tables 

            SELECT 'ECHO Export & Import Completed'
            --- to verify the errors
            PRINT 'Powershell.exe -noexit -command "get-childitem '+@RootFolder+'\LOGS\'+'*.TXT -recurse | select-string -pattern ''error''" >> '+@RootFolder+'\'+@DestDB+'_'+'BCP_status.txt'
            PRINT 'Powershell.exe -noexit -command "get-childitem '+@RootFolder+'\LOGS\'+'*.TXT -recurse | select-string -pattern ''#@ Row''" >> '+@RootFolder+'\'+@DestDB+'_'+'BCP_status.txt'

    RETURN


    END


select IDENTITY(int,1,1) as iid, ' bcp "select * from '+@SourceDB +'.['+schema_name(schema_id) +'].['+ name  +'] " queryout '+@BCP_DATA_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_')+'.txt -n -S '+@SourceServer+' -T -E -w -k -t'+@BCP_COL_DELIMITER+' -b 10000 -e'+@BCP_EXPORT_ERROR_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_') +'.TXT >> '+@BCP_EXPORT_Logging_FOLDER+'\'+REPLACE(NAME,' ','_')+'.TXT'
as [ECHO EXPORT BEGIN...]
INTO MSDB..AWS_EXPORT
from sys.tables 


SELECT [ECHO EXPORT BEGIN...] FROM  MSDB..AWS_EXPORT ORDER BY IID ASC

SELECT 'ECHO EXPORT COMPLETE'

----------------------------------------------------------------------------------------------------------------------------------------------------------------------

select ' bcp '+@DestDB+'.['+schema_name(schema_id) +'].['+ name  +'] in '+@BCP_DATA_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_')+'.txt -n -S '+@DestServer+' -Usa -P'+@DEST_SQL_SA_PASSWORD +' -E -w -k -t'+@BCP_COL_DELIMITER+' -b 10000 -e'+@BCP_IMPORT_ERROR_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_') +'.TXT >> '+@BCP_IMPORT_Logging_FOLDER+'\'+REPLACE(NAME,' ','_')+'.TXT "TABLOCK" '
as [ECHO IMPORT BEGIN..]
,IDENTITY(int,1,1) as iid
into MSDB..AWS_IMPORT
from sys.tables a


SELECT [ECHO IMPORT BEGIN..] FROM MSDB..AWS_IMPORT ORDER BY IID ASC

SELECT 'ECHO IMPORT COMPLETE'


--get-childitem H:\BACKUP\AWS_BCP_DATA_RP_Regression\DATA\EXPORT_LOGGING\*.TXT -recurse | select-string -pattern "error"

--- to verify the errors
select  'Powershell.exe -noexit -command "get-childitem '+@RootFolder+'\LOGS\'+'*.TXT -recurse | select-string -pattern ''error''" >> '+@RootFolder+'\'+@DestDB+'_'+'BCP_status.txt'
select 'Powershell.exe -noexit -command "get-childitem '+@RootFolder+'\LOGS\'+'*.TXT -recurse | select-string -pattern ''#@ Row''" >> '+@RootFolder+'\'+@DestDB+'_'+'BCP_status.txt'