SQL Server | Postgres |
bigint | numeric(20,0) |
binary | bytea |
bit | numeric(1,0) |
char | character(10) |
date | date |
datetime | timestamp without time zone |
datetime2 | timestamp(6) without time zone |
datetimeoffset | timestamp(6) with time zone |
decimal | numeric(18,0) |
float | double precision |
geography | character varying(8000) |
geometry | character varying(8000) |
hierarchyid | character varying(8000) |
image | bytea |
money | numeric(19,4) |
nchar | character(10) |
ntext | text |
numeric | numeric(18,0) |
nvarchar | character varying(10) |
real | double precision |
smalldatetime | timestamp without time zone |
smallint | numeric(5,0) |
smallmoney | numeric(10,4) |
sql_variant | character varying(8000) |
sysname | character varying(128) |
text | text |
time | time(6) without time zone |
timestamp | character varying(8000) |
tinyint | numeric(5,0) |
uniqueidentifier | uuid |
varbinary | bytea |
varchar | character varying(10) |
xml | xml |
CALSQL | SQL Server Community Blog
Sunday, November 5, 2017
SQL Server and Postgres Datatypes
Friday, October 27, 2017
postgres equivalent of sql server begin transaction
Comparing Transactions between SQL Server and Postgres.
SQL Server : : BEGIN Transaction -- Commit Transaction – Rollback Transaction
Postgres : Begin – Commit – Rollback
Unlike SQL Server , Postgres commits or Rollback the transactions at the First Commit/Rollback comman after any begin statement and Ignore Subsequent Commit/Rollbacks.
But in SQL Server , Each Begin Transaction requires a Commit/Rollback.
Monday, June 20, 2016
Migrate SQL Server Database to AWS (RDS) or Azure using SSIS
Migrate SQL Server Database to AWS (RDS) or Azure using SSIS
If backup and restore is not an option to migrate or copy the whole database which is not an option, specially to migrate managed database services like AWS RDS /Azure or any Environment which has restrictions to restore backups needs Import/Export mechanisms to migrate the data. Both these techniques are very familiar to most of the DBAs, here I am highlighting the problems SSISExport Wizard: 1. Identity Insert should be ON for all the tables which has identity key property, Its very tedious manual process to identify which tables has this property and manually enable them in each data flow. 2. Keep Nulls should be on the tables with default values. E.g.: if the table, column which allows nulls and has a date data type with default value as getdate() then after exporting the data it inserts the date values instead nulls, unless the SSIS dataflow destination property option FastLoadKeepNulls is checked 3. fast load options are not set by default; it has to be set manually for the property sFastLoadOptionsBatchCommit 4. TABLOCK which is not enable by default 5. ROWS_PER_BATCH is not configured by default Above 5 settings have to be done manually. Without these the package would fail or perform slowly How it works: Note: This has been tested on SQL Server 2008 EE. If this doesn’t work in your environment, then modify the key strings as per the release. Steps: 1. Run the Export export data wizard and save the package to C:\TEMP\northwind.dtsx 2. copy the below vbscript code to a file in C:\TEMP\SetSSIS.VBS 3. Edit the Vbscript, Line 5 "filePath = "<ssis file path>" 4. run the vb script 5. it creates a new package with new name as <PKG_NAME>.NEW 6. open the package in BIDS 7. run the package
BCP Check the below link which is using BCP to migrate the Data. |
sql server search job commands
Handy query to search a particular command across all the Jobs.
To search all the jobs which are using mail object , set the @keyword value to “Mail”
To search all the jobs which are using external folders , set the @keyword value to “\”
To search all the jobs which are calling DTS packages , set the @keyword value to “.dts”
declare @keyword varchar(max) set @keyword = '%'+@keyword +'%' |
postgres cant open the log file
If you are PgAdmin tool and facing this error whenever the execution code encounters an error like below
To fix this.
Open PgAdmin
-- File – Options – Logging – set the path to the existing folder. Eg: /tmp/pgadmin.logxc
Sunday, February 21, 2016
SQL Server AWS Migration BCP
Developed this stored procedure on my labs to simulate the Migrating databases to AWS RDS.
/*
S -- server ,@DestServer varchar(max) AS
SET NOCOUNT ON ----------------------------------------------------------------------------------------------------------------------------------------------------------
set @RootFolder = @RootFolder +'_'++REPLACE(DB_NAME(),' ','_') DECLARE @BCP_COL_DELIMITER VARCHAR(10)
set @BCP_DATA_FOLDER = @RootFolder +'\BCP_DATA'
SET @BCP_EXPORT_ERROR_FOLDER = @RootFolder +'\LOGS\EXPORT_ERRORS'
SELECT 'MKDIR '+@BCP_DATA_FOLDER
IF @EXP_IMP_EACH_TABLE = 1 SELECT 'ECHO Export & Import Completed' RETURN
SELECT 'ECHO EXPORT COMPLETE' ---------------------------------------------------------------------------------------------------------------------------------------------------------------------- select ' bcp '+@DestDB+'.['+schema_name(schema_id) +'].['+ name +'] in '+@BCP_DATA_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_')+'.txt -n -S '+@DestServer+' -Usa -P'+@DEST_SQL_SA_PASSWORD +' -E -w -k -t'+@BCP_COL_DELIMITER+' -b 10000 -e'+@BCP_IMPORT_ERROR_FOLDER+'\'+schema_name(schema_id)+'_'+REPLACE(NAME,' ','_') +'.TXT >> '+@BCP_IMPORT_Logging_FOLDER+'\'+REPLACE(NAME,' ','_')+'.TXT "TABLOCK" '
SELECT 'ECHO IMPORT COMPLETE'
--- to verify the errors |
Monday, February 8, 2016
Featured Post
SQL Server AWS Migration BCP
stored procedure to generate BCP scritps to migrate the SQL Server database. Developed this stored procedure on my labs to simulate the Mi...


-
Also check practice queries on PUBS database Download the Northwind database from the below link https://northwinddatabase.codep...
-
Pivot without Aggregating with consistent and inconsistent formats: One of the most common tasks which I came across at least once in almo...
-
query execution in the production taking long time one of the query in the production taking almost 40 minutes instead seconds where it us...
