How to escape single quotes in Unload

redshift unload
redshift unload date format
redshift unload file extension
redshift unload iam role
sql unload
redshift incremental unload
redshift unload to emr
unload parallel on
    conn_string = "dbname='{}' port='{}' user='{}' password='{}' host='{}'"\
            .format(dbname,port,user,password,host_url) 

    sql="""UNLOAD ('select col1,col2 from %s.visitation_hourly_summary_us where col4= '2018-07-10' and col5= '1';') TO 's3://%s/%s/%s.csv' \
            credentials 'aws_access_key_id=%s;aws_secret_access_key=%s' \
            MANIFEST GZIP ALLOWOVERWRITE;Commit;""" \
            % (schema_name,s3_bucket_name, schema,table,aws_access_key_id,\
            aws_secret_access_key)

con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(sql)

I'm trying to execute the above script to read the table and then create a file in S3

Since my columns are string I'm not able to skip the single quotes and I'm getting error as syntax error near where

Also, I've tried giving \ in where condition still it showing the same error.

Any help would be highly appreciated.

Thanks

As Sarang says, simply by replacing single quotes by double quotes in col4 and col5 values of your query should do the trick.

However I would suggest you to break your string down in smaller chunks easier to read and maintain. This way, you should be able to use execute as chepner suggests (and MySQL documentation):

# Create the inner SQL statement. Notice the single quotes for the general
# string and the double quotes for the col4 and col5 values
sql_stmt = ('SELECT col1, col2 '
            'FROM %s.visitation_hourly_summary_us '
            'WHERE col4 = "2018-07-10" AND col5= "1";' % schema_name)

# Format the s3 path
s3_target = 's3://%s/%s/%s.csv' % (s3_bucket_name, schema, table)

# Format credentials string
s3_credentials = 'aws_access_key_id=%s;aws_secret_access_key=%s' % (
    aws_access_key_id, aws_secret_access_key)

# Create a tuple with all preformatted strings
data = (sql_stmt, s3_target, s3_credentials)

# Format the s3 query skeleton
s3_stmt = ("UNLOAD ('%s') TO '%s' "
           "CREDENTIALS '%s' "
           "MANIFEST GZIP ALLOWOVERWRITE;Commit;")

con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(s3_stmt, data)

UNLOAD - Amazon Redshift, unload ('select * from events where event=\'PURCHASE\' ') Per following article​, I should escape quotes and that's what I'm doing but no luck  Let us consider the same example. select ‘This is SQL Authority”s author Pinal Dave’ as result. The result of the above select statement is. result. —————————————–. This is SQL Authority’s author Pinal Dave. As shown in the select statement, you need to use two single quotes to produce a single quote in the result.

You can also use postgres style :

unload 
($$
select * from table where id='ABC'
$$)
to 's3://bucket/queries_results/20150324/table_dump/'
credentials 'aws_access_key_id=;aws_secret_access_key='
;

UNLOAD query fails when using quotes in, However, we are having problems because the UNLOAD(http://docs. escape single quotes it before adding it into the unload statement my  Use Two Single Quotes For Every One Quote To Display. The simplest method to escape single quotes in Oracle SQL is to use two single quotes. For example, if you wanted to show the value O’Reilly, you would use two quotes in the middle instead of one. The single quote is the escape character in Oracle SQL.

You would want to use two single quotes to enclose the value.

If your query contains quotes (for example to enclose literal values), put the literal between two sets of single quotation marks—you must also enclose the query between single quotation marks:

Example:

UNLOAD ('select * from venue where venuestate=''NV''')

Taken from the redshift documentation: https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html

quoting in a sub clause, Escape character \; Single- or double-quote character. Default is OFF. PARALLEL: The Unload command writes data in parallel to multiple files, according  In such cases, you have to escape single quote to avoid any errors. There are several ways to escape a single quote. Below are couple of methods. Escape Single Quote Using Another Single Quote. The easiest way to escape single quote in a string to double up the quote. Add another single quote to the quote. Here is an example.

' (single quotes can be sent as ) -> \\\\'

I had used this in the R as well as python Please find solutions

if your sql QUERY is

Select * from sample_table where register_date='2018-12-31'

then for unload command write it like this

sql=     """unload ('Select * from tnltemp.otpsuccess_details where register_date=\\\\'2018-12-31\\\\' ')
        to 's3://my-bucket/migration/exported_sample_table_' credentials 
        'aws_access_key_id=12234123;aws_secret_access_key=12345'
        DELIMITER AS ','
        NULL AS ''
        parallel off;""""



cur = con.cursor()
cur.execute(sql)

Unload Command Options, Unload data from database tables to a set of files in an Amazon S3 bucket. If your query contains quotes (enclosing literal values, for example), you need to escape them in the query To write data to a single file, specify PARALLEL OFF. PreparedStatement will take care of it if you use parameterized queries. But not if you try to dynamically build the SQL statement yourself - then you're just passing invalid SQL to the Statement, and it can't cope. If you look at the Javadocs for PreparedStatement, there's a simple example of how a parameterized query is used.

You can put values in double quotes. 'select col1,col2 from %s.visitation_hourly_summary_us where col4= "2018-07-10" and col5= "1";'

Unloading data to Amazon S3, UNLOAD SELECT * FROM t1 ORDER BY key_1 TO 't1_b4.txt' note the quotes around the VARCHAR value, the doubled single quote, the escape characters,  If you loaded your data using a COPY with the ESCAPE option, you must also specify the ESCAPE option with your UNLOAD command to generate the reciprocal output file. Similarly, if you UNLOAD using the ESCAPE option, you need to use ESCAPE when you COPY the same data.

SQL Anywhere Studio 9 Developer's Guide, However, ANSI standards specify that using the backslash character (\) to escape single (' ') or double (" ") quotation marks is invalid. For example, the following  Simple example of escaping quotes in shell: It's done by finishing already opened one ('), placing escaped one (\'), then opening another one (').

Using Quotation Marks and Escape Characters, I am having difficulties escaping newlines on unloads to S3. A field can be optionally enclosed by double quotes and, within the field, all special characters PUBLIC.unload_testing (; text; ) VALUES (; 'this is a test string. In that case, you need to escape them or use single quotes. Escaping is a term that refers to refers to making making non-literal elements literal. It'll be much easier to understand with an example below.

How to escape newlines on COPY INTO external stage., Use this option to enclose strings in the specified character: single quote ( ' ) quote character and a field contains the string "A" , escape the double quotes as​  Put the escaping character before the single quote in the string. You can use the first solution like this: print ("hello I don't like single quote at all") hello I don't like single quote at all Process finished with exit code 0. Here is the Second solution: In Python, the backslash is an escaping character. So you can use it like the below:

Comments
  • what is the error you are getting? in my opinion your conn_string is wrong here
  • Also, do not use string formatting operations to build a query. Pass a parameterized query and a tuple of arguments separately to execute instead.
  • This approach lets you have syntax highlighting for the query.